• 5

A PHP Error was encountered

Severity: Notice

Message: Undefined index: userid

Filename: views/question.php

Line Number: 191


File: /home/prodcxja/public_html/questions/application/views/question.php
Line: 191
Function: _error_handler

File: /home/prodcxja/public_html/questions/application/controllers/Questions.php
Line: 433
Function: view

File: /home/prodcxja/public_html/questions/index.php
Line: 315
Function: require_once

I need hardware-accelerated H.264 decoding for a research project, to test a self-defined protocol.

As I have Search on the web, I have found a few ways to perform hardware-accelerated video decoding on Android.

  1. Use ffmpeg libstagefright (overview of libstagefright) or use libstagefright in the OS directly, like here.
  2. Use OpenMax on specific hardware platform. like here about samsung device and here about Qualcomm Snapdragon series
  3. Some people mentioned PVplayer,

Some people "say" libstagefright is the only way while Qualcomm guys have made success obviously.

Currently I am not sure which way could work. I am a little confused now. If all could work, I would certainly prefer a hardware independent method.

As I have tested a few video players of their H/W acceleration with Galaxy Tab 7.7(3.2 & Enxyos), VLC, Mobo, Rock, vplayer, rock and mobo work fine, VLC doesn't work, vplayer seems to have a rendering bug which costs its performance.

Anyway, I did an 'operation' on Rockplayer and deleted all its .so libs in data\data\com.redirecting\rockplayer, and software decoding crashes while hw decoding works still fine! I wonder how they did that. It appears to me that hw acceleration could be independent of hardware platforms.

Can someone nail this problem? Or provide any reference with additional information or better details?

      • 1
    • I am bit confused! Do you want to direct access (without Android media APIs) to H/W accelerated decoder to decode your bit-streams? Because all modern phone SOCs decode H.264 using H/w Acceleration.
    • @OakBytes,I want to implement H/W accelerated decoding however it's done. Now I only know how to decode the stream with ffmpeg software decoding. H/W acceleration refers to level of performance at 1080P@30fps while software decoding is much weaker. I have avoided referring to software decoding as using CPU because the H/W acceleration module is also part of the CPU. What do you mean by all modern phones already used H/W acceleration?
      • 2
    • When Gallery Media Player is used to play H.264 clips, all recent Android phones use H/W accelerated H.264 decoder. I guess you plan to use H.264 decoder to decode raw H.264 bitstream and get decoded output, rather than play a file contain H.264 video and some audio.
      • 1
    • @OakBytes You are right. That's exactly what I wanted. Just raw bitsreams and no mkv or mp4 containers. Sorry I didn't make it clearer. I want to call H/W decoding based on NAL or frame level over raw bitstream, rather than setting up a media player for files.
    • @Holyglenn - have you succeeded with your project? Maybe you found some new information about subject?

To answer the above question, let me introduce few concepts related to Android

OpenMAX Android uses OpenMAX for codec interface. Hence all native codecs (hardware accelerated or otherwise) provide OpenMAX interface. This interface is used by StageFright(Player framework) for decoding media using codec

NDK Android allows Java Applications to interact with underlying C/C++ native libraries using NDK. This requires using JNI (Java Native Interface).

Now coming to your question How to tap native decoder to decode raw video bitstream?

In Android 4.0 version and below, Android did not provide access to underlying video decoders at Java layer. You would need to write native code to directly interact with OMX decoder. Though this is possible, it is not trivial as it would need knowledge of how OMX works and how to map this OMX to application using NDK.

In 4.1 (Jelly Bean version), Android seems to provide access to hardware accelerated decoders at application level through JAVA APIs. More details about new APIs at http://developer.android.com/about/versions/android-4.1.html#Multimedia

  • 22
Reply Report
    • @Holygenn I have added link to Media Player APIs in Jelly Bean. In case of RockPlayer, does it directly display video using Hardware Accelerated Decoders or does it provide you output buffers? It is easier to do the former than latter with hardware accelerated decoders
      • 1
    • RockPlayer is close sourced, only the config of ffmpeg is open hence I am not sure which it used. My guess is in Rockplayer after demuxing, the raw video is fed into a switch, letting the user choose from sw 3rd party decoder or hw system decoder as its menu suggested, then processed and displayed. I did Rockplayer experiment under Honeycomb 3.2. So it seems there would be a way to tap system codecs without NDK/JNI, not saying NDK is too much trouble but exploring a posibility.
      • 2
    • As I understand it, Jelly bean MediaCodec provides system codecs(SW/HW) for raw bitstream. That seems to be exactly what I need -- I need to decode(with hw) and display raw video stream. However as curiosity drives it, I am even more eager to know how I can achieve HW accelerated decoding without Jelly Bean. Thank you very much for your help so far.
      • 1
    • Thank you for your clarification on concepts and answers. I have looked through references only to find it too true that I must dig into the OS framework. This would definitely work.
      • 2
    • However, as the operation on Rockplayer suggests, in which I deleted all the .so libraries and the hardware decoding still works while software dec fails, there could be some simpler way to this in Android 4.0 version and belows. As to my raw bitstream decoding and all, I might have to figure out the whole OMX thing anyway. Can you give the Java API in Jelly Bean?