Christoph Christoph - 1 month ago 30
Java Question

Decoding raw h264 with MediaCodec stream results in black surface

Hello Stack Overflow,

I'm currently writing a framework to achive vr experience with a smartphone. Therefore graphical content gets rendered on a server (stereoscope), encoded and sent to a smartphone. The one I use is the Nexus 5x from LG.
The app I'm writing originally consisted of two texture views and the logic to decode and display the frames.
However, Androids MediaCodec class crashed in every atempt, so I tried to create a minimal working example with only one surface, based on working code I've written before. But despite the MediaCodec doesn't throw an CodecException anymore, the surface still remains black.

public class MainActivity extends Activity implements SurfaceHolder.Callback
{
private DisplayThread displayThread = null;

@Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
SurfaceView sv = new SurfaceView(this);
sv.getHolder().addCallback(this);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
setContentView(sv);
}

@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height)
{
if (displayThread == null)
{
displayThread = new DisplayThread(holder.getSurface());
displayThread.start();
}
}

private class DisplayThread extends Thread
{
private MediaCodec codec;
private Surface surface;
private UdpReceiver m_renderSock;


public DisplayThread(Surface surface)
{
this.surface = surface;
}

@Override
public void run()
{
m_renderSock = new UdpReceiver(9091);

//Configuring Media Decoder
try {
codec = MediaCodec.createDecoderByType("video/avc");
} catch (IOException e) {
throw new RuntimeException(e.getMessage());
}

MediaFormat format = MediaFormat.createVideoFormat("video/avc", 1280,720);

codec.configure(format, surface, null, 0);
codec.start();


while(!Thread.interrupted())
{
int frameSize = 0;
byte[] frameData = m_renderSock.receive();

if(frameData.length == 1) // Just for the moment, to cope with the first pakets get lost because of missing ARP, see http://stackoverflow.com/questions/11812731/first-udp-message-to-a-specific-remote-ip-gets-lost
continue;

/*Edit: This part may be left out*/
int NAL_START = 1;
//103, 104 -> SPS, PPS | 101 -> Data
int id = 0;
int dataOffset = 0;

//Later on this will be serversided, but for now...
//Separate the SPSPPS from the Data
for(int i = 0; i < frameData.length - 4; i++)
{
id = frameData[i] << 24 |frameData[i+1] << 16 | frameData[i+2] << 8
| frameData[i+3];

if(id == NAL_START) {
if(frameData[i+4] == 101)
{
dataOffset = i;
}
}
}


byte[] SPSPPS = Arrays.copyOfRange(frameData, 0, dataOffset);
byte[] data = Arrays.copyOfRange(frameData, dataOffset, frameData.length);

if(SPSPPS.length != 0) {
int inIndex = codec.dequeueInputBuffer(100000);

if(inIndex >= 0)
{
ByteBuffer input = codec.getInputBuffer(inIndex);
input.clear();
input.put(SPSPPS);
codec.queueInputBuffer(inIndex, 0, SPSPPS.length, 16, MediaCodec.BUFFER_FLAG_CODEC_CONFIG);
}
}
/*Edit end*/

int inIndex = codec.dequeueInputBuffer(10000);
if(inIndex >= 0)
{
ByteBuffer inputBuffer = codec.getInputBuffer(inIndex);
inputBuffer.clear();
//inputBuffer.put(data);
inputBuffer.put(frameData);
//codec.queueInputBuffer(inIndex, 0, data.length, 16, 0);
codec.queueInputBuffer(inIndex, 0, frameData.length, 16, 0);
}

BufferInfo buffInfo = new MediaCodec.BufferInfo();
int outIndex = codec.dequeueOutputBuffer(buffInfo, 10000);

switch(outIndex)
{
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
break;
case MediaCodec.INFO_TRY_AGAIN_LATER:
break;
case -3: //This solves it
break;
default:
ByteBuffer buffer = codec.getOutputBuffer(outIndex);
codec.releaseOutputBuffer(outIndex, true);
}


}
}
}


So basically this code had worked in the past. But at that time the media codec API hat
ByteBuffer[]
for Input- and Output-Buffers. Also there was no need to separate the SPSPPS-data from the frame-data (At least I didn't do it and it worked, could also because of Nvcuvenc seperated every NALU).

I inspected the contents of the two buffers and this is the result:

SPSPPS:
0 0 0 1 103 100 0 32 -84 43 64 40 2 -35 -128 -120 0 0 31 64 0 14 -90 4 120 -31 -107
0 0 1 104 -18 60 -80

Data:
0 0 0 1 101 -72 4 95 ...


For me, this looks correct. The h264 stream is created with Nvidias NVenc API and, if saved to disc, is playable with VLC without any problem.

I'm sorry for the large code-lump.
Thanks for your help!

Answer

So the only problem was, that dequeueOutputBuffers still may return -3, aka MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED, which is marked as deprecated. Very nice. By not handling this return value, or to be more specific, use the constants value as input for getOutputBuffer(), the codec throws an error -> black screen.

Edit: Oh, and apparently the whole NAL stuff isn't needed as well. Even if the API states, that the SPS and PPS NALUs have to be provided before start. I marked the part that can be left out in my Question.

Comments