Vladimir Kulyk Vladimir Kulyk - 1 month ago 22
Android Question

Android Camera2 API onImageAvailable ImageReader wrong size

I'm getting preview frames using OnImageAvailableListener:


@Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
byte[] data = new byte[buffer.capacity()];
buffer.get(data);
//data.length=332803; width=3264; height=2448
Log.e(TAG, "data.length=" + data.length + "; width=" + image.getWidth() + "; height=" + image.getHeight());
//TODO data processing
} catch (Exception e) {
e.printStackTrace();
}
if (image != null) {
image.close();
}
}


Each time length of data is different but image width and height are the same.

Main problem:
data.length
is too small for such resolution as 3264x2448.

Size of data array should be 3264*2448=7,990,272, not 300,000 - 600,000.

What is wrong?




imageReader = ImageReader.newInstance(1920, 1080, ImageFormat.JPEG, 5);

Answer

I am not sure, but I think you are taking only one of the plane of the YUV_420_888 format (luminance part).

In my case, I usually transform my image to byte[] in this way.

            Image m_img;
            Log.v(LOG_TAG,"Format -> "+m_img.getFormat());
            Image.Plane Y = m_img.getPlanes()[0];
            Image.Plane U = m_img.getPlanes()[1];
            Image.Plane V = m_img.getPlanes()[2];

            int Yb = Y.getBuffer().remaining();
            int Ub = U.getBuffer().remaining();
            int Vb = V.getBuffer().remaining();

            data = new byte[Yb + Ub + Vb];
            //your data length should be this byte array length.

            Y.getBuffer().get(data, 0, Yb);
            U.getBuffer().get(data, Yb, Ub);
            V.getBuffer().get(data, Yb+ Ub, Vb);
            final int width = m_img.getWidth();
            final int height = m_img.getHeight();

And I use this byte buffer to transform to rgb.

Hope this helps.

Cheers. Unai.