BadAss BadAss - 1 month ago 9x
Python Question

What the numpy array stores when i convert a grayscale image into a numpy array?

I converted a 32 x 32 grayscale image into a numpy array using this procedure

image ='bn2.bmp')

when I print the numpy array, it prints 6 different matrix. When I printed the size of the numpy array, it showed me 3072. I suppose that the calculation is something like 3 x 32 x 32 . But I would like to know why, what is the numpy array storing?


Looking at the documentation on the page A crash course on NumPy for images

If you use shape,


this will give the dimensions of the image as something like

(32, 32, 3)

which gives the size you found (32 x 32 x 3 = 3072)

What this shows is that your image is a 32-by-32 pixel image with three channels (red, green, and blue). If it were grayscale, the size would be 32 x 32 = 1024, corresponding to a shape of:

(32, 32)

Incidentally, to convert your image to grayscale, you would need to use something like rgb2gray (link to documentation).