BadAss - 1 year ago 146
Python Question

# What the numpy array stores when i convert a grayscale image into a numpy array?

I converted a 32 x 32 grayscale image into a numpy array using this procedure

``````image = Image.open('bn2.bmp')
nparray=np.array(image)
``````

when I print the numpy array, it prints 6 different matrix. When I printed the size of the numpy array, it showed me 3072. I suppose that the calculation is something like 3 x 32 x 32 . But I would like to know why, what is the numpy array storing?

Looking at the documentation on the page A crash course on NumPy for images

If you use `shape`,

``````nparray.shape
``````

this will give the dimensions of the image as something like

``````(32, 32, 3)
``````

which gives the size you found (32 x 32 x 3 = 3072)

What this shows is that your image is a 32-by-32 pixel image with three channels (red, green, and blue). If it were grayscale, the size would be 32 x 32 = 1024, corresponding to a shape of:

`(32, 32)`

Incidentally, to convert your image to grayscale, you would need to use something like `rgb2gray` (link to documentation).

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download