I have a big multidimensional array and I want it to occupy as little memory as possible. In python, this occupies 66 Mb.
m = np.zeros([1000, 70, 1, 1000], dtype='bool')
size = sys.getsizeof(m)/1024/1024
print("Size: %s MB" % size)
m <- array(FALSE, dim = c(1000, 70, 1, 1000))
format(object.size(m), units = "auto")
Your assertion that these arrays are the same is clearly wrong. If they were the same arrays, then you would need the same memory allocation in R than in any other language.
From the help for
Note that current implementations of R use 32-bit integers for integer vectors
So clearly the 4x memory usage is because you are using 32-bit objects in R, whereas you are using 8-bit objects in Python.
To use 8-bit objects in R, you can use
raw vectors. From the help for
The raw type is intended to hold raw bytes
m3 <- array(raw(0), dim = c(1000, 70, 1, 1000)) format(object.size(m3), units = "auto")  "66.8 Mb"
This is identical to the value you report that Python uses.