I'd like to read the content of multiple files, process their data individually (because of performance and hardware resources) and write my results into one 'big' netCDF4 file.
Right now I'm able to read the files, process their data, but I struggle with the resulting multiple arrays. I wasn't able to merge them correctly.
I've got a 3d array (time,long,lat) containing my calculated value for each day. What I like to do is to merge all the arrays I've got into one big array before I write it into my netCDF4 file. (all days in one array)
Here two example arrays:
Let's start with some random data.
>>> import numpy as np >>> day1 = np.random.randint(255, size=(1, 81, 141))
Your array has a dimension of size 1, so every time you want to access an element, you'll have to painstalkingly type
day1[0,x,y]. You can remove that necessary dimension with
>>> day1[0,50,50] 36 >>> day1 = np.squeeze(day1) >>> day1.shape (81, 141) >>> day1[50,50] 36
Now let's make some more of these.
>>> day2 = np.random.randint(255, size=day1.shape) >>> day3 = np.random.randint(255, size=day1.shape)
You can put all of these in one big list and pass them to
np.array() which will create an array of size
(N, 81, 141), where
N is the number of days you have.
>>> allDays = np.array([day1, day2, day3]) >>> allDays.shape (3, 81, 141)
All the data from
day1 are in index 0, from
day2 in index 1, etc.
>>> allDays[0,50,50] 36