[Thread Prev][Thread Next][Index]

Re: [ferret_users] memory problem with large file when regredding



Dividing data with many indeces (e.g. L or I or J or K) is good method to handle saving of large files.
I don't know how many L values are there, but suppose you have total values of L = Lmax (i.e. L=1:Lmax), then you may try following:

save/clobber/file=file_0.1.nc/llimit=Lmax/l=1 mask   ! put Lmax as a number say 365 or something .....
repeat/range=2:Lmax/name=aa (save/append/file=file_0.1.nc mask[=`aa`])

Hope it works. (LLIMITS can be used for I also, for e.g. ILIMIT)

On Mon, May 20, 2019 at 6:35 PM Patrick Brockmann <patrick.brockmann@xxxxxxxxxxxx> wrote:
Hi all,

I have a 5Go variable at 0.01x0.01 resolution and I would like to save it at a coarser resolution.
My regridding script can be run because it requires too much memory.

Is there a strategy to run i ?
I have tried to save it with different /append calls but cannot figure out
because anunlimited dimension seems to be possible only for time dimension (L).

[Thread Prev][Thread Next][Index]