[Thread Prev][Thread Next][Index]

[ferret_users] converting binary to netCDF: memory insufficient?



Dear Ferret users,

I seem to be missing something, but I can't even list a single point of my 4D variable:

yes? save/clobber/i=1:1/j=1:1/k=1:1/l=1:1/file=tmp.nc temp[i=1,j=1,k=1,l=1]
. . . memory error . . . 
yes? list/i=1/j=1/k=1/l=1 temp
. . . memory error . . . 

(Some more details are at the end of this message).

Basically, I just want to convert a 4D variable in a raw-binary format to a netCDF file.

Is there any fundamental limit to the size of the underlying raw binary file?

The array is in the regular Fortran order of (i, j, k, l) and because i=1:500 and j=1:400, the x-y plane (500 x 400) isn't so large.  It's just 1.5 MB in double precision.  So, I thought saving by

repeat/L=1:73 ( \
    repeat/K=1:67 ( \
        save/append/klimits=1:67/file=output.nc temp ) )

would be manageable.

Regards,
Ryo
---------
  NOAA/PMEL TMAP
  PyFerret v7.5 (optimized)
  Linux 4.15.0-1061-azure - 11/14/19
   1-May-20 16:40
[. . . ]
yes? file/form=STREAM/type=r8/var=temp/grid=grid_xt_yt_zt_ctime\
 temp-pntd.bin
yes? set memory/size=300
yes? save/clobber/i=1:1/j=1:1/k=1:1/l=1:1/file=tmp.nc temp[i=1,j=1,k=1,l=1]
 **ERROR: request exceeds memory setting
    To fulfill this request would exceed the current SET MEMORY/SIZE= limit of 300 megawords
    At the moment that
[Thread Prev][Thread Next][Index]