[Thread Prev][Thread Next][Index]

Re: [ferret_users] Regarding memory limits



Dear Karl,
G'day with new year wishes (apologies for late wishes)

Thank you for your email but I think I am not able to understand the difference between previous version of ferret and the latest one. Previously when we were using e.g. repeat/l=1:300:1 (save/append/file=var.nc var;set mem/size=xxxx), so at every time step saving the ram was clear. This means let say during processing first time step my whole 16 gb of ram was occupied but after finishing the saving of first time step data my whole was available to process next time step of data. But in latest version, even after issuing set mem/size=xxxx is not helping ram of my machine to clear the chache data of previous time step.

hence I am facing problem to save a large model output and this is making my machine really super slow. So I need any idea how to overcome this problem. I am attaching the command that I am using to save my chunk of data from  a model output of 1/10 degree resolution.
repeat/range=1:3600:100/name=ii (repeat/range=1:450:15/name=jj save/append/file=mss19932012.nc/i=`ii`:`ii+99`/j=`jj`:`jj+14` mss;set mem/size=3000);set mem/size=3000) and this is not working with memory command.

I hope I am able to clearly state my problem.

Cheers, Saurabh


--
Karl M. Smith, Ph.D.
JISAO Univ. Wash. and PMEL NOAA
"The contents of this message are mine personally and do
not necessarily reflect any position of the Government
or the National Oceanic and Atmospheric Administration."


--


REGARDS

Saurabh Rathore
Research Scholar (PhD.)
Centre For Oceans, Rivers, Atmosphere & Land Science Technology
Indian Institute Of Technology, Kharagpur
contact :- 91- 8345984434

[Thread Prev][Thread Next][Index]