[S] Memory problems

Matthew Wiener (mcw@ln.nimh.nih.gov)
Sat, 21 Feb 1998 14:59:13 -0500 (EST)


I am working with an extremely large minimization problem. I have 1325
entries in the vector over which I am minimizing.

I've been using nlminb.
One of the things I do is pass it a vector of conditional
probabilities, which is of dimension 1325 x 1325.

The memory size Splus is using (as measured by the function memory.size)
grows many times when either the function I'm minimizing or the gradient
function is called. It always grows by the size of a 1325 x 1325 array.

I've checked my functions, and I'm not calculating any 1325x1325 arrays,
even as temporary results. This suggests to me that Splus is making
copies of the 1325x1325 matrix that I pass through nlminb.
This is confirmed by the fact that the memory goes up exactly on the step
when I use the matrix.

So far, I haven't found any documentation on how to prevent this.
I'm looking into making the matrix a global variable (or assigning it to
my frame) so that I don't have to pass it, to see whether that might help.

Any suggestions?

Thanks,
Matt Wiener

-----------------------------------------------------------------------
This message was distributed by s-news@wubios.wustl.edu. To unsubscribe
send e-mail to s-news-request@wubios.wustl.edu with the BODY of the
message: unsubscribe s-news