[S] nlregb() memory problems

Joe Mortzheim (jmortz@snake1.cr.usgs.gov)
Tue, 12 May 1998 09:23:47 -0400

I am using S+ code with a call to nlregb() that has performed admirably many times. However, in one particularly large dataset (7800+ observations) the call uses all my computer's memory resources and shows no sign of solving the problem in a reasonable time frame (I have tried waiting 45+ minutes). Could I change the syntax to speed up the estimation process?

The formula I am trying to estimate parameters for is shown below:
#(I am trying to estimate for B1,B2,B3,B4,B5,B6)
#(D,S,T,B are the explanatory variables)
#(H is the response variable)

H = 4.5 * B1 * ((1 - exp( -(B2 * D)))^ B3) * (S ^ B4) * (T ^ B5) * (B ^ B6)
I rewrote the formula as shown below:

Nlregb.SI.BA <- function (Param)
Reg.Data[,2] - (4.5 + (Param[1]*(1-exp(-Param[2]*Reg.Data[,1]))^Param[3])
I made the call to nlregb as shown below:

Output <- nlregb( nres = dim(Reg.Data)[1],
start = Start.Values,
residuals = Nlregb.SI.BA,
lower = Lower.Limits,
upper = Upper.Limits)

I certainly would appreciate any suggestions to make this call more efficient. The call seems to work perfectly for datasets up to about 4,000 observations but I will periodically encounter datasets up to about 10,000 observations and don't know how to deal with the big datasets. The call to nlregb() is made within a function that is called from another function. I am using S+ version 3.3 under Windows NT on a Pentium 166 with 64 M of memory.

Thanks in advance for any ideas.
This message was distributed by s-news@wubios.wustl.edu. To unsubscribe
send e-mail to s-news-request@wubios.wustl.edu with the BODY of the
message: unsubscribe s-news