Re: logit transformation when predictor's a continuous variable

Frank E Harrell Jr (fharrell@virginia.edu)
Thu, 22 Jan 1998 09:20:17 -0500


Jan,

In doing simulations to study bootstrap error estimates, you should not
have to deal with any p=0 or p=1, as you use maximum likelihood to
estimate the regression coefficients and don't actually use the logit
transformation in the fitting process.

I have an old tech report where I compared .632 to ordinary bootstrap and
cross-validation for binary logistic modeling. Let me know if you want me
to mail you a copy. My simulations were done all in S-PLUS using
my Design library's lrm function (available in statlib).

---------------------------------------------------------------------------
Frank E Harrell Jr
Professor of Biostatistics and Statistics
Director, Division of Biostatistics and Epidemiology
Dept of Health Evaluation Sciences
University of Virginia School of Medicine
http://www.med.virginia.edu/medicine/clinical/hes/biostat.htm

-----Original Message-----
From: Jan Muska <jmuska@almaak.usc.edu>
To: S-news@utstat.toronto.edu <S-news@utstat.toronto.edu>
Date: Thursday, January 22, 1998 2:29 AM
Subject: logit transformation when predictor's a continuous variable

>I want to do a simmulation study on Efron's prediction error estimators
>(632 and 632+, and some other ones). One of the decision rules I want to
>test this on is logistic regression (logit). My problem is I cannot find
>FORTRAN code to do the logit transformation when the predictor is a
>continuous variable. I can't even find any reference-just do not know
>where to look. Does anybody know how to do this transformation
>log(p/(1-p)) when p=1 or 0?
>
>I found a on internet an approximation such as if p=1, then p=1-1/2*n.
>But this does not seem to provide the same answer as the GLM procedure in
>S-Plus using the binary link.
>
>If you can send me a reference where to look or how to do this
>transformation, I'll be greatfull. For this is the only think that is
>keeping me from running the simulation and getting done with my
>disertation.
>
>Thanks, Jan Muska
>
>