---------------------------------------------------------REpost
We used iterative generalized least squares (procedure
lm.gls from Prof. Ripley's Exercises) for meta-analysis
of survival data. The method by Dear Biometrics 94
models survival Si's to other factors with a known
covariance matrix derived from survival estimates.
For distinct failure times k > j, the correlation
is C(Sj, Sk)= ((1-S^j)*S^k/(S^j*(1-S^k))^.5 for
a particular cohort. I did the coding as follows.
glsV_function(vS,sSse) { n_length(vSe);
C_matrix(0,n,n); # sSse is vector of SE of survival vector S;
for(i in c(1:n-1))
for(j in c(i+1,n)) C[i,j]_C[j,i]_the above expression;
return(diag(sSse)%*%C%*%diag(sSse));
}
In combining different studies by stacking their
survival vectors, the covariance matrix is blocked
diagonal and updated at each iteration with the
current survival estimates.
Questions:
1. The QR decomposition is rather unstable with this covariance
structure. It fails for sure for cohort with stable
survivals, e.g.
Time 1 2 3 4 5 %(se)
Study 1 49(12) 46(12) 42(12) 40(12) 40(12)
Study 2 54(10) 47(13) 40(13) 40(13)
at successive time points. How can we get around this?
--Notes: The QR decomposition above was wrong. sorry about that.
--- Prof. Ripley pointed out to me after i posted- it's the eigen-
--- decomposition. Dr. Dear, the original author suggested we gigle
--- the initial parameters a bit to get started. He's also provided
--- me with a SAS IML code but haven't had the time to test it out.
--- please email if you're interested, i replied when i figured it
--- out myself. [Ba's notes]
2. We could not extract SE from some studies. If we estimate SE
through SE(i)=(S^i*(1-S^i))/# at risk, what are the arguments
against doing this?
--- Dr. Dear think it's ok to proceed this way and assess the
--- impact later, possibly adjusted # at risk if we know something
---- about censors. [Ba's notes]
3. If a published survival S(i) was estimated including censors
but detailed only the initial n and total censors, could we
use the effective sample size as (n - # censors). in effect,
we ignore the contribution to the lik from censored pts but
decrease the sample size to only pts with an event (this
definitely won't work for studies with small failure rates).
What are the stupid aspects of doing this?
--- same concept in Point 2, simulation to assess the validity and
--- impact on the pooled estimates. [Ba's notes]
4. Is there some work out there for the calculation of least
square estimates of certain factors in a glm or lm object?
--- no reply on this. i'll find time to work on this. [Ba's notes]
Thank you for bearing with me on these questions.
Regards,
Ba' Pham
Thomas C. Chalmers for Systematic Reviews
Children's Hospital of Eastern Ontario Research Institute
Phone (613) 738 3951
-----------------------------------------------------------------------
This message was distributed by s-news@wubios.wustl.edu. To unsubscribe
send e-mail to s-news-request@wubios.wustl.edu with the BODY of the
message: unsubscribe s-news