Two-step jackknife bias reduction for logistic regression mles
作者:
S.B Bull,
W.W Hauck,
C.M.T Greenwood,
期刊:
Communications in Statistics - Simulation and Computation
(Taylor Available online 1994)
卷期:
Volume 23,
issue 1
页码: 59-88
ISSN:0361-0918
年代: 1994
DOI:10.1080/03610919408813156
出版商: Marcel Dekker, Inc.
关键词: bias correction;categorical outcome;finite sample;maximum likelihood;logistic model
数据来源: Taylor
摘要:
Maximum likelihood estimates (MLEs) for logistic regression coefficients are known to be biased in finite samples and consequently may produce misleading inferences. Bias adjusted estimates can be calculated using the first-order asymptotic bias derived from a Taylor series expansion of the log likelihood. Jackknifing can also be used to obtain bias corrected estimates, but the approach is computationally intensive, requiring an additional series of iterations (steps) for each observation in the dataset.Although the one-step jackknife has been shown to be useful in logistic regression diagnostics and i the estimation of classification error rates, it does not effectively reduce bias. The two-step jackknife, however, can reduce computation in moderate-sized samples, provide estimates of dispersion and classification error, and appears to be effective in bias reduction. Another alternative, a two-step closed-form approximation, is found to be similar to the Taylo series method in certain circumstances. Monte Carlo simulations indicate that all the procedures, but particularly the multi-step jackknife, may tend to over-correct in very small samples. Comparison of the various bias correction proceduresin an example from the medical literature illustrates that bias correction can have a considerable impact on inference
点击下载:
PDF (857KB)
返 回