|
1. |
Assessment of the Safety and Efficacy Data for the Hypnotic Halcion(R): Results of an Analysis by an Institute of Medicine Committee |
|
Journal of the American Statistical Association,
Volume 94,
Issue 448,
1999,
Page 993-1002
RobertD. Gibbons,
ByronW. M. Brown,
DanielL. Azarnoff,
WilliamE. Bunney,
Robert Cancro,
JohnC. Gillin,
Sandra Hullett,
KeithF. Killam,
JohnH. Krystal,
DavidJ. Kupfer,
PaulD. Stolley,
AndrewM. Pope,
GeoffreyS. French,
Preview
|
PDF (1102KB)
|
|
摘要:
Recent estimates indicate a 10% prevalence of chronic insomnia in the adult population of the United States, with an associated annual cost of more than $90 billion. Since its approval in 1982 for use in the treatment of insomnia, an estimated 11 billion prescriptions for Halcion(R)(triazolam) have been filled worldwide. Concerns about the safety of Halcion began to emerge when a Dutch psychiatrist reported a possible link between the drug and a syndrome that included depression, amnesia, hallucinations, and anxiety. Since that time, the United Kingdom, Brazil, Argentina, Norway, and Denmark removed Halcion from the market, and the manufacturer, Upjohn, withdrew Halcion from the market in The Netherlands. Other countries, including the United States and Canada, modified the labeling to reduce the recommended dose and duration of treatment and to heighten awareness regarding possible side effects affecting behavior and cognition. The labeling changes raised questions regarding the hypnotic effectiveness of these lower doses of Halcion. Based on a 1996 U.S. Food and Drug Administration (FDA) task force report, the Institute of Medicine (IOM) of the National Academy of Sciences assessed the adequacy, quality, and overall confidence in the data on the effectiveness and safety of Halcion at different doses and durations, including those specified in the current product labeling. This article provides a summary of the IOM report titled “Halcion: An Independent Assessment of Safety and Efficacy Data” and a more detailed overview of the statistical analysis that led to the committee's conclusions.
ISSN:0162-1459
DOI:10.1080/01621459.1999.10473852
出版商:Taylor & Francis Group
年代:1999
数据来源: Taylor
|
2. |
Multicriterion Decision Merging: Competitive Development of an Aboriginal Whaling Management Procedure |
|
Journal of the American Statistical Association,
Volume 94,
Issue 448,
1999,
Page 1003-1014
GeofH. Givens,
Preview
|
PDF (1355KB)
|
|
摘要:
International Whaling Commission management of aboriginal subsistence whaling will eventually use an aboriginal whaling management procedure (AWMP) chosen from a collection of candidate procedures after grueling simulation testing. An AWMP is a fully automatic algorithm designed to operate on the results of an assessment (i.e., a statistical estimation problem relying on sparse series of whale abundance data) to produce a catch limit in each year of real or simulated management. An AWMP should, as much as possible, meet the conflicting objectives of low population risk, high satisfaction of needed catch, and high rate of population recovery. The choice of the best procedure falls naturally in the multicriterion decision making framework, because one of several candidates must be chosen on the basis of high-dimensional simulated performance summaries over a wide range of assumptions about whales and whaling. However, standard multicriterion decision making methods are impractical and unsatisfying for this problem. A method is developed to merge competing procedures into a new procedure that is an admissible Bayes rule. The approach is constructive rather than selective, meaning that it is not intended to produce an automatic winner, but rather a promising new candidate. This merging approach allows the best performance aspects of competing procedures to be combined. Ideally, and in examples shown, the newly constructed procedure outperforms all previous candidates. The approach also permits tuning of a single procedure to enhance performance or to more closely reflect design goals, without a simulation-intensive search over the tuning parameter space. These methods are generalizable to a larger class of decision problems.
ISSN:0162-1459
DOI:10.1080/01621459.1999.10473853
出版商:Taylor & Francis Group
年代:1999
数据来源: Taylor
|
3. |
Modeling Epidemiologic Typing Data and Likelihood Inference for Disease Spread |
|
Journal of the American Statistical Association,
Volume 94,
Issue 448,
1999,
Page 1015-1024
BeverlyG. Mellen,
Preview
|
PDF (767KB)
|
|
摘要:
A model for epidemiologic typing data is introduced, and likelihood ratio methods are developed for evaluating these data as evidence about disease spread. The observed data consist of microorganism subtypes from an index case of infectious disease, cases clustered with the index case, and a reference sample. The likelihood methods are evaluated via probabilities of observing epidemiologic subtypes that represent strong and sometimes misleading evidence favoring one hypothesis over another. A general bound is identified for the probability of observing strong evidence favoring a close epidemiologic relationship between the index and cluster cases vis-à-vis no relationship when in fact there is none. The advantages of this approach versus alternate approaches to measuring the strength of typing evidence are discussed.
ISSN:0162-1459
DOI:10.1080/01621459.1999.10473854
出版商:Taylor & Francis Group
年代:1999
数据来源: Taylor
|
4. |
A Method-of-Moments Estimation Procedure for Categorical Quality-of-Life Data with Nonignorable Missingness |
|
Journal of the American Statistical Association,
Volume 94,
Issue 448,
1999,
Page 1025-1034
Marco Bonetti,
BernardF. Cole,
RichardD. Gelber,
Preview
|
PDF (1024KB)
|
|
摘要:
Quality-of-life outcomes collected during clinical trials often have considerable amounts of missing data, which, if not appropriately accounted for, may lead to bias in inferences. We introduce a method-of-moments (MM) estimating procedure for a model designed to handle nonignorable missingness arising in categorical data measured on independent populations. The missingness mechanism is assumed to be the same across the populations. We derive necessary and sufficient conditions for the identifiability of the model and fit the model to quality-of-life data collected as part of a breast cancer clinical trial. We compare the MM estimator to the maximum likelihood estimator in a simulation study.
ISSN:0162-1459
DOI:10.1080/01621459.1999.10473855
出版商:Taylor & Francis Group
年代:1999
数据来源: Taylor
|
5. |
Comparison of Partially Measured Latent Traits across Nominal Subgroups |
|
Journal of the American Statistical Association,
Volume 94,
Issue 448,
1999,
Page 1035-1044
JonD. Cohen,
Tao Jiang,
Preview
|
PDF (956KB)
|
|
摘要:
This article presents a method for estimating the subgroup distributions of a latent trait measured as a normal variate in the population. This problem occurs in large-scale assessments, such as the National Assessment of Educational Progress, the National Adult Literacy Survey, and other programs where proficiencies are estimated via marginal maximum likelihood through a model based on item-response theory. Heretofore, estimates of subgroup means were often estimated using ad hoc assumptions about within-group distributions, which conflicted with concurrent assumptions about normality of the population distributions. The method presented here removes the conflict, using consistent distributional assumptions for population and subgroup estimates.
ISSN:0162-1459
DOI:10.1080/01621459.1999.10473856
出版商:Taylor & Francis Group
年代:1999
数据来源: Taylor
|
6. |
An Evaluation of California's Inmate Classification System Using a Generalized Regression Discontinuity Design |
|
Journal of the American Statistical Association,
Volume 94,
Issue 448,
1999,
Page 1045-1052
RichardA. Berk,
Jan de Leeuw,
Preview
|
PDF (877KB)
|
|
摘要:
Published studies using the regression discontinuity design have been limited to cases in which linear regression is applied to a categorical treatment indicator and an equal interval outcome. This is unnecessarily narrow. We show here how a generalization the usual regression discontinuity design can be applied in a wider range of situations. We focus on the use of categorical treatment and response variables, but we also consider the more general case of any regression relationship. We also show how a resampling sensitivity analysis may be used to address the credibility of the assumed assignment process. The broader formulation is applied to an evaluation of California's inmate classification system, which is used to allocate prisoners to different kinds of confinement.
ISSN:0162-1459
DOI:10.1080/01621459.1999.10473857
出版商:Taylor & Francis Group
年代:1999
数据来源: Taylor
|
7. |
Causal Effects in Nonexperimental Studies: Reevaluating the Evaluation of Training Programs |
|
Journal of the American Statistical Association,
Volume 94,
Issue 448,
1999,
Page 1053-1062
RajeevH. Dehejia,
Sadek Wahba,
Preview
|
PDF (997KB)
|
|
摘要:
This article uses propensity score methods to estimate the treatment impact of the National Supported Work (NSW) Demonstration, a labor training program, on postintervention earnings. We use data from Lalonde's evaluation of nonexperimental methods that combine the treated units from a randomized evaluation of the NSW with nonexperimental comparison units drawn from survey datasets. We apply propensity score methods to this composite dataset and demonstrate that, relative to the estimators that Lalonde evaluates, propensity score estimates of the treatment impact are much closer to the experimental benchmark estimate. Propensity score methods assume that the variables associated with assignment to treatment are observed (referred to as ignorable treatment assignment, or selection on observables). Even under this assumption, it is difficult to control for differences between the treatment and comparison groups when they are dissimilar and when there are many preintervention variables. The estimated propensity score (the probability of assignment to treatment, conditional on preintervention variables) summarizes the preintervention variables. This offers a diagnostic on the comparability of the treatment and comparison groups, because one has only to compare the estimated propensity score across the two groups. We discuss several methods (such as stratification and matching) that use the propensity score to estimate the treatment impact. When the range of estimated propensity scores of the treatment and comparison groups overlap, these methods can estimate the treatment impact for the treatment group. A sensitivity analysis shows that our estimates are not sensitive to the specification of the estimated propensity score, but are sensitive to the assumption of selection on observables. We conclude that when the treatment and comparison groups overlap, and when the variables determining assignment to treatment are observed, these methods provide a means to estimate the treatment impact. Even though propensity score methods are not always applicable, they offer a diagnostic on the quality of nonexperimental comparison groups in terms of observable preintervention variables.
ISSN:0162-1459
DOI:10.1080/01621459.1999.10473858
出版商:Taylor & Francis Group
年代:1999
数据来源: Taylor
|
8. |
Account-Level Modeling for Trade Promotion: An Application of a Constrained Parameter Hierarchical Model |
|
Journal of the American Statistical Association,
Volume 94,
Issue 448,
1999,
Page 1063-1073
Peter Boatwright,
Robert McCulloch,
Peter Rossi,
Preview
|
PDF (1141KB)
|
|
摘要:
We consider the problem of utilizing data at the retail/market level on sales and marketing mix variables to help manufacturers optimize the allocation of trade promotional budgets across areas. Major consumer packaged goods manufacturers budget at least one-half of their total marketing expenses to trade promotions. Trade promotional deals are designed to encourage retailers to promote products by temporarily reducing the price, putting them in in-store displays, or advertising in local media. A profit-based trade promotional allocation system will require estimates of the responsiveness of sales at each retailer to a given promotion. A major barrier to the use of retailer data is the proliferation of incorrectly signed coefficients in standard least squares analyses. Even more sophisticated adaptive shrinkage methods will not remove the problem of improper signs. We propose a hierarchical model to modeling retailer response that uses a first-stage prior with inequality constraints on the regression coefficients. We demonstrate the usefulness of our modeling approach with data on more than 75 retailers. We find substantial profit opportunities from our response-based promotional allocation scheme over and above what might be achieved by a standard volume-oriented allocation scheme.
ISSN:0162-1459
DOI:10.1080/01621459.1999.10473859
出版商:Taylor & Francis Group
年代:1999
数据来源: Taylor
|
9. |
Hierarchical Bayes Estimation of Unemployment Rates for the States of the U.S. |
|
Journal of the American Statistical Association,
Volume 94,
Issue 448,
1999,
Page 1074-1082
G.S. Datta,
P. Lahiri,
T. Maiti,
K.L. Lu,
Preview
|
PDF (843KB)
|
|
摘要:
Under a federal-state cooperative program, the U.S. Bureau of Labor Statistics (BLS) publishes monthly unemployment rate estimates for its 50 states and the District of Columbia. The primary source of data for this estimation problem is the Current Population Survey (CPS). However, the CPS state unemployment rate estimates are unreliable, because the survey provides relatively few observations per state. Various federal agencies use state-level unemployment rate estimates for policy making and fund allocation. Thus it is important to improve on the CPS estimates. For this, we propose a hierarchical Bayes (HB) method using a time series generalization of a widely used cross-sectional model in small-area estimation. The proposed method is compared in detail with the corresponding HB method, which uses the HB analog of the well-known Fay-Herriot cross-sectional model. A third model based on a time series approach to repetitive surveys is found to be very hard to implement for these data; the resulting estimates are very unstable and not meaningful. If we ignore some important factors from this model, then the reduced model can be fit, but the resulting model is found to be less than adequate. Gibbs sampling is used to obtain the posterior means and variances of the state unemployment rates. Based on some diagnostic tools recently developed for hierarchical models, our proposed model emerges as the best. The coefficients of variation of the proposed HB estimates are considerably lower than those of the rival estimates.
ISSN:0162-1459
DOI:10.1080/01621459.1999.10473860
出版商:Taylor & Francis Group
年代:1999
数据来源: Taylor
|
10. |
Evaluation and Comparison of EEG Traces: Latent Structure in Nonstationary Time Series |
|
Journal of the American Statistical Association,
Volume 94,
Issue 448,
1999,
Page 1083-1095
Mike West,
Raquel Prado,
AndrewD. Krystal,
Preview
|
PDF (1881KB)
|
|
摘要:
We explore and illustrate the use of time series decomposition methods for evaluating and comparing latent structure in nonstationary electroencephalographic (EEG) traces obtained from depressed patients during brain seizures induced as part of electroconvulsive therapy (ECT). Analysis of the patterns of change over time in the frequency structure of such EEG data provides insight into the neurophysiological mechanisms of action of this effective but poorly understood antidepressant treatment, and allows clinicians to modify ECT treatments to optimize therapeutic benefits while minimizing associated side effects. Our work has introduced new methods of time-frequency analysis of EEG series that identify the complete pattern of time evolution of frequency structure over the course of a seizure, and usefully assist in these scientific and clinical studies. New methods of decomposition of flexible dynamic models provide time domain decompositions of individual EEG series into collections of latent components in different frequency bands. This allows us to explore ECT seizure characteristics via inferences on the time-varying parameters that characterize these latent components, and to relate differences in such characteristics across seizures to differences in the therapeutic effectiveness and cognitive side effects of those seizures. This article discusses the scientific context and problems, development of nonstationary time series models and new methods of decomposition to explore time-frequency structure, and aspects of model fitting and analysis. We include applied studies on two datasets from recent clinical ECT studies. One is an initial illustrative analysis of a single EEG trace, the second compares the EEG data recorded during two types of ECT treatment that differ in therapeutic effectiveness and cognitive side effects. The uses of these models and time series decomposition methods in extracting and contrasting key features of the seizure underlying the EEG signals are highlighted. Through the use of these models we have quantified, for the first time, decreases in the dominant frequencies of low-frequency EEG components during ECT seizures. We have also identified preliminary evidence that such decreases are enhanced under the more effective ECTs at higher electrical dosages, a finding consistent with prior reports and the hypothesis that more effective forms of ECT are more effective in eliciting neurophysiological inhibitory processes.
ISSN:0162-1459
DOI:10.1080/01621459.1999.10473861
出版商:Taylor & Francis Group
年代:1999
数据来源: Taylor
|
|