|
31. |
Bayesian Robust Multivariate Linear Regression with Incomplete Data |
|
Journal of the American Statistical Association,
Volume 91,
Issue 435,
1996,
Page 1219-1227
Chuanhai Liu,
Preview
|
PDF (1230KB)
|
|
摘要:
The multivariatetdistribution and other normal/independent multivariate distributions, such as the multivariate slash distribution and the multivariate contaminated distribution, are used for robust regression with complete or incomplete data. Most previous work focused on the method of maximum likelihood estimation for linear regression using normal/independent distributions. This article considers Bayesian estimation of multivariate linear regression models using normal/independent distributions with fully observed predictor variables and possible missing values from outcome variables. A monotone data augmentation algorithm for posterior simulation of the parameters and missing data imputation is presented. The posterior distributions of functions of the parameters can be obtained using Monte Carlo methods. The monotone data augmentation algorithm can also be used for creating multiple imputations for incomplete data sets. An illustrative example of using the multivariatetis also included.
ISSN:0162-1459
DOI:10.1080/01621459.1996.10476991
出版商:Taylor & Francis Group
年代:1996
数据来源: Taylor
|
32. |
Reasoning to a Foregone Conclusion |
|
Journal of the American Statistical Association,
Volume 91,
Issue 435,
1996,
Page 1228-1235
JosephB. Kadane,
MarkJ. Schervish,
Teddy Seidenfeld,
Preview
|
PDF (1501KB)
|
|
摘要:
When can a Bayesian select an hypothesisHand design an experiment (or a sequence of experiments) to make certain that, given the experimental outcome(s), the posterior probability ofHwill be greater than its prior probability? We discuss an elementary result that establishes sufficient conditions under which this reasoning to a foregone conclusion cannot occur. We illustrate how when the sufficient conditions fail, because probability is finitely but not countably additive, it may be that a Bayesian can design an experiment to lead his/her posterior probability into a foregone conclusion. The problem has a decision theoretic version in which a Bayesian might rationally pay not to see the outcome of certain cost-free experiments, which we discuss from several perspectives. Also, we relate this issue in Bayesian hypothesis testing to various concerns about “optional stopping.”
ISSN:0162-1459
DOI:10.1080/01621459.1996.10476992
出版商:Taylor & Francis Group
年代:1996
数据来源: Taylor
|
33. |
The Equivalence of Constrained and Weighted Designs in Multiple Objective Design Problems |
|
Journal of the American Statistical Association,
Volume 91,
Issue 435,
1996,
Page 1236-1244
Merlise Clyde,
Kathryn Chaloner,
Preview
|
PDF (1485KB)
|
|
摘要:
Several competing objectives may be relevant in the design of an experiment. The competing objectives may not be easy to characterize in a single optimality criterion. One approach to these design problems has been to weight each criterion and find the design that optimizes the weighted average of the criteria. An alternative approach has been to optimize one criterion subject to constraints on the other criteria. An equivalence theorem is presented for the Bayesian constrained design problem. Equivalence theorems are essential in verifying optimality of proposed designs, especially when (as in most nonlinear design problems) numerical optimization is required. This theorem is used to show that the results of Cook and Wong on the equivalence of the weighted and constrained problems apply much more generally. The results are applied to Bayesian nonlinear design problems with several objectives.
ISSN:0162-1459
DOI:10.1080/01621459.1996.10476993
出版商:Taylor & Francis Group
年代:1996
数据来源: Taylor
|
34. |
Nonparametric Importance Sampling |
|
Journal of the American Statistical Association,
Volume 91,
Issue 435,
1996,
Page 1245-1253
Ping Zhang,
Preview
|
PDF (1332KB)
|
|
摘要:
Importance sampling is a widely used variance reduction simulation technique for the evaluation of high-dimensional integrals. A key step in the implementation of importance sampling is to choose a proper distribution function from which pseudorandom numbers are generated. Parametric sampling distributions, if available at all, are often inadequate for high-dimensional integrals over irregular regions. One possible remedy is to use a nonparametric method to estimate the unknown optimal sampling function. We show that the nonparametric approach yields integral estimates that converge faster than estimates obtained from parametric approaches. We also demonstrate that an adaptive method, which has been used successfully in parametric settings, does not yield better results than simple one-step methods in the nonparametric setting.
ISSN:0162-1459
DOI:10.1080/01621459.1996.10476994
出版商:Taylor & Francis Group
年代:1996
数据来源: Taylor
|
35. |
Fitting Full-Information Item Factor Models and an Empirical Investigation of Bridge Sampling |
|
Journal of the American Statistical Association,
Volume 91,
Issue 435,
1996,
Page 1254-1267
Xiao-Li Meng,
Stephen Schilling,
Preview
|
PDF (2406KB)
|
|
摘要:
Based on item response theory, Bock and Aitken introduced a method of item factor analysis, termed full-information item factor (FIIF) analysis by Bartholomew because it uses all distinct item response vectors as data. But a limitation of their fitting algorithm is its reliance on fixed-point Gauss—Hermite quadrature, which can produce appreciable numerical errors, especially in high-dimension problems. The first purpose of this article is to offer more reliable methods by using recent advances in statistical computation. Specifically, we illustrate two ways of implementing Monte Carlo Expectation Maximization (EM) algorithm to fit a FIIF model, using the Gibbs sampler to carry out the computation for theEsteps. We also show how to use bridge sampling to simulate the likelihood ratios for monitoring the convergence of a Monte Carlo EM, a strategy that is useful in general. Simulations demonstrate substantial improvement over Bock and Aitken's algorithm in recovering known factor loadings in high dimensions. To test our methods, we also apply them to data from LSAT and from a survey on quality of American life, and compare the results to those from the fixed-point approach. Using the FIIF model as a working example, the second purpose of this article is to provide an empirical investigation of the theoretical development of Meng and Wong on bridge sampling, an efficient method for computing normalizing constants. In contrast to importance sampling, which uses draws from one density, bridge sampling uses draws from two (or more) densities and then introduces intermediate densities to “bridge” them. Our empirical investigation confirms the results of Meng and Wong and echoes the empirical evidences documented in computational physics; that is, bridge sampling can reduce simulation errors by orders of magnitude when compared to importance sampling with the same simulation sizes.
ISSN:0162-1459
DOI:10.1080/01621459.1996.10476995
出版商:Taylor & Francis Group
年代:1996
数据来源: Taylor
|
36. |
Assessing Evidence in Multiple Hypotheses |
|
Journal of the American Statistical Association,
Volume 91,
Issue 435,
1996,
Page 1268-1277
Constantinos Goutis,
George Casella,
MartinT. Wells,
Preview
|
PDF (1740KB)
|
|
摘要:
We formulate the problem of choosing between two hypotheses as a problem of constructing a data-dependent evidential measure for or against the null hypothesis. Our main focus is on the multivariate case, and in particular we examine multivariate evidential measures constructed as combinations of univariate ones. Such measures should obey some minimal intuitive desiderata, which we state as axioms. These axioms formalize the acceptable behavior for various values of the individual pieces of evidence, and we discuss in detail the rationale behind each axiom. Furthermore, we investigate other properties of multivariate evidence that are desirable but not indispensable. These properties consider mainly the performance of methods of combination of evidence as the dimension of the problem varies. We critically examine the behavior of common rules of assessing evidence in higher dimensions, such as combinations ofpvalues and Bayes posterior probabilities, and clarify the connection between hypothesis testing and our approach. We include a discussion comparing different methods of assessing evidence.
ISSN:0162-1459
DOI:10.1080/01621459.1996.10476996
出版商:Taylor & Francis Group
年代:1996
数据来源: Taylor
|
37. |
Bootstrap for Imputed Survey Data |
|
Journal of the American Statistical Association,
Volume 91,
Issue 435,
1996,
Page 1278-1288
Jun Shao,
RandyR. Sitter,
Preview
|
PDF (1835KB)
|
|
摘要:
Most surveys use imputation to compensate for missing data. However, treating the imputed data set as the complete data set and directly applying existing methods (e.g., the linearization, the jackknife, and the bootstrap) for variance estimation and/or statistical inference does not produce valid results, because these methods do not account for the effect of missing data and/or imputation. In this article we show that correct bootstrap estimates can be obtained by imitating the process of imputing the original data set in the bootstrap resampling; that is, by imputing the bootstrap data sets in exactly the same way that the original data set is imputed. The proposed bootstrap is asymptotically valid irrespective of the sampling design, the imputation method, or the type of statistic used in inference. This enables us to use a unified method in a variety of problems, and in fact this is the only method that works without any restriction on the sampling design, the imputation method, or the type of statistic.
ISSN:0162-1459
DOI:10.1080/01621459.1996.10476997
出版商:Taylor & Francis Group
年代:1996
数据来源: Taylor
|
38. |
Efficient Estimators with Simple Variance in Unequal Probability Sampling |
|
Journal of the American Statistical Association,
Volume 91,
Issue 435,
1996,
Page 1289-1300
Carl-Erik Särndal,
Preview
|
PDF (2160KB)
|
|
摘要:
For unequal probability sampling designs, design-based variance estimation is cumbersome because it requires second-order inclusion probabilities. For most fixed sample size probability proportional-to-size (φPS) schemes, these probabilities are difficult to compute, and the variance estimation depends on them for a tedious double-sum calculation. We show how to replace the traditional φPS scenario with simpler design/estimator alternatives that preserve the high efficiency characteristic of φPS schemes. These use the generalized regression estimator, and the variance estimation entails only the calculation of a simple weighted squared residual sum.
ISSN:0162-1459
DOI:10.1080/01621459.1996.10476998
出版商:Taylor & Francis Group
年代:1996
数据来源: Taylor
|
39. |
On the Asymptotic Properties of LDU-Based Tests of the Rank of a Matrix |
|
Journal of the American Statistical Association,
Volume 91,
Issue 435,
1996,
Page 1301-1309
JohnG. Cragg,
StephenG. Donald,
Preview
|
PDF (1456KB)
|
|
摘要:
Gill and Lewbel recently introduced a test for the rank of a matrix based on the LDU decomposition. Unfortunately, the asymptotic distribution suggested by them is incorrect except in a very limited problem. In general, the asymptotic distribution is that of a highly complicated nonlinear function of a normally distributed random vector that appears to defy useful characterization. The LDU decomposition can be used to produce a valid test asymptotically equivalent to the minimum-X2test.
ISSN:0162-1459
DOI:10.1080/01621459.1996.10476999
出版商:Taylor & Francis Group
年代:1996
数据来源: Taylor
|
40. |
Nonlinear Additive Models for Environmental Time Series, with Applications to Ground-Level Ozone Data Analysis |
|
Journal of the American Statistical Association,
Volume 91,
Issue 435,
1996,
Page 1310-1321
Xu-Feng Niu,
Preview
|
PDF (1836KB)
|
|
摘要:
Environmental time series usually vary systematically in response to meteorological conditions and thus often are not stationary. In this article a class of additive models are introduced for environmental time series, in which both mean levels and variances of the series are nonlinear functions of relevant meteorological variables. Backfitting algorithms in nonlinear regression are adopted to estimate the unknown functions in the model, and the maximum likelihood method is used to estimate the parameters in the noise component. Asymptotic properties of the parameter estimates, including consistency and limiting distribution, are derived under mild conditions. The model is applied to daily maxima of ground-level ozone concentrations in the Chicago area for possible long-term trend assessment. Compared to alternative models, the proposed models gave more accurate estimations for the 95th and 99th percentiles of the ozone distribution.
ISSN:0162-1459
DOI:10.1080/01621459.1996.10477000
出版商:Taylor & Francis Group
年代:1996
数据来源: Taylor
|
|