|
1. |
Characterizing Water Diffusion In Fixed Baboon Brain |
|
AIP Conference Proceedings,
Volume 735,
Issue 1,
1904,
Page 3-15
G. Larry Bretthorst,
Christopher D. Kroenke,
Jeffrey J. Neil,
Preview
|
PDF (855KB)
|
|
摘要:
In the Biomedical Magnetic Resonance Laboratory in St. Louis, Missouri, there is an ongoing project to characterize water diffusion in fixed baboon brain using diffusion weighted magnetic resonance imaging as a means of monitoring development throughout gestation. Magnetic resonance images can be made sensitive to diffusion by applying magnetic field gradients during the pulse sequence. Results from the analysis of diffusion weighted magnetic resonance images using a full diffusion tensor model do not fit the data well. The estimated standard deviation of the noise exhibit structures corresponding to known baboon brain anatomy. However, the diffusion tensor plus a constant model overfits the data: the residuals in the brain are smaller than in regions where there is no signal. Consequently, the full diffusion tensor plus a constant model has too many parameters and needs to be simplified. This model can be simplified by imposing axial symmetry on the diffusion tensor. There are three axially symmetric diffusion tensor models, prolate, oblate, and isotropic; and two other models, no signal and full diffusion tensor, that could characterize the diffusion weighted images. These five models may or may not have a constant offset, giving 10 total models that potentially describe the diffusion process. In this paper the Bayesian calculations needed to select which of the 10 models best characterizes the diffusion data are presented. The various outputs from the analysis are illustrated using one of our baboon brain data sets. © 2004 American Institute of Physics
ISSN:0094-243X
DOI:10.1063/1.1835192
出版商:AIP
年代:1904
数据来源: AIP
|
2. |
Bayesian Wavelet Domain Segmentation |
|
AIP Conference Proceedings,
Volume 735,
Issue 1,
1904,
Page 19-26
Patrice Brault,
Ali Mohammad‐Djafari,
Preview
|
PDF (1130KB)
|
|
摘要:
We have recently demonstrated that fully unsupervised segmentations of still images and 2D+T sequences is possible by Bayesian methods, on the basis of a Hidden Markovian Model (HMM) and a Potts‐Markov Random Field (PMRF), in the pixel domain. The use of a high number of iterations to reach convergence in a segmentation where the number of segments, or “classes” labels, is important makes the algorithm rather slow for the processing of a large quantity of data like in image sequences. We more recently have worked out a new version of this algorithm by first operating our segmentation in the wavelet transform domain rather than in the direct domain. Doing so, we take advantage of the local decay property, or “peaky” distribution of the wavelet coefficients, in an orthogonal decomposition. This decomposition is a fast pyramidal.O(N2), decomposition, so the Bayesian segmentation is performed only once on the first coarse image then on all sub‐bands up to the highest resolution level. Moreover, we have improved our Potts‐Markov model in order to take into account the three main orientations of the wavelets band‐pass, or so‐called, detail, subbands. The main advantage of such an algorithm, in comparison with the direct domain Bayesian segmentation, is that the high frequency coefficients, i.e. the coefficients of all sub‐bands except the coarsest, are segmented in only 2 classes : 1 for the weak energy coefficients and 2 for the few, and most representative, high energy coefficients, thus enabling to speed up the convergence process of the segmentation. © 2004 American Institute of Physics
ISSN:0094-243X
DOI:10.1063/1.1835193
出版商:AIP
年代:1904
数据来源: AIP
|
3. |
Multigrid Priors for fMRI time series analysis |
|
AIP Conference Proceedings,
Volume 735,
Issue 1,
1904,
Page 27-34
Nestor Caticha,
Selene da Rocha Amaral,
Said R. Rabbani,
Preview
|
PDF (135KB)
|
|
摘要:
We deal with the problem of constructing priors for data analysis in order to asses brain activity in functional Magnetic Resonance Imaging (fMRI). Our method is an example of how a prior distribution can incorporate what could be termed as conventional prior information as well as other information such as that steming from knowledge of what constitues a reasonable likelihood.Brain activity during a cognitive, sensorial or motor task presents a certain level of localization and spatial correlations with different scales involved in the problem. These suggests a multiscale iterative procedure to construct the prior. Grids of different scales are constructed over the image. Spatially coarse grain data variables are defined for each scale, until a single voxel time series is obtained. The process consists in iterating back to finer scales, determining for each coarse scale a set of posterior probabilities. The posterior on a coarse scale is used as the prior for activity at the next finer scale. We have applied our method both to real as well as synthetic data of block experiments. A linear model and a standard hemodynamic response function are used to construct the likelihood. ROC curves are used to compare the results with other Bayesian and orthodox methods. By systematically deleting images in each period or by corrupting the signal with noise, we can study the robustness of the method under information loss. © 2004 American Institute of Physics
ISSN:0094-243X
DOI:10.1063/1.1835194
出版商:AIP
年代:1904
数据来源: AIP
|
4. |
Model Fitting and Model Evidence for Multiscale Image Texture Analysis |
|
AIP Conference Proceedings,
Volume 735,
Issue 1,
1904,
Page 35-42
Mihai Datcu,
Dan Alexandru Stoichescu,
Klaus Seidel,
Cristian Iorga,
Preview
|
PDF (813KB)
|
|
摘要:
This paper gives an overview of the two levels of Bayesian inference: model fitting and model selection and shows how they can be used for the image texture analysis. The applied models are the Gauss‐Markov and Gibbs auto‐binomial Random Fields. In the second part the article introduces a linear model for the image wavelet coefficients able to explain the full description of the spatial, inter‐scale and inter‐band behavior of a multi‐resolution decomposed image. The model parameters, model variance and evidence are used to characterize the image texture. © 2004 American Institute of Physics
ISSN:0094-243X
DOI:10.1063/1.1835195
出版商:AIP
年代:1904
数据来源: AIP
|
5. |
Integrated Approaches in Fusion Data Analysis |
|
AIP Conference Proceedings,
Volume 735,
Issue 1,
1904,
Page 43-51
A. Dinklage,
R. Fischer,
H. Dreier,
J. Svensson,
Yu. Turkin,
Preview
|
PDF (344KB)
|
|
摘要:
The concept of integrated data analysis in nuclear fusion requires the linkage of data and physical information. Summarizing the key steps for the analysis of transport in the core plasma, benefits of probabilistic modelling of single diagnostics are discussed. Concepts for full diagnostics models consisting of several diagnostics modules and linkage through mapping procedures are given in figures of Bayesian graphical models. Coupling to theory codes is demonstrated by the error estimation of neoclassical error analysis allowing a quantitative physical model validation. As an inverted use of the integrated data analysis approach, goals for the design of diagnostics andsetsof diagnostics (meta‐diagnostics) are outlined. © 2004 American Institute of Physics
ISSN:0094-243X
DOI:10.1063/1.1835196
出版商:AIP
年代:1904
数据来源: AIP
|
6. |
Bayesian Data analysis for ERDA measurements |
|
AIP Conference Proceedings,
Volume 735,
Issue 1,
1904,
Page 52-59
E. Edelmann,
K. Arstila,
J. Keinonen,
Preview
|
PDF (107KB)
|
|
摘要:
Elastic recoil detection analysis (ERDA) is an important ion beam analysis (IBA) method for analysis of thin films. It does, however, suffer from broadening of the energy spectra due to multiple and plural scattering and surface roughness, with loss of depth resolution as a result. We present a method based on Bayesian probability theory to improve the depth resolution, utilising a simulation code to simulate the ERDA measurement process. The method is demonstrated on a simulated measurement on a WxCyN1−x−y/SiO2/Si sample, for which multiple and plural scattering becomes a large problem with traditional data analysis methods used with ERDA, due to the heavy mass of tungsten. © 2004 American Institute of Physics
ISSN:0094-243X
DOI:10.1063/1.1835197
出版商:AIP
年代:1904
数据来源: AIP
|
7. |
Relative Entropy Credibility Theory |
|
AIP Conference Proceedings,
Volume 735,
Issue 1,
1904,
Page 60-67
Juan Jose´ Ferna´ndez‐Dura´n,
Mari´a Mercedes Gregorio‐Domi´nguez,
Preview
|
PDF (120KB)
|
|
摘要:
Consider a portfolio of personal motor insurance policies in which, for each policyholder in the portfolio, we want to assign a credibility factor at the end of each policy period that reflects the claim experience of the policyholder compared with the claim experience of the entire portfolio. In this paper we present the calculation of credibility factors based on the concept of relative entropy between the claim size distribution of the entire portfolio and the claim size distribution of the policyholder. © 2004 American Institute of Physics
ISSN:0094-243X
DOI:10.1063/1.1835198
出版商:AIP
年代:1904
数据来源: AIP
|
8. |
Reconstruction of piecewise homogeneous images from partial knowledge of their Fourier Transform |
|
AIP Conference Proceedings,
Volume 735,
Issue 1,
1904,
Page 68-75
Olivier Fe´ron,
Zouaoui Chama,
Ali Mohammad‐Djafari,
Preview
|
PDF (161KB)
|
|
摘要:
Fourier synthesis (FS) inverse problem consists in reconstructing a multi‐variable function from the measured data which correspond to partial and uncertain knowledge of its Fourier Transform (FT). By partial knowledge we mean either partial support and/or the knowledge of only the module and by uncertain we mean both uncertainty of the model and noisy data. This inverse problem arises in many applications such as : optical imaging, radio astronomy, magnetic resonance imaging (MRI) and diffraction scattering (ultrasounds or microwave imaging).Most classical methods of inversion are based on interpolation of the data and fast inverse FT. But when the data do not fill uniformly the Fourier domain or when the phase of the signal is lacking as in optical interferometry, the results obtained by such methods are not satisfactory, because these inverse problems are ill‐posed. The Bayesian estimation approach, via an appropriate modeling of the unknown functions gives the possibility of compensating the lack of information in the data, thus giving satisfactory results.In this paper we study the case where the observations are a part of the FT modulus of objects which are composed of a few number of homogeneous materials. To model such objects we use a Hierarchical Hidden Markov Modeling (HMM) and propose a Bayesian inversion method using appropriate Markov Chain Monte Carlo (MCMC) algorithms. © 2004 American Institute of Physics
ISSN:0094-243X
DOI:10.1063/1.1835199
出版商:AIP
年代:1904
数据来源: AIP
|
9. |
Bayesian Experimental Design — Studies for Fusion Diagnostics |
|
AIP Conference Proceedings,
Volume 735,
Issue 1,
1904,
Page 76-83
R. Fischer,
Preview
|
PDF (136KB)
|
|
摘要:
The design of fusion diagnostics is essential for the physics program of future fusion devices. The goal is to maximize the information gain of a future experiment with respect to various constraints. A measure of information gain is the mutual information between the posterior and the prior distribution. The Kullback‐Leibler distance is used as a utility function to calculate the expected information gain marginalizing over data and parameter space. The expected utility function is maximized with respect to the design parameters of the experiment. The method will be applied to the design of a Thomson scattering experiment. © 2004 American Institute of Physics
ISSN:0094-243X
DOI:10.1063/1.1835200
出版商:AIP
年代:1904
数据来源: AIP
|
10. |
Bayesian estimation methods in metrology |
|
AIP Conference Proceedings,
Volume 735,
Issue 1,
1904,
Page 84-95
M. G. Cox,
A. B. Forbes,
P. M. Harris,
Preview
|
PDF (215KB)
|
|
摘要:
In metrology — the science of measurement — a measurement result must be accompanied by a statement of its associated uncertainty. The degree of validity of a measurement result is determined by the validity of the uncertainty statement. In recognition of the importance of uncertainty evaluation, the International Standardization Organization in 1995 published theGuide to the Expression of Uncertainty in Measurementand the Guide has been widely adopted. The validity of uncertainty statements is tested in interlaboratory comparisons in which an artefact is measured by a number of laboratories and their measurement results compared. Since the introduction of the Mutual Recognition Arrangement, key comparisons are being undertaken to determine the degree of equivalence of laboratories for particular measurement tasks. In this paper, we discuss the possible development of the Guide to reflect Bayesian approaches and the evaluation of key comparison data using Bayesian estimation methods. © 2004 American Institute of Physics
ISSN:0094-243X
DOI:10.1063/1.1835201
出版商:AIP
年代:1904
数据来源: AIP
|
|