E-Book, Englisch, 384 Seiten, E-Book
Liang / Liu / Carroll Advanced Markov Chain Monte Carlo Methods
1. Auflage 2010
ISBN: 978-0-470-66973-0
Verlag: John Wiley & Sons
Format: PDF
Kopierschutz: Adobe DRM (»Systemvoraussetzungen)
Learning from Past Samples
E-Book, Englisch, 384 Seiten, E-Book
Reihe: Wiley Series in Computational Statistics
ISBN: 978-0-470-66973-0
Verlag: John Wiley & Sons
Format: PDF
Kopierschutz: Adobe DRM (»Systemvoraussetzungen)
Markov Chain Monte Carlo (MCMC) methods are now an indispensabletool in scientific computing. This book discusses recentdevelopments of MCMC methods with an emphasis on those making useof past sample information during simulations. The applicationexamples are drawn from diverse fields such as bioinformatics,machine learning, social science, combinatorial optimization, andcomputational physics.
Key Features:
* Expanded coverage of the stochastic approximation Monte Carloand dynamic weighting algorithms that are essentially immune tolocal trap problems.
* A detailed discussion of the Monte Carlo Metropolis-Hastingsalgorithm that can be used for sampling from distributions withintractable normalizing constants.
* Up-to-date accounts of recent developments of the Gibbssampler.
* Comprehensive overviews of the population-based MCMC algorithmsand the MCMC algorithms with adaptive proposals.
This book can be used as a textbook or a reference book for aone-semester graduate course in statistics, computational biology,engineering, and computer sciences. Applied or theoreticalresearchers will also find this book beneficial.
Autoren/Hrsg.
Weitere Infos & Material
Preface
Acknowledgements
List of Figures
List of Tables
1 Bayesian Inference and Markov chain Monte Carlo
1.1 Bayes
1.2 Bayes output
1.3 Monte Carlo Integration
1.4 Random variable generation
1.5 Markov chain Monte Carlo
Exercises
2 The Gibbs sampler
2.1 The Gibbs sampler
2.2 Data Augmentation
2.3 Implementation strategies and acceleration methods
2.4 Applications
Exercises
3 The Metropolis-Hastings Algorithm
3.1 The Metropolis-Hastings Algorithm
3.2 Some Variants of the Metropolis-Hastings Algorithm
3.3 Reversible Jump MCMC Algorithm for Bayesian Model Selection
Problems
3.4 Metropolis-within-Gibbs Sampler for ChIP-chip Data Analysis
Exercises
4 Auxiliary Variable MCMC Methods
4.1 Simulated Annealing
4.2 Simulated Tempering
4.3 Slice Sampler
4.4 The Swendsen-Wang Algorithm
4.5 The Wolff Algorithm
4.6 The Møller algorithm
4.7 The Exchange Algorithm
4.8 Double MH Sampler
4.9 Monte Carlo MH Sampler
4.10 Applications
Exercises
5 Population-Based MCMC Methods
5.1 Adaptive Direction Sampling
5.2 Conjugate Gradient Monte Carlo
5.3 Sample Metropolis-Hastings Algorithm
5.4 Parallel Tempering
5.5 Evolutionary Monte Carlo
5.6 Sequential Parallel Tempering for Simulation of High Dimensional
Systems
5.7 Equi-Energy Sampler
5.8 Applications
Forecasting
Exercises
6 Dynamic Weighting
6.1 Dynamic Weighting
6.2 Dynamically Weighted Importance Sampling
6.3 Monte Carlo Dynamically Weighted Importance Sampling
6.4 Sequentially Dynamically Weighted Importance Sampling
Exercises
7 Stochastic Approximation Monte Carlo
7.1 Multicanonical Monte Carlo
7.2 1/k-Ensemble Sampling
7.3 Wang-Landau Algorithm
7.4 Stochastic Approximation Monte Carlo
7.5 Applications of Stochastic Approximation Monte Carlo
7.6 Variants of Stochastic Approximation Monte Carlo
7.7 Theory of Stochastic Approximation Monte Carlo
7.8 Trajectory Averaging: Toward the Optimal Convergence Rate
Exercises
8 Markov Chain Monte Carlo with Adaptive Proposals
8.1 Stochastic Approximation-based Adaptive Algorithms
8.2 Adaptive Independent Metropolis-Hastings Algorithms
8.3 Regeneration-based Adaptive Algorithms
8.4 Population-based Adaptive Algorithms
Exercises
References
Index