Winkler | Image Analysis, Random Fields and Markov Chain Monte Carlo Methods | E-Book | www.sack.de
E-Book

E-Book, Englisch, Band 27, 387 Seiten, eBook

Reihe: Stochastic Modelling and Applied Probability

Winkler Image Analysis, Random Fields and Markov Chain Monte Carlo Methods

A Mathematical Introduction
2. Auflage 2003
ISBN: 978-3-642-55760-6
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark

A Mathematical Introduction

E-Book, Englisch, Band 27, 387 Seiten, eBook

Reihe: Stochastic Modelling and Applied Probability

ISBN: 978-3-642-55760-6
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark



"This book is concerned with a probabilistic approach for image analysis, mostly from the Bayesian point of view, and the important Markov chain Monte Carlo methods commonly used....This book will be useful, especially to researchers with a strong background in probability and an interest in image analysis. The author has presented the theory with rigor…he doesn’t neglect applications, providing numerous examples of applications to illustrate the theory." -- MATHEMATICAL REVIEWS

Winkler Image Analysis, Random Fields and Markov Chain Monte Carlo Methods jetzt bestellen!

Zielgruppe


Research


Autoren/Hrsg.


Weitere Infos & Material


I. Bayesian Image Analysis: Introduction.- 1. The Bayesian Paradigm.- 1.1 Warming up for Absolute Beginners.- 1.2 Images and Observations.- 1.3 Prior and Posterior Distributions.- 1.4 Bayes Estimators.- 2. Cleaning Dirty Pictures.- 2.1 Boundaries and Their Information Content.- 2.2 Towards Piecewise Smoothing.- 2.3 Filters, Smoothers, and Bayes Estimators.- 2.4 Boundary Extraction.- 2.5 Dependence on Hyperparameters.- 3. Finite Random Fields.- 3.1 Markov Random Fields.- 3.2 Gibbs Fields and Potentials.- 3.3 Potentials Continued.- II. The Gibbs Sampler and Simulated Annealing.- 4. Markov Chains: Limit Theorems.- 4.1 Preliminaries.- 4.2 The Contraction Coefficient.- 4.3 Homogeneous Markov Chains.- 4.4 Exact Sampling.- 4.5 Inhomogeneous Markov Chains.- 4.6 A Law of Large Numbers for Inhomogeneous Chains.- 4.7 A Counterexample for the Law of Large Numbers.- 5. Gibbsian Sampling and Annealing.- 5.1 Sampling.- 5.2 Simulated Annealing.- 5.3 Discussion.- 6. Cooling Schedules.- 6.1 The ICM Algorithm.- 6.2 Exact MAP Estimation Versus Fast Cooling.- 6.3 Finite Time Annealing.- III. Variations of the Gibbs Sampler.- 7. Gibbsian Sampling and Annealing Revisited.- 7.1 A General Gibbs Sampler.- 7.2 Sampling and Annealing Under Constraints.- 8. Partially Parallel Algorithms.- 8.1 Synchronous Updating on Independent Sets.- 8.2 The Swendson-Wang Algorithm.- 9. Synchronous Algorithms.- 9.1 Invariant Distributions and Convergence.- 9.2 Support of the Limit Distribution.- 9.3 Synchronous Algorithms and Reversibility.- IV. Metropolis Algorithms and Spectral Methods.- 10. Metropolis Algorithms.- 10.1 Metropolis Sampling and Annealing.- 10.2 Convergence Theorems.- 10.3 Best Constants.- 10.4 About Visiting Schemes.- 10.5 Generalizations and Modifications.- 10.6 The Metropolis Algorithm in Combinatorial Optimization.- 11. The Spectral Gap and Convergence of Markov Chains.- 11.1 Eigenvalues of Markov Kernels.- 11.2 Geometric Convergence Rates.- 12. Eigenvalues, Sampling, Variance Reduction.- 12.1 Samplers and Their Eigenvalues.- 12.2 Variance Reduction.- 12.3 Importance Sampling.- 13. Continuous Time Processes.- 13.1 Discrete State Space.- 13.2 Continuous State Space.- V. Texture Analysis.- 14. Partitioning.- 14.1 How to Tell Textures Apart.- 14.2 Bayesian Texture Segmentation.- 14.3 Segmentation by a Boundary Model.- 14.4 Juleszs Conjecture and Two Point Processes.- 15. Random Fields and Texture Models.- 15.1 Neighbourhood Relations.- 15.2 Random Field Texture Models.- 15.3 Texture Synthesis.- 16. Bayesian Texture Classification.- 16.1 Contextual Classification.- 16.2 Marginal Posterior Modes Methods.- VI. Parameter Estimation.- 17. Maximum Likelihood Estimation.- 17.1 The Likelihood Function.- 17.2 Objective Functions.- 18. Consistency of Spatial ML Estimators.- 18.1 Observation Windows and Specifications.- 18.2 Pseudolikelihood Methods.- 18.3 Large Deviations and Full Maximum Likelihood.- 18.4 Partially Observed Data.- 19. Computation of Full ML Estimators.- 19.1 A Naive Algorithm.- 19.2 Stochastic Optimization for the Full Likelihood.- 19.3 Main Results.- 19.4 Error Decomposition.- 19.5 L2-Estimates.- VII. Supplement.- 20. A Glance at Neural Networks.- 20.1 Boltzmann Machines.- 20.2 A Learning Rule.- 21. Three Applications.- 21.1 Motion Analysis.- 21.2 Tomographic Image Reconstruction.- 21.3 Biological Shape.- VIII. Appendix.- A. Simulation of Random Variables.- A.1 Pseudorandom Numbers.- A.2 Discrete Random Variables.- A.3 Special Distributions.- B. Analytical Tools.- B.1 Concave Functions.- B.2 Convergence of Descent Algorithms.- B.3 A Discrete Gronwall Lemma.- B.4 A Gradient System.- C. Physical Imaging Systems.- D. The Software Package AntslnFields.- References.- Symbols.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.