Gray | Entropy and Information Theory | E-Book | www.sack.de
E-Book

E-Book, Englisch, 409 Seiten

Gray Entropy and Information Theory


2. Auflage 2011
ISBN: 978-1-4419-7970-4
Verlag: Springer US
Format: PDF
Kopierschutz: 1 - PDF Watermark

E-Book, Englisch, 409 Seiten

ISBN: 978-1-4419-7970-4
Verlag: Springer US
Format: PDF
Kopierschutz: 1 - PDF Watermark



This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition:Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of results from ergodic theory relevant to information theoryExpanded treatment of B-processes -- processes formed by stationary coding memoryless sourcesNew material on trading off information and distortion, including the Marton inequalityNew material on the properties of optimal and asymptotically optimal source codesNew material on the relationships of source coding and rate-constrained simulation or modeling of random processesSignificant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

Robert M. Gray is the Alcatel-Lucent Technologies Professor of Communications and Networking in the School of Engineering and Professor of Electrical Engineering at Stanford University. For over four decades he has done research, taught, and published in the areas of information theory and statistical signal processing. He is a Fellow of the IEEE and the Institute for Mathematical Statistics. He has won several professional awards, including a Guggenheim Fellowship, the Society Award and Education Award of the IEEE Signal Processing Society, the Claude E. Shannon Award from the IEEE Information Theory Society, the Jack S. Kilby Signal Processing Medal, Centennial Medal, and Third Millennium Medal from the IEEE, and a Presidential Award for Excellence in Science, Mathematics and Engineering Mentoring (PAESMEM). He is a member of the National Academy of Engineering.

Gray Entropy and Information Theory jetzt bestellen!

Autoren/Hrsg.


Weitere Infos & Material


1;Entropy and Information Theory;3
1.1;Preface;7
1.2;Contents;13
1.3;Introduction;17
1.4;Chapter 1 Information Sources;29
1.4.1;1.1 Probability Spaces and Random Variables;29
1.4.2;1.2 Random Processes and Dynamical Systems;33
1.4.3;1.3 Distributions;35
1.4.4;1.4 Standard Alphabets;40
1.4.5;1.5 Expectation;41
1.4.6;1.6 Asymptotic Mean Stationarity;44
1.4.7;1.7 Ergodic Properties;45
1.5;Chapter 2 Pair Processes: Channels, Codes, andCouplings;48
1.5.1;2.1 Pair Processes;48
1.5.2;2.2 Channels;49
1.5.3;2.3 Stationarity Properties of Channels;52
1.5.4;2.4 Extremes: Noiseless and Completely Random Channels;56
1.5.4.1;Noiseless Channels;56
1.5.4.2;Completely Random Channels;56
1.5.5;2.5 Deterministic Channels and Sequence Coders;57
1.5.6;2.6 Stationary and Sliding-Block Codes;58
1.5.6.1;Finite-length Sliding-Block Codes;60
1.5.6.2;Sliding-Block Codes and Partitions;62
1.5.6.3;B-Processes;62
1.5.7;2.7 Block Codes;64
1.5.7.1;Block Independent Processes;64
1.5.7.2;Sliding-Block vs. Block Codes;64
1.5.8;2.8 Random Punctuation Sequences;65
1.5.9;2.9 Memoryless Channels;69
1.5.10;2.10 Finite-Memory Channels;69
1.5.11;2.11 Output Mixing Channels;70
1.5.12;2.12 Block Independent Channels;72
1.5.13;2.13 Conditionally Block Independent Channels;73
1.5.14;2.14 Stationarizing Block Independent Channels;73
1.5.15;2.15 Primitive Channels;75
1.5.16;2.16 Additive Noise Channels;76
1.5.17;2.17 Markov Channels;76
1.5.18;2.18 Finite-State Channels and Codes;77
1.5.19;2.19 Cascade Channels;78
1.5.20;2.20 Communication Systems;79
1.5.21;2.21 Couplings;79
1.5.22;2.22 Block to Sliding-Block: The Rohlin-Kakutani Theorem;80
1.5.22.1;Partitions;82
1.5.22.2;Gadgets;83
1.5.22.3;Strengthened Rohlin-Kakutani Theorem;84
1.6;Chapter 3 Entropy;88
1.6.1;3.1 Entropy and Entropy Rate;88
1.6.2;3.2 Divergence Inequality and Relative Entropy;92
1.6.3;3.3 Basic Properties of Entropy;96
1.6.3.1;Concavity of Entropy;99
1.6.3.2;Convexity of Divergence;101
1.6.3.3;Entropy and Binomial Sums;101
1.6.3.4;Variational Description of Divergence;103
1.6.4;3.4 Entropy Rate;105
1.6.5;3.5 Relative Entropy Rate;108
1.6.6;3.6 Conditional Entropy and Mutual Information;109
1.6.7;3.7 Entropy Rate Revisited;117
1.6.8;3.8 Markov Approximations;118
1.6.9;3.9 Relative Entropy Densities;120
1.7;Chapter 4 The Entropy Ergodic Theorem;123
1.7.1;4.1 History;123
1.7.2;4.2 Stationary Ergodic Sources;126
1.7.3;4.3 Stationary Nonergodic Sources;132
1.7.4;4.4 AMS Sources;136
1.7.5;4.5 The Asymptotic Equipartition Property;140
1.8;Chapter 5 Distortion and Approximation;142
1.8.1;5.1 Distortion Measures;142
1.8.2;5.2 Fidelity Criteria;145
1.8.3;5.3 Average Limiting Distortion;146
1.8.4;5.4 Communications Systems Performance;148
1.8.5;5.5 Optimal Performance;149
1.8.6;5.6 Code Approximation;149
1.8.7;5.7 Approximating Random Vectors and Processes;154
1.8.8;5.8 The Monge/Kantorovich/Vasershtein Distance;157
1.8.9;5.9 Variation and Distribution Distance;157
1.8.10;5.10 Coupling Discrete Spaces with the Hamming Distance;159
1.8.11;5.11 Process Distance and Approximation;160
1.8.11.1;The dp-distance;162
1.8.11.2;Evaluating Process Distortion;166
1.8.12;5.12 Source Approximation and Codes;166
1.8.13;5.13 d-bar Continuous Channels;167
1.9;Chapter 6 Distortion and Entropy;172
1.9.1;6.1 The Fano Inequality;172
1.9.2;6.2 Code Approximation and Entropy Rate;175
1.9.2.1;Dynamical Systems and Random Processes;176
1.9.3;6.3 Pinsker’s and Marton’s Inequalities;177
1.9.4;6.4 Entropy and Isomorphism;181
1.9.4.1;Isomorphic Measurable Spaces;182
1.9.4.2;Isomorphic Probability Spaces;182
1.9.4.3;Isomorphism Mod 0;183
1.9.4.4;Isomorphic Dynamical Systems;183
1.9.4.5;Isomorphic Random Processes;183
1.9.5;6.5 Almost Lossless Source Coding;185
1.9.5.1;Almost-Lossless Block Codes;186
1.9.5.2;Asynchronous Block Code;188
1.9.5.3;Sliding-Block Code;190
1.9.6;6.6 Asymptotically Optimal Almost Lossless Codes;193
1.9.7;6.7 Modeling and Simulation;194
1.10;Chapter 7 Relative Entropy;197
1.10.1;7.1 Divergence;197
1.10.1.1;Variational Description of Divergence;211
1.10.2;7.2 Conditional Relative Entropy;213
1.10.2.1;Generalized Conditional Relative Entropy;224
1.10.3;7.3 Limiting Entropy Densities;226
1.10.4;7.4 Information for General Alphabets;228
1.10.5;7.5 Convergence Results;240
1.11;Chapter 8 Information Rates;243
1.11.1;8.1 Information Rates for Finite Alphabets;243
1.11.2;8.2 Information Rates for General Alphabets;245
1.11.3;8.3 A Mean Ergodic Theorem for Densities;249
1.11.4;8.4 Information Rates of Stationary Processes;251
1.11.5;8.5 The Data Processing Theorem;258
1.11.6;8.6 Memoryless Channels and Sources;259
1.12;Chapter 9 Distortion and Information;261
1.12.1;9.1 The Shannon Distortion-Rate Function;261
1.12.2;9.2 Basic Properties;263
1.12.2.1;IID Sources;265
1.12.3;9.3 Process Definitions of the Distortion-Rate Function;266
1.12.4;9.4 The Distortion-Rate Function as a Lower Bound;274
1.12.5;9.5 Evaluating the Rate-Distortion Function;276
1.12.5.1;Support of Shannon Optimal Distributions;286
1.13;Chapter 10 Relative Entropy Rates;288
1.13.1;10.1 Relative Entropy Densities and Rates;288
1.13.2;10.2 Markov Dominating Measures;291
1.13.3;10.3 Stationary Processes;295
1.13.4;10.4 Mean Ergodic Theorems;298
1.13.4.1;Finite Alphabets;298
1.13.4.2;Standard Alphabets;301
1.14;Chapter 11 Ergodic Theorems for Densities;303
1.14.1;11.1 Stationary Ergodic Sources;303
1.14.2;11.2 Stationary Nonergodic Sources;308
1.14.3;11.3 AMS Sources;312
1.14.4;11.4 Ergodic Theorems for Information Densities.;315
1.15;Chapter 12 Source Coding Theorems;317
1.15.1;12.1 Source Coding and Channel Coding;317
1.15.2;12.2 Block Source Codes for AMS Sources;318
1.15.2.1;Reference Letters;321
1.15.2.2;Performance and Distortion-Rate Functions;322
1.15.3;12.3 Block Source Code Mismatch;329
1.15.4;12.4 Block Coding Stationary Sources;332
1.15.5;12.5 Block Coding AMS Ergodic Sources;334
1.15.6;12.6 Subadditive Fidelity Criteria;341
1.15.7;12.7 Asynchronous Block Codes;343
1.15.8;12.8 Sliding-Block Source Codes;345
1.15.9;12.9 A Geometric Interpretation;355
1.16;Chapter 13 Properties of Good Source Codes;357
1.16.1;13.1 Optimal and Asymptotically Optimal Codes;357
1.16.2;13.2 Block Codes;359
1.16.2.1;Moment Properties;364
1.16.3;13.3 Sliding-Block Codes;365
1.16.3.1;Asymptotically Optimal Sliding-Block Codes;372
1.16.3.2;Process approximation;372
1.16.3.3;Moment conditions;374
1.16.3.4;Finite-order distribution Shannon conditions for IID processes;376
1.16.3.5;Asymptotic Uncorrelation;378
1.17;Chapter 14 Coding for Noisy Channels;380
1.17.1;14.1 Noisy Channels;380
1.17.2;14.2 Feinstein’s Lemma;382
1.17.3;14.3 Feinstein’s Theorem;385
1.17.4;14.4 Channel Capacity;388
1.17.5;14.5 Robust Block Codes;393
1.17.6;14.6 Block Coding Theorems for Noisy Channels;396
1.17.7;14.7 Joint Source and Channel Block Codes;398
1.17.8;14.8 Synchronizing Block Channel Codes;401
1.17.9;14.9 Sliding-block Source and Channel Coding;405
1.17.9.1;Totally Ergodic Sources;405
1.17.9.2;Ergodic Sources;412
1.18;References;416
1.19;Index;425



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.