E-Book, Englisch, 581 Seiten
Reihe: Engineering (R0)
Ahlswede / Althöfer / Deppe Probabilistic Methods and Distributed Information
1. Auflage 2018
ISBN: 978-3-030-00312-8
Verlag: Springer Nature Switzerland
Format: PDF
Kopierschutz: 1 - PDF Watermark
Rudolf Ahlswede's Lectures on Information Theory 5
E-Book, Englisch, 581 Seiten
Reihe: Engineering (R0)
ISBN: 978-3-030-00312-8
Verlag: Springer Nature Switzerland
Format: PDF
Kopierschutz: 1 - PDF Watermark
The fifth volume of Rudolf Ahlswede's lectures on Information Theory focuses on several problems that were at the heart of a lot of his research. One of the highlights of the entire lecture note series is surely Part I of this volume on arbitrarily varying channels (AVC), a subject in which Ahlswede was probably the world's leading expert. Appended to Part I is a survey by Holger Boche and Ahmed Mansour on recent results concerning AVC and arbitrarily varying wiretap channels (AVWC). After a short Part II on continuous data compression, Part III, the longest part of the book, is devoted to distributed information. This Part includes discussions on a variety of related topics; among them let us emphasize two which are famously associated with Ahlswede: 'multiple descriptions', on which he produced some of the best research worldwide, and 'network coding', which had Ahlswede among the authors of its pioneering paper. The final Part IV on 'Statistical Inference under Communication constraints' is mainly based on Ahlswede's joint paper with Imre Csiszar, which received the Best Paper Award of the IEEE Information Theory Society. The lectures presented in this work, which consists of 10 volumes, are suitable for graduate students in Mathematics, and also for those working in Theoretical Computer Science, Physics, and Electrical Engineering with a background in basic Mathematics. The lectures can be used either as the basis for courses or to supplement them in many ways. Ph.D. students will also find research problems, often with conjectures, that offer potential subjects for a thesis. More advanced researchers may find questions which form the basis of entire research programs.
Rudolf Ahlswede (1938-2010) studied Mathematics in Göttingen, and held postdoc positions in Erlangen, Germany and Ohio, USA. From 1977 on he was full Professor of Applied Mathematics at the University of Bielefeld. His work represents an essential contribution to information theory and networking. He developed and contributed to a number of central areas, including network coding and the theory of identification, while also advancing the fields of combinatorics and number theory. These efforts culminated in his research program on the 'Development of a General Theory of Information Transfer'. In recognition of his work, he received several awards for 'Best Paper', as well as the distinguished 'Shannon Award'.
Autoren/Hrsg.
Weitere Infos & Material
1;Preface;7
2;Words and Introduction of the Editors;9
3;Contents;12
4;Part I Arbitrarily Varying Channels;18
5;1 Preliminaries;19
5.1;1.1 Introduction;19
5.2;1.2 Basic Definitions;21
5.3;1.3 The Models of Arbitrarily Varying Channels;22
5.4;1.4 AVC and Zero-Error Capacity;24
5.5;1.5 Positivity of the Capacity;26
5.6;References;30
6;2 Random Correlated Codes for the AVC and the Compound Channels;32
6.1;2.1 The Capacity of the Compound Channel;32
6.2;2.2 The Random Code Capacity;33
6.3;2.3 Channels Without a Strong Converse;36
6.4;References;37
7;3 Elimination and Robustification Techniques;38
7.1;3.1 Elimination Technique and Dichotomy Formula;38
7.2;3.2 Robustification Techniques;40
7.3;3.3 A Capacity Formula of Arbitrarily Varying Multiple-Access Channels;42
7.4;3.4 Arbitrarily Varying Channels with State Sequence Known to the Sender;46
7.5;References;51
8;4 Arbitrarily Varying Channels with Worst Channels;53
8.1;4.1 Arbitrarily Varying Channels with Binary Output Alphabet;53
8.2;4.2 A Channel with Additive Gaussian Noise of Arbitrarily Varying Variances;55
8.3;References;59
9;5 Non-standard Decoders;60
9.1;5.1 Arbitrarily Varying Channels with the Criterion of Maximum Probability of Error;60
9.2;5.2 Positivity of Arbitrarily Varying Channels with Average Error Probability Criterion;67
9.3;5.3 The Smallest List Size of Codes for an Arbitrarily Varying Channel with Average Probability of Error Criterion;76
9.4;5.4 A Channel with Additive Gaussian Noise of Arbitrarily Varying Means;84
9.5;5.5 Interior of the Achievable Regions of Arbitrarily Varying Multiple-Access Channels with Average Probability of Error Criterion;95
9.6;References;103
10;6 Feedback and Correlated Sources;105
10.1;6.1 Arbitrarily Varying Channels with Noiseless Feedback;105
10.2;6.2 Correlated Sources Help the Transmission Over Arbitrarily Varying Channels;126
10.3;6.3 Arbitrarily Varying Multiple-Access Channels with Correlated Sender's Side Information or Correlated Messages;129
10.4;References;142
11;7 Arbitrarily Varying Source;143
11.1;7.1 Single-User Arbitrarily Varying Source;143
11.2;7.2 Multi-user Arbitrarily Varying Sources and Coloring Hypergraphs;145
11.3;References;157
12;8 Applications and Related Problems;159
12.1;8.1 Coding for Channels with Localized Errors and Arbitrarily Varying Channels;159
12.2;8.2 Application to Writing-Type Memories and OV-Channels;177
12.3;References;185
13;9 Appendix to Part I: The AVC and AVWC;187
13.1;9.1 Channel Models;187
13.1.1;9.1.1 Code Concepts;189
13.1.2;9.1.2 Capacity Results;195
13.1.3;9.1.3 Motivation;197
13.2;9.2 Basic Tools and Main Properties;198
13.2.1;9.2.1 Basic Tools;198
13.2.2;9.2.2 Analytical Properties of the Correlated Random Capacities;201
13.2.3;9.2.3 Discontinuity Behaviour Under List Decoding;202
13.2.4;9.2.4 Additivity and Super-Activation Under List Decoding;204
13.3;9.3 Further Applications and Open Problems;206
13.3.1;9.3.1 ?-Capacity;206
13.3.2;9.3.2 Secure Identification;207
13.3.3;9.3.3 Correlated Jamming;209
13.4;References;210
14;Part II Continuous Data Compression;212
15;10 Ergodic Theory and Encoding of Individual Sequences;213
15.1;10.1 Introduction;213
15.2;10.2 Formal Statement of the Problem and Results;213
15.3;10.3 Proof of Theorem 10.1;217
15.4;10.4 Proof of Theorem 10.2 (Converse Part);226
15.5;References;229
16;11 The Slepian-Wolf Theorem for Individual Sequences;231
16.1;11.1 Introduction;231
16.2;11.2 Formal Statement of the Problem and the Main Result;232
16.3;11.3 Auxiliary Results;235
16.4;11.4 Proof of the Direct Part;238
16.5;11.5 Proof of the Converse Part;242
16.6;References;244
17;Part III Distributed Information;246
18;12 A Wringing Method: An Elementary Proof of the Strong Converse Theorem for Multiple-Access Channels;249
18.1;12.1 Introduction;249
18.2;12.2 The Strong Converse Theorem for the MAC;249
18.3;12.3 The Packing Lemma and a Bound on Codes for the MAC;251
18.4;12.4 Wringing Techniques;257
18.5;12.5 Proof of Theorem 12.1;263
18.6;References;265
19;13 Extremal Properties of Rate-Distortion Functions;266
19.1;13.1 Basic Concepts and Auxiliary Results;266
19.2;13.2 The Key Ideas and a Basic Inequality;269
19.3;13.3 Schur-Concavity in the Hamming Case;271
19.4;13.4 The Counterexample;275
19.5;13.5 A Consequence for Error Exponents;278
19.6;References;279
20;14 Multiple Descriptions;280
20.1;14.1 Introduction;280
20.2;14.2 Definitions and Formulation of Basic Results;281
20.3;14.3 Preliminaries;283
20.4;14.4 Wringing Techniques;287
20.5;14.5 Proof of Theorem 14.1;290
20.6;14.6 Witsenhausen's Hyperbola Conjecture for a Binary Source;293
20.7;14.7 A Zero-Distortion Problem;295
20.8;14.8 On Team Guessing;295
20.9;14.9 Proof of Theorem 14.2;299
20.10;14.10 Proof of Theorem 14.3;300
20.11;14.11 Continuity Properties;303
20.12;14.12 A Missing Step in Work on Breakdown Degradation;306
20.13;References;307
21;15 Distributive Information Storage;308
21.1;15.1 Introduction;308
21.2;15.2 Single-User Distributed Storage;310
21.3;15.3 Multiple Users with Common Information;319
21.4;15.4 Storage of Independent Information;324
21.5;15.5 Floating Parity for Disk Arrays;329
21.6;15.6 The Use of One Bit of Memory to Store Information About Bernoulli Sequence;333
21.7;References;339
22;16 Network Coding;340
22.1;16.1 The Network Coding Homepage;340
22.2;16.2 Information Flow in Networks;342
22.3;16.3 A Missed Theory and Possible Implications …;355
22.4;References;363
23;17 Random Network Coding;365
23.1;17.1 The Benefits of Coding Over Routing in a Randomized Setting;365
23.2;17.2 Another Look on the Subject: Practical Network Coding;374
23.3;17.3 Further Progress on Random Linear Network Coding;376
23.4;17.4 Error Correction Capability of Random Network Error Correction Codes;378
23.5;References;389
24;18 On Perfect Codes and Related Concepts;390
24.1;18.1 Introduction;390
24.2;18.2 A Local Inequality;395
24.3;18.3 Examples of D-Diameter Perfect Codes in J(n,k);399
24.4;18.4 Examples of D-Diameter Perfect Codes in mathcalHq(n);401
24.5;18.5 Tiling in J(n,k) with Caps;403
24.6;18.6 Open Problems;406
24.7;References;406
25;19 On Error Control Codes for Random Network Coding;408
25.1;19.1 Introduction;408
25.2;19.2 Bounds on the Size of Codes;410
25.3;19.3 Insertion/Deletion Channel;413
25.4;19.4 Code Construction;415
25.5;References;416
26;20 Classical Work: Edge-Disjoint Branchings, Min-Max Theorems, and Shortest Connection Networks;418
26.1;20.1 Edge-Disjoint Branchings;418
26.2;20.2 On Two Minimax Theorems in a Graph;421
26.3;20.3 On the Shortest Spanning Subtree of a Graph and the Traveling Salesman Problem;427
26.4;20.4 Shortest Connection Networks and Some Generalizations;429
26.5;References;437
27;21 On the Advantage of Network Coding;439
27.1;21.1 Introduction;440
27.2;21.2 Problems in the Multicast Networks;443
27.3;21.3 Network Switching for Multisource Multicast Networks with Links …;448
27.4;21.4 Multicast Network Switching Formulated as Matrix Game;455
27.5;21.5 Computation of Maximum Achievable Information Rate for Single-Source Multicast Network Switching;470
27.6;21.6 Computation of Achievable Information Rate Region for Multisource Multicast Network Switching;482
27.7;21.7 Maximum Information Flow with Network Switching Versus Maximum Information Flow with Network Coding;494
27.8;21.8 Achievable Information Rate Regions for a Class of Multisource Multicast Networks with Network Switching and Network Coding;504
27.9;21.9 Conclusion;508
27.10;References;509
28;Part IV Statistical Inference Under Communication Constraints;511
29;22 Hypothesis Testing Under Communication Constraints;512
29.1;22.1 Introduction;512
29.2;22.2 Statement and Discussion of Results;514
29.3;22.3 Lower Bound to ?(R);520
29.4;22.4 Independence of ? of the Exponent;527
29.4.1;22.4.1 Blowing Up Lemma;527
29.5;22.5 Identification in a Large Population;532
29.6;References;534
30;23 Estimation Under Communication Constraints;536
30.1;23.1 Introduction;536
30.2;23.2 A Model for Parameter Estimation in the Presence of Side Information;538
30.3;23.3 On Fisher Information, Mutual Information, and the J Function;539
30.4;23.4 The Informational Inequality;542
30.5;23.5 Encoding the Side Information;547
30.6;23.6 Regularity Conditions for Achievability of the Informational Bound;554
30.7;23.7 Asymptotic Achievability of the Informational Bound in Case of a Finite mathcalX;560
30.8;23.8 Does J Single-Letterize in the Symmetric Bernoulli Case?;563
30.9;References;565
31; Supplement;567
32;Author Index;574
33;Subject Index;577




