Buch, Englisch, 326 Seiten, Format (B × H): 183 mm x 260 mm, Gewicht: 1920 g
Buch, Englisch, 326 Seiten, Format (B × H): 183 mm x 260 mm, Gewicht: 1920 g
Reihe: Undergraduate Texts in Mathematics
ISBN: 978-0-387-94704-4
Verlag: Springer
This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.
Zielgruppe
Lower undergraduate
Autoren/Hrsg.
Fachgebiete
- Interdisziplinäres Wissenschaften Wissenschaften: Forschung und Information Informationstheorie, Kodierungstheorie
- Mathematik | Informatik EDV | Informatik Informatik Logik, formale Sprachen, Automaten
- Mathematik | Informatik Mathematik Stochastik Mathematische Statistik
- Mathematik | Informatik Mathematik Stochastik Wahrscheinlichkeitsrechnung
- Mathematik | Informatik EDV | Informatik Informatik Mathematik für Informatiker
- Mathematik | Informatik EDV | Informatik Daten / Datenbanken Informationstheorie, Kodierungstheorie
Weitere Infos & Material
Introduction:
Preliminaries; Miscellany; Some Probability; Matrices
1. An Introduction to Codes
Strings and Things; What are codes? Uniquely Decipherable Codes;
Instantaneous Codes and Kraft's Theorem
2. Efficient Encoding
Information Sources; Average Codeword Length; Huffman Encoding; The
Proof that Huffman Encoding is the Most Efficient
3. Noiseless Coding
Entropy; Properties of Entropy; Extensions of an Information 1= Source; The Noiseless Coding Theorem
II Coding Theory
4. The Main Coding Theory Problem
Communications Channels; Decision Rules; Nearest Neighbor Decoding;