Book:Robert B. Ash/Information Theory

Contents

 * Preface


 * A Measure of Information
 * 1.1 Introduction
 * 1.2 Axioms for the Uncertainty Measure
 * 1.3 Three Interpretations of the Uncertainty Function
 * 1.4 Properties of the Uncertainty Function; Joint and Conditional Uncertainty
 * 1.5 The Measure of Information
 * 1.6 Notes and Remarks


 * Noiseless Coding
 * 2.1 Introduction
 * 2.2 The Problem of Unique Decipherability
 * 2.3 Necessary and Sufficient Conditions for the Existence of Instantaneous Codes
 * 2.4 Extension of the Condition $\ds \sum_{i \mathop = 1}^M D^{-n_i} \le 1$ to Uniquely Decipherable Codes
 * 2.5 The Noiseless Coding Theorem
 * 2.6 Construction of Optimal Codes
 * 2.7 Notes and Remarks


 * The Discrete Memoryless Channel
 * 3.1 Models for Communication Channels
 * 3.2 The Information Processed by a Channel; Channel Capacity; Classification of Channels
 * 3.3 Calculation of Channel Capacity
 * 3.4 Decoding Schemes; the Ideal Observer
 * 3.5 The Fundamental Theorem
 * 3.6 Exponential Error Bounds
 * 3.7 The Weak Converse to the Fundamental Theorem
 * 3.8 Notes and Remarks


 * Error Correcting Codes
 * 4.1 Introduction; Minimum Distance Principle
 * 4.2 Relation between Distance and Error Correcting Properties of Codes; the Hamming Bound
 * 4.3 Parity Check Coding
 * 4.4 The Application of Group Theory to Parity Check Coding
 * 4.5 Upper and Lower Bounds on the Error Correcting Ability of Parity Check Codes
 * 4.6 Parity Check Codes Are Adequate
 * 4.7 Precise Error Bounds for General Binary Codes
 * 4.8 The Strong Converse for the Binary Symmetric Channel
 * 4.9 Non-Binary Coding
 * 4.10 Notes and Remarks


 * Further Theory of Error Correcting Codes
 * 5.1 Feedback Shift Registers and Cyclic Codes
 * 5.2 General Properties of Binary Matrices and Their Cycle Sets
 * 5.3 Properties of Cyclic Codes
 * 5.4 Bose-Chaudhuri-Hocquenghem Codes
 * 5.5 Single Error Correcting Cyclic Codes; Automatic Decoding
 * 5.6 Notes and Remarks


 * Information Sources
 * 6.1 Introduction
 * 6.2 A Mathematical Model for an Information Source
 * 6.3 Introduction to the Theory of Finite Markov Groups
 * 6.4 Information Sources; Uncertainty of a Source
 * 6.5 Order of a Source; Approximation of a General Information Source by a Source of Finite Order
 * 6.6 The Asymptotic Equipartition Property
 * 6.7 Notes and Remarks


 * Channels with Memory
 * 7.1 Introduction
 * 7.2 The Finite-State Channel
 * 7.3 The Coding Theorem for Finite State Regular Channels
 * 7.4 The Capacity of a General Discrete Channel; Comparison of the Weak and Strong Converses
 * 7.5 Notes and Remarks


 * Continuous Channels
 * 8.1 Introduction
 * 8.2 The Time-Discrete Gaussian Channel
 * 8.3 Uncertainty in the Continuous Case
 * 8.4 The Converse to the Coding Theorem for the Time-Discrete Gaussian Channel
 * 8.5 The Time-Continuous Gaussian Channel
 * 8.6 Band-Limited Channels
 * 8.7 Notes and Remarks


 * Appendix
 * 1. Compact and Symmetric Operators on $L_2 \sqbrk {a, b}$
 * 2. Integral Operators
 * 3. The Karhunen-Loève Theorem
 * 4. Further Results Concerning Integral Operators Determined by a Covariance Function


 * Tables of Values of $-\log_2 p$ and $-p \log_2 p$


 * Solutions to Problems


 * References


 * Index