Book:Dominic Welsh/Codes and Cryptography

Subject Matter

 * Communication Theory
 * Cryptography

Contents

 * Preface


 * Notation


 * 1. Entropy = Uncertainty = Information
 * 1. Uncertainty
 * 2. Entropy and its properties
 * 3. Conditional entropy
 * 4. Information
 * 5. Conclusion


 * 2. The noiseless coding theorem for memoryless sources
 * 1. Memoryless sources
 * 2. Instantaneous and uniquely decipherable codes
 * 3. The Kraft-McMillan inequalities
 * 4. The noiseless coding theorem for memoryless sources
 * 5. Constructing compact codes


 * 3. Communication through noisy channels
 * 1. The discrete memoryless channel
 * 2. Connecting the source to the channel
 * 3. Codes and decoding rules
 * 4. The capacity of a channel
 * 5. The noisy coding theorem
 * 6. Capacity is the bound to accurate communication


 * 4. Error-correcting codes
 * 1. The coding problem
 * 2. The sphere-packing and Gilbert-Varshamov Bounds; perfect codes
 * 3. Linear codes
 * 4. Using linear codes
 * 5. Minimum-distance decoding for linear codes
 * 6. Binary Hamming codes
 * 7. Cyclic codes
 * 8. The Mariner code; Reed Muller codes
 * 9. Conclusion


 * 5. General sources
 * 1. The entropy of a general source
 * 2. Stationary sources
 * 3. Typical messages of a memoryless source
 * 4. Typical messages of general sources -- ergodicity
 * 5. Markov sources
 * 6. The coding theorems for ergodic sources


 * 6. The structure of natural languages
 * 1. English as a mathematical source
 * 2. The entropy of English
 * 3. Zipf's law and word entropy
 * 4. The redundancy of a language


 * 7. Cryptosystems
 * 1. Basic principles
 * 2. Breaking a cryptosystem
 * 3. Equivocation and perfect secrecy
 * 4. Combining cryptosystems
 * 5. Unicity
 * 6. Hellman's extension of linear shift-register sequences
 * 7. Conclusion


 * 8. The one-time pad and linear shift-register sequences
 * 1. The one-time pad
 * 2. Linear shift-register sequences
 * 3. The insecurity of linear shift-register sequences
 * 4. Generating cyclic codes


 * 9. Computational complexity
 * 1. The intrinsic difficulty of a problem: examples
 * 2. P = Polynomial time
 * 3. NP = Nondeterministic Polynomial time
 * 4. NP-complete/hard problems
 * 5. Circuit complexity
 * 6. Randomized algorithms
 * 7. Effective versus intractable computations


 * 10. One-way functions
 * 1. Informal approach: the password problem
 * 2. Using NP-hard problems as cryptosystems
 * 3. The Data Encryption Standard (DES)
 * 4. The discrete logarithm
 * 5. Using the discrete logarithm to solve the key-distribution problem
 * 6. A cryptosystem with no keys
 * 7. On the difficulty of factoring and taking discrete logarithms


 * 11. Public key cryptosystems
 * 1. The idea of a trapdoor function
 * 2. The Rivest-Shamir-Adleman (RSA) system
 * 3. Knapsack-based systems
 * 4. A public-key system as intractable as factoring
 * 5. A public-key system based on the discrete logarithm
 * 6. Error-correcting codes as a public-key system


 * 12. Authentication and digital signatures
 * 1. Introduction
 * 2. Authentication in a communication system
 * 3. Signature schemes based on conventional cryptosystems
 * 4. Using public key networks to send signed messages
 * 5. Faster signatures but less privacy
 * 6. Attacks and cracks in trapdoor signature schemes


 * 13. Randomized encryption
 * 1. Introduction
 * 2. Semantic security and the Goldwasser-Micali scheme
 * 3. Cryptographically secure pseudo-random numbers
 * 4. Wyner's wiretap channel
 * 5. Effective entropy


 * Appendices
 * 1. Proof of the uniqueness theorem that $H = -\sum p_i \log p_i$
 * 2. Letter frequencies of English


 * Answers to exercises


 * Answers and hints to problems


 * References


 * Index



Source work progress
* : $\S 1$: Entropy = uncertainty = information: $1.1$ Uncertainty: Exercises $1.1$: $2.$