
- English
- ePUB (mobile friendly)
- Available on iOS & Android
Elements of Information Theory
About this book
The latest edition of this classic is updated with new problem sets and material
The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory. All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points. The Second Edition features:
* Chapters reorganized to improve teaching
* 200 new problems
* New material on source coding, portfolio theory, and feedback capacity
* Updated references Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.
Frequently asked questions
- Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
- Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Information
Table of contents
- Cover
- Half Title page
- Title page
- Copyright page
- Preface to the Second Edition
- Preface to the First Edition
- Acknowledgments for the Second Edition
- Acknowledgments for the First Edition
- Chapter 1: Introduction and Preview
- Chapter 2: Entropy, Relative Entropy, and Mutual Information
- Chapter 3: Asymptotic Equipartition Property
- Chapter 4: Entropy Rates of a Stochastic Process
- Chapter 5: Data Compression
- Chapter 6: Gambling and Data Compression
- Chapter 7: Channel Capacity
- Chapter 8: Differential Entropy
- Chapter 9: Gaussian Channel
- Chapter 10: Rate Distortion Theory
- Chapter 11: Information Theory and Statistics
- Chapter 12: Maximum Entropy
- Chapter 13: Universal Source Coding
- Chapter 14: Kolmogorov Complexity
- Chapter 15: Network Information Theory
- Chapter 16: Information Theory and Portfolio Theory
- Chapter 17: Inequalities in Information Theory
- Bibliography
- List of Symbols
- Index