Information Theory Tools for Visualization
eBook - ePub

Information Theory Tools for Visualization

  1. 194 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Information Theory Tools for Visualization

About this book

This book explores Information theory (IT) tools, which have become state of the art to solve and understand better many of the problems in visualization. This book covers all relevant literature up to date. It is the first book solely devoted to this subject, written by leading experts in the field.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Information Theory Tools for Visualization by Min Chen,Miquel Feixas,Ivan Viola,Anton Bardera,Han-Wei Shen,Mateu Sbert in PDF and/or ePUB format, as well as other popular books in Computer Science & Computer Graphics. We have over one million books available in our catalogue for you to explore.
CHAPTER 1
Basic Concepts of Information Theory
CONTENTS
1.1 Entropy
1.2 Relative Entropy and Mutual Information
1.3 Information Specific to a Particular Symbol
1.4 Entropy Rate
1.5 Jensen–Shannon Divergence
1.6 Information Bottleneck Method
1.7 Summary
Information theory was founded in 1948 and described in Shannon’s paper ā€œA Mathematical Theory of Communicationā€ [107]. In this paper, Shannon defined entropy and mutual information (initially called the rate of transmission), and introduced the fundamental laws of data compression and transmission. In information theory, information is simply the outcome of a selection among a finite number of possibilities and an information source is modeled as a random variable or as a random process. While Shannon entropy expresses the uncertainty or the information content of a single random variable, mutual information quantifies the dependence between two random variables and plays an important role in the analysis of a communication channel, a system in which the output depends probabilistically on its input [33, 134, 148]. From its birth to date, information theory has interacted with many different fields, such as statistical inference, computer science, mathematics, physics, chemistry, economics, and biology.
This chapter presents Shannon’s information measures (entropy, conditional entropy, and mutual information) for discrete and continuous random variables, together with Kullback–Leibler distance, entropy rate, some relevant inequalities, and the information bottleneck method. Two main references on information theory are the books by Cover and Thomas [33] and Yeung [148].
1.1 ENTROPY
Let X be a discrete random variable with alphabet X and probability distribution {p(x)}, where p(x) = Pr{X = x} and x ∈ X. The probability distribution {p(x)} will also be denoted by p(X) or simply p. For instance, a discrete random variable can be used to describe the toss of a fair coin, with alphabet X = {head, tail} and probability distribution p(X) = {1/2, 1/2}. The entropy H(X) of a discre...

Table of contents

  1. Cover
  2. Half Title
  3. Title Page
  4. Copyright Page
  5. Table of Contents
  6. Foreword
  7. Preface
  8. CHAPTER 1 ā–  Basic Concepts of Information Theory
  9. CHAPTER 2 ā–  Visualization and Information Theory
  10. CHAPTER 3 ā–  Viewpoint Metrics and Applications
  11. CHAPTER 4 ā–  Volume Visualization
  12. CHAPTER 5 ā–  Flow Visualization
  13. CHAPTER 6 ā–  Information Visualization
  14. Bibliography
  15. Index