Chapter 1
Introduction to the Theory of Information
‘Do not worry about your difficulties in Mathematics. I can assure you mine are still greater.’
Albert Einstein
1.1 Introduction
Information processing is the most important and the most energy consuming human activity. Our brains contain approximately 3 × 109 neurons and each of them has approximately 104 connections with other neurons. This impressive network is dense, as each cubic millimetre of neural tissue contains up to 109 synaptic junctions. While the brain constitutes only 2% of body mass, is consumes 20% of the energetic demand at rest. We really must need our personal ‘CPUs’ as Mother Nature invests so much energy in nervous systems. The importance of information storage, transfer and processing has been greatly appreciated through the ages. Nowadays various techniques have revolutionized all aspects of information processing via digital electronic technologies.
It is impossible to find a direct relation between brains and computers, but both systems show some functional and structural analogies. Their building blocks are relatively simple and operate according to well-defined rules, the complex functions they can perform is a result of the structural complexity (i.e. is an emergent feature of the system) and communication between structural elements is digital. This is quite obvious for electronic computers, but spikes of action potential can also be regarded as digital signals, as it is not the amplitude of the signal, but the sequence of otherwise identical pulses that carries information.
1.2 Definition and Properties of Information
We all intuitively use and understand the notion of information, but it defies precise definition. The concept of information has many meanings, depending on the context. It is usually associated with language, data, knowledge or perception, but in thermodynamics it is a notion closely related to entropy. Its technical definition is usually understood to be an ordered sequence of symbols. Information can be also regarded as any kind of sensory input for humans, animals, plants and artificial devices. It should carry a pattern that influences the interaction of the system with other sensory inputs or other patterns. This definition separates information from consciousness, as interaction with patterns (or pattern circulation) can take place in unanimated systems as well.
While the psychological definition of information is ambiguous, the technological applications must be based on strict definitions and measures. Information can be regarded as a certain physical or structural feature of any system. It can be understood as a degree of order of any physical system. This (structural) form of information is usually regarded as a third (along with matter and energy) component of the Universe. Every object, phenomenon or process can be described in terms of matter (type and number of particles), energy (physical movements) and information (structure). In other words, information can be another manifestation of a primary element. In the same way that the special theory of relativity expresses the equivalence of mass and energy (1.1) [1],
the equivalence of energy and information can be shown within information theory (vide infra).
The most precise definition of information is given by the syntactic theory of Hartley and Shannon. According to this theory, information is a measure of the probability of a certain event. It is the amount of uncertainty removed on occurrence of an event or data transmission. The less probable the event, the higher its information value. According to Hartley the amount of information (Ii) given by an event xi can be formulated as (1.2) [2, 3]:
where pi denotes the probability of an event xi and r is the base of logarithm. Such an expressed amount of information is also a measure of the entropy associated with the event xi. The average amount of information carried by an event from a set of events (X) is the weight...