Introduction to the Theory of Information
‘Do not worry about your difficulties in Mathematics. I can assure you mine are still greater.’
Information processing is the most important and the most energy consuming human activity. Our brains contain approximately 3 × 109 neurons and each of them has approximately 104 connections with other neurons. This impressive network is dense, as each cubic millimetre of neural tissue contains up to 109 synaptic junctions. While the brain constitutes only 2% of body mass, is consumes 20% of the energetic demand at rest. We really must need our personal ‘CPUs’ as Mother Nature invests so much energy in nervous systems. The importance of information storage, transfer and processing has been greatly appreciated through the ages. Nowadays various techniques have revolutionized all aspects of information processing via digital electronic technologies.
It is impossible to find a direct relation between brains and computers, but both systems show some functional and structural analogies. Their building blocks are relatively simple and operate according to well-defined rules, the complex functions they can perform is a result of the structural complexity (i.e. is an emergent feature of the system) and communication between structural elements is digital. This is quite obvious for electronic computers, but spikes of action potential can also be regarded as digital signals, as it is not the amplitude of the ...