Chapter 12. Digital Cognition and Agency

The electric things have their life, too.

—PHILIP K. DICK

Shannon’s Logic

For the realm of information technology, the word information has a specific history. Just as ecological psychologist James J. Gibson chose the word for his work in psychology, Claude Shannon (1916–2001) appropriated it for his own, separate purposes. An American mathematician, electronic engineer, and cryptographer—often called the “Father of Information Theory”—Shannon was the prime mover behind a way of understanding and using information that has led to the digital revolution we’re experiencing today.[242] His work during World War II, and later at Bell Labs and MIT, is foundational to anything that relies on packaging up information into bits (the word “bit” being a conflation of “binary digit”) and transmitting it over any distance.

One important part of Shannon’s work is how he applied mathematical logic to the problem of transmission, using an encoded (or encrypted) form. Previously, engineers had tried improving the signal of electronic transmission by boosting the power. But that approach could help to only a certain point, at which physics got in the way. Pushing electrons through wires or air over a long-enough distance eventually generates noise, corrupting the signal.

Shannon’s revolutionary discovery: accuracy is improved by encoding the information in a way that works best for machines, not for humans. This turn goes beyond the ...

Get Understanding Context now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.