8.1. The AI Spring?

The AI winter was not just a write-off for the venture capitalists who had drunk too deeply of the Kool-Aid; it spawned many genuine innovations, which came from questioning in a scientific way what had gone wrong. The symbolic predicate calculus logic programming view of AI had its limitations. Learning how to solve problems in the really messy, noisy, dynamic world was different from theorem proving and chess.

Scientists looking for successful models of learning and adaptive behavior do not have to look far. Birds do it, bees do it, even monkeys in the trees do it. But they all do it using wetware that we understand well enough to appreciate the crucial lessons for the next generation of AI paradigms. There is massive parallelism. Computation is going on all over the place, not in one instruction stream. Brains do not have accumulators.

AI went parallel. Thinking Machines, founded by computational superstar Danny Hillis (son-in-law of Marvin Minsky, the pope of symbolic AI), gathered some of the leading lights to build and program massive machines with up to 64K (65,536) processors. That is a lot more than one, but still a lot less than the 100 billion neurons in the brain.

You don't need a machine with a billion processors to try out solutions that would use them. A simulator will do fine, if not as fast. For theory buffs, this is an example of the idea of a universal computation; a Turing machine or its equivalent can emulate anything you want. The Nintendo ...

Get Nerds on Wall Street: Math, Machines, and Wired Markets now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.