Over the holiday break, I re-read Andy Hunt’s Pragmatic Thinking and Learning on my phone. I had started it mainly to force myself to re-evaluate the iBooks reading experience, but quickly became immersed (again). The book makes an informed, but opinionated, introduction to brain architecture, learning theory, and neuroscience. The compelling central theme is that knowledge workers, programmers in particular, are hopelessly bad at using their minds.
As a parent of two young children, hopelessness is something I’m acutely familiar with. From the outset, a decidedly verbal father like myself is forced to communicate with his kid using a range of awkward and non-verbal techniques. It sucks. Even as they’ve grown older and grasped language, I’ve continued to be astonished by how ineffective words are at teaching key behaviors. “Go back to sleep” is easy for an expert adult to say, but nearly meaningless to a kid. Similarly, they need a different tool than language to distinguish between their “whiny voice” and a tolerable one.
Words fail us
Hunt argues that this blindness about the limitations of language is a particular weakness of programmers, who are tremendously attached to representing everything verbally. He recounts a story from the Inner Game of Tennis about teaching an older neophyte:
The next exercise was to listen to the sound of the ball hitting the racket. If you’ve never played, the ball makes a particularly sweet, clear sound when it hits just the right spot on the racket. This fact wasn’t made explicit; our student was merely told to listen.
Next, it was time to serve. First, she was to just hum a phrase while watching Gallwey serve in order to get the rhythm of the motion. No description of the movements; just watch and hum. Next, she tried the serve—humming the same tune and focusing on the rhythm, not the motions. After twenty minutes of this sort of thing, it was time to play. She made the first point of the game and played a very respectable, lengthy set of volleys.
It is easy to dismiss this focus on sound and movement as irrelevant to programming, but that’s the trap. What non-verbal tools do we ignore as we collaborate and teach?
Programmer pedagogy is terrible in general, so it’s obvious to start there. What would it look like if we showed, in the most literal sense, a learner good testing, how to search for a bug, or planning a requirements doc? This is a part of the draw of pair programming, but we rarely reference or emphasize any non-verbal elements.
Pictures in particular
As a programmer who never ever uses UML, I am often surprised how attached and excited my non-programmer colleagues get about a diagram of a software system or process. There’s a whole lot packed into one of these sketches even without the words:
What would change about internal communication if we spent as much time drawing a new icon to represent a new project instead of arguing about the name?
Many HTTP APIs are too “chatty.” How would our API design change if we drew a picture of the desired interplay without ever writing a line of code?
There has to be more opportunity for non-verbal thinking than just images. Is there an opportunity in expressing the auditory layer of programming? The movements? It all sounds terribly New Age, but then I remember trying to talk to a six-week old kid.