Learn to code. And cook, perform open-heart surgery, write kanji, design efficient thermodynamics, blog…
For such an interesting and useful debate, what such silly arguments are being advanced. I think the coders need to lift their heads from their screens and spend the year to learn the humanities.
Clay Johnson (@cjoh) believes that coding is like literacy: if you don’t learn it, you’re shut out of the world. Matt Galligan (@mg) suggests it is like cooking: it doesn’t matter that you can’t compete against Jamie Oliver with a whisk, simply being knowledgeable is important. (I’ll link to their tweets shortly.)
Both are deeply smart. Yet I beg to differ on two grounds.
First, we live in a resource constrained world, and one cannot peruse everything they’d like. I’ve never read Plutarch despite knowing I’d be a better person for it.
Second, there is a value to concentrating one’s efforts where there is the biggest payoff; the idea of comparative advantage. An old economics textbook — was it Samuelson? — used the example of why President Roosevelt oughtn’t type his own letters even if he is a faster typist than his secretary.
I have no principled objection to learning to code, either rudimentarily or more seriously, if that is what one wishes. But insisting that it is somehow essential to learn is ridiculous.
Surely the same arguments could be made for other things that affect us on an everyday level, such as food (learn to farm!), health (learn to sequence genomes!). We drive cars: must we learn how they work? We surf the internet: must we tinker with IPv6 header fields and the protocol stack? Where does it end?
Benjamin Franklin detested that schools in his day taught Latin and Greek as standard fare — far better to learn living languages to actually talk to people, he recommended. I’m not as narrowly practical as that. (After all, it let scholars across Europe communicate, as the term “Latin Quarter” in Paris suggests. And it opens the mind to a world of great works.) Yet there is a lot of sense to the idea that we should discriminate with our time wisely.
Yes, yes. I understand that as more facets of life are dominated by computers, with algorithms making decisions that once were done by people, it is essential that the public has a basic understanding of how software works, so that they can appreciate its limitations — and can act on that knowledge as citizens, voters, consumers, parents, etc. I get it. (I’m even writing a book on big-data that deals with this.)
Still, the principles of software, albeit useful to be familiar with, holds no sacrosanct importance that it should jump the queue of priorities; coding can hardly claim some sort of categorical imperative that elevates it to something any honorable person must know. Rather, it is like most other things in life: nice to know if you can, but one can avail oneself of the marketplace to bring in the skills when it’s needed. Typists never needed to know the innards of a mechanical typewriter. Newspapers subscribers don’t need to learn about presses, or the stylebook, or HTML5.
What may be most surprising is that people are surprised. So coders urge everyone to code. Priests want parishioners to pray. Boy scouts want us to camp. Generals ask that we be prepared to defend. When you’re a hammer, everything looks like a nail. Frogs see the universe as a pond. Lawyers want us to code too, but not in the way software engineers mean.
In the middle ages, music was the fourth of the seven liberal arts which all educated men needed to learn. Should today’s computer scientists think less of themselves if they cannot sightread a staff?