The Appification of Computing
It appears that the relentless drive to simplicity in user interface has had the-side effect of serving as a disincentive for students to bother learning more about how computers work.
By Ryan McGreal.
643 words. Approximately a 2 to 4 minute read.
Posted June 27, 2012 in Blog.
(Last Updated June 28, 2012)
Contents
1 | Computer Illiteracy | ||
2 | Knowledge-Destroying Idea |
The historian and philosopher Walter J. Ong drew a distinction between what he called "craft literacy" and "social literacy". The former is the kind of literacy you find in cultures that have systems of writing but do not have cheap, ubiquitous publishing - like Europe before Gutenberg's invention of movable type.
In a culture with craft literacy, most people can't read and write and those who can - scribes - do so as a vocation.
When a society moves toward full literacy, made possible by universal access to both reading and writing material, a number of huge social transformations occur - but one such change is that the mere ability to read and write stops being a rare skill, possessed only by professionals.
For some time I've believed that computer literacy - by which I mean the ability to read and write computer programs - is currently at the level of craft literacy in our society.
I've expressed the hope that we will eventually move toward social literacy in computing, in which most people routinely learn to read and write computer programs in the same way we currently learn to read and write text.
1 Computer Illiteracy ↑
So a recent article in the Globe and Mail gave me serious pause. Titled, "Are we breeding a generation of app-living, web-addicted digital illiterates?", the article quotes Sang-Jin Bae, a computer animation technical director and instructor at New York University's Tisch School of the Arts:
"When kids come into my class they divide into three groups," he says. There are the pure geeks who love technology. There are those trying to understand. And then there is the biggest group: "Those who couldn't care less."
As remarkable as it is to consider, this hip, articulate 36-year old computer whiz makes a heck of an argument that the computer age is entering a dark new era: the age of the digital illiterate.
Today's teens grew up on SMS and Facebook. Everything is being presented to them all the time. Web companies love it, since kids are addicted to their products. But, he says, "They expect less and less from the Web and the software they use."
Mr. Bae is not just talking about obscure, high-end animation tools. Instead, he sees an essential dumbing down of bedrock computing skills.
"The kids I have, and that is roughly two dozen of the brightest young digital artists a semester, often have no idea what Microsoft Word is. They can't tell a Mac from a PC. And forget Excel," he says. He struggles to get his students to use basic computing etiquette.
It appears that the relentless drive to simplicity in user interface has had the-side effect of serving as a disincentive for students to bother learning more about how computers work.
2 Knowledge-Destroying Idea ↑
Lately I've been reading Edward Glaeser's book Triumph of the City, a sprawling tour-de-force in support of the thesis that cities are our most important engines of creativity, innovation and growth and that we would do well to understand how cities work so that we can run our cities more effectively.
Writing about the spectacular decline of Detroit, formerly one of America's greatest cities, Glaeser argues that the seed of Detroit's failure was the invention of the assembly line:
By turning a human being into a cog in a vast industrial enterprise, [Henry] Ford made it possible to be highly productive without having to know all that much. But if people need to know less, they also have less need for cities that spread knowledge. When a city creates a powerful enough knowledge-destroying idea, it sets itself up for self-destruction.
I wonder if the 'app-ification' of computing is turning out to be the same thing: a powerful, knowledge-destroying idea that is actually crippling our collective ability to use computers as tools of creation rather than merely as vectors of consumption.
Update: interesting discussion on Hacker News.