Computer Literacy: What It Was and Eventually Became

By Cait Etherington March 12, 2018

997 0

In 1978, a new concept entered the English language: computer literacy. Early on, some people assumed that computer literacy meant the ability to program a computer or even build one while others assumed it referred to the ability to use a computer. Despite the ubiquity of the concept, computer literacy has never had a fixed meaning. Forty years after its arrival, there are also compelling reasons to wonder whether it still matters at all.

The History of Computer Literacy Since 1978

A search of the New York Times archive reveals that the term “computer literacy” began to seep into the vernacular in the late 1970s.  The first reference to computer literacy occurred in the New York Times in early January 1979 in an article headlined, “Johnny’s New Learning Challenge: Computer Literacy.” As the article’s author correctly predicted, “If ‘Why can’t Johnny read and write?’ has been the education story of the 1970’s, then the story of the 1980’s might well be ‘Why can’t Johnny use a computer?'” However, at the time, there appeared to be no common understanding of what computer literacy actually meant.

The term “computer literacy” was coined by Andrew R. Molnar in 1978, just a few months before the term first appeared in a New York Times headline. At the time, Molnar was working at the National Science Foundation’s Office of Computing Activities (OCA) where he played a critical role in helping the office establish a strategic nation vision on technological change. In a 1991 interview with William Aspray at the Center for the History of Electrical Engineering, Molnar reflected on his time at the OCA and how the concept of “computer literacy” was born.

“In 1978, I wrote a paper entitled, ‘The Next Great Crisis in American Education: Computer Literacy’,” explains Molnar. “It was reprinted in about a dozen publications and caused a minor stir. It is sort of ironic, because literacy programs were vogue, but no one knew exactly what computer literacy was or should be. Several writers tried to define it, but there was no broad consensus about what all students should know about computers. We liked the term because it was broad enough that we could get all of the various introductory programs together under one roof.”

While computer literacy would remain a vague term, it was also a term that would experience widespread uptake in the 1980s to early 2000s. A search on WorldCat reveals that since the late 1970s, over 14,000 book titles and over 30,000 journal articles on the subject of computer literacy have been published. In addition, 121 journals and magazines have adopted the term as a title over the past forty years.

A 1985 article published in the Journal of Higher Education defined computer literacy as “that compendium of knowledge and skill which ordinary, educated people need to have about computers in order to function effectively at work and in their private lives in American society for the remainder of this century.” The article also emphasized several key skills that are part of this new literacy, including word processing, the ability to use spreadsheet programs, and the ability to use a computer to retrieve and share information.

While it is now difficult to imagine simple computing tasks, such as word processing or email, as types of literacy, in the mid-1980s both were considered integral aspects of computer know-how. To appreciate why sending an email, for example, was once considered a key part of computer literacy, one need only watch this 1984 clip from the BBC program, Database. On this specific episode, the program broadcast a live demonstration of a British couple connecting a home computer to the Internet with a rotary dial in order to send what was described as the time as an “E mail.”

Does Computer Literacy Still Matter?

Between 1978 and 1988, the New York Times published 155 articles on the topic of computer literacy. By contrast, between 2008 and 2018, there were only 36 articles published on the topic. This is likely partly due to the fact that in a culture where even two-year-olds are frequently found using phones, tablets, and computers, the idea that simply using a computer still constitutes a form of literacy has diminished. Another factor is no doubt connected to a much-needed shift in terminology. By the end of the first decade of the new millennium, “media literacy,” “information literacy,” and most notably, “digital literacy” had also gained currency. So, what has computer literacy become and does it still matter?

While the concept of computer literacy is still deployed, especially in relation to vocational training, in general, the term has been replaced by the concept of “digital literacy.” The terminology shift is by no means insignificant. If this new literacy was once attached to a specific machine (the computer), it is now attached to a broad range of devices and platforms. On this basis, one may also conclude that computer literacy, which was never easy to define, was arguably a misleading concept from the start. Just as we wouldn’t talk about “pencil literacy” or “pen literacy,” to speak of “computer literacy” is to place too much emphasis on a single technological tool. The shift from computer to digital literacy is not about the decline of computer literacy, but rather a recognition that to be literate in a digital age doesn’t mean knowing how to use a single device. Instead, it means how to engage with a network of devices and platforms and to do so in myriad roles. This, of course, is something that likely would have been difficult if not impossible for Molnar to predict back in 1978 when he coined to term, computer literacy. After all, at the time, there was little indication that by 2018, most computing would happen on phones rather than desktop computers.

Related News