In its earliest forms literacy was purely the processes of interpreting symbols or hieroglyphics. Many ancient writings have been recovered and interpreted, undisputedly proving that literacy has, in some form, always existed. As time passed new forms of language and literature surfaced and literacy became known as the ability to read and write.1 Or, more specifically, ones ability to convey ones thoughts onto a medium understood by others. At first these mediums took the form of nearly anything: mankind has used anything from rocks to canvas. Eventually, with inventions such as the printing press, the mediums used by people became standardized and the definition of literacy was in no need of revision.
With the onslaught of technology brought on by the twentieth century, society began to accept new mediums over which to express themselves. Multimedia, the use of several different forms of media, has been becoming commonplace in everyday life since the advent of the internet, a world-wide conglomeration of computers networked together via telephone lines, optical wires, and satellite connections (among other forms of digital communication). Content of seemingly boundless quantities is available in nearly every household in the US thanks to the so-called Information Super Highway. To capitalize on such a useful resource requires that people of all ages be able to knowingly operate the application software of a personal computer, or as our literacy definition adapts to the twenty first century, that all people become computer literate on some level.
Since its conception in 1962 at MIT and eventual growth through the late 60s and 70s2 the internet has grown literally by leaps and bounds. This sudden access to nearly infinite amounts of information has caused society to move towards a more convenient, more abundant source of media to express ones self with. However, due to its rapid growth in popularity, several large populations have been left behind. Even those who did achieve a certain level of competency when dealing with computer applications may find that within as little as one year their knowledge has become superseded, outdated by newer evolving technologies.
Thusly, computers and their software have brought an almost burdensome issue along with their immense usefulness: the issue of becoming obsolete. In the past, becoming literate meant, for the most part, that one would remain literate for the rest of ones natural life (sparing any severe mental trauma). However, becoming literate in the use of computer application software does not ensure that one will remain so for very long. Even the most simplistic of applications, take the word processor for example, have evolved beyond recognition within the past several years. This further complicates our definition of literacy as it introduces the element of time into the very threads of the definition. No longer can one be assured that their literacy will remain intact in this rapidly changing environment.
Just how can computer literacy be defined? We have already concluded it encompasses some sort of basic understanding of computer application software and that these applications evolve at such a rate that one must not ignore their own pending move back into the realm of illiteracy. But what does this basic understanding include? At first, being computer literate meant understanding the inner workings of a computer3 : both the element of hardware (physical equipment) and the element of software (digital binary