Acronym (pronounced “askee”) for American Standard Code for Information Interchange. ASCII was developed by ANSI (neat: acronyms that rhyme, I feel a song coming on) to provide a standard way for computer systems to deal with the text characters we use. When we type ASCII characters from the keyboard (which looks like words to us), the computer interprets them as binary so they can be read, manipulated, stored and retrieved. Each character in the ASCII set is represented by a number from 0 to 127, which can be represented in 7 bits of binary information. For example, and upper case “A” is ASCII character #65, which in binary (or to a computer) would look like 1000001. ASCII files are commonly known as text files and since it is standardized most computers can read them, which is one big reason why it is so easy to share text files between different operating systems on radically different computers. There is also an extended ASCII set where an 8th bit is added. It supports additional characters (using numbers 128-255), which is where a lot of the special (non-English) characters and symbols are represented. Historically one of the ways complex computer data was (and sometimes still is) sent over the Internet is by converting it into an ASCII format and sending it as text. That way the receiving computer could receive it and convert it into code that could be read locally even though the two computers (or their operating systems) might “speak” different languages and normally not be able to communicate with each other.