Computers: Past, Present, and Future

Banner image for computers past present and future

Many people know what a computer is, but not how one works. The National Institute of Standards and Technology’s Computer Security Resource Center (What a mouthful! It’s usually called NIST CSRC) defines a computer as “a device that accepts digital data and manipulates the information based on a program or sequence of instructions for how data is to be processed.” [NIST CSRC glossary page, 2010] 

This definition is important because computers have changed a lot since the first few were built. Someone could look at what was considered a computer in the late 1900s or early 2000s and not know anything about how the object works or what they could do with it. The basics remained, but what we in tech call the User Interface, whether it was there or not, changed how people interact with computers. 

For example, you may have read in school that the abacus was considered one of the first computers. An abacus is a mathematical tool originating from China used for arithmetic and counting. More information about the abacus and other early number processors can be found here [Ryerson University, 2015]

Modern computing starts with Charles Babbage. Known as the Father of Modern Computing, the device he designed and set to invent was known as the Difference Engine. There were multiple designs and at least two broad attempts at successfully building the machine. The Computer History Museum has more information on how the Analytical and Difference engines work. Both machines give us the blueprints for computers as we know them today. [Computer History Museum, circa 2008] 

According to Khan Academy, computers have four things that are common to every iteration. They must take input, store and process information, and output the results. One example of this would be displaying a sentence on a computer screen. The input comes from the keyboard, which a user presses in order to get the letter they want on screen. Another example would be having a computer do a simple calculation. A calculator is a type of computer but most computers are able to take input like (2+2), translate that into something the computer understands, calculate the answer based on math languages stored in the computer, come up with the right answer, and output that answer into a language that humans understand. [Khan Academy, circa 2018]

Computers can be sorted into generations one through five based on when and how they were created. Devices created from 1945 to 1956 were made with vacuum tubes and are classified as First Generation computers. The next generation were called such because they were made with transistors, from 1956 – 1963. The third generation were made from integrated circuits, from 1964 – 1971. Computers used today are part of the Fourth and Fifth generations. Fourth Gen computers are made using microprocessors, which are made by companies like Intel. You may see a sticker on your desktop or laptop with the name of that company. Fifth Gen computers are made with Artificial Intelligence and Machine Learning, but we are unlikely to see anything beyond perhaps Cortana on computers and Siri on Apple mobile devices. Even still, those are not true artificial intelligences yet. That is currently limited to processes that are being explored heavily by computer scientists. [KmacIMS, 2019]

This article was meant to be a brief history and technical explanation of what computers are and where they came from.

We’d love to hear your opinion on where computers are headed next. Share below in the comments!

This is part of our Cybersecurity 101 series.

Related Articles

Responses

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.