With the insurgence of so many new tablets, smartphones and laptops, it’s really amazing to think about some of the old computers we used to work on.
How old do you think the idea of a computer is? The real answer will knock your socks off! Some readers may point to the IBM PC-XT, while others may say a Tandy product. Old school “hackers” may point to the DEC and DEC Alpha. All of these people would be wrong. Have a seat, dear reader and let’s take a computerized stroll through the ages of computing.
Old Computers Timeline
For those that like to know word origins, the word “computer” dates back to the 17th Century and referred to people that performs calculations or computed sums.
Charles Babbage Creates the Difference Engine — 1822
Yes, you read that correctly — I said 1822. This was when inventor Charles Babbage created what he called the Difference Engine. This was a machine designed to compute a series of numbers and print out the results, although, due to a lack of funding, it was never completed in his lifetime.
Fifteen years later, Babbage began work on what he called the Analytical Engine. However, again, money problems kept him from finishing it. His son Henry did finish it, in 1910 and get it to perform its first basic calculations, proving the concept: Old computers are really old.
The First “Modern Computer” — 1938
German inventor Konrad Zuse is the person we can thank for what is considered to be the first “modern” computer (Z1), which he completed in his parents’ living room in 1938. The reason it’s considered modern is that this was the first truly programmable (and fully functional) computer. Computer historians call it the first electro-mechanical, binary programmable computer.
The First Electric Programmable Computer — 1943
The first all-electric computer was created to help with the war effort by assisting codebreakers with the job of decoding German radio messages encoded using their Enigma cryptology device. (Cryptology is the science of making messages unreadable through the use of codes and the process of breaking those codes.) This computer was the “Colossus,” so named because, well, it was colossal, taking up several rooms and consuming the same amount of power as several thousand of today’s personal computers. Brit Tommy Flowers is credited with the development of this in 1943.
We Officially Enter the “Digital Age” — 1942
The world’s first digital computer was thought to be the ENIAC (Electrical Numerical Integrator and Calculator), another room-sized behemoth. The honors actually go to Professor Victor Atanasoff and graduate student Cliff Berry at Iowa State College (now University) in 1942. This computer, known as the Atanasoff-Berry Computer (ABC for short), was the first to use Boolean logic and binary math to complete complex calculations. It had no central processing unit and used vacuum tubes.
Storing a Computer Program — 1949
The year that saw the first graphical computer game created and played was in 1942. This was done on British-designed EDSAC, which is known as the first electronic computer that allowed for the storing of a written program. Input for this computer was done by using a rotary dial similar to that found on a rotary phone. Oh yeah, the game was called, “Baby.”
The First Computer Company (not IBM!) is Formed — 1949
At this time in history, IBM was still making calculators and cash registers. The title of the first computer company in history goes to the guys that created the ENIAC. J. Presper Eckert and John Mauchly and was named after them — the Eckert-Mauchly Computer Company, more commonly known as EMCC. They were responsible for the UNIVAC family of computers.
Speaking of the UNIVAC, it was the first computer that allowed for the electronic storing and reading into memory of programs. Their first model was the UNIVAC (ERA) 1101, which was delivered to the U.S. military. The company was formed in 1949 and delivered its first unit in 1950.
The ’50s — Many Firsts
IBM entered the computer industry with the first mass produced electric computer in 1953. This was introduced in the first week of April that year and was named the “701.” Two years later, MIT introduced the world to Random Access Memory (RAM), which we still use to this day. The computer in question was the Whirlwind and had magnetic core RAM and it was also the first to offer real-time graphics.
In 1956, MIT introduced the first computer to use transistors, the TX-O (Transistorized Experimental).
The ’70s — The Heyday of Hackers
The term hackers originally didn’t carry a negative connotation, it meant people that shared their information, resources, and programs, for the benefit of all. They came of age in the ’60s with computers made by Digital Equipment Corp. (DEC), beginning with the PDP-1, which was considered a microcomputer by the standards of the day. We’d still call it a behemoth. HP was the first to begin mass-marketing computers, with their HP 9100A (1968).
The First Microprocessor — 1971
Intel introduced the first microprocessor, the 4004, in 1971. Xerox introduced the Alto, the first workstation, with a central processor, mouse, and keyboard (plus windows, icons and menus). Believe or not, the first “laptop computer” was introduced by IBM (IBM 5100) in 1975. However, it was the size of a suitcase and quite heavy — 55 pounds! And in 1975, we also saw the introduction of the Altair 8800, the world’s first personal computer. Then Steve Wozniak introduced his first computer, the Apple I, in 1976.
The ’80s and ’90s — Explosion of the PC Market
IBM introduced the PC-XT and PX-AT, and others started “cloning” them — making models that were fully compatible with the IBM products. This is the time frame when Intel introduced the “x86” processor architecture with the 286, 386 and 486 processors, which were then followed by the Pentium series.
Cover Photo Credit: Wikimedia Commons