The History Of Computing

The Prehistory of the Computer

The earliest device used to calculate was the abacus. This number crunching device has been found in use as far back as the Sumerians, circa 2700BC. The abacus can be found throughout Asia, the Middle East, and India through ancient history. Don’t worry, the rate of innovation always speeds up as multiple technologies can be combined. Leonardo da Vinci sketched out the first known plans for a calculator. But it was the 17th century, or the Early modern period in Europe, that gave us the Scientific Revolution. Names like Kepler, Leibniz, Boyle, Newton, and Hook brought us calculus, telescopes, microscopes, and even electricity. 

The term computer is first found in 1613, describing a person that did computations. Wilhelm Schickard built the first calculator in 1623, which he described in a letter to Kepler. Opening the minds of humanity caused people like Blaise Pascal to theorize about vacuums and he then did something very special: he built a mechanical calculator that could add and subtract numbers, do multiplication, and even division. And more important than building a prototype, he sold a few! His programming language was a lantern gear. It took him 50 prototypes and many years, but he presented the calculator in 1645, earning him a royal privilege in France for calculators. That’s feudal French for a patent. 

Leibniz added repetition to the mechanical calculator in his Step Reckoner. And he was a huge proponent of binary, although he didn’t use it in his mechanical calculator. Binary would become even more important later, when electronics came to computers. But as with many great innovations it took awhile to percolate. In many ways, the age of enlightenment was taking the theories from the previous century and building on them. The early industrial revolution though, was about automation. And so the mechanical calculator was finally ready for daily use in 1820 when another Frenchman, Colmar, built the arithmometer, based on Leibniz’s design. 

A few years earlier, another innovation had occurred: memory. Memory came in the form of punchcards, an innovation that would go on to last until World War II. The Jacquard loom was used to weave textiles. The punch cards controlled how rods moved and thus were the basis of the pattern of the weave. Punching cards was an early form of programming. You recorded a set of instructions onto a card and the loom performed them. The bash programming of today is similar.  

Charles Babbage expanded on the ideas of Pascal and Leibniz and added to mechanical computing, making the difference engine, the inspiration of many a steampunk. Babbage had multiple engineers building components for the engine and after he scrapped his first, he moved on to the analytical engine, adding conditional branching, loops, and memory  – and further complicating the machine. The engine borrowed the punchcard tech from the Jacquard loom and applied that same logic to math.

Ada Lovelace contributed the concept of Bernoulli numbers in algorithms giving us a glimpse into what an open source collaboration might some day look like. And she was in many ways the first programmer – and daughter of Lord Byron and Anne Millbanke, a math whiz. She became fascinated with the engine and ended up becoming an expert at creating a set of instructions to punch on cards, thus the first programmer of the analytical engine and far before her time. In fact, there would be no programmer for 100 years with her depth of understanding. Not to make you feel inadequate, but she was 27 in 1843. 

 The engine was a bit too advanced for its time. While Babbage is credited as the father of computing because of his ideas, shipping is a feature. Having said that, it has been proven that if the build had been completed to specifications the device would have worked. Sometimes the best of plans just can’t be operationalized unless you reduce scope. Babbage added scope. 

Despite having troubles keeping contractors who could build complex machinery, Babbage first looked to tree rings to predict weather and he was a mathematician who worked with keys and ciphers. As with Isaac Newton 150 years earlier, the British government also allowed a great scientist/engineer to reform a political institution: the Postal System. You see, he was also an early proponent of applying the scientific method to the management and administration of governmental, commercial, and industrial processes. He also got one of the first government grants in R&D to help build the difference engine, although ended up putting some of his own money in there as well, of course. Babbage died in 1871 and thus ended computing. For a bit.

The typewriter came in 1874, as parts kept getting smaller and people kept tinkerating with ideas to automate all the things. 

Herman Hollerith filed for a patent in 1884 to use a machine to punch and count punched cars. He used that first in health care management and then in the 1890 census. He later formed Tabulating Machine Company, in 1896. In the meantime, Julius E. Pitrap patented a computing scale in 1885. William S Burroughs (not that one, the other one) formed the American Arithmometer Company in 1886. Sales exploded for these and they merged, creating the Computing-Tabulation-Recording Company. Thomas J Watson, Sr joined the company as president in 1914 and expanded business, especially outside of the United States. The name of the company was changed to International Business Machines, or IBM for short, in 1924.

Konrad Zuse built the first electric computer from 1936 to 1938 in his parent’s living room, which began the beginning of the end of mechanical computing. It was called the Z1. OK, so electric is a stretch, how about electromechanical… In 1936 Alan Turing proposed the Turing machine, which printed symbols on tape that simulated a human following a set of instructions. Maybe he accidentally found one of Ada Lovelace’s old papers. The first truly programmable electric computer came in 1943, with Colossus, built by Tommy flowers to break German codes. The first truly digital computer came from Professor John Vincent Atanasoff and his grad student Cliff Berry from Iowa State University. The ABC, or Atanasoff-Berry Computer took from 1937 to 1942 to build and was the first to add vacuum tubes. 

The ENIAC came from J Presper Eckert and John Mauchly from the University of Pennsylvania from 1943 to 1946. 1,800 square feet and ten times that many vacuum tubes, ENIAC weighed 50 tons. ENIAC is considered to be the first digital computer because unlike the ABC it was fully functional. The Small-Scale Experimental Machine from Frederic Williams and Tom Kilburn from the University of Manchester came in 1948 and added the ability to store and execute a program.  That program was run by Tom Kilburn on June 21st, 1948. Up to this point, the computer devices were being built in universities, with the exception of the Z1. But in 1950, Konrad Zuse sold the Z4, thus creating the commercial computer industry. IBM got into the business of selling computers in 1952 as well, basically outright owning the market until grunge killed the suit in the 90s.

MIT added RAM in 1955 and then transistors in 1956. The PDP-1 was released in 1960 from Digital Equipment Corporation (DEC). This was the first minicomputer. My first computer was a DEC.

Pier Giorgio Perotto introduced the first desktop computer, the Programmer 101 in 1964. HP began to sell the HP 9100A in 1968. All of this steam led to the first microprocessor, the Intel 4004, to be released in 1971. The first truly personal computer was released in 1975 by Ed Roberts, who was the first to call it that. It was the Altair 8800. The IBM 5100 was the first portable computer, released the same year. I guess it’s portable if 55 pounds is considered portable. And the end of ancient history came the next year, when the Apple I was developed by Steve Wozniak, which I’ve always considered as the date that the modern era of computing be.