Who invented the computer? It doesn't seem like such a complicated question. We know who invented the steam engine, the flushing toilet, and many other things that changed the world. Why then is it so difficult to determine who invented the computer?
For starters, it depends on how you define a computer—the earliest recorded usage of the word in 1613. Then, a computer was simply a person who completed calculations, such as a clerk or an accountant. This term was still in regular use in the 1930s and 1940s, and we continued to differentiate between manual computers (i.e., people) and automatic computers (i.e., machines) right through until the 1970s.
If you agree that a computer is just a device that carries out computations, then the competition for who gets to be crowned inventor of the computer is suddenly blown wide open. Should it go to Blaise Pascal, who developed the first mechanical calculator in 1642? Or is William Oughtred - inventor of the slide rule - a more worthy recipient?
As it turns out, even these early inventions might be too late to be considered the very first computers. For example, the Antikythera mechanism was discovered in 1901 on a shipwreck off the coast of the Greek island that gives the device its name. Capable of tracking and predicting astronomical positions such as lunar orbits and solar eclipses for calendar and time-keeping purposes, many historians have cited this 2000-year old complex clockwork invention as the very first analog computer.
Although it would be nearly one thousand years before the equivalent technology of the Antikythera was reproduced, astronomical clocks with similar designs and mechanisms have existed throughout history worldwide. The most accurate of these was the Castle Clock, built by Al-Jazari in 1206. At 33 meters high and powered by water, this complex device was used for more than just basic timekeeping and included the functionality to adjust the length of day and night as these changed throughout the year. For this reason, it is considered the world's first programmable analog computer.
Yet, while there are valid arguments in both cases, neither of these or any of the other computational devices invented much before the Industrial Revolution looks - or behaves - anything like what we would consider a computer in 2019. And, realistically, there isn't anybody alive today that is actually referring to a mechanical calculator or a slide rule when they use the word computer. Instead, they're talking about the machines that evolved into laptops and desktop PCs and, much further down the line, smartphones and tablets. For the context of this article, so will we.
Although, ruling out everything pre-19th Century by default doesn't make answering the question any easier.
Charles Babbage (1791 - 1871) was a British mathematician, philosopher, and mechanical engineer credited with pioneering the concept of a programmable computer. In 1822 he designed an elaborate machine capable of automatically calculating values using only simple addition and avoiding more complicated operations like multiplication and division. He started building this Difference Engine in 1823 with a grant from the British government, but after 10 years and considerable funding, it remained unfinished. The Science Museum in London eventually completed a full working model in 1990.
During these 10 years, Babbage began focusing on a more complex version of his idea, which he called the Analytical Engine. It would use punch cards to control a mechanical calculator. On the basis that these punch cards could also represent answers to previous computations, the machine is deemed programmable.
In reality, the only thing that sets the Analytical Engine apart from any other mechanical calculator invented at the time, or in the next one hundred years, is that it was designed to make use of conditional computing (i.e., doing something only if something else is true), as well as standard operations like addition and subtraction. This is what aligns it with computers about modern standards.
And, had it actually been completed before he died, Babbage would certainly stand uncontested as the inventor of the world's the first mechanical computer.
Alan Turing (1912 - 1954) is the man considered responsible for formalizing those modern standards. Aside from cracking the Enigma code and shortening the Second World War by approximately two years, Turing also holds a pretty strong claim to the Inventor of the Computer crown. Theoretically, at least.
In 1936, Turing published a paper in which he suggested that it was possible to design a method, or algorithm, that could solve any mathematical problem, as long as it could be expressed in simple, coded instructions. He proved that it wouldn't be possible to use one single method (also called Turing Machines) to solve all types of problems. But he also demonstrated that if you could design a Universal Machine (or method) which incorporated all individual Turing Machines, you could theoretically get it to do anything for which you could write the instructions. It might have only existed on paper at that time, but you could argue that Alan Turing had just invented what somebody in this century would recognize as the computer.
Turing drew up detailed plans for a digital computer after the war in March 1946. It was called the Automatic Computing Engine (ACE) and included storing programs in its memory, but, crucially, it wasn't built until 1950.
However, Konrad Zuse (1910 - 1995) had been busy experimenting with the concept of computers since 1935. The Z1 - a binary mechanical calculator with limited programming and capable of reading instructions - was finished in 1938.
Presumably, because of World War II, Zuse was unaware of his contemporaries and the breakthroughs in computer science. His work was, without question, completed independently. He created the Z2 - an improved version of the Z1 - in 1940 and then the Z3 in 1941.
The Z3 was an even more refined version of the Z2: a binary 22-bit calculator with memory, a separate calculation unit, and more extensive programmability. It was, without doubt, the world's first programmable, fully automatic computer. But it was electromechanical, not electronic, and included no capability for conditional operations (considered necessary to prove computing viability in modern computer science).
Zuse began work on his Z4 in 1942 and, upon its completion in July 1950, sold it to Eduard Stiefel of the Swiss Federal Institute of Technology. He is therefore also credited with inventing the world's first commercially the available computer.
In December 1943, Tommy Flowers and a team at Bletchley Park demonstrated the successful operation of Colossus Mark 1: a prototype computer intended to crack Nazi codes. Colossus utilized valves and tubes to complete arithmetical and logical operations, making it the world's first electronic computer capable of programmability. Although reprogramming would mean physically rearranging the machine's components and Colossus did not include stored-program functionality.
ENIAC was completed in 1946 by the University of Pennsylvania and was patented as the world's first digital computer. It had been designed for the U.S. Army's Ballistic Research Laboratory and was initially used to calculate artillery firing tables but had many different applications before being decommissioned in 1956.
Let's muddy the waters further and mention the Atanasoff-Berry Computer (ABC), which was developed at (what is now) Iowa State University by Professor John Vincent Atanasoff and his graduate student, Cliff Berry, between 1937 and 1942.
The ABC was an electronic computer that also used valves and tubes to complete digital computations. Although not programmable like ENIAC, the ABC was completed more than four years earlier and beat it to the title of the world's first digital computer. This was clarified by U.S. Federal Judge Earl R. Larson in 1973, who ruled that the ENIAC patent was invalid and named Atanasoff as the sole inventor of the digital computer. However, having said this, the ABC was never proved to be fully functioning. Thus, despite the legality of the patent, the legitimacy of its claim is questionable.
The first electronic digital computer capable of storing and executing a program was the Small-Scale Experimental Machine (SSM), dubbed the Manchester Baby. It was developed at the University of Manchester, the UK, in 1948. Although it was only ever intended as a device to test Random Access Memory (RAM), the Baby had all the components of a fully functioning computer.
Almost immediately after proving the Manchester Baby's functionality, the team at the University of Manchester set about making it more usable. They finished building the Manchester Automatic Digital Machine (MADM), more commonly known as the Manchester Mk 1, in June 1949, which provided the university with its first computing resource.
Having collaborated with the University on developing the two Manchester Computers, the local electronics company, Ferranti, used the Manchester Mk1 as the prototype for their own machine. The Ferranti Mk1 (also known as the Manchester Electronic Computer, or the Manchester Ferranti) was a smaller, faster version of the Manchester Mk1 with better storage capacity and additional instructions. It was formally sold to the University of Manchester in February 1951 and became the world's first commercially available computer with stored-program facilities.
The answer to who invented the computer is a subjective one and depends on the context. The term computer was initially only used to describe people who completed calculations.
Charles Babbage is widely considered the father of the computer, but he never saw his concepts realized: his Difference Engine wasn't built until 1990. Alan Turing also failed actually to build anything, but in 1936 officially formalized the concept of a universal computer.
In the meantime, Konrad Zuse developed the world's first programmable computer - in complete intellectual isolation - in 1941 - and the first commercially available electromechanical computer in 1950. If you agree with the Oxford English Dictionary that a computer is inherently electronic, however, then nothing before Colossus counts for anything, and Tommy Flowers can be your only viable contender for the inventor of the computer.
If you refuse to accept that a device incapable of storing and executing a program can even be considered a computer in the modern sense, your answer is the team behind the Manchester Baby. Anything after that is just gravy.
TO THE MAXIMUM EXTENT PERMITTED BY APPLICABLE LAW, THE GEEK COMPUTER ENTITIES SHALL NOT BE LIABLE FOR ANY INDIRECT, INCIDENTAL, SPECIAL, CONSEQUENTIAL OR PUNITIVE DAMAGES, OR ANY LOSS OF PROFITS OR REVENUES, WHETHER INCURRED DIRECTLY OR INDIRECTLY, OR ANY LOSS OF DATA, USE, GOODWILL, OR OTHER INTANGIBLE LOSSES, RESULTING FROM (i) YOUR ACCESS TO OR USE OF OR INABILTY TO ACCESS OR USE THE SITE; (ii) ANY CONDUCT OR CONTENT OF ANY THIRD PARTY ON THE SITE, INCLUDING WITHOUT LIMITATION, ANY DEFAMATORY, OFFENSIVE OR ILLEGAL CONDUCT OF OTHER USERS OR THIRD PARTIES; (iii) ANY CONTENT OBTAINED FROM THE SITE; OR (iv) UNAUTHORIZED ACCESS, USE OR ALTERATION OF YOUR TRANSMISSIONS OR CONTENT. IN NO EVENT SHALL THE AGGRESGATE LIABILITY OF THE GEEK COMPUTER ENTITIES EXCEED THE GREATER OF ONE HUNDRED U.S. DOLLARS (U.S. $100.00) OR THE AMOUNT YOU PAID GEEK COMPUTER, IF ANY, IN THE PAST SIX MONTHS FOR THE SITE GIVING RISE TO THE CLAIM. THE LIMITATIONS OF THE SUBSECTION SHALL APPLY TO ANY THEORY OF LIABILITY, WETHER BASED ON WARRANTY, CONTRACT, STATUTE, TORT (INCLUDING NEGLIGENCE) OR OTHERWISE, AND WHETHER OR NOT THE GEEK COMPUTER ENTITIES HAVE BEEN INFORMED OF THE POSSIBILITY OF ANY SUCH DAMAGE, AND EVEN IF A REMEDY SET FORTH HEREIN IS FOUND TO HAVE FAILED OF ITS ESSENTIAL PURPOSE.