Last week we published an article about Moore’s Law and the rapid development of
computing technology. In that article we said that a singing birthday card has
more computing power than the Allied Forces of WWII. Since there is the potential to debate that comparison, we thought we would use this week's article to clear up what we meant. It all boils down to one question: When was the first computer invented? Whereas
one person could argue that it was unfair to compare the card to the military
because the latter had no transistors and therefore no computing power at all, we at Sketchy Science contended that computing technology goes back a lot further than that.

To uncover the ancestor of the modern computer we have to go back in time. Conveniently enough, our first stop is
around the time of the events that led to last week’s contentious comparison.
As we saw in our discussion of Moore’s Law, modern computers are tied to
transistors. The first transistor was built in 1947, only 2 years after the end
of WWII.

Transistors were a huge leap forward in computing technology. They
allowed for the unprecedented control of electrons and formed the foundation for all 21

^{st}century electronics (so far). If your definition of a computer includes the words “electronic” and “transistor,” our argument is at an end. But to stop here would be to ignore an influential 19^{th}century philosopher, mathematician, and engineer with the delightfully British name of Charles Babbage.
The 1800’s were a tough time to be a mathematician. Science had progressed far enough to begin dealing with some seriously complex equations, but not so
far as to have the technology to compute the answers automatically. The result
was long hours spent hunched over a desk with quill in hand, writing out the
calculations that could ultimately prove your ideas. A man after my own heart,
Charles Babbage had a touch of laziness in him and thought there must be a
better way. In one of the great feats of laziness-inspired motivation (he
wasn’t really lazy, he just valued efficiency and accuracy) Babbage drafted
plans for a mechanical device that could compute complex polynomials without the
need for a person to work out every step.

In 1822, Babbage presented a paper to the Royal Astronomical Society
called “Note on the application of machinery to the computation of astronomical and mathematical tables.” In it he laid out how his machine would work.
The Difference Engine, as it was called, would have several columns numbered 1
to N (N is just a stand-in code for the highest number of columns needed). Each column would contain numbered wheels. In theory, after the alignment and timing
were worked out based on mathematic principles, you could set the first column
to the value of any equation at a known point and run the machine through a series
of calculations to determine the value of that same equation after any given number of
changes.

The big wigs at the Royal Society were obviously
impressed by all this. In 1823 they gave Babbage

*£*1700 (the equivalent of*£*190,000 or $317,000 US today) to go build his machine. Unfortunately for them, for Babbage, and for science things got a little out of hand. Instead of building a machine that could perform basic polynomial calculations (equations with multiple terms), Babbage took the opportunity to try and build a machine capable of advanced analytics. In the end, and after a total investment of*£*17,000 ($3.17 million US in today's money), the project was scrapped (Campbell-Kelly, 2004).
The first working difference engine was eventually
built in 1855. Babbage’s machine, as conceived was finally built (more or less just
to see if it would actually work) in 1991. In the event, the machine works flawlessly to this day and now resides at the
Computer History Museum in Mountain View, California, USA.

If you want to get really semantic we could go back
even further and say the birth of computing is tied to abacuses or some
comparably simple device; but in terms of raw automatic computing power, Babbage’s difference engine is what got the ball rolling. Famously, the Allied
Forces in WWII were able to crack Nazi codes using machines
that worked on similar principles. So it appears that our initial comparison
holds some water after all. This isn’t about being right though, it’s about the
fun of finding out the truth.

Campbell-Kelly, Martin (2004).

*Computer: A History of the Information Machine 2nd ed*. Boulder, Co: Westview Press. ISBN 978-0-8133-4264-1.
## 2 comments:

thanku

In theory, after the alignment and timing were worked out based on mathematics principles, you could set the first column to the value of any equation at a known point and run the machine through a series of calculations to determine the value of that same equation after any given number of changes.

Post a Comment