IDEAS THAT CHANGED THE WORLD
In the strictest sense of the word, computers have existed for centuries. That is, machine that could calculate of their own accord, using gears and wheels and abacuses and such. For that is what a computer is, essentially – a sophisticated calculator.
There were three generations of the computer’s development: first, around 1918, scientists invented the electronic calculator (much like the little hand-held ones we all use nowadays, only much larger).
Second, they took advantage of the invention of the transistor (very popular with radios) in 1948 (the first step was the hardest, and so took the longest) to allow computers to be built on a much smaller scale, and also paved the way for advanced “memory” – you may have heard of the UNIVAC computer developed by the Rand corporation. This was the model government computers used.
Finally, the microchip allowed circuits to integrate – or, if you will, to talk to each other, much like we talk to ourselves from one side of our brain to the other. That’s what created the consumer and industrial computer boom in the ‘70s, which has continued to this day.
It is important to note the division of the schools of thought which began with the advent of Apple computers in the early ‘70s. Apple was started by two maverick Standard University students, Steve Jobs and Steve Wozniak, with the idea that they would therefore appeal to the general public more than would the stodgy IBM-produced models. With their breakthrough Macintosh computer in 1984, and subsequent improvements, they indeed created a second computer movement.
So for the last twenty years the two companies (IBM licenses their technology to smaller companies so that their version is more widespread) have been intense rivals, and their vastly different technologies were incompatible.
That seems to be changing now. The newest innovations actually merged the two systems, so that they can indeed use the same software and communicate with each other.