A History of Artificial Intelligence
Teaching Intelligence, Marvin Minsky
In 1956, a group of scientists and computer engineers met at Dartmouth College to better understand where their collective research into a specific style of computing was headed. While there, John McCarthy of the mathematics department called the branch “artificial intelligence,” and the name stuck.
“At the 1956 Dartmouth Artificial Intelligence Conference, an audacious, outrageous even, intellectual Zeitgeist emerged: that the core of humanity, our ability to think and reason, was subject to our own technological understanding, a recursive formulation of our very nature. And the participants were right,” said Rodney Brooks, the Panasonic Professor of Robotics and director of the Massachusetts Institute of Technology (MIT) Computer Science and Artificial Intelligence Lab.
Earlier computers were merely devices to perform mathematical tasks. The history of AI and computers can be traced back 5,000 years to Asia and the development of the abacus. This device, still in use in parts of the world today, developed the principals in calculating numbers that also formed the first computer programs.
It wasn't until 1642, however, that Blaise Pascal invented the world's first automatic calculating machine, called the Pascaline. From this humble beginning, people spent the next several centuries trying to find ways to have computers do more complex calculations. Some milestone inventions on the path to today's supercomputers include Gottfried Wilhelm von Leibniz's 1694 Leibniz computer, noteworthy for its use of algorithms, a concept still employed by today's devices.
In 1805, Joseph-Marie Jacquard devised a method for automated weaving that is seen as a precursor to early computer technology. The weaving looms were directed by instructions on a series of punched cards, like computers would be instructed later.
Charles Babbage wrote Observations on the Application of Machinery to the Computation of Mathematical Tables, earning him the British Astronomical Society's first gold medal in 1821. Babbage built his difference engine a year later but abandoned it as unwieldy. He imagined building a machine that would perform calculations and was aided by Ada Lovelace, Lord Byron's child, who came up with programming ideas, hence making her the first software engineer. She wrote in 1843 of how the analytical engine could be employed to play chess or compose music.
Herman Hollerith built upon the advancements made throughout the nineteenth century and patented an electromechanical information device that used punched cards. It won the 1890 U.S. Census competition, thus introducing the use of electricity in a major data-processing project. Hollerith founded the Tabulating Machine Company in 1896, which eventually evolved into International Business Machines, or IBM.
In 1940, Alan Turing led a group of mathematicians and electrical engineers in finding a way to crack Germany's military codes during World War II. They used telephone relays and other electromagnetic pieces to construct Robinson, the first computer. As Adolf Hitler's men added more complex codes through their Enigma machine, the Allied team kept pace, replacing Robinson with Colossus in 1943, using 2,000 radio vacuum tubes. Colossus and nine similar computers worked throughout WW II to counter the German offensive.
In 1941, Konrad Zue developed the world's first fully programmable digital computer, the Z-3. Arnold Fast, a blind mathematician, was hired to program the Z-3, making him the first working programmer. After this point, the development of computing machines sped up with names like the Mark I, ENIAC, and UNIVAC, but they all were huge, slow-processing machines dependent on vacuum tubes.
William Bradford Shockley, Walter Hauser Brittain, and John Bardeen invented the transistor in 1947, which revolutionized electronic devices in general. As computer speed improved, the hardware began to shrink also, as transistors took up less space than the older vacuum tubes.
Additional topics
- In Development - Energy And The Environment, Warfare, Taking Nanotechnology Into Outer Space, Nanotechnology And The Distant Future
- Game Testing and Marketing - Game Testing, Marketing
- A History of Artificial Intelligence - Teaching Intelligence
- A History of Artificial Intelligence - Marvin Minsky
Job Descriptions and Careers, Career and Job Opportunities, Career Search, and Career Choices and ProfilesCool Science Careers