Job Descriptions and Careers, Career and Job Opportunities, Career Search, and Career Choices and Profiles » Cool Science Careers

Practical Applications of Artificial Intelligence

computer military control

While scientists, engineers, and computer programmers were excited at the development of artificial intelligence throughout the 1950s, other applications were slow to be employed until the military discovered practical uses for AI. The Defense Advanced Research Projects Agency (DARPA) was founded in 1958 as a response to the Soviet Union launching the first artificial satellite, Sputnik. The mission of DARPA was to anticipate applications for AI, considering what the military leaders of tomorrow might need to be more effective. After the Gulf War of the 1990s, DARPA said that using AI to schedule military units in the Middle East more than justified all U.S. military spending since 1958. In that time, the military has employed AI to identify enemy aircraft and weapons, and target distant objects.

The use of autopilot technology on airplanes is one of the earliest uses of AI in the civilian world. Developed by Sperry Corporation, the autopilot took control of the hydraulically operated rudder, elevator, and ailerons, allowing aircraft to fly straight without a pilot's touch. These tasks accounted for up to 80 percent of a pilot's flight time and allowed pilots to take periodic breaks from flying, leading to a dramatic reduction in pilot error. Today's autopilots now help control the take-off, ascent, level, approach, and landing of airplanes.

During the 1970s, AI began being used to scan and recognize the printed word and synthesize a voice, allowing machines to read to the blind. Similar scanning technology allows the printed page to be turned into digital files for electronic publishing and archiving. A related field has allowed for voice recognition programming to translate the spoken word into a computer-generated written word. People with disabilities have benefited greatly from AI, which is also used to control wheelchairs and other devices.


As scientists searched for ways to make machines calculate and then think, writers kept pace. One could argue that Mary Wollstonecraft Shelley's Frankenstein featured the first use of AI, since the creature had a brain transplanted into its body. The first noteworthy fictional piece about AI was by Karel Capek, who coined the term “robot” in his 1921 drama, R.U.R. (Rossum's Universal Robots). The play postulated intelligent machines, built as servants, which subsequently rebelled and destroyed their makers.

As computing devices continued to develop, the potential for AI to be a center point for fictional stories grew. George Orwell envisioned a bleak future under computer control in his classic dystopia, 1984. Another story of AI run amok is the Arthur C. Clarke and Stanley Kubrick collaboration, 2001: A Space Odyssey.

Fritz Lang's silent film Metropolis featured one of the first notable robots in a sympathetic light. Since then, artificial life-forms have populated novels, plays, movies, television programs, and comic books. In the Star Wars saga, all such mechanical devices are nicknamed “droids,” although R2-D2 is a robot. Star Trek: The Next Generation's Data is a perfect example of an android. More recently, Steven Spielberg directed AI, a film about a world of androids, including a young boy model played by Haley Joel Osment. The somber production showed how these constructs formed their own culture and society.

The medical world has taken full advantage of AI to help improve patient diagnosis. For instance, neural networks (interconnected groups of artificial neurons) are used to assist doctors in finding patterns and relationships in data. In a study conducted by Lars Edenbrandt, M.D., Ph.D., and coauthor Bo Heden, M.D., Ph.D., of the University Hospital in Lund, Sweden, neural networks can more accurately read electrocardiograms than ever before. Cardiologists interpret these tests, normally used on heart attack patients, to help them determine the best course of treatment. “The neural networks performed higher than an experienced cardiologist, indicating that they may be useful as decision support,” Edenbrandt told the American Heart Association. To conduct the test, the computer program was loaded with thousands of electrocardiogram readings so it could determine when a patient was suffering a heart attack.

Hearing aids have also improved thanks to AI, as reported at the 2006 Canadian Hard of Hearing Association meeting. Karla Rissling of the Medicine Hat Hearing Centre reported that AI can optimize speech-specific user preferences such as differentiating between loud or quiet environments.

AI is now being employed to create more lifelike prosthetics. The Los Angeles Times noted in 2006, “One knee … will even mimic lost muscle activity by powering ankle and leg amputees up stairs, or up from a sitting position. But that's just the beginning. Advances in robotics, electronics and tissue engineering ultimately could create ways to lengthen damaged limbs, grow new cartilage, skin and bone, and permanently affix a prosthetic device to the body. Some researchers are even designing a so-called biohybrid limb—a prosthesis that can be controlled by the user's thoughts.” Work just like this is now being accomplished at Cyberkinetics Neurotechnology Systems, Inc., where they are developing a system called BrainGate. The software will decode brain waves via a small chip in the brain's primary motor cortex, turning them into computer commands.

Announcements are constantly being made regarding new products and services that use AI, which means that associated career opportunities are particularly bright. Dr. Anthony Francis of Google notes, “There are so many areas of AI—military applications (general planning and education), information retrieval (search and data mining), game AI (believable characters), theoretical research (cool algorithms), robotics (even cooler whirring gears and der blinkenlights—the modern term for the diagnostic lights of mainframe computers), simulation (crowds, traffic, etc.), and cognitive science (models of mind and the human connection).”

Preparing for a Nanotechnology Career - Going To School, Academic Centers Of Nanotechnology, Launching A Nanotech Career, Landing A Job [next] [back] The Next Decade - Unparalleled Growth Rate, Everything Will Be Wired

User Comments

Your email address will be altered so spam harvesting bots can't read it easily.
Hide my email completely instead?

Cancel or