Song Page - Lyrify.me

Lyrify.me

Creating Artificial Intelligence Based on the Real Thing by Steve Lohr Lyrics

Genre: misc | Year: 2014

Ever since the early days of modern computing in the 1940s, the biological metaphor has been irresistible. The first computers — room-size behemoths — were referred to as “giant brains” or “electronic brains,” in headlines and everyday speech. As computers improved and became capable of some tasks familiar to humans, like playing chess, the term used was “artificial intelligence.” DNA, it is said, is the original software.

For the most part, the biological metaphor has long been just that — a simplifying analogy rather than a blueprint for how to do computing. Engineering, not biology, guided the pursuit of artificial intelligence. As Frederick Jelinek, a pioneer in speech recognition, put it, “airplanes don’t flap their wings.”

Yet the principles of biology are gaining ground as a tool in computing. The shift in thinking results from advances in neuroscience and computer science, and from the prod of necessity.

The physical limits of conventional computer designs are within sight — not today or tomorrow, but soon enough. Nanoscale circuits cannot shrink much further. Today’s chips are power hogs, running hot, which curbs how much of a chip’s circuitry can be used. These limits loom as demand is accelerating for computing capacity to make sense of a surge of new digital data from sensors, online commerce, social networks, video streams and corporate and government databases.

To meet the challenge, without gobbling the world’s energy supply, a different approach will be needed. And biology, scientists say, promises to contribute more than metaphors. “Every time we look at this, biology provides a clue as to how we should pursue the frontiers of computing,” said John E. Kelly, the director of research at I.B.M.

Dr. Kelly points to Watson, the question-answering computer that can play “Jeopardy!” and beat two human champions earlier this year. I.B.M.’s clever machine consumes 85,000 watts of electricity, while the human brain runs on just 20 watts. “Evolution figured this out,” Dr. Kelly said.

Several biologically inspired paths are being explored by computer scientists in universities and corporate laboratories worldwide. But researchers from I.B.M. and four universities — Cornell, Columbia, the University of Wisconsin, and the University of California, Merced — are engaged in a project that seems particularly intriguing.

The project, a collaboration of computer scientists and neuroscientists begun three years ago, has been encouraging enough that in August it won a $21 million round of government financing from the Defense Advanced Research Projects Agency, bringing the total to $41 million in three rounds. In recent months, the team has developed prototype “neurosynaptic” microprocessors, or chips that operate more like neurons and synapses than like conventional semiconductors.

But since 2008, the project itself has evolved, becoming more focused, if not scaled back. Its experience suggests what designs, concepts and techniques might be usefully borrowed from biology to push the boundaries of computing, and what cannot be applied, or even understood.

At the outset, Dharmendra S. Modha, the I.B.M. computer scientist leading the project, described the research grandly as “the quest to engineer the mind by reverse-engineering the brain.” The project embarked on supercomputer simulations intended to equal the complexity of animal brains — a cat and then a monkey. In science blogs and online forums, some neuroscientists sharply criticized I.B.M. for what they regarded as exaggerated claims of what the project could achieve.

These days at the I.B.M. Almaden Research Center in San Jose, Calif., there is not a lot of talk of reverse-engineering the brain. Wide-ranging ambitions that narrow over time, Dr. Modha explained, are part of research and discovery, even if his earlier rhetoric was inflated or misunderstood.
“Deciding what not to do is just as important as deciding what to do,” Dr. Modha said. “We’re not trying to replicate the brain. That’s impossible. We don’t know how the brain works, really.”

The discussion and debate across disciplines has helped steer the research, as the team pursues the goals set out by Darpa, the Pentagon’s research agency. The technology produced, according to the guidelines, should have the characteristics of being self-organizing, able to “learn” instead of merely responding to conventional programming commands, and consuming very little power.

“We have this fantastic network of specialists who talk to each other,” said Giulio Tononi, a psychiatrist and neuroscientist at the University of Wisconsin. “It focuses our thinking as neuroscientists and guides the thinking of the computer scientists.”

In early 2010, Dr. Modha made a decision that put the project on its current path. While away from the lab for a few weeks, because of a Hawaiian vacation and a bout of flu, he decided to streamline the work of the far-flung researchers. The biologically inspired chip under development would come first, Dr. Modha said. That meant a lot of experimental software already written was scrapped. But, he said, “chip-first as an organizing principle gave us a coherent plan.”

In designing chips that bear some structural resemblance to the brain, so-called neuromorphic chips, neuroscience was a guiding principle as well. Brains are low-power, nimble computing mechanisms — real-world proof that it is possible.

A brain does its computing with a design drastically different from today’s computers. Its processors — neurons — are, in computing terms, massively distributed; there are billions in a human brain. These neuron processors are wrapped in its data memory devices — synapses — so that the brain’s paths of communication are extremely efficient and diverse, through the neuron’s axons, which conduct electrical impulses.

A machine that adopts that approach, Dr. Modha said, would represent “a crucial shift away from von Neumann computing.” He was referring to a design with processor and memory physically separated and connected by a narrow communications channel, or bus, and operating according to step-by-step sequential methods — the von Neumann architecture used in current computers, named after the mathematician John von Neumann.

The concept of neuromorphic electronic systems is more than two decades old; Carver Mead, a renowned computer scientist, described such devices in an engineering journal article in 1990. Earlier biologically inspired devices, scientists say, were mostly analog, single-purpose sensors that mimicked one function, like an electronic equivalent of a retina for sensing image data.

But the I.B.M. and university researchers are pursuing a more versatile digital technology. “It seems that we can build a computing architecture that is quite general-purpose and could be used for a large class of applications,” said Rajit Manohar, a professor of electrical and computer engineering at Cornell University.

What might such applications be, 5 or 10 years from now, if the technology proves successful? They would be the sorts of tasks that humans find effortless and that computers struggle with — the pattern recognition of seeing and identifying someone, walking down a crowded sidewalk without running into people, learning from experience. Specifically, the scientists say, the applications might include robots that can navigate a battlefield environment and be trained; low-power prosthetic devices that would allow blind people to see; and computerized health-care monitors that watch over people in nursing homes and send alerts to human workers if a resident’s behavior suggests illness.

It is an appealing vision, but there are formidable obstacles. The prototype chip has 256 neuron-like nodes, surrounded by more than 262,000 synaptic memory modules. That is impressive, until one considers that the human brain is estimated to house up to 100 billion neurons. In the Almaden research lab, a computer running the chip has learned to play the primitive video game Pong, correctly moving an on-screen paddle to hit a bouncing cursor. It can also recognize numbers 1 through 10 written by a person on a digital pad — most of the time. But the project still has a long way to go.

It is still questionable whether the scientists can successfully assemble large clusters of neuromorphic chips. And though the intention is for the machines to evolve more from learning than from being programmed, the software that performs that magic for any kind of complex task has yet to be written.

The project’s Pentagon sponsor is encouraged. “I’m surprised that we’re so far along, and I don’t see any fundamental reason why it can’t be done,” said Todd Hylton, a program manager.
If it succeeds, the project would seem to make peace with the “airplanes don’t flap their wings” critique. “Yes, they are different, but bird wings and plane wings both depend on the same aerodynamic principles to get lift,” said Christopher T. Kello, director of the Cognitive Mechanics Lab at the University of California, Merced. “It’s the same with this project. You can use essential design elements from biology.”