The next big bang: Man meets machine
In science-fiction fantasies, the melding of organic matter and digital technology usually takes human form, from Steve Austin's six-million-dollar bionics to the replicants running amok in "Blade Runner" to the Terminator.
Yet research on multiple fronts in digital technology, biotechnology and nanotechnology may, over the next half century, alter the way we think about computers and information, and our relationship to them. With these changes, bionic body parts won't seem so far-fetched as we increasingly develop ways to integrate high-tech materials into our mortal flesh.
And the reverse is true as well. Researchers are now looking to biological materials such as bacteria, viruses, proteins and DNA to replace mechanical parts in computers. And as the age of genetic engineering matures, scientists are already borrowing techniques from software developers to build libraries of genetic information.
All of these overlapping strands of scientific inquiry are known colloquially as "BANG," which stands for bits, atoms, neurons and genes. "All these things are converging because biology, nanotech and organic chemistry are running together," says Mark Bunger, an analyst with Lux Research. "The boundaries are really getting sketchy."
Some of the advances are in the earliest phases of research and won't produce actual products for years, if at all. But some of these concepts have quietly been with us for years. Sixty thousand people worldwide, for example, have cochlear implants, surgically implanted devices that do electronically what the ear can no longer do naturally--transform vibrations into signals the brain interprets as sound. Prosthetic limbs are increasing in sophistication. And now, tech applications are making their way into other parts of the human body.
Mind control
One of the best examples from this new world where man meets machine, and biology and digital technology come together with stunning results, occurred in an unassuming young man from the suburbs south of Boston.
Matthew Nagle was a normal American guy who played football in high school and loved his local teams. A few years after graduation, he was looking into a job with the U.S. Postal Service--until a July night in 2001 when he was knifed in the neck during a fight at the beach. The blow severed his spinal cord and left him paralyzed from the neck down.
Young, optimistic and otherwise healthy, Nagle at age 24 volunteered to be a human guinea pig--the first recipient of an implant developed at Brown University. Nagle spent a year connected to the BrainGate system, with a chip the size of a lentil resting on a part of his brain that controls motor functions. The chip, 16 millimeters square with 100 gold spikes on it, was sensitive enough to pick up Matt's brain activity when he thought about movement.
The chip was connected to a cable that emerged from the top of Matt's skull and into a contraption that resembled devices from "The Matrix" movies. In those films, Keanu Reeves is hooked up to a computer from a box in the back of his neck, which downloads intelligence into him. ("Whoa," he says upon waking. "I know kung fu.") Nagle's connection went the other way; the implant uploaded brain signals into a software program that, with some tweaking, learned to interpret what they meant.
Here's how it works: When the patient's neurons fire, electrodes pick up the electrical activity; when the neurons are firing well, they generate electrical "spikes." The software reads these spikes as "movement intention."
Elizabeth Razee, a spokeswoman for Cyberkinetics Neurotechnology Systems, which ran the BrainGate trial, describes the process. "When you want to move your arm up and to the left, for example, the neurons on your motor cortex actually fire in a specific sequence. The computer software reads that intention and translates it into cursor action on the screen 'up and to the left.'"
Nagle quickly learned how to control an on-screen cursor and other visual interfaces, such as a "Pong" paddle, with his mind. The footage is surreal. Nagle sits immobile in his wheelchair, speaking with the aid of a ventilator and playing "Pong" or "Tetris" or changing channels on a TV.
Nagle's year with the BrainGate ended last fall, and the implant has now been removed, but Cyberkinetics provided archive video and interviews with Nagle. "It's kind of a trip to think that my brain signals were controlling a mouse," he says. "Who knows, in two or three years, they might put it back in. I'd do it all over again. It did a lot of good."
Lou Gehrig's disease
Cyberkinetics now has another spinal cord patient using BrainGate, but unlike Nagle the new patient has chosen to remain anonymous. The company says the next step is to test the system with patients suffering from amyotrophic lateral sclerosis, or ALS, also known as Lou Gehrig's disease, named after the New York Yankee who retired in 1939 after his diagnosis and died two years later.
ALS patients slowly get "locked" into their own bodies. They remain cognitive, but their muscle and motor functions are cruelly stripped away, including the ability to communicate with the outside world, leaving only their hearing and vision intact. Many die because they can no longer breathe.
Researchers in Boston are recruiting patients for the BrainGate ALS trials, but with the leap in complexity from spinal cord injury, or SCI, to ALS, success is far from assured. "ALS patients often come to me and say, 'I've learned about (BrainGate), why aren't we doing this?'" says the ALS Association's science director, Lucie Bruijn. "You have to appreciate that with SCI. There's an injury in one area, (but) then there's not much progression. ALS is diffuse. It affects motor neurons throughout the body, and it's progressive."
One ALS specialist who advised on the design of the upcoming BrainGate trial says applying the technology to fight ALS is much more of a leap into the unknown. Primate studies that may give guidance aren't possible with ALS, says Dr. Merit Cudkowicz of Partners HealthCare System. "They can model spinal cord injury in monkeys, but no one will develop primate models for ALS. It's such a horrible disease. There's no shortcut to going straight to people."
Hundreds of researchers around the world are working on various aspects of this brain-computer interface, including noninvasive systems such as caps full of electrodes that pick up brain activity through the skull. Prominent participating institutions include Duke University's Nicolelis Lab, the state of New York's Wadsworth Center in Albany and the Cleveland Clinic. In Europe, the Graz University of Technology in Austria has a brain-computer interface lab. In Japan, where ALS patients are living longer and progressing more deeply into the "locked in" phase, corporations such as Hitachi have joined forces with university researchers.
Biocomputing
Less miraculous than helping paralyzed people use mind control, but just as far-reaching, is the future of computers themselves. Various research disciplines, each in itself a vast and complex area of knowledge, are looking ahead to a day when we reach the physical limitations of current computers and their components: silicon chips, metal batteries, cathode-ray monitors.
Some of these limitations come from the materials themselves. Silicon and other semiconductors begin to lose key properties, such as temperature control, as components shrink. But other constraints are a function of the interface between humans and computers. Anyone who has suffered from carpal tunnel syndrome or dry, aching eyes from reading computer monitors too long knows there's room for improvement on the interface front.
To delve deeply into the biological inroads researchers are making into each layer of the computing "stack" would fill textbooks. But to provide an overview of advances in each layer, we'll follow the example of analyst Mark Bunger, who co-authored a report last year for Forrester Research called "Biochemical Computing."
First, what could replace the semiconductor? Several labs are working on the inherent computational power of our natural world. The basic building blocks of life--DNA, enzymes, proteins--process instructions to carry out incredibly complex biological tasks. With our nascent ability to manipulate these molecular structures, could we effectively exploit them to carry out these operations ourselves?
"Like the carefully orchestrated molecular processes that occur within living cells, biomolecular computation can in principle occur autonomously, without the need for any external intervention during the computation," writes Erik Winfree, a professor at the California Institute of Technology in Pasadena, Calif. "Being able to design and understand such systems is our ultimate goal."
In addition to Winfree, work by Drew Endy at the Massachusetts Institute of Technology and others has led to an open-source biotech project called BioBricks. The idea: to build a library of biological components that can be used to create synthetic organisms.
For-profit companies are starting to tap into this idea, too. Craig Venter, the scientist who raced the U.S. government to crack the human genome, has a new company that aims to re-create basic genetic components from bacteria and other sources. It's akin to the way software programmers have access to sophisticated libraries of code and tools when they build applications for a specific operating system.
Memory and storage
As recent headlines about Google and the National Security Agency underscore, the need to store and sort data for all kinds of purposes is growing at a 40 percent annual compound rate, according to Forrester. As cameras become ever more ubiquitous--built into phones, monitoring street corners or orbiting the globe--a flood of still and video images will join the data mix.
At some point, the magnetic storage media of disk and tape will be tapped out. Some of the most far-out bioinformatic research is taking place in the field of DNA storage. DNA, of course, is the ultimate storage device. Each cell in your body has a complete copy, which stores 3 billion base pairs.
Instead of strings of zeroes and ones, DNA stores information in strands of adenine, cytosine, thymine and guanine. That's 6GB of storage per cell. And people have a hundred trillion cells in their body, which makes living things the world's most redundant storage devices.
DNA is also inert, so unlike a hard drive, bits of it can stick around for years. Just ask a forensic scientist investigating a long-cold crime scene.
Storing our home videos in DNA, however, will take quite a bit of genetic engineering, so don't hold your breath. But at least two laboratories are working on the problem: the biocomputation project at the U.S. Defense Advanced Research Projects Agency, or DARPA, the same folks who first cooked up the Internet; and the Department of Energy's Pacific Northwest National Laboratory.
Robo-grunts
Today the frontier of the brain-computer interface is being pushed as a remedy for paralysis, but the military also is interested in the technology for use in able-bodied soldiers who will be able to control machines remotely.
The Air Force, for example, has long been interested in what it calls "alternative control technology" to allow its pilots to fly planes hands-free. DARPA is running or funding several projects, including work at Duke's Nicolelis Lab similar to the Cyberkinetics' BrainGate, on that theme, and to develop exoskeletons to enhance battlefield performance.
Whether drastic procedures such as invasive brain implants ever reach beyond the military into the mass market is anyone's guess. But don't underestimate the determination of otherwise healthy people to augment their bodies in all manner of once-unbelievable ways. Indeed, with the ubiquity of personal devices on the streets these days, it's surprising no one's tried to have his cell phone or iPod directly implanted under the skin. That would do away once and for all with fumbling about in your bag or the fear of leaving those devices behind.
Implants or not, the way we interact with computers is in dire need of a rethink, as the digital elite might say. Our keyboards and mice make our hands hurt, our monitors give us headaches and double vision, our desk chairs reinforce our bad posture. On the whole, the organic constituents of our bodies and the inert materials of our computers continue to remain more adversarial than complementary. It's too soon to say when this will change, but we can be sure that change it will.
© 2006 The Deal.com. All rights reserved.