In his groundbreaking cyberpunk novel, "Neuromancer," William Gibson writes of a dystopian society whose citizens are irrevocably enhanced and inevitably corrupted by technology. Technology surrounds them. Technology permeates everything they do. Technology is now part of their bodies:
This was it. This was what he was, who he was, his being. He forgot to eat. ... Sometimes he resented having to leave the deck to use the chemical toilet they'd set up in a corner of the loft. … Its rainbow pixel maze was the first thing he saw when he woke. He'd go straight to the deck, not bothering to dress, and jack in. He was cutting it. He was working. He lost track of days.
… His gaze swept past her, to the rack of blank monitors. He seemed to shiver. … "We cause the brain to become allergic to certain of its own neurotransmitters, resulting in a peculiarly pliable imitation of autism." His head swayed sideways, recovered. "I understand that the effect is now more easily obtained with an embedded microchip. ..."
Advertisement - story continues below
There was a time when the idea of microchips embedded in a person's brain was the stuff of fantasy and science-fiction novels. Three summers ago, "Business Week" ran a special double issue in which it focused on the future of technology in the consumer and business worlds. In that issue, Louise Lee wrote of the many ways technology can be, is being, or could be used to enhance the human body.
Among the examples cited was the use of pacemakers by researchers at the Cleveland Clinic to treat obsessive-compulsive disorder by sending electrical impulses to the patients' brains. At Brown University, according to Lee, a few paralyzed patients have had chips implanted in their brains that relay neurological signals to computers (which, in turn, carry out certain tasks like moving cursors). At the University of Southern California, researchers are learning how to use computer chips to take over the functions of damaged tissue within the brain, as a means of circumventing brain damage – or even, as in Gibson's "Neuromancer," enhancing a healthy brain's memory capacity.
Many readers may have heard of the cochlear implant, a device that processes sound in order to impart signals to a hearing-impaired person. (A famous conservative talk-radio host uses one, in fact.) The device sends its signals through the auditory nerve to the brain. It is, therefore, a form of brain-computer interface – a "neuroprosthesis" that works with the human nervous system to restore functions of the body lost to disease or injury.
Research on brain-computer interfaces, including animal tests and experiments involving human beings, continues to push and expand the envelope of what can be accomplished with neuroprosthetics. In April, the New York Times reported on a new prosthetic device that replaces lost fingers. Made by a Scottish company, each motorized replacement finger is controlled by computer chips in the prosthesis itself. The action of the fingers is triggered by moving the muscles in the wearer's palm. While not a direct link to the brain, it is at least an indirect cue from the wearer's nervous system that tells the prosthesis what to do.
Advertisement - story continues below
Implanting chips in the brains of living beings is no longer speculation. As a reality, it is now being discussed in terms of its ethical implications. "Worldwide," wrote Ellen M. McGee and G. Q. Maguire, Jr., "there are at least 3 million people living with artificial implants. In particular, research on the cochlear implant and retinal vision have furthered the development of interfaces between neural tissues and silicon substrate micro probes. The cochlear implant, which directly stimulates the auditory nerve, enables over 10,000 totally deaf people to hear sound; the retinal implantable chip for prosthetic vision may restore vision to the blind." The authors go on to raise fascinating questions about the line between fixing what's broken and enhancing what's already working – asking when and whether it is appropriate to "play God" with devices wired directly to the human brain.
San Francisco-based company Emotiv, meanwhile, has developed a headset that reads your brainwaves to allow you to control hardware with the software in your own gray matter. While still relatively primitive, the controller demonstrates how equipment that looks for pattern matches in your brain activity can be used to trigger certain events in the device's software (and connected hardware). We previously discussed the dangers of pattern-recognition software in "Government machines that can read your mind," in which we showed you how the TSA uses elaborate scanning devices to match what you might be thinking to ill intent. The technology we're discussing isn't fiction. It's coming eventually, and it's coming for good or for ill.
Using or misuing technology as intended isn't even the worst concern we should have. In the very first edition of Technocracy, we discussed the dangers of advances in RFID technology, particularly implantable radio frequency ID chips that can be put into people and animals. Now a researcher in the U.K. has deliberately infected his implanted chip with a computer virus to see what would happen. "Our research," Mark Gasson wrote, "shows that implantable technology has developed to the point where implants are capable of communicating, storing and manipulating data." A device that can do that, Gasson points out, can be infected with a virus, and therefore affected negatively ... and maliciously.
We, as a society, crawl, walk and even run inexorably toward William Gibson's technologically saturated future. When we begin to accept technology into our bodies, we begin to absorb potential for great harm. As unreliable as any machine can be, dare we introduce this danger for any but the most necessary of reasons? What is the line between restoring one's lost functions and augmenting one's senses to a superhuman degree? At what point do we accept the possibility that an implanted device could experience a failure engineered by another person to hurt us?
Technology, while morally neutral, continues to raise moral questions. In this case, there are no easy answers.