electricmind-1458645390-63.jpg
electricmind-1458645390-63.jpg

The Electric Mind

One woman’s battle against paralysis at the frontiers of science.

By Jessica Benko

The Atavist Magazine, No. 15


Jessica Benko is a freelance journalist focusing on stories about science, medicine, technology, and the environment. Formerly a producer for WNYC’s Radiolab and science editor for WNYC’s Studio 360, her written work has appeared in National Geographic and The Virginia Quarterly Review. She lives in Brooklyn, New York.

Editor: Evan Ratliff
Producers: Olivia Koski and Gray Beltran
Copy Editor: Sean Cooper
Fact Checker: Spencer Woodman
Illustrator: Damien Scogin, dls4.com
Audiobook Voice Artist: Liz Stephens

Published in May 2012. Design updated in 2021.

Chapter One

“The primary aim, object, and purpose of consciousness is control.”

—Conwy Lloyd Morgan, ‘An Introduction to Comparative Psychology’

The first thing Cathy Hutchinson became aware of upon waking from three weeks in the quiet of a coma was the rhythmic alternation of surge then draw: whoosh, hiss, whoosh, hiss. As the contours of a room began to resolve before her eyes, she discovered the source of the sounds—a ventilator machine beside her bed. Her eyes followed the curve of a plastic tube issuing from the noisy box until it disappeared under her chin, entering her body through the opening in her throat left by a tracheotomy. When she tried to raise her head, she discovered that she could not. No amount of effort allowed her to lift her hand or flex her feet.

Her last memories were of feeling sick, of passing out as her 18-year-old son, Brian, helped her up the stairs to her bedroom, of waking briefly on the rough carpet of the hallway, unable to move. She was 43, a healthy nonsmoker, single mother of two, post office employee. She and Brian had taken a break from planting their annual vegetable garden to check the score of a basketball game when she began to hear a loud buzzing in her ears and was overcome by a wave of nausea.            

On that spring day in 1996, it took doctors nearly 12 hours following Brian’s emergency call to discover that Cathy had suffered a catastrophic brain-stem stroke. The brain stem is located at the base of the skull, a small region of primitive structures crucial to survival. It governs the critical functions of breathing, swallowing, blood-pressure regulation, and consciousness and conducts all messages between the brain and the spinal cord.

A brain-stem stroke is the sort of medical event that can result in death immediately or soon thereafter. But in Cathy, who was young and in otherwise good health, the stroke disconnected her brain from the descending motor tracts of her brain stem—the neural pathways carrying instructions to her muscles—leaving her “locked in,” not only quadriplegic but also unable to speak. The ascending tracts, which carry sensory information from body to brain, remained intact, allowing her the experience of pain, itch, heat, and cold but not the possibility of addressing them. She had a sensate, lucid mind incapable of action.

The best-known locked-in person is Jean-Dominique Bauby, the former editor of French Elle magazine who, like Cathy, had a brain-stem stroke at the age of 43. He wrote a book about the experience, The Diving Bell and the Butterfly, by communicating with an assistant by blinking his left eye. “But my communication system disqualifies repartee,” he wrote. “The keenest rapier grows dull and falls flat when it takes several minutes to thrust it home. By the time you strike, even you no longer understand what had seemed so witty before you started to dictate it, letter by letter.”

Unlike Cathy, Bauby endured his condition for just 18 months, eventually succumbing to pneumonia. When I first met Cathy, she had been unable to move or speak for 14 years. She was a participant in a promising medical study I was researching, involving experimental trials that tested the limits of science’s ability to tap into the brain of someone in her condition and read meaningful signals. In strict accordance with privacy protocols, the scientists identified her only as S3, but when I investigated their work further I discovered that Cathy had been featured in a television segment about the early years of the study. I was, I admit, intrigued by the extreme nature of her disability, and I wanted to know more about the research from her point of view. She was a scientific pioneer, it seemed to me. The question was, how did she view herself and the experiment? I scoured the Internet until I found contact information for someone I thought might be her son. It turned out to be Brian, who relayed my request for an interview to his mother. Once she agreed, and after I had been vetted by the directors of the research study, I set up our first meeting.

Bauby’s sharp observations of the dark ironies of his condition left me uncertain about what to expect from her. It’s not hard to imagine a poisonous strain of bitterness growing over a decade and a half of insurmountable helplessness and inexpressible opinions—or, more precisely, it’s not hard to imagine that happening to me if I were in her place. (“In the past,” Bauby wrote, “it was known as a ‘massive stroke,’ and you simply died. But improved resuscitation techniques have now prolonged and refined the agony.”) Yet, survival for so many years with her condition seemed incompatible with that sort of resentment, and I half wondered if I might meet a paragon of nonattachment, preternaturally skilled in quieting the echo chamber of her skull.

Chapter Two

“Against the assault of laughter, nothing can stand.”

—Mark Twain

I first met Cathy on a warm August day at her nursing home in Dorchester, Massachusetts. She was outside in a courtyard with other residents, in the shade of a large tree. At 57, she had skin that was still beautifully smooth, free of the lines and discolorations of age. Her nails were neatly manicured, her hands drawn into her lap. Her forearms were in braces to combat the contraction of her wrists and fingers into the sharp angles that result from neurological damage and disuse. A seat belt secured her to a motorized wheelchair.

What recovery her body was able to make had happened early on after the stroke. She could control her eyes, she could swallow and breathe on her own, and she could move her head slightly, which allowed her to operate the wheelchair with a button on the headrest behind her. She smiled at me, which I hadn’t realized she could do, and flicked her eyes upward in agreement when I said I was pleased to meet her.

Cathy’s daughter, Holly, had come to help with our meeting. She visited her mom often, driving from her home just over the Rhode Island state line. She had shoulder-length chestnut hair framing a round face, a warm smile, and eyes just like her mother’s. She was proud of her mother’s resilience and happy to help her tell her story. We made small talk as Cathy led us down the long hallways to her shared bedroom, where her roommate was watching a biography of Doris Day on television. Holly gathered up her mother’s link to the outside world: a computer equipped with a head-tracking device. We navigated out past the nurses’ desks and open common areas to a long, empty family room.

We had planned for Cathy to use her computer to communicate with me. The machine itself, a system modified by engineers at UMass Dartmouth’s Center for Rehabilitation Engineering to allow Cathy to use email and do some basic Web browsing, sat on a rolling computer stand that could be positioned in front of her. A camera tracked a small white target on the bridge of her eyeglasses, and when her gaze rested in the same place for a few moments, the software would perform a mouse click. In this way, she could slowly pick out letters on a keyboard. I wanted to be in her line of sight, as in a normal conversation, but with the computer placed deliberately in front of her face I couldn’t sit across from her and still see her. I wound up standing somewhat awkwardly beyond the screen, so we could at least see each other’s faces with ease. 

Head tracking requires total focus by the operator; stray movements can easily cause the software to misfire. Parked in the middle of the family room, Cathy struggled to control the program, and minutes elapsed as she attempted to compose an answer to my first question. Holly approached to readjust the placement of the computer. “What’s happening?” the synthesized female voice of the computer interjected, “Hi! I am Fred.”

After 10 minutes of effort that yielded only 17 words, we gave up on the computer, instead using the more reliable method of a transparent acrylic alphabet board, which was stored inside a polka-dot bag hanging from the back of the wheelchair. Tracking eye movement through the alphabet board takes practice, so Holly acted as translator. Holding up the board between herself and her mother, Holly shifted it until she thought her eyes met Cathy’s on the flip side of a letter, which she then named aloud. If she was correct, Cathy looked up to the right for yes. If not, she looked down to the left for no. It was easy to get tangled up in the recitation of letters and lose sight of the words she was trying to form. Holly called out letters in succession until, like the predictive-text function of a cell phone, she thought she could complete the intended word. The pace hinted at the dissonance between the speed of Cathy’s thoughts and the speed with which she could communicate them.

Over email, which allowed Cathy to take more time with her communication, and in two visits together with Holly, we pieced together the progression of years since her stroke that afternoon in May 1996. Like Brian, Holly had been a teenager, age 17, at the time of Cathy’s stroke. She remembers her mom as playful and active, an avid gardener and an enthusiastic cook. “My mom was always goofing around and singing and dancing,” she told me. “That’s one of my cherished memories, dancing in the kitchen.”

The weeks of Cathy’s coma are a blur for her kids. They lived with extended-family members while Cathy’s sister took over guardianship of their mother. “We relied on social workers, because we didn’t know how to navigate anything, it was so foreign to us,” Holly said.

Cathy’s sister was the first person to recognize that Cathy had emerged from her coma and had voluntary control over her eye movements. She alerted a junior resident at Massachusetts General Hospital, Dr. Aneesh Singhal, and he and Cathy began to communicate, at first using the binary code of yes and no. But the discovery that she was conscious and mentally alert didn’t mean an optimistic prognosis. “They never sugarcoated my condition,” Cathy told me, “nor did the doctors offer any hope for recovering. I appreciated their honesty, but I would not accept paralysis as a permanent way of life.” She had raised her children as a single parent, and for the first time she felt helpless.

She was firm in her resolve to battle her way back from her stroke. Though she was, in the neurological lingo, awake and alert, she struggled at first to control her emotions and found herself laughing hysterically for no identifiable reason. Over time she regained her emotional equilibrium and accepted that although she disliked being dependent on others for even the simplest tasks, she had no choice but to consent to  the assistance. Her family and her children drew closer together, depending on one another in ways they never anticipated.

At the nursing home, Cathy’s electric wheelchair gave her a cherished degree of independence, allowing her to move around without assistance. Arrow indicators on a panel in front of her cycled through the cardinal points, and when the desired arrow lit up she bumped the button with her head to cause the chair to move. I laughed when I noticed a speedometer on the control panel. Holly told me that when her mom had gotten the new chair, she’d said, “‘Let’s go out in the parking lot and see how fast she can go.’ It goes up to five miles an hour, and when you’re in five-miles-an-hour mode it spins just as fast, too. She jumped out of her skin!” Cathy laughed, a short burst of air that vibrated across vocal cords she can’t voluntarily control.

The wheelchair enabled her to keep up with her three grandkids, A…N… A…N…G…E…L… A…N…D… T…W…O… T…E…R…R…O…R…S, she joked of her granddaughter, 9 at the time, and her 2- and 3-year-old grandsons. The older boy, Holly said, liked to sit on Cathy’s lap in her seatbelt and drive around with her in her wheelchair, yelling, “Faster, Nana, faster!”


In our early conversations, Cathy was careful to maintain an air of hopefulness and determination in her answers to my questions. There could be no spontaneity in our exchanges, and she betrayed few of the harder emotions that must have taken hold of her at times. Her responses followed well-trodden paths through the narrative of overcoming extreme hardship: She had a greater appreciation for life. She realized that she’d taken her life for granted. She was happy to have a second chance. She learned not to allow her condition to stop her life. They were stock sentiments but sincere. And they were likely the anchors that kept her mind from drifting into darker waters. Her lack of self-pity seemed genuine. She was never concerned with the why of her stroke, only with the how: how it had happened and how she could overcome the worst of it.

But her dreams of greater autonomy were close to the surface, even as they remained maddeningly out of reach. When I asked what her days consisted of in the nursing home, she replied, F…R…U…S…T…R…A…T…I…O…N. Her options were limited. She read books, she said, something she had never had time for before. Her favorites were biographies—she particularly enjoyed one about Rose Kennedy, a fellow resilient Massachusetts mother whom she admired—but the selection of books programmed to work with her computer was small. She could watch TV to the point of saturation if she liked, but Cathy could only stand to watch the news.

Tony Judt, a historian who died in 2010 from ALS, a progressive neurological disorder that leaves a patient locked in before his eventual death, dictated an essay, entitled “Night,” after he had become essentially quadriplegic but before he had lost the ability to speak. Judt described the psychological pain of thwarted desires, of the inability to stretch or scratch or adjust in response to discomfort. Hardest to bear was the nighttime, when, for seven unattended hours, “there I lie: trussed, myopic, and motionless like a modern-day mummy, alone in my corporeal prison, accompanied for the rest of the night only by my thoughts.”

At one point, I asked Cathy if she had come up with a system that kept her comfortable when she was alone in her room at night. She laughed again and spelled out A…M…B…I…E…N.

“Loss is loss,” wrote Judt, “and nothing is gained by calling it by a nicer name. My nights are intriguing, but I could do without them.” 

Chapter Three

“Mind knows the world and operates on the world by means of its body. It is hard to escape the conclusion that bodies existed before minds and minds only exist because there are bodies fit for them.

—A. D. Ritchie, ‘The Natural History of Mind’

Humans have long sought to stave off death by replacing what they could not heal. Many diseases and injuries of the body do not impact the life of the mind. The scientific endeavor to overcome the frailty of the body to preserve the life of a sound mind has been closely linked with experiments to determine which parts of our bodies we could live without, leading toward the conclusion that the brain, the seat of self and consciousness, needs little more than circulating blood, oxygenated and nutrient rich, to survive. Whether that blood is supplied by the biological body the brain was born with or by some other means may not be of critical importance.

Robert J. White, a devout Catholic and one of the country’s leading transplant researchers, was a neurosurgeon at Case Western Reserve University. He spent his career seeking a way to allow humans to replace diseased parts. He believed it could be possible to use a healthy body from an otherwise brain-dead patient to replace the unhealthy body of a cognitively intact human with heart disease, diabetes, or any other disease or disorder. First he had to make certain that the only role of the human body was cycling blood containing oxygen and nutrients to the brain through the arteries and veins that connected it to the heart and lungs. In the early 1960s, he immobilized a living monkey and methodically removed the face, eyes, tongue, scalp—in fact, every scrap of tissue—from the head and neck of the pitiable creature, leaving only the major vessels and arteries to circulate its blood. He then removed the skull as well, allowing the still active brain to rest, alive and intact, for several hours before disconnecting it from the heart, ending the life of the monkey.

The success of this experiment set the stage for the procedure that would bring White infamy. In 1970, he and his team transplanted the head of one rhesus monkey onto the body of another, carefully splicing the circulatory system of the donor body onto the recipient head. The new, hybrid monkey was supplied with blood and oxygen through the body of the donor and regained consciousness after several hours, tracking the researchers with its eyes, chewing food it was given, and snapping its teeth savagely if they came too near. It lived for a day and a half, long enough for the procedure to be considered a success, but its body remained quadriplegic because, then as now, there was no known way to attach a brain to a spinal cord. Regardless of the health of the new body, it could sense the intentions of the monkey’s brain no better than a cold hunk of metal could.

White’s reports of this and several subsequent head-transplant experiments were widely met with outrage. The experiments were cruel—not an unusual feature of animal experimentation—but they were also useless to medical science. There was no chance White would ever be allowed to try such an experiment with humans (though he believed that would not be the case in some other countries), it would be prohibitively expensive and dangerous even if it could be done, and still the patient would have neither movement nor speech.

In the decades since, researchers have continued to struggle with the problem of finding a way to reroute the signals from an active brain to an otherwise functioning body. Without a functioning spinal cord, even the healthiest brain has no agency. Until recently, doctors held out little hope for patients like Cathy, but seven years ago she enrolled in a pilot study that could radically change the prospects for people trapped inside damaged or diseased bodies.

A friend of Cathy’s, a nurse, had come across a call for participants for a project called BrainGate, run out of Brown University. The researchers were seeking patients with quadriplegia for a pioneering experiment in which an electrode-studded implant would be embedded directly into the brain, with the hope of identifying and decoding the neurological activity that governed physical movement. The short-term goal was to use signals from the brain to control computers and then assistive devices; the long-term goal was to bypass damaged sections of the spinal cord and restore movement. The study’s codirector, a conscientious young neuroscientist named Leigh Hochberg, was blunt with Cathy: Whatever the failures or successes of the study, she could not hope that the results would assist her in her lifetime. “There are no expected benefits this early on in the research,” Hochberg told me. “What we’re doing, and what Cathy knew when we were starting and what she enthusiastically joined, is an endeavor to test and develop a device we hope will help other people with paralysis in the future.”

Cathy was the third patient chosen to participate. The first was a young man named Matthew Nagle, a former high school football star who had been injured defending a friend in a brawl. His spinal cord was severed at the C4 vertebra, leaving him paralyzed from the shoulders down but still able to speak. Competitive and determined, he threw himself into the research, receiving his implant in 2004 and helping the BrainGate team build a foundation of data for decoding the instructions flashing through his motor cortex. Because he was only in his twenties, Matthew held out hope that BrainGate could restore him to independence in his lifetime.

The health complications of his spinal injury proved too grave for medical science to overcome, however, and Matthew died in 2007. The second participant had his implant removed after a year, due to repeated failure of the hardware outside his skull to record the signals from the electrodes. Cathy’s device, the BrainGate Neural Interface System, was implanted in 2005. For six years, she worked with researchers one or two times a week to allow them to read her intentions from inside her brain, in an attempt to release them from her unresponsive body.

Chapter Four

“The ancestor of every action is a thought.”

—Ralph Waldo Emerson

John Donoghue runs the Brown University lab that developed the BrainGate system. He has spent his career in neuroscience focused on unraveling how the brain turns thought into action—essentially, what happens in our neurons that results in movement. After all, Donoghue points out, you can’t do anything without movement. His grand, wood-paneled office in the old Victorian home in Providence that houses the Department of Neuroscience is composed and orderly, rather like the man himself. On a cold winter day, the radiators banged loudly as we sat at a long wooden table next to an enormous flatscreen monitor hanging on the wall. The only personal touches were several large tile coasters painted with monkeys and a basket on his desk with a monkey eating a banana sitting on the edge, nods to the importance of nonhuman primates in his team’s work.

The human brain contains around 100 billion nerve cells, or neurons, which communicate among themselves using electrical and chemical signals. A neuron can fire a message to others by releasing a barrage of chemical particles, triggering a spike in the electrical charge of a nearby cell and starting a chain reaction along the branching pathways of the brain. Scientists have only known about the existence of electrical activity in the brain since the late 1800s, and it has proved to be a difficult phenomenon to study, given the significant obstacles encountered in any attempt to poke around in the brain tissue of a living animal to take measurements.

The first major breakthrough came in 1924. Years earlier, a young Bavarian soldier named Hans Berger fell off his horse while on duty with his regiment. Miles away, at the same time, his sister felt a strong premonition that he had met with an accident and insisted that her father telegram to inquire after Hans. Berger would later write, “This was a case of spontaneous telepathy in which at a time of mortal danger, and as I contemplated certain death, I transmitted my thoughts, while my sister, who was particularly close to me, acted as the receiver.”

The incident inspired Berger to pursue a career in medicine and psychiatric research, with the hope of detecting the psychic waves he believed had been the medium of telepathic communication with his sister. Presented with a patient who had gaps in his skull following removal of a brain tumor, Berger took the opportunity to measure the electrical current from the brain by placing two electrodes under the patient’s scalp. He eventually succeeded in producing recordings from electrodes placed on the outside of the scalp, as well, using his son and himself as test subjects. He called the recording technique hirnspiegel—brain mirror—which we now know as electroencephalography, or EEG.

EEG reads what is called field potential, a kind of aura of the ebb and flow of the chemical and electrical activity of neurons. It is a noninvasive detection of brain activity that renders a sort of smeared version of what’s going on inside our heads. And it remains the foundation of research into electrical activity in the human brain. In the past decade, German neurobiologist Niels Birbaumer was able to use EEG to decode intended speech in locked-in patients, but only after months of training and at the painful pace of about a minute a word.

The investigators of BrainGate wanted to delve closer to the source, intercepting the flickering electrochemical signals of a human’s thoughts as they coalesce into intentions inside the brain. Unlike EEG, implanted electrodes can render the electrical activity of populations of individual neurons in fine detail. An implant like the BrainGate sensor allows for communication between brain and computer in a shared language: electrical impulses. But the electrodes must be physically embedded in brain tissue to pick up the signals. Specifically, they must be in the motor cortex, a narrow region on the surface of the brain, spanning from ear to ear, that governs movement by translating intentions into electrical directives to be carried out by muscles. “If they’re not really close,” Donoghue explained, “they don’t get them. It’s like being outside the range of a cell-phone tower.”

For nearly 25 years before launching the BrainGate project, Donoghue investigated the brain’s signals using rhesus monkeys—the nonhuman primates with the unfortunate distinction of having the motor cortex most similar to our own. Using at first single electrodes and then increasingly complex arrays of them, Donoghue recorded the brain signals of living, moving monkeys, amassing enormous quantities of data correlating the locations and signatures of brain activity with specific movements of the monkeys’ bodies. Then, in the late ’90s, a team of engineers at the University of Utah developed an array of 100 microelectrodes made of silicon and platinum, materials that could safely be used inside the body. In 2004, Donoghue and the BrainGate team used the Utah array to demonstrate that the electrodes could last for over a year in the brains of three rhesus monkeys, collecting reliable data and causing no harm. That reliability had to be shown for the team to receive approval from the FDA to begin testing the implant in humans. “There were something like 12 or 14 boxes of papers, all consolidated, sent to the FDA,” Donoghue recalled. Later that year, the implants were cleared for preliminary trials in human subjects.

By 2005, it was time for Cathy to receive her implant. Working with data accumulated from the monkeys, the neurosurgical team had a good idea where in Cathy’s brain to look for signals intended for her arm. In surgery at Rhode Island Hospital, a neurosurgeon working with the BrainGate team removed a bottle-cap-sized piece of her skull and opened the delicate membrane that lay just beneath, exposing the neuron-threaded tissues of the arm area of her motor cortex. Using a pneumatic device, like a tiny air hammer, the team fired the implant into the surface of her brain. They then carefully closed the membrane around a set of wires, leashed to the implant, leading to a titanium pedestal that covered the removed portion of her skull.

The wires were housed under a gray plastic port, like a tiny top hat, protruding from Cathy’s head. When opened, the port allowed the wires to be linked to computers, which recorded electrical data sensed by the array.

The researchers began trials with Cathy a month after her surgery, and Donoghue was unsure whether the region of her brain that corresponded to her hand movements would still fire signals after a decade of disuse. “We have this concept of brain plasticity, which I certainly adhere to: that our brains are always changing, rewiring themselves in some way, changing their interactions. You’d think that a piece of the brain that had lost its job for a few years would retire.” There was reason to be hopeful: They had been able to read useful information from Matthew Nagle’s brain. But Matthew had been paralyzed for only two years when he joined the BrainGate trials.

The team hooked Cathy’s implant up to a cart of computers and processors the size of a mini refrigerator that received the signals being detected by the electrodes in her brain. They asked her to follow a cursor moving on a screen in front of her by imagining that she was controlling a computer mouse with her own hand. The scrolling screens lit up with peaks of her brain activity. Donoghue was delighted. “The brain was actually very active, and in fact was active in ways that resembled what we expected to happen when a person was actually moving.” After years of paralysis, Cathy’s brain was still primed for action.

For the computer to act as a useful translator—taking the neural signals in her brain and interpreting them as directions to action—researchers still needed to decipher the unique dialect of Cathy’s motor cortex. Using her still vivid imagination, Cathy commanded her arm to move in all the ways the researchers requested. Over and over, she tried to move her hand forward and back, left and right, flexing, grasping, and releasing. The computer systems were trained to recognize the electrical signatures preceding each action, filter out any noise, and amplify the signals they identified as meaningful.

To this point, all the action had been taking place, as usual, only inside Cathy’s head. But the researchers weren’t aiming just to recognize her intentions from her brain activity. They wanted to use the brain signals to direct actions outside her body. Their first target was a computer cursor. As Cathy concentrated on moving her hand, her efforts unspooled on screens in front of the researchers, who tried to use the information from her brain as a sort of virtual mind-controlled mouse. When the researchers turned control of the cursor over to Cathy’s neurons, the cursor immediately began to move haltingly across the screen. Cathy couldn’t believe her eyes. “I was numb with shock and disbelief,” she wrote to me, “so I moved the cursor all over the screen.”

From there, over the following months of weekly sessions, it was a matter of refining her control. The first task she mastered required moving the cursor from a center mark on the screen to tag an image that appeared at the top, bottom, or side of the screen before returning to the center. As the trials progressed, she learned to play a game the researchers called “neural pong” for its similarity to the early Atari game modeled on table tennis. In the neural version, the playing field was a box framing the screen. At the bottom was a sliding plank, controlled by Cathy’s brain. A circle would bounce against the sides of the frame, and when it approached the bottom of the screen Cathy moved the plank to intercept it and send it ricocheting back up.

She also learned to use the cursor to navigate a simplified computer interface. She could open email and select music. She was able to pick out letters on a virtual keyboard. As the researchers refined the algorithms, the system learned to assist Cathy in her goals: If it detected two different intended motions in succession, it would supply the movement in between.

The cursor experiments were a huge achievement; for the BrainGate team, they were important steps toward the goal of turning thought into action. But moving cursors on a screen involves interpretation of only two dimensions of intended movement in a digital environment. The next trials would require a great leap. The researchers wanted to give Cathy the ability to operate in physical space. They hoped to allow her to control a sophisticated robotic arm to stretch, grab, and move real objects in her surroundings—her first chance to do so in nearly 15 years.

Chapter Five

My heart is human / My blood is boiling / My brain IBM

—“Mr. Roboto,” Styx

In April of 2010, the BrainGate researchers secured the use of an advanced humanoid robotic arm made by the German space agency DLR. The sophisticated system is much in-demand, and arranging for it to travel to the BrainGate researchers was made difficult by the constraints of its schedule. “It’s like a visiting professor,” Hochberg says, “and then it returns home.” It is equipped with a shoulder, an elbow, a wrist, and articulated fingers complete with fingernails, and can move in the same directions as a human arm: The shoulder and elbow can swing and raise, the wrist can rotate, and the fingers can squeeze. Controlling it requires coordination of seven degrees of freedom of movement, as well as compensation for complicating factors like mass, inertia, and gravity.

The BrainGate researchers set up their equipment in a beige-and-gray visiting room in Cathy’s nursing home facility, the site of the past four years of their work with her. Video and lighting rigs stood at strategic angles around the room, documenting the details of the trials. Cathy sat in her wheelchair, surrounded by stacks of computer processors and screens, a gray plug the size of a Rubik’s cube protruding from her short brown hair. Underneath, the neurons of her motor cortex were securely woven through the 100 platinum-tipped microelectrodes, the combined size of a baby aspirin, emerging from the implant embedded in the surface of her brain. A thick cable ran from the plug to a box attached to the back of her wheelchair, and from there it split off to the surrounding computers. Pink and blue lines zigzagged across a graph on one of the screens, registering the activity of neurons detected by the electrodes of her implant.

At first the blue and silver robotic arm, 30 pounds heavy and much larger than Cathy’s own, was placed at a remove from her body, hanging over a cloth-covered table marked with small colored circular targets.

I have seen video of the trials, which were preliminary demonstrations and, at the time, mere reflections of in-process research, not ready for publication. In one clip, John Donoghue stands beside Cathy, whose face is blurred in accordance with privacy protocols. He asks her to try opening and closing the hand. A second later the robotic fingertips bend inward. Donoghue and three researchers standing behind Cathy break into grins. In the next clip, shot over Cathy’s shoulder, a young researcher standing to her left asks Cathy to lower the arm, and a moment later the robot’s joints adjust smoothly until the hand rests on the tabletop. More commands follow. She raises the arm halfway, then drops it back to the table. She lifts it all the way up, then brings it down to rest again.

In another experiment, a bottle of juice is placed on the table to the left of the robotic hand. A voice off camera instructs Cathy to try to pick up the bottle. The hand begins to inch slowly toward it, joints extending, and captures the bottle between fingers and thumb. The bottle starts to tip over, and the arm retracts slightly until the bottle is standing again. Then the arm extends once more, the bottle leaning, and retracts again, then pauses. It reaches for the bottle a third time, the fingers closing around it, and shifts upward, lifting it off the table. Cathy is then asked to move the bottle to the middle target on the table, several inches to the right of where it sat. The arm retracts too far at first, then adjusts and releases the bottle gently within the target, which is just barely wider in diameter than the bottle itself. Later on, with more authority, the hand sweeps a wineglass resting near the edge of the table into its grasp, fingers closing around the bowl. It wavers indecisively, then picks the glass back up and readjusts its positioning to place the glass on the target.

Even from the distance of a video, what I was seeing was difficult to believe. There, in that moment of indecision, was Cathy, dissatisfied with her performance, insisting on greater precision. The robot’s movements were matching Cathy’s thoughts. The smiles on the faces of the researchers in the room reflected the significance of these moments. The team had worked with Cathy for more than four years. The hundreds of hours spent on tedious and repetitive tasks had led them to this: for the first time in 14 years—indeed, for the first time for any quadriplegic—Cathy was able to reach out into the world.

The trials were an attempt at brain control of a physical object and important proof of the BrainGate team’s progress translating brain signals into directions for an assistive device. In this case, Cathy had control over only two dimensions of the robot arm’s movement at a time, either the horizontal plane or the vertical plane; programmed instructions governed the third. When she imagined grasping the hand, it triggered the robot to not only close its fingers but to lift the bottle off the table so it could be moved. After she directed the hand to a specified target, her next grasp command caused the arm to lower the bottle and then release it.

When directing the hand to reach the bottle, Cathy was thinking of her own hand in the same way she did to control the cursor in previous trials, and the computers read her intended direction and velocity of movement, adjusting the arm’s joints to make the hand follow those intentions. This is the same way we control our biological arms. When we plan to reach out and touch something, we don’t think of moving each joint individually. We set our focus on the endpoint, and our brain does the calculations to control the muscles to get us there.


The next challenge was building filters to detect Cathy’s intended movement in a three-dimensional space, giving her greater control over the robot and greater flexibility of movement. It was not a simple task. The team needed to record and classify the signals of the ensemble of neurons surrounding her implant as they sent instructions in the three dimensions: left-right, down-up, and toward-away. They calibrated the computers by asking Cathy to watch carefully as the arm moved through a series of programmed actions, imagining herself doing the same with her own arm. Once they had rough filters built, they transferred control of the arm to her implant and refined the translation. Like in previous trials, Cathy’s focus was on moving the hand to the correct destination, while the computers detected her intended speed and trajectory and adjusted the joints to follow, now in three dimensions.

The targets, which in a rare lapse into informal language Hochberg refers to as “raspberry ice cream cones,” were made of pink foam balls set in black cones at the end of flexible rods. They lay flat on the table until activated, when motorized levers brought them up one at a time to varied spots in the space in front of Cathy. Her task was to maneuver the robot hand to meet the balls, which were only slightly smaller in diameter than the open hand, and, if possible, grasp them. The task was made harder by the springiness of the rods: If the hand brushed against the target, it would bounce, making it even more difficult to grab.

Cathy performed the trials with both the DLR arm and an advanced prosthetic made by New Hampshire–based DEKA Research, founded by the inventor Dean Kamen and developed with funding from the military in the hope of creating sophisticated prosthetics for amputees. The results of the trials, published in May 2012 in the journal Nature, report that Cathy and a second participant, called T2, another brain-stem-stroke survivor, were able to touch or grasp the targets a significant portion of the time.

The BrainGate system worked like a prosthetic spinal cord leading to an artificial limb, conveying the will of the mind without the need for speech or movement. The researchers were able to read the signals of the brain, not in vague generalities but in specific detail, and their computers passed those instructions to a robot nearly as quickly as they would have been communicated to a living limb. BrainGate, in fact, could have applications not only for people with brain and spinal-cord injuries, but for others with neurological disorders and for amputees through injury or illness. The technology promises to relink the brain to working bodies both artificial and, potentially, biological.

With the ice cream cone task accomplished, researchers had a special plan in store for Cathy. After the successes of the preliminary demonstrations with the DLR arm, the BrainGate team decided to bring the arm into range of her body.

They knew that Cathy loved coffee, but she required assistance to lift a cup to her mouth to allow her to drink from a straw. So the team placed a red metal thermos of coffee, emblazoned with the initials and insignias of the research team and sponsors and topped with a straw, on the table in front of her. With pursed lips and a look of intense concentration, Cathy guided the remote hand to the bottle and closed the fingers around it. She lifted it off the table and brought it close to her chest, and when she triggered the arm with a grasp command, it tilted the thermos toward her, allowing her to reach the straw. She took a sip, gave it another grasp command to right the bottle, and set it back down on the table, breaking into a shocked-looking smile.

“Drinking from a cup felt very natural,” she told me later. It’s what the BrainGate researchers most wanted to hear: that controlling the robot arm felt like moving one’s own. “My mind raced back to the early days of BrainGate, when a team member told me I would be able to control a cursor,” Cathy said. “I was in disbelief when I was able to control a cursor, and here I was controlling a robotic arm, drinking from a cup.”

Chapter Six

“If real is what you can feel, smell, taste, and see, then ‘real’ is simply electrical signals interpreted by your brain.”

—Morpheus, ‘The Matrix’

What lies beyond Cathy’s grasp of a coffee cup? Speech, requiring the coordination of tongue and lips to shape sound, is similarly the result of a complex electrical storm of intentions funneled into simplified directives in the motor cortex, making it a candidate for eavesdropping electrodes. And, in fact, research is under way at Boston University that uses microelectrodes implanted in the speech area of a locked-in volunteer’s brain to attempt to identify intended phonemes, the units of sound we string together to make up speech. Though successful detection rates are still low, they are higher for some sounds than for others and are gaining accuracy with time and research. It’s another advance that would have inestimable impact on the quality of life of people with a disease or injury that has left them unable to speak.

Other glimpses of the future of human brain implants can be seen in the lab of Duke University neuroscientist Miguel Nicolelis. Supported by a $26 million grant from the Defense Advanced Research Projects Agency—DARPA, the R&D arm of the U.S. military—Nicolelis’s lab has been investigating the possibilities for remote control of supplemental limbs and for the augmentation of sensory systems for the past decade. Rhesus monkeys are again the preferred subjects—or collaborators, as Nicolelis likes to refer to them—when testing not just the safety of cortical implants but also their possibilities. With them, Nicolelis and his team are able to push the boundaries of experimental design well beyond what can be achieved using a patient like Cathy.

In 2001, Nicolelis’ lab acquired a middle-aged rhesus monkey called Aurora, rejected from other labs for being difficult to work with. Nicolelis describes her as a slow learner, disinclined to participate in complex or repetitive tasks. After months in the lab, Aurora finally began to show interest in experimental assignments, spurred on by a reward of fruit juice, often used with monkeys. The researchers discovered that they could entice her to play video games with a joystick, and soon she excelled. She became a star in the lab and was chosen as the subject for 2002’s Project MANE—an acronym for Mother of All Neurophysiological Experiments.

Like Cathy, Aurora received an implant of electrodes in her motor cortex, connected to decoding computers, which mapped the activity in her brain to her arm movements and translated them into directions for a robotic arm. The arm, in Aurora’s case, had a range of movement similar to the monkey’s own and a set of fingerlike pincers for grasping. According to Nicolelis, Aurora couldn’t see the robotic arm and had no idea that it was in a separate room mimicking her movements. What she knew was that if she successfully performed video game tasks—using her joystick to control a cursor on a screen and intercept moving targets—a high-frequency beep would sound and she would receive a few drops of the juice she loved.

Once Aurora’s brain signals were being reliably read by the computers, a researcher entered Aurora’s room and removed the joystick from her reach. The team switched control of the video game from the joystick over to the robot arm. The targets in the video game continued to move across the screen, but the cursor lay dormant, not receiving any commands. After a period of confusion and false starts, during which she grabbed at the targets on the screen with her hand, Aurora began following the video game closely, watching the targets intently. The only way for her brain implant to send the signals that would cause the robot arm to move the cursor was to imagine that she was controlling the cursor with her joystick, as she had been before with her own hand.

Eventually, the signals her brain was sending to the computers began to resemble the patterns they had shown when she was moving the joystick. As the computer decoded her intentions, the robotic arm began moving the cursor on the screen. Every time her brain successfully caused the robot to intercept the target on the screen, she received a reward. Aurora, in other words, had figured out that she could play the video game with just her imagination. She could choose not to move her limbs while continuing to generate brain activity the computers could interpret. After about a month, she discovered that not only could she relax her body while playing the game with her mind, but she could actually use her limbs for other purposes—like scratching—at the same time.

Combing through the data produced by Aurora’s brain, the Duke researchers found evidence of three important populations of neurons. The first was active in the same or similar ways whether Aurora was playing the game with the joystick or with her mind. The second was active only when her biological arm was moving. The third was active only when she was controlling the video game with her brain alone. That third group of neurons was largely functioning as though she had carved out space in her brain for a new phantom arm and integrated it into her mental model of her body.

In 2007, Nicolelis’s lab conducted an experiment—for demonstration, not peer-reviewed research—with a monkey called Idoya, focusing on the lower limbs rather than the arms. Using tips discovered in a 100-year-old report of Russian circus-training techniques, the researchers were able to teach Idoya to walk upright on a treadmill, rewarding her with her preferred snacks of Cheerios and raisins. The researchers modeled her brain activity as she mastered walking forward and backward, shifting directions, and changing pace.

Fluorescent markings painted on Idoya’s hips, knees, and ankles enabled researchers to track their positions in 3-D space. As Idoya walked on the treadmill, the computers matched the activity of her brain to the positions of her joints. Those models were mapped to the legs of a five-foot-tall, 200-pound humanoid robot in a laboratory in Kyoto, Japan. The electrical spikes of Idoya’s motor cortex governing her leg movements were used to control the robot’s movements as it hung suspended just above its own treadmill. A large screen covered the wall in front of Idoya, filling her visual field with a live video feed of the robot. As Idoya found her walking pace, she watched the robot match her movement for movement. She continued walking and watching for an hour, as the robot legs churned in time with her own.

The researchers switched off her treadmill, and Idoya slowed to a stop, still watching the screen. But her brain continued sending intelligible signals through the computer link, maintaining control of the robot and continuing to direct its movement for several more minutes as she watched.

Taking their exploration of the rhesus brain further, the Duke researchers placed electrodes in the sensory cortex of two monkeys. They could stimulate the monkeys’ brains by sending small electrical signals to the implants. In research results published in Nature in 2011, they trained the monkeys on a simple video game task: Three circular targets appeared on a screen in front of the monkeys. In some trials, the monkeys used a joystick to control a cursor or an image of a virtual arm to select the targets. In other trials, the cursor or virtual arm was controlled by signals from the implant in their motor cortices. The virtual hand would pass across targets as though the monkeys were reaching out and running their fingers over them. 

An artificial “texture” was assigned to each target and was communicated directly into the monkey’s sensory implant as a pattern of electrical stimulation. When the cursor passed over the two dummy targets, a sequence of high-frequency pulses was delivered to their brains. When the cursor passed over the reward target, signals of a slightly different frequency were sent. The monkeys could distinguish between the two artificial “sensations,” allowing them to use the pulses sent to their implants to choose the reward target from among the three. Though it is impossible to know what the monkeys felt when their neurons were buzzed by the researchers, they were able to recognize the input and use it to play the game.

Taken together, these discoveries indicate that monkeys can control artificial bodies while maintaining control over their own, and that they can integrate artificial sensations. They suggest intriguing possibilities: that there is room inside the brain for the adoption of additional limbs, and that substitute, and supplemental, body parts can not only receive signals from inside the brain but also send new forms of sensory information back.

In 2002, at DARPATech, a showcase of defense technologies Eric Eisenstadt of the Pentagon’s Defense Sciences Office spoke about the agency’s vision for the future of biology. “Picture a time when humans see in the UV and IR portions of the electromagnetic spectrum, or hear speech on the noisy flight deck of an aircraft carrier, or when soldiers communicate by thought alone,” he said. “Imagine a time when the human brain has its own wireless modem so that instead of acting on thoughts, war fighters have thoughts that act.” He described the goals of the Pentagon’s Brain Machine Interface Program as allowing the human brain to incorporate synthetic devices as though they were part of the biological body, giving the capabilities of machines to intelligent human operators. “Who knows?” he added. “If we can eavesdrop on the brain, maybe we can sort out deceit from honesty, truth from fiction. What a lie detector that would be!” The 2013 budget for DARPA includes funding for a project called Avatar, which, according to the agency, is intended to “develop interfaces and algorithms to enable a soldier to effectively partner with a semi-autonomous bi-pedal machine and allow it to act as the soldier’s surrogate.” 

Chapter Seven

“You’re only a clear, glowing mind animating a metal body, like a candle flame in a glass. And as precariously vulnerable to the wind.”

—C. L. Moore, “No Woman Born”

Advanced bio-hybrid fighting machines held little interest for Cathy. For her, and for the BrainGate researchers, the goals they cared most about were more prosaic: achieving small degrees of meaningful function to improve the lots of people with paralysis or limb loss. Day by day, trial by trial, they worked toward more precise readings of Cathy’s brain. During a visit last winter, Cathy told me that being a part of BrainGate kept her going, providing a welcome distraction and a chance to help others, an opportunity she intended to take full advantage of.  M…Y… L…I…F…E… I…S… O…V…E…R, she spelled, as Holly translated using the alphabet board, tears filling her eyes. “But young people have lives ahead of them and I can’t imagine kids spending childhood in a wheelchair.” (Later, Cathy sent me an email asking to clarify that her life wasn’t really “over.”)

Cathy used to love to ride bikes with her kids, to sing and dance around the kitchen, making meals from the tomatoes and squash and peppers she had grown in her carefully tended garden—things she hoped her participation in the BrainGate research would someday allow other injured people to do. Holly handed me a photo of her mother before the stroke, standing with three friends in long winter coats, dressed up in front of a limo they rented to celebrate the 40th birthday of one of the women. It took a moment for me to identify Cathy, but then I saw her. Her face was fuller then, her hair styled for a night out. She was smiling broadly.

There is a science-fiction story from 1944, written by C. L. Moore, called “No Woman Born,” about a beautiful singer, Deirdre, whose badly burned body is replaced by an artificial model. Moore describes her “bare, golden skull … the most delicate suggestion of cheekbones, narrowing in the blankness below the mask to the hint of a human face.”  They chose not to attempt to re-create her face as it looked in her humanity but focused on her motion, which, Deirdre says, “is the other basis of recognition, after actual physical likeness.”

Cathy and I talked about the future, and though she was little prone to sci-fi thought experiments, we pondered the idea of a replacement body, a robot body that would behave like her own, back when it worked. She would take it, she said—she wasn’t particularly sentimental about her human flesh—as long as her mind stayed intact. There was something, though, that worried her. She told me she had “a  trust issue” with the robots and the computers, that she felt entirely dependent on them not to malfunction. One day, during one of the BrainGate experiments, she was startled to see the large robot arm drop suddenly, possibly from a power outage. It made her realize something, she said.

“I was not in control.” 

Chapter Eight

“Cyberspace. A consensual hallucination experienced daily by legitimate operators, in every nation. … A graphic representation of data abstracted from the banks of every computer in the human system.”

—William Gibson, Neuromancer

“I am an inexperienced pilot of Skype. I’m waiting for the version just by thought!” Miguel Nicolelis joked, eyes crinkling merrily on my computer screen as we video-chatted. Unlike the BrainGate scientists, who are extremely reticent to discuss unpublished research and even less inclined to speculate about the future, Nicolelis believes in stimulating public conversation about his work. “We had this pattern of conduct—scientists don’t speak to the press, scientists don’t talk to people, scientists don’t come down from their pristine castles to explain what they do. I don’t believe in any of this. And I think that scientists should speculate. They’re paid to.”

When the monkeys in Nicolelis’s lab control an extra limb or make decisions based on the sensations from a virtual arm, he says, they are demonstrating their ability to dissociate their minds from their biological bodies. Their physical body has no impact on their ability to enact a voluntary motor command across long distances or in a virtual universe. “I like to say that this is when you free the brain from the body,” he told me. He believes this dissociation will one day make it possible to use computers, drive cars, and communicate with one another by brain activity alone. “Your presence is going to be pretty much anywhere you want to be,” he told me. “You don’t need to send a manned mission to Mars. You send your avatar there. And you experience being there. Your physical presence will be represented by this device.”

He described the brain-implant surgery as a “trivial procedure,” from a neurosurgical point of view, and told me he believed that one day these kinds of implants will be considered no more extreme than cosmetic surgery. Recently, he said, he was speaking about his work to a group of high school students. Following his presentation, he asked the students if, were it safe and available, they would get an implant that allowed them to play video games faster and better than any of their friends. “The kids, nearly unanimously, said, ‘Yeah, of course!’” Nicolelis said.

Though he may not share their enthusiasm for video games, Kevin Warwick, a systems engineer and professor of cybernetics at the University of Reading, does share their willingness to receive a brain implant. Warwick is best known for his Cyborg 2.0 project, a 2002 experiment in which he had an early version of the Utah electrode array wired into his nervous system, fired into the median nerve of his wrist by surgeons at the Radcliffe Infirmary at Oxford who had practiced the procedure on sheep carcasses. He used the electrical spikes that traveled through the median nerve when he closed and opened his hand to pilot a wheelchair and, via the Internet, to control a robotic hand located across the Atlantic Ocean. He was also able to feel and interpret signals fed back through the electrodes to judge the pressure of the robot hand’s grip and to gauge distances while blindfolded, using feedback from a sonar transponder attached to a hat—a form of sensory substitution.

Warwick’s real interests, echoing Hans Berger’s, lie in what he calls radiotelepathy—communication from one human brain to another, nervous system to nervous system, in a rich and unedited wave of sensation. He believes it will be possible to transmit colors, images, and emotional and physical states in a vastly superior form of communication. He speaks freely of his desire to experiment with a brain implant like Cathy’s. “From a scientific point of view, there are things I want to find out, things I want to experience, and I don’t want to die without having experienced them.”


When I asked Donaghue about goals like these, he dismissed them as an unworthy focus. “We do all that already,” he said. “It’s an immensely complex way to solve a problem where we have a beautiful solution like literature. That is your interface with somebody else’s brain.” In any case, it is unclear whether electrical signals fed into the neurons via electrode will ever feel like sensations that have traveled the lengths of our bodies, through billions of networked nerves. It could be little more than a new form of language. It could be another method for conveying an impression of thought, but it wouldn’t be a truer experience of thought itself.

Even if we could transmit the complexity of a person’s thoughts, memories, or emotions through a computer to another person, they would encounter an unfamiliar environment. Donoghue referred to the words of Charles Sherrington, a neurologist working in the first half of the 20th century, who described the activity of the human brain as “an enchanted loom, where millions of flashing shuttles weave a dissolving pattern, always a meaningful pattern though never an abiding one; a shifting harmony of subpatterns.” At any given moment, our brains are suffused by swirls and eddies of chemicals, flickering electrical storms morphing as our interactions with our environments do. It is as though the same spot on a map is at some times a desert, at others a rainforest, at still others buried in snow. Identical sensory input may be met with a completely different response in the brain depending on the instant of the encounter, as when a rush of adrenaline dulls the experience of pain.

This prevents us from achieving the fantasy of the mind meld, but it also protects us from dystopian scenarios of unwelcome mind hacking. Our brains are more like palimpsests than blank canvases. It is impossible to draw the experiences of one brain onto another without them being warped by layer upon layer of past impression stretching back to the moment our lives begin. The connections of 100 billion neurons are unique in every individual; we can make sense of impulses in the brain only by being grossly reductive, letting the activity of a few dozen neurons being read by electrode-studded implants stand in for the activity of billions of others, wiping away any subtlety.

With just those few dozen neurons, though, the BrainGate researchers were able to do something real, tangible, and potentially life changing. Even cursor control would give Cathy an important degree of freedom in her everyday life. The implanted electrodes would allow her to control her computer and type with greater reliability and precision than her head-tracking device allowed her, increasing her ability to communicate without human assistance. But for the technology to be usable, not only do the computer algorithms have to be unfailing, but the entire system has to be fully wireless—no open port through her skull. And for it to be wireless, it would require a power device, batteries implanted under the skin that would have to be recharged with a special device or periodically replaced. Those advances aremany years, and countless safety trials, away.

infographic-1394062040-10.jpg

Chapter Nine

“The scientific man does not aim at an immediate result. He does not expect that his advanced ideas will be readily taken up. His work is like that of a planter—for the future. His duty is to lay foundation of those who are to come and point the way.”

—Nicola Tesla

The halting progress Cathy has made toward greater independence is picking up speed. Five years ago, she agreed to be the lead plaintiff in a class action against the state on behalf of the Brain Injury Association of Massachusetts. State-administered Medicaid allowed brain-injured patients to live only in nursing homes and long-term-care facilities—sterile, rule-bound environments often focused more on preventing patients from dying than on giving them a life worth living. But many brain-injured patients like Cathy are in good physical health, with long life expectancies, far too long to be committed to an institution under rules that make it difficult, if not impossible, to participate in a community outside the nursing home.

In 2008, Massachusetts settled the suit, the first of its kind in the United States. Under the agreement, the state is required to reduce unnecessary institutionalization. Patients can apply for a waiver that allows them to use the money that would be spent on housing them in large facilities to assist them in living in their communities, in provider-run small group homes or in private residences.

Cathy received a waiver under the decision, and last fall her opportunity finally arrived. In September 2011, after 15 years in nursing facilities, she moved into a real home. It has two porches, two living rooms, soft couches and curtains, and a big open kitchen with dark wood cabinets. At Thanksgiving, the families of the residents descended on the house, and Cathy’s son, Brian, cooked a feast for all.

She has her own bedroom at last, with bright yellow walls and a white wooden dresser. There is no roommate and no beeping medical equipment; it is a welcome return to privacy after so many years. She calls it her sanctuary. She shares the home with several other brain-injured residents, and a small staff assists when necessary. Cathy leaves home during the day for programs and therapies, and a personal-care attendant takes her in a handicapped-accessible van to the mall, to the grocery store, or to visit friends and family at her request. Last October, she even went trick-or-treating at Halloween. Holly describes her mother’s trajectory as going from seemingly permanent despair to genuine happiness.

In our recent correspondence, Cathy does seem very pleased with her new arrangement. She has close relationships with her housemates and her personal-care attendants and sees her family frequently. The new control she has over her day-to-day life has given her a great sense of liberation. Her network of friends and family support her in her continuing push toward autonomy and recovery for herself and others like her.

But there are no longer twice-weekly visits from the BrainGate researchers. Several months before moving, Cathy decided to have the implant removed and exit the research trials. “I was hopeless before BrainGate,” she told me. “Being a participant in the trials is one of the best experiences of my life.” She had grown close with the researchers, especially Leigh Hochberg, whom she still consults about her plans.

She made the decision, she says, not because she had given up on the promise of BrainGate, which she believes will eventually lead to a functional cure for paralysis. But controlling an assistive device, like a robotic arm, isn’t her goal anymore. She wants to move her own arm.

To that end, she wants to pursue functional electrical stimulation, FES, which uses electrical currents to trigger muscle contractions and enable limb movement. The BrainGate researchers hope to someday link the motor-cortex implant of a paralyzed patient to FES devices, to reconnect the brain to the limbs. For now, though, and likely years to come, they need to work at refining the devices that read and record signals in the brain before adding the additional factor of an external device injecting signals into the body.

Cathy doesn’t want to wait that long to feel her body move again. She’s eager to participate in other trials, other studies. She has received cord-lengthening surgery on the tendons in her arms and wrists to ease the contractions. With physical therapy, she says, she has gained some range of movement in the hope of using FES to gain useful control. She still can’t move her limbs or speak, but she has lived at the vanguard of scientific research and of advocacy.

Her current technology, however, continues to be unreliable. In recent weeks, as she tried to fill me in on the latest developments in her life, her computer began to fail, cutting off our line of communication and erasing her hard-fought email responses to my questions. Ultimately, the technologies she relies on—the wheelchair that carries her body, the computer that projects her voice into the world—are indifferent to her needs, unaware of their importance in her life. What is left, when even the most sophisticated robots and advanced computers fail, is other people, who will continue to care for one another, connected by a shared understanding of human fragility.