Ethics as a cornerstone of neural engineering research

Submitted by Kate Goldyn on

Written by Aleenah Ansari and originally published on the Center for Sensorimotor Neural Engineering website. 

The BioRobotics Lab at the University of Washington (UW) strives to improve people’s lives through neural engineering research and the development of technology for minimally invasive robot-assisted surgery. This lab is currently co-directed by Center for Sensorimotor Neural Engineering (CSNE) member and UW Department of Electrical Engineering (UWEE) professor, Howard Chizeck.

Researchers in the lab’s neural engineering research group focus on optimizing the technology of closed-loop deep implantable neurostimulators (DBS), which can be used to manage motor and mental conditions like essential tremor, Parkinson’s disease and dystonia. Essential tremor is the most common neurological movement disorder, and it causes uncontrollable rhythmic motion, often of the upper extremities, which can make it difficult for people with the disease to perform everyday tasks. As the disease progresses, treatment medications are less effective at addressing the symptoms, but current DBS technology may offer a promising alternative. This technology delivers a consistent stream of electrical stimulation to an implantation site located deep in the brain; however, because stimulation is always on, the DBS’s battery may deplete quickly, which necessitates invasive surgery for replacement.

“It might be a more optimal therapy if we could only turn stimulation on when people are moving in a way that’s causing tremors,” said Brady Houston, a fifth year PhD student in the UW Graduate Program in Neuroscience and graduate research assistant in the UW BioRobotics Lab. “That’s the goal of [my] project – create a closed loop system where [users are] only receiving stimulation when they actually need it, which is during movement that’s causing tremors.”

Houston and other researchers in the lab are working on a new generation of DBS systems called brain computer interface (BCI) -triggered DBS. These devices would allow patients to control the DBS’s level of stimulation, and turn it on or off based on their preference. The BioRobotics Lab’s research on use of implanted BCI-triggered DBS to address essential tremors is paving the way for development of neuroprosthetic devices; however, engineering this technology is only part of the process.

Allowing the user to control the stimulator, or have a self-regulating device that automatically changes stimulation, can bring up challenges to the user’s personal identity and autonomy, and render them vulnerable to hacking. Because DBS devices can be used to alter brain activity, engineers are responsible for considering the ethical implications of this technology.

Initiating the conversation about neuroethics in research

The CSNE is dedicated to proactively identifying these challenges, which is evidenced by the Center’s neuroethics research thrust that focuses on the ethical issues implications emerging neural engineering technologies such as the need to protect the user’s autonomy or agency.

“If you neglect [neuroethics and patient experience], you could end up with results that are scientifically impressive, but not necessarily applicable or valuable in the same ways,” said Maggie Thompson, a third year PhD student in UWEE and graduate student researcher in the UW BioRobotics lab. “The CSNE understands that mission, and works to provide the funding to enable a collaborative project like this.”

In line with this goal, the CSNE and the National Science Foundation currently fund a research assistant position for Tim Brown, a sixth year PhD student in the UW Department of Philosophy, to work as a neuroethicist in the UW BioRobotics Lab. In this role, Brown explores the ethical implications of DBS controlled by a BCI, which is especially relevant as this technology becomes more advanced.

 “I have a feeling that these new systems will have an effect on the users’ agencies even more, in ways that are not accounted for currently in the research,” Brown said.

Brown’s involvement with the CSNE, and the integration of neuroethics into the CSNE’s work, began when Thomas Daniel, a professor in the UW Department of Biology, was the interim director of the CSNE. Daniel knew that the Center was creating cutting-edge technology in neural engineering and believed that researchers needed to consider the ethical implications of the research while creating neural engineering devices. To achieve this goal, Daniel reached out to Sara Goering, a CSNE neuroethics research leader and an associate professor in the UW Department of Philosophy. Goering invited Brown to join the CSNE neuroethics team as a graduate student researcher.

“I think [Sara] knew me better than I knew myself,” Brown said. “I had no idea what neuroethics was before that. [Now,] I see [neuroethics] as a platform and way to take interesting philosophy and give it an application.”

Brown’s research focuses on a user’s sense of self, identity and agency when they use devices like DBS or BCIs, and ethical considerations related to experimental testing of neural devices. His research can provide relevant context for the findings of his colleagues at the CSNE and the UW BioRobotics Lab. For this reason, information from his research can be incorporated into their papers to better explain the ethical challenges accompanying DBS and BCI use.

Capturing the user’s firsthand perspective

Although Brown values the use of quantitative data in neural engineering research, he believes that the impact of DBS systems on users cannot be fully captured by a machine’s output or numerical score.

“I think we need a more nuanced understanding of how people are impacted by a deep brain stimulator,” Brown said. “What we have to do is ask them, listen to sub-text and nuance, and give them scenarios to react to. It’s not going to be quantifiable.”

Brown hopes that his research and direct conversations with BCI and DBS users can help capture the experience of patients who are using this technology in the lab. Brown works with three patients in-depth to conduct semi-structured interviews about the impact of DBS on their self-perception. Because these interviews occur in-situ, or while the DBS or BCI is implanted, patients provide feedback about the DBS system in real time.

Brown’s questions cover topics like the impact of DBS on body image and [users'] sense of autonomy when the DBS device is implanted, if they feel in control, and what the surgery to get the device implanted felt like for them. These kinds of questions can help researchers understand how people think, talk and feel about themselves once they begin using these devices.

In Brown’s research, he sees people as experts of their own experience and encourages frequent conversation with the end-users of technology like DBS and BCIs.

“This research is about figuring out what the public is worried about, and making ourselves accountable to that,” Brown said. “[We need to figure] out what measures to put in place to protect people who use their technologies, and change our language to address their worries in ways that are productive and make them feel better about using certain devices.”

Although Houston’s project is currently focused on proof of concept, Brown’s questions, and patient’s responses, can inform future iterations of DBS technology.

“Knowing [patients’] preferences makes it more likely for the system to be successful for that specific patient earlier on in the process,” Houston said. 

Thompson noted that Brown’s research informs her own research papers by capturing the patient perspective. For example, in a conference paper she published about preliminary BCI work, Thompson included a whole section about patient perceptions of BCI use with quotes from Brown’s interviews.

“You can perform an accuracy percentage in terms how successfully they performed the task, and plenty of researchers do that. That doesn’t tell if the person was controlling it with their eyes closed or if they were sweating—some of our patients found it very fatiguing to use the system,” Thompson said. “Or if a patient was controlling the system proficiently but didn’t feel like they had control, those things aren’t necessarily reported.”

In the lab, both the engineers and Brown benefit from this collaboration and continued conversation about the ethical implications of research.

“I find that [my work and presence] has an impact on how people talk about their own research. They’re more attentive to the moral implications of [their projects],” Brown said. “There were times when, in my experiments, the engineers would ask people questions that I wouldn’t have thought of about the patient’s experience ... I’m always proud when they do that.”

Brown also hopes to actively consider and honor the perspectives of people with disabilities as well as other potential end users of the lab’s technology.

“If we don’t access the perspective [of people with disabilities], we have no chance to address the harms that are possible with these technologies,” Brown said. “We need to make their voices heard and apply our previous ethical frameworks to addressing some of their concerns.”

To encourage collaboration of ethicists and engineers beyond the UW, Brown and Thompson are currently co-writing a paper that focuses on ways that engineering research centers can consider the neuroethical implications in research, or approach interdisciplinary collaborations of any form. Some suggestions focus on recognizing the financial commitment and time needed for this endeavor, or simply engaging in conversations with people from other fields.

Ultimately, Tim strives to inform people of the innovations and potential benefits of DBS and BCI technology, but also the risks or impact to autonomy, agency and privacy of users.

“Getting ethicists involved [early is valuable because] they often ask questions that engineers wouldn’t necessarily think about,” Houston said. “That’s important in the progress of these projects … so we get another point of view and figure out what patients actually want, and what would be beneficial to them.”

Brown has shared the implications of neuroethics research with academic communities by giving lectures to programs like Math Science Upward Bound or neuroscience classes at the UW. The significance of neuroethics is also beginning to show up in curriculum facilitated by the CSNE for engineering students. For example, educational programs like the CSNE’s Graduate Certificate Program in Neural Engineering at the UW currently include an embedded piece of the curriculum on neuroethics, which reflects the CSNE’s effort to encourage rising engineers to proactively consider the neuroethical implications of their work. With these outreach efforts, Tim hopes to create a two-way dialogue about neuroethics with the public.

“We’re striving to better inform engineers, the public [and] neuroethicists to make our work informative for all the stakeholders,” Brown said.

For more information, visit the CSNE’s neuroethics page or contact Tim Brown.