Inviting neural engineers to proactively consider the ethical implications of their work

Aleenah Ansari

Although most people are somewhat familiar with the concept of ethics, which describes fundamental principles of decent human conduct, the idea of ethics as applied to neural engineering, or “neuroethics,” might be less familiar. As neural engineering technology becomes increasingly prolific, it is imperative that future engineers, researchers and ethicists recognize how these devices can impact current and future users of neurotechnology.

“When you have driverless cars, Elon Musk with Neuralink and all of these tech companies starting to interface with neuroscience and human capabilities, [neuroethics] is something that students have to think about,” said Laura Specker Sullivan, an interdisciplinary ethicist, cross-cultural bioethicist and philosopher currently based at Harvard Medical School's Center for Bioethics. “I want to make sure they have the skills to be leaders in the field, and have the tools to guide these discussions about what these technologies are, how we should be pursuing them, and if we should be pursuing them.” Specker Sullivan was formerly a postdoctoral neuroethics fellow at the Center for Sensorimotor Neural Engineering (CSNE), as well as an affiliate assistant professor in the Department of Philosophy at the University of Washington (UW)

Members of the CSNE neuroethics research thrust study the ethical issues that arise from neural engineering technology such as privacy, shifts in personal identity and moral responsibility. This is vital to explore because the impact of neural technology extends beyond the lab setting into people’s everyday lives, particularly because neural engineering devices are created for the long-term goal of helping people heal, feel and move again.

“When they put in the grant to [the National Science Foundation], they recognized that there were lots of ethical and philosophical implications of doing research with implants in the brain, and really wanted to be proactive in terms of thinking through those issues,” said Eran Klein, a neuroethics thrust leader and affiliate assistant professor in the UW Department of Philosophy. “Over the initial couple of years, there was recognition that it needed to be as fundamental as other areas within neural engineering. It became a [research] thrust alongside some of the other thrusts at the CSNE.”

Students from the neuroethics team are part of the research groups in CSNE-affiliated labs, and they engage in regular conversation about neuroethics and the impact of neural engineering technology on end-users.

“That’s one of the benefits of the way that the CSNE has integrated ethics – it makes space for conversations in a lab and not having to go to a separate lecture,” Klein said. “When [an] issue comes up, there’s someone there to talk about it and explore.”

Encouraging proactive consideration of neuroethics in the classroom

To continue the conversation about neuroethics with current students who will continue to work with neural engineering technology or participate in research, Specker Sullivan taught a UW course called “Neuroethics” during spring 2017 that was offered under “Advanced Topics in Philosophy.” She hoped that this class would prepare student researchers, philosophers and future engineers to learn about ethics and identify their personal stances on these issues.

“Having a class and having people come together for a full quarter shows a sustained commitment to training the next generation of scholars and [neuroethicists],” Specker Sullivan said.

The course focused on the ethical issues that arise in neuroscience and neural engineering such as user privacy, security and legal responsibility. The first few classes focused on discussions about issues of autonomy and agency in deep brain stimulation, because this topic is a focal point for many neuroethical discussions in neural engineering. From there, students explored a range of contemporary topics in neuroethics, such as medical treatment versus human-enhancement and neural security.

“We start by focusing on the [individual] with these questions about authenticity and then go outward. By the end, they’re thinking about disability and justice in the context of society,” Specker Sullivan said.

The CSNE is an international leader in developing a philosophical approach to neuroethics, which is why this class emphasized argumentation and identifying the motivations behind the student’s beliefs, a topic that is infrequently discussed in typical engineering courses. This approach helped students step back from academic pursuits and reflect on how their work aligned with their own personal values.

“It’s important to identify the reasons behind why you do what you do. Otherwise, you’ll be [easily] moved by social influences and emotions,” Specker Sullivan said. “In many ways, I see being able to pick up argumentation as a really important life skill.”

Creating community and encouraging interdisciplinary discussion

This class is open to students who conduct research at the CSNE as well as other students who are interested in exploring ethics related to technology. Specker Sullivan chose a class discussion-based model to encourage collaborative learning and discussion among students from several different disciplines and career interests.

“We view the class as a service to the CSNE and the students in the CSNE, because it’s a way to go into some in-depth thinking about these issues but also give them exposure to perspectives outside engineering through the students who are studying other things,” Klein said. “It’s a way of bringing our students into to the community by having a class.”

Specker Sullivan hopes that students will encourage each other to proactively and critically consider the ethical implications of their work, and how it relates to their personal values.

“I [want] students to be able to think about the ethical significance of science and technology, not just in terms of its downstream implications,” Specker Sullivan said. “I think the really interesting questions about new technology are about how these technologies came to be in the first place – what types of social values and assumptions and biases are driving the fact that this is an important technology to have at this time. That’s a less common way of thinking about ethics, and it’s something that I really want students to be able to pick up.”

Specker Sullivan strived to create a classroom where people feel comfortable sharing their opinions and perspectives, which is why she made sure that everyone learned each other’s names and set-up the tables in a circle. In the classroom, Specker Sullivan also wanted to facilitate conversations among students and encouraged them to be collaborative colleagues who could discuss their work and opinions in a productive way.

“I really want students to be constructively asking each other about how they are forming their arguments,” Specker Sullivan said. “By the end of the quarter, they can be giving presentations, and it’s essentially like they’re a strong professional working group where they have this shared basis of information and can talk to each other about what they are doing.”

Yunqian (Waterlilly) Huang is a double major in neurobiology and biology at the UW and currently works in Eberhard Fetz’s Lab, a CSNE-affiliated lab that investigates the neural mechanisms involved in executing hand movements. Huang said that she really valued hearing from classmates in philosophy, and she hopes to apply the things she learned in this class as she works toward her dream of creating a neural engineering start-up.

“I want to really understand what my values are and what my definitions are before I enter the field and design anything,” Huang said.

In addition to classroom discussions, Specker Sullivan assigned a case analysis project where each student was paired with a CSNE lab and tasked with writing about the significance of the lab’s research and the neuroethical implications of its work. For some students, it was their first experience visiting a lab environment.

“Part of the reasoning for this assignment is that I don’t want to just tell them what’s ethically significant. A real skill is [looking] at scientific practices and technologies and think[ing] about how they are going to play into society, and what values went into their creation,” Specker Sullivan said. “It’s creating this connection between ethics theory and reality.”

Huang enjoyed the classroom discussions, and she emphasized that everyone has stakes in the conversation about the ethical implications of neural engineering research.

“I do think it should be a graduation requirement for everyone … [because] the biomedical field has become so important in everyone’s lives,” Huang said. “We need to have a society where people want to talk about current neuroethical issues, and there should be a community to share these ideas.”

This class asks students to actively identify their own values and the implications of their work, which is especially important as students go on to create neural engineering devices while considering the needs of technology end users, conduct research or interact with the field in other ways.

“There are topics like autonomy, agency and responsibility that are important to dig deep into, but it’s more a recognition that there are issues within each of those areas that are controversial and complicated,” Klein said. “We’re thinking hard about [them] because they impact people’s lives who are going to be exposed to these technologies.”

For more information, contact Eran Klein