JAMES GIORDANO: James Giordano is a professor of neurology, chief of the Neuroethics Studies Program, and co-director of the O’Neill-Pellegrino Program in Brain Science and Global Health Law and Policy at...More
Nearly two years ago, Juliano Pinto, a 29-year-old paraplegic man, kicked off the World Cup in Brazil with the help of a brain-interface machine that allowed his thoughts to control a robotic exoskeleton. Audiences watching Pinto make his gentle kick, aided as he was by helpers and an elaborate rig, could be forgiven for not seeing much danger in the thrilling achievement. Yet like most powerful scientific breakthroughs, neurotechnologies that allow brains to control machines—or machines to read or control brains—inevitably bring with them the threat of weaponization and misuse, a threat that existing UN conventions designed to limit biological and chemical weapons do not yet cover and which ethical discussions of these new technologies tend to give short shrift. (It may seem like science fiction, but according to a September 2015 article in Foreign Policy, “The same brain-scanning machines meant to diagnose Alzheimer’s disease or autism could potentially read someone’s private thoughts. Computer systems attached to brain tissue that allow paralyzed patients to control robotic appendages with thought alone could also be used by a state to direct bionic soldiers or pilot aircraft. And devices designed to aid a deteriorating mind could alternatively be used to implant new memories, or to extinguish existing ones, in allies and enemies alike.”)
Despite the daunting complexity of the task, it’s time for the nations of the world to start closing these legal and ethical gaps—and taking other security precautions—if they hope to control the neuroweapons threat.
The technology on display in São Paulo, pioneered by Miguel Nicolelis of Duke University, exhibited the growing capability of neurorobotics—the study of artificial neural systems. The medical benefits for amputees and other patients are obvious, yet the power to read or manipulate human brains carries with it more nefarious possibilities as well, foreshadowing a bold new chapter in the long history of psychological warfare and opening another front in the difficult struggle against the proliferation of exceptionally dangerous weapons.
The full range of potential neuroweapons covers everything from stimulation devices to artificial drugs to natural toxins, some of which have been studied and used for decades, including by militaries. Existing conventions on biological and chemical weapons have limited research on, and stockpiling of, certain toxins and “neuro-microbiologicals” (such as ricin and anthrax, respectively), while other powerful substances and technologies—some developed for medical purposes and readily available on the commercial market—remain ungoverned by existing international rules. Some experts also worry about an ethics lag among scientists and researchers; as the September 2015 Foreign Policy article pointed out, a 200-page report put out last spring on the ethics of the Obama administration’s BRAIN Initiative didn’t once mention “dual use” or “weaponization.” In America, federally funded medical research with potential military applications can be regulated by Dual-Use Research of Concern policies at the National Institutes of Health, which reflect the general tenor of the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. Yet these policies do not account for research in other countries, or research undertaken (or underwritten) by non-state actors, and might actually create security concerns for the United States should they cause American efforts to lag behind those of other states hiding behind the excuse of health research or routine experimentation, or commercial entities sheltered by industry norms protecting proprietary interests and intellectual property.
In addition to a more robust effort on the part of scientists to better understand and define the ethics of neuroscience in this new era, one obvious solution to the neuroweapons threat would be progress on the bioweapons convention itself. In preparation for the biological weapons convention’s Eighth Review Conference at the end of this year, member states should establish a clearer view of today’s neuroscience and neurotechnology, a better understanding of present and future capabilities, and a realistic picture of emerging threats. They should also revise the current definitions of what constitutes a bioweapon, and what is weaponizable, and set up criteria to more accurately assess and analyze neuroscience research and development going forward.
I would also argue that the United States and its allies should take the proper security precautions in the form of increased surveillance of neuroscience R&D around the world. As a preliminary measure, government monitors can develop a better understanding of the field by paying attention to “tacit knowledge”—the unofficial know-how that accumulates among individuals in labs and other venues where a particular science is practiced or studied. (For more on tacit knowledge and arms control, see Sonia Ben Ouagrham-Gormley’s recent Bulletin article about its crucial importance for the bioweapons convention.) In a similar vein, authorities should also follow the neuroscience literature in an effort to assess trends, gauge progress, and profile emerging tools and techniques that could be enlisted for weaponization.
Of course these are only preliminary measures, easily stymied by proprietary restrictions in the case of commercial research and state-secret classifications in the case of government work. Thus deeper surveillance will require a wider effort to collect intelligence from a variety of sources and indicators, including university and industrial programs and projects that have direct dual-use applications; governmental and private investment in, and support of, neuroscience and neurotech R&D; researchers and scholars with specific types of knowledge and skills; product and device commercialization; and current and near-term military postures regarding neurotechnology. This type of surveillance, while requiring more nuanced and more extensive investigations, could produce highly valuable empirical models to plot realistic possibilities for the near future of neuroscience and neurotechnology. These could then be used to better anticipate threats and create contingency plans.
It’s important to note the danger of this type of surveillance as well. As a 2008 reportby the National Academies in Washington warned, increased surveillance could lead to a kind of arms race, as nations react to new developments by creating countering agents or improving upon one another’s discoveries. This could be the case not only for incapacitating agents and devices but also for performance-enhancing technologies. As a 2014 report by the National Academies readily acknowledged, this type of escalation is a realistic possibility with the potential to affect international security.
The United States and its allies should therefore be cautious if they deem it necessary to establish this kind of deep surveillance. And on the international front, they should simultaneously support efforts to improve the Biological Weapons Convention to account for neuroweapons threats in the offing.
Finally, they should keep in mind just how hard it is to regulate neuroscience and neurotechnology during this time of great discovery and expansion. Ethical ideals can be developed to shape guidelines and policies that are sensitive to real-world scenarios, but the flexibility of these approaches also means that they are not conclusive. Those charged with monitoring potential threats must be constantly vigilant in the face of changing technologies and fuzzy distinctions between medical and military uses, all while navigating the complexities of the health-care industry, political and military ethics, and international law. In light of the work ahead, it remains to be seen just how well the nations of the world will rally to face the neuroweapons threat.
Author’s note: The views expressed in this article do not necessarily reflect those of DARPA, the Joint Staff, or the United States Department of Defense. Source: https://thebulletin.org