The neuroweapons threat

JAMES GIORDANO: James Giordano is a professor of neurology, chief of the Neuroethics Studies Program, and co-director of the O’Neill-Pellegrino Program in Brain Science and Global Health Law and Policy at...More

Nearly two years ago, Juliano Pinto, a 29-year-old paraplegic man, kicked off the World Cup in Brazil with the help of a brain-interface machine that allowed his thoughts to control a robotic exoskeleton. Audiences watching Pinto make his gentle kick, aided as he was by helpers and an elaborate rig, could be forgiven for not seeing much danger in the thrilling achievement. Yet like most powerful scientific breakthroughs, neurotechnologies that allow brains to control machines—or machines to read or control brains—inevitably bring with them the threat of weaponization and misuse, a threat that existing UN conventions designed to limit biological and chemical weapons do not yet cover and which ethical discussions of these new technologies tend to give short shrift. (It may seem like science fiction, but according to a September 2015 article in Foreign Policy, “The same brain-scanning machines meant to diagnose Alzheimer’s disease or autism could potentially read someone’s private thoughts. Computer systems attached to brain tissue that allow paralyzed patients to control robotic appendages with thought alone could also be used by a state to direct bionic soldiers or pilot aircraft. And devices designed to aid a deteriorating mind could alternatively be used to implant new memories, or to extinguish existing ones, in allies and enemies alike.”)

Despite the daunting complexity of the task, it’s time for the nations of the world to start closing these legal and ethical gaps—and taking other security precautions—if they hope to control the neuroweapons threat.

The technology on display in São Paulo, pioneered by Miguel Nicolelis of Duke University, exhibited the growing capability of neurorobotics—the study of artificial neural systems. The medical benefits for amputees and other patients are obvious, yet the power to read or manipulate human brains carries with it more nefarious possibilities as well, foreshadowing a bold new chapter in the long history of psychological warfare and opening another front in the difficult struggle against the proliferation of exceptionally dangerous weapons.

The full range of potential neuroweapons covers everything from stimulation devices to artificial drugs to natural toxins, some of which have been studied and used for decades, including by militaries. Existing conventions on biological and chemical weapons have limited research on, and stockpiling of, certain toxins and “neuro-microbiologicals” (such as ricin and anthrax, respectively), while other powerful substances and technologies—some developed for medical purposes and readily available on the commercial market—remain ungoverned by existing international rules. Some experts also worry about an ethics lag among scientists and researchers; as the September 2015 Foreign Policy article pointed out, a 200-page report put out last spring on the ethics of the Obama administration’s BRAIN Initiative didn’t once mention “dual use” or “weaponization.” In America, federally funded medical research with potential military applications can be regulated by Dual-Use Research of Concern policies at the National Institutes of Health, which reflect the general tenor of the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. Yet these policies do not account for research in other countries, or research undertaken (or underwritten) by non-state actors, and might actually create security concerns for the United States should they cause American efforts to lag behind those of other states hiding behind the excuse of health research or routine experimentation, or commercial entities sheltered by industry norms protecting proprietary interests and intellectual property.

In addition to a more robust effort on the part of scientists to better understand and define the ethics of neuroscience in this new era, one obvious solution to the neuroweapons threat would be progress on the bioweapons convention itself. In preparation for the biological weapons convention’s Eighth Review Conference at the end of this year, member states should establish a clearer view of today’s neuroscience and neurotechnology, a better understanding of present and future capabilities, and a realistic picture of emerging threats. They should also revise the current definitions of what constitutes a bioweapon, and what is weaponizable, and set up criteria to more accurately assess and analyze neuroscience research and development going forward.

I would also argue that the United States and its allies should take the proper security precautions in the form of increased surveillance of neuroscience R&D around the world. As a preliminary measure, government monitors can develop a better understanding of the field by paying attention to “tacit knowledge”—the unofficial know-how that accumulates among individuals in labs and other venues where a particular science is practiced or studied. (For more on tacit knowledge and arms control, see Sonia Ben Ouagrham-Gormley’s recent Bulletin article about its crucial importance for the bioweapons convention.) In a similar vein, authorities should also follow the neuroscience literature in an effort to assess trends, gauge progress, and profile emerging tools and techniques that could be enlisted for weaponization.

Of course these are only preliminary measures, easily stymied by proprietary restrictions in the case of commercial research and state-secret classifications in the case of government work. Thus deeper surveillance will require a wider effort to collect intelligence from a variety of sources and indicators, including university and industrial programs and projects that have direct dual-use applications; governmental and private investment in, and support of, neuroscience and neurotech R&D; researchers and scholars with specific types of knowledge and skills; product and device commercialization; and current and near-term military postures regarding neurotechnology. This type of surveillance, while requiring more nuanced and more extensive investigations, could produce highly valuable empirical models to plot realistic possibilities for the near future of neuroscience and neurotechnology. These could then be used to better anticipate threats and create contingency plans.

It’s important to note the danger of this type of surveillance as well. As a 2008 reportby the National Academies in Washington warned, increased surveillance could lead to a kind of arms race, as nations react to new developments by creating countering agents or improving upon one another’s discoveries. This could be the case not only for incapacitating agents and devices but also for performance-enhancing technologies. As a 2014 report by the National Academies readily acknowledged, this type of escalation is a realistic possibility with the potential to affect international security.

The United States and its allies should therefore be cautious if they deem it necessary to establish this kind of deep surveillance. And on the international front, they should simultaneously support efforts to improve the Biological Weapons Convention to account for neuroweapons threats in the offing.

Finally, they should keep in mind just how hard it is to regulate neuroscience and neurotechnology during this time of great discovery and expansion. Ethical ideals can be developed to shape guidelines and policies that are sensitive to real-world scenarios, but the flexibility of these approaches also means that they are not conclusive. Those charged with monitoring potential threats must be constantly vigilant in the face of changing technologies and fuzzy distinctions between medical and military uses, all while navigating the complexities of the health-care industry, political and military ethics, and international law. In light of the work ahead, it remains to be seen just how well the nations of the world will rally to face the neuroweapons threat.

Author’s note: The views expressed in this article do not necessarily reflect those of DARPA, the Joint Staff, or the United States Department of Defense. Source: https://thebulletin.org
Read More........

Smartphones uncover how the world sleeps


A pioneering study of worldwide sleep patterns combines math modelling, mobile apps and big data to parse the roles society and biology each play in setting sleep schedules. The study, led by University of Michigan mathematicians, used a free smartphone app that reduces jetlag to gather robust sleep data from thousands of people in 100 nations. The researchers examined how age, gender, amount of light and home country affect the amount of shut-eye people around the globe get, when they go to bed, and when they wake up. Among their findings is that cultural pressures can override natural circadian rhythms, with the effects showing up most markedly at bedtime. While morning responsibilities like work, kids and school play a role in wake-time, the researchers say they're not the only factor. Population-level trends agree with what they would expect from current knowledge of the circadian clock. "Across the board, it appears that society governs bedtime and one's internal clock governs wake time, and a later bedtime is linked to a loss of sleep," says Daniel Forger, who holds faculty positions in mathematics at the U-M College of Literature, Science, and the Arts, and in the U-M Medical School's Department of Computational Medicine and Bioinformatics. "At the same time, we found a strong wake-time effect from users' biological clocks - not just their alarm clocks. These findings help to quantify the tug-of-war between solar and social timekeeping." When Forger talks about internal or biological clocks, he's referring to circadian rhythms - fluctuations in bodily functions and behaviors that are tied to the planet's 24-hour day. These rhythms are set by a grain-of-rice-sized cluster of 20,000 neurons behind the eyes. They're regulated by the amount of light, particularly sunlight, our eyes take in. Circadian rhythms have long been thought to be the primary driver of sleep schedules, even since the advent of artificial light and 9-to-5 work schedules. The new research helps to quantify the role that society plays. Here's how Forger and colleague Olivia Walch arrived at their findings. Several years ago, they released an app called Entrain that helps travelers adjust to new time zones. It recommends custom schedules of light and darkness. To use the app, you have to plug in your typical hours of sleep and light exposure, and are given the option of submitting your information anonymously to U-M. The quality of the app's recommendations depended on the accuracy of the users' information, and the researchers say this motivated users to be particularly careful in reporting their lighting history and sleep habits. With information from thousands of people in hand, they then analysed it for patterns. Any correlations that bubbled up, they put to the test in what amounts to a circadian rhythm simulator. The simulator - a mathematical model - is based on the field's deep knowledge of how light affects the brain's suprachiasmatic nucleus (that's the cluster of neurons behind the eyes that regulates our internal clocks). With the model, the researchers could dial the sun up and down at will to see if the correlations still held in extreme conditions. "In the real world, bedtime doesn't behave how it does in our model universe," Walch says. "What the model is missing is how society affects that." The spread of national averages of sleep duration ranged from a minimum of around 7 hours, 24 minutes of sleep for residents of Singapore and Japan to a maximum of 8 hours, 12 minutes for those in the Netherlands. That's not a huge window, but the researchers say every half hour of sleep makes a big difference in terms of cognitive function and long-term health. The findings, the researchers say, point to an important lever for the sleep-deprived - a set that the Centers for Disease Control and Prevention is concerned about. A recent CDC study found that across the US, one in three adults aren't getting the recommended minimum of seven hours. Sleep deprivation, the CDC says, increases the risk of obesity, diabetes, high blood pressure, heart disease, stroke and stress. 
The U-M researchers also found that:
  • Middle-aged men get the least sleep, often getting less than the recommended 7 to 8 hours.
  • Women schedule more sleep than men, about 30 minutes more on average. They go to bed a bit earlier and wake up later. This is most pronounced in ages between 30 and 60.
  • People who spend some time in the sunlight each day tend to go to bed earlier and get more sleep than those who spend most of their time in indoor light.
  • Habits converge as we age. Sleep schedules were more similar among the older-than-55 set than those younger than 30, which could be related to a narrowing window in which older individuals can fall and stay asleep.
Sleep is more important than a lot of people realise, the researchers say. Even if you get six hours a night, you're still building up a sleep debt, says Walch, doctoral student in the mathematics department and a co-author on the paper. "It doesn't take that many days of not getting enough sleep before you're functionally drunk," she said. "Researchers have figured out that being overly tired can have that effect. And what's terrifying at the same time is that people think they're performing tasks way better than they are. Your performance drops off but your perception of your performance doesn't." Aside from the findings themselves, the researchers say the work demonstrates that mobile technology can be a reliable way to gather massive data sets at very low cost. "This is a cool triumph of citizen science," Forger said. The work is funded by the Army Research Laboratory, the Air Force Office of Scientific Research and the National Science Foundation. Source: domain-b.com
Read More........

Study shows how the brain can trigger a deep sleep

Scientists have discovered that switching on one area of the brain chemically can trigger a deep sleep. The new study, which explored how sedatives work in the brain's neural pathways, could lead to better remedies for insomnia and more effective anaesthetic drugs. Scientists from Imperial College London found that certain types of sedative drugs work by 'switching on' neurons in a particular area of the brain, called the preoptic hypothalamus. Their work, in mice, showed that it is these neurons that are responsible for shutting down the areas of the brain that are inactive during deep sleep. Following a period of sleep deprivation, the brain triggers a process that leads to a deep recovery sleep. The researchers found that the process that is triggered by the sedatives is very similar. In mice, when the researchers used a chemical to activate only specific neurons in the preoptic hypothalamus, this produced a recovery sleep in the animals. The new research is important because although scientists understand how sedatives bind to certain receptors to cause their desired effects, it had previously been assumed that they had a general effect throughout the brain. The knowledge that one distinct area of the brain triggers this kind of deep sleep paves the way for the development of better targeted sedative drugs and sleeping pills. These new drugs could directly hijack this natural mechanism to work more effectively, with fewer side effects and shorter recovery times. ''If you don't sleep for a long period, your body shuts down – almost as if you had taken a drug,'' said study co-author Professor Bill Wisden, from the Department of Life Sciences at Imperial College London. ''We've shown that sedative drugs trigger the same neurons, making the two types of unconsciousness very similar.'' ''Although we know that certain sedatives are effective, there are lots of gaps in scientists' knowledge in terms of precisely what sedatives are doing in the brain. We looked at the class of sedative drugs commonly used for patients undergoing investigative procedures or minor operations, to try and identify the circuitry in the brain that they are affecting,'' explained Nick Franks, also from the Department of Life Sciences at Imperial College London. ''What we found was really striking. Most people might think that sedative drugs would work by directly shutting down certain neural pathways but actually what happened was that they first switched on one particular area – the preoptic hypothalamus – and this then caused other parts of the brain to shut down.'' ''Lack of sleep is a really serious problem for many people, such as people suffering from stress or people working irregular shifts, and it affects their physical and mental health'' added Professor Wisden. ''There are many different sleeping pills available but none of them provide rest that is as restorative as natural sleep. We hope that our new research will ultimately lead to new ways of addressing this problem.'' In the study, published in Nature Neuroscience, the researchers used a genetic tagging system to mark neurons in mice that were activated both during sedation and in recovery sleep. When the researchers subsequently targeted those neurons in the mice with a selective chemical, this was sufficient to produce a recovery sleep in the mice. The team plan to continue their investigations into sleep induction in the brain, to try to understand more of the complex chemical circuitry governing our response to tiredness. The research is funded by the Medical Research Council, the Biotechnology and Biological Sciences Research Council, the Wellcome Trust, the UK-China Scholarships for Excellence Scheme, and the ERASMUS Program. Source: Article
Read More........