Nineteen years ago today, on July 17, 1990, as scientists embarked on the mapping of the human genome, Congress and former President George H.W. Bush proclaimed the 1990's to be the 'Decade of the Brain,' stating:
The human brain, a 3-pound mass of interwoven nerve cells that controls our activity, is one of the most magnificent--and mysterious--wonders of creation.
The seat of human intelligence, interpreter of senses, and controller of movement, this incredible organ continues to intrigue scientists and layman alike.
Over the years, our understanding of the brain--how it works, what goes wrong when it is injured or diseased--has increased dramatically. However, we still have much more to learn. ...
While the learning continues, the effort embedded the field of neuroscience into the nation's consciousness, and we witnessed a remarkable evolution regarding our study and knowledge of the master gland in the '90s. Some top examples included:
- The advent of modern brain scan imaging technology, which today is an indispensable tool used to detect both traumatic brain injury and even post-traumatic stress in injured troops.
- Scientists succeeding in using primate (monkey) brain impulses to physically move robotic arms, a sort of mind-machine melding in line with a special Navy helicopter flight suit that allows pilots hands-free, precision control of their craft.
- The discovery that new human brain cells can, indeed, be created, shattering the previous notion that we are born with a finite number.
And what of the military?
In educational interest, article(s) quoted from extensively.
To answer the first question, an excerpt from An Interpersonal Neurobiology of Psychotherapy: The Developing Mind and the Resolution of Trauma by Daniel J. Siegel, which is Chapter 1 in the book Healing Trauma: Attachment, Mind, Body and Brain edited by Marion F. Solomon and Daniel J. Siegel:
But what is the mind?
One way to address this important question is by looking at the definition of the psyche. Webster's Dictionary defines psyche as follows: "1. the soul; 2. the intellect; and 3. in psychiatry -- the mind considered as a subjectively perceived, functional entity, based ultimately upon physical processes but with complex processes of its own: it governs the total organism and its interaction with the environment." ...
A variety of disciplines explore the nature of the mind in its ability to process information and to regulate the function of the individual in adapting to the environment. These various conceptualizations of mind often share the notion that the mind is more than a physical entity -- such as brain activity alone -- and yet emerges from and also regulates the "self" and the physiological processes from which it emerges. ..
Recent technological advances have permitted truly new insights into the nature of the mind. For example, our modern view of the brain and its response to experience has shed some new light on how experience directly affects gene function, neuronal connections, and the organization of the mind.
Most new advances are now under the neuroscience umbrella. Author Steven Rose writes in his book, The Future of the Brain: The Promise and Perils of Tomorrow's Neuroscience:
Formal designations apart, the huge expansion of the neurosciences which has taken place over recent years has led many to suggest that the first ten years of this new century should be claimed as The Decade of the Mind. Capitalising on the scale and technological success of the Human Genome Project, understanding -- even decoding -- the complex interconnected web between the languages of the brain and those of mind has come to be seen as science's final frontier. With its hundred billion nerve cells, with their hundred trillion interconnections, the human brain is the most complex phenomenon in the known universe -- always, of course, excepting the interaction of some six billion such brains and their owners within the socio-technological culture of our planetary ecosystem!
The global scale of the research effort now put into the neurosciences, primarily in the US, but closely followed by Europe and Japan, has turned them from classical 'little sciences' into a major industry engaging large teams of researchers, involving billions of dollars from government -- including its military wing -- and the pharmaceutical industry. The consequence is that what were once disparate fields -- anatomy, physiology, molecular biology, genetics and behavior -- are now all embraced within 'neurobiology.' However, its ambitions have reached still further, into the historically disputed terrain between biology, psychology and philosophy; hence the more all-embracing phrase: 'the neurosciences.'
In May, New Scientist writer Linda Geddes reported on ways the military was harnessing science to create the ultimate warrior:
Battalions of super-soldiers could be selected for specific duties on the basis of their genetic make-up and then constantly monitored for signs of weakness. So says a report by the US National Academies of Science (NAS).
If a soldier is struggling, a digital "buddy" might step in and warn them about nearby threats, or advise comrades to zap them with an electromagnet to increase their alertness. If the whole unit is falling apart, biosensors could warn central commanders to send in a replacement team.
As advances in neuroscience bring all this into the realms of reality, there are ethical issues to consider. Last week, the NAS released a report assessing the military potential of neuroscience, providing a rare insight into how the military might invest its money to create future armies.
Sponsored by the US army and written by a panel of 14 prominent neuroscientists, the report focuses on those areas with "high-payoff potential" - where the science is sufficiently reliable to turn into useful technologies (see "Where should the money go?"). ...
Within five years, biomarkers might be used to assess how well a soldier's brain is functioning, and within 10 years, it should be possible to predict how individuals are likely to respond to environmental stresses like extreme heat and cold, or endurance exercises.
Genetic testing might also enable recruitment officers to determine which soldiers are best for specialist jobs. For example, by combining psychological testing with genetic tests for levels of brain chemicals, a clearer picture of a soldier's competencies might shine through. "We might say that given this person's high levels of brain serotonin, they're going to be calmer under pressure, so they might make a good sniper," says Paul Zak of Claremont Graduate University in California, who was on the NAS panel. Alternatively, someone with low dopamine might be less likely to take risks, he says, and therefore be better suited as a commanding officer in a civilian area. ...
"There are lots of stories of soldiers who refuse to shoot other soldiers," says Zak. "If you could get rid of that empathy response you might create a soldier that's more prepared to engage in battle and risk their life."
The panel recognised that such ethical dilemmas might be an inevitable consequence of their work. For this reason, they recommended that the US military should recruit ethicists to examine the ramifications of such developments before they occur. "They need to be explored because at some point someone's going to do them," says Zak. "Controls have to be put in place."
For our purposes, an April 2007 Bulletin of the Atomic Scientists post by George Mason University professor Hugh Gusterson offers a solid look at the militarization of neuroscience, giving a brief history as well as a peek at possible future outcomes of the mushrooming relationship:
We've seen this story before: The Pentagon takes an interest in a rapidly changing area of scientific knowledge, and the world is forever changed. And not for the better.
During World War II, the scientific field was atomic physics. Afraid that the Nazis were working on an atomic bomb, the U.S. government mounted its own crash project to get there first. The Manhattan Project was so secret that Congress did not know what it was funding and Vice President Harry S. Truman did not learn about it until FDR's death made him president. In this situation of extreme secrecy, there was almost no ethical or political debate about the Bomb before it was dropped on two cities by a bureaucratic apparatus on autopilot.
Despite J. Robert Oppenheimer's objections, a few Manhattan Project scientists organized a discussion on the implications of the "Gadget" for civilization shortly before the bomb was tested. Another handful issued the Franck Report, advising against dropping the bomb on cities without a prior demonstration and warning of the dangers of an atomic arms race. Neither initiative had any discernible effect. We ended up in a world where the United States had two incinerated cities on its conscience, and its pursuit of nuclear dominance created a world of nuclear overkill and mutually assured destruction.
This time we have a chance to do better. The science in question now is not physics, but neuroscience, and the question is whether we can control its militarization.
According to Jonathan Moreno's fascinating and frightening new book, Mind Wars: Brain Research and National Defense (Dana Press 2006), the Defense Advanced Research Projects Agency has been funding research in the following areas:
Moreno's book is important since there has been little discussion about the ethical implications of such research, and the science is at an early enough stage that it might yet be redirected in response to public discussion.
- Mind-machine interfaces ("neural prosthetics") that will enable pilots and soldiers to control high-tech weapons by thought alone.
- "Living robots" whose movements could be controlled via brain implants. This technology has already been tested successfully on "roborats" and could lead to animals remotely directed for mine clearance, or even to remotely controlled soldiers.
- "Cognitive feedback helmets" that allow remote monitoring of soldiers' mental state.
- MRI technologies ("brain fingerprinting") for use in interrogation or airport screening for terrorists. Quite apart from questions about their error rate, such technologies would raise the issue of whether involuntary brain scans violate the Fifth Amendment right against self-incrimination.
- Pulse weapons or other neurodisruptors that play havoc with enemy soldiers' thought processes.
- "Neuroweapons" that use biological agents to excite the release of neurotoxins. (The Biological and Toxin Weapons Convention bans the stockpiling of such weapons for offensive purposes, but not "defensive" research into their mechanisms of action.)
- New drugs that would enable soldiers to go without sleep for days, to excise traumatic memories, to suppress fear, or to repress psychological inhibitions against killing.
While many have missed the first years of this discussion, the topic is now breaking out into the wider public square for review and consideration. A good place to begin to ponder this intersection of the military and the field of neuroscience is offered in the following highly, highly recommended video.
A little over 50 minutes long, WPSU's Conversations from Penn State recently interviewed the above-mentioned Jonathan Moreno. From their intro:
Super soldiers equipped with neural implants, suits that contain biosensors, and thought scans of detainees may become reality sooner than you think. Find out how neuroscience is changing modern warfare, and discover the ethical implications with guest Jonathan Moreno.
For more on the topic, download Chapter 8, "Toward an Ethics of Neurosecurity," [pdf] of Moreno's book. An excerpt:
One theme I want to highlight in this chapter is the need for the scientific community to be more engaged in dealing with the unintended consequences of their work. Michael Moodie, the former director of the Chemical and Biological Arms Control Institute, has observed that “the attitudes of those working in the life sciences contrast sharply with the nuclear community.
Physicists since the beginning of the nuclear age, including Albert Einstein, understood the dangers of atomic power, and the need to participate actively in managing these risks. "e life sciences sectors lag in this regard. Many neglect thinking about the potential risks of their work.” My experience suggests that an increased sense of the need to be publicly involved is taking hold among life scientists, especially in the face of recent controversies about stem cell research and intelligent design. Questions of dual use also require the informed engagement of our best scientific thinkers.
The dual use issue becomes more pressing as the science becomes more powerful and as more people possess the knowledge to apply it. We’ve seen that the applications of neuroscience and other brain-targeting fields to national security are no longer in the realms of science fiction or paranoid fantasy, that tremendous advances have been and are being made in understanding the way the brain works and, more slowly, in modifying it.
Even though some of the claims that are being made are likely exaggerated, especially by companies trying to sell their products, not all are. Separating the wheat from the cha) is a challenge, but it does seem clear that the fascinating science I’ve described is on a course that, although not wholly predictable, almost surely points to greater understanding of and control over brain-related processes, and from various approaches.
When I’ve raised questions about the dual use implications of advances in brain science and technology at scienti!c meetings, many neuroscientists are surprised. Although they may receive Pentagon or CIA funding, brain scientists generally don’t regard themselves as contributing to warfare. Those whose research is funded wholly by civilian agencies are taken aback when I suggest that their published results might well be examined by national security agencies to assess their implications. A number of neuroscientists have told me that they have received phone calls out of the blue from security officials interested in their work in areas such as monitoring or altering neural processes.
Among those researchers who do accept national security agency funding, some tend to dismiss the idea that anything of military use will come of their research. Some believe, or prefer to believe, that they can manipulate their funding sources so that they can do the work they want to do without serving the goals of their benefactors, and that their results are going to be benign no matter what others might be looking for.
Often it’s true that scientists are smart enough to get their grants without delivering the goods their funders want. But in the long run, as enough knowledge is gathered, the opportunities for dual use can’t be completely avoided. For those who are deeply concerned about the exploitation of science for military purposes, an obvious answer seems to be that the scientific community should simply swear off cooperation with national security agencies, including accepting research contracts. Call this the purist approach. Based on some historical experience I shall elaborate, I believe the purist answer is shortsighted.
In the real world, this kind of research is going to continue, and it’s best that university researchers be those who do it, rather than building top secret science fortresses with researchers who are not answerable to anyone but their commanders. It is critical for the well-being of our democratic society that the civilian scientific community is kept in the loop and that the rest of us can have at least a general idea of the kind of work that is being done, even though for legitimate reasons many of the details may not be generally available.