Cambridge, U.K.
Kingsley Amis, the British novelist, once explained that everything that had gone wrong with his country in the second half of the last century could be summarized in the word “workshop.” His point is sound. No two syllables better conjure up the mandatory “sharing,” the regimented bonhomie and bogus cheerfulness, the mincing
and posturing, the smiley-faced Maoism that descended upon corporate and academic culture a generation ago and show no signs of abating. The word alone suggests a string of horrifying cognates: “team work,” “role playing,” “brainstorming,” “trust building,” “leadership” . . . Brrrr.
I think I’ve found a workshop Amis would have approved of, however, if only because it wasn’t like a workshop at all—no falling backwards into your colleagues’ threaded arms, no happytalk about building your brand. Its title, “The Uses and Abuses of Biology,” referred to a series of papers commissioned by the Faraday Institute at St. Edmund’s College of Cambridge University and presented there in late September. The purpose was to discover how evolutionary biology is used to illuminate economics, sociology, education, religion, ethics, philosophy, and other academic disciplines, and whether it can illuminate anything beyond itself. The conclusion was surprising and uplifting.
Evolutionary biology is imperialistic, overtaking entire fields of endeavor simply by attaching the prefix bio- or neuro- to their names: bioethics, neuroeconomics, even, God help us, neurotheology. Its logic is deployed against hapless laymen as a bully’s truncheon or an argument stopper. A famous example of biological imperialism was offered by one of the greatest biologists of them all, Francis Crick, who believed his discovery (with James Watson) of DNA had exposed all philosophical problems, from free will to the nature of the self, as meaningless.
“You,” he wrote, “your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules. Who you are is nothing but a pack of neurons.”
For 50 years this reductionism has been a prevailing view among biologists and their publicists in academic philosophy and science journalism. It’s particularly common when the subject turns to neurobiology, the biological study of the brain. All of neuroscience has been greatly aided as a popular topic by magnetic resonance imaging, or MRIs. With their familiar shapes and pretty colors, brain scans provide editors and art designers with dynamic, off-the-shelf images to illustrate stories about how the brain determines what we do and why we think we do it. The complexities of the brain often come off as quite simple, forming a tidy causal link from brain to behavior: If your amygdala is turning magenta, you must be in love. Further, if you’re in love, it’s because your amygdala is turning magenta.
“Everybody’s interested in the brain and likes to talk about it, because everybody’s got one” said Duncan Astle, a researcher at Cambridge’s Cognition and Brain Science unit. But the popularity of neuroscience, along with the loose talk of journalists and other popularizers, has led to a large number of what Astle called “neuromyths.”
Not surprisingly, educational institutions, which are staffed by education majors, are especially vulnerable. School districts in the United States and the U.K. have spent millions of tax dollars on “Brain Gym” and other programs said to specialize in “neuro-sculpting,” “brain training,” and “mental fitness.” The idea is that specified physical exercises can increase student learning far beyond the undoubted aerobic benefits. Try this: Massage your chest just below the clavicle with your extended thumb and second finger while rubbing your navel with your other palm. Feel smarter? Maybe you’re not doing it right.
There is no persuasive evidence supporting the claims of the neuro-sculptors and brain gym coaches, just as there isn’t any support for the popular, and allegedly scientific, belief that “right brain learning” is somehow different from “left brain learning.” “We use both sides of our brain for most tasks,” Astle said. Nor has any experimental basis been found for the theory of the three learning styles—auditory, visual, and tactile—that many educators now accept as dogma. “Everybody pretty much learns the same way,” Astle said. The idea of learning styles, pounded into children from an early age, can even impede learning. If you convince a child over years of schooling that he’s an auditory learner, he won’t learn as well if he thinks you’re teaching him visually—even though the teaching style is the same.
MRIs are commonly brought out when fads like this are questioned. But MRIs, as Astle noted, measure only blood flow: Any activity not associated with blood flow won’t be captured by the image of the brain. In “reading” an MRI, we can only infer brain activity, not observe it. Details like this are ignored when pop scientists set out to prove that every task we perform is “caused” by some neurobiological activity that can be isolated and measured. As other researchers noted at the workshop, the recent publication of Brainwashed: The Seductive Appeal of Mindless Neuroscience, by Sally Satel and Scott Lilienfeld, has gone far in introducing a note of cautionary realism into our MRI obsession and the reductionism it often feeds.
Skepticism isn’t as contagious as credulity, especially in journalism, but it can spread in unexpected places. The James S. McDonnell Foundation, once known as a “neuromill” for all the brain studies it funds, has posted a disclaimer to its grant guidelines. “Proposals proposing to use functional imaging to identify the ‘neural correlates’ of cognitive or behavioral tasks (for example, mapping the parts of the brain that ‘light up’ when different groups of subjects play chess, solve physics problems, or choose apples over oranges) are not funded through this program. In general, JSMF and its expert advisors have taken an unfavorable view of . . . functional imaging studies using poorly characterized tasks as proxies for complex behavioral issues involving empathy, moral judgments, or social decision-making.”
Daniel McKaughan, a philosopher at Boston College, offered the workshoppers a textbook example of a neuromyth in action: the oxytocin fad. There are signs the fad may be in remission, but it was goosed along a few years ago with a pop science book by a “neuroeconomist” named Paul J. Zak, who doubles as a professor of neurology at Loma Linda University Medical Center. The book’s title, The Moral Molecule: The Source of Love and Prosperity, guaranteed that it would be feted by a less-than-skeptical press, and Zak has gone on to fame as a speaker at conferences and (of course) workshops around the world.
The molecule of his title is oxytocin, naturally created in the brain. Various experiments in the 2000s showed that a supplemental whiff of the stuff could elevate levels of trust, compassion, and sociability in human subjects. In this way, Zak reasoned, oxytocin could be said to control your moral sense and mine. Hence the “moral molecule”!
“We can turn the [moral] behavioral response on and off like a garden hose,” Zak told an interviewer for the Wall Street Journal. (Among serious news outlets, the Journal has a uniquely ravenous appetite for the claims of neurononsense.)
As McKaughan showed, however, the performance of oxytocin is hardly uniform. Along with its happy effects, oxytocin can have unhappy effects too. Depending on the subject and situation, oxytocin is associated with feelings of envy, group prejudice, estrangement, favoritism, irritability, and schadenfreude (and not only in Germans). Under many conditions, McKaughan pointed out, the moral molecule can also double as the immoral molecule.
And so it goes with the specious scheme of cause and effects in pop biology. We get the “monogamy gene”—or the “cheatin’ gene,” as the newsreader Brian Williams labeled it for viewers of the NBC Nightly News—and the “God gene” and also, though you’ll think I’m kidding, the Kobe Bryant neuron. That one lights up on a brain scan when subjects are shown pictures of Kobe Bryant.
Neuroscientists are getting better at policing their ranks—you are more likely these days to hear the “morality molecule” dismissed as the “hype hormone.” The objection to such boneheaded fantasies isn’t that behavior is unrelated to brain activity and even, in some sense, caused by it; it’s that brain activity, like the operation of genes, is almost infinitely complex and its chain of cause and effect has scarcely begun to be understood, much less “mapped.” Simplifying it for the sake of marketing or journalism is a cheat on the public, which is too busy with other business to question the neuro-myths. It also serves to advance the reductionism advocated by Crick and his fellow determinists.
And what’s wrong with that? The commonest criticism of reductionism—the idea that we are a pack of neurons and nothing more—is that it will lead us to treat our fellow human beings as if . . . well, as if they were a pack of neurons and nothing more. John Evans, a sociologist of religion at University of California, San Diego, has set about testing whether the criticism has any merit.
In the Western humanistic tradition, he finds three definitions of the human: the theological, the philosophical, and the biological (his term for reductionist). Evans arranged a large national survey of a random sample of Americans. He asked them which of the three definitions they agreed or disagreed with most. Women, churchgoers, and conservatives were more likely than men, nonchurch-goers, and liberals to disagree with the reductionist account of human life. Then he asked a series of questions designed to elicit their attitudes toward behavior. Were they in favor of allowing experiments on prisoners without their consent? Selling human organs for profit? Allowing suicide in the case of people who wanted to save money? Intervening to stop genocide?
Sure enough, he found that people who hold the reductionist view—who deny the special status of the human species in nature, who believe behavior is determined by physical processes alone—were far more likely to agree with the maltreatment of humans. Evans can’t draw conclusions about whether determinism causes those views. But the correlations between them, he said, are unmistakable.
You can question the reliability of such a large survey on such complicated questions. It offers little more than a glimpse of what people say they might do rather than of how they would truly behave. And of course it does nothing to confirm or deny the ultimate truth of the determinism that still has so many biologists and science writers in its grip.
But the grip is loosening. This is thanks in part to the new field of epigenetics, which suggests that environmental factors can alter the genes that we pass on to our children—an idea deemed heretical since the dawn of modern genetics. Some evolutionary biologists have even begun to speak timorously of “predictable evolution,” a process in which certain patterns recur and to which evolutionary adaptations conform. More heresy: Nothing can rile a dogmatic biological determinist like the suggestion that evolution might point in a certain direction or have anything like an ordained outcome. Who knows where such thinking might lead?
The great quantum physicist (turned Anglican priest) John Polking-horne once noted that very few physicists, a century ago, doubted that the mechanical model of Newtonian physics was the whole truth about how the world works. Yet today, after a hundred years of relativity and quantum mechanics, not to mention Dark Matter and quarks and Higgs bosons, comprehensive certainty in physics is impossible. Nowadays, Polkinghorne said, evolutionary biology is in the position of physics a hundred years ago: a young discipline full of certainties—dogmas, really—that are soon to crumble in the face of greater understanding.
Some of us will consider this wonderful news, even if it takes a whole series of workshops to spread it.
Andrew Ferguson is a senior editor at The Weekly Standard.