Neuro-imaging can now trace the processes that underlie our ability to tell right from wrong. It allows us to analyse the brain tissue of psychopaths, and to map the areas responsible for feeling guilt or indignation, charity and racial prejudice. Recent studies have identified brain areas sensitive to fairness, and even common neural pathways in the frontal and temporal lobes that are active whenever we make a moral response: brain areas that might embody our very “moral sense.” But what is unclear, and both worries and excites ethicists, is how these advances might change the way we think about ethics.
To what extent is morality built into the brain? Experiments on chimpanzees suggest that they possess a primitive sense of fairness. The Harvard psychologist Marc Hauser postulates that all humans share an innate “moral grammar.” Meanwhile Jonathan Haidt, a psychologist at the University of Virginia, sees the basis of morality in a set of far messier, conflicting emotional responses due to both nature and nurture. But if there is a common biological basis to human morality, it must be compatible with a vast diversity of moral outlooks. And it offers little help in overcoming the profound disagreements that seem inherent to morality and politics.
Science has also begun to shed light on the biological sources of moral conflict. One important strand of recent research has focused on disgust. At some point in our evolutionary history, primitive disgust reactions (to toxic or contaminated food, for instance) seem to have extended into our moral lives. Haidt and others have shown that it is possible to replicate this response by asking people a moral question while placing them in a messy or tidy environment—the messier it is, the more people are likely to be judgemental or critical, thinking that a given action is wrong or immoral. More astoundingly, Haidt and his colleague Thalia Wheatley have used hypnosis to prime people to feel a sharp pang of disgust whenever they read the word “often” in a sentence. The result was an inclination to treat even innocent acts as morally suspect: so a subject would claim the sentence “he often makes decisions quickly” implied an immoral intent (which it clearly does not), but would change their mind when the offending word was removed.
Going further, neuro-imaging has located overlapping neural networks common to physical disgust and moral repugnance. Psychologists from the University of Toronto have even reported that people responding to unfair treatment find the muscle central to the facial expression of disgust automatically activated. Other researchers are now applying such insights in more contentious settings, even examining the role of disgust in politics—the hypothesis being that some traditionally conservative attitudes (like views on homosexuality) may, in fact, be rooted in feelings of disgust as well.
Why does any of this matter? In part because, through understanding the neural basis of opposing moral views, we might get a better grip on why people sometimes seem to inhabit different moral universes. But underneath this is the more intriguing possibility that neuroscience could help us decide which side is right. Some liberal thinkers, such as political theorist Martha Nussbaum, have argued that conservative views could be undermined if we are able to trace their roots to this primitive feeling of disgust. Others think we should reject, or at least question closely, moral views which neuro-imaging reveals to be based in affective centres of our brains, rather than those associated with reason and judgement.
This is an interesting idea, but the suggestion that we could resolve our most profound moral disagreements in this way is one we should treat with deep scepticism. Why? In part we must be cautious of inferring moral principles from the way the world happens to be, or what philosophers sometimes describe as deriving “an ought from an is.” The idea that we can debunk moral beliefs by tracing their origins or “genealogy” is an old one, familiar from the writings of Nietzsche and Marx.
So, should we dismiss moral intuitions that neuroscientists show to be based in mere emotion, or in less developed parts of our brains? The problem is that snapshots of the brain cannot on their own tell us whether emotion clouds our moral thinking, or is absolutely necessary for sound judgement. Arguments from “neural is” to “moral ought” often make an interpretative leap well beyond the scientific facts, illicitly smuggling in moral assumptions that can only be established by ethical reflection. Fascinating as the neuroscientific findings may be, there is no escaping the hard task of thinking through moral questions and striving, in the face of recalcitrant disagreement, to achieve consensus.