The Science Of Morality

The friendliest place on the web for anyone that follows U2.
If you have answers, please help by responding to the unanswered posts.

MrsSpringsteen

Blue Crack Addict
Joined
Nov 30, 2002
Messages
29,289
Location
Edge's beanie closet
Is Morality Natural?

Science is tracing the biological roots of our intuitive sense of what is right and what is wrong.
Marc D. Hauser, Ph.D.
NEWSWEEK
Updated: 1:56 PM ET Sep 13, 2008

On Jan. 2, 2007, a large woman entered the Cango caves of South Africa and wedged herself into the only exit, trapping 22 tourists behind her. Digging her out appeared not to be an option, which left a terrible moral dilemma: take the woman's life to free the 22, or leave her to die along with her fellow tourists? It is a dilemma because it pushes us to decide between saving many and using someone else's life as a means to this end.

A new science of morality is beginning to uncover how people in different cultures judge such dilemmas, identifying the factors that influence judgment and the actions that follow. These studies suggest that nature provides a universal moral grammar, designed to generate fast, intuitive and universally held judgments of right and wrong.

Consider yourself a subject in an experiment on the Moral Sense Test, a site presenting dilemmas such as these: Would you drive your boat faster to save the lives of five drowning people knowing that a person in your boat will fall off and drown? Would you fail to give a drug to a terminally ill patient knowing that he will die without it but his organs could be used to save three other patients? Would you suffocate your screaming baby if it would prevent enemy soldiers from finding and killing you both, along with the eight others hiding out with you?

These are moral dilemmas because there are no clear-cut answers that obligate duty to one party over the other. What is remarkable is that people with different backgrounds, including atheists and those of faith, respond in the same way. Moreover, when asked why they make their decisions, most people are clueless, but confident in their choices. In these cases, most people say that it is acceptable to speed up the boat, but iffy to omit care to the patient. Although many people initially respond that it is unthinkable to suffocate the baby, they later often say that it is permissible in that situation.

Why these patterns? Cases 1 and 3 require actions, case 2 the omission of an action. All three cases result in a clear win in terms of lives saved: five, three and nine over one death. In cases 1 and 2, one person is made worse off, whereas in case 3, the baby dies no matter what choice is made. In case 1, the harm to the one arises as a side effect. The goal is to save five, not drop off and drown the one. In case 2, the goal is to end the life of the patient, as he is the means to saving three others.

Surprisingly, our emotions do not appear to have much effect on our judgments about right and wrong in these moral dilemmas. A study of individuals with damage to an area of the brain that links decision-making and emotion found that when faced with a series of moral dilemmas, these patients generally made the same moral judgments as most people. This suggests that emotions are not necessary for such judgments.

Our emotions do, however, have a great impact on our actions. How we judge what is right or wrong may well be different from what we chose to do in a situation. For example, we may all agree that it is morally permissible to kill one person in order to save the lives of many. When it comes to actually taking someone's life, however, most of us would turn limp.

Another example of the role that emotions have on our actions comes from recent studies of psychopaths. Take the villains portrayed by Heath Ledger and Javier Bardem, respectively, in "The Dark Knight" and "No Country for Old Men." Do such psychopathic killers know right from wrong? New, preliminary studies suggest that clinically diagnosed psychopaths do recognize right from wrong, as evidenced by their responses to moral dilemmas. What is different is their behavior. While all of us can become angry and have violent thoughts, our emotions typically restrain our violent tendencies. In contrast, psychopaths are free of such emotional restraints. They act violently even though they know it is wrong because they are without remorse, guilt or shame.

These studies suggest that nature handed us a moral grammar that fuels our intuitive judgments of right and wrong. Emotions play their strongest role in influencing our actions—reinforcing acts of virtue and punishing acts of vice. We generally do not commit wrong acts because we recognize that they are wrong and because we do not want to pay the emotional price of doing something we perceive as wrong.

So, would you have killed the large woman stuck in the cave or allowed her to die with the others? If you are like other subjects taking the moral sense test, you would say that it is permissible to take her life because you don't make her worse off. But could you really do it? Fortunately, there was a simpler solution: she was popped out with paraffin after 10 hours.



Moral Sense Test

moral.wjh.harvard.edu
 
It's not solely about morality, but this interesting article concerns the difficulty humans experience disentangling the metaphorical from the literal--with major implications for our moral decisionmaking, among other things.

New York Times, Nov. 14

...Consider an animal (including a human) that has started eating some rotten, fetid, disgusting food. As a result, neurons in an area of the brain called the insula will activate. Gustatory disgust. Smell the same awful food, and the insula activates as well. Think about what might count as a disgusting food (say, taking a bite out of a struggling cockroach). Same thing. Now read in the newspaper about a saintly old widow who had her home foreclosed by a sleazy mortgage company, her medical insurance canceled on flimsy grounds, and got a lousy, exploitative offer at the pawn shop where she tried to hock her kidney dialysis machine. You sit there thinking, those bastards, those people are scum, they’re worse than maggots, they make me want to puke…and your insula activates. Think about something shameful and rotten that you once did…same thing. Not only does the insula “do” sensory disgust; it does moral disgust as well. Because the two are so viscerally similar. When we evolved the capacity to be disgusted by moral failures, we didn’t evolve a new brain region to handle it. Instead, the insula expanded its portfolio.

Or consider pain. Somebody pokes your big left toe with a pin. Spinal reflexes cause you to instantly jerk your foot back just as they would in, say, a frog. Evolutionarily ancient regions activate in the brain as well, telling you about things like the intensity of the pain, or whether it’s a sharp localized pain or a diffuse burning one. But then there’s a fancier, more recently evolved brain region in the frontal cortex called the anterior cingulate that’s involved in the subjective, evaluative response to the pain. A piranha has just bitten you? That’s a disaster. The shoes you bought are a size too small? Well, not as much of a disaster. Now instead, watch your beloved being poked with the pin. And your anterior cingulate will activate, as if it were you in pain. There’s a neurotransmitter called Substance P that is involved in the nuts and bolts circuitry of pain perception. Administer a drug that blocks the actions of Substance P to people who are clinically depressed, and they often feel better, feel less of the world’s agonies. When humans evolved the ability to be wrenched with feeling the pain of others, where was it going to process it? It got crammed into the anterior cingulate. And thus it “does” both physical and psychic pain.

Another truly interesting domain in which the brain confuses the literal and metaphorical is cleanliness.
In a remarkable study, Chen-Bo Zhong of the University of Toronto and Katie Liljenquist of Northwestern University demonstrated how the brain has trouble distinguishing between being a dirty scoundrel and being in need of a bath. Volunteers were asked to recall either a moral or immoral act in their past. Afterward, as a token of appreciation, Zhong and Liljenquist offered the volunteers a choice between the gift of a pencil or of a package of antiseptic wipes. And the folks who had just wallowed in their ethical failures were more likely to go for the wipes. In the next study, volunteers were told to recall an immoral act of theirs. Afterward, subjects either did or did not have the opportunity to clean their hands. Those who were able to wash were less likely to respond to a request for help (that the experimenters had set up) that came shortly afterward. Apparently, Lady Macbeth and Pontius Pilate weren’t the only ones to metaphorically absolve their sins by washing their hands.

This potential to manipulate behavior by exploiting the brain’s literal-metaphorical confusions about hygiene and health is also shown in a study by Mark Landau and Daniel Sullivan of the University of Kansas and Jeff Greenberg of the University of Arizona. Subjects either did or didn’t read an article about the health risks of airborne bacteria. All then read a history article that used imagery of a nation as a living organism with statements like, “Following the Civil War, the United States underwent a growth spurt.” Those who read about scary bacteria before thinking about the U.S. as an organism were then more likely to express negative views about immigration.

Another example of how the brain links the literal and the metaphorical comes from a study by Lawrence Williams of the University of Colorado and John Bargh of Yale. Volunteers would meet one of the experimenters, believing that they would be starting the experiment shortly. In reality, the experiment began when the experimenter, seemingly struggling with an armful of folders, asks the volunteer to briefly hold their coffee. As the key experimental manipulation, the coffee was either hot or iced. Subjects then read a description of some individual, and those who had held the warmer cup tended to rate the individual as having a warmer personality, with no change in ratings of other attributes.

Another brilliant study by Bargh and colleagues concerned haptic sensations (I had to look the word up—haptic: related to the sense of touch). Volunteers were asked to evaluate the resumes of supposed job applicants where, as the critical variable, the resume was attached to a clipboard of one of two different weights. Subjects who evaluated the candidate while holding the heavier clipboard tended to judge candidates to be more serious, with the weight of the clipboard having no effect on how congenial the applicant was judged. After all, we say things like “weighty matter” or “gravity of a situation.”

What are we to make of the brain processing literal and metaphorical versions of a concept in the same brain region? Or that our neural circuitry doesn’t cleanly differentiate between the real and the symbolic? What are the consequences of the fact that evolution is a tinkerer and not an inventor, and has duct-taped metaphors and symbols to whichever pre-existing brain areas provided the closest fit?

Jonathan Haidt, of the University of Virginia, has shown how viscera and emotion often drive our decisionmaking, with conscious cognition mopping up afterward, trying to come up with rationalizations for that gut decision. The viscera that can influence moral decisionmaking and the brain’s confusion about the literalness of symbols can have enormous consequences.
Part of the emotional contagion of the genocide of Tutsis in Rwanda arose from the fact that when militant Hutu propagandists called for the eradication of the Tutsi, they iconically referred to them as “cockroaches.” Get someone to the point where his insula activates at the mention of an entire people, and he’s primed to join the bloodletting. But if the brain confusing reality and literalness with metaphor and symbol can have adverse consequences, the opposite can occur as well. At one juncture just before the birth of a free South Africa, Nelson Mandela entered secret negotiations with an Afrikaans general with death squad blood all over his hands, a man critical to the peace process because he led a large, well-armed Afrikaans resistance group. They met in Mandela’s house, the general anticipating tense negotiations across a conference table. Instead, Mandela led him to the warm, homey living room, sat beside him on a comfy couch, and spoke to him in Afrikaans. And the resistance melted away. ...
 
Back
Top Bottom