Moral decision making is rife with internal conflict, say developmental psychologists

Findings challenge popular notion that we rely on our "guts" and don't think through challenging questions of right and wrong

Photo of a human shadow over arrows pointing three directions
Head shot of Audun Dahl
Understanding how people make decisions in extreme hypothetical situations illuminates how people make decisions and act in everyday life, says developmental psychologist Audun Dahl. "Almost every human conflict involves judgments about right and wrong," he says. (Photos by Carolyn Lagattuta)

A new in-depth study of moral reasoning challenges the popular notion that people are unable to think through difficult moral problems and rely primarily on automatic "gut" reactions to make tough decisions.

The new findings, which shed light on how we make moral and political decisions on polarizing issues such as abortion, immigration, and waterboarding, reveal that adolescents and adults can—and do—reason deeply about highly complex moral issues.

"When confronted with very, very hard questions about the value of life, decisions are grounded in multiple and sometimes competing considerations about harm, welfare, individual rights, fairness, and justice," said lead author Audun Dahl, associate professor of psychology at the University of California, Santa Cruz. "And contrary to popular belief, people are quite able to articulate all of this when asked to justify how they arrived at their decision."

The "trolley dilemma"

The study explores moral reasoning in the context of the famous "trolley dilemma." In 1967, moral philosopher Philippa Foot introduced the now-classic hypothetical dilemma: A train is hurtling down a track about to hit and kill five people, but a bystander can throw a switch and divert the train to another track, saving five lives. If they do, however, they will kill one person who is tied to the other track. What is the right thing to do? Most people say they would divert the train.

In a second scenario, five people are tied to a track. A bystander on a footbridge above the track can push one man to his death on the track, taking one life to save five others. In this scenario, most people say it would be wrong to push the man, although the number of saved lives would be the same as in the earlier scenario with the switch.

Why do people respond so differently to the two scenarios? In 2001, an influential neuroscience paper proposed that the responses stemmed from two distinct psychological processes: moral reasoning in the first scenario, based on the number of lives saved, as distinct from an automatic "gut reaction" in the footbridge scenario. The "dual-process" theory of morality holds that people rely on emotional, automatic reactions to make moral judgments most of the time, and they use moral reasoning only occasionally. The theory has changed the fields of moral psychology and philosophy, and it even spurred a subfield known as "trolleyology." It has also gained widespread popular acceptance, with some pundits attributing the current cultural and political divide to innate, automatic intuitions instead of reasoning.

A developmental approach

Dahl, however, was unconvinced. A developmental psychologist, he sees reasoning about moral issues emerge in children as young as 3 years of age. "Even preschool-age children can reason, and despite their linguistic limitations, they can provide explanations for their moral judgments," he said. "So we wanted to take our interview methods and apply them to the trolley scenarios and see if adolescents and adults were really unable to reason about these scenarios." Understanding how people make decisions in extreme hypothetical situations illuminates how people make decisions and act in everyday life, said Dahl. "Almost every human conflict involves judgments about right and wrong," he noted.

Dahl and his coauthors posed the trolley scenarios to 432 adolescents, college students, and other adults and conducted in-depth interviews in search of insights into their moral reasoning.

"Our findings rebut the notion that adults can't reason about moral issues," said Dahl. "Both adolescents and adults considered a number of factors in addition to the number of lives that would be saved: the fundamental value of life, the intrinsic rights of individuals, their involvement and their responsibilities in the scenarios, as well as guilt and social repercussions," said Dahl. "People were able to provide reasons for their decisions in both scenarios, and their justifications were consistent with their judgments across situations."

If moral reasoning were just about counting lives, reasoning about the trolley dilemmas would be easy, noted Dahl.

"But our participants found both scenarios to be extremely hard," he said. "They brought the same competing considerations to both." In both situations, people recognized the value of life, they want to maximize the welfare of all, and they bring notions of innocence into the scenarios. "People expressed a lot of conflict even about the switch scenario," added Dahl, "Either way, they said they would feel horrible. It shows how fundamentally central conflict is to moral reasoning about hard cases."

It's complicated

There is scant research about how people reason morally about the value of life and even less that involves conflicting situations.

"When people reason about moral issues, they often bring a host of different principles that are in tension, and they struggle with it. People experience conflict," said Dahl. "Everybody agrees that the value of life is important, but when they are asked to judge, it gets complicated. On the one hand, people think it’s better to save more people. On the other hand, they view the value of each individual life as unmeasurable. Their own internal conflict can help them accept that other people arrive at different judgments."

This is where Dahl sees hope that a deeper understanding of the process of moral reasoning will foster better discussions about divisive moral issues such as capital punishment, abortion, torture, drones, and border security.

"We need to recognize that moral decision making, especially around contentious issues, involves being conflicted," said Dahl. "People with divergent views are often weighing the same principles, but they prioritize them differently." One of the great things about these anonymous research interviews, as opposed to public debates, is that people acknowledge how conflicted they sometimes are."

Unlike philosophical theories of morality, which often aim for a single “right answer” to moral dilemmas, Dahl's findings indicate that inconsistency and tension are integral parts of human moral reasoning. "Being a morally mature, developed person doesn't mean you have a clean set of coherent beliefs," he said. "Some inner tensions are part of our moral system."

Dahl hopes his team's findings will reach beyond developmental psychology and lay the foundation for additional investigation.

Dahl's paper, "Moral Reasoning about Human Welfare in Adolescents and Adults: Judging Conflicts Involving Sacrificing and Saving Lives," appears in the current issue of Monographs of the Society for Research in Child Development. His coauthors are Matthew Gingo, assistant professor of psychology at Wheaton College; Kevin Uttich, senior researcher at Conifer Research; and Elliot Turiel, Jerome A. Hutto Professor of Education at UC Berkeley.