Introduction
If you were given the choice to divert a trolley down a different track and kill one person instead of remaining on a track that kills five people, would you make it? What if a trolley was on a path to kill five people but you had the choice to push a random person from a footbridge onto the track to inhibit the trolley from reaching the five people? Would you still make this choice?
Here we encounter a moral dilemma, an ethical paradox where either choice can be interpreted as correct depending on one’s specific beliefs. Morality is the foundational framework through which social norms are enforced and decisions are executed to establish a collective sense of justice and fairness in society [1]. A sense of right and wrong allows societies to reward virtuous actions while condemning unprincipled ones. When confronted with a moral dilemma, our brains utilize overlapping cognitive and emotional neural networks to determine an ethically sound decision. This intricate interplay between neurological processes across varying brain regions demonstrates the complexity and diversity of human behavior. These internal mechanisms manifest in external expressions of thought and reasoning that reinforce our morals and facilitate social decision-making [1].
There are several theories around the emergence of moral reasoning: is it an innate ability or is moral sense formulated through social interaction? Past research has shown that an individual’s moral values develop through both inherent neural pathways and exposure to cultural and social elements in the environment [2]. Moral sense is theorized to be constructed through innatism and socio-constructivism.
The theory of innatism attributes moral thought to be a natural characteristic defined at birth [2]. Studies suggest that before infants are taught social behaviors, they follow predetermined principles such as fairness, avoidance of harm, group support, and respect for authority. Researchers tested fairness in 9-month and 4-month infants by unfairly dividing cookies to two animated puppets and examining infants’ responses [3]. Both infant groups detected a violation and expected the experimenter to divide the cookies equally, suggesting that basic moral principles are embedded into innate cognitive processing in infants [3]. Infants enter the world carrying these foundational instinctive responses that promote prosocial moral behaviors intended to benefit others [4]. The limbic system regulates several of these innate behaviors responsible for mating, maternal care, and defense mechanisms. The amygdala, a key region in the limbic system, receives inputs from sensory processing regions such as the thalamus; the amygdala then passes signals to regions such as the hypothalamus that initiate physiological responses to the sensory input. The communication between these regions is responsible for facilitating the body’s “fight or flight” response, which is an example of an autonomic fear response [5]. These behaviors are involved in moral cognition processes and are evolutionarily beneficial for human survival. For example, human populations that participated in cooperation and altruistic prosocial behavior have had a reduced likelihood of negative interactions and increased reproductive success [2]. Additionally, these altruistic responses incorporate feelings of empathy and selflessness that permit humans to perceive the implications of their moral actions and act in a way that is socially acceptable and further facilitates productive human interaction. Therefore, morally innate judgments and inclinations for prosocial behavior in preverbal infants and children have been reinforced and retained by natural selection processes [2].
Alternatively, the theory of socio-constructivism asserts that moral sense grows over time through social relations [6]. It proposes that moral sense develops progressively, allowing individuals to distinguish between right and wrong as they are exposed to varying social surroundings over time. While infants carry innate tendencies that promote ethical behavior, moral sense has not entirely developed. At ages three and four, children begin to construct advanced categorical judgments on right and wrong based on learned behavior and perceived social standards, exhibiting core components of moral cognition [6]. For example, a study shows that while 3-4 year olds distribute resources equally, 5-6 year olds allocate resources equitably [6]. Internal moral judgments are continually shaped and transformed through social interactions and relationships that facilitate the progression of moral thought [2]. A study examining moral reasoning and aggressive subtypes in economically-impoverished children displays the influence of external factors on moral development [7]. In this study, impoverished children completed a story-interview that tested moral reasoning strategies surrounding accidental harm. Researchers observed that children in systematically disinvested communities showed a lower level of physical proactive aggression during reasoning for conventional strategies (e.g. rule adherence). Alternatively, reasoning for property damage and resource distribution showed higher levels of physical proactive aggression. The study suggests impoverished children place greater value on authority obedience during moral reasoning, as this may be more rewarding in their community. Additionally, children demonstrated increased aggression to resource distribution as they likely carry stronger desires for equity within their community [7]. Differences in moral reasoning indicate that socio-cultural contexts exert varying influences on moral development. Both theories of innatism and socio-constructivism are delicately intertwined to ultimately define our moral sense.
Dual Process Theory
Let’s return to examining the moral dilemma in the trolley scenario above. Most people would choose to turn the switch and change the direction of the trolley in order to save five lives over one. Conversely, in the second scenario, most people refuse to push a stranger in front of the train, although the outcome is identical and five lives would have been saved over one life [8].
Both moral dilemmas involve similar situations but typically result in different choices, signifying the multifaceted framework through which moral decisions are constructed. The first dilemma is characterized as a moral impersonal dilemma, as the individual is only indirectly involved in the process that may result in harm [1]. Alternatively, the second dilemma is termed a moral personal dilemma, as the individual feels emotionally involved, being responsible for pushing the person on the track. These two styles are both components of the dual-process theory, which states that moral decision-making and reasoning are products of two competing processes: a slow, utilitarian, reason-based cognitive response and a fast, intuitive, automatic emotion-based response [1]. The first trolley dilemma is impersonal and individuals utilize cognitive reasoning to formulate a utilitarian response. Utilitarianism is a theory of morality rooted in utilizing cognitive processes to derive an outcome that will produce the greatest good. In moral impersonal dilemmas, when an emotional threshold is not reached or surpassed, cognitive processing systems dominate the emotional response and promote utilitarian ‘reasonable’ thought [1]. In contrast, the second dilemma is an example of a moral personal dilemma that elicits an aversive emotional response since the individual feels directly accountable for the death of someone else. The intense emotional response overrides the cognitive system, and an emotional response is delivered. The dual-process theory suggests that humans possess the cognitive capacity to engage in multiple forms of thinking involving separate neural mechanisms to produce moral decisions.
Cognitive utilitarian processing in moral and ethical decision-making
Cognitive processes in moral decision-making primarily focus on formulating reason-based, utilitarian responses. Neurons in the prefrontal cortex and striatum encode expected cost and reward associated with the presented stimulus or situation and use these predicted outcomes to generate optimal responses– because moral choices offer inherent social benefits, this likely results in a morally sound response [9]. The dorsolateral prefrontal cortex (DLPFC) specifically plays an important role in rational utilitarian cognitive processing during moral dilemmas. A study utilized transmagnetic stimulation to disrupt the DLPFC during moral personal decisions [1]. When disrupted, participants deemed scenarios that described actions that could result in serious bodily harm to be more appropriate in nature. This suggests the DLPFC’s key involvement in reinforcing utilitarian decision-making during cognitive processing [1]. Another study investigated the DLPFC’s role in prosocial behaviors in moral-personal dilemmas [10]. As opposed to the previous study, this study activated the right DLPFC while inhibiting the left, which still resulted in participants exhibiting less utilitarian choices. The study suggests that the active right DLPFC participates in integrating emotional elements when constructing utilitarian moral judgements. This result was only observed during high-conflict dilemmas or moral-personal dilemmas when emotional weight was high. Since moral-personal dilemmas favor the emotional response, high-conflict dilemmas require greater cognitive control to suppress the initial emotional response, promoting stronger activity in the prefrontal cortex [10]. The DLPFC competes with other brain structures such as the orbitofrontal cortex (OFC) and ventromedial prefrontal cortex (VMPFC), both also located in the frontal lobe, to moderate emotionally charged reactions [11]. Since the VMPFC is involved in the mediation of emotions during moral processing, patients with VMPFC lesions have demonstrated increased utilitarian responses [11]. The DLPFC, VMPFC, and OFC demonstrate integrative decision-making as these key structures are involved in differing moral processes yet must extensively interact to formulate a moral decision.
Theory of Mind, empathy, and emotion-based processing when making moral decisions
Emotions appear through the identification of situational stimuli that induce psychological and physiological changes, eliciting an automatic rapid response. Emotional processing is multifaceted, incorporating several brain structures responsible for separate mechanisms that contribute to overall moral judgment [12]. For example, the insula is involved in identifying emotional expressions for different body states such as disgust, while the orbitofrontal cortex is involved in the integration of emotional and motivational information. Together, accompanied with other structures, these individual and interconnected networks contribute to the emotional response that affects moral decision-making [12].
The Theory of mind (TOM) framework and the various brain structures involved are essential in comprehending emotional processing systems that guide social interactions and moral decision-making processes [1]. TOM is the cognitive process of interpreting and explaining the beliefs, knowledge, and intentions of others. TOM processes are incorporated in moral thought, especially in the moral judgment of others’ decisions. For example, TOM allows individuals to cognitively decipher situations by placing themselves in the decision-maker’s position, evaluating the dilemma from another’s perspective and producing a response. This can result in a shared emotional response with the decision-maker that influences the emotional nature of the overall moral judgment of the decision-maker’s response [1].
This TOM network includes structures such as the bilateral temporoparietal junction (TPJ) [13]. During the decision-making process, the TPJ is heavily involved in understanding others’ reasoning and integrating personal subjective perspectives into moral responses. During emotional processing, individuals formulate responses that place greater weight on beliefs and intentions over reason and consequences [10]. As mentioned earlier, the moral-impersonal dilemma elicits a greater utilitarian response. Studies show that stimulation to the TPJ results in a less utilitarian response and greater emotional influence than generally observed in moral-impersonal dilemmas, revealing a connection between the TPJ and emotional systems [sources]. This suggests that the TPJ incorporates personal beliefs and intentions during emotional moral processing [10].
Several other brain structures play a part in the TOM framework. The superior temporal sulcus (STS) located in the parietal lobe is involved in the perception and representation of social information utilized to draw inferences on others’ behaviors. The STS initially processes social perception and interprets beliefs. It exhibits higher activity during moral dilemmas, which require deeper understanding and analysis of the “victims’” actions to justify their behavior [11]. The orbitofrontal cortex is also active during processing of emotionally salient statements. Specifically, the orbitofrontal and ventromedial prefrontal cortices are heavily involved in the emotional response, competing with the left DLPFC that presents the cognitive response [11]. The amygdala accompanies the orbitofrontal cortex, rewarding value to connections from visual processing areas that comprehend stimuli. Both these structures inform the anterior cingulate cortex of the importance of the stimuli, including the value of outcomes and rewards. The anterior cingulate cortex then links reward and punishment information, which elicits emotional responses through observable behaviors and actions [14]. Together, these structures construct emotional processing systems during moral cognition.
Psychopaths and moral decision-making
A psychopath takes a different approach to interpreting the trolley dilemma. A typical individual would deem pushing a stranger in front of a trolley as impermissible due to the harm it causes. Most would also utilize cognitive reasoning to recognize that saving five lives would be greater than one. Utilizing both of these interpretations, the individual would face moral conflict. However, psychopaths respond differently. Faced with the same scenario, they are more likely to decide to push the stranger without hesitation [15].
Limited neuroimaging research has revealed that individuals with high levels of psychopathy engage distinct neural circuits to arrive at moral judgements [15]. While individuals with low psychopathy utilize emotionally-based processes centered around the amygdala, psychopathic individuals rely on abstract reasoning and semantic knowledge and engage regions structured in the DLPFC. This suggests that psychopaths heavily utilize deliberate cognitive reasoning over emotional processing during decision-making [15].
While psychopathy is not clearly defined, it is marked by significant emotional dysfunction, characterized by a lack of empathy, diminished feelings of guilt, and abnormal moral reasoning [16]. When taking a closer look at the neuroanatomy, psychopaths show minimal activity in the amygdala and frontal cortex regions during fear-based learning and decision-making, and exhibit blank reactions when viewing fearful expressions. Their inability to identify others’ fears suggests that neural deficits in fear processing impair moral judgments and inhibit empathetic reactions to others’ distress [15].
A study has also observed volumetric differences and structural abnormalities in the temporal lobe, VMPFC, and amygdala across the brains of psychopaths that promote antisocial behavior and pathological tendencies to lie [15]. These regions are essential for the integration of emotional and rational thought into decision-making. The amygdala is invaluable to stimulus-reinforcement learning, which allows for specific representations of stimuli from the temporal cortex to be connected to emotional information. The amygdala is also responsible for the categorization of stimuli to be emotionally marked as “good” or “bad.” After categorizing stimuli, the amygdala sends signals to the VMPFC to facilitate the representation and reinforcement of specific objects and actions, which is then utilized by other structures to anticipate consequences and formulate rational decisions [15]. During moral conflict, psychopaths lack the ability to appropriately integrate emotional and rational thought in involved neural structures to execute a morally “correct” reaction. Though research on cognition within psychopathy has been performed, it has not been sufficiently replicated and often does not control for environmental factors, meaning more thorough trials must be performed to reach more conclusive results.
Conclusion
The exploration of moral decision-making through the lens of neuroscience provides profound insights into how different neural processes shape ethical behavior. Cognitive and emotional neural mechanisms within the dual-process theory recruit structures from overlapping brain regions to exhibit moral thought and reflect human behavior. Moral responses are attributed to thousands of simultaneous chemical reactions occurring across several brain structures at each second of the day. These brain structures assign meaning to ideas and values, allowing humans to have moral beliefs that reinforce the intricate nature to human life. Morality guides societies, dictating social norms and what behaviors are deemed socially acceptable in both justice and education systems. Still, moral reasoning is diverse, defined differently across varying cultures. While some cultures are individualistic and value personal expression and autonomy, other cultures are collectivistic and prioritize harmony in the community. Therefore, morality is relative as there are no absolute universal moral standards. Moral responses are constantly being re-evaluated and modified by experience, changing cultural contexts, and societal influences. While the neural networks that guide moral thought are common between humans, the specific interpretation of stimuli varies as different humans characterize the same actions as morally correct or wrong. Variations in moral responses demonstrate not only the neural processes behind moral reasoning and ethical decision-making but also the wide neural diversity between human brains that bleeds individuality into humanity.
References
- Jeurissen, D., Sack, A. T., Roebroeck, A., Russ, B. E., & Pascual-Leone, A. (2014). TMS affects moral judgment, showing the role of DLPFC and TPJ in cognitive and emotional processing. Frontiers in neuroscience, 8, 18. https://doi.org/10.3389/fnins.2014.00018
- Limone, P., & Toto, G. A. (2022). Origin and Development of Moral Sense: A Systematic Review. Frontiers in psychology, 13, 887537. https://doi.org/10.3389/fpsyg.2022.887537
- Dawkins, M. B., Sloane, S., and Baillargeon, R. (2019b). Do infants in the first year of life expect equal resource allocations? Front. Psychol. 10:116. doi: 10.3389/fpsyg.2019.00116
- Hamlin, J. K. (2014). “The origins of human morality: complex socio-moral evaluations by preverbal infants,” in New Frontiers in Social Neuroscience, eds J. Decety and Y. Christen (New York, NY: Springer International Publishing), 165–188. doi: 10.1007/978-3-319-02904-7_10
- Sokolowski, K., & Corbin, J. G. (2012). Wired for behaviors: from development to function of innate limbic system circuitry. Frontiers in molecular neuroscience, 5, 55. https://doi.org/10.3389/fnmol.2012.00055
- Dahl, A., & Killen, M. (2018). A Developmental Perspective on the Origins of Morality in Infancy and Early Childhood. Frontiers in psychology, 9, 1736. https://doi.org/10.3389/fpsyg.2018.01736
- Baker, E.R., Huang, R., Battista, C., & Liu, Q. (2023). Head Start Children’s Moral Reasoning Predicts Aggressive Forms and Functions. Early Childhood Education Journal, 51, 443–455. https://doi.org/10.1007/s10643-022-01313-6
- Thomson, J. J. (1985). The Trolley Problem. The Yale Law Journal, 94(6), 1395–1415. https://doi.org/10.2307/796133
- Hosokawa, T., Kennerley, S. W., Sloan, J., & Wallis, J. D. (2013). Single-neuron mechanisms underlying cost-benefit analysis in frontal cortex. The Journal of neuroscience : the official journal of the Society for Neuroscience, 33(44), 17385–17397. https://doi.org/10.1523/JNEUROSCI.2221-13.2013
- Zheng, H., Lu, X., & Huang, D. (2018). tDCS Over DLPFC Leads to Less Utilitarian Response in Moral-Personal Judgment. Frontiers in neuroscience, 12, 193. https://doi.org/10.3389/fnins.2018.00193
- Pascual, L., Rodrigues, P., & Gallardo-Pujol, D. (2013). How does morality work in the brain? A functional and structural perspective of moral behavior. Frontiers in integrative neuroscience, 7, 65. https://doi.org/10.3389/fnint.2013.00065
- Helion, C., & Ochsner, K. N. (2018). The role of emotion regulation in moral judgment. Neuroethics, 11(3), 297–308. https://doi.org/10.1007/s12152-016-9261-z
- Ye, H., Chen, S., Huang, D., Zheng, H., Jia, Y., & Luo, J. (2015). Modulation of Neural Activity in the Temporoparietal Junction with Transcranial Direct Current Stimulation Changes the Role of Beliefs in Moral Judgment. Frontiers in human neuroscience, 9, 659. https://doi.org/10.3389/fnhum.2015.00659
- Rolls E. T. (2019). The cingulate cortex and limbic systems for emotion, action, and memory. Brain structure & function, 224(9), 3001–3018. https://doi.org/10.1007/s00429-019-01945-2
- Cardinale, E. M., & Marsh, A. A. (2015). Impact of Psychopathy on Moral Judgments about Causing Fear and Physical Harm. PloS one, 10(5), e0125708. https://doi.org/10.1371/journal.pone.0125708
- Muñoz-Negro, J.E., Martínez Barbero, J.P., Smith, F., Leonard, B., Martínez, J.P., & Ibáñez-Casas, I. (2018). The controversial relationship between neuroscience and moral responsibility in psychopaths. Egyptian Journal of Forensic Sciences, 8, 40 (2018). https://doi.org/10.1186/s41935-018-0071-9