Presentation is loading. Please wait.

Presentation is loading. Please wait.

Correspondence: Stuart Pugh School of Psychology University of Central Lancashire Preston England United Kingdom PR1 2HE An.

Similar presentations


Presentation on theme: "Correspondence: Stuart Pugh School of Psychology University of Central Lancashire Preston England United Kingdom PR1 2HE An."— Presentation transcript:

1 Correspondence: Stuart Pugh School of Psychology University of Central Lancashire Preston England United Kingdom PR1 2HE E-mail: SMPugh@uclan.ac.uk An Investigation into Moral Intuition and Reasoning. Stuart Pugh, Glen Carrigan, Dr Andrew Churchill, Dr Andy Morley and Dr Lea Pilgrim. Introduction Moral dilemmas are a familiar component of modern life, yet most people are fortunate enough to encounter problems devoid of any real jeopardy. So how would a person react when lives are at risk? Philosophers and Psychologists have long used the Trolley Problem (Foot, 1967) to measure responses in a life threatening situation. Studies consistently show both utilitarian and deontological decisions are mediated by many factors including; whether problems are posed as instrumental (action directly results in death) or incidental (death is a foreseen yet unintended consequence of action) (DDE - Aquinas, 1952); use of emotive language (Borg, 2006 analysis of Greene, 2001); or number of victims compared to lives saved (Nakamura, 2012). However, aside from Navarrete (2012), which used Virtual Reality, most pose hypothetical problems. Would the hypothetical findings of the literature hold in a more realistic situation? The current study used a model version of the Trolley problem to measure whether the decision time made available (slow train, 10 secs vs. fast train, 5 secs), the numbers involved being made explicit or not (implicit vs. explicit), or the ratios of loss of life exhibited any significant differences. 48 participants were tested across all three ratios after being assigned to either a fast or slow, and implicit or explicit condition. Following a verbal brief of the Trolley problem, the train was released at a predetermined speed, approaching human figures on the track. Participants were asked to pull a lever to in order to choose whether to keep the train on the main track (deontological decision) or switch to the alternative track (utilitarian decision). A 2 x 3 x 2 ANOVA was carried out and found a significant main effect of ratio (F (2,88) = 17.04, p < 0.001), but no significant main effect of decision time or victim information. Further, no significant interactions were found. Post Hoc t-tests with a Bonferroni adjustment were carried out to identify where the differences lay within the ratios and found significant differences between; 5vs1 and 2vs1 (t (47) = 2.59, p = 0.013); 5vs1 and 5vs4 (t (47) = 5.08, p < 0.001) and between 2vs1 and 5vs4 (t (47) = 3.74, p = 0.001. (Significant levels adjusted due to multiple comparisons) The condition offering the most benefit (5vs1) elicited significantly more utilitarian decisions than the next beneficial (2vs1 when considered as a ratio) which in turn elicited significantly more utilitarian decisions than the least beneficial (5vs4 when considered as a ratio). The results appear to support previous findings that utilitarian decisions are a product of a mental calculation that consider lives to be saved against lives to be lost, and impact size. (Tage et al., 2009; Churchill, in preparation). Enhancing the work carried out by Greene (2004), who investigated moral dilemmas using fMRI, and considering the results of this study, we intend to carry out a mixture of the Trolley Problems (different ratios) using Electroencephalogram (EEG) (specifically Event Related Potentials or ERPs). Greene argues that either the emotional area or the problem-solving area of the brain is activated depending on the type of moral problem presented. If this is the case then it should be possible to isolate the respective temporal components that correlate with whether the decision made was governed by emotion or problem solving, providing further evidence for Greene’s hypothesis. References: Churchill, A. (In Preparation) Examining the Effects of Closing One’s Eyes and using Mental Imagery on Decision Making in Moral Dilemmas Navarrete, D. (2012). Virtual Morality: Emotion and Action in a Simulated Three-Dimensional "Trolley Problem". Emotion, 364-370. Nakamura, K. (2012). The Footbridge Dilemma Reflects More Utilitarian Thinking Than the Trolley Dilemma: Effect of Number of Victims in Moral Dilemmas. Thinking and Reasoning. Greene, J.D. (2004). The Neural Bases of Cognitive Conflict and Control in Moral Judgement. Neuron, 289-400. Greene, J.D. (2001). An fMRI investigation of emotional engagement in moral Judgment. Science, Vol. 293. Tage, S.R. (2009). Moral Principles or Consumer Preferences? Alternative Framings of the Trolley Problem. Cognitive Science, Vol 34, 311-321. Acknowledgements A big thank you to all the participants for taking part. Also, thank you to Dr Lea Pilgrim for providing training on the EEG equipment. Proposed Research Results Study Conclusion Graph demonstrating willingness to sacrifice across different ratios Example Trolley Problem A trolley is running down the track out of control. The line ahead has five workmen on it, who will be killed if the trolley continues. However, you could divert the train onto a siding away from these five, but you know there is a single workmen on this line who would be killed instead. Will you throw the switch Benefits of Future Research Currently, the neurophysiological evidence for Moral Reasoning is limited to fMRI findings. Using EEG we hope to gain a better understanding of the actual processing taking place when making a decision i.e. is it a methodical process or an instantaneous reaction to the stimuli as it allows us to measure reaction time of neural activation. 96% 83% 60%


Download ppt "Correspondence: Stuart Pugh School of Psychology University of Central Lancashire Preston England United Kingdom PR1 2HE An."

Similar presentations


Ads by Google