The Cognitive Psychology of Moral Discrepancy  

 

During the Second World War, the French existentialist philosopher Jean Paul Sartre, under the occupation of the German army, co-founded the underground resistance group Socialisme et Liberte. He also contributed to several illegal newspapers and magazines, writing articles in opposition to the invaders. However, he also famously accepted a lecturing position which had been taken from a Jew following the ban of their teaching in the country by the Nazis.

Jean Paul Sartre
Jean Paul Sartre

Further, after
approaching several people about joining Socialisme et Liberte and meeting indecision and uncertainty, the group shortly dissolved, and Sartre took no further part in active resistance. Sartre’s philosophy espoused the value of freedom and the moral duty of human beings, and given this, there has been much debate regarding whether Sartre’s actions, or lack of action, during this period were consistent with his professed beliefs.

The study of this relationship between ‘espoused’ moral values and actual behaviour has a long history and also continues to this day. Espoused Theory (developed principally by Chris Argyris e.g.  Argyris and Schön [1974]) states that:

When someone is asked how he would behave under certain circumstances, the answer he usually gives is his espoused theory of action for that situation. This is the theory of action to which he gives allegiance, and which, upon request, he communicates to others. However, the theory that actually governs his actions is this theory-in-use [actual behaviour]. (Argyris and Schön 1974: 6-7)

Jonathan Haidt (2001) goes further: espoused moral beliefs and actual behaviours are governed by completely different mental systems. In Haidt’s “Social Intuitist Model” (see paper: ‘The Emotional Dog and its Rational Tail’) the vast majority of real moral judgments / behaviours in-the-moment are made by one’s intuitive reaction to the situation, rather than through step-by-step reasoning. Reasoning, Haidt states, is generally only used in order to make after-the-fact justifications for moral decisions that have already been made intuitively or indeed to explain one’s moral beliefs to others in a theoretical context.

So what is the reason for this discrepancy? Both Espoused Theory and the Social Intuitist Model provide little explanatory theory outside of the proposal that the two phenomena are governed by different ‘theories’ or ‘mental systems’. Why do these systems behave differently? One possibility comes from research on cognitive biases. Firstly, check out the two versions of the ‘disease’ problem below:

 

Disease Problem: V1

Imagine you are in charge of the health department for a country experiencing a national disease outbreak. You have quarantined all the affected cases, 600 people in total. Your advisor presents you with the only two treatments available. You are told that treatment A will definitely save 200 lives, while treatment B has a 33% chance of saving all 600, but a 66% possibility of saving no one.

Which treatment option do you choose?

 

Quarantine

Disease Problem: V2

The situation is the same; 600 people quarantined. However in regards to the treatments, you are now told that treatment A will definitely kill 400 people, while Treatment B has a 33% chance that no people will die, but a 66% chance that all 600 people will die.

Now which treatment do you choose?

 

While the decision to be made in each version of these two problems is precisely equal, it has been consistently shown that the majority of people opt for treatment A in the ‘lives saved’ framing version (V1) but the same majority opt for treatment B in the ‘deaths’ framing version (V2). This effect has been found in many other experiments with related problems and the general consensus is that when faced with ‘gains’ people tend to choose the safe / certain option, while when faced with ‘losses’ people tend to choose the risky option – even when the two decisions are precisely equal.

This insight, known as ‘Loss Aversion’ led to Tversky and Kahneman’s 1979 ‘Prospect Theory’, a cornerstone of modern behavioural economics. In their 1974 paper (‘Judgment under Uncertainty: Heuristics and Biases’) they

System 1 vs System 2 [Illustration by David Plunkert, via The New York Times]
System 1 vs System 2 [Illustration by David Plunkert, via The New York Times]
proposed that people are susceptible to a wide variety of other cognitive biases also (including ‘anchoring‘, the ‘base rate fallacy‘, the ‘conjunction fallacy‘ and many others). Further, in Kahneman’s best-selling 2011 book ‘Thinking Fast and Slow’ he lays out

his belief that these biases are inherently due to the design of the mental ‘System 1’ (Haidt’s ‘Intuitive’ system) and can be overcome by greater use and education of the mental ‘System 2’ (Haidt’s ‘Reasoning’ system). In Kahneman’s model, both systems have their virtues and vices: System 1 makes decisions quickly and can handle a large amount of complexity, but it makes mistakes. System 2 is slower but more methodical and so makes less mistakes. In the moment, System 2 will often be too slow to determine how to behave so we rely predominantly on System 1.

So, perhaps we have the best intentions but are simply incapable of carrying them out in the moment due to the cognitive limitations of System 1?

 

An Experimental Test

In a recent paper, Schwitzgebel and Cushman (2015) wanted to test whether the degree of theoretical knowledge of moral situations would affect this in-the-moment decision making. To examine this, the authors decided to compare philosophers (people with philosophy degrees) to “similarly-educated” non-philosophers on the two disease problems. They also took data on the level of expertise in philosophy as well as whether ‘ethics’ was their area of speciality.

PhilvsPeople

 

Results

The study firstly replicated previous results, with a large majority of participants choosing the risky option when faced with ‘deaths’, and far less choosing the risky option when faced with ‘lives saved’. Furthermore, the effect size was the same for non-philosophers (83% vs 43%) and for philosophers (79% vs 32%) and no difference was seen even for philosophers with specialization in ethics.

Another Approach

This all fits with Espoused Theory, the Social Intuitist Model and the Cognitive Biases approach. Philosophers are trained to deal with ethical problems slowly and precisely (using ‘System 2’ in Kahneman’s language), but when faced with problems like the disease scenarios, their System 1 is just as vulnerable to the framing effect as anyone else.

But can this approach explain all moral discrepancy? Does it even explain the story we began with? Can Sartre’s actions during the war really be put down to cognitive biases and framing effects? He certainly would have had time to consider whether to disband his resistance group as well as whether to take the lecturing post. Can we really class these as in-the-moment, intuitive decisions? Professor Schwitzgebel (of Schwitzgebel and Cushman) has another theory. He has spent a large amount of his life’s work conducting empirical studies on the moral behaviours of Professors of Ethics in particular to determine if they are any kinder, fairer or more moral than other people.

angel_devilOver the years Professor Schwitzgebel and colleagues have looked at a vast range of behaviours including donating to charity, responding to student emails, organ and blood donation, frequency they call their mothers, eating meat, theft of library books, etc etc. The overall finding? No difference. Professors of philosophy studying ethics were no worse or better on these range of behaviours than other people.

Further, especially in regards to eating meat and giving to charity, the ethics professors were significantly different to other groups in their espoused belief about how morally bad eating meat was (they thought it was worse) and how much of one’s salary should be given to charity (they thought it should be more). But when it came to actual
behaviour? No difference.

So why the discrepancy here? The cognitive biases approach doesn’t seem any more relevant here than in Sartre’s case – there are no clear ‘framing’ effects, and people have all the time they need to make these decisions. From all his studies and interviews Professor Schwitzgebel believes one fact clearly shines through: morally, he says, people just want to be about as good as the other people around them. Studying ethics will change your idea of what an ‘ideal person’ is – but it won’t change your desire to be that ideal person – you will still just aim to be about average and no amount of theoretical expertise will change this fact. So it seems that even when we have time to employ our ‘System 2’ and really think about our behaviour, ‘good enough’ is good enough and we shouldn’t expect those with a large amount of training in ethical philosophy or even those who profess these beliefs, like Sartre, to stand by them in practise. Schwitzgebel calls this ‘Cheeseburger Ethics’ and you can find out why by reading his excellent post here: http://aeon.co/magazine/philosophy/how-often-do-ethics-professors-call-their-mothers/

God Speed!

References

Haidt, J. (2001). The Emotional Dog and Its Rational Tail: A Social Intuitionist Approach to Moral Judgment. Psychological Review, 108(4), 814–834. doi:10.1037//0033-295X.

Kahneman, D. (2011). Thinking, fast and slow. Macmillan.

Kahneman, D., & Tversky, A. (1979). Prospect theory: an analysis of decision under risk. Econometrica, 47(2), 263–292. Retrieved from http://www.jstor.org/stable/1914185

Meyer, M. W., Argyris, C., & Schon, D. a. (1976). Theory in Practice: Increasing Professional Effectiveness. Contemporary Sociology (Vol. 5). doi:10.2307/2062989

Schwitzgebel, E., & Cushman, F. (2015). Philosophers’ biased judgments persist despite training, expertise and reflection. Cognition, 141, 127–137. doi:10.1016/j.cognition.2015.04.015

Tversky, A, & Kahneman, D. (1974). Judgment under Uncertainty: Heuristics and Biases. Science (New York, N.Y.), 185(4157), 1124–1131. doi:10.1126/science.185.4157.1124

 

 

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *