Ultimate envy

By now, only the most fundamentalist of libertarians cling to the belief that humans behave as Homo Economicus, the mythical self-interested rational agent on which much of free market economic thinking is based.  Studies of real humans behaving in the real world (or at least in research laboratory settings) have revealed that we all exhibit a variety of cognitive biases, and that these biases affect our decision-making in such a way that we regularly diverge from ‘rational’ behaviour.

If you wish, you could join the Less Wrong crowd who have, in recent years, been attempting to modify their own behaviours so that they conform better to pure rationality.  Their reasons for doing so vary, but include something along these lines: perfect rationality results in the best approximation of a condition conducive to human happiness. At a minimum, rationality is less wrong than what we have now.

It is not hard to demonstrate irrational behaviour among humans. One of the more compelling ways to do so is to ask people to participate in the Ultimatum game. A darling of neuroeconomists, in the Ultimatum game both you and a partner are given a sum of money (for best results, real money is used), let’s say $20.  Your partner is given the task of deciding how to divide the money between the two of you. Your task is to decide whether you accept the partner’s offer, in which case you both keep the money, or reject it, in which case you both get nothing.  If the partner offers you $10, the decision is easy and you both get 50% of the winnings.  But when the partner is viewed as acting unfairly, offering only $1 out of the $20 pot (5%), people often reject the offer. [Similar effects are seen if the pot is $200, but if the pot because sufficiently large – say a million dollars, most people say yes to an offer of only 5%.] The perfect rationalist idealized as homo economicus would never do such a thing – why, after all, would anyone turn down a free dollar?  The Less Wrong crowd would take a moment to consider what cognitive biases might cause individuals to turn down a free dollar under such circumstances, and work to try and nullify them. Real people in the real world turn down unfair offers with regularity.

What sorts of cognitive biases cause people to spurn an offer of free money? In the Ultimatum game, it seems that the unfairness causes people to feel pangs of disgust, and this emotional response is thought to modify rational thinking. The phenomenon is also a form of altruistic punishment, and has long been thought to act as a sort of social glue: members of society punish people who act unfairly, even if they do so at a cost to themselves. Put into this context, it might make a bit of sense to act this way – perhaps rationality plays out not in the self-interested way that libertarian economists would have us believe, but rather in the buttressing of a social order way that, in the long run, serves the interests of everyone.

Or so the theory goes.

Ulrich Kirk, Jonathan Downar & Read Montague have just published the results of an experiment in which they compared two groups of people playing the Ultimatum game – a group of control individuals and a group of experienced Buddhist meditators (not 25,000 hours of experience, but enough to characterize them as experts, with somewhere between 6 months and 20 years of meditation practice). At relatively even splits of the pot, both groups behaved similarly, but when unfairness reared its head – when the participants were offered only 1 or 2 dollars of a $20 pot, the results diverged quite dramatically. The experienced meditators were significantly more likely to accept these ‘unfair’ offers.

One conclusion that Kirk et al. draw from their data is that experienced meditators behave more like rational agents than control subjects do when playing the Ultimatum game. Well, yes they behave more like rational agents, but are they doing so because they are unaffected by the cognitive biases that plague the rest of us? Kirk et al. provide some support for this position. When the brains of subjects were examined using fMRI when ‘unfair’ offers were made in the Ultimatum game, the areas of the brain that normally are associated with feelings of disgust were activated in control subjects but not in the meditators. By suppressing the emotional processing that might subvert rationality (emotional regulation being one of the outcomes of long-term mediation practice), the meditators were in a stronger position to arrive at the rational decision: accepting the lousy dollar that was being offered.  But the fMRI data also demonstrate that the meditators were not engaging those parts of the brain typically involved in rational thought. Despite the absence of responses associated with disgust, the meditators were not simply thinking ‘rationally’ and thereby accepting the money.

I would submit that there may be another interpretation that explains at least part of these meditator’s actions. Buddhist philosophy would neither endorse nor condemn the (rational) decision to take the money, but would question the (irrational) decision to engage in (altruistic) punishment. Put another way, it is only the ego that responds with outrage that someone else may get more of a free gift then we might; meditation helps one put those ego-driven emotions aside. Once that happens, the dollar doesn’t seem like such a bad deal after all. Indeed, there is every possibility that these experienced meditators were just grateful for what was being offered, and were not envious that others had more.

Were we to eliminate that cognitive bias, we might all be less wrong.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s