Decision Making

Battle of Sexes

Battle of sexes is an example of a game with multiple Nash equilibria. So what is it?

A and B are a couple who want to spend the evening together. A likes to watch football, but B prefers dancing. They go to work agreeing to call each other in the evening to meet up. Because of some network issues, they can’t communicate in the evening. So what is the strategy for the couple? Note that it has become a simultaneous move game. Let’s check the payoffs from each other’s point of view.

B
FootballDance
AFootballA:10, B:5A:0, B:0
DanceA:0, B:0A:5, B:10

A knows B has a chance to go to her favourite dance. Therefore, A decides to go to the dance. Since A has no liking for the show, he can’t get the highest payoff, 10, but is still happy to be with B. B, on the other hand, enjoys dancing with her partner.

From B’s angle, A could go for the football he loves. If both end up on the grounds, the payoff will be similar to the previous case, but the roles will be reversed.

Suddenly, the game has two Nash equilibria; a football equilibrium and a dancing equilibrium. So what could really happen? One possibility is a complete lack of coordination, and the Gift of the Magi, the perfect tale of love and sacrifice, re-enacts zero payoff but maximum drama!

Reaching equilibrium

An equilibrium can happen under any of the three circumstances
1) Equal-partner case: they end up at the same place by luck
2) A-dominating case: B has no doubt that A would be at the football place
3) B -dominating case: A knows B is determined to be at her favourite dance floor

Battle of Sexes Read More »

When the Dominant Strategy Leads to Doom

We have seen in the previous post that from a country’s standpoint, it is more advantageous to do nothing for the climate. And let others work for the betterment of you! We’ve also seen that others, too, playing their little games, will follow suit, each finding their equilibrium to the detriment of the whole planet. In other words, individual rationality leads to collective irrationality.

The Paris Climate Accord

The role of the Paris Accord is to force countries and bring them to the non-dominant strategy {contribute, contribute}. It can be done in the following ways.

First, the Paris Agreement is legally binding. So all the signees have to contribute. But here is the catch: the amount of contribution is determined by the individual country through what is known as Nationally Determined Contributions (NDC).

The second option is to provide incentives. The Paris Agreement provides the framework for financial, technical, and capacity building for those who need it. Financial support is necessary to achieve capacity building, which, in turn, will reduce the cost through a feature known as the learning rate.

The third option is to impose a cost on non-compliance. As part of the Enhanced Transparency Framework (ETF), countries are required to report back in 2024 and show the actions taken. The process can induce a moral cost to the participating countries, inducing the required push to follow up on promises with actions. Moreover, the measurement, reporting, and verification are legally binding.

Paris Agreement: UNFCC

When the Dominant Strategy Leads to Doom Read More »

The Game Called Climate

International cooperation on climate change is an example of game theory with a Nash equilibrium and a dominant strategy. Let’s look at the problem from a country’s viewpoint and construct the payoff matrix. We call it the country MY.

The premise is that the climate crisis is real, but the solution is costly because it requires developing new technologies. If neither the country MY nor the rest makes any attempt to invest, the game ends in total calamity: {(MY:-10), (RE:-10)}. On the other hand, if country MY remains a passive free-rider and the rest of the world does the job, the former gets the maximum benefit (MY:12). On the other hand, if country MY makes all the effort, the payoff will be (MY:-15) in great losses for them. Finally, if everybody cooperates according to their respective capacities, all of them will benefit {(MY:10),(RE:10)}. The payoff matrix is

Country MY
Country MY
doesn’t contribute
Country MY
contributes
The RestThe rest
doesn’t contribute
MY: -10, RE: -10MY: -15, RE: 12
The rest
contributes
MY: 12, RE: 8MY: 10, RE: 10

Country MY calculates that it is better placed by not acting, irrespective of what the rest does. They also anticipate that, in such a scenario, the rest will lose more by not working. From an individual country standpoint, this logic of not participating makes it economically advantageous.

But there is an error in this thinking. Any (or all) of the countries in the rest can also follow Country MY’s suit. It leads to a total failure, and no one benefits {-10, -10}

The Game Called Climate Read More »

The Tragedy of Pareto Inefficient Nash Equilibrium

We have seen the prisoner’s dilemma in an older post. The rational decision-maker, the prisoner, will confess because it gives the best outcome, no regret, irrespective of what the other would do. Therefore, it is a Nash equilibrium, named after the American mathematician John Nash – the best response for the prisoner to the choice of the other.

But we know that it, {confess, confess}, is not the best result for either of the players. In other words, the outcome (Nash equilibrium) is not Pareto efficient! A Pareto efficient outcome happens when there isn’t a possible result where someone is better off and nobody is worse off. For this game, Pareto efficiency would have occurred had the prisoners cooperated. But then, confess is the dominant strategy.

Another example of a Pareto inefficient Nash equilibrium is when participants over-consume common resources in what is known as the tragedy of the commons. It is a tragedy as parties out of self-interest consume and deplete the shared resources.

The Tragedy of Pareto Inefficient Nash Equilibrium Read More »

Endowment Effect

The endowment effect is an emotional attachment to an object in possession whereby the individual asks for a higher value than what they would be willing to pay for it. In other words, an individual’s willingness to accept (WTA) compensation to sell a good exceeds her willingness to pay (WTP) a similar good.

The root of the endowment effect can be similar to status quo bias – the loss aversion.

In the famous experiment carried out by Kahneman et al., students of Simon Fraser University were randomly grouped into three – sellers, buyers and choosers. Sellers were handed coffee mugs that they could sell for prices between $0.25 – $9.25. Buyer groups have the option to buy them at those sets of prices. The people in the chooser group have no mugs but can choose between mugs or that amount of money at the selling prices.

You may notice that the chooser group is equivalent to the seller but without possession of the good, and therefore, the ideal control of the experiment. The results were startling: the average asking price for the sellers was twice ($7.12) than what the chooser ($3.12) and buyer ($2.87) would have settled for.

Kahneman; Knetsch; Thaler, “The Endowment Effect, Loss Aversion, and Status Quo Bias”, Journal of Economic Perspectives, 5(1), 1991

Endowment Effect Read More »

Loss Aversion and Status Quo

Does rational decision-making exist? Well, that depends on how you define it. We have seen the expected value and expected utility theories. The expected value is a straightforward multiplication between the value of an option and its probability to be realised. In this scheme, a 20% chance for $600 (0.2 x $600 = $120) is a clear winner over a sure $100. But we know that is not an automatic choice for people. If one finds more utility for $100, a sure chance is worth more than the possibility of $600; remember, if you are among the lucky 20%, you get $600, not $120. At the same time, a gambler may go for not just %600 but even $400!

These two rules are still not enough to categorise all choices. There is another principle that governs choice, known as the reference-dependent model. i.e. an individual’s preference is dependent on the asset she already possesses. Driven by psychology, a sort of inertia creeps in such situations.

Loss aversion is one such instance. It means that the perceived downside appears heavier than the potential upsides for someone possessing a material (reference point). As a result, you decide to stay where you are – a case of status quo bias. Tversky and Kahneman sketch this value function in the following form.

There is data available from field studies on the choice of medical plans. The researchers found that a new medical scheme is more likely chosen by a new employee than someone hired before that plan was available.

Tversky and Kahneman, “Loss Aversion and Riskless Choice: A Reference Dependent Model,” Quarterly Journal of
Economics, 1991

Loss Aversion and Status Quo Read More »

Biases in Decision Making

Be it massive industrial projects or simple personal decisions – good, quality decisions are essential for success. We have seen the value and utility models helps in understanding how a rational (or real) decision-maker operates.

Biases are one of the big blockers against rational decision making. Let’s see some of the most common types.

Optimism bias

It refers to an individual’s anticipation for positive future outcomes than what could occur in reality. The decision-maker gets the overconfidence that prevents her from thinking about a negative outcome. It is also sometimes called the illusion of invulnerability. While for individuals, this may bring a few benefits (happier life), for critical decision making, say for a business, this is a major cause of alarm. Imagine you launching a product ignoring your competition.

Status quo bias

The preference for the current state of affairs is a type of decision making based on what comes naturally rather than what is important or what the evidence suggests. In politics, conservative ideologies, as their name suggests, favour the status quo, often thwarting progressive changes to society.

Confirmation bias

Confirmation bias ignores contradictory evidence and is desperate to deliver what they are committed to. We all have it to certain degrees, and it is so difficult to get rid of completely.

Sunk cost bias

It happens you continue progressing a project long after you should have abandoned it. And the reason? Well, I have spent a lot already. Call it emotional attachment, overoptimism, or just faith, businesses and individuals throwing good money after bad happens all the time.

Not invented here (NIH) bias

Typical for well-established corporates – hesitance to adopt outsider technologies, and the insistence to develop own, while it is sufficient to replicate existing solutions. A notable exception who reaped rich rewards is Microsoft Corporation.

Anchoring

We have seen anchoring before. This happens when the decision-maker depends heavily on the initial piece of information. Discount sales in shops are a good example. Rather than analysing the merit of that particular (final) price tag, the buyer gets hooked to the anchor, i.e. the original (often inflated) higher price.

Groupthink

It’s about supporting the consensus and ignoring divergent alternatives. Such a tendency could originate from our tendency to harmonise with the rest (especially the superiors), the overconfidence in the judgement of the crowd (argumentum ad populum), or merely the feel of comfort for sharing the blame, in case the decision fails!

The Intelligence Trap

Beware of this automatic assumption that smarter, reactive, quick answers mean an effective thinking process has taken place.

Biases in Decision Making Read More »

Utility and Psychology of Sunk Cost

We have already seen that human decision making is complex and is not always related to the value or the utility of a material (or money). Tversky and Kahneman describe another survey on two groups of people, about 200 people each.

To group #1:
Imagine you paid $10 for a ticket to a play. You reached the theatre and discovered that you had lost the ticket. Will you spend $10 on another ticket?

54% of the people said NO to that question. Apparently, half of the people thought $20 was too expensive for a ticket.

To group #2:
Imagine you went for a play where admission is $10 per ticket. You reached the theatre and discovered that you had lost a $10 bill. Will you spend $10 on a ticket?
88% of the respondents said YES to the purchase.

Same loss, different feelings

The main difference is that, in the second case, the lost dollar was not accounted for the ticket purchase. And $10 on the ticket was a different event unconnected to the loss of $10. I lost a 10 dollar bill due to negligence, but that doesn’t deprive me of watching the play (or it is a good distraction to forget my loss)!.

On the other hand, the re-purchase of the ticket is a painful decision; spending double on a ticket that happened due to my carelessness!

Tversky, A.; Kahneman, D., Science, 1981, 211, 453

Utility and Psychology of Sunk Cost Read More »

The Framing of the Risk in Decision Making

We have seen three questions and the public response to those in the last post. While the expected value theory provides a decent conceptual basis, real-life decisions are typically taken based on risk perception, sometimes collected in the utility functions. Let’s analyse options and the popular answers to the three questions.

80% preference for sure $30 vs 80% chance for $45 is a clear case of risk aversion in all forms. People were willing to ignore that 8 out of 10 would get $45 had they given up the $30 in the bank.

Understanding the response to the second question was easy. The first stage was mandatory for the participants to play, and the options in the second stage were identical to the first question.

The intriguing response was the third one. In reality, the second and third questions are the same. Yet, almost half of people who went for the sure-shot $30 are now willing to bet for $45!

Tversky, A.; Kahneman, D., Science, 1981, 211, 453

The Framing of the Risk in Decision Making Read More »

Utility Model in practice

We have seen how a rational decision-maker may operate using either the expected value or the expected utility theory. Real-life, however, is not so straightforward about these kinds of outcomes. In a famous Tversky-Kahneman experiment, three groups were presented with three situations.

1: Which of the following do you prefer?
A. a sure win of $30
B. 80% chance to win $45

2: There is a two-stage game with a 25% chance to advance to the second stage. After reaching the second, you get the following choices, but you must give the preference before the first stage. If you fail in the first, you get nothing.
C. a sure win of $30
D. 80% chance to win $45

3: Which of the following do you prefer?
E. 25% chance to win $30
F. 20% chance to win $45

Expected Values

We will look at the expected values of each of the options. You will argue that it’s not how people make decisions in real-life. But, keep it as a reference. Remember: EV = value x chance, summed over.

CaseEV
A$30
B$36 ($45 x 0.8)
C$7.5 (0.25 x $30)
D$9 (0.25 x $45 x 0.8)
E$7.5 (0.25 x $30)
F$9 (0.2 x $45)
I have highlighted the higher EVs (of the choices) in bold.

What did people say?

In the first group, 78% of the participants chose option A. In the second, it was 74% in favour of option C. It was almost a tie (42% for E and 58% for F) for the final group.

These three problems are, in one way, similar to each other. We will see that next.

Tversky, A.; Kahneman, D., Science, 1981, 211, 453

Utility Model in practice Read More »