When the Dominant Strategy Leads to Doom

We have seen in the previous post that from a country’s standpoint, it is more advantageous to do nothing for the climate. And let others work for the betterment of you! We’ve also seen that others, too, playing their little games, will follow suit, each finding their equilibrium to the detriment of the whole planet. In other words, individual rationality leads to collective irrationality.

The Paris Climate Accord

The role of the Paris Accord is to force countries and bring them to the non-dominant strategy {contribute, contribute}. It can be done in the following ways.

First, the Paris Agreement is legally binding. So all the signees have to contribute. But here is the catch: the amount of contribution is determined by the individual country through what is known as Nationally Determined Contributions (NDC).

The second option is to provide incentives. The Paris Agreement provides the framework for financial, technical, and capacity building for those who need it. Financial support is necessary to achieve capacity building, which, in turn, will reduce the cost through a feature known as the learning rate.

The third option is to impose a cost on non-compliance. As part of the Enhanced Transparency Framework (ETF), countries are required to report back in 2024 and show the actions taken. The process can induce a moral cost to the participating countries, inducing the required push to follow up on promises with actions. Moreover, the measurement, reporting, and verification are legally binding.

Paris Agreement: UNFCC

When the Dominant Strategy Leads to Doom Read More »

The Game Called Climate

International cooperation on climate change is an example of game theory with a Nash equilibrium and a dominant strategy. Let’s look at the problem from a country’s viewpoint and construct the payoff matrix. We call it the country MY.

The premise is that the climate crisis is real, but the solution is costly because it requires developing new technologies. If neither the country MY nor the rest makes any attempt to invest, the game ends in total calamity: {(MY:-10), (RE:-10)}. On the other hand, if country MY remains a passive free-rider and the rest of the world does the job, the former gets the maximum benefit (MY:12). On the other hand, if country MY makes all the effort, the payoff will be (MY:-15) in great losses for them. Finally, if everybody cooperates according to their respective capacities, all of them will benefit {(MY:10),(RE:10)}. The payoff matrix is

Country MY
Country MY
doesn’t contribute
Country MY
contributes
The RestThe rest
doesn’t contribute
MY: -10, RE: -10MY: -15, RE: 12
The rest
contributes
MY: 12, RE: 8MY: 10, RE: 10

Country MY calculates that it is better placed by not acting, irrespective of what the rest does. They also anticipate that, in such a scenario, the rest will lose more by not working. From an individual country standpoint, this logic of not participating makes it economically advantageous.

But there is an error in this thinking. Any (or all) of the countries in the rest can also follow Country MY’s suit. It leads to a total failure, and no one benefits {-10, -10}

The Game Called Climate Read More »

The Tragedy of Pareto Inefficient Nash Equilibrium

We have seen the prisoner’s dilemma in an older post. The rational decision-maker, the prisoner, will confess because it gives the best outcome, no regret, irrespective of what the other would do. Therefore, it is a Nash equilibrium, named after the American mathematician John Nash – the best response for the prisoner to the choice of the other.

But we know that it, {confess, confess}, is not the best result for either of the players. In other words, the outcome (Nash equilibrium) is not Pareto efficient! A Pareto efficient outcome happens when there isn’t a possible result where someone is better off and nobody is worse off. For this game, Pareto efficiency would have occurred had the prisoners cooperated. But then, confess is the dominant strategy.

Another example of a Pareto inefficient Nash equilibrium is when participants over-consume common resources in what is known as the tragedy of the commons. It is a tragedy as parties out of self-interest consume and deplete the shared resources.

The Tragedy of Pareto Inefficient Nash Equilibrium Read More »

SSP in CMIP6

A key feature of the climate report is the abundance of acronyms, and the title for today’s post is a deliberate attempt to introduce a couple of them!

One of the goals for establishing a model framework is to forecast. And the prerequisite for reliable forecasting is a good fit with the historical data. We have seen in the previous post the importance of CMIP and the role of climate models to match the historical trends as much as possible.

SSPs

This projection of future scenarios is based on fives pathways, called Socioeconomic Pathways (SSP) – SSP1 through SSP5. These range from the mildest (impact to climate change) SSP1 (sustainable development) to the harshest SSP5 (high energy demand, fossil fuel development). These pathways are then combined with the global effective radiative forcing (ERF) values (W/m2) envisaged in 2100 to get the SSP matrix.

Representation of SSP scenarios

An SSP scenario is represented by the SSP pathway number followed by the 2100 forcing value. For example, the sustainable pathway at 1.9 W/m2 ERF is SSP1-1.9. Chapter 4 of the WG1 report of AR6 focusses on 5 scenarios SSP1-1.9, SSP1-2.6, SSP2-4.5, SSP3-7.0, and SSP5-8.5.

As per these scenarios, a 1.5 C increase of global mean surface air temperature (GSAT) is highly likely to occur in SSP5-8.5, likely to occur in SSP2-4.5, SSP3-7.0 and more likely than not in SSP1-1.9, SSP1-2.6 in the next 20 years!

Climate Reports: IPCC

Introduction to Climate Models, CMIP

SSP in CMIP6 Read More »

Climate Models

The backbone of the IPCC climate assessment is the Coupled Model Intercomparison Project (CMIP). The term coupled means that the model can evaluate the whole system, i.e., atmosphere and ocean. Intercomparison suggests that climate models, developed by various groups at different points in time, are harmonised using the same set of inputs (provided by CMIP). This way, if there are differences between models’ predictions, it can be assured that it was not due to variations in experimental design but due to the difference in physics or the mathematical treatments.

Climate models are mathematical representations of complex geo-chemical-physical aspects and relate various inputs to the observed features of global warming. It gives better control over the underlying science but more importantly, it serves as a framework for forecasting.

The tool, currently at version 6, is a collaborative framework aiming to improve the understanding of climate changes related to global warming. The tool compares climate models, developed by various groups, with the experimental data and to each other.

The multi-model mean captured by CMIP6 closely matches the Global Mean Surface Temperature (GMST), although there can be differences in the predictions of individual models with the observed data.

Climate Reports: IPCC

 Introduction to Climate Models, CMIP

Climate Models Read More »

Many Indicators, One Message

We may divide the ecological system we reside in into four parts viz., atmosphere, biosphere, cryosphere and oceans. The climate report makes an exhaustive list of evidence of changes in these sub-systems. Some of them, such as the rise in sea levels and the increased frequency of floods, prove direct evidence for a changing climate, whereas the others, e.g. temperature rise or elevated levels of greenhouse gases in the atmosphere, are evidence of its root causes.

Data from atmosphere

The first and foremost is the carbon dioxide (CO2) in the atmosphere. Its concentration had reached 409.9 (+/- 0.4) ppm in 2019. Similar values for the other greenhouse gases (GHG) were methane 1866.3 +/- 3.3 ppb and nitrous oxide 332.1 (+/-0.4) ppb. This level of CO2 is the highest in the last two million years! These molecules are ominous for the system because of their power to influence the so-called Effective Radiative Forcing (ERF); the subsequent increase of energy causes warming in the system.

Data from biosphere

Global Mean Surface Temperature (GMST), the poster boy of climate change crisis, has increased by over 1.09 oC [0.95 – 1.20] since industrialisation; 1.59 oC for land and 0.88 oC for oceans.

Global land-precipitation has increased since 1950, and its pace has further picked up since the 1980s.

Data from cryosphere

The annual mean and the late-summer values of Arctic ice coverage are the lowest since 1850. The decadal means of Arctic sea ice area has decreased from 6.23 million km2 in the 1980s to 3.76 million km2 in the last decade for September and from 14.52 to 13.42 million km2 for March.

Data from oceans

The global mean sea level (GMSL) has risen by 0.2 m since 1901, and the rate is accelerating. On the other hand, the ocean heat content has increased, pH and oxygen contents have decreased.

Climate Change Indicators: US EPA

Climate Reports: IPCC

Many Indicators, One Message Read More »

AR6 is getting Ready

The first two Working Group (WG) reports of the sixth assessment (AR6) of the Intergovernmental Panel on Climate Change (IPCC) is now available for public view. Two more reports – the third working group (WGIII) and final synthesis report (SYR) – are due later this year.

IPCC and working groups

IPCC, formed in 1988 by World Meteorological Organization (WMO) and United Nations Environment Programme (UNEP), is the United Nations (UN) body for assessing the science related to climate change. The body has done an honourable job for more than 30 years in providing policymakers with scientific information about climate change. While IPCC does not conduct its research, it gathers input from thousands of scientists and mathematicians working in this field globally and facilitates expert review.

It has three working groups and a task force. WGI deals with the science of climate change, WGII its impact and the third group, WGIII, concerns the mitigation plans.

Assessment cycles

The current assessment cycle, the sixth, started from where the fifth had ended (2013-14) and has its first report (AR6-WGI) released in 2021.

WGI: physical science basis of climate change

Through its 12 chapters spread over 4000 pages, the report summarises the current state of knowledge about climate information and human-induced climate change. We’ll go through some of its findings in the next post.

AR6 is getting Ready Read More »

Endowment Effect

The endowment effect is an emotional attachment to an object in possession whereby the individual asks for a higher value than what they would be willing to pay for it. In other words, an individual’s willingness to accept (WTA) compensation to sell a good exceeds her willingness to pay (WTP) a similar good.

The root of the endowment effect can be similar to status quo bias – the loss aversion.

In the famous experiment carried out by Kahneman et al., students of Simon Fraser University were randomly grouped into three – sellers, buyers and choosers. Sellers were handed coffee mugs that they could sell for prices between $0.25 – $9.25. Buyer groups have the option to buy them at those sets of prices. The people in the chooser group have no mugs but can choose between mugs or that amount of money at the selling prices.

You may notice that the chooser group is equivalent to the seller but without possession of the good, and therefore, the ideal control of the experiment. The results were startling: the average asking price for the sellers was twice ($7.12) than what the chooser ($3.12) and buyer ($2.87) would have settled for.

Kahneman; Knetsch; Thaler, “The Endowment Effect, Loss Aversion, and Status Quo Bias”, Journal of Economic Perspectives, 5(1), 1991

Endowment Effect Read More »

Loss Aversion and Status Quo

Does rational decision-making exist? Well, that depends on how you define it. We have seen the expected value and expected utility theories. The expected value is a straightforward multiplication between the value of an option and its probability to be realised. In this scheme, a 20% chance for $600 (0.2 x $600 = $120) is a clear winner over a sure $100. But we know that is not an automatic choice for people. If one finds more utility for $100, a sure chance is worth more than the possibility of $600; remember, if you are among the lucky 20%, you get $600, not $120. At the same time, a gambler may go for not just %600 but even $400!

These two rules are still not enough to categorise all choices. There is another principle that governs choice, known as the reference-dependent model. i.e. an individual’s preference is dependent on the asset she already possesses. Driven by psychology, a sort of inertia creeps in such situations.

Loss aversion is one such instance. It means that the perceived downside appears heavier than the potential upsides for someone possessing a material (reference point). As a result, you decide to stay where you are – a case of status quo bias. Tversky and Kahneman sketch this value function in the following form.

There is data available from field studies on the choice of medical plans. The researchers found that a new medical scheme is more likely chosen by a new employee than someone hired before that plan was available.

Tversky and Kahneman, “Loss Aversion and Riskless Choice: A Reference Dependent Model,” Quarterly Journal of
Economics, 1991

Loss Aversion and Status Quo Read More »

Biases in Decision Making

Be it massive industrial projects or simple personal decisions – good, quality decisions are essential for success. We have seen the value and utility models helps in understanding how a rational (or real) decision-maker operates.

Biases are one of the big blockers against rational decision making. Let’s see some of the most common types.

Optimism bias

It refers to an individual’s anticipation for positive future outcomes than what could occur in reality. The decision-maker gets the overconfidence that prevents her from thinking about a negative outcome. It is also sometimes called the illusion of invulnerability. While for individuals, this may bring a few benefits (happier life), for critical decision making, say for a business, this is a major cause of alarm. Imagine you launching a product ignoring your competition.

Status quo bias

The preference for the current state of affairs is a type of decision making based on what comes naturally rather than what is important or what the evidence suggests. In politics, conservative ideologies, as their name suggests, favour the status quo, often thwarting progressive changes to society.

Confirmation bias

Confirmation bias ignores contradictory evidence and is desperate to deliver what they are committed to. We all have it to certain degrees, and it is so difficult to get rid of completely.

Sunk cost bias

It happens you continue progressing a project long after you should have abandoned it. And the reason? Well, I have spent a lot already. Call it emotional attachment, overoptimism, or just faith, businesses and individuals throwing good money after bad happens all the time.

Not invented here (NIH) bias

Typical for well-established corporates – hesitance to adopt outsider technologies, and the insistence to develop own, while it is sufficient to replicate existing solutions. A notable exception who reaped rich rewards is Microsoft Corporation.

Anchoring

We have seen anchoring before. This happens when the decision-maker depends heavily on the initial piece of information. Discount sales in shops are a good example. Rather than analysing the merit of that particular (final) price tag, the buyer gets hooked to the anchor, i.e. the original (often inflated) higher price.

Groupthink

It’s about supporting the consensus and ignoring divergent alternatives. Such a tendency could originate from our tendency to harmonise with the rest (especially the superiors), the overconfidence in the judgement of the crowd (argumentum ad populum), or merely the feel of comfort for sharing the blame, in case the decision fails!

The Intelligence Trap

Beware of this automatic assumption that smarter, reactive, quick answers mean an effective thinking process has taken place.

Biases in Decision Making Read More »