Have you heard about the ‘ultimate game of economics’? Here’s how it goes.
A person – let’s call him proposer – is given a hundred bucks and asked to split the money with a stranger, called responder. The split doesn’t need to be equal. Proposer could split it 50-50 or he could even keep 90 for himself and offer 10 to the stranger. But the condition is that if the responder rejects the offer, none of them get any money.
If you were the responder, at what split ratio would you accept the offer?
50-50? Most people would consider that fair. But is it rational?
What if you didn’t know about the total sum involved in the deal and you’re told only about the amount that proposer offers you? Isn’t it like a free money, something that you found lying on the street. Why would you reject even 5 bucks that way?
But that’s not how humans think. Right?
The knowledge that someone else got a better deal (at our cost) makes us humans feel cheated.
“Not fair,” we cry. “How dare the proposer offer less than 50 to me?”
Some would even argue that the proposer should keep less than 50 for himself and offer more to the responder.
The ultimate Game of economics isn’t something that I have coined myself. Wikipedia mentions –
When carried out between members of a shared social group (e.g., a village, a tribe, a nation, humanity) people offer “fair” (i.e., 50:50) splits, and offers of less than 30% are often rejected.
But how exactly do we define what’s fair? Here’s another hypothetical situation. I have taken this example from Prof. Sanjay Bakshi’s post –
You are in charge of running a retail store and one of your cashiers, an elderly woman, is caught committing a minor embezzlement. Fearing that she might be dismissed, she approaches you to plead forgiveness. She tells you that this is the first time she embezzled money from the company and promises that she’ll never do it again. She tells you about her sad situation, namely that her husband is very ill and that she was going to use the money to buy medicines for him. She becomes extremely emotional and your heart is melting. What do you do?
There’s no right or wrong answer here. It’s an open-ended question and how you think about it is more important than what you decide to do in the end. Prof Bakshi writes –
The possible actions are: (1) She is lying and you fire her (good outcome – because it cures the problem and sends the right signals); (2) She is telling the truth and you fire her (bad outcome for her but good outcome for system integrity); (3) She is lying and you pardon her (bad outcome for system integrity); and (4) She is telling the truth and you pardon her (bad outcome for system integrity because it will send the wrong signal that it’s ok to embezzle once).
What’s fair to the elderly lady may not necessarily be fair to the larger system or the society. Prof Bakshi has termed this mental model as the “law of higher good.” So one way to solve this problem is to make a decision based on what’s good for the larger group of people i.e. the people in the organization. Of course, this course of action assumes that your decision will lead to a good outcome for the larger system, which again may not be true.
Devdutt Pattanaik, a medical doctor turned mythologist, in his book How To Take Decisions, writes –
At the time of action, our decision is based on a set of assumptions. The assumptions may be wrong. Leaders have to constantly deal with uncertainty, give hope to the people even when nothing is clear. Decisions become good or bad in hindsight. We would like to believe that a decision is rational. More often than not, decisions are rationalized.
Now the question of fairness leads us to a bigger and even more interesting conundrum. It’s about morality.
For centuries, the trolley problem has troubled the philosophers.
A trolley with no brakes is cruising at a dangerous speed on its track. Few hundred meters down the same track five people are working, unaware of the oncoming trolley. Fortunately, you are sitting in the control room and can see the precarious situation. You can save those five lives by pulling a lever which will send the trolley on a different track. But doing so will ensure the death of another guy who is working alone the second track. Should you kill the one to save five?
Most people say, that they would pull the lever and save five lives at the cost of one. Fair enough. But here’s a little twist in the tale. Now imagine you’re standing on a footbridge above the track and can see the trolley hurtling towards those five people. There’s a fat man standing next to you, and you know that his weight would be enough to stop the trolley. Would you push that fat guy while the trolley is passing through the footbridge? Effectively, it’s the same situation. Just that, now you have to push a guy instead of pulling a lever.
A real moral dilemma. Isn’t it? There’s even a book on this particular problem called Would You Kill the Fat Man?
Now you might think that it’s a purely hypothetical question which armchair philosophers have created for their own amusement. Not really. This problem is perhaps giving sleepless nights to Elon Musk these days.
Imagine you’re out on a long drive in your brand new self-driving car. A Tesla maybe. The car is gliding on a deserted mountain road. All of a sudden, a group of cyclists appear from nowhere. The car-software, supposedly powered by an advanced artificial intelligence, does a few million complex calculations in a couple of microseconds and determines that collision is inevitable unless the car takes a sharp turn towards the right and plunges several hundred feet down the hill, killing the passenger. What would the car do?
It’ll probably depend on how Mr. Musk chose to address the trolley problem. Or maybe it’ll boil down to a much simpler problem i.e. what version of the software have you bought for your autonomous car. Is it the cheaper altruistic version? Or is it the selfish one for which you had to pay an exorbitant price?
We started this discussion with the ultimate game of economics and then drifted off to the subjects of fairness and morality. How’s all this related to investing?
Let’s say you spend months together in researching a stock. You read all the annual reports of the past year, you crunch the numbers, you attend the AGM and study the conference call transcripts diligently. After all this hard work you decide to invest in the business only to find out a few months later that the stock crashes because of a change in certain government regulation which negatively impacted the businesses. Is it fair?
In this situation, if you harp on the question of fairness you may lose more money as the stock continues its downward journey while you are shaking your head in utter disbelief and crying, “It’s not fair.”
Peter Lynch said, “In this business, if you’re good, you’re right six times out of ten. You’re never going to be right nine times out of ten.”
That’s why you need diversification. If you have 15-20 stocks in your portfolio, you don’t need to make money on each of them. In investing, you don’t have to make the money the same way you lost it.
Extending the “law of higher good” mental model in this scenario would mean ensuring a positive outcome at the portfolio level and not getting disheartened about few losers.
Let’s talk about morality and investing.
Should you invest in a business that manufactures tobacco products? How about investing in a company that sells alcohol? These products are harmful to the society so isn’t it unethical to promote these businesses? But scientists have proved that even sugar is very harmful too. Should you then reject the businesses that use sugar in their products? If yes, then you’ll have to filter out almost all the businesses selling packaged foods.
It’s a very difficult question. And the answer is very subjective. Call it the investing version of the trolley problem. By the way, if you haven’t watched Harvard Professor Michael Sandel’s lecture – Moral Side of the Murder – you must watch it as soon as you’re done reading this post.
Coming back to the proposer-responder game, it’s not just the ultimate game of economics but the ultimate game of life. How should one deal with seemingly unfair situations in life? Should we fight for our fair share? Or we should move on?
My intention in this discussion is to nudge you to think about these paradoxes of fairness and morality in the context of your own life and the decision that you’ve made or are going to make in future.
And while you do that let me go and find that scientist who would like to play the ultimate game of economics with me. It would be a good opportunity to make few easy bucks.