HomeBlogAbout UsWorkContentContact Us
 
 Advertisement 

The difference between Risk and Uncertainty

What’s the difference between risk and uncertainty? Both imply doubt and ambiguity in the outcome of an event, but for different reasons.

RISK is when we don’t know what the outcome is, but we do know the distribution of the outcomes.

UNCERTAINTY is when we don’t know what the outcome, and we don’t know the distribution.

This sounds like a subtle difference, but it is important and, as we will see later, because of the psychology of the human mind, our perception of risk and uncertainty is non-linear. This leads to some documented “paradoxes”, which we'll look into shortly. First, here's a very memorable quote related to this topic:

“ There are known knowns; there are things we know that we know. There are known unknowns; that is to say, there are things that we now know we don't know. But there are also unknown unknowns – there are things we do not know we don't know. ”

— United States Secretary of Defense, Donald Rumsfeld

Expected Value

Mathematicians (and gamblers) know of the concept of Expected Value.

Expected value is the outcome that would be obtained in an experiment if were possible to repeat the experiment an infinite number of times and the results averaged.

Let’s try a few thought experiments to understand this property, and how risk modifies the perception and acceptance of this (in people). There are no right or wrong answers to these questions. Every person has their own individual tolerance to risk, and casinos make their living in the arbitrage of risk.

Thought Experiments

Let’s imagine there are two boxes, we’ll call them Box A and Box B.

Box A contains $10,000 in cash. This amount of money is certain to be in the box.

Box B contains either $20,000 in cash, or nothing. Both options are equally likely (The person who prepared the boxes flipped a fair coin. If it landed heads, he put the cash in the box. If it landed tails, he left the box empty). It’s a 50:50 chance for either option.

You are allowed to take one of the boxes, and only one. You are not permitted to examine, weigh or touch either of the boxes before making your selection.

Question: Which box would you take?

 Box A   Box B 

Mathematically, both of these boxes have the same expected outcome. In theory, it should not matter which box you select. This assumes, though, that we could repeat the experiment over and over again (Remember the expected outcome is the average of outcomes if we could repeat the experiment an infinite number of times).

For this experiment, however, we’re only allowed to run it once. The law of large numbers does not apply. If you take Box A, you’re certain of $10,000. If you select Box B, and are unlucky, you get nothing.

When most people are presented with this question, they say they would take Box A with the justification, ”Hey, it’s better to get at least $10,000. If I took the other box, sure I could get $20,000 but I might and up with nothing. At least this way I get something.”

This is an example of RISK. The distribution of outcomes is known, even if the outcome is not.

Even if we increased the expected value for Box B by adjusting the potential payout to $21,000 (or nothing), I suspect most people would still select Box A. The higher expected outcome is not enough of an incentive to make people give up the certainty of $10,000.

Sure, if we could repeat the experiment dozens of times and average the results we’d plumb for the box with higher expected value, but we can’t.

Different folks, different strokes

Different people have different sensitivities to risk. As stated before, there is no right or wrong answer. It is, also, not just based on the psychology of the person, it is situationally dependent. Someone who has very little disposable income might cling on more to $10,000 certainty than someone who makes that every day.

Ask yourself the question. How much would the potential prize in Box B need to be increased so that you would take it instead of Box A? Would you take this box if there was a potential of it containing $25,000 (or nothing)? $100,000 (or nothing)? $500,000 (or nothing)? …

When the absolute difference between the two minimum possible outcomes is smaller, it seems (from my limited research), that people are more inclined to take risk (even if the expected outcome is, paradoxically, lower). e.g. If the money in Box A is changed to $100,000 sum-certain, and that in Box B is changed to either $90,000 (with a chance of ¾) and $125,000 (with a chance of ¼) people seem to want to take Box B.

Ellsberg Paradox

Let’s play a different thought experiment. Imagine there are two urns.

Urn A contains 50 red marbles and 50 white marbles.

Urn B contains an unknown mixture of red and white marbles (in an unspecified ratio).

You can select either of the Urns, then select from it a random (unseen) marble. If you pick a red marble, you win a prize. Which Urn do you pick from?

 Urn A   Urn B 

In theory, it should not matter which urn you select from. Urn A gives a 50:50 chance of selecting a red marble. Urn B also gives you the same 50:50 chance.

Even though we don’t know the distribution of marbles in the second urn, since it only contains red and white marbles, this ambiguity equates to the same 50:50 chance.

For various reasons, most people prefer to pick from Urn A. It seems that people prefer a known risk rather than ambiguity.

This ambiguity aversion was documented extensively by Daniel Ellsberg (yes, that one), and even earlier by John Maynard Keynes.

Ellsberg continued …

Next experiment: This time there is only one urn. In this urn are a mixture or Red, White and Blue marbles (how patriotic).

There are 90 marbles in total. 30 are Red, and the other 60 are a mixture of White and Blue (in an unknown ratio). You are given a choice of two gambles:

 Gamble 1  You win $100 if you pick a Red marble.

 Gamble 2  You win $100 if you pick a White marble.

Which gamble do you take? Now that you’ve read a section above you will see that most people seem to select Gamble 1. They prefer their risk to be unambiguous. A quick check of the expected value of both gambles shows they are equivalent (with a ⅓ probability). They go with the known quantity.

OK, next question. You are now given two more gambles to select between:

 Gamble 3  You win $100 if you pick a Red or Blue marble.

 Gamble 4  You win $100 if you pick a White or Blue marble.

In this exercise, the majority of people select Gamble 4 (the number of White and Blue marbles combined is 60).

This is a paradox, as the phrase "or Blue marble" is common on both Gamble 3 and Gamble 4, and the only thing that differentiates these choices from Gamble 1 and Gamble 2

If people selected Gamble 1 over Gamble 2 the majority of the time, then simply adding "or Blue marble" to both of these should keep people on this selection, but paradoxically, these people seem to flip to prefering Gamble 4 over Gamble 3.

Allais paradox

A slightly different example of this phenomenon is called called the Allais Paradox.

You can find a complete list of all the articles here.      Click here to receive email alerts on new articles.

© 2009-2013 DataGenetics    Privacy Policy