Prospecting Mimetic Fractals
  • The View
  • The Lenses of Wisdom
    • The Prospecting Lens
    • The Mimetic Lens
    • The Fractal Lens
  • The Master Crafters
  • Legal

The View

At the focal point of powerful lenses one might detect a glimmer of insight and wisdom.​

Prospecting Mimetic Fractals with Gerd Gigerenzer:  Double-Tonguing, Dread-Risk Fears and Rules of Thumb in the Face of Uncertainty

1/30/2019

0 Comments

 
Two posts ago I referenced a book called Risk Savvy by Gerd Gigerenzer of the Max Planck Institute.  This book has some interesting things to say at the intersection between the Prospecting and Fractal Lenses, with a peek through the Mimetic Lens as well.  It also provides some practical ways of recognizing the best paths to good decision-making.

The first idea we explored from the book under the Prospecting Lens was the System 1 bias known as:

THE LAW OF SMALL NUMBERS.  Our brains have a difficult time with statistics. Small samples are more prone to extreme outcomes than large samples, but we tend to lend the outcomes of small samples more credence than statistics warrant.  System 1 is impressed with the outcome of small samples but shouldn’t be.  Small samples are not representative of large samples.

THE LAW OF SMALL NUMBERS is often used to make small variations seem like large ones.  Professor Gigerenzer notes that these practices are endemic in the area of medical tests and treatments and the way they are marketed.

Basically, it works like this: when a proponent is trying to push a narrative using statistics, he or she will focus on relative differences to amplify to the supposed problem or effect of treatment.  In contrast, when a proponent is trying to downplay or oppose a narrative using statistics, the proponent will use base rates to minimize the problem or effect of treatment.  In reality, the base rates are the important statistics, while the relative statistics are usually misleadingly large and often lead to a cognitive misapprehension of the true situation, causing the reader to overvalue the meaning of small numbers as the heuristic suggests.

Professor Gigerenzer has a simple solution for this problem, which will keep up on System 2 thinking.  He says:  “Always ask: What is the absolute risk increase?”

For example, one should never rely on “5-year survival rates”, which are a relative measurement and overvalue the test or treatment.  Instead, one should consider overall mortality rates with and without the test or treatment as a measure of actual benefit.  Often there is only a negligible difference.

He also warns to beware of the scourge of “Double-Tonguing,” especially as to medical treatment.  Double-tonguing is a dubious practice or “trick to make the benefit of a drug (treatment) appear large and its harms smaller.  Typically, benefits are reported in relative risks (big numbers) and harms in absolute risks (small numbers).”

Another area where THE LAW OF SMALL NUMBERS comes into play is with what Professor Gigerenzer calls “Dread Risk Fear.”  Dread Risk Fear is an unconscious psychological reaction that “is triggered by real or imagined situations in which many people suddenly die, such as the 9/11 attacks.  Due to an evolved pattern of fear and flight, people tend to avoid these situations. In contrast, it is hard to make people fear situations in which as many or more people die distributed over the year, such as by driving and smoking. Dread-risk fear may have been adaptive in human history when our ancestors lived in small groups, where the sudden death of a substantial part threatened the survival of the rest.”

We previously explored the concept of Dread Risk Fear in the context of terrorism in the post The Realities, Scare-Tactics and Scapegoating of Terrorism.  As discussed therein, the Prospecting Lens tells us that the other cognitive biases that may come into play in Dread Risk Fear scenarios include: 
  • OVERLOOKING STATISTICS
  • THE POSSIBILITY EFFECT
  • OVERESTIMATING THE LIKELIHOOD OF RARE EVENTS
  • THE AVAILABILITY HEURISTIC
  • AVAILABILITY CASCADES
  • THE ANCHORING EFFECT
  • THE NARRATIVE FALLACY
  • THE HINDSIGbHT ILLUSION
  • THE ILLUSION OF VALIDITY
A Dread Risk Fear can really get us going, especially when combined with self-selected information that plays to our CONFIRMATION BIAS.

Besides terrorism, Professor Gigerenzer makes note of the swine flu and mad cow disease scares as two examples of the Dread Risk Fear creating mass hysteria that turned out to be unjustified.  Both resulted in extreme precautions and expense over something that turned out to be relatively trivial.  For mad cow, the predicted death toll was 100,000, but the actual number of deaths over 10 years was about 150.  For swine flu, the WHO estimated that 2 billion people would be infected.  In reality, fewer people died from flu that year than usual, and virtually all of them from plain influenza, not swine flu.

Professor Gigerenzer advises that the Dread Risk Fear is so strong, sometimes the only way to combat it is with other powerful emotions.  He says:  “If reason conflicts with a strong emotion, don’t try to argue. Enlist a conflicting and stronger emotion.”

Taking a quick look through the Mimetic Lens, we can see how Dread Risk Fear can be used to incite crowds and mass hysteria as discussed in Crowd Sourcing And Political Campaigns:  A Riot Is An Ugly Old Habit.  When incited, such fears provide fertile ground for scapegoating and mob mentalities.

         *          *          *          *          *          *          *          *          *          *       
 
A second focus of Risk Savvy is the difference between “Risk” and “Uncertainty” and how best to make decisions in each kind of situation.  In this discussion, Professor Gigerenzer arrives at one of the important variations in how to use the Prospecting Lens.  But it takes the Fractal Lens to see it.

Briefly, a situation that is governed by Risk is relatively predictable.  “Risk” means that the chances or odds of various outcomes can be calculated and are known in advance or can be estimated through analysis of prior data to a high degree of certainty.

By contrast, a situation that is governed by Uncertainty is one where the odds are not known in advance and cannot be calculated accurately from prior data.  And what we know from looking through the Fractal Lens is that these situation usually involve complex or chaotic systems, like weather storms, earthquakes or financial markets.  While one can know with some certainty that such events will occur, one cannot predict exactly when or where they will occur, their magnitude or precisely what will be the trigger.

People naturally seek and desire certainty, and are willing to believe in it even when it is not there.  As Professor Gigerenzer summarizes:

“The quest for certainty is an old human endeavor. Magical cults, soothsayers, and authority figures who know what’s right and wrong are its proponents. Similarly, for centuries many philosophers have been misled by looking for certainties where none exist, equating knowledge with certainty and belief with uncertainty, as John Dewey, the great pragmatist philosopher, pointed out. Today, modern technologies, from mathematical stock prediction methods to medical imaging machines, compete for the confidence promised by religion and authority. The quest for certainty is the biggest obstacle to becoming risk savvy. While there are things we can know, we must also be able to recognize when we cannot know something. We know almost for sure that Halley’s Comet will return in the year 2062, but we can rarely predict natural disasters and stock crashes. “Only fools, liars, and charlatans predict earthquakes,” said Charles Richter, namesake of the scale that measures their magnitude. Similarly, an analysis of thousands of forecasts by political and economic experts revealed that they rarely did better than dilettantes or dart-throwing chimps. . . .”

“The quest for certainty is a deep human desire. The dream that all thinking can be reduced to calculation is an old and beautiful one. In the seventeenth century, the philosopher Gottfried Wilhelm Leibniz envisioned establishing numbers or symbols for all ideas, which would enable determining the best answer for every question. That would put an end to all scholarly bickering: If a dispute arose, the contending parties could settle it quickly and peacefully by sitting down and saying, “Let’s calculate.” The only problem was that the great Leibniz never managed to find this universal calculus—nor has anyone else.”

Professor Gigerenzer labels the tendency to seek and believe in certainty even when there is some risk present as “the Zero Risk Illusion.”  We also know this through the Prospecting Lens as:

CONFIDENCE OVER DOUBT.  System 1 suppresses ambiguity and doubt by constructing coherent stories from mere scraps of data.  System 2 is our inner skeptic, weighing those stories, doubting them, and suspending judgment. But because disbelief requires lots of work System 2 sometimes fails to do its job and allows us to slide into certainty. We have a bias toward believing. Because our brains are pattern recognition devices we tend to attribute causality where none exists or erroneously concluding that scientific tests (e.g., blood, fingerprints, DNA) are infallible.

But the more interesting questions concern the line between Risk and Uncertainty, which Professor Gigerenzer dubs “The Turkey Illusion,” after Nassim Taleb’s Thanksgiving turkey who continues to believe he will be fed even as he is about to be slaughtered, because he could not know the potential outcomes involved -- and he wrongly assumed that the risk could be calculated.  What Professor Gigerenzer and the Fractal Lens tell us is that these two conditions are different enough to requires a different general approach.

In the case of RISK:  “If risks are known, good decisions require logic and statistical thinking.” This is a straight application of System 2 thinking.  Risks can be calculated when these three conditions are present:  (1) there is low uncertainty – the environment is stable and predictable; (2) there are few alternatives, such that not too many risk factors have to be estimated; and (3) there is a high amount of data available to make these estimations.

In the case of UNCERTAINTY:  “If some risks are unknown, good decisions also require intuition and smart rules of thumb.”  This is an important variation to straight System 2 logical thinking.  What your System 2 should be saying in situations involving uncertainty is that more and more calculations will not lead to better decisions.  Rather, better decisions are more likely to come from selecting the correct heuristic or rule of thumb.

Professor Gigerenzer presents these rules in a way to suggest that “sometimes System 1 thinking is better than System 2 thinking and that intuition trumps logical decision-making.”  However, while one might conclude that is the case, one would still have to use System 2 to recognize whether one is dealing with a Risk Situation or an Uncertainty Situation!  Thus, I did not see this as a great conflict as he does, but more a recognition that these types of situations require some forethought and planning.

For example, expert intuition can often be the right decision-making tool when time is of the essence and the expert has a great deal of experience in the decision to be made, like landing a plane without instruments or hitting a baseball off a major league pitcher.  Yet the Prospecting Lens already accounts for this exception when it cautions against having too much trust in expert intuitions:

TRUSTING EXPERT INTUITION.  “We are confident when the story we tell ourselves comes easily to mind, with no contradiction and no competing scenario. But ease and coherence do not guarantee that a belief held with confidence is true. The associative machine is set to suppress doubt and to evoke ideas and information that are compatible with the currently dominant story,” (page 239). Kahneman is skeptical of experts because they often overlook what they do not know. Kahneman trusts experts when two conditions are met: the expert is in an environment that is sufficiently regular to be predictable and the expert has learned these regularities through prolonged practice.

The last sentence addresses Professor Gigerenzer’s preference for expert intuition in certain circumstances.  Of course, the real difficulty in a situation involving Uncertainty is picking the correct rule of thumb or heuristic for the situation.  Professor Gigerenzer provides some examples:

Equal Weighting in Investing.  For deciding how much of a stock portfolio to invest in each particular stock, the heuristic of “1/N”, or equal weighting, has been shown to be superior to more complicated methods of diversification.  Thus, Harry Markowitz, who won a Nobel Prize for inventing the “mean-variance” model of answering this question mathematically does not use the “mean-variance” method, but favors the 1/N heuristic.  The reason for this is that the “mean-variance” model and other complex methods are designed to work in a situation where Risk is defined, not in a world of Uncertainty where it is not.  When it comes to financial markets, one would need about 500 years’ worth of data to move from the Uncertainty scenario to the Risk scenario.

“The moral of the story is that in a world of risk that matches with the mathematical assumptions of the mean-variance portfolio, the calculations pay. But in the real world of investment, simple intuitive rules can be smarter. The same holds generally in uncertain worlds.”

The mathematical reason for this is that complex models have more variance, and therefore require large amounts of data to be accurate – much more data than is often available.  Professor Gigerenzer identifies three concepts to help decide when to use a heuristic or rule of thumb (Uncertainty Scenario) and when to calculate using complex methods (Risk Scenario):

“First, the more uncertainty, the more we should make it simple. The less uncertainty, the more complex it should be. The stock market is highly uncertain in the sense that it is extremely unpredictable. That speaks for a simple method like 1/N.  Second, the more alternatives, the more we should simplify; the fewer, the more complex it can be. The reason is that complex methods need to estimate risk factors, and more alternatives mean that more factors need to be estimated, which leads to more estimation errors being made. In contrast, 1/N is not affected by more alternatives because it does not need to make estimates from past data. Finally, the more past data there are, the more beneficial for the complex methods.  That is why the Markowitz calculations pay off with 500 years of stock data…”

Later he addresses the issue more explicitly as the “Bias-variance Dilemma.”  Essentially, the more parameters that are being analyzed, the greater the potential variance and the more likely a rule of thumb will outperform complex calculations.  Specifically:

“Bias-variance dilemma. A statistical theory that explains less-is-more effects; that is, when and why simple heuristics can lead to more accurate predictions than more complex methods. The key idea is that the total error consists of three components: Total error = (bias^2) + variance + noise.  Noise is irreducible (measurement) error, while the other two kinds of error can be influenced. Bias is the difference between mean estimate and true state, and variance is the variability (instability) of individual estimates (based on different samples) around the mean estimate. For instance, 1/N has no free parameters and therefore has bias only (it makes the same allocation independent of specific samples). Models with many free parameters tend to have less bias, but more variance. Too much variance is one reason why “less can be more.””
 
Looking through the Fractal Lens, the more varied the inputs and potential outcomes, the more likely it is that a heuristic will yield a better (and safer or more robust) decision than a data-optimized plan.
But again, the problem then often becomes deciding what rule of thumb to choose.  Just grasping at any old System 1 heuristic is not likely to yield superior results to some form of calculation.  Professor Gigerenzer touches on some other rules of thumb that tend to work well in certain types of Uncertainty Scenarios:

The 37% Rule (a/k/a the Optimal Stopping Problem).  When faced with a series of choices when one has to pick one but can’t go back, the best strategy is to look at the first 37% and then pick the last one if better than all priors or the next one that is better than all priors.  This can be used whether hiring an assistant or choosing a spouse, although I suggest you don't reveal that to your husband or wife.

Satisficing.  This is also a choice mechanism when there are many alternatives available and one would need infinite (or at least too much) time to examine them all.  The rules are (1) set your level of aspiration before looking – i.e., what is “good enough”; (2) choose the first alternative that you find that meets that level.

Parenting Rule of Thumb.  Try to divide up your time equally among each of your children.  Similar to “1/N” as a diversifying principle for diversifying stocks or funds.

Safe Banking Rule of Thumb.  This one is from Mervyn King, the former governor of the Bank of England:  “Don’t use leverage ratios above 10:1.”  (Note that the average leverage ratio of U.S. banks exceeded 10:1 in the last quarter of 2018 for the first time since the Financial Crisis of 2008.)

Simple Business Decision-Making.  When faced with myriad data about the future behavior of customers or potential customers, often the most predictive methods are the simplest.  These include:  the Recency of Purchase (or Hiatus) rule, which states that customers who have not purchased in the last nine months are much less likely to respond to new promotions and can be classified as “inactive”, whereas “active” customers are mostly likely to buy again;  the All Reasons are Equal (or Tallying) rule, which states that every reason known to have a causal connection to sales should be equally weighted; and the Take the Best and Ignore the Rest rule, which states that only the most causal or correlated reason is enough to pursue a particular customer or opportunity.  Each of these simple rules has been found to outperform more complex data analysis in a variety of business and sales contexts.|

         *          *          *          *          *          *          *          *          *          *

Finally, Risk Savvy also peers a bit more through the Mimetic Lens, noting that many of our fears and avoidance mechanisms are merely products of social imitation, or mimicking the fears of those around us.  For example, North Americans are often fearful of eating mushrooms and would never use lit candles on Christmas trees, but these practices are common and not feared in Europe.  The level of fear is reversed when it comes to firearms.  Raw milk is favored in France and Italy, but banned in much of North America.  Genetically modified crops are allowed in America, but banned and deemed morally unacceptable in much of Europe.  Radiation is greatly feared in Germany and Austria, but not so much in France or the U.S.

This type of social-fears mimicry also contributes to “Defensive Decision-Making”.  This type of decision making is not driven by trying to be correct or accurate, but by trying not to vary too much from the crowd.  Thus, the most common predictions that financial market forecasters make is that this year will be like last year and not to different from what the consensus view is likely to be.  This “safety-in-numbers” decision-making leads to worthless forecasts.  “Fear of personal responsibility creates a market for worthless products delivered by high-paid experts.  (The other would-be-financial-guru trick is to make the same dire prediction every year and not keep records when you are wrong – you are bound to be right sometime.)
​
For more on becoming more Risk Literate with Professor Gigerenzer, I leave you with this Tedx Talk:
0 Comments
<<Previous
    Picture

    I have always been curious about the way the world works and the most elegant ideas for describing and explaining it.  I think I have found three of them.  

    I was very fond of James Burke's Connections series that explored interesting intersections between ideas, and hope to create some of that magic here.

    Archives

    February 2019
    January 2019
    December 2016
    November 2016
    October 2016
    September 2016
    August 2016
    July 2016
    June 2016
    May 2016
    April 2016
    March 2016
    February 2016

    Categories

    All
    Aging
    Allan Lichtman
    Anderson
    Benoit Mandelbrot
    Brexit
    Celebrity Culture
    Cities
    Comet Ping Pong
    Complexity Theory
    Crowds
    Deliberate Practice
    Differentiation
    Dread Risk Fear
    Emergence
    Fractal Lens
    Free Will
    Frugality
    Gazzaniga
    Gerd Gigerenzer
    Ian Morris
    Inequality
    Introductory
    Jonathan Haidt
    Just World Hypothesis
    Juvenoia
    Keilis-Borok
    Luca Luchesini
    Microaggressions
    Mimetic Brain
    Mimetic Lens
    Mortality
    Neuroscience
    Oughourlian
    Outliers
    Politics
    Pope Francis
    Prospecting Lens
    Protests
    Reductionism
    Role Models
    Scale
    Scapegoats
    Scarcity As Sacred
    S-Curves
    Seinfeld
    Sociopaths
    Terrorism
    Theory Of Mind
    Tyranny
    Victims
    Violence
    WEIRD

    RSS Feed

(c) All rights reserved
  • The View
  • The Lenses of Wisdom
    • The Prospecting Lens
    • The Mimetic Lens
    • The Fractal Lens
  • The Master Crafters
  • Legal