Return to Video

What to trust in a "post-truth" world

  • 0:02 - 0:05
    Belle Gibson was a happy young Australian.
  • 0:05 - 0:08
    She lived in Perth,
    and she loved skateboarding.
  • 0:08 - 0:13
    But in 2009, Belle learned that she had
    brain cancer and four months to live.
  • 0:13 - 0:17
    Two months of chemo
    and radiotherapy had no effect.
  • 0:17 - 0:19
    But Belle was determined.
  • 0:19 - 0:21
    She'd been a fighter her whole life.
  • 0:21 - 0:24
    From age six, she had to cook
    for her brother, who had autism,
  • 0:24 - 0:27
    and her mother,
    who had multiple sclerosis.
  • 0:27 - 0:28
    Her father was out of the picture.
  • 0:29 - 0:32
    So Belle fought, with exercise,
    with meditation
  • 0:32 - 0:35
    and by ditching meat
    for fruit and vegetables.
  • 0:35 - 0:38
    And she made a complete recovery.
  • 0:39 - 0:40
    Belle's story went viral.
  • 0:40 - 0:44
    It was tweeted, blogged about,
    shared and reached millions of people.
  • 0:44 - 0:47
    It showed the benefits of shunning
    traditional medicine
  • 0:47 - 0:49
    for diet and exercise.
  • 0:49 - 0:54
    In August 2013, Belle launched
    a healthy eating app,
  • 0:54 - 0:55
    The Whole Pantry,
  • 0:55 - 0:59
    downloaded 200,000 times
    in the first month.
  • 1:01 - 1:04
    But Belle's story was a lie.
  • 1:05 - 1:07
    Belle never had cancer.
  • 1:08 - 1:12
    People shared her story
    without ever checking if it was true.
  • 1:13 - 1:16
    This is a classic example
    of confirmation bias.
  • 1:16 - 1:21
    We accept a story uncritically
    if it confirms what we'd like to be true.
  • 1:21 - 1:24
    And we reject any story
    that contradicts it.
  • 1:25 - 1:27
    How often do we see this
  • 1:27 - 1:30
    in the stories
    that we share and we ignore?
  • 1:30 - 1:34
    In politics, in business,
    in health advice.
  • 1:35 - 1:39
    The Oxford Dictionary's
    word of 2016 was "post-truth."
  • 1:40 - 1:43
    And the recognition that we now live
    in a post-truth world
  • 1:43 - 1:47
    has led to a much needed emphasis
    on checking the facts.
  • 1:47 - 1:49
    But the punch line of my talk
  • 1:49 - 1:52
    is that just checking
    the facts is not enough.
  • 1:52 - 1:55
    Even if Belle's story were true,
  • 1:55 - 1:57
    it would be just as irrelevant.
  • 1:58 - 2:00
    Why?
  • 2:00 - 2:03
    Well, let's look at one of the most
    fundamental techniques in statistics.
  • 2:03 - 2:06
    It's called Bayesian inference.
  • 2:06 - 2:09
    And the very simple version is this:
  • 2:09 - 2:12
    We care about "does the data
    support the theory?"
  • 2:13 - 2:17
    Does the data increase our belief
    that the theory is true?
  • 2:18 - 2:22
    But instead, we end up asking,
    "Is the data consistent with the theory?"
  • 2:23 - 2:25
    But being consistent with the theory
  • 2:25 - 2:28
    does not mean that the data
    supports the theory.
  • 2:29 - 2:30
    Why?
  • 2:30 - 2:34
    Because of a crucial
    but forgotten third term --
  • 2:34 - 2:37
    the data could also be consistent
    with rival theories.
  • 2:38 - 2:43
    But due to confirmation bias,
    we never consider the rival theories,
  • 2:43 - 2:46
    because we're so protective
    of our own pet theory.
  • 2:47 - 2:49
    Now, let's look at this for Belle's story.
  • 2:49 - 2:53
    Well, we care about:
    Does Belle's story support the theory
  • 2:53 - 2:55
    that diet cures cancer?
  • 2:55 - 2:57
    But instead, we end up asking,
  • 2:57 - 3:01
    "Is Belle's story consistent
    with diet curing cancer?"
  • 3:02 - 3:03
    And the answer is yes.
  • 3:04 - 3:08
    If diet did cure cancer,
    we'd see stories like Belle's.
  • 3:09 - 3:12
    But even if diet did not cure cancer,
  • 3:12 - 3:14
    we'd still see stories like Belle's.
  • 3:15 - 3:20
    A single story in which
    a patient apparently self-cured
  • 3:20 - 3:23
    just due to being misdiagnosed
    in the first place.
  • 3:24 - 3:27
    Just like, even if smoking
    was bad for your health,
  • 3:27 - 3:30
    you'd still see one smoker
    who lived until 100.
  • 3:31 - 3:32
    (Laughter)
  • 3:32 - 3:35
    Just like, even if education
    was good for your income,
  • 3:35 - 3:39
    you'd still see one multimillionaire
    who didn't go to university.
  • 3:39 - 3:44
    (Laughter)
  • 3:44 - 3:48
    So the biggest problem with Belle's story
    is not that it was false.
  • 3:48 - 3:51
    It's that it's only one story.
  • 3:51 - 3:55
    There might be thousands of other stories
    where diet alone failed,
  • 3:55 - 3:57
    but we never hear about them.
  • 3:58 - 4:02
    We share the outlier cases
    because they are new,
  • 4:02 - 4:04
    and therefore they are news.
  • 4:05 - 4:07
    We never share the ordinary cases.
  • 4:07 - 4:10
    They're too ordinary,
    they're what normally happens.
  • 4:11 - 4:14
    And that's the true
    99 percent that we ignore.
  • 4:14 - 4:17
    Just like in society, you can't just
    listen to the one percent,
  • 4:17 - 4:18
    the outliers,
  • 4:18 - 4:21
    and ignore the 99 percent, the ordinary.
  • 4:22 - 4:25
    Because that's the second example
    of confirmation bias.
  • 4:25 - 4:28
    We accept a fact as data.
  • 4:29 - 4:33
    The biggest problem is not
    that we live in a post-truth world;
  • 4:33 - 4:37
    it's that we live in a post-data world.
  • 4:38 - 4:42
    We prefer a single story to tons of data.
  • 4:43 - 4:46
    Now, stories are powerful,
    they're vivid, they bring it to life.
  • 4:46 - 4:48
    They tell you to start
    every talk with a story.
  • 4:48 - 4:49
    I did.
  • 4:50 - 4:54
    But a single story
    is meaningless and misleading
  • 4:54 - 4:57
    unless it's backed up by large-scale data.
  • 4:59 - 5:02
    But even if we had large-scale data,
  • 5:02 - 5:04
    that might still not be enough.
  • 5:04 - 5:07
    Because it could still be consistent
    with rival theories.
  • 5:08 - 5:09
    Let me explain.
  • 5:10 - 5:13
    A classic study
    by psychologist Peter Wason
  • 5:13 - 5:15
    gives you a set of three numbers
  • 5:15 - 5:18
    and asks you to think of the rule
    that generated them.
  • 5:19 - 5:23
    So if you're given two, four, six,
  • 5:23 - 5:24
    what's the rule?
  • 5:25 - 5:28
    Well, most people would think,
    it's successive even numbers.
  • 5:29 - 5:30
    How would you test it?
  • 5:30 - 5:34
    Well, you'd propose other sets
    of successive even numbers:
  • 5:34 - 5:37
    4, 6, 8 or 12, 14, 16.
  • 5:38 - 5:40
    And Peter would say these sets also work.
  • 5:41 - 5:44
    But knowing that these sets also work,
  • 5:44 - 5:48
    knowing that perhaps hundreds of sets
    of successive even numbers also work,
  • 5:49 - 5:50
    tells you nothing.
  • 5:51 - 5:54
    Because this is still consistent
    with rival theories.
  • 5:55 - 5:58
    Perhaps the rule
    is any three even numbers.
  • 5:59 - 6:01
    Or any three increasing numbers.
  • 6:02 - 6:05
    And that's the third example
    of confirmation bias:
  • 6:05 - 6:09
    accepting data as evidence,
  • 6:09 - 6:12
    even if it's consistent
    with rival theories.
  • 6:13 - 6:16
    Data is just a collection of facts.
  • 6:16 - 6:21
    Evidence is data that supports
    one theory and rules out others.
  • 6:23 - 6:25
    So the best way to support your theory
  • 6:25 - 6:29
    is actually to try to disprove it,
    to play devil's advocate.
  • 6:29 - 6:34
    So test something, like 4, 12, 26.
  • 6:35 - 6:39
    If you got a yes to that,
    that would disprove your theory
  • 6:39 - 6:41
    of successive even numbers.
  • 6:41 - 6:43
    Yet this test is powerful,
  • 6:43 - 6:48
    because if you got a no, it would rule out
    "any three even numbers"
  • 6:48 - 6:50
    and "any three increasing numbers."
  • 6:50 - 6:53
    It would rule out the rival theories,
    but not rule out yours.
  • 6:54 - 6:59
    But most people are too afraid
    of testing the 4, 12, 26,
  • 6:59 - 7:03
    because they don't want to get a yes
    and prove their pet theory to be wrong.
  • 7:05 - 7:10
    Confirmation bias is not only
    about failing to search for new data,
  • 7:10 - 7:14
    but it's also about misinterpreting
    data once you receive it.
  • 7:14 - 7:18
    And this applies outside the lab
    to important, real-world problems.
  • 7:18 - 7:21
    Indeed, Thomas Edison famously said,
  • 7:21 - 7:23
    "I have not failed,
  • 7:23 - 7:27
    I have found 10,000 ways that won't work."
  • 7:28 - 7:31
    Finding out that you're wrong
  • 7:31 - 7:34
    is the only way to find out what's right.
  • 7:35 - 7:38
    Say you're a university
    admissions director
  • 7:38 - 7:40
    and your theory is that only
    students with good grades
  • 7:40 - 7:42
    from rich families do well.
  • 7:42 - 7:45
    So you only let in such students.
  • 7:45 - 7:46
    And they do well.
  • 7:46 - 7:49
    But that's also consistent
    with the rival theory.
  • 7:50 - 7:52
    Perhaps all students
    with good grades do well,
  • 7:52 - 7:54
    rich or poor.
  • 7:54 - 7:58
    But you never test that theory
    because you never let in poor students
  • 7:58 - 8:01
    because you don't want to be proven wrong.
  • 8:03 - 8:04
    So, what have we learned?
  • 8:05 - 8:09
    A story is not fact,
    because it may not be true.
  • 8:09 - 8:12
    A fact is not data,
  • 8:12 - 8:16
    it may not be representative
    if it's only one data point.
  • 8:17 - 8:19
    And data is not evidence --
  • 8:19 - 8:23
    it may not be supportive
    if it's consistent with rival theories.
  • 8:24 - 8:26
    So, what do you do?
  • 8:27 - 8:30
    When you're at
    the inflection points of life,
  • 8:30 - 8:33
    deciding on a strategy for your business,
  • 8:33 - 8:35
    a parenting technique for your child
  • 8:35 - 8:38
    or a regimen for your health,
  • 8:38 - 8:41
    how do you ensure
    that you don't have a story
  • 8:41 - 8:43
    but you have evidence?
  • 8:44 - 8:46
    Let me give you three tips.
  • 8:47 - 8:51
    The first is to actively seek
    other viewpoints.
  • 8:51 - 8:54
    Read and listen to people
    you flagrantly disagree with.
  • 8:54 - 8:58
    Ninety percent of what they say
    may be wrong, in your view.
  • 8:59 - 9:01
    But what if 10 percent is right?
  • 9:02 - 9:03
    As Aristotle said,
  • 9:03 - 9:06
    "The mark of an educated man
  • 9:06 - 9:09
    is the ability to entertain a thought
  • 9:09 - 9:11
    without necessarily accepting it."
  • 9:13 - 9:15
    Surround yourself with people
    who challenge you,
  • 9:15 - 9:19
    and create a culture
    that actively encourages dissent.
  • 9:19 - 9:22
    Some banks suffered from groupthink,
  • 9:22 - 9:26
    where staff were too afraid to challenge
    management's lending decisions,
  • 9:26 - 9:28
    contributing to the financial crisis.
  • 9:29 - 9:33
    In a meeting, appoint someone
    to be devil's advocate
  • 9:33 - 9:35
    against your pet idea.
  • 9:36 - 9:38
    And don't just hear another viewpoint --
  • 9:38 - 9:40
    listen to it, as well.
  • 9:41 - 9:44
    As psychologist Stephen Covey said,
  • 9:44 - 9:47
    "Listen with the intent to understand,
  • 9:47 - 9:49
    not the intent to reply."
  • 9:50 - 9:53
    A dissenting viewpoint
    is something to learn from
  • 9:53 - 9:55
    not to argue against.
  • 9:56 - 10:00
    Which takes us to the other
    forgotten terms in Bayesian inference.
  • 10:00 - 10:03
    Because data allows you to learn,
  • 10:03 - 10:06
    but learning is only relative
    to a starting point.
  • 10:06 - 10:12
    If you started with complete certainty
    that your pet theory must be true,
  • 10:12 - 10:14
    then your view won't change --
  • 10:14 - 10:16
    regardless of what data you see.
  • 10:17 - 10:21
    Only if you are truly open
    to the possibility of being wrong
  • 10:21 - 10:22
    can you ever learn.
  • 10:24 - 10:26
    As Leo Tolstoy wrote,
  • 10:26 - 10:28
    "The most difficult subjects
  • 10:28 - 10:31
    can be explained to the most
    slow-witted man
  • 10:31 - 10:34
    if he has not formed
    any idea of them already.
  • 10:34 - 10:36
    But the simplest thing
  • 10:36 - 10:39
    cannot be made clear
    to the most intelligent man
  • 10:39 - 10:43
    if he is firmly persuaded
    that he knows already."
  • 10:44 - 10:48
    Tip number two is "listen to experts."
  • 10:49 - 10:53
    Now, that's perhaps the most
    unpopular advice that I could give you.
  • 10:53 - 10:54
    (Laughter)
  • 10:54 - 10:59
    British politician Michael Gove
    famously said that people in this country
  • 10:59 - 11:01
    have had enough of experts.
  • 11:02 - 11:05
    A recent poll showed that more people
    would trust their hairdresser --
  • 11:05 - 11:08
    (Laughter)
  • 11:08 - 11:09
    or the man on the street
  • 11:09 - 11:14
    than they would leaders of businesses,
    the health service and even charities.
  • 11:14 - 11:18
    So we respect a teeth-whitening formula
    discovered by a mom,
  • 11:18 - 11:21
    or we listen to an actress's view
    on vaccination.
  • 11:21 - 11:24
    We like people who tell it like it is,
    who go with their gut,
  • 11:24 - 11:26
    and we call them authentic.
  • 11:27 - 11:30
    But gut feel can only get you so far.
  • 11:31 - 11:35
    Gut feel would tell you never to give
    water to a baby with diarrhea,
  • 11:35 - 11:38
    because it would just
    flow out the other end.
  • 11:38 - 11:40
    Expertise tells you otherwise.
  • 11:41 - 11:45
    You'd never trust your surgery
    to the man on the street.
  • 11:45 - 11:48
    You'd want an expert
    who spent years doing surgery
  • 11:48 - 11:50
    and knows the best techniques.
  • 11:52 - 11:55
    But that should apply
    to every major decision.
  • 11:55 - 12:00
    Politics, business, health advice
  • 12:00 - 12:03
    require expertise, just like surgery.
  • 12:04 - 12:08
    So then, why are experts so mistrusted?
  • 12:09 - 12:12
    Well, one reason
    is they're seen as out of touch.
  • 12:12 - 12:16
    A millionaire CEO couldn't possibly
    speak for the man on the street.
  • 12:17 - 12:21
    But true expertise is found on evidence.
  • 12:21 - 12:24
    And evidence stands up
    for the man on the street
  • 12:24 - 12:26
    and against the elites.
  • 12:26 - 12:29
    Because evidence forces you to prove it.
  • 12:30 - 12:34
    Evidence prevents the elites
    from imposing their own view
  • 12:34 - 12:35
    without proof.
  • 12:37 - 12:39
    A second reason
    why experts are not trusted
  • 12:39 - 12:42
    is that different experts
    say different things.
  • 12:42 - 12:47
    For every expert who claimed that leaving
    the EU would be bad for Britain,
  • 12:47 - 12:49
    another expert claimed it would be good.
  • 12:49 - 12:53
    Half of these so-called experts
    will be wrong.
  • 12:54 - 12:58
    And I have to admit that most papers
    written by experts are wrong.
  • 12:59 - 13:02
    Or at best, make claims that
    the evidence doesn't actually support.
  • 13:03 - 13:06
    So we can't just take
    an expert's word for it.
  • 13:07 - 13:13
    In November 2016, a study
    on executive pay hit national headlines.
  • 13:13 - 13:16
    Even though none of the newspapers
    who covered the study
  • 13:16 - 13:18
    had even seen the study.
  • 13:19 - 13:20
    It wasn't even out yet.
  • 13:21 - 13:23
    They just took the author's word for it,
  • 13:24 - 13:25
    just like with Belle.
  • 13:26 - 13:29
    Nor does it mean that we can
    just handpick any study
  • 13:29 - 13:31
    that happens to support our viewpoint --
  • 13:31 - 13:33
    that would, again, be confirmation bias.
  • 13:33 - 13:35
    Nor does it mean
    that if seven studies show A
  • 13:35 - 13:37
    and three show B,
  • 13:37 - 13:39
    that A must be true.
  • 13:39 - 13:42
    What matters is the quality,
  • 13:42 - 13:45
    and not the quantity of expertise.
  • 13:46 - 13:48
    So we should do two things.
  • 13:48 - 13:53
    First, we should critically examine
    the credentials of the authors.
  • 13:54 - 13:58
    Just like you'd critically examine
    the credentials of a potential surgeon.
  • 13:58 - 14:02
    Are they truly experts in the matter,
  • 14:02 - 14:04
    or do they have a vested interest?
  • 14:05 - 14:07
    Second, we should pay particular attention
  • 14:07 - 14:11
    to papers published
    in the top academic journals.
  • 14:12 - 14:16
    Now, academics are often accused
    of being detached from the real world.
  • 14:17 - 14:20
    But this detachment gives you
    years to spend on a study.
  • 14:20 - 14:22
    To really nail down a result,
  • 14:22 - 14:24
    to rule out those rival theories,
  • 14:24 - 14:27
    and to distinguish correlation
    from causation.
  • 14:28 - 14:32
    And academic journals involve peer review,
  • 14:32 - 14:34
    where a paper is rigorously scrutinized
  • 14:34 - 14:35
    (Laughter)
  • 14:35 - 14:37
    by the world's leading minds.
  • 14:38 - 14:41
    The better the journal,
    the higher the standard.
  • 14:41 - 14:46
    The most elite journals
    reject 95 percent of papers.
  • 14:47 - 14:51
    Now, academic evidence is not everything.
  • 14:51 - 14:54
    Real-world experience is critical, also.
  • 14:54 - 14:58
    And peer review is not perfect,
    mistakes are made.
  • 14:59 - 15:01
    But it's better to go
    with something checked
  • 15:01 - 15:02
    than something unchecked.
  • 15:03 - 15:06
    If we latch onto a study
    because we like the findings,
  • 15:06 - 15:10
    without considering who it's by
    or whether it's even been vetted,
  • 15:10 - 15:13
    there is a massive chance
    that that study is misleading.
  • 15:15 - 15:17
    And those of us who claim to be experts
  • 15:17 - 15:21
    should recognize the limitations
    of our analysis.
  • 15:21 - 15:26
    Very rarely is it possible to prove
    or predict something with certainty,
  • 15:26 - 15:31
    yet it's so tempting to make
    a sweeping, unqualified statement.
  • 15:31 - 15:35
    It's easier to turn into a headline
    or to be tweeted in 140 characters.
  • 15:36 - 15:40
    But even evidence may not be proof.
  • 15:40 - 15:45
    It may not be universal,
    it may not apply in every setting.
  • 15:45 - 15:50
    So don't say, "Red wine
    causes longer life,"
  • 15:50 - 15:55
    when the evidence is only that red wine
    is correlated with longer life.
  • 15:55 - 15:58
    And only then in people
    who exercise as well.
  • 16:00 - 16:04
    Tip number three
    is "pause before sharing anything."
  • 16:05 - 16:08
    The Hippocratic oath says,
    "First, do no harm."
  • 16:09 - 16:12
    What we share is potentially contagious,
  • 16:12 - 16:16
    so be very careful about what we spread.
  • 16:17 - 16:20
    Our goal should not be
    to get likes or retweets.
  • 16:20 - 16:24
    Otherwise, we only share the consensus;
    we don't challenge anyone's thinking.
  • 16:24 - 16:27
    Otherwise, we only share what sounds good,
  • 16:27 - 16:29
    regardless of whether it's evidence.
  • 16:30 - 16:33
    Instead, we should ask the following:
  • 16:34 - 16:36
    If it's a story, is it true?
  • 16:36 - 16:39
    If it's true, is it backed up
    by large-scale evidence?
  • 16:39 - 16:41
    If it is, who is it by,
    what are their credentials?
  • 16:41 - 16:44
    Is it published,
    how rigorous is the journal?
  • 16:45 - 16:47
    And ask yourself
    the million-dollar question:
  • 16:48 - 16:52
    If the same study was written by the same
    authors with the same credentials
  • 16:53 - 16:55
    but found the opposite results,
  • 16:56 - 16:59
    would you still be willing
    to believe it and to share it?
  • 17:01 - 17:04
    Treating any problem --
  • 17:04 - 17:08
    a nation's economic problem
    or an individual's health problem,
  • 17:08 - 17:09
    is difficult.
  • 17:09 - 17:14
    So we must ensure that we have
    the very best evidence to guide us.
  • 17:14 - 17:17
    Only if it's true can it be fact.
  • 17:18 - 17:20
    Only if it's representative
    can it be data.
  • 17:21 - 17:24
    Only if it's supportive
    can it be evidence.
  • 17:24 - 17:29
    And only with evidence
    can we move from a post-truth world
  • 17:30 - 17:31
    to a pro-truth world.
  • 17:32 - 17:34
    Thank you very much.
  • 17:34 - 17:35
    (Applause)
Title:
What to trust in a "post-truth" world
Speaker:
Alex Edmans
Description:

Only if you are truly open to the possibility of being wrong can you ever learn, says researcher Alex Edmans. In an insightful talk, he explores how confirmation bias -- the tendency to only accept information that supports your personal beliefs -- can lead you astray on social media, in politics and beyond, and offers three practical tools for finding evidence you can actually trust. (Hint: appoint someone to be the devil's advocate in your life.)

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
17:47

English subtitles

Revisions Compare revisions