< Return to Video

To stop fake news, challenge your own views first | Lionel Page | TEDxQUT

  • 0:09 - 0:11
    Today, in the palm of your hand,
  • 0:11 - 0:14
    you have access to a world of information.
  • 0:15 - 0:18
    You can reach a multitude of new sources
  • 0:18 - 0:21
    and exchange your views
    with a wide range of people
  • 0:21 - 0:22
    all over the world.
  • 0:23 - 0:27
    This new reality should allow us
    to share a wisdom,
  • 0:27 - 0:30
    to communicate, to understand each other
  • 0:31 - 0:34
    and to become more understanding
    of our differences,
  • 0:34 - 0:36
    more tolerant of our differences.
  • 0:37 - 0:41
    But 2016 seems to have replaced
    the information age
  • 0:41 - 0:43
    with the post-truth era.
  • 0:44 - 0:46
    With Brexit and the Trump election,
  • 0:46 - 0:48
    we have found, we have discovered,
  • 0:48 - 0:52
    that more communication
    doesn't mean more information.
  • 0:54 - 0:56
    Let me show you what I mean.
  • 0:59 - 1:04
    Here is a picture of the
    Trump inauguration ceremony in 2017,
  • 1:04 - 1:08
    and the same picture of the
    Obama inauguration ceremony in 2009.
  • 1:09 - 1:12
    The White House declared
    that the Trump ceremony
  • 1:12 - 1:14
    had been the largest in history.
  • 1:17 - 1:19
    Did you know
  • 1:19 - 1:22
    that 28% of Trump supporters
  • 1:22 - 1:26
    said they believed there were
    as many people, if not more,
  • 1:26 - 1:28
    at the Trump ceremony.
  • 1:28 - 1:30
    30% said they didn't know.
    They were not sure.
  • 1:31 - 1:34
    And 41%, that is less than half,
  • 1:34 - 1:38
    said that they disagreed
    with the White House statement.
  • 1:39 - 1:43
    When you see these pictures,
    these views may seem crazy.
  • 1:44 - 1:47
    So how is it that people can believe
    something so clearly untrue?
  • 1:48 - 1:49
    But wait a minute,
  • 1:49 - 1:53
    Are you sure that you would never
    believe something so clearly untrue?
  • 1:54 - 1:55
    Think about it.
  • 1:55 - 1:58
    These pictures are not proof.
  • 1:58 - 2:00
    They could have been photoshopped.
  • 2:00 - 2:02
    They could have been swapped.
  • 2:03 - 2:05
    So when somebody raises
    these doubts to you,
  • 2:05 - 2:08
    you will have to use your own judgment
  • 2:08 - 2:10
    to weigh the evidence
    and make up your mind.
  • 2:11 - 2:14
    So now suppose that you were
    a Trump supporter.
  • 2:14 - 2:16
    Are you sure
  • 2:16 - 2:18
    that you would not give any credence
  • 2:18 - 2:19
    to somebody raising these doubts?
  • 2:20 - 2:22
    Are you sure that you
    would never entertain
  • 2:22 - 2:25
    that perhaps there were more people
    at the Trump ceremony,
  • 2:25 - 2:27
    and that these pictures are not proof?
  • 2:28 - 2:30
    Today, I want to put to you
  • 2:30 - 2:34
    that the cause of the fake news success
    lies primarily with us.
  • 2:34 - 2:36
    Fake news works
  • 2:36 - 2:38
    because we are willing to believe it.
  • 2:38 - 2:41
    And because we are willing to believe it,
    we lie to ourselves.
  • 2:43 - 2:45
    If we want a sound public debate,
  • 2:45 - 2:49
    we need to stop lying to ourselves
    when we engage with the news.
  • 2:51 - 2:54
    Let's consider an ideal public debate,
  • 2:54 - 2:57
    and think about it as a battle of ideas.
  • 2:58 - 3:00
    All ideas are voiced and debated.
  • 3:01 - 3:05
    A good idea is convincing,
    and it wins over less convincing ideas.
  • 3:06 - 3:08
    The philosopher Karl Popper
  • 3:08 - 3:11
    said that this process
    should lead public debate
  • 3:11 - 3:13
    to select the best ideas.
  • 3:14 - 3:18
    Unconvincing ideas disappear,
    and only the good ones survive.
  • 3:19 - 3:23
    And science seems like a perfect example
  • 3:23 - 3:26
    where public debate
    leads to selection of the best ideas.
  • 3:27 - 3:29
    With this view in mind,
  • 3:29 - 3:33
    the Internet should have
    a positive effect on public debate.
  • 3:33 - 3:37
    On the Internet, ideas are free
    to be voiced and criticized.
  • 3:38 - 3:42
    Good ideas should be able to convince
    more people and spread around,
  • 3:43 - 3:44
    and bad ideas are abandoned.
  • 3:45 - 3:47
    But when you look,
  • 3:47 - 3:50
    that's not necessarily what is happening
    on the Internet at the moment.
  • 3:51 - 3:52
    So why is that?
  • 3:52 - 3:58
    Well, perhaps, this view
    of the public debate is a bit unrealistic.
  • 3:59 - 4:04
    In 1949, Max Planck famously joked
    about this vision from Popper.
  • 4:04 - 4:09
    Max Planck was a theoretical physicist
    who eventually won a Nobel Prize.
  • 4:10 - 4:11
    He said,
  • 4:11 - 4:16
    "A new scientific truth doesn't triumph
    because it convinces its opponents;
  • 4:16 - 4:20
    rather, these opponents get old,
  • 4:20 - 4:21
    and eventually they die,
  • 4:21 - 4:24
    and they are replaced by
    a new generation of scientists."
  • 4:25 - 4:28
    And what Max Planck
    was alluding to with irony
  • 4:28 - 4:33
    was to the reality of debates
    with humans, us.
  • 4:34 - 4:38
    Us humans, we are not designed
    as perfect rational thinkers
  • 4:38 - 4:40
    only looking for the truth.
  • 4:40 - 4:44
    The fact is that often,
    we are attached to our ideas.
  • 4:45 - 4:47
    We can be attached to ideas
  • 4:47 - 4:50
    because some ideas
    may be convenient for us -
  • 4:50 - 4:51
    for interest.
  • 4:52 - 4:56
    So going to the example
    of the scientists of Max Planck,
  • 4:56 - 4:58
    these scientists may have become famous
  • 4:58 - 5:01
    because of the ideas
    that they proposed in the past,
  • 5:01 - 5:03
    and which are now old ideas.
  • 5:03 - 5:07
    Abandoning these ideas would be
    losing part of the credit they got
  • 5:07 - 5:10
    for proposing these ideas
    in the first place.
  • 5:11 - 5:12
    But it's not just in science.
  • 5:12 - 5:14
    If you think of politics,
  • 5:14 - 5:17
    if a government proposes
    to extend social welfare,
  • 5:18 - 5:20
    those who would receive social welfare
  • 5:20 - 5:22
    have an interest to believe
    it's good for the country.
  • 5:23 - 5:26
    And those who would have to pay
    for the social welfare
  • 5:26 - 5:29
    have an interest to think
    it's bad for the country,
  • 5:29 - 5:31
    it's a bad policy for the country.
  • 5:32 - 5:35
    But we are not just attached to ideas
    for material interests;
  • 5:35 - 5:36
    it's much more complex.
  • 5:36 - 5:39
    Often, we can be
    emotionally attached to our ideas.
  • 5:40 - 5:42
    They may be part of our identity.
  • 5:42 - 5:46
    So if I'm a Christian conservative
    or a left-wing liberal,
  • 5:47 - 5:49
    these ideas may be part of who I am.
  • 5:50 - 5:54
    Abandoning these ideas
    could be losing part of my identity.
  • 5:56 - 5:59
    As a consequence,
    we are attached to our ideas.
  • 5:59 - 6:02
    So we're not neutral judges
  • 6:03 - 6:05
    when we're considering
    the evidence for or against them.
  • 6:06 - 6:11
    On the contrary,
    science and behavior
  • 6:11 - 6:14
    show that we typically
    engage in self-deception,
  • 6:14 - 6:18
    which means that we are
    building beliefs which are compatible
  • 6:19 - 6:21
    with our interests
    and with our other beliefs.
  • 6:22 - 6:25
    Self-deception is subtle,
  • 6:25 - 6:28
    it takes place all the time
    in your everyday life.
  • 6:28 - 6:30
    So I'm going to give you two ways
  • 6:30 - 6:32
    in which self-deception
    can change what we believe in
  • 6:32 - 6:34
    and produce convenient
    views for ourselves.
  • 6:35 - 6:37
    First, when you receive some news,
  • 6:38 - 6:41
    you have some flexibility
    in how you consider it.
  • 6:42 - 6:46
    If it's positive news
    which is compatible with your beliefs,
  • 6:46 - 6:49
    you can accept it as positive evidence.
  • 6:50 - 6:51
    And if it's negative news,
  • 6:51 - 6:53
    you can to the contrary
    choose to discount it
  • 6:53 - 6:56
    and not choose to consider it.
  • 6:56 - 6:58
    Let me tell you about a study.
  • 6:58 - 7:01
    A group of people
    were asked about beliefs,
  • 7:01 - 7:04
    political beliefs
    and non-political beliefs.
  • 7:05 - 7:07
    For their political beliefs,
    they had to say
  • 7:07 - 7:11
    whether they believed in statements
    such as "Abortion should be legal" -
  • 7:11 - 7:14
    very loaded statements,
    typical of political statements.
  • 7:15 - 7:19
    And non-political beliefs
    were statements such as
  • 7:19 - 7:22
    "Second-hand smoking
    is dangerous to your health."
  • 7:24 - 7:25
    So what happened is
  • 7:25 - 7:29
    that these people were confronted
    with contradictions to these beliefs.
  • 7:29 - 7:31
    What do you think people did?
  • 7:31 - 7:35
    How did they react when they were
    confronted with these contradictions?
  • 7:35 - 7:40
    Well, here's how people reacted
    with their non-political beliefs.
  • 7:41 - 7:45
    So you have the strength of their beliefs
    before the contradiction and after it.
  • 7:46 - 7:48
    So when faced with a series
    of contradictions,
  • 7:48 - 7:50
    they updated their beliefs
  • 7:50 - 7:54
    and the strength of their beliefs
    was lower after facing contradictions.
  • 7:54 - 7:56
    But now look at what happened
  • 7:56 - 7:59
    when they were faced with contradictions
    to their political beliefs.
  • 8:00 - 8:04
    Here you can see that people resisted
    the contradiction much more
  • 8:04 - 8:07
    and held to their political beliefs.
  • 8:08 - 8:12
    That's one way we selectively
    interpret the news.
  • 8:12 - 8:13
    But there is another way.
  • 8:13 - 8:17
    We're not just receiving the news;
    we're looking out for it.
  • 8:17 - 8:19
    We are selecting where we want
    to look for information.
  • 8:20 - 8:22
    And typically, we look for
    confirming information.
  • 8:23 - 8:24
    If you are a conservative,
  • 8:24 - 8:27
    you are more likely
    to read a conservative newspaper,
  • 8:27 - 8:29
    to watch a conservative news channel,
  • 8:29 - 8:32
    and perhaps even to turn off
    the TV or the radio
  • 8:32 - 8:34
    when a left-wing politician
    is being interviewed.
  • 8:35 - 8:39
    Let me show you a hypothetical scenario.
  • 8:40 - 8:43
    Let's say that you wake up in the morning
    and you open your newspaper,
  • 8:44 - 8:45
    and in one scenario,
  • 8:45 - 8:47
    you've got some news which is not good -
  • 8:47 - 8:49
    it's a contradiction to your beliefs.
  • 8:50 - 8:53
    Let's say that this news suggests
    that your favorite politician
  • 8:53 - 8:55
    is involved in a political scandal.
  • 8:57 - 9:00
    And consider the other situation,
    where to the contrary,
  • 9:00 - 9:05
    the news in the newspaper is positive -
    it goes with your usual beliefs.
  • 9:06 - 9:10
    Perhaps it's another politician,
    a politician you do not like,
  • 9:10 - 9:12
    who is involved in the political scandal.
  • 9:13 - 9:15
    Do you think, do you feel,
  • 9:16 - 9:19
    that you'd react in the same way
    to these two situations?
  • 9:20 - 9:22
    Well, research shows that you would not.
  • 9:22 - 9:26
    Most likely, what happens
    is that if you find a contradiction,
  • 9:26 - 9:29
    you tend to look for all the news sources.
  • 9:29 - 9:31
    You give yourself a chance
    to find something
  • 9:31 - 9:33
    which will contradict this negative news.
  • 9:34 - 9:35
    Perhaps another newspaper,
  • 9:35 - 9:38
    perhaps you will read
    the fine print in the newspaper
  • 9:38 - 9:42
    to make sure that the title
    was really reflecting the information.
  • 9:42 - 9:45
    On the contrary,
    if you have the positive news,
  • 9:45 - 9:47
    you're more likely to stop there,
  • 9:47 - 9:50
    you're more likely to be happy to consider
    that this piece of evidence
  • 9:50 - 9:53
    is enough for you to make up
    your mind on this issue.
  • 9:55 - 9:59
    So as much as we would like to think
    of ourselves as rational thinkers,
  • 10:00 - 10:04
    it is a fact that this tendency
    to look for confirming news
  • 10:04 - 10:08
    and to reject negative information
    is ingrained in us.
  • 10:10 - 10:11
    So now before you panic
  • 10:11 - 10:16
    and you think that our irrationality
    is making public debate impossible,
  • 10:17 - 10:19
    you can relax.
  • 10:19 - 10:24
    These bars have existed forever,
    way before social media.
  • 10:24 - 10:27
    So what's happening with social media
  • 10:27 - 10:30
    is that they are exacerbating
    some of the effects of these bars.
  • 10:31 - 10:34
    Let me give you two ways
    in which they are doing so.
  • 10:34 - 10:37
    First, on the Internet
    you have much more freedom
  • 10:37 - 10:40
    to look for the information
    which is convenient
  • 10:40 - 10:42
    to whatever beliefs you have.
  • 10:43 - 10:45
    The Internet is like a giant supermarket.
  • 10:46 - 10:47
    For any kind of idea,
  • 10:47 - 10:50
    you'll be able to find arguments for
    and supporting this idea.
  • 10:51 - 10:54
    Let's consider a crazy idea as an example.
  • 10:54 - 10:58
    Let's consider that you believe
    that the Earth is flat.
  • 10:59 - 11:02
    Well, 30 years ago,
    you would have been a bit alone
  • 11:02 - 11:05
    and maybe struggling to find people
    to give you evidence for this.
  • 11:06 - 11:08
    Today, you can just
    connect to the Internet,
  • 11:08 - 11:10
    contact the Flat Earth Society,
  • 11:10 - 11:14
    and this society is going to provide you
    with elements of evidence,
  • 11:14 - 11:16
    arguments in favor of your belief.
  • 11:18 - 11:19
    That's another way
  • 11:19 - 11:22
    in which the Internet helps you
    engage in self-deception,
  • 11:22 - 11:24
    that you can do it collectively.
  • 11:24 - 11:27
    You now have a multitude
    of communities on the Internet
  • 11:27 - 11:30
    that have created
    informational bubbles on their own.
  • 11:31 - 11:34
    In these bubbles,
    people select information,
  • 11:34 - 11:37
    interpret information,
    and repackage it
  • 11:37 - 11:40
    in a way that is compatible
    with the community's views.
  • 11:40 - 11:43
    When a new fact is discussed
    in this community,
  • 11:43 - 11:47
    the images which are positive
    for the community are reinforced,
  • 11:47 - 11:50
    and those which are not are dropped,
  • 11:50 - 11:52
    and the nuances are lost.
  • 11:53 - 11:56
    These communities build simple views
  • 11:56 - 11:59
    compatible with the beliefs
    of the community.
  • 11:59 - 12:03
    So the social media have not created
    a unified public space
  • 12:03 - 12:05
    where ideas are debated;
  • 12:05 - 12:11
    instead, social media have increased
    our ability to connect specifically
  • 12:11 - 12:14
    with the people whose views
    mostly match our own.
  • 12:16 - 12:17
    And whatever your views,
  • 12:17 - 12:20
    you can find a community
    of like-minded people
  • 12:20 - 12:25
    with whom you can share arguments
    supporting your existing beliefs
  • 12:25 - 12:26
    and not challenging them.
  • 12:28 - 12:29
    But what do you want?
  • 12:30 - 12:36
    Do you want a unified public space
    where ideas are debated, discussed,
  • 12:36 - 12:38
    and where maybe the best ideas can win,
  • 12:38 - 12:40
    or a compartmented public space
  • 12:40 - 12:44
    where different visions of the world
    can coexist unchallenged?
  • 12:45 - 12:50
    Well, if we want to defend and support
  • 12:50 - 12:52
    the existence of an open public space,
  • 12:52 - 12:56
    we need to realize
    that the problem starts with us,
  • 12:57 - 12:59
    with how we form opinions
  • 12:59 - 13:01
    and how we share them on the social media.
  • 13:02 - 13:06
    So perhaps, in the post-truth era
    we need some guidelines
  • 13:06 - 13:09
    about how to interact on the social media.
  • 13:10 - 13:11
    Here are three steps.
  • 13:12 - 13:14
    Step 1:
  • 13:14 - 13:19
    Avoid the narrow selection of information
    which closely match your views.
  • 13:20 - 13:22
    Make sure that in your timeline
  • 13:22 - 13:24
    you've got some news sources
    which challenge your view.
  • 13:25 - 13:29
    Try to find people with whom you disagree
  • 13:29 - 13:31
    and exchange with them.
  • 13:31 - 13:33
    Listen to what they have to say.
  • 13:34 - 13:37
    Try to appreciate the points
    that they may have.
  • 13:38 - 13:40
    Give them a chance to change your mind.
  • 13:42 - 13:43
    Point 2:
  • 13:44 - 13:45
    Question your views.
  • 13:46 - 13:49
    The more you'd like an idea to be true,
  • 13:49 - 13:53
    the more you need to distrust the way
    you made up your mind about it.
  • 13:53 - 13:56
    Are you sure that you considered
  • 13:56 - 13:59
    all the best counterarguments
    to the views you have?
  • 13:59 - 14:02
    that you've considered the possibility
    of weak points in your reasoning?
  • 14:02 - 14:04
    Try to find them.
  • 14:05 - 14:07
    Whenever you really want
    an idea to be true,
  • 14:07 - 14:09
    you may remember
    that at some point in your life,
  • 14:09 - 14:12
    we all liked to believe in Santa.
  • 14:14 - 14:17
    And even though we really wanted
    Santa to be true,
  • 14:18 - 14:20
    it didn't make it more real in the end.
  • 14:20 - 14:22
    So the more you'd like an idea to be true,
  • 14:22 - 14:26
    you need to be wondering,
    Is it too good to be true?
  • 14:26 - 14:29
    Do I believe this
    because I want to believe it
  • 14:29 - 14:31
    or because of the evidence?
  • 14:31 - 14:33
    Could it be another Santa for me?
  • 14:35 - 14:36
    Step 3:
  • 14:36 - 14:39
    Avoid contributing
    to the distortion of facts.
  • 14:39 - 14:42
    When you want to forward
    information on social media,
  • 14:42 - 14:44
    make sure you've read it,
    you understand it,
  • 14:44 - 14:46
    and avoid simplifying it in a way
  • 14:46 - 14:49
    which is going to conform
    with the views of the community.
  • 14:49 - 14:52
    Try not to contribute
    to an echo chamber effect
  • 14:52 - 14:55
    within the community of views
    you are participating in.
  • 14:56 - 15:00
    If you follow these guidelines,
    you are going to do yourself a favor:
  • 15:01 - 15:05
    you're going to stop building
    a convenient alternative reality;
  • 15:05 - 15:09
    you're going to avoid spreading
    half truths and distorted facts;
  • 15:10 - 15:15
    and you will limit the self-enforcing
    groups of confirming exchanges,
  • 15:16 - 15:19
    which are pushing people
    in different informational bubbles.
  • 15:20 - 15:23
    You will help ideas
    to be questioned and challenged,
  • 15:23 - 15:27
    and you will contribute to making
    the public space an open space
  • 15:27 - 15:28
    where the best ideas can win.
  • 15:29 - 15:32
    Will you allow yourself
    to change your views
  • 15:33 - 15:35
    and to abandon old ones?
  • 15:36 - 15:38
    Smart people change their mind.
  • 15:38 - 15:40
    Choose to be one.
  • 15:41 - 15:43
    (Applause)
Title:
To stop fake news, challenge your own views first | Lionel Page | TEDxQUT
Description:

The information era seems to have given way to the post-truth era, where the lines between facts and fiction have been blurred by fake news. How can fake news be so effective? Lionel Page, Professor in Behavioural Economics, argues that fake news works because we are willing to believe it. To stop fake news and improve public debate, we have to change how we engage with the news and spread it on social media.

Lionel is the Head of the Queensland Behavioural Economics group. He received the 2016 Young Economist Prize from the Economic Society of Australia. His research focuses on the study of economic and political behavior. He is a dual French and Australian citizen having worked in economics in Paris, London, and Brisbane.

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at https://www.ted.com/tedx

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDxTalks
Duration:
15:57

English subtitles

Revisions