Return to Video

Information pollution on the Internet, the fake news era | Adriana Garcia | TEDxLaçador

  • 0:11 - 0:15
    Hi. Who among you, nowadays,
    besides using social media,
  • 0:15 - 0:17
    has also used WhatsApp?
  • 0:17 - 0:18
    Raise your hand.
  • 0:19 - 0:21
    Everybody, right?
  • 0:21 - 0:26
    So, the Internet was a wonderful thing
    that emerged in the world.
  • 0:26 - 0:31
    And it's still young, it's only a couple
    of decades old, right?
  • 0:32 - 0:36
    It's brought lots of information,
    given voice to those who didn't have one,
  • 0:36 - 0:38
    and told wonderful stories.
  • 0:38 - 0:42
    But, last year, I discovered
    another side of the Internet.
  • 0:42 - 0:47
    I was invited to work
    on a project called "Comprova,"
  • 0:47 - 0:50
    which had the goal of mapping
  • 0:50 - 0:55
    the informational pollution
    that circulates around the Internet,
  • 0:55 - 1:00
    and combating deceptive content
  • 1:00 - 1:02
    prior to the Brazilian
    presidential elections.
  • 1:02 - 1:07
    Comprova was a coalition
    of 24 media channels -
  • 1:07 - 1:10
    large, small, startups,
  • 1:10 - 1:16
    radio, TV, newspapers, magazines -
  • 1:16 - 1:19
    everybody united together
    in a common mission
  • 1:19 - 1:24
    of trying to uncover
    and investigate rumors.
  • 1:24 - 1:28
    Rumors are a type of information
    that circulates through media,
  • 1:28 - 1:31
    and we don't know where they come from.
  • 1:31 - 1:34
    We don't know who started them
    nor what was their intention.
  • 1:35 - 1:39
    The volume of these rumors
    increased a lot before the elections.
  • 1:40 - 1:43
    Just to give you an idea ...
    in three months,
  • 1:43 - 1:50
    we conducted 146 investigations of rumors
    that were going viral on a large scale.
  • 1:50 - 1:52
    We relied on technology
  • 1:52 - 1:57
    because the same platforms
    this information circulates from
  • 1:57 - 2:00
    helped us monitor
    what was possible to monitor.
  • 2:01 - 2:04
    Facebook, YouTube ...
  • 2:05 - 2:07
    WhatsApp, which we couldn't monitor
  • 2:07 - 2:10
    because it's encrypted
    and closed, as you know.
  • 2:10 - 2:14
    But we asked people,
  • 2:14 - 2:18
    through a WhatsApp number we bought,
  • 2:18 - 2:22
    to send us the content
    that they were seeing on WhatsApp
  • 2:22 - 2:27
    should they have doubts or wonder
    if the information made sense,
  • 2:27 - 2:30
    if it was true, or if it wasn't true.
  • 2:30 - 2:31
    Anyway ...
  • 2:31 - 2:38
    We started looking at what was going viral
  • 2:38 - 2:41
    and when content became really popular,
  • 2:41 - 2:43
    we would look at it more closely.
  • 2:44 - 2:48
    We investigated 146 stories in depth.
  • 2:48 - 2:52
    Can you imagine how many were true?
  • 2:53 - 2:57
    Of those 146, only 8%
  • 2:57 - 2:59
    were completely true.
  • 2:59 - 3:01
    The others were either false,
  • 3:01 - 3:02
    partially false,
  • 3:02 - 3:05
    or biased, carrying a little bit of truth,
  • 3:05 - 3:10
    along with information that wasn't
    so truthful, aiming to confuse people.
  • 3:10 - 3:11
    Well ...
  • 3:12 - 3:15
    the fact is that the platforms helped us.
  • 3:15 - 3:19
    They financed us because they know
    this informational pollution
  • 3:19 - 3:21
    is circulating there too,
  • 3:21 - 3:26
    as well as lots of wonderful
    information and content,
  • 3:26 - 3:29
    like what we heard about here today
    and are also circulating.
  • 3:29 - 3:34
    But the 24 channels concluded
    that they needed to unite,
  • 3:34 - 3:37
    for the first time in history,
    to investigate together.
  • 3:37 - 3:40
    That's 24 competing
    communication channels,
  • 3:40 - 3:43
    including two from here,
    Rio Grande do Sul.
  • 3:44 - 3:46
    Well, so what did we find out?
  • 3:46 - 3:49
    As we were investigating,
  • 3:49 - 3:52
    people were hearing
    about our WhatsApp number
  • 3:52 - 3:54
    and starting to send us content.
  • 3:54 - 3:55
    In the beginning of August,
  • 3:55 - 3:57
    there were dozens.
  • 3:57 - 4:00
    Later, there were hundreds per day.
  • 4:00 - 4:04
    And, finally, during the runoff election,
    they were thousands per day.
  • 4:04 - 4:10
    We couldn't even process
    the messages we were getting.
  • 4:10 - 4:10
    The result:
  • 4:10 - 4:14
    in three months, we got messages
  • 4:14 - 4:18
    from almost 70 thousand people
  • 4:18 - 4:20
    who were very distressed with doubts
  • 4:20 - 4:24
    about what they were reading
    on the Internet
  • 4:24 - 4:26
    and on their WhatsApp messages.
  • 4:27 - 4:29
    What can I say to you?
  • 4:30 - 4:33
    To spread misleading information ...
  • 4:33 - 4:36
    it's a very easy thing to produce,
  • 4:36 - 4:39
    but awfully hard to uncover.
  • 4:39 - 4:44
    And the denied version
    never sounds as cool as the lie, right?
  • 4:44 - 4:46
    It's less spectacular.
  • 4:46 - 4:48
    I'll give you an example
  • 4:48 - 4:51
    of a job that took ten days to complete.
  • 4:51 - 4:53
    We reached a stage of the campaign
  • 4:53 - 4:57
    where content, images,
    and videos started circulating
  • 4:57 - 5:01
    because, in fact, information
    that circulates in the form of rumors
  • 5:01 - 5:03
    has many formats.
  • 5:03 - 5:10
    It can circulate as a GIF, meme,
    photo, audio, video, or text,
  • 5:10 - 5:11
    even a massive text.
  • 5:12 - 5:14
    We started to see lots of content
  • 5:14 - 5:20
    questioning the reliability
    of Brazilian voting machines.
  • 5:20 - 5:21
    I don't know if you remember that.
  • 5:22 - 5:26
    We saw this was going viral,
    so we decided to investigate.
  • 5:27 - 5:29
    But investigating this was difficult
  • 5:29 - 5:33
    because there were a lot
    of contradictory studies.
  • 5:33 - 5:35
    Summarizing the investigation,
    so you can have an idea
  • 5:35 - 5:39
    how much work is involved
    checking information,
  • 5:39 - 5:43
    this investigation took over a week,
  • 5:43 - 5:46
    involved a dozen journalists,
  • 5:46 - 5:49
    and generated 40 pages of reports
  • 5:49 - 5:51
    before we were sure
  • 5:51 - 5:55
    that we could authenticate
    and publish that, yes,
  • 5:55 - 5:58
    Brazilian voting machines were reliable,
  • 5:58 - 6:03
    according to the best international
    peer-reviewed studies
  • 6:03 - 6:08
    and, also, from feedback about use
    of the same machines in other countries.
  • 6:08 - 6:12
    But this was one example;
    there were many more.
  • 6:12 - 6:16
    And then, I learned a little ...
    I learned a lot, actually,
  • 6:16 - 6:19
    about information pollution.
  • 6:19 - 6:23
    But I also learned
    how we all, including you,
  • 6:23 - 6:27
    have to be responsible
    when we publish things.
  • 6:27 - 6:28
    Because, what did the Internet do?
  • 6:28 - 6:35
    It democratized access
    to publications and to information.
  • 6:35 - 6:40
    But this democracy, also,
    brought a responsibility
  • 6:40 - 6:45
    regarding what we'll consume
    and what we'll share.
  • 6:45 - 6:47
    It works something like this:
  • 6:47 - 6:49
    when you go to a grocery store
  • 6:49 - 6:51
    to buy food,
  • 6:51 - 6:53
    you want to know the origin of the food,
  • 6:53 - 6:55
    you want to know where it comes from,
  • 6:55 - 6:59
    you want to know if it is still
    within the expiration period,
  • 6:59 - 7:01
    if it has a certified origin,
  • 7:02 - 7:05
    and that it isn't spoiled, right?
  • 7:05 - 7:09
    So when we think about information,
    it's the same thing.
  • 7:09 - 7:11
    We must ask ourselves:
  • 7:11 - 7:13
    "Who sent me that?
  • 7:13 - 7:15
    What was their intention?
  • 7:15 - 7:18
    Is there other similar content?"
  • 7:19 - 7:21
    And we keep thinking, and thinking,
  • 7:21 - 7:23
    and, after all this experience
  • 7:23 - 7:27
    trying to help the media
    and the journalists
  • 7:27 - 7:29
    check on this information,
  • 7:29 - 7:36
    I came up with three recommendations
    that you can use to protect yourselves
  • 7:36 - 7:41
    and to spread and share
    information responsibly.
  • 7:41 - 7:46
    This is what we need
    to have a healthy informational diet
  • 7:46 - 7:47
    and a healthy society.
  • 7:47 - 7:49
    So first,
  • 7:49 - 7:54
    if someone sent you a WhatsApp message
    where it's written "Forwarded" on top,
  • 7:54 - 8:00
    this means that your friend,
    your relative, your aunt, or your grandpa
  • 8:00 - 8:02
    passed along this content to you.
  • 8:02 - 8:05
    They didn't write or produce it.
  • 8:05 - 8:10
    Someone sent it to them,
    and they are only forwarding it on to you.
  • 8:10 - 8:13
    So, when you see "Forwarded,"
    reflect upon it.
  • 8:14 - 8:16
    The second thing is,
  • 8:16 - 8:21
    who is the author of the content
    you see on your social media,
  • 8:21 - 8:24
    or that came to you
    through a message system?
  • 8:24 - 8:25
    Did someone sign it?
  • 8:25 - 8:28
    Did someone take responsibility
    for that information?
  • 8:28 - 8:31
    Is someone paying for that information?
  • 8:31 - 8:32
    Who is this person?
  • 8:33 - 8:36
    And the third one is,
  • 8:36 - 8:39
    that message you received,
  • 8:39 - 8:41
    is it really "over the top"?
  • 8:41 - 8:43
    Does it come with a lot
    of exclamation points,
  • 8:43 - 8:47
    with questions,
    and does it say it's urgent?
  • 8:47 - 8:48
    "Please spread this!"
  • 8:48 - 8:50
    "Please forward!"
  • 8:50 - 8:52
    "Please, share!"
  • 8:52 - 8:54
    If so, be suspicious.
  • 8:54 - 8:57
    The best way to check information
  • 8:57 - 9:01
    is to use a search engine.
  • 9:01 - 9:04
    If it's fake content,
  • 9:04 - 9:07
    it's possible that many people
    have already uncovered it.
  • 9:07 - 9:09
    This reminds me of another phenomenon
  • 9:09 - 9:12
    that's very recurrent,
    nowadays, on the Internet,
  • 9:12 - 9:14
    and that's the zombie rumors.
  • 9:14 - 9:15
    It works like this:
  • 9:15 - 9:22
    photos, images and videos vanish
    and then reappear, like zombies.
  • 9:22 - 9:27
    For example, you take a photo
    of a demonstration from five years ago,
  • 9:27 - 9:32
    make a new post,
    and alter the date and location.
  • 9:32 - 9:35
    So, it's true, that photo
    is from some demonstration somewhere,
  • 9:35 - 9:39
    but it's not from the demonstration
    that occurred just days ago,
  • 9:39 - 9:41
    or the day you were there.
  • 9:41 - 9:43
    So, this is very common.
  • 9:43 - 9:50
    Increasingly, technology will sophisticate
    the possibility to alter contents,
  • 9:50 - 9:51
    including videos.
  • 9:51 - 9:55
    Even changing what people are saying
    through lip manipulation,
  • 9:55 - 9:58
    which are what people call "deepfakes."
  • 9:58 - 10:04
    So, we have to be alert
    about the information we're consuming,
  • 10:04 - 10:09
    try to check if the source is reliable,
  • 10:09 - 10:12
    and trust in going to the web to check.
  • 10:12 - 10:16
    Today, there are many basic,
    simple pieces of software
  • 10:16 - 10:21
    for you to do a reverse search
    when researching photos and videos.
  • 10:21 - 10:25
    You can detect and discover quickly
  • 10:25 - 10:27
    if they are real or not.
  • 10:27 - 10:28
    So, it's like this:
  • 10:28 - 10:33
    in the same way we concern ourselves
    with our everyday nourishment
  • 10:33 - 10:37
    and that we want to eat
    a wide variety of healthy food,
  • 10:37 - 10:42
    we also need to have a wide range
    of information that we choose
  • 10:42 - 10:44
    and doesn't come to us
  • 10:44 - 10:47
    by the grace of our phone's data package,
    of our mobile operator,
  • 10:47 - 10:52
    that we'll research and try to compare
    our information with other ideas
  • 10:52 - 10:54
    and other contents,
  • 10:54 - 10:58
    so we can make our own
    daily decisions as citizens.
  • 10:58 - 10:59
    And that's all.
  • 10:59 - 11:02
    Happy informational dieting to everyone.
  • 11:02 - 11:03
    Thank you.
  • 11:03 - 11:05
    (Applause)
Title:
Information pollution on the Internet, the fake news era | Adriana Garcia | TEDxLaçador
Description:

In a very timely lecture, Adriana Garcia talks about information pollution on the Internet, particularly from fake news. She talks about the difficulty of knowing the veracity of information that comes to us through so many social media options and apps, and how we can protect ourselves in regards to this issue.

As a journalist, mother of two girls, international correspondent, and teacher for over 20 years, Adriana Garcia has worked for Reuters of Brazil and USA, studied innovation and leadership in journalism at Stanford University, and today is the Operations Director of Projor, the Institute for the Development of Journalism. Adriana is co-founder of the LabCOLab project that focuses on co-created education for children and juveniles. In 2018, she initiated the project Comprova, an initiative that collaborated with more than 24 communication channels, having the main goal of fighting misinformation during the Brazilian presidential campaign. She believes that creativity, affection and genuine human relationships are key abilities for dealing with the complexities of the present and the future.

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at https://www.ted.com/tedx

more » « less
Video Language:
Portuguese, Brazilian
Team:
closed TED
Project:
TEDxTalks
Duration:
11:09

English subtitles

Revisions