How we can protect truth in the age of misinformation
-
0:01 - 0:07So, on April 23 of 2013,
-
0:07 - 0:12the Associated Press
put out the following tweet on Twitter. -
0:12 - 0:15It said, "Breaking news:
-
0:15 - 0:17Two explosions at the White House
-
0:17 - 0:20and Barack Obama has been injured."
-
0:20 - 0:26This tweet was retweeted 4,000 times
in less than five minutes, -
0:26 - 0:28and it went viral thereafter.
-
0:29 - 0:33Now, this tweet wasn't real news
put out by the Associated Press. -
0:33 - 0:36In fact it was false news, or fake news,
-
0:36 - 0:39that was propagated by Syrian hackers
-
0:39 - 0:44that had infiltrated
the Associated Press Twitter handle. -
0:44 - 0:48Their purpose was to disrupt society,
but they disrupted much more. -
0:48 - 0:51Because automated trading algorithms
-
0:51 - 0:54immediately seized
on the sentiment on this tweet, -
0:54 - 0:57and began trading based on the potential
-
0:57 - 1:01that the president of the United States
had been injured or killed -
1:01 - 1:02in this explosion.
-
1:02 - 1:04And as they started tweeting,
-
1:04 - 1:08they immediately sent
the stock market crashing, -
1:08 - 1:13wiping out 140 billion dollars
in equity value in a single day. -
1:13 - 1:18Robert Mueller, special counsel
prosecutor in the United States, -
1:18 - 1:21issued indictments
against three Russian companies -
1:21 - 1:24and 13 Russian individuals
-
1:24 - 1:27on a conspiracy to defraud
the United States -
1:27 - 1:31by meddling in the 2016
presidential election. -
1:32 - 1:35And what this indictment tells as a story
-
1:35 - 1:39is the story of the Internet
Research Agency, -
1:39 - 1:42the shadowy arm of the Kremlin
on social media. -
1:43 - 1:46During the presidential election alone,
-
1:46 - 1:48the Internet Agency's efforts
-
1:48 - 1:53reached 126 million people
on Facebook in the United States, -
1:53 - 1:56issued three million individual tweets
-
1:56 - 2:00and 43 hours' worth of YouTube content.
-
2:00 - 2:02All of which was fake --
-
2:02 - 2:08misinformation designed to sow discord
in the US presidential election. -
2:09 - 2:12A recent study by Oxford University
-
2:12 - 2:15showed that in the recent
Swedish elections, -
2:15 - 2:19one third of all of the information
spreading on social media -
2:19 - 2:21about the election
-
2:21 - 2:23was fake or misinformation.
-
2:23 - 2:28In addition, these types
of social-media misinformation campaigns -
2:28 - 2:32can spread what has been called
"genocidal propaganda," -
2:32 - 2:35for instance against
the Rohingya in Burma, -
2:35 - 2:38triggering mob killings in India.
-
2:38 - 2:39We studied fake news
-
2:39 - 2:43and began studying it
before it was a popular term. -
2:43 - 2:48And we recently published
the largest-ever longitudinal study -
2:48 - 2:50of the spread of fake news online
-
2:50 - 2:54on the cover of "Science"
in March of this year. -
2:55 - 2:59We studied all of the verified
true and false news stories -
2:59 - 3:00that ever spread on Twitter,
-
3:00 - 3:04from its inception in 2006 to 2017.
-
3:05 - 3:07And when we studied this information,
-
3:07 - 3:10we studied verified news stories
-
3:10 - 3:14that were verified by six
independent fact-checking organizations. -
3:14 - 3:17So we knew which stories were true
-
3:17 - 3:19and which stories were false.
-
3:19 - 3:21We can measure their diffusion,
-
3:21 - 3:22the speed of their diffusion,
-
3:22 - 3:24the depth and breadth of their diffusion,
-
3:24 - 3:29how many people become entangled
in this information cascade and so on. -
3:29 - 3:30And what we did in this paper
-
3:30 - 3:34was we compared the spread of true news
to the spread of false news. -
3:34 - 3:36And here's what we found.
-
3:36 - 3:40We found that false news
diffused further, faster, deeper -
3:40 - 3:42and more broadly than the truth
-
3:42 - 3:45in every category of information
that we studied, -
3:45 - 3:47sometimes by an order of magnitude.
-
3:48 - 3:51And in fact, false political news
was the most viral. -
3:51 - 3:55It diffused further, faster,
deeper and more broadly -
3:55 - 3:57than any other type of false news.
-
3:57 - 3:59When we saw this,
-
3:59 - 4:02we were at once worried but also curious.
-
4:02 - 4:03Why?
-
4:03 - 4:06Why does false news travel
so much further, faster, deeper -
4:06 - 4:08and more broadly than the truth?
-
4:08 - 4:11The first hypothesis
that we came up with was, -
4:11 - 4:16"Well, maybe people who spread false news
have more followers or follow more people, -
4:16 - 4:18or tweet more often,
-
4:18 - 4:22or maybe they're more often 'verified'
users of Twitter, with more credibility, -
4:22 - 4:24or maybe they've been on Twitter longer."
-
4:24 - 4:26So we checked each one of these in turn.
-
4:27 - 4:30And what we found
was exactly the opposite. -
4:30 - 4:32False-news spreaders had fewer followers,
-
4:32 - 4:34followed fewer people, were less active,
-
4:34 - 4:36less often "verified"
-
4:36 - 4:39and had been on Twitter
for a shorter period of time. -
4:39 - 4:40And yet,
-
4:40 - 4:45false news was 70 percent more likely
to be retweeted than the truth, -
4:45 - 4:48controlling for all of these
and many other factors. -
4:48 - 4:51So we had to come up
with other explanations. -
4:51 - 4:55And we devised what we called
a "novelty hypothesis." -
4:55 - 4:57So if you read the literature,
-
4:57 - 5:01it is well known that human attention
is drawn to novelty, -
5:01 - 5:03things that are new in the environment.
-
5:03 - 5:05And if you read the sociology literature,
-
5:05 - 5:10you know that we like to share
novel information. -
5:10 - 5:14It makes us seem like we have access
to inside information, -
5:14 - 5:17and we gain in status
by spreading this kind of information. -
5:18 - 5:24So what we did was we measured the novelty
of an incoming true or false tweet, -
5:24 - 5:28compared to the corpus
of what that individual had seen -
5:28 - 5:31in the 60 days prior on Twitter.
-
5:31 - 5:34But that wasn't enough,
because we thought to ourselves, -
5:34 - 5:38"Well, maybe false news is more novel
in an information-theoretic sense, -
5:38 - 5:41but maybe people
don't perceive it as more novel." -
5:42 - 5:46So to understand people's
perceptions of false news, -
5:46 - 5:49we looked at the information
and the sentiment -
5:50 - 5:54contained in the replies
to true and false tweets. -
5:54 - 5:55And what we found
-
5:55 - 5:59was that across a bunch
of different measures of sentiment -- -
5:59 - 6:03surprise, disgust, fear, sadness,
-
6:03 - 6:05anticipation, joy and trust --
-
6:05 - 6:11false news exhibited significantly more
surprise and disgust -
6:11 - 6:14in the replies to false tweets.
-
6:14 - 6:18And true news exhibited
significantly more anticipation, -
6:18 - 6:20joy and trust
-
6:20 - 6:22in reply to true tweets.
-
6:22 - 6:26The surprise corroborates
our novelty hypothesis. -
6:26 - 6:31This is new and surprising,
and so we're more likely to share it. -
6:31 - 6:34At the same time,
there was congressional testimony -
6:34 - 6:37in front of both houses of Congress
in the United States, -
6:37 - 6:41looking at the role of bots
in the spread of misinformation. -
6:41 - 6:42So we looked at this too --
-
6:42 - 6:46we used multiple sophisticated
bot-detection algorithms -
6:46 - 6:49to find the bots in our data
and to pull them out. -
6:49 - 6:52So we pulled them out,
we put them back in -
6:52 - 6:55and we compared what happens
to our measurement. -
6:55 - 6:57And what we found was that, yes indeed,
-
6:57 - 7:01bots were accelerating
the spread of false news online, -
7:01 - 7:04but they were accelerating
the spread of true news -
7:04 - 7:06at approximately the same rate.
-
7:06 - 7:09Which means bots are not responsible
-
7:09 - 7:14for the differential diffusion
of truth and falsity online. -
7:14 - 7:17We can't abdicate that responsibility,
-
7:17 - 7:21because we, humans,
are responsible for that spread. -
7:22 - 7:26Now, everything
that I have told you so far, -
7:26 - 7:28unfortunately for all of us,
-
7:28 - 7:29is the good news.
-
7:31 - 7:35The reason is because
it's about to get a whole lot worse. -
7:36 - 7:40And two specific technologies
are going to make it worse. -
7:40 - 7:45We are going to see the rise
of a tremendous wave of synthetic media. -
7:45 - 7:51Fake video, fake audio
that is very convincing to the human eye. -
7:51 - 7:54And this will powered by two technologies.
-
7:54 - 7:58The first of these is known
as "generative adversarial networks." -
7:58 - 8:01This is a machine-learning model
with two networks: -
8:01 - 8:02a discriminator,
-
8:02 - 8:06whose job it is to determine
whether something is true or false, -
8:06 - 8:08and a generator,
-
8:08 - 8:11whose job it is to generate
synthetic media. -
8:11 - 8:16So the synthetic generator
generates synthetic video or audio, -
8:16 - 8:21and the discriminator tries to tell,
"Is this real or is this fake?" -
8:21 - 8:24And in fact, it is the job
of the generator -
8:24 - 8:28to maximize the likelihood
that it will fool the discriminator -
8:28 - 8:32into thinking the synthetic
video and audio that it is creating -
8:32 - 8:33is actually true.
-
8:33 - 8:36Imagine a machine in a hyperloop,
-
8:36 - 8:39trying to get better
and better at fooling us. -
8:39 - 8:42This, combined with the second technology,
-
8:42 - 8:47which is essentially the democratization
of artificial intelligence to the people, -
8:47 - 8:50the ability for anyone,
-
8:50 - 8:52without any background
in artificial intelligence -
8:52 - 8:54or machine learning,
-
8:54 - 8:58to deploy these kinds of algorithms
to generate synthetic media -
8:58 - 9:02makes it ultimately so much easier
to create videos. -
9:02 - 9:07The White House issued
a false, doctored video -
9:07 - 9:11of a journalist interacting with an intern
who was trying to take his microphone. -
9:11 - 9:13They removed frames from this video
-
9:13 - 9:17in order to make his actions
seem more punchy. -
9:17 - 9:21And when videographers
and stuntmen and women -
9:21 - 9:23were interviewed
about this type of technique, -
9:23 - 9:27they said, "Yes, we use this
in the movies all the time -
9:27 - 9:32to make our punches and kicks
look more choppy and more aggressive." -
9:32 - 9:34They then put out this video
-
9:34 - 9:37and partly used it as justification
-
9:37 - 9:41to revoke Jim Acosta,
the reporter's, press pass -
9:41 - 9:42from the White House.
-
9:42 - 9:47And CNN had to sue
to have that press pass reinstated. -
9:49 - 9:54There are about five different paths
that I can think of that we can follow -
9:54 - 9:58to try and address some
of these very difficult problems today. -
9:58 - 10:00Each one of them has promise,
-
10:00 - 10:03but each one of them
has its own challenges. -
10:03 - 10:05The first one is labeling.
-
10:05 - 10:07Think about it this way:
-
10:07 - 10:10when you go to the grocery store
to buy food to consume, -
10:10 - 10:12it's extensively labeled.
-
10:12 - 10:14You know how many calories it has,
-
10:14 - 10:16how much fat it contains --
-
10:16 - 10:20and yet when we consume information,
we have no labels whatsoever. -
10:20 - 10:22What is contained in this information?
-
10:22 - 10:24Is the source credible?
-
10:24 - 10:26Where is this information gathered from?
-
10:26 - 10:28We have none of that information
-
10:28 - 10:30when we are consuming information.
-
10:30 - 10:33That is a potential avenue,
but it comes with its challenges. -
10:33 - 10:40For instance, who gets to decide,
in society, what's true and what's false? -
10:40 - 10:42Is it the governments?
-
10:42 - 10:43Is it Facebook?
-
10:44 - 10:47Is it an independent
consortium of fact-checkers? -
10:47 - 10:50And who's checking the fact-checkers?
-
10:50 - 10:54Another potential avenue is incentives.
-
10:54 - 10:56We know that during
the US presidential election -
10:56 - 11:00there was a wave of misinformation
that came from Macedonia -
11:00 - 11:02that didn't have any political motive
-
11:02 - 11:05but instead had an economic motive.
-
11:05 - 11:07And this economic motive existed,
-
11:07 - 11:10because false news travels
so much farther, faster -
11:10 - 11:12and more deeply than the truth,
-
11:13 - 11:17and you can earn advertising dollars
as you garner eyeballs and attention -
11:17 - 11:19with this type of information.
-
11:19 - 11:23But if we can depress the spread
of this information, -
11:23 - 11:26perhaps it would reduce
the economic incentive -
11:26 - 11:29to produce it at all in the first place.
-
11:29 - 11:31Third, we can think about regulation,
-
11:31 - 11:34and certainly, we should think
about this option. -
11:34 - 11:35In the United States, currently,
-
11:35 - 11:40we are exploring what might happen
if Facebook and others are regulated. -
11:40 - 11:44While we should consider things
like regulating political speech, -
11:44 - 11:47labeling the fact
that it's political speech, -
11:47 - 11:51making sure foreign actors
can't fund political speech, -
11:51 - 11:53it also has its own dangers.
-
11:54 - 11:58For instance, Malaysia just instituted
a six-year prison sentence -
11:58 - 12:01for anyone found spreading misinformation.
-
12:02 - 12:04And in authoritarian regimes,
-
12:04 - 12:08these kinds of policies can be used
to suppress minority opinions -
12:08 - 12:12and to continue to extend repression.
-
12:13 - 12:16The fourth possible option
is transparency. -
12:17 - 12:21We want to know
how do Facebook's algorithms work. -
12:21 - 12:23How does the data
combine with the algorithms -
12:23 - 12:26to produce the outcomes that we see?
-
12:26 - 12:29We want them to open the kimono
-
12:29 - 12:33and show us exactly the inner workings
of how Facebook is working. -
12:33 - 12:36And if we want to know
social media's effect on society, -
12:36 - 12:38we need scientists, researchers
-
12:38 - 12:41and others to have access
to this kind of information. -
12:41 - 12:43But at the same time,
-
12:43 - 12:46we are asking Facebook
to lock everything down, -
12:46 - 12:49to keep all of the data secure.
-
12:49 - 12:52So, Facebook and the other
social media platforms -
12:52 - 12:55are facing what I call
a transparency paradox. -
12:55 - 12:58We are asking them, at the same time,
-
12:58 - 13:03to be open and transparent
and, simultaneously secure. -
13:03 - 13:05This is a very difficult needle to thread,
-
13:06 - 13:07but they will need to thread this needle
-
13:07 - 13:11if we are to achieve the promise
of social technologies -
13:11 - 13:13while avoiding their peril.
-
13:13 - 13:18The final thing that we could think about
is algorithms and machine learning. -
13:18 - 13:23Technology devised to root out
and understand fake news, how it spreads, -
13:23 - 13:25and to try and dampen its flow.
-
13:26 - 13:29Humans have to be in the loop
of this technology, -
13:29 - 13:31because we can never escape
-
13:31 - 13:35that underlying any technological
solution or approach -
13:35 - 13:39is a fundamental ethical
and philosophical question -
13:39 - 13:42about how do we define truth and falsity,
-
13:42 - 13:46to whom do we give the power
to define truth and falsity -
13:46 - 13:48and which opinions are legitimate,
-
13:48 - 13:52which type of speech
should be allowed and so on. -
13:52 - 13:54Technology is not a solution for that.
-
13:54 - 13:58Ethics and philosophy
is a solution for that. -
13:59 - 14:02Nearly every theory
of human decision making, -
14:02 - 14:05human cooperation and human coordination
-
14:05 - 14:09has some sense of the truth at its core.
-
14:09 - 14:11But with the rise of fake news,
-
14:11 - 14:13the rise of fake video,
-
14:13 - 14:15the rise of fake audio,
-
14:15 - 14:19we are teetering on the brink
of the end of reality, -
14:19 - 14:23where we cannot tell
what is real from what is fake. -
14:23 - 14:26And that's potentially
incredibly dangerous. -
14:27 - 14:31We have to be vigilant
in defending the truth -
14:31 - 14:32against misinformation.
-
14:33 - 14:36With our technologies, with our policies
-
14:36 - 14:38and, perhaps most importantly,
-
14:38 - 14:42with our own individual responsibilities,
-
14:42 - 14:45decisions, behaviors and actions.
-
14:46 - 14:47Thank you very much.
-
14:47 - 14:51(Applause)
- Title:
- How we can protect truth in the age of misinformation
- Speaker:
- Sinan Aral
- Description:
-
Fake news can sway elections, tank economies and sow discord in everyday life. Data scientist Sinan Aral demystifies how and why it spreads so quickly -- citing one of the largest studies on misinformation -- and identifies five strategies to help us unweave the tangled web between true and false.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 15:03
Erin Gregory edited English subtitles for How we can protect truth in the age of misinformation | ||
Oliver Friedman edited English subtitles for How we can protect truth in the age of misinformation | ||
Erin Gregory approved English subtitles for How we can protect truth in the age of misinformation | ||
Erin Gregory edited English subtitles for How we can protect truth in the age of misinformation | ||
Erin Gregory edited English subtitles for How we can protect truth in the age of misinformation | ||
Erin Gregory edited English subtitles for How we can protect truth in the age of misinformation | ||
Krystian Aparta accepted English subtitles for How we can protect truth in the age of misinformation | ||
Krystian Aparta edited English subtitles for How we can protect truth in the age of misinformation |