How you can help transform the internet into a place of trust
-
0:02 - 0:04No matter who you are or where you live,
-
0:04 - 0:06I'm guessing that you have
at least one relative -
0:06 - 0:09that likes to forward those emails.
-
0:09 - 0:11You know the ones I'm talking about --
-
0:11 - 0:14the ones with dubious claims
or conspiracy videos. -
0:14 - 0:17And you've probably
already muted them on Facebook -
0:17 - 0:19for sharing social posts like this one.
-
0:19 - 0:21It's an image of a banana
-
0:21 - 0:23with a strange red cross
running through the center. -
0:23 - 0:26And the text around it is warning people
-
0:26 - 0:28not to eat fruits that look like this,
-
0:28 - 0:30suggesting they've been
injected with blood -
0:30 - 0:32contaminated with the HIV virus.
-
0:32 - 0:35And the social share message
above it simply says, -
0:35 - 0:37"Please forward to save lives."
-
0:38 - 0:41Now, fact-checkers have been debunking
this one for years, -
0:41 - 0:44but it's one of those rumors
that just won't die. -
0:44 - 0:45A zombie rumor.
-
0:46 - 0:48And, of course, it's entirely false.
-
0:48 - 0:51It might be tempting to laugh
at an example like this, to say, -
0:51 - 0:53"Well, who would believe this, anyway?"
-
0:53 - 0:55But the reason it's a zombie rumor
-
0:55 - 0:59is because it taps into people's
deepest fears about their own safety -
0:59 - 1:01and that of the people they love.
-
1:02 - 1:05And if you spend as enough time
as I have looking at misinformation, -
1:05 - 1:08you know that this is just
one example of many -
1:08 - 1:11that taps into people's deepest
fears and vulnerabilities. -
1:11 - 1:16Every day, across the world,
we see scores of new memes on Instagram -
1:16 - 1:19encouraging parents
not to vaccinate their children. -
1:19 - 1:23We see new videos on YouTube
explaining that climate change is a hoax. -
1:23 - 1:28And across all platforms, we see
endless posts designed to demonize others -
1:28 - 1:31on the basis of their race,
religion or sexuality. -
1:32 - 1:35Welcome to one of the central
challenges of our time. -
1:36 - 1:40How can we maintain an internet
with freedom of expression at the core, -
1:40 - 1:43while also ensuring that the content
that's being disseminated -
1:43 - 1:47doesn't cause irreparable harms
to our democracies, our communities -
1:47 - 1:49and to our physical and mental well-being?
-
1:50 - 1:52Because we live in the information age,
-
1:52 - 1:56yet the central currency
upon which we all depend -- information -- -
1:56 - 1:58is no longer deemed entirely trustworthy
-
1:58 - 2:00and, at times, can appear
downright dangerous. -
2:01 - 2:05This is thanks in part to the runaway
growth of social sharing platforms -
2:05 - 2:06that allow us to scroll through,
-
2:06 - 2:09where lies and facts sit side by side,
-
2:09 - 2:12but with none of the traditional
signals of trustworthiness. -
2:12 - 2:16And goodness -- our language around this
is horribly muddled. -
2:16 - 2:19People are still obsessed
with the phrase "fake news," -
2:19 - 2:22despite the fact that
it's extraordinarily unhelpful -
2:22 - 2:25and used to describe a number of things
that are actually very different: -
2:25 - 2:28lies, rumors, hoaxes,
conspiracies, propaganda. -
2:29 - 2:32And I really wish
we could stop using a phrase -
2:32 - 2:35that's been co-opted by politicians
right around the world, -
2:35 - 2:36from the left and the right,
-
2:36 - 2:39used as a weapon to attack
a free and independent press. -
2:40 - 2:45(Applause)
-
2:45 - 2:48Because we need our professional
news media now more than ever. -
2:49 - 2:52And besides, most of this content
doesn't even masquerade as news. -
2:52 - 2:55It's memes, videos, social posts.
-
2:55 - 2:58And most of it is not fake;
it's misleading. -
2:58 - 3:01We tend to fixate on what's true or false.
-
3:01 - 3:05But the biggest concern is actually
the weaponization of context. -
3:07 - 3:09Because the most effective disinformation
-
3:09 - 3:12has always been that
which has a kernel of truth to it. -
3:12 - 3:15Let's take this example
from London, from March 2017, -
3:15 - 3:17a tweet that circulated widely
-
3:17 - 3:21in the aftermath of a terrorist incident
on Westminster Bridge. -
3:21 - 3:23This is a genuine image, not fake.
-
3:23 - 3:26The woman who appears in the photograph
was interviewed afterwards, -
3:26 - 3:29and she explained that
she was utterly traumatized. -
3:29 - 3:30She was on the phone to a loved one,
-
3:30 - 3:33and she wasn't looking
at the victim out of respect. -
3:33 - 3:37But it still was circulated widely
with this Islamophobic framing, -
3:37 - 3:40with multiple hashtags,
including: #BanIslam. -
3:40 - 3:43Now, if you worked at Twitter,
what would you do? -
3:43 - 3:45Would you take that down,
or would you leave it up? -
3:47 - 3:50My gut reaction, my emotional reaction,
is to take this down. -
3:50 - 3:52I hate the framing of this image.
-
3:53 - 3:55But freedom of expression
is a human right, -
3:55 - 3:58and if we start taking down speech
that makes us feel uncomfortable, -
3:58 - 3:59we're in trouble.
-
4:00 - 4:02And this might look like a clear-cut case,
-
4:02 - 4:04but, actually, most speech isn't.
-
4:04 - 4:06These lines are incredibly
difficult to draw. -
4:06 - 4:08What's a well-meaning
decision by one person -
4:08 - 4:10is outright censorship to the next.
-
4:11 - 4:14What we now know is that
this account, Texas Lone Star, -
4:14 - 4:17was part of a wider Russian
disinformation campaign, -
4:17 - 4:19one that has since been taken down.
-
4:19 - 4:21Would that change your view?
-
4:21 - 4:22It would mine,
-
4:23 - 4:25because now it's a case
of a coordinated campaign -
4:25 - 4:26to sow discord.
-
4:26 - 4:28And for those of you who'd like to think
-
4:28 - 4:31that artificial intelligence
will solve all of our problems, -
4:31 - 4:33I think we can agree
that we're a long way away -
4:33 - 4:36from AI that's able to make sense
of posts like this. -
4:37 - 4:39So I'd like to explain
three interlocking issues -
4:39 - 4:42that make this so complex
-
4:42 - 4:45and then think about some ways
we can consider these challenges. -
4:45 - 4:49First, we just don't have
a rational relationship to information, -
4:49 - 4:51we have an emotional one.
-
4:51 - 4:55It's just not true that more facts
will make everything OK, -
4:55 - 4:58because the algorithms that determine
what content we see, -
4:58 - 5:01well, they're designed to reward
our emotional responses. -
5:01 - 5:02And when we're fearful,
-
5:02 - 5:05oversimplified narratives,
conspiratorial explanations -
5:05 - 5:09and language that demonizes others
is far more effective. -
5:10 - 5:11And besides, many of these companies,
-
5:11 - 5:14their business model
is attached to attention, -
5:14 - 5:18which means these algorithms
will always be skewed towards emotion. -
5:18 - 5:23Second, most of the speech
I'm talking about here is legal. -
5:23 - 5:25It would be a different matter
-
5:25 - 5:27if I was talking about
child sexual abuse imagery -
5:27 - 5:29or content that incites violence.
-
5:29 - 5:32It can be perfectly legal
to post an outright lie. -
5:33 - 5:37But people keep talking about taking down
"problematic" or "harmful" content, -
5:37 - 5:40but with no clear definition
of what they mean by that, -
5:40 - 5:41including Mark Zuckerberg,
-
5:41 - 5:45who recently called for global
regulation to moderate speech. -
5:45 - 5:47And my concern is that
we're seeing governments -
5:47 - 5:48right around the world
-
5:48 - 5:51rolling out hasty policy decisions
-
5:51 - 5:54that might actually trigger
much more serious consequences -
5:54 - 5:56when it comes to our speech.
-
5:56 - 6:00And even if we could decide
which speech to take up or take down, -
6:00 - 6:02we've never had so much speech.
-
6:02 - 6:04Every second, millions
of pieces of content -
6:04 - 6:06are uploaded by people
right around the world -
6:06 - 6:07in different languages,
-
6:07 - 6:10drawing on thousands
of different cultural contexts. -
6:10 - 6:13We've simply never had
effective mechanisms -
6:13 - 6:14to moderate speech at this scale,
-
6:15 - 6:17whether powered by humans
or by technology. -
6:18 - 6:22And third, these companies --
Google, Twitter, Facebook, WhatsApp -- -
6:22 - 6:25they're part of a wider
information ecosystem. -
6:25 - 6:28We like to lay all the blame
at their feet, but the truth is, -
6:28 - 6:32the mass media and elected officials
can also play an equal role -
6:32 - 6:35in amplifying rumors and conspiracies
when they want to. -
6:36 - 6:41As can we, when we mindlessly forward
divisive or misleading content -
6:41 - 6:42without trying.
-
6:42 - 6:44We're adding to the pollution.
-
6:45 - 6:48I know we're all looking for an easy fix.
-
6:48 - 6:50But there just isn't one.
-
6:50 - 6:54Any solution will have to be rolled out
at a massive scale, internet scale, -
6:54 - 6:58and yes, the platforms,
they're used to operating at that level. -
6:58 - 7:01But can and should we allow them
to fix these problems? -
7:02 - 7:03They're certainly trying.
-
7:03 - 7:07But most of us would agree that, actually,
we don't want global corporations -
7:07 - 7:09to be the guardians of truth
and fairness online. -
7:09 - 7:12And I also think the platforms
would agree with that. -
7:12 - 7:15And at the moment,
they're marking their own homework. -
7:15 - 7:16They like to tell us
-
7:16 - 7:19that the interventions
they're rolling out are working, -
7:19 - 7:22but because they write
their own transparency reports, -
7:22 - 7:25there's no way for us to independently
verify what's actually happening. -
7:26 - 7:30(Applause)
-
7:30 - 7:33And let's also be clear
that most of the changes we see -
7:33 - 7:36only happen after journalists
undertake an investigation -
7:36 - 7:37and find evidence of bias
-
7:37 - 7:40or content that breaks
their community guidelines. -
7:41 - 7:45So yes, these companies have to play
a really important role in this process, -
7:45 - 7:47but they can't control it.
-
7:48 - 7:49So what about governments?
-
7:50 - 7:53Many people believe
that global regulation is our last hope -
7:53 - 7:56in terms of cleaning up
our information ecosystem. -
7:56 - 7:59But what I see are lawmakers
who are struggling to keep up to date -
7:59 - 8:01with the rapid changes in technology.
-
8:01 - 8:03And worse, they're working in the dark,
-
8:03 - 8:05because they don't have access to data
-
8:05 - 8:08to understand what's happening
on these platforms. -
8:08 - 8:11And anyway, which governments
would we trust to do this? -
8:11 - 8:14We need a global response,
not a national one. -
8:15 - 8:18So the missing link is us.
-
8:18 - 8:21It's those people who use
these technologies every day. -
8:21 - 8:26Can we design a new infrastructure
to support quality information? -
8:26 - 8:28Well, I believe we can,
-
8:28 - 8:31and I've got a few ideas about
what we might be able to actually do. -
8:31 - 8:34So firstly, if we're serious
about bringing the public into this, -
8:34 - 8:37can we take some inspiration
from Wikipedia? -
8:37 - 8:38They've shown us what's possible.
-
8:38 - 8:40Yes, it's not perfect,
-
8:40 - 8:42but they've demonstrated
that with the right structures, -
8:42 - 8:45with a global outlook
and lots and lots of transparency, -
8:45 - 8:48you can build something
that will earn the trust of most people. -
8:48 - 8:51Because we have to find a way
to tap into the collective wisdom -
8:51 - 8:53and experience of all users.
-
8:54 - 8:56This is particularly the case
for women, people of color -
8:56 - 8:58and underrepresented groups.
-
8:58 - 8:59Because guess what?
-
8:59 - 9:01They are experts when it comes
to hate and disinformation, -
9:02 - 9:05because they have been the targets
of these campaigns for so long. -
9:05 - 9:07And over the years,
they've been raising flags, -
9:07 - 9:09and they haven't been listened to.
-
9:09 - 9:10This has got to change.
-
9:11 - 9:15So could we build a Wikipedia for trust?
-
9:15 - 9:19Could we find a way that users
can actually provide insights? -
9:19 - 9:23They could offer insights around
difficult content-moderation decisions. -
9:23 - 9:25They could provide feedback
-
9:25 - 9:28when platforms decide
they want to roll out new changes. -
9:28 - 9:32Second, people's experiences
with the information is personalized. -
9:32 - 9:35My Facebook news feed
is very different to yours. -
9:35 - 9:38Your YouTube recommendations
are very different to mine. -
9:38 - 9:40That makes it impossible for us
to actually examine -
9:40 - 9:42what information people are seeing.
-
9:43 - 9:44So could we imagine
-
9:44 - 9:49developing some kind of centralized
open repository for anonymized data, -
9:49 - 9:52with privacy and ethical
concerns built in? -
9:52 - 9:54Because imagine what we would learn
-
9:54 - 9:57if we built out a global network
of concerned citizens -
9:57 - 10:01who wanted to donate
their social data to science. -
10:01 - 10:03Because we actually know very little
-
10:03 - 10:06about the long-term consequences
of hate and disinformation -
10:06 - 10:08on people's attitudes and behaviors.
-
10:08 - 10:09And what we do know,
-
10:09 - 10:12most of that has been
carried out in the US, -
10:12 - 10:14despite the fact that
this is a global problem. -
10:14 - 10:16We need to work on that, too.
-
10:16 - 10:17And third,
-
10:17 - 10:20can we find a way to connect the dots?
-
10:20 - 10:23No one sector, let alone nonprofit,
start-up or government, -
10:23 - 10:25is going to solve this.
-
10:25 - 10:27But there are very smart people
right around the world -
10:27 - 10:29working on these challenges,
-
10:29 - 10:32from newsrooms, civil society,
academia, activist groups. -
10:32 - 10:34And you can see some of them here.
-
10:34 - 10:37Some are building out indicators
of content credibility. -
10:37 - 10:38Others are fact-checking,
-
10:38 - 10:42so that false claims, videos and images
can be down-ranked by the platforms. -
10:42 - 10:44A nonprofit I helped
to found, First Draft, -
10:44 - 10:47is working with normally competitive
newsrooms around the world -
10:47 - 10:51to help them build out investigative,
collaborative programs. -
10:51 - 10:54And Danny Hillis, a software architect,
-
10:54 - 10:56is designing a new system
called The Underlay, -
10:56 - 10:59which will be a record
of all public statements of fact -
10:59 - 11:00connected to their sources,
-
11:00 - 11:04so that people and algorithms
can better judge what is credible. -
11:05 - 11:08And educators around the world
are testing different techniques -
11:08 - 11:12for finding ways to make people
critical of the content they consume. -
11:13 - 11:16All of these efforts are wonderful,
but they're working in silos, -
11:16 - 11:18and many of them are woefully underfunded.
-
11:19 - 11:21There are also hundreds
of very smart people -
11:21 - 11:22working inside these companies,
-
11:22 - 11:25but again, these efforts
can feel disjointed, -
11:25 - 11:29because they're actually developing
different solutions to the same problems. -
11:29 - 11:31How can we find a way
to bring people together -
11:31 - 11:35in one physical location
for days or weeks at a time, -
11:35 - 11:37so they can actually tackle
these problems together -
11:37 - 11:39but from their different perspectives?
-
11:39 - 11:40So can we do this?
-
11:40 - 11:44Can we build out a coordinated,
ambitious response, -
11:44 - 11:47one that matches the scale
and the complexity of the problem? -
11:48 - 11:49I really think we can.
-
11:49 - 11:52Together, let's rebuild
our information commons. -
11:53 - 11:54Thank you.
-
11:54 - 11:58(Applause)
- Title:
- How you can help transform the internet into a place of trust
- Speaker:
- Claire Wardle
- Description:
-
How can we stop the spread of misleading, sometimes dangerous content while maintaining an internet with freedom of expression at its core? Misinformation expert Claire Wardle explores the new challenges of our polluted online environment and maps out a plan to transform the internet into a place of trust -- with the help everyday users. "Together, let's rebuild our information commons," she says.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 12:33
Oliver Friedman edited English subtitles for How you can help transform the internet into a place of trust | ||
Oliver Friedman edited English subtitles for How you can help transform the internet into a place of trust | ||
Brian Greene edited English subtitles for How you can help transform the internet into a place of trust | ||
Brian Greene approved English subtitles for How you can help transform the internet into a place of trust | ||
Brian Greene edited English subtitles for How you can help transform the internet into a place of trust | ||
Camille Martínez accepted English subtitles for How you can help transform the internet into a place of trust | ||
Camille Martínez edited English subtitles for How you can help transform the internet into a place of trust | ||
Camille Martínez edited English subtitles for How you can help transform the internet into a place of trust |