Dear Facebook, this is how you're breaking democracy
-
0:01 - 0:03Around five years ago,
-
0:03 - 0:05it struck me that I was losing the ability
-
0:05 - 0:08to engage with people
who aren't like-minded. -
0:09 - 0:13The idea of discussing hot-button issues
with my fellow Americans -
0:13 - 0:15was starting to give me more heartburn
-
0:15 - 0:19than the times that I engaged
with suspected extremists overseas. -
0:20 - 0:23It was starting to leave me feeling
more embittered and frustrated. -
0:23 - 0:25And so just like that,
-
0:25 - 0:27I shifted my entire focus
-
0:27 - 0:29from global national security threats
-
0:29 - 0:32to trying to understand
what was causing this push -
0:32 - 0:36towards extreme polarization at home.
-
0:36 - 0:38As a former CIA officer and diplomat
-
0:38 - 0:41who spent years working
on counterextremism issues, -
0:41 - 0:45I started to fear that this was becoming
a far greater threat to our democracy -
0:45 - 0:48than any foreign adversary.
-
0:48 - 0:50And so I started digging in,
-
0:50 - 0:51and I started speaking out,
-
0:51 - 0:54which eventually led me
to being hired at Facebook -
0:54 - 0:57and ultimately brought me here today
-
0:57 - 1:00to continue warning you
about how these platforms -
1:00 - 1:04are manipulating
and radicalizing so many of us -
1:04 - 1:07and to talk about
how to reclaim our public square. -
1:08 - 1:10I was a foreign service officer in Kenya
-
1:10 - 1:13just a few years after
the September 11 attacks, -
1:13 - 1:15and I led what some call
"hearts and minds" campaigns -
1:15 - 1:17along the Somalia border.
-
1:17 - 1:21A big part of my job
was to build trust with communities -
1:21 - 1:23deemed the most susceptible
to extremist messaging. -
1:24 - 1:28I spent hours drinking tea
with outspoken anti-Western clerics -
1:28 - 1:32and even dialogued
with some suspected terrorists, -
1:32 - 1:35and while many of these engagements
began with mutual suspicion, -
1:35 - 1:39I don't recall any of them
resulting in shouting or insults, -
1:39 - 1:43and in some case we even worked together
on areas of mutual interest. -
1:44 - 1:48The most powerful tools we had
were to simply listen, learn -
1:48 - 1:50and build empathy.
-
1:50 - 1:53This is the essence
of hearts and minds work, -
1:53 - 1:56because what I found again and again
is that what most people wanted -
1:56 - 2:00was to feel heard,
validated and respected. -
2:00 - 2:03And I believe that's what most of us want.
-
2:03 - 2:06So what I see happening online today
is especially heartbreaking -
2:06 - 2:08and a much harder problem to tackle.
-
2:09 - 2:13We are being manipulated
by the current information ecosystem -
2:13 - 2:16entrenching so many of us
so far into absolutism -
2:16 - 2:19that compromise has become a dirty word.
-
2:20 - 2:21Because right now,
-
2:21 - 2:23social media companies like Facebook
-
2:23 - 2:27profit off of segmenting us
and feeding us personalized content -
2:27 - 2:31that both validates
and exploits our biases. -
2:31 - 2:34Their bottom line depends
on provoking a strong emotion -
2:34 - 2:36to keep us engaged,
-
2:36 - 2:40often incentivizing the most
inflammatory and polarizing voices, -
2:40 - 2:45to the point where finding common ground
no longer feels possible. -
2:45 - 2:49And despite a growing chorus of people
crying out for the platforms to change, -
2:49 - 2:52it's clear they will not
do enough on their own. -
2:52 - 2:55So governments must define
the responsibility -
2:55 - 2:59for the real-world harms being caused
by these business models -
2:59 - 3:02and impose real costs
on the damaging effects -
3:02 - 3:07they're having to our public health,
our public square and our democracy. -
3:07 - 3:12But unfortunately, this won't happen
in time for the US presidential election, -
3:12 - 3:15so I am continuing to raise this alarm,
-
3:15 - 3:18because even if one day
we do have strong rules in place, -
3:18 - 3:20it will take all of us to fix this.
-
3:21 - 3:24When I started shifting my focus
from threats abroad -
3:24 - 3:26to the breakdown
in civil discourse at home, -
3:26 - 3:30I wondered if we could repurpose
some of these hearts and minds campaigns -
3:30 - 3:32to help heal our divides.
-
3:32 - 3:36Our more than 200-year
experiment with democracy works -
3:36 - 3:40in large part because we are able
to openly and passionately -
3:40 - 3:43debate our ideas for the best solutions.
-
3:43 - 3:45But while I still deeply believe
-
3:45 - 3:47in the power of face-to-face
civil discourse, -
3:47 - 3:49it just cannot compete
-
3:49 - 3:53with the polarizing effects
and scale of social media right now. -
3:53 - 3:55The people who are sucked
down these rabbit holes -
3:55 - 3:56of social media outrage
-
3:56 - 4:00often feel far harder to break
of their ideological mindsets -
4:00 - 4:03than those vulnerable communities
I worked with ever were. -
4:04 - 4:06So when Facebook called me in 2018
-
4:06 - 4:07and offered me this role
-
4:07 - 4:11heading its elections integrity operations
for political advertising, -
4:11 - 4:13I felt I had to say yes.
-
4:13 - 4:16I had no illusions
that I would fix it all, -
4:16 - 4:18but when offered the opportunity
-
4:18 - 4:20to help steer the ship
in a better direction, -
4:20 - 4:21I had to at least try.
-
4:23 - 4:25I didn't work directly on polarization,
-
4:25 - 4:29but I did look at which issues
were the most divisive in our society -
4:29 - 4:33and therefore the most exploitable
in elections interference efforts, -
4:33 - 4:36which was Russia's tactic ahead of 2016.
-
4:37 - 4:39So I started by asking questions.
-
4:39 - 4:42I wanted to understand
the underlying systemic issues -
4:42 - 4:44that were allowing all of this to happen,
-
4:44 - 4:46in order to figure out how to fix it.
-
4:48 - 4:50Now I still do believe
in the power of the internet -
4:50 - 4:53to bring more voices to the table,
-
4:53 - 4:56but despite their stated goal
of building community, -
4:56 - 4:59the largest social media companies
as currently constructed -
4:59 - 5:03are antithetical to the concept
of reasoned discourse. -
5:03 - 5:05There's no way to reward listening,
-
5:05 - 5:07to encourage civil debate
-
5:07 - 5:11and to protect people
who sincerely want to ask questions -
5:11 - 5:14in a business where optimizing
engagement and user growth -
5:14 - 5:17are the two most important
metrics for success. -
5:17 - 5:21There's no incentive
to help people slow down, -
5:21 - 5:24to build in enough friction
that people have to stop, -
5:24 - 5:27recognize their emotional
reaction to something, -
5:27 - 5:29and question their own
assumptions before engaging. -
5:31 - 5:33The unfortunate reality is:
-
5:33 - 5:36lies are more engaging online than truth,
-
5:36 - 5:39and salaciousness beats out
wonky, fact-based reasoning -
5:39 - 5:42in a world optimized
for frictionless virality. -
5:43 - 5:47As long as algorithms' goals
are to keep us engaged, -
5:47 - 5:51they will continue to feed us the poison
that plays to our worst instincts -
5:51 - 5:52and human weaknesses.
-
5:53 - 5:56And yes, anger, mistrust,
-
5:56 - 5:58the culture of fear, hatred:
-
5:58 - 6:01none of this is new in America.
-
6:01 - 6:04But in recent years,
social media has harnessed all of that -
6:04 - 6:08and, as I see it,
dramatically tipped the scales. -
6:08 - 6:10And Facebook knows it.
-
6:10 - 6:12A recent "Wall Street Journal" article
-
6:12 - 6:16exposed an internal
Facebook presentation from 2018 -
6:16 - 6:20that specifically points
to the companies' own algorithms -
6:20 - 6:23for growing extremist groups'
presence on their platform -
6:23 - 6:26and for polarizing their users.
-
6:27 - 6:30But keeping us engaged
is how they make their money. -
6:30 - 6:35The modern information environment
is crystallized around profiling us -
6:35 - 6:38and then segmenting us
into more and more narrow categories -
6:38 - 6:41to perfect this personalization process.
-
6:41 - 6:45We're then bombarded
with information confirming our views, -
6:45 - 6:47reinforcing our biases,
-
6:47 - 6:50and making us feel
like we belong to something. -
6:50 - 6:54These are the same tactics
we would see terrorist recruiters -
6:54 - 6:56using on vulnerable youth,
-
6:56 - 7:00albeit in smaller, more localized ways
before social media, -
7:00 - 7:03with the ultimate goal
of persuading their behavior. -
7:03 - 7:08Unfortunately, I was never empowered
by Facebook to have an actual impact. -
7:08 - 7:12In fact, on my second day,
my title and job description were changed -
7:12 - 7:15and I was cut out
of decision-making meetings. -
7:15 - 7:16My biggest efforts,
-
7:16 - 7:18trying to build plans
-
7:18 - 7:21to combat disinformation
and voter suppression in political ads, -
7:22 - 7:23were rejected.
-
7:23 - 7:26And so I lasted just shy of six months.
-
7:26 - 7:30But here is my biggest takeaway
from my time there. -
7:30 - 7:32There are thousands of people at Facebook
-
7:32 - 7:34who are passionately working on a product
-
7:34 - 7:38that they truly believe
makes the world a better place, -
7:38 - 7:42but as long as the company continues
to merely tinker around the margins -
7:42 - 7:44of content policy and moderation,
-
7:44 - 7:46as opposed to considering
-
7:46 - 7:49how the entire machine
is designed and monetized, -
7:49 - 7:52they will never truly address
how the platform is contributing -
7:52 - 7:57to hatred, division and radicalization.
-
7:57 - 8:01And that's the one conversation
I never heard happen during my time there, -
8:01 - 8:04because that would require
fundamentally accepting -
8:04 - 8:08that the thing you built
might not be the best thing for society -
8:08 - 8:11and agreeing to alter
the entire product and profit model. -
8:12 - 8:14So what can we do about this?
-
8:15 - 8:19I'm not saying that social media
bears the sole responsibility -
8:19 - 8:21for the state that we're in today.
-
8:21 - 8:26Clearly, we have deep-seated
societal issues that we need to solve. -
8:26 - 8:30But Facebook's response,
that it is just a mirror to society, -
8:30 - 8:33is a convenient attempt
to deflect any responsibility -
8:34 - 8:38from the way their platform
is amplifying harmful content -
8:38 - 8:41and pushing some users
towards extreme views. -
8:42 - 8:44And Facebook could, if they wanted to,
-
8:44 - 8:46fix some of this.
-
8:46 - 8:50They could stop amplifying
and recommending the conspiracy theorists, -
8:50 - 8:53the hate groups,
the purveyors of disinformation -
8:53 - 8:57and, yes, in some cases
even our president. -
8:57 - 9:00They could stop using
the same personalization techniques -
9:00 - 9:04to deliver political rhetoric
that they use to sell us sneakers. -
9:04 - 9:06They could retrain their algorithms
-
9:06 - 9:08to focus on a metric
other than engagement, -
9:08 - 9:13and they could build in guardrails
to stop certain content from going viral -
9:13 - 9:14before being reviewed.
-
9:15 - 9:17And they could do all of this
-
9:17 - 9:21without becoming what they call
the arbiters of truth. -
9:21 - 9:24But they've made it clear
that they will not go far enough -
9:24 - 9:27to do the right thing
without being forced to, -
9:27 - 9:30and, to be frank, why should they?
-
9:30 - 9:34The markets keep rewarding them,
and they're not breaking the law. -
9:34 - 9:35Because as it stands,
-
9:35 - 9:40there are no US laws compelling Facebook,
or any social media company, -
9:40 - 9:42to protect our public square,
-
9:42 - 9:43our democracy
-
9:43 - 9:46and even our elections.
-
9:46 - 9:50We have ceded the decision-making
on what rules to write and what to enforce -
9:50 - 9:53to the CEOs of for-profit
internet companies. -
9:54 - 9:56Is this what we want?
-
9:56 - 9:59A post-truth world
where toxicity and tribalism -
9:59 - 10:02trump bridge-building
and consensus-seeking? -
10:03 - 10:07I do remain optimistic that we still
have more in common with each other -
10:07 - 10:10than the current media
and online environment portray, -
10:10 - 10:14and I do believe that having
more perspective surface -
10:14 - 10:17makes for a more robust
and inclusive democracy. -
10:17 - 10:20But not the way it's happening right now.
-
10:20 - 10:24And it bears emphasizing,
I do not want to kill off these companies. -
10:24 - 10:28I just want them held
to a certain level of accountability, -
10:28 - 10:29just like the rest of society.
-
10:31 - 10:35It is time for our governments
to step up and do their jobs -
10:35 - 10:37of protecting our citizenry.
-
10:37 - 10:40And while there isn't
one magical piece of legislation -
10:40 - 10:41that will fix this all,
-
10:41 - 10:46I do believe that governments
can and must find the balance -
10:46 - 10:48between protecting free speech
-
10:48 - 10:52and holding these platforms accountable
for their effects on society. -
10:52 - 10:57And they could do so in part
by insisting on actual transparency -
10:57 - 11:00around how these recommendation
engines are working, -
11:00 - 11:05around how the curation, amplification
and targeting are happening. -
11:05 - 11:07You see, I want these companies
held accountable -
11:07 - 11:10not for if an individual
posts misinformation -
11:10 - 11:12or extreme rhetoric,
-
11:12 - 11:15but for how their
recommendation engines spread it, -
11:16 - 11:19how their algorithms
are steering people towards it, -
11:19 - 11:22and how their tools are used
to target people with it. -
11:23 - 11:27I tried to make change
from within Facebook and failed, -
11:27 - 11:30and so I've been using my voice again
for the past few years -
11:30 - 11:33to continue sounding this alarm
-
11:33 - 11:37and hopefully inspire more people
to demand this accountability. -
11:37 - 11:40My message to you is simple:
-
11:40 - 11:42pressure your government representatives
-
11:42 - 11:47to step up and stop ceding
our public square to for-profit interests. -
11:47 - 11:49Help educate your friends and family
-
11:49 - 11:52about how they're being
manipulated online. -
11:52 - 11:56Push yourselves to engage
with people who aren't like-minded. -
11:56 - 11:59Make this issue a priority.
-
11:59 - 12:02We need a whole-society
approach to fix this. -
12:03 - 12:09And my message to the leaders
of my former employer Facebook is this: -
12:09 - 12:15right now, people are using your tools
exactly as they were designed -
12:15 - 12:17to sow hatred, division and distrust,
-
12:17 - 12:21and you're not just allowing it,
you are enabling it. -
12:21 - 12:24And yes, there are lots of great stories
-
12:24 - 12:28of positive things happening
on your platform around the globe, -
12:28 - 12:31but that doesn't make any of this OK.
-
12:31 - 12:34And it's only getting worse
as we're heading into our election, -
12:34 - 12:36and even more concerning,
-
12:36 - 12:38face our biggest potential crisis yet,
-
12:38 - 12:42if the results aren't trusted,
and if violence breaks out. -
12:42 - 12:47So when in 2021 you once again say,
"We know we have to do better," -
12:47 - 12:50I want you to remember this moment,
-
12:50 - 12:53because it's no longer
just a few outlier voices. -
12:53 - 12:56Civil rights leaders, academics,
-
12:56 - 12:59journalists, advertisers,
your own employees, -
12:59 - 13:01are shouting from the rooftops
-
13:01 - 13:03that your policies
and your business practices -
13:03 - 13:06are harming people and democracy.
-
13:07 - 13:09You own your decisions,
-
13:09 - 13:13but you can no longer say
that you couldn't have seen it coming. -
13:14 - 13:15Thank you.
- Title:
- Dear Facebook, this is how you're breaking democracy
- Speaker:
- Yaël Eisenstat
- Description:
-
"Lies are more engaging online than truth," says former CIA analyst, diplomat and Facebook employee Yaël Eisenstat. "As long as [social media] algorithms' goals are to keep us engaged, they will feed us the poison that plays to our worst instincts and human weaknesses." In this bold talk, Eisenstat explores how social media companies like Facebook incentivize inflammatory content, contributing to a culture of political polarization and mistrust -- and calls on governments to hold these platforms accountable in order to protect civil discourse and democracy.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 13:30
Erin Gregory edited English subtitles for Dear Facebook, this is how you're breaking democracy | ||
Erin Gregory edited English subtitles for Dear Facebook, this is how you're breaking democracy | ||
Oliver Friedman approved English subtitles for Dear Facebook, this is how you're breaking democracy | ||
Oliver Friedman edited English subtitles for Dear Facebook, this is how you're breaking democracy | ||
Joanna Pietrulewicz accepted English subtitles for Dear Facebook, this is how you're breaking democracy | ||
Joanna Pietrulewicz edited English subtitles for Dear Facebook, this is how you're breaking democracy | ||
Joanna Pietrulewicz edited English subtitles for Dear Facebook, this is how you're breaking democracy | ||
Joseph Geni edited English subtitles for Dear Facebook, this is how you're breaking democracy |