Return to Video

Dear Facebook, this is how you're breaking democracy

  • 0:01 - 0:03
    Around five years ago,
  • 0:03 - 0:05
    it struck me that I was losing the ability
  • 0:05 - 0:08
    to engage with people
    who aren't like-minded.
  • 0:09 - 0:13
    The idea of discussing hot-button issues
    with my fellow Americans
  • 0:13 - 0:15
    was starting to give me more heartburn
  • 0:15 - 0:19
    than the times that I engaged
    with suspected extremists overseas.
  • 0:20 - 0:23
    It was starting to leave me feeling
    more embittered and frustrated.
  • 0:23 - 0:25
    And so just like that,
  • 0:25 - 0:27
    I shifted my entire focus
  • 0:27 - 0:29
    from global national security threats
  • 0:29 - 0:32
    to trying to understand
    what was causing this push
  • 0:32 - 0:36
    towards extreme polarization at home.
  • 0:36 - 0:38
    As a former CIA officer and diplomat
  • 0:38 - 0:41
    who spent years working
    on counterextremism issues,
  • 0:41 - 0:45
    I started to fear that this was becoming
    a far greater threat to our democracy
  • 0:45 - 0:48
    than any foreign adversary.
  • 0:48 - 0:50
    And so I started digging in,
  • 0:50 - 0:51
    and I started speaking out,
  • 0:51 - 0:54
    which eventually led me
    to being hired at Facebook
  • 0:54 - 0:57
    and ultimately brought me here today
  • 0:57 - 1:00
    to continue warning you
    about how these platforms
  • 1:00 - 1:04
    are manipulating
    and radicalizing so many of us
  • 1:04 - 1:07
    and to talk about
    how to reclaim our public square.
  • 1:08 - 1:10
    I was a foreign service officer in Kenya
  • 1:10 - 1:13
    just a few years after
    the September 11 attacks,
  • 1:13 - 1:15
    and I led what some call
    "hearts and minds" campaigns
  • 1:15 - 1:17
    along the Somalia border.
  • 1:17 - 1:21
    A big part of my job
    was to build trust with communities
  • 1:21 - 1:23
    deemed the most susceptible
    to extremist messaging.
  • 1:24 - 1:28
    I spent hours drinking tea
    with outspoken anti-Western clerics
  • 1:28 - 1:32
    and even dialogued
    with some suspected terrorists,
  • 1:32 - 1:35
    and while many of these engagements
    began with mutual suspicion,
  • 1:35 - 1:39
    I don't recall any of them
    resulting in shouting or insults,
  • 1:39 - 1:43
    and in some case we even worked together
    on areas of mutual interest.
  • 1:44 - 1:48
    The most powerful tools we had
    were to simply listen, learn
  • 1:48 - 1:50
    and build empathy.
  • 1:50 - 1:53
    This is the essence
    of hearts and minds work,
  • 1:53 - 1:56
    because what I found again and again
    is that what most people wanted
  • 1:56 - 2:00
    was to feel heard,
    validated and respected.
  • 2:00 - 2:03
    And I believe that's what most of us want.
  • 2:03 - 2:06
    So what I see happening online today
    is especially heartbreaking
  • 2:06 - 2:08
    and a much harder problem to tackle.
  • 2:09 - 2:13
    We are being manipulated
    by the current information ecosystem
  • 2:13 - 2:16
    entrenching so many of us
    so far into absolutism
  • 2:16 - 2:19
    that compromise has become a dirty word.
  • 2:20 - 2:21
    Because right now,
  • 2:21 - 2:23
    social media companies like Facebook
  • 2:23 - 2:27
    profit off of segmenting us
    and feeding us personalized content
  • 2:27 - 2:31
    that both validates
    and exploits our biases.
  • 2:31 - 2:34
    Their bottom line depends
    on provoking a strong emotion
  • 2:34 - 2:36
    to keep us engaged,
  • 2:36 - 2:40
    often incentivizing the most
    inflammatory and polarizing voices,
  • 2:40 - 2:45
    to the point where finding common ground
    no longer feels possible.
  • 2:45 - 2:49
    And despite a growing chorus of people
    crying out for the platforms to change,
  • 2:49 - 2:52
    it's clear they will not
    do enough on their own.
  • 2:52 - 2:55
    So governments must define
    the responsibility
  • 2:55 - 2:59
    for the real-world harms being caused
    by these business models
  • 2:59 - 3:02
    and impose real costs
    on the damaging effects
  • 3:02 - 3:07
    they're having to our public health,
    our public square and our democracy.
  • 3:07 - 3:12
    But unfortunately, this won't happen
    in time for the US presidential election,
  • 3:12 - 3:15
    so I am continuing to raise this alarm,
  • 3:15 - 3:18
    because even if one day
    we do have strong rules in place,
  • 3:18 - 3:20
    it will take all of us to fix this.
  • 3:21 - 3:24
    When I started shifting my focus
    from threats abroad
  • 3:24 - 3:26
    to the breakdown
    in civil discourse at home,
  • 3:26 - 3:30
    I wondered if we could repurpose
    some of these hearts and minds campaigns
  • 3:30 - 3:32
    to help heal our divides.
  • 3:32 - 3:36
    Our more than 200-year
    experiment with democracy works
  • 3:36 - 3:40
    in large part because we are able
    to openly and passionately
  • 3:40 - 3:43
    debate our ideas for the best solutions.
  • 3:43 - 3:45
    But while I still deeply believe
  • 3:45 - 3:47
    in the power of face-to-face
    civil discourse,
  • 3:47 - 3:49
    it just cannot compete
  • 3:49 - 3:53
    with the polarizing effects
    and scale of social media right now.
  • 3:53 - 3:55
    The people who are sucked
    down these rabbit holes
  • 3:55 - 3:56
    of social media outrage
  • 3:56 - 4:00
    often feel far harder to break
    of their ideological mindsets
  • 4:00 - 4:03
    than those vulnerable communities
    I worked with ever were.
  • 4:04 - 4:06
    So when Facebook called me in 2018
  • 4:06 - 4:07
    and offered me this role
  • 4:07 - 4:11
    heading its elections integrity operations
    for political advertising,
  • 4:11 - 4:13
    I felt I had to say yes.
  • 4:13 - 4:16
    I had no illusions
    that I would fix it all,
  • 4:16 - 4:18
    but when offered the opportunity
  • 4:18 - 4:20
    to help steer the ship
    in a better direction,
  • 4:20 - 4:21
    I had to at least try.
  • 4:23 - 4:25
    I didn't work directly on polarization,
  • 4:25 - 4:29
    but I did look at which issues
    were the most divisive in our society
  • 4:29 - 4:33
    and therefore the most exploitable
    in elections interference efforts,
  • 4:33 - 4:36
    which was Russia's tactic ahead of 2016.
  • 4:37 - 4:39
    So I started by asking questions.
  • 4:39 - 4:42
    I wanted to understand
    the underlying systemic issues
  • 4:42 - 4:44
    that were allowing all of this to happen,
  • 4:44 - 4:46
    in order to figure out how to fix it.
  • 4:48 - 4:50
    Now I still do believe
    in the power of the internet
  • 4:50 - 4:53
    to bring more voices to the table,
  • 4:53 - 4:56
    but despite their stated goal
    of building community,
  • 4:56 - 4:59
    the largest social media companies
    as currently constructed
  • 4:59 - 5:03
    are antithetical to the concept
    of reasoned discourse.
  • 5:03 - 5:05
    There's no way to reward listening,
  • 5:05 - 5:07
    to encourage civil debate
  • 5:07 - 5:11
    and to protect people
    who sincerely want to ask questions
  • 5:11 - 5:14
    in a business where optimizing
    engagement and user growth
  • 5:14 - 5:17
    are the two most important
    metrics for success.
  • 5:17 - 5:21
    There's no incentive
    to help people slow down,
  • 5:21 - 5:24
    to build in enough friction
    that people have to stop,
  • 5:24 - 5:27
    recognize their emotional
    reaction to something,
  • 5:27 - 5:29
    and question their own
    assumptions before engaging.
  • 5:31 - 5:33
    The unfortunate reality is:
  • 5:33 - 5:36
    lies are more engaging online than truth,
  • 5:36 - 5:39
    and salaciousness beats out
    wonky, fact-based reasoning
  • 5:39 - 5:42
    in a world optimized
    for frictionless virality.
  • 5:43 - 5:47
    As long as algorithms' goals
    are to keep us engaged,
  • 5:47 - 5:51
    they will continue to feed us the poison
    that plays to our worst instincts
  • 5:51 - 5:52
    and human weaknesses.
  • 5:53 - 5:56
    And yes, anger, mistrust,
  • 5:56 - 5:58
    the culture of fear, hatred:
  • 5:58 - 6:01
    none of this is new in America.
  • 6:01 - 6:04
    But in recent years,
    social media has harnessed all of that
  • 6:04 - 6:08
    and, as I see it,
    dramatically tipped the scales.
  • 6:08 - 6:10
    And Facebook knows it.
  • 6:10 - 6:12
    A recent "Wall Street Journal" article
  • 6:12 - 6:16
    exposed an internal
    Facebook presentation from 2018
  • 6:16 - 6:20
    that specifically points
    to the companies' own algorithms
  • 6:20 - 6:23
    for growing extremist groups'
    presence on their platform
  • 6:23 - 6:26
    and for polarizing their users.
  • 6:27 - 6:30
    But keeping us engaged
    is how they make their money.
  • 6:30 - 6:35
    The modern information environment
    is crystallized around profiling us
  • 6:35 - 6:38
    and then segmenting us
    into more and more narrow categories
  • 6:38 - 6:41
    to perfect this personalization process.
  • 6:41 - 6:45
    We're then bombarded
    with information confirming our views,
  • 6:45 - 6:47
    reinforcing our biases,
  • 6:47 - 6:50
    and making us feel
    like we belong to something.
  • 6:50 - 6:54
    These are the same tactics
    we would see terrorist recruiters
  • 6:54 - 6:56
    using on vulnerable youth,
  • 6:56 - 7:00
    albeit in smaller, more localized ways
    before social media,
  • 7:00 - 7:03
    with the ultimate goal
    of persuading their behavior.
  • 7:03 - 7:08
    Unfortunately, I was never empowered
    by Facebook to have an actual impact.
  • 7:08 - 7:12
    In fact, on my second day,
    my title and job description were changed
  • 7:12 - 7:15
    and I was cut out
    of decision-making meetings.
  • 7:15 - 7:16
    My biggest efforts,
  • 7:16 - 7:18
    trying to build plans
  • 7:18 - 7:21
    to combat disinformation
    and voter suppression in political ads,
  • 7:22 - 7:23
    were rejected.
  • 7:23 - 7:26
    And so I lasted just shy of six months.
  • 7:26 - 7:30
    But here is my biggest takeaway
    from my time there.
  • 7:30 - 7:32
    There are thousands of people at Facebook
  • 7:32 - 7:34
    who are passionately working on a product
  • 7:34 - 7:38
    that they truly believe
    makes the world a better place,
  • 7:38 - 7:42
    but as long as the company continues
    to merely tinker around the margins
  • 7:42 - 7:44
    of content policy and moderation,
  • 7:44 - 7:46
    as opposed to considering
  • 7:46 - 7:49
    how the entire machine
    is designed and monetized,
  • 7:49 - 7:52
    they will never truly address
    how the platform is contributing
  • 7:52 - 7:57
    to hatred, division and radicalization.
  • 7:57 - 8:01
    And that's the one conversation
    I never heard happen during my time there,
  • 8:01 - 8:04
    because that would require
    fundamentally accepting
  • 8:04 - 8:08
    that the thing you built
    might not be the best thing for society
  • 8:08 - 8:11
    and agreeing to alter
    the entire product and profit model.
  • 8:12 - 8:14
    So what can we do about this?
  • 8:15 - 8:19
    I'm not saying that social media
    bears the sole responsibility
  • 8:19 - 8:21
    for the state that we're in today.
  • 8:21 - 8:26
    Clearly, we have deep-seated
    societal issues that we need to solve.
  • 8:26 - 8:30
    But Facebook's response,
    that it is just a mirror to society,
  • 8:30 - 8:33
    is a convenient attempt
    to deflect any responsibility
  • 8:34 - 8:38
    from the way their platform
    is amplifying harmful content
  • 8:38 - 8:41
    and pushing some users
    towards extreme views.
  • 8:42 - 8:44
    And Facebook could, if they wanted to,
  • 8:44 - 8:46
    fix some of this.
  • 8:46 - 8:50
    They could stop amplifying
    and recommending the conspiracy theorists,
  • 8:50 - 8:53
    the hate groups,
    the purveyors of disinformation
  • 8:53 - 8:57
    and, yes, in some cases
    even our president.
  • 8:57 - 9:00
    They could stop using
    the same personalization techniques
  • 9:00 - 9:04
    to deliver political rhetoric
    that they use to sell us sneakers.
  • 9:04 - 9:06
    They could retrain their algorithms
  • 9:06 - 9:08
    to focus on a metric
    other than engagement,
  • 9:08 - 9:13
    and they could build in guardrails
    to stop certain content from going viral
  • 9:13 - 9:14
    before being reviewed.
  • 9:15 - 9:17
    And they could do all of this
  • 9:17 - 9:21
    without becoming what they call
    the arbiters of truth.
  • 9:21 - 9:24
    But they've made it clear
    that they will not go far enough
  • 9:24 - 9:27
    to do the right thing
    without being forced to,
  • 9:27 - 9:30
    and, to be frank, why should they?
  • 9:30 - 9:34
    The markets keep rewarding them,
    and they're not breaking the law.
  • 9:34 - 9:35
    Because as it stands,
  • 9:35 - 9:40
    there are no US laws compelling Facebook,
    or any social media company,
  • 9:40 - 9:42
    to protect our public square,
  • 9:42 - 9:43
    our democracy
  • 9:43 - 9:46
    and even our elections.
  • 9:46 - 9:50
    We have ceded the decision-making
    on what rules to write and what to enforce
  • 9:50 - 9:53
    to the CEOs of for-profit
    internet companies.
  • 9:54 - 9:56
    Is this what we want?
  • 9:56 - 9:59
    A post-truth world
    where toxicity and tribalism
  • 9:59 - 10:02
    trump bridge-building
    and consensus-seeking?
  • 10:03 - 10:07
    I do remain optimistic that we still
    have more in common with each other
  • 10:07 - 10:10
    than the current media
    and online environment portray,
  • 10:10 - 10:14
    and I do believe that having
    more perspective surface
  • 10:14 - 10:17
    makes for a more robust
    and inclusive democracy.
  • 10:17 - 10:20
    But not the way it's happening right now.
  • 10:20 - 10:24
    And it bears emphasizing,
    I do not want to kill off these companies.
  • 10:24 - 10:28
    I just want them held
    to a certain level of accountability,
  • 10:28 - 10:29
    just like the rest of society.
  • 10:31 - 10:35
    It is time for our governments
    to step up and do their jobs
  • 10:35 - 10:37
    of protecting our citizenry.
  • 10:37 - 10:40
    And while there isn't
    one magical piece of legislation
  • 10:40 - 10:41
    that will fix this all,
  • 10:41 - 10:46
    I do believe that governments
    can and must find the balance
  • 10:46 - 10:48
    between protecting free speech
  • 10:48 - 10:52
    and holding these platforms accountable
    for their effects on society.
  • 10:52 - 10:57
    And they could do so in part
    by insisting on actual transparency
  • 10:57 - 11:00
    around how these recommendation
    engines are working,
  • 11:00 - 11:05
    around how the curation, amplification
    and targeting are happening.
  • 11:05 - 11:07
    You see, I want these companies
    held accountable
  • 11:07 - 11:10
    not for if an individual
    posts misinformation
  • 11:10 - 11:12
    or extreme rhetoric,
  • 11:12 - 11:15
    but for how their
    recommendation engines spread it,
  • 11:16 - 11:19
    how their algorithms
    are steering people towards it,
  • 11:19 - 11:22
    and how their tools are used
    to target people with it.
  • 11:23 - 11:27
    I tried to make change
    from within Facebook and failed,
  • 11:27 - 11:30
    and so I've been using my voice again
    for the past few years
  • 11:30 - 11:33
    to continue sounding this alarm
  • 11:33 - 11:37
    and hopefully inspire more people
    to demand this accountability.
  • 11:37 - 11:40
    My message to you is simple:
  • 11:40 - 11:42
    pressure your government representatives
  • 11:42 - 11:47
    to step up and stop ceding
    our public square to for-profit interests.
  • 11:47 - 11:49
    Help educate your friends and family
  • 11:49 - 11:52
    about how they're being
    manipulated online.
  • 11:52 - 11:56
    Push yourselves to engage
    with people who aren't like-minded.
  • 11:56 - 11:59
    Make this issue a priority.
  • 11:59 - 12:02
    We need a whole-society
    approach to fix this.
  • 12:03 - 12:09
    And my message to the leaders
    of my former employer Facebook is this:
  • 12:09 - 12:15
    right now, people are using your tools
    exactly as they were designed
  • 12:15 - 12:17
    to sow hatred, division and distrust,
  • 12:17 - 12:21
    and you're not just allowing it,
    you are enabling it.
  • 12:21 - 12:24
    And yes, there are lots of great stories
  • 12:24 - 12:28
    of positive things happening
    on your platform around the globe,
  • 12:28 - 12:31
    but that doesn't make any of this OK.
  • 12:31 - 12:34
    And it's only getting worse
    as we're heading into our election,
  • 12:34 - 12:36
    and even more concerning,
  • 12:36 - 12:38
    face our biggest potential crisis yet,
  • 12:38 - 12:42
    if the results aren't trusted,
    and if violence breaks out.
  • 12:42 - 12:47
    So when in 2021 you once again say,
    "We know we have to do better,"
  • 12:47 - 12:50
    I want you to remember this moment,
  • 12:50 - 12:53
    because it's no longer
    just a few outlier voices.
  • 12:53 - 12:56
    Civil rights leaders, academics,
  • 12:56 - 12:59
    journalists, advertisers,
    your own employees,
  • 12:59 - 13:01
    are shouting from the rooftops
  • 13:01 - 13:03
    that your policies
    and your business practices
  • 13:03 - 13:06
    are harming people and democracy.
  • 13:07 - 13:09
    You own your decisions,
  • 13:09 - 13:13
    but you can no longer say
    that you couldn't have seen it coming.
  • 13:14 - 13:15
    Thank you.
Title:
Dear Facebook, this is how you're breaking democracy
Speaker:
Yaël Eisenstat
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
13:30

English subtitles

Revisions Compare revisions