Return to Video

Inside the bizarre world of internet trolls and propagandists

  • 0:01 - 0:02
    I spent the past three years
  • 0:02 - 0:05
    talking to some of the worst
    people on the internet.
  • 0:06 - 0:09
    Now, if you've been online recently,
  • 0:09 - 0:12
    you may have noticed that there's
    a lot of toxic garbage out there:
  • 0:12 - 0:17
    racist memes, misogynist propaganda,
    viral misinformation.
  • 0:17 - 0:20
    So I wanted to know
    who was making this stuff.
  • 0:20 - 0:22
    I wanted to understand
    how they were spreading it.
  • 0:22 - 0:23
    Ultimately, I wanted to know
  • 0:23 - 0:26
    what kind of impact
    it might be having on our society.
  • 0:26 - 0:30
    So in 2016, I started tracing
    some of these memes back to their source,
  • 0:30 - 0:34
    back to the people who were making them
    or who were making them go viral.
  • 0:34 - 0:35
    I'd approach those people and say,
  • 0:35 - 0:38
    "Hey, I'm a journalist.
    Can I come watch you do what you do?"
  • 0:38 - 0:40
    Now, often the response would be,
  • 0:40 - 0:42
    "Why in hell would I want to talk to
  • 0:42 - 0:44
    some low-t soy-boy
    Brooklyn globalist Jew cuck
  • 0:44 - 0:46
    who's in cahoots with the Democrat Party?"
  • 0:46 - 0:47
    (Laughter)
  • 0:47 - 0:51
    To which my response would be,
    "Look, man, that's only 57 percent true."
  • 0:51 - 0:52
    (Laughter)
  • 0:53 - 0:54
    But often I got the opposite response.
  • 0:54 - 0:56
    "Yeah, sure, come on by."
  • 0:57 - 0:59
    So that's how I ended up
    in the living room
  • 0:59 - 1:02
    of a social media propagandist
    in Southern California.
  • 1:02 - 1:05
    He was a married white guy
    in his late 30s.
  • 1:05 - 1:08
    He had a table in front of him
    with a mug of coffee,
  • 1:08 - 1:10
    a laptop for tweeting,
  • 1:10 - 1:11
    a phone for texting
  • 1:11 - 1:15
    and an iPad for livestreaming
    to Periscope and YouTube.
  • 1:15 - 1:16
    That was it.
  • 1:17 - 1:18
    And yet, with those tools,
  • 1:18 - 1:22
    he was able to propel his fringe,
    noxious talking points
  • 1:22 - 1:24
    into the heart of
    the American conversation.
  • 1:25 - 1:27
    For example, one of the days I was there,
  • 1:27 - 1:30
    a bomb had just exploded in New York,
  • 1:30 - 1:34
    and the guy accused of planting the bomb
    had a Muslim-sounding name.
  • 1:34 - 1:38
    Now, to the propagandist in California,
    this seemed like an opportunity,
  • 1:38 - 1:40
    because one of the things he wanted
  • 1:40 - 1:42
    was for the US to cut off
    almost all immigration,
  • 1:42 - 1:45
    especially from Muslim-majority countries.
  • 1:45 - 1:47
    So he started livestreaming,
  • 1:48 - 1:50
    getting his followers
    worked up into a frenzy
  • 1:50 - 1:53
    about how the open borders agenda
    was going to kill us all
  • 1:53 - 1:54
    and asking them to tweet about this,
  • 1:54 - 1:56
    and use specific hashtags,
  • 1:56 - 1:58
    trying to get those hashtags trending.
  • 1:58 - 1:59
    And tweet they did --
  • 1:59 - 2:01
    hundreds and hundreds of tweets,
  • 2:01 - 2:03
    a lot of them featuring
    images like this one.
  • 2:03 - 2:05
    So that's George Soros.
  • 2:05 - 2:08
    He's a Hungarian billionaire
    and philanthropist,
  • 2:08 - 2:10
    and in the minds
    of some conspiracists online,
  • 2:10 - 2:13
    George Soros is like
    a globalist bogeyman,
  • 2:13 - 2:17
    one of a few elites who is secretly
    manipulating all of global affairs.
  • 2:17 - 2:21
    Now, just to pause here:
    if this idea sounds familiar to you,
  • 2:21 - 2:23
    that there are a few elites
    who control the world
  • 2:23 - 2:26
    and a lot of them happen to be rich Jews,
  • 2:26 - 2:29
    that's because it is one of the most
    anti-Semitic tropes in existence.
  • 2:30 - 2:34
    I should also mention that the guy
    in New York who planted that bomb,
  • 2:34 - 2:35
    he was an American citizen.
  • 2:36 - 2:38
    So whatever else was going on there,
  • 2:38 - 2:40
    immigration was not the main issue.
  • 2:41 - 2:44
    And the propagandist in California,
    he understood all this.
  • 2:44 - 2:47
    He was a well-read guy.
    He was actually a lawyer.
  • 2:47 - 2:48
    He knew the underlying facts,
  • 2:48 - 2:52
    but he also knew that facts
    do not drive conversation online.
  • 2:52 - 2:53
    What drives conversation online
  • 2:53 - 2:55
    is emotion.
  • 2:55 - 2:57
    See, the original premise of social media
  • 2:57 - 3:00
    was that it was going
    to bring us all together,
  • 3:00 - 3:02
    make the world more open
    and tolerant and fair ...
  • 3:02 - 3:03
    And it did some of that.
  • 3:05 - 3:07
    But the social media algorithms
    have never been built
  • 3:07 - 3:10
    to distinguish between
    what's true or false,
  • 3:10 - 3:14
    what's good or bad for society,
    what's prosocial and what's antisocial.
  • 3:14 - 3:16
    That's just not what those algorithms do.
  • 3:16 - 3:19
    A lot of what they do
    is measure engagement:
  • 3:19 - 3:22
    clicks, comments, shares,
    retweets, that kind of thing.
  • 3:22 - 3:25
    And if you want your content
    to get engagement,
  • 3:25 - 3:26
    it has to spark emotion,
  • 3:26 - 3:30
    specifically, what behavioral scientists
    call "high-arousal emotion."
  • 3:30 - 3:33
    Now, "high arousal" doesn't only
    mean sexual arousal,
  • 3:33 - 3:36
    although it's the internet,
    obviously that works.
  • 3:36 - 3:40
    It means anything, positive or negative,
    that gets people's hearts pumping.
  • 3:40 - 3:42
    So I would sit with these propagandists,
  • 3:42 - 3:44
    not just the guy in California,
    but dozens of them,
  • 3:44 - 3:48
    and I would watch as they did this
    again and again successfully,
  • 3:48 - 3:51
    not because they were Russian hackers,
    not because they were tech prodigies,
  • 3:51 - 3:54
    not because they had
    unique political insights --
  • 3:54 - 3:56
    just because they understood
    how social media worked,
  • 3:56 - 3:59
    and they were willing
    to exploit it to their advantage.
  • 3:59 - 4:02
    Now, at first I was able to tell myself
    this was a fringe phenomenon,
  • 4:02 - 4:04
    something that was
    relegated to the internet.
  • 4:04 - 4:09
    But there's really no separation anymore
    between the internet and everything else.
  • 4:09 - 4:11
    This is an ad that ran
    on multiple TV stations
  • 4:11 - 4:14
    during the 2018 congressional elections,
  • 4:14 - 4:16
    alleging with very little evidence
    that one of the candidates
  • 4:17 - 4:19
    was in the pocket of
    international manipulator George Soros,
  • 4:19 - 4:23
    who is awkwardly photoshopped here
    next to stacks of cash.
  • 4:23 - 4:26
    This is a tweet from
    the President of the United States,
  • 4:26 - 4:28
    alleging, again with no evidence,
  • 4:28 - 4:31
    that American politics is being
    manipulated by George Soros.
  • 4:31 - 4:35
    This stuff that once seemed so shocking
    and marginal and, frankly, just ignorable,
  • 4:35 - 4:38
    it's now so normalized
    that we hardly even notice it.
  • 4:38 - 4:40
    So I spent about
    three years in this world.
  • 4:40 - 4:42
    I talked to a lot of people.
  • 4:42 - 4:44
    Some of them seemed to have
    no core beliefs at all.
  • 4:44 - 4:47
    They just seemed to be betting,
    perfectly rationally,
  • 4:47 - 4:49
    that if they wanted
    to make some money online
  • 4:49 - 4:51
    or get some attention online,
  • 4:51 - 4:53
    they should just be
    as outrageous as possible.
  • 4:53 - 4:55
    But I talked to other people
    who were true ideologues.
  • 4:56 - 5:00
    And to be clear, their ideology
    was not traditional conservatism.
  • 5:00 - 5:03
    These were people who wanted
    to revoke female suffrage.
  • 5:03 - 5:06
    These were people who wanted
    to go back to racial segregation.
  • 5:06 - 5:09
    Some of them wanted to do away
    with democracy altogether.
  • 5:09 - 5:12
    Now, obviously these people
    were not born believing these things.
  • 5:12 - 5:15
    They didn't pick them up
    in elementary school.
  • 5:15 - 5:18
    A lot of them, before they went
    down some internet rabbit hole,
  • 5:18 - 5:20
    they had been libertarian
    or they had been socialist
  • 5:20 - 5:23
    or they had been something else entirely.
  • 5:23 - 5:24
    So what was going on?
  • 5:25 - 5:27
    Well, I can't generalize about every case,
  • 5:27 - 5:29
    but a lot of the people I spoke to,
  • 5:29 - 5:33
    they seem to have a combination
    of a high IQ and a low EQ.
  • 5:33 - 5:36
    They seem to take comfort
    in anonymous, online spaces
  • 5:36 - 5:38
    rather than connecting in the real world.
  • 5:39 - 5:41
    So often they would retreat
    to these message boards
  • 5:41 - 5:42
    or these subreddits,
  • 5:42 - 5:45
    where their worst impulses
    would be magnified.
  • 5:45 - 5:48
    They might start out saying
    something just as a sick joke,
  • 5:48 - 5:51
    and then they would get so much
    positive reinforcement for that joke,
  • 5:51 - 5:54
    so many meaningless
    "internet points," as they called it,
  • 5:54 - 5:57
    that they might start
    believing their own joke.
  • 5:58 - 6:02
    I talked a lot with one young woman
    who grew up in New Jersey,
  • 6:02 - 6:04
    and then after high school,
    she moved to a new place
  • 6:04 - 6:06
    and suddenly she just felt
    alienated and cut off
  • 6:06 - 6:08
    and started retreating into her phone.
  • 6:09 - 6:11
    She found some of these
    spaces on the internet
  • 6:11 - 6:14
    where people would post
    the most shocking, heinous things.
  • 6:14 - 6:16
    And she found this stuff
    really off-putting
  • 6:16 - 6:18
    but also kind of engrossing,
  • 6:19 - 6:21
    kind of like she couldn't
    look away from it.
  • 6:21 - 6:24
    She started interacting with people
    in these online spaces,
  • 6:24 - 6:27
    and they made her feel smart,
    they made her feel validated.
  • 6:27 - 6:29
    She started feeling a sense of community,
  • 6:29 - 6:32
    started wondering if maybe
    some of these shocking memes
  • 6:32 - 6:34
    might actually contain a kernel of truth.
  • 6:34 - 6:38
    A few months later, she was in a car
    with some of her new internet friends
  • 6:38 - 6:40
    headed to Charlottesville, Virginia,
  • 6:40 - 6:42
    to march with torches
    in the name of the white race.
  • 6:43 - 6:45
    She'd gone, in a few months,
    from Obama supporter
  • 6:45 - 6:48
    to fully radicalized white supremacist.
  • 6:49 - 6:51
    Now, in her particular case,
  • 6:51 - 6:55
    she actually was able to find her way
    out of the cult of white supremacy.
  • 6:56 - 6:58
    But a lot of the people
    I spoke to were not.
  • 6:59 - 7:00
    And just to be clear:
  • 7:00 - 7:03
    I was never so convinced
    that I had to find common ground
  • 7:03 - 7:05
    with every single person I spoke to
  • 7:05 - 7:07
    that I was willing to say,
  • 7:07 - 7:10
    "You know what, man,
    you're a fascist propagandist, I'm not,
  • 7:10 - 7:13
    whatever, let's just hug it out,
    all our differences will melt away."
  • 7:13 - 7:15
    No, absolutely not.
  • 7:16 - 7:20
    But I did become convinced that we cannot
    just look away from this stuff.
  • 7:20 - 7:23
    We have to try to understand it,
    because only by understanding it
  • 7:23 - 7:26
    can we even start to inoculate
    ourselves against it.
  • 7:27 - 7:31
    In my three years in this world,
    I got a few nasty phone calls,
  • 7:31 - 7:32
    even some threats,
  • 7:32 - 7:36
    but it wasn't a fraction of what
    female journalists get on this beat.
  • 7:37 - 7:38
    And yeah, I am Jewish,
  • 7:38 - 7:42
    although, weirdly, a lot of the Nazis
    couldn't tell I was Jewish,
  • 7:42 - 7:45
    which I frankly just found
    kind of disappointing.
  • 7:45 - 7:47
    (Laughter)
  • 7:47 - 7:51
    Seriously, like, your whole job
    is being a professional anti-Semite.
  • 7:51 - 7:54
    Nothing about me
    is tipping you off at all?
  • 7:54 - 7:55
    Nothing?
  • 7:55 - 7:57
    (Laughter)
  • 7:58 - 7:59
    This is not a secret.
  • 7:59 - 8:02
    My name is Andrew Marantz,
    I write for "The New Yorker,"
  • 8:02 - 8:04
    my personality type
    is like if a Seinfeld episode
  • 8:04 - 8:06
    was taped at the Park Slope Food Coop.
  • 8:06 - 8:07
    Nothing?
  • 8:08 - 8:10
    (Laughter)
  • 8:13 - 8:15
    Anyway, look -- ultimately,
    it would be nice
  • 8:15 - 8:17
    if there were, like, a simple formula:
  • 8:17 - 8:21
    smartphone plus alienated kid
    equals 12 percent chance of Nazi.
  • 8:22 - 8:24
    It's obviously not that simple.
  • 8:24 - 8:25
    And in my writing,
  • 8:25 - 8:29
    I'm much more comfortable
    being descriptive, not prescriptive.
  • 8:29 - 8:32
    But this is TED,
  • 8:32 - 8:34
    so let's get practical.
  • 8:34 - 8:35
    I want to share a few suggestions
  • 8:35 - 8:39
    of things that citizens
    of the internet like you and I
  • 8:39 - 8:42
    might be able to do to make things
    a little bit less toxic.
  • 8:43 - 8:45
    So the first one is to be a smart skeptic.
  • 8:46 - 8:48
    So, I think there are
    two kinds of skepticism.
  • 8:48 - 8:52
    And I don't want to drown you in technical
    epistemological information here,
  • 8:52 - 8:55
    but I call them smart and dumb skepticism.
  • 8:56 - 8:59
    So, smart skepticism:
  • 8:59 - 9:00
    thinking for yourself,
  • 9:00 - 9:01
    questioning every claim,
  • 9:01 - 9:03
    demanding evidence --
  • 9:03 - 9:04
    great, that's real skepticism.
  • 9:05 - 9:08
    Dumb skepticism:
    it sounds like skepticism,
  • 9:08 - 9:11
    but it's actually closer
    to knee-jerk contrarianism.
  • 9:12 - 9:13
    Everyone says the earth is round,
  • 9:13 - 9:15
    you say it's flat.
  • 9:15 - 9:16
    Everyone says racism is bad,
  • 9:16 - 9:19
    you say, "I dunno,
    I'm skeptical about that."
  • 9:20 - 9:24
    I cannot tell you how many young white men
    I have spoken to in the last few years
  • 9:24 - 9:25
    who have said,
  • 9:25 - 9:28
    "You know, the media, my teachers,
    they're all trying to indoctrinate me
  • 9:28 - 9:31
    into believing in male privilege
    and white privilege,
  • 9:31 - 9:33
    but I don't know about that,
    man, I don't think so."
  • 9:33 - 9:37
    Guys -- contrarian
    white teens of the world --
  • 9:37 - 9:39
    look:
  • 9:39 - 9:42
    if you are being a round earth skeptic
    and a male privilege skeptic
  • 9:42 - 9:45
    and a racism is bad skeptic,
  • 9:45 - 9:47
    you're not being a skeptic,
    you're being a jerk.
  • 9:47 - 9:51
    (Applause)
  • 9:52 - 9:56
    It's great to be independent-minded,
    we all should be independent-minded,
  • 9:56 - 9:57
    but just be smart about it.
  • 9:58 - 10:00
    So this next one is about free speech.
  • 10:00 - 10:04
    You will hear smart, accomplished people
    who will say, "Well, I'm pro-free speech,"
  • 10:04 - 10:07
    and they say it in this way
    that it's like they're settling a debate,
  • 10:07 - 10:11
    when actually, that is the very beginning
    of any meaningful conversation.
  • 10:12 - 10:14
    All the interesting stuff
    happens after that point.
  • 10:14 - 10:16
    OK, you're pro-free speech.
    What does that mean?
  • 10:16 - 10:19
    Does it mean that David Duke
    and Richard Spencer
  • 10:19 - 10:21
    need to have active Twitter accounts?
  • 10:21 - 10:24
    Does it mean that anyone
    can harass anyone else online
  • 10:24 - 10:25
    for any reason?
  • 10:25 - 10:28
    You know, I looked through
    the entire list of TED speakers this year.
  • 10:28 - 10:30
    I didn't find a single
    round earth skeptic.
  • 10:30 - 10:33
    Is that a violation of free speech norms?
  • 10:33 - 10:37
    Look, we're all pro-free speech,
    it's wonderful to be pro-free speech,
  • 10:37 - 10:39
    but if that's all you know
    how to say again and again,
  • 10:39 - 10:42
    you're standing in the way
    of a more productive conversation.
  • 10:44 - 10:47
    Making decency cool again, so ...
  • 10:47 - 10:48
    Great!
  • 10:48 - 10:50
    (Applause)
  • 10:50 - 10:52
    Yeah. I don't even need to explain it.
  • 10:52 - 10:56
    So in my research, I would go
    to Reddit or YouTube or Facebook,
  • 10:56 - 10:58
    and I would search for "sharia law"
  • 10:58 - 11:00
    or I would search for "the Holocaust,"
  • 11:01 - 11:04
    and you might be able to guess
    what the algorithms showed me, right?
  • 11:04 - 11:07
    "Is sharia law sweeping
    across the United States?"
  • 11:07 - 11:09
    "Did the Holocaust really happen?"
  • 11:10 - 11:12
    Dumb skepticism.
  • 11:13 - 11:15
    So we've ended up in this
    bizarre dynamic online,
  • 11:15 - 11:17
    where some people see bigoted propaganda
  • 11:17 - 11:20
    as being edgy or being dangerous and cool,
  • 11:20 - 11:23
    and people see basic truth
    and human decency as pearl-clutching
  • 11:23 - 11:26
    or virtue-signaling or just boring.
  • 11:26 - 11:30
    And the social media algorithms,
    whether intentionally or not,
  • 11:30 - 11:32
    they have incentivized this,
  • 11:32 - 11:35
    because bigoted propaganda
    is great for engagement.
  • 11:35 - 11:37
    Everyone clicks on it,
    everyone comments on it,
  • 11:37 - 11:39
    whether they love it or they hate it.
  • 11:39 - 11:42
    So the number one thing
    that has to happen here
  • 11:42 - 11:45
    is social networks need
    to fix their platforms.
  • 11:45 - 11:49
    (Applause)
  • 11:50 - 11:53
    So if you're listening to my voice
    and you work at a social media company
  • 11:53 - 11:56
    or you invest in one
    or, I don't know, own one,
  • 11:57 - 11:58
    this tip is for you.
  • 11:58 - 12:02
    If you have been optimizing
    for maximum emotional engagement
  • 12:02 - 12:06
    and maximum emotional engagement turns out
    to be actively harming the world,
  • 12:06 - 12:09
    it's time to optimize for something else.
  • 12:09 - 12:12
    (Applause)
  • 12:15 - 12:18
    But in addition to putting pressure
    on them to do that
  • 12:18 - 12:21
    and waiting for them
    and hoping that they'll do that,
  • 12:21 - 12:24
    there's some stuff that
    the rest of us can do, too.
  • 12:24 - 12:28
    So, we can create some better pathways
    or suggest some better pathways
  • 12:28 - 12:30
    for angsty teens to go down.
  • 12:30 - 12:33
    If you see something that you think
    is really creative and thoughtful
  • 12:33 - 12:36
    and you want to share that thing,
    you can share that thing,
  • 12:36 - 12:39
    even if it's not flooding you
    with high arousal emotion.
  • 12:39 - 12:41
    Now that is a very small step, I realize,
  • 12:41 - 12:43
    but in the aggregate,
    this stuff does matter,
  • 12:43 - 12:46
    because these algorithms,
    as powerful as they are,
  • 12:46 - 12:48
    they are taking their
    behavioral cues from us.
  • 12:50 - 12:51
    So let me leave you with this.
  • 12:52 - 12:54
    You know, a few years ago
    it was really fashionable
  • 12:54 - 12:57
    to say that the internet
    was a revolutionary tool
  • 12:57 - 12:59
    that was going to bring us all together.
  • 12:59 - 13:01
    It's now more fashionable to say
  • 13:01 - 13:04
    that the internet is a huge,
    irredeemable dumpster fire.
  • 13:05 - 13:07
    Neither caricature is really true.
  • 13:07 - 13:09
    We know the internet
    is just too vast and complex
  • 13:09 - 13:11
    to be all good or all bad.
  • 13:11 - 13:13
    And the danger with
    these ways of thinking,
  • 13:13 - 13:16
    whether it's the utopian view
    that the internet will inevitably save us
  • 13:16 - 13:20
    or the dystopian view that it
    will inevitably destroy us,
  • 13:20 - 13:22
    either way, we're letting
    ourselves off the hook.
  • 13:24 - 13:26
    There is nothing inevitable
    about our future.
  • 13:27 - 13:29
    The internet is made of people.
  • 13:29 - 13:32
    People make decisions
    at social media companies.
  • 13:32 - 13:35
    People make hashtags trend or not trend.
  • 13:35 - 13:38
    People make societies progress or regress.
  • 13:39 - 13:41
    When we internalize that fact,
  • 13:41 - 13:44
    we can stop waiting
    for some inevitable future to arrive
  • 13:44 - 13:45
    and actually get to work now.
  • 13:47 - 13:50
    You know, we've all been taught
    that the arc of the moral universe is long
  • 13:50 - 13:52
    but that it bends toward justice.
  • 13:54 - 13:56
    Maybe.
  • 13:57 - 13:58
    Maybe it will.
  • 13:59 - 14:01
    But that has always been an aspiration.
  • 14:01 - 14:03
    It is not a guarantee.
  • 14:04 - 14:06
    The arc doesn't bend itself.
  • 14:06 - 14:10
    It's not bent inevitably
    by some mysterious force.
  • 14:10 - 14:11
    The real truth,
  • 14:11 - 14:14
    which is scarier and also more liberating,
  • 14:15 - 14:16
    is that we bend it.
  • 14:17 - 14:18
    Thank you.
  • 14:18 - 14:21
    (Applause)
Title:
Inside the bizarre world of internet trolls and propagandists
Speaker:
Andrew Marantz
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
14:36

English subtitles

Revisions Compare revisions