Return to Video

ADL International Leadership Award Presented to Sacha Baron Cohen at Never Is Now 2019

  • 0:02 - 0:04
    Thank you very much.
    Thanks to everyone who stood up.
  • 0:05 - 0:07
    Unnecessary but very flattering.
  • 0:07 - 0:10
    Thank you, Jonathan,
    for your very kind words.
  • 0:10 - 0:12
    Thank you, the Anti-Defamation League,
  • 0:12 - 0:15
    for this recognition
  • 0:15 - 0:18
    and your work in fighting racism,
    hate and bigotry.
  • 0:19 - 0:20
    And to be clear,
  • 0:20 - 0:22
    when I say racism, hate, and bigotry
  • 0:23 - 0:26
    I'm not referring to the names
    of Stephen Miller's labradoodles.
  • 0:27 - 0:28
    [audience laughs]
  • 0:30 - 0:34
    Now, I realize that
    some of you may be thinking:
  • 0:34 - 0:38
    "What the hell is a comedian doing
    speaking at a conference like this?"
  • 0:39 - 0:40
    I certainly am.
  • 0:40 - 0:41
    [audience laughs]
  • 0:41 - 0:45
    I've spent most of the past two decades
    in character.
  • 0:45 - 0:50
    In fact, this is the first ever time
    that I've stood up and given a speech
  • 0:50 - 0:53
    as my least popular character:
    Sacha Baron Cohen.
  • 0:53 - 0:54
    [audience laughs]
  • 0:54 - 0:56
    [audience cheers]
  • 0:57 - 1:01
    And, I have to confess, it is terrifying.
  • 1:03 - 1:07
    I realize that my presence here
    may also be unexpected for another reason:
  • 1:08 - 1:11
    At times, some critics
    have said my comedy
  • 1:11 - 1:14
    risks reinforcing old stereotypes.
  • 1:14 - 1:19
    The truth is, I've been passionate about
    challenging bigotry and intolerance
  • 1:19 - 1:20
    throughout my life.
  • 1:21 - 1:22
    As a teenager in England,
  • 1:22 - 1:25
    I marched against
    the fascist National Front
  • 1:25 - 1:27
    and to abolish apartheid.
  • 1:27 - 1:30
    As an undergraduate,
    I traveled around America
  • 1:30 - 1:33
    and wrote my thesis
    about the Civil Rights Movement
  • 1:33 - 1:35
    with the help of the archives of the ADL;
  • 1:36 - 1:39
    and as a comedian,
    I've tried to use my characters
  • 1:39 - 1:42
    to get people to let down their guard
  • 1:42 - 1:44
    and reveal what they actually believe,
  • 1:44 - 1:46
    including their own prejudice.
  • 1:46 - 1:49
    Now, I'm not gonna claim
    everything I've done
  • 1:49 - 1:50
    has been for a higher purpose.
  • 1:51 - 1:53
    Yes, some of my comedy –
  • 1:53 - 1:54
    okay, probably half my comedy –
  • 1:54 - 1:56
    has been absolutely juvenile.
  • 1:56 - 1:57
    [audience laughs]
  • 1:57 - 2:00
    And the other half completely puerile.
  • 2:00 - 2:02
    But...
    [audience laughs]
  • 2:02 - 2:05
    I admit there was nothing
    particularly enlightening about me
  • 2:05 - 2:07
    as Borat from Kazakhstan,
  • 2:07 - 2:09
    the first 'fake news' journalist,
  • 2:09 - 2:10
    [audience laughs]
  • 2:10 - 2:13
    running through a conference
    of mortgage brokers
  • 2:13 - 2:14
    while I was completely naked.
  • 2:14 - 2:15
    [audience laughs]
  • 2:16 - 2:20
    But when Borat was able to get
    an entire bar in Arizona to sing
  • 2:20 - 2:22
    ♪ throw the Jew down the well ♪
  • 2:22 - 2:23
    [audience laughs]
  • 2:23 - 2:27
    it did reveal people's indifference
    to anti-semitism.
  • 2:27 - 2:31
    When, as Bruno, the gay
    fashion reporter from Austria,
  • 2:32 - 2:35
    I started kissing a man
    in a cage fight in Arkansas,
  • 2:35 - 2:37
    nearly starting a riot,
  • 2:37 - 2:40
    it showed the violent potential
    of homophobia.
  • 2:41 - 2:44
    And, when disguised
    as an ultra-woke developer,
  • 2:44 - 2:48
    I proposed building a mosque
    in one rural community,
  • 2:48 - 2:51
    prompting a resident to proudly admit:
  • 2:51 - 2:54
    "I am racist against Muslims,"
  • 2:54 - 2:57
    it showed the growing acceptance
    of islamophobia.
  • 2:58 - 3:02
    That's why I really appreciate
    the opportunity to be here with you.
  • 3:02 - 3:04
    Today, around the world,
  • 3:04 - 3:07
    demagogues appeal
    to our worst instincts.
  • 3:07 - 3:11
    Conspiracy theories,
    once confined to the fringe,
  • 3:11 - 3:12
    are going mainstream.
  • 3:13 - 3:18
    It's as if the age of reason,
    the era of evidential argument
  • 3:18 - 3:19
    is ending,
  • 3:19 - 3:22
    and now knowledge
    is increasingly delegitimized
  • 3:22 - 3:25
    and scientific consensus is dismissed.
  • 3:26 - 3:29
    Democracy,
    which depends on shared truths,
  • 3:30 - 3:31
    is in retreat,
  • 3:31 - 3:34
    and autocracy,
    which depends on shared lies,
  • 3:34 - 3:36
    is on the march.
  • 3:36 - 3:38
    Hate crimes are surging,
  • 3:38 - 3:41
    as are murderous attacks
    on religious and ethnic minorities.
  • 3:42 - 3:45
    Now, what do all these
    dangerous trends have in common?
  • 3:46 - 3:47
    I'm just a comedian and an actor,
  • 3:47 - 3:51
    I'm not a scholar,
    but one thing is pretty clear to me:
  • 3:52 - 3:55
    All this hate and violence
    is being facilitated
  • 3:56 - 3:58
    by a handful of Internet companies
  • 3:58 - 4:02
    that amount to the greatest
    propaganda machine in history.
  • 4:02 - 4:04
    [audience applauds]
  • 4:07 - 4:09
    The greatest
    propaganda machine in history.
  • 4:09 - 4:12
    Let's think about it.
    Facebook, YouTube...
  • 4:12 - 4:14
    ...Google, Twitter and others –
  • 4:14 - 4:16
    they reach billions of people.
  • 4:16 - 4:18
    The algorithms
    these platforms depend on
  • 4:19 - 4:20
    deliberately amplify
  • 4:21 - 4:23
    the type of content
    that keeps users engaged;
  • 4:24 - 4:27
    stories that appeal
    to our baser instincts,
  • 4:27 - 4:29
    and that trigger outrage and fear.
  • 4:30 - 4:35
    It's why YouTube recommended videos
    by the conspiracist Alex Jones
  • 4:35 - 4:37
    billions of times.
  • 4:37 - 4:40
    It's why fake news outperforms real news,
  • 4:41 - 4:44
    because studies show that
    lies spread faster than truth.
  • 4:45 - 4:48
    And it's no surprise that the greatest
    propaganda machine in history
  • 4:49 - 4:52
    has spread the oldest
    conspiracy theory in history:
  • 4:52 - 4:55
    the lie that Jews are somehow dangerous.
  • 4:56 - 4:58
    As one headline put it:
  • 4:58 - 5:01
    "Just think what Goebbels
    could have done with Facebook."
  • 5:01 - 5:03
    [audience groans]
  • 5:03 - 5:04
    On the internet,
  • 5:04 - 5:07
    everything can appear equally legitimate.
  • 5:07 - 5:10
    Breitbart resembles the BBC,
  • 5:10 - 5:13
    the fictitious protocols
    of the Elders of Zion
  • 5:13 - 5:16
    look as valid as an ADL report,
  • 5:16 - 5:19
    and the rantings of a lunatic
    seem as credible
  • 5:19 - 5:21
    as the findings of a Nobel Prize winner.
  • 5:22 - 5:26
    We have lost, it seems,
    a shared sense of basic facts
  • 5:26 - 5:29
    upon which democracy depends.
  • 5:29 - 5:32
    When I, as the wanna-be-gansta Ali G,
  • 5:32 - 5:35
    asked the astronaut Buzz Aldrin:
  • 5:35 - 5:37
    "what woz it like to walk on de sun??"
  • 5:38 - 5:40
    [audience laughs]
  • 5:42 - 5:47
    the joke worked because we,
    the audience, shared the same facts.
  • 5:47 - 5:50
    If you believe the moon landing was a hoax
  • 5:50 - 5:52
    the joke doesn't work.
  • 5:52 - 5:56
    When Borat got that bar
    in Arizona to agree that,
  • 5:56 - 6:01
    ♪ Jews control everybody's money
    and they never give it back ♪
  • 6:02 - 6:05
    the joke worked because
    the audience shared the fact
  • 6:05 - 6:08
    that the depiction of Jews as miserly
  • 6:09 - 6:13
    is a conspiracy theory
    originating in the Middle Ages.
  • 6:13 - 6:17
    But when, thanks to social media,
    conspiracies take hold,
  • 6:17 - 6:20
    it is easier for hate groups to recruit,
  • 6:20 - 6:23
    easier for foreign intelligence agencies
  • 6:23 - 6:25
    to interfere in our elections,
  • 6:25 - 6:32
    and easier for a country like Myanmar
    to commit genocide against the Rohingya.
  • 6:33 - 6:36
    [audience applauds]
  • 6:39 - 6:41
    Now, it's actually quite shocking
  • 6:41 - 6:45
    how easy it is to turn
    conspiracy thinking into violence.
  • 6:45 - 6:47
    In my last show, "Who is America?"
  • 6:47 - 6:51
    I found an educated, normal guy
    who'd held down a good job,
  • 6:51 - 6:53
    but who, on social media,
  • 6:53 - 6:56
    repeated many of the conspiracy theories
  • 6:56 - 6:58
    that President Trump, using Twitter,
  • 6:59 - 7:05
    has spread more than 1700 times
    to his 67 million Twitter followers.
  • 7:05 - 7:10
    The president even tweeted that
    he was considering designating Antifa
  • 7:11 - 7:13
    who are anti-fascists
    who march against the far-right,
  • 7:14 - 7:16
    as a terror organization.
  • 7:16 - 7:21
    So, disguised as an Israeli
    anti-terrorism expert,
  • 7:21 - 7:23
    Colonel Erran Morad,
  • 7:23 - 7:24
    [audience laughs]
  • 7:25 - 7:27
    [in a foreign accent]
    "Yalla. Let go"
  • 7:27 - 7:28
    [audience laughs]
  • 7:28 - 7:30
    Disguised as him,
    I told my interviewee
  • 7:30 - 7:32
    that at the Women's March in San Francisco
  • 7:33 - 7:38
    Antifa were plotting to put hormones
    into babies' diapers in order to
  • 7:38 - 7:41
    [in a foreign accent]
    "make them transgender,"
  • 7:41 - 7:42
    [audience laughs]
  • 7:42 - 7:43
    And this man believed it.
  • 7:44 - 7:49
    I instructed him to plant small devices
    on three innocent people at the march,
  • 7:49 - 7:51
    and explained that,
    when he pushed a button,
  • 7:51 - 7:53
    he'd trigger an explosion
    that would kill them all.
  • 7:54 - 7:57
    They weren't real explosives,
    of course, but he thought they were.
  • 7:57 - 8:00
    I wanted to see, would he actually do it?
  • 8:00 - 8:02
    The answer was yes.
  • 8:03 - 8:07
    He pushed the button and thought
    he had actually killed three human beings.
  • 8:07 - 8:10
    Voltaire was right when he said,
  • 8:10 - 8:14
    "Those who can make you
    believe absurdities
  • 8:14 - 8:17
    "can make you commit atrocities."
  • 8:18 - 8:24
    And social media lets authoritarians
    push absurdities to billions of people.
  • 8:25 - 8:26
    In their defense,
  • 8:26 - 8:30
    these social media companies
    have taken some steps to reduce hate
  • 8:30 - 8:32
    and conspiracies on their platforms,
  • 8:32 - 8:35
    but these steps
    have been mainly superficial.
  • 8:35 - 8:37
    And I'm talking about this today
  • 8:37 - 8:40
    because I believe that
    our pluralistic democracies
  • 8:41 - 8:44
    are on a precipice
    and that the next 12 months,
  • 8:44 - 8:47
    and the role of social media
    could be determinant.
  • 8:48 - 8:50
    British voters will go to the polls
  • 8:50 - 8:52
    while online conspiracists
  • 8:52 - 8:56
    promote the despicable theory
    of the "great replacement,"
  • 8:56 - 8:59
    that white Christians
    are being deliberately replaced
  • 8:59 - 9:01
    by Muslim immigrants.
  • 9:02 - 9:04
    Americans will vote for president
  • 9:04 - 9:11
    while trolls and bots perpetuate
    the disgusting lie of a Hispanic invasion.
  • 9:12 - 9:16
    And after years of YouTube videos
    calling climate change a hoax,
  • 9:16 - 9:19
    the United States is on track,
    a year from now,
  • 9:20 - 9:23
    to formally withdraw
    from the Paris Accords.
  • 9:23 - 9:27
    A sewer of bigotry
    and vile conspiracy theories
  • 9:27 - 9:31
    that threaten our democracy,
    and to some degree, our planet.
  • 9:32 - 9:36
    This can't possibly be what
    the creators of the internet had in mind.
  • 9:36 - 9:41
    I believe that it's time for
    a fundamental rethink of social media
  • 9:41 - 9:44
    and how it spreads
    hate, conspiracies and lies.
  • 9:45 - 9:48
    (Audience cheers; applauds)
  • 9:53 - 9:58
    Last month, however,
    Mark Zuckerberg of Facebook
  • 9:58 - 10:00
    delivered a major speech
  • 10:00 - 10:04
    that, not surprisingly,
    warned against new laws and regulations
  • 10:04 - 10:06
    on companies like his.
  • 10:06 - 10:10
    Well, some of these arguments
    are simply...
  • 10:10 - 10:12
    ...pardon my French, bullshit.
  • 10:13 - 10:14
    Let's count the ways.
  • 10:14 - 10:18
    First, Zuckerberg
    tried to portray this whole issue as
  • 10:18 - 10:21
    "choices around free expression."
  • 10:21 - 10:23
    That is ludicrous.
  • 10:23 - 10:27
    This is not about
    limiting anyone's free speech.
  • 10:27 - 10:28
    This is about giving people –
  • 10:28 - 10:32
    including some of the most
    reprehensible people on earth –
  • 10:32 - 10:36
    the biggest platform in history
    to reach a third of the planet.
  • 10:36 - 10:40
    Freedom of speech
    is not freedom of reach.
  • 10:40 - 10:42
    Sadly, there will always be racists,
  • 10:42 - 10:45
    misogynists,
    anti-semites and child abusers,
  • 10:45 - 10:47
    but I think we can all agree
  • 10:48 - 10:50
    that we shouldn't be
    giving bigots and pedophiles
  • 10:51 - 10:55
    a free platform to amplify their views
    and target their victims.
  • 10:55 - 10:58
    [audience applauds]
  • 11:01 - 11:04
    Second, Mark Zuckerberg
    claimed that new limits
  • 11:04 - 11:06
    on what's posted on social media
  • 11:06 - 11:09
    would be
    "to pull back on free expression."
  • 11:10 - 11:11
    This is utter nonsense.
  • 11:12 - 11:15
    The First Amendment
    says that, and I quote:
  • 11:15 - 11:18
    "Congress shall make no law...
    abridging freedom of speech."
  • 11:18 - 11:23
    However, this does not apply
    to private businesses like Facebook.
  • 11:23 - 11:27
    We're not asking these companies
    to determine the boundaries of free speech
  • 11:27 - 11:28
    across society,
  • 11:29 - 11:32
    we just want them
    to be responsible on their platforms.
  • 11:33 - 11:36
    If neo-nazi comes
    goose-stepping into a restaurant
  • 11:36 - 11:39
    and starts threatening other customers
    and says he wants to kill Jews,
  • 11:40 - 11:42
    would the owner of the restaurant,
    a private business,
  • 11:43 - 11:47
    be required to serve him
    an elegant 8-course meal?
  • 11:47 - 11:49
    Of course not!
  • 11:49 - 11:52
    The restaurant owner
    has every legal right –
  • 11:52 - 11:55
    and indeed, I would argue,
    a moral obligation –
  • 11:55 - 11:56
    to kick that Nazi out.
  • 11:57 - 11:59
    And so do these internet companies.
  • 11:59 - 12:01
    [audience applauds]
  • 12:05 - 12:06
    Now third.
  • 12:06 - 12:10
    Mark Zuckerberg seemed to equate
    regulation of companies like his
  • 12:10 - 12:14
    to the actions of
    "the most repressive societies."
  • 12:15 - 12:16
    Incredible.
  • 12:16 - 12:21
    This from one of the six people
    who decide what information
  • 12:21 - 12:23
    so much of the world sees:
  • 12:24 - 12:26
    Zuckerberg at Facebook,
  • 12:26 - 12:28
    Sundar Pichai at Google,
  • 12:28 - 12:32
    at its parent company Alphabet,
    Larry Page and Sergey Brin,
  • 12:32 - 12:36
    Brin’s ex-sister-in-law,
    Susan Wojcicki at YouTube,
  • 12:36 - 12:38
    and Jack Dorsey at Twitter.
  • 12:38 - 12:40
    The Silicon Six.
  • 12:41 - 12:43
    All billionaires, all Americans,
  • 12:43 - 12:46
    who care more
    about boosting their share price
  • 12:47 - 12:49
    than about protecting democracy.
  • 12:49 - 12:51
    [audience applauds]
  • 12:54 - 12:57
    This is ideological imperialism.
  • 12:58 - 13:01
    Six unelected individuals
    in Silicon Valley
  • 13:01 - 13:04
    imposing their vision
    on the rest of the world,
  • 13:04 - 13:07
    unaccountable to any government
  • 13:07 - 13:10
    and acting like
    they're above the reach of law.
  • 13:10 - 13:12
    It's like we're living
    in the Roman Empire,
  • 13:12 - 13:14
    and Mark Zuckerberg is Caesar.
  • 13:15 - 13:17
    At least that would explain his haircut.
  • 13:17 - 13:19
    [audience laughs]
  • 13:19 - 13:21
    Now here's an idea.
  • 13:22 - 13:26
    Instead of letting the Sillicon Six
    decide the fate of the world,
  • 13:27 - 13:29
    let our elected representatives,
  • 13:30 - 13:34
    voted for by the people
    of every democracy in the world,
  • 13:34 - 13:36
    have at least some say.
  • 13:37 - 13:39
    Fourth, Zuckerberg
    speaks of welcoming
  • 13:39 - 13:42
    "a diversity of ideas,"
  • 13:42 - 13:44
    and last year he gave us an example.
  • 13:44 - 13:48
    He said that he found
    posts denying the Holocaust
  • 13:48 - 13:49
    "deeply offensive,"
  • 13:50 - 13:52
    but he didn't think
    Facebook should take them down
  • 13:52 - 13:57
    "because I think there are things
    that different people get wrong."
  • 13:57 - 14:02
    At this very moment, there are still
    Holocaust deniers on Facebook,
  • 14:02 - 14:04
    and Google still takes you
  • 14:04 - 14:08
    to the most repulsive
    Holocaust denial sites
  • 14:08 - 14:09
    with a simple click.
  • 14:10 - 14:15
    One of the heads of Google, in fact,
    told me that these sites just show
  • 14:15 - 14:17
    "both sides" of the issue.
  • 14:17 - 14:18
    [audience groans]
  • 14:18 - 14:19
    This is madness.
  • 14:19 - 14:21
    To quote Edward R. Murrow:
  • 14:21 - 14:25
    One "cannot accept
    that there are, on every story,
  • 14:25 - 14:28
    "two equal and logical sides
    to an argument."
  • 14:28 - 14:32
    We have, unfortunately,
    millions of pieces of evidence
  • 14:32 - 14:33
    for the Holocaust.
  • 14:33 - 14:36
    It is an historical fact.
  • 14:36 - 14:39
    And denying it is not
    some random opinion.
  • 14:39 - 14:43
    Those who deny the Holocaust
    aim to encourage another one.
  • 14:44 - 14:46
    [audience applauds]
  • 14:51 - 14:55
    Still, Zuckerberg says that
    "people should decide what is credible,
  • 14:55 - 14:57
    "not tech companies."
  • 14:57 - 15:00
    But at a time
    when two-thirds of Millenials
  • 15:00 - 15:03
    say they haven't even heard of Auschwitz,
  • 15:03 - 15:05
    how are they supposed to know
    what's credible?
  • 15:06 - 15:11
    How are they supposed to know
    that the lie is a lie?
  • 15:11 - 15:14
    There is such a thing as objective truth.
  • 15:14 - 15:16
    Facts do exist.
  • 15:17 - 15:20
    And if these internet companies
    really want to make a difference
  • 15:20 - 15:24
    they should hire enough monitors
    to actually monitor,
  • 15:24 - 15:28
    work closely with groups like the ADL
    and the NAACP,
  • 15:29 - 15:35
    insist on facts, and purge these lies
    and conspiracies from their platforms.
  • 15:35 - 15:37
    [audience applauds]
  • 15:40 - 15:45
    Now, fifth, when discussing
    the difficulty of removing content,
  • 15:45 - 15:46
    Zuckerberg...
  • 15:47 - 15:50
    Mark Zuckerberg asked,
    "where do you draw the line?"
  • 15:51 - 15:55
    Yes, drawing the line can be difficult,
  • 15:55 - 15:57
    but here's what he's really saying:
  • 15:57 - 16:00
    removing more
    of these lies and conspiracies
  • 16:00 - 16:02
    is just too expensive.
  • 16:02 - 16:05
    These are
    the richest companies in the world,
  • 16:05 - 16:07
    and they have
    the best engineers in the world.
  • 16:08 - 16:11
    They could fix these problems
    if they wanted to.
  • 16:11 - 16:14
    Twitter could deploy an algorithm
  • 16:14 - 16:17
    to remove more
    white supremacist hate speech,
  • 16:17 - 16:19
    but they reportedly haven't
  • 16:20 - 16:23
    because it would eject
    some very prominent politicians
  • 16:23 - 16:24
    from their platform.
  • 16:24 - 16:26
    [audience groans]
  • 16:26 - 16:28
    Maybe that wouldn't be such a bad thing.
  • 16:28 - 16:30
    [audience applauds, cheers]
  • 16:36 - 16:37
    The truth is
  • 16:38 - 16:40
    these companies
    won't fundamentally change
  • 16:40 - 16:42
    because their entire business model
  • 16:43 - 16:46
    relies on generating more engagement,
  • 16:46 - 16:48
    and nothing generates more engagement
  • 16:48 - 16:51
    than lies, fear and outrage.
  • 16:51 - 16:53
    So it's time
    to finally call these companies
  • 16:53 - 16:57
    what they really are:
    the largest publishers in history.
  • 16:57 - 16:59
    So here's an idea for them:
  • 16:59 - 17:02
    abide by basic standards and practices
  • 17:02 - 17:07
    just like newspapers,
    magazines and TV news do everyday.
  • 17:07 - 17:11
    We have standards and practices
    in television and the movies.
  • 17:11 - 17:14
    There are certain things
    we cannot say or do.
  • 17:14 - 17:16
    In England, I was told that Ali G
  • 17:16 - 17:19
    couldn't curse
    when he appeared before 9 p.m.
  • 17:20 - 17:24
    Here in the US,
    the Motion Picture Association of America
  • 17:24 - 17:26
    regulates and rates what we see.
  • 17:27 - 17:31
    I've had scenes in my movies cut
    or reduced to abide by those standards.
  • 17:32 - 17:36
    If there are standards and practices
    for what cinemas and TV channels can show
  • 17:36 - 17:40
    then, surely
    companies that publish material
  • 17:40 - 17:41
    to billions of people
  • 17:41 - 17:44
    should have to abide
    basic standards and practices too.
  • 17:44 - 17:45
    [audience applauds]
  • 17:50 - 17:53
    Take the issue of political ads,
  • 17:54 - 17:56
    on which Facebook have been resolute.
  • 17:56 - 17:59
    Fortunately, Twitter finally banned them,
  • 17:59 - 18:02
    and Google, today I read,
    is making changes too.
  • 18:03 - 18:04
    But, if you pay them,
  • 18:05 - 18:09
    Facebook will run
    any political ad you want,
  • 18:09 - 18:10
    even if it's a lie.
  • 18:11 - 18:15
    And they'll even help you
    micro-target those ads to their users
  • 18:15 - 18:17
    for maximum effect.
  • 18:17 - 18:20
    Under this twisted logic,
  • 18:20 - 18:23
    if Facebook were around in the 1930s
  • 18:23 - 18:26
    it would have allowed Hitler
    to post 30-second ads
  • 18:26 - 18:29
    on his solution to the "Jewish problem."
  • 18:29 - 18:32
    So here's a good standard and practice:
  • 18:32 - 18:39
    Facebook, start fact-checking
    political ads before you run them,
  • 18:39 - 18:43
    stop micro-targeted lies immediately,
  • 18:43 - 18:44
    and when the ads are false,
  • 18:44 - 18:47
    give back the money
    and don't publish them.
  • 18:48 - 18:50
    [audience applauds]
  • 18:55 - 18:57
    Here's another good practice:
  • 18:57 - 18:59
    slow down.
  • 19:00 - 19:04
    Every single post doesn't need
    to be published immediately.
  • 19:04 - 19:07
    Oscar Wilde once said:
    "We live in an age
  • 19:07 - 19:11
    "when unnecessary things
    are our only necessity."
  • 19:11 - 19:12
    But...
  • 19:13 - 19:16
    But, let me ask you,
    is having every thought
  • 19:16 - 19:19
    or video posted instantly online,
  • 19:19 - 19:21
    even if it's racist
    or criminal or murderous,
  • 19:22 - 19:24
    really a necessity?
  • 19:24 - 19:25
    Of course not.
  • 19:25 - 19:29
    The shooter who
    massacred Muslims in New Zealand
  • 19:29 - 19:32
    live streamed his atrocity on Facebook
  • 19:32 - 19:34
    where it then spread across the internet
  • 19:34 - 19:37
    and was viewed likely millions of times.
  • 19:37 - 19:42
    It was a snuff film,
    brought to you by social media.
  • 19:42 - 19:44
    Why can't we have more of a delay
  • 19:45 - 19:47
    so that this trauma-inducing filth
  • 19:47 - 19:52
    can be caught and stopped
    before it's posted in the first place?
  • 19:52 - 19:55
    [audience applauds]
  • 19:57 - 20:01
    Finally, Zuckerberg said that
    social media companies should
  • 20:01 - 20:04
    "live up to their responsibilities,"
  • 20:04 - 20:09
    but he's totally silent about
    what should happen when they don't.
  • 20:09 - 20:12
    By now, it's pretty clear
  • 20:12 - 20:15
    they can't be trusted
    to regulate themselves.
  • 20:16 - 20:19
    As with Industrial Revolution,
    it's time for regulation
  • 20:19 - 20:24
    and legislation to curb the greed
    of these high-tech robber barons.
  • 20:25 - 20:27
    [audience applauds]
  • 20:30 - 20:32
    In every other industry,
  • 20:32 - 20:36
    a company can be held liable
    when their product is defective.
  • 20:36 - 20:40
    When engines explode
    or seat belts malfunction,
  • 20:40 - 20:44
    car company's recall
    tens of thousands of vehicles
  • 20:44 - 20:46
    at a cost of billions of dollars.
  • 20:47 - 20:50
    It only seems fair
    to say to Facebook, YouTube and Twitter:
  • 20:51 - 20:53
    Your product is defective,
  • 20:53 - 20:55
    you are obliged to fix it,
  • 20:55 - 20:57
    no matter how much it costs
  • 20:57 - 21:01
    and no matter how many moderators
    you need to employ.
  • 21:02 - 21:05
    [audience applauds, cheers]
  • 21:08 - 21:11
    In every under-- sorry
  • 21:11 - 21:15
    In every other industry
    you can be sued for the harm you cause.
  • 21:15 - 21:20
    Publishers can be sued for libel,
    people can be sued for defamation.
  • 21:20 - 21:23
    I've been sued many times!
  • 21:23 - 21:24
    [audience laughs]
  • 21:24 - 21:26
    I'm being sued right now
  • 21:27 - 21:31
    by someone whose name I won't mention
    because he might sue me again!
  • 21:31 - 21:32
    [audience laughs]
  • 21:32 - 21:37
    But social media companies
    are largely protected from liability
  • 21:37 - 21:39
    for the content their users post –
  • 21:39 - 21:41
    no matter how indecent it is –
  • 21:41 - 21:44
    by Section 230 of, get ready for it,
  • 21:44 - 21:47
    the Communications Decency Act.
  • 21:47 - 21:48
    It's absurd!
  • 21:49 - 21:54
    Fortunately, internet companies can now
    be held responsible for pedophiles
  • 21:54 - 21:57
    who use their site to target children.
  • 21:57 - 21:58
    So I say,
  • 21:58 - 22:00
    let's also hold these companies
  • 22:00 - 22:03
    responsible for those who use their sites
  • 22:03 - 22:06
    to advocate
    for the mass murder of children
  • 22:06 - 22:08
    because of their race or religion.
  • 22:09 - 22:10
    [audience applauds]
  • 22:15 - 22:17
    And maybe fines are not enough.
  • 22:18 - 22:24
    Maybe it's time to tell Mark Zuckerberg
    and the CEOs of these companies
  • 22:24 - 22:28
    you already allowed one foreign power
  • 22:28 - 22:29
    to interfere in our elections.
  • 22:30 - 22:34
    You already facilitated
    one genocide in Myanmar.
  • 22:34 - 22:37
    Do it again and you go to jail.
  • 22:37 - 22:38
    [audience applauds]
  • 22:43 - 22:48
    In the end, it all comes down to
    what kind of world we want.
  • 22:49 - 22:53
    In his speech, Zuckerberg
    said that one of his main goals
  • 22:53 - 22:58
    is to "uphold as wide a definition
    of freedom of expression as possible."
  • 22:59 - 23:00
    It sounds good.
  • 23:00 - 23:04
    Yet our freedoms
    are not only an end in themselves.
  • 23:04 - 23:07
    They're also the means to another end –
  • 23:07 - 23:09
    as you say here in the US:
  • 23:09 - 23:13
    the right to life, liberty
    and the pursuit of happiness.
  • 23:13 - 23:19
    But today these rights are threatened
    by hate, conspiracies and lies.
  • 23:19 - 23:24
    So allow me to leave you with a suggestion
    for a different aim for society:
  • 23:25 - 23:27
    The ultimate aim of society
  • 23:27 - 23:31
    should be to make sure
    that people are not targeted,
  • 23:31 - 23:34
    not harassed and not murdered
  • 23:34 - 23:37
    because of who they are,
    where they come from,
  • 23:37 - 23:39
    who they love or how they pray.
  • 23:40 - 23:42
    [audience applauds, cheers]
  • 23:55 - 23:58
    If we make that our aim –
  • 23:58 - 24:00
    if we prioritize truth over lies,
  • 24:00 - 24:02
    tolerance over prejudice,
  • 24:03 - 24:05
    empathy over indifference,
  • 24:05 - 24:08
    and experts over ignoramuses –
  • 24:08 - 24:09
    [audience laughs]
  • 24:09 - 24:11
    then maybe, just maybe,
  • 24:12 - 24:16
    we can stop the greatest
    propaganda machine in history.
  • 24:16 - 24:18
    We can save democracy,
  • 24:18 - 24:22
    we can still have a place
    for free speech and free expression,
  • 24:23 - 24:27
    and most importantly,
    my jokes will still work.
  • 24:27 - 24:28
    Thank you very much.
  • 24:28 - 24:31
    [cheers and applause]
Title:
ADL International Leadership Award Presented to Sacha Baron Cohen at Never Is Now 2019
Description:

Sacha Baron Cohen is the well-deserved recipient of ADL’s International Leadership Award, which goes to exceptional individuals who combine professional success with a profound personal commitment to community involvement and to crossing borders and barriers with a message of diversity and equal opportunity.

Over 100 years ago Supreme Court Justice Louis Brandeis wrote: “Sunlight is said to be the best disinfectant.” Through his alter egos, many of whom represent anti-Semites, racists and neo-Nazis, Baron Cohen shines a piercing light on people’s ignorance and biases.

https://www.adl.org/news/article/sacha-baron-cohens-keynote-address-at-adls-2019-never-is-now-summit-on-anti-semitism

copyright © 2019 ADL

more » « less
Video Language:
English
Team:
Captions Requested
Duration:
24:44

English subtitles

Revisions Compare revisions