Return to Video

How Amazon, Apple, Facebook and Google manipulate our emotions

  • 0:05 - 0:10
    So this is the first and the last slide
    each of my 6400 students
  • 0:10 - 0:12
    for the last 15 years have seen.
  • 0:12 - 0:16
    I do no believe you can build
    a multibillion dollar organization
  • 0:16 - 0:21
    unless you are clear on which instinct
    or organ you are targeting.
  • 0:22 - 0:25
    Our species has a need for a superbeing.
  • 0:25 - 0:28
    Our competitive advantage
    as a species is our brain,
  • 0:28 - 0:31
    our brain in robust enough to ask
    these really difficult questions,
  • 0:31 - 0:34
    but unfortunately it doesn't have
    the processesing power to answer them,
  • 0:34 - 0:38
    which creates a need for a superbeing
    that we can pray to
  • 0:38 - 0:39
    and look to for answers.
  • 0:39 - 0:40
    What is prayer?
  • 0:40 - 0:43
    Sending a query into the universe,
  • 0:43 - 0:45
    and hopefully there's some sort of
    divine intervention.
  • 0:45 - 0:47
    We don't need to understand
    what's going on
  • 0:47 - 0:50
    from an all-knowing, all-seeing superbeing
  • 0:50 - 0:53
    that gives us authority
    that this is the right answer.
  • 0:54 - 0:56
    Will my kid be all right?
  • 0:57 - 0:58
    You have your planet of stuff,
  • 0:58 - 1:00
    you have your planet of work,
  • 1:00 - 1:02
    you have your planet of friends.
  • 1:02 - 1:03
    If you have kids,
  • 1:03 - 1:06
    you know that once something
    comes off the rails with your kids,
  • 1:06 - 1:08
    everything melts,
  • 1:08 - 1:11
    and your universe,
    to the summit, is your kids.
  • 1:11 - 1:14
    Will my kid be all right?
  • 1:14 - 1:17
    Symptoms and treatment of croup
    in the Google query box.
  • 1:18 - 1:22
    One in six queries presented to Google
    have never been asked before
  • 1:22 - 1:23
    in the history of mankind.
  • 1:23 - 1:26
    What priest, teacher, rabbi,
    scholar, mentor, boss
  • 1:26 - 1:28
    has so much credibility
  • 1:28 - 1:31
    that one in six questions
    posed to that person
  • 1:31 - 1:33
    have never been asked before?
  • 1:33 - 1:36
    Google is our modern man's God.
  • 1:36 - 1:40
    Imagine your face and your name
    above everything you've put into that box,
  • 1:40 - 1:45
    and you're going to realize you trust
    Google more than any entitity
  • 1:45 - 1:46
    in your history.
  • 1:46 - 1:48
    Let's move further down the torso.
  • 1:49 - 1:50
    (Laughter)
  • 1:51 - 1:53
    One of the other wonderful things
    about our species
  • 1:53 - 1:56
    is we not only need to be loved
    but we need to love others.
  • 1:57 - 2:00
    Children with poor nutrition
    but a lot of affection
  • 2:00 - 2:04
    have better outcomes than children
    with good nutrition and poor affection.
  • 2:05 - 2:08
    However, the best signal
    that you might make it
  • 2:08 - 2:12
    to be part of the number-one fastest
    growing demographic in the world --
  • 2:12 - 2:13
    centenarians,
  • 2:13 - 2:15
    people who live to triple digits --
  • 2:15 - 2:16
    their are three signals.
  • 2:16 - 2:17
    In reverse order,
  • 2:17 - 2:18
    your genetics --
  • 2:18 - 2:20
    not as important as you'd like to think.
  • 2:20 - 2:22
    So you can continue to treat
    your body like shit
  • 2:22 - 2:24
    and think that, "Oh,
    Uncle Joe lived to 95,
  • 2:24 - 2:25
    the die have been cast."
  • 2:25 - 2:27
    It's less important than you think.
  • 2:27 - 2:28
    Number two is lifestyle.
  • 2:28 - 2:31
    Don't smoke, don't be obese, prescreen --
  • 2:31 - 2:34
    get rid of about two-thirds
    of early cancers
  • 2:34 - 2:35
    and cardiovascular disease.
  • 2:35 - 2:39
    The number one indicator or signal
    that you'll make it to triple digits:
  • 2:39 - 2:42
    how many people do you love?
  • 2:43 - 2:46
    Caretaking is the security camera --
  • 2:46 - 2:48
    we call the low-resolution
    security camera in our brain,
  • 2:48 - 2:51
    deciding whether or not
    you are adding value.
  • 2:51 - 2:54
    Facebook taps into our instinctive need
    not only to be loved,
  • 2:54 - 2:56
    but to love others,
  • 2:56 - 2:58
    mostly through pictures
    that create empathy,
  • 2:58 - 3:00
    catalyze and reinforce our relationships.
  • 3:00 - 3:03
    Let's continue our journey down the torso.
  • 3:04 - 3:06
    Amazon is our consumptive gut.
  • 3:06 - 3:09
    The instinct of more is hardwired into us.
  • 3:10 - 3:13
    The penalty for too little
    is starvation and malnurtition.
  • 3:14 - 3:15
    Open your cupboards,
  • 3:15 - 3:16
    open your closets,
  • 3:16 - 3:20
    you have 10 to 100 x times
    what you need.
  • 3:20 - 3:21
    Why?
  • 3:21 - 3:23
    Because the penalty for too little
  • 3:23 - 3:25
    is much greater than
    the penalty for too much.
  • 3:25 - 3:29
    So more for less is a business strategy
    that never goes out of style.
  • 3:29 - 3:31
    It's the strategy of China,
  • 3:31 - 3:32
    it's a the strategy of Wal-Mart,
  • 3:32 - 3:36
    and now it's the strategy of the most
    successful company in the world,
  • 3:36 - 3:37
    Amazon.
  • 3:37 - 3:39
    You get more for less into your gut,
  • 3:39 - 3:40
    digest,
  • 3:40 - 3:43
    send it to your muscular
    and skeletal system of consumption.
  • 3:43 - 3:45
    Moving further,
  • 3:45 - 3:47
    once we know we will survive --
  • 3:47 - 3:48
    the basic instinct --
  • 3:48 - 3:51
    we move to the second
    most powerful instinct,
  • 3:51 - 3:53
    and that is to spread and select
  • 3:53 - 3:55
    the strongest, smartest
    and fastest seed
  • 3:55 - 3:57
    to the four corners of the earth,
  • 3:57 - 3:59
    or pick the best seed.
  • 4:00 - 4:01
    This is not a timepiece.
  • 4:01 - 4:03
    I haven't wound it in five years.
  • 4:03 - 4:07
    It's my vain attempt to say to people
    if you mate with me,
  • 4:07 - 4:09
    your children are more likely to survive
  • 4:09 - 4:11
    than if you mate with somebody
    wearing a Swatch watch.
  • 4:11 - 4:13
    (Laughter)
  • 4:13 - 4:17
    The key to business is tapping into
    the irrational organs.
  • 4:17 - 4:21
    Irrational is Harvard Business School's
    and New York Business School's term
  • 4:21 - 4:23
    for fat profit margins
  • 4:23 - 4:25
    and shareholder value.
  • 4:25 - 4:28
    "High caloric [paste] for your children,"
  • 4:28 - 4:29
    no?
  • 4:29 - 4:31
    You love your choosy mom.
  • 4:31 - 4:33
    Why choosy moms choose JIF:
  • 4:33 - 4:34
    you love your kids more.
  • 4:34 - 4:37
    The greatest algorithm for shareholder
    creation from World War II
  • 4:37 - 4:39
    to the advent of Google
  • 4:39 - 4:42
    was taking an average product
    and appealing to people's hearts.
  • 4:42 - 4:43
    You're a better a mom,
  • 4:43 - 4:44
    a better person,
  • 4:44 - 4:45
    a better patriot
  • 4:45 - 4:48
    if you buy this average soap
    versus this average soap.
  • 4:48 - 4:52
    Now the number one algorithm
    for shareholder value isn't technology.
  • 4:52 - 4:53
    Look at the Forbes 400,
  • 4:53 - 4:54
    take out inherited wealth,
  • 4:54 - 4:55
    take out finance,
  • 4:55 - 4:58
    the number one source of wealth creation:
  • 4:58 - 5:00
    appealing to your reproductive organs.
  • 5:00 - 5:01
    The Lauders;
  • 5:01 - 5:04
    the number one
    wealthiest man in Europe --
  • 5:04 - 5:05
    LVMH.
  • 5:05 - 5:06
    Numbers two and three:
  • 5:06 - 5:07
    H&M and Inditex.
  • 5:07 - 5:13
    You want to target the most
    irrational organs for shareholder value.
  • 5:13 - 5:14
    As a result,
  • 5:14 - 5:16
    these four companies --
  • 5:16 - 5:19
    Apple, Amazon, Facebook and Google
    have disarticulated who we are.
  • 5:19 - 5:21
    God love consumption sex.
  • 5:21 - 5:25
    The proportion in your approach
    to those things is who you are,
  • 5:25 - 5:29
    and they have reassembeled who we are
    in the form of for-profit companies
  • 5:29 - 5:30
    at the end of the Great Recession,
  • 5:30 - 5:32
    the market capitalization
    of these companies
  • 5:32 - 5:35
    was equivalent to the GDP of Niger,
  • 5:35 - 5:37
    now it is equivalent to the GDP of India,
  • 5:37 - 5:39
    having blown past Russia and Canada
  • 5:39 - 5:41
    in '13 and '14.
  • 5:41 - 5:42
    There are only five nations
  • 5:42 - 5:47
    that have a GDP greater than the combined
    capitalization of these four firms.
  • 5:47 - 5:49
    Something is happening though.
  • 5:50 - 5:54
    The conversation just a year ago
    was which CEO was more Jesus-like,
  • 5:54 - 5:57
    who was running for president?
  • 5:57 - 5:58
    Now the worm has turned.
  • 5:58 - 6:00
    Everything they're doing is bothering us.
  • 6:00 - 6:02
    We're worried they're tax avoiders.
  • 6:02 - 6:06
    Wal-mart, since the Great Recession,
    has paid 64 billion dollars
  • 6:06 - 6:07
    in corporate income tax;
  • 6:07 - 6:09
    Amazon has paid 1.4.
  • 6:10 - 6:13
    How do we pay our firefighters,
    our soldiers and our social workers
  • 6:13 - 6:17
    if the most successful companies
    in the world don't pay their fair share?
  • 6:17 - 6:18
    Pretty easy.
  • 6:18 - 6:22
    That means the less successful companies
    have to pay more than their fair share.
  • 6:22 - 6:24
    Alexa, is this a good thing?
  • 6:24 - 6:26
    This is despite that fact --
  • 6:26 - 6:27
    (Laughter)
  • 6:27 - 6:30
    This is despite the fact
  • 6:30 - 6:34
    that Amazon has added the entire
    market capitalization of Wal-Mart
  • 6:34 - 6:38
    to its market cap in the last 19 months.
  • 6:40 - 6:41
    Whose fault is it?
  • 6:41 - 6:43
    It's our fault.
  • 6:43 - 6:47
    We're electing regulators
    who don't have the backbone
  • 6:47 - 6:49
    to actually go after these companies.
  • 6:49 - 6:51
    Facebook lies to EU regulators
  • 6:51 - 6:53
    and says, "It would be impossible
  • 6:53 - 6:56
    for us to share the data between
    our core platform
  • 6:56 - 6:58
    and our proposed acquisition of WhatsApp.
  • 6:58 - 6:59
    Approve the merger."
  • 6:59 - 7:01
    They approve the merger and then --
  • 7:01 - 7:02
    spoiler alert --
  • 7:02 - 7:03
    they figure it out.
  • 7:03 - 7:05
    And the EU says, "I feel lied to,
  • 7:05 - 7:08
    we're fining you 120 billion dollars,"
  • 7:08 - 7:12
    about .6 percent of the acquisition price
    of 19 billion dollars.
  • 7:13 - 7:15
    If Mark Zuckerberg could
    take out an insurance policy
  • 7:15 - 7:18
    that the acquisition would
    go through for .6 percent,
  • 7:18 - 7:19
    wouldn't he do it?
  • 7:19 - 7:21
    Anticompetitive behavior.
  • 7:22 - 7:24
    A two-and-a-half billion dollar fine,
  • 7:24 - 7:27
    three billion of the cash flow,
  • 7:27 - 7:29
    three percent of the cash
    on Google's balance sheet.
  • 7:30 - 7:32
    We are telling these companies
  • 7:32 - 7:34
    the smart thing to do,
  • 7:34 - 7:36
    the shareholder-driven thing to do,
  • 7:36 - 7:38
    is to lie and to cheat.
  • 7:38 - 7:42
    We are issuing ¢25 parking tickets
  • 7:42 - 7:45
    on a meter that costs $100 an hour.
  • 7:45 - 7:48
    The smart thing to do is like,
    "job destruction!"
  • 7:48 - 7:51
    Amazon only needs one person
    for two at Macy's.
  • 7:51 - 7:53
    If they grow their business
    20 billion dollars this year,
  • 7:53 - 7:54
    which they will,
  • 7:54 - 7:57
    we will lose 53,000 cashiers and clerks.
  • 7:57 - 7:58
    This is nothing unusual;
  • 7:58 - 8:00
    this has happened all through our economy,
  • 8:00 - 8:03
    we've just never seen
    companies this good at it.
  • 8:03 - 8:05
    That's one Yankee Stadium of workers.
  • 8:05 - 8:06
    It's even worse in media.
  • 8:06 - 8:10
    Facebook and Google grow their businesses
    22 billion dollars this year --
  • 8:10 - 8:11
    which they will --
  • 8:11 - 8:14
    we're going to lose approximately
    150,000 creative directors,
  • 8:14 - 8:16
    planners and copywriters.
  • 8:16 - 8:19
    Or we can fill up two-and-a-half
    Yankee Stadiums
  • 8:19 - 8:22
    and say, "You are out of work
    courtesy of Amazon."
  • 8:22 - 8:25
    We now get the majority of our news
    from our social media feeds,
  • 8:25 - 8:30
    and the majority of our news
    coming off of social media feeds is ...
  • 8:30 - 8:31
    fake news.
  • 8:31 - 8:33
    (Laughter)
  • 8:33 - 8:36
    I am not allowed to be politcal
    or use curse words,
  • 8:36 - 8:38
    or talk about religion in class,
  • 8:38 - 8:42
    so I can definitely not say ...
  • 8:42 - 8:44
    "Zuckerberg has become Putin's bitch."
  • 8:44 - 8:46
    I definitely cannot say that.
  • 8:46 - 8:47
    (Laughter)
  • 8:47 - 8:48
    Their defense:
  • 8:48 - 8:50
    "Facebook is not a media company,
  • 8:50 - 8:52
    it's a technology company."
  • 8:52 - 8:54
    You create original content,
  • 8:54 - 8:57
    you pay sports leagues
    to give you original content,
  • 8:57 - 8:58
    you run advertising against it --
  • 8:58 - 9:00
    boom, you're a media company.
  • 9:00 - 9:01
    Just in the last few days,
  • 9:01 - 9:04
    Sheryl Sandberg has repeated this lie
  • 9:04 - 9:06
    that "we are not a media company."
  • 9:06 - 9:09
    Facebook has openly embraced
    the margins of celebrity
  • 9:09 - 9:12
    and the influence of a media company,
  • 9:12 - 9:15
    yet seems to be allergic
    to the responsibilities
  • 9:15 - 9:17
    of a media company.
  • 9:17 - 9:19
    Imagine McDonald's.
  • 9:19 - 9:22
    We find 80 percent of their beef is fake,
  • 9:22 - 9:23
    and it's giving us encephalitis,
  • 9:23 - 9:25
    and we're making terrible decisions.
  • 9:26 - 9:27
    And we say, "McDonald's,
  • 9:27 - 9:29
    we're pissed off."
  • 9:29 - 9:30
    And they say, "Wait,
  • 9:30 - 9:32
    wait,
  • 9:32 - 9:34
    we're not a fast-food restaurant,
  • 9:34 - 9:36
    we're a fast-food platform."
  • 9:36 - 9:38
    (Laughter)
  • 9:38 - 9:41
    These companies and CEOs wrap themselves
  • 9:41 - 9:45
    in a neon-blue pink rainbow
    and blue blanket
  • 9:45 - 9:46
    to create an illusionist trick
  • 9:46 - 9:48
    from their behavior each day,
  • 9:48 - 9:52
    which is more indicative
    of the spawn of Darth Vader and Ayn Rand.
  • 9:52 - 9:53
    Why?
  • 9:53 - 9:57
    Because we as progressives
    are seen as nice but weak.
  • 9:57 - 10:00
    If Sheryl Sandberg had written
    a book on gun rights,
  • 10:00 - 10:03
    or on the pro-life movement,
  • 10:03 - 10:06
    would they be flying Sheryl to Cannes?
  • 10:06 - 10:07
    No.
  • 10:07 - 10:09
    And I'm not doubting
    their progressive values,
  • 10:09 - 10:11
    but it foots to shareholder value,
  • 10:11 - 10:14
    because we as progressives
    are seen as weak.
  • 10:14 - 10:15
    They're so nice --
  • 10:15 - 10:16
    remember Microsoft?
  • 10:16 - 10:17
    They didn't seem as nice,
  • 10:17 - 10:22
    and regulators stepped in much earlier
    than the regulators now
  • 10:22 - 10:25
    who would never step in
    on those nice, nice people.
  • 10:25 - 10:27
    I'm about to get on a plane tonight,
  • 10:27 - 10:30
    and I'm going to have a guy
    named Roy from TSA molest me.
  • 10:30 - 10:33
    If I am suspected
    of a DUI on the way home,
  • 10:33 - 10:36
    I can have blood taken from my person.
  • 10:37 - 10:40
    But wait, don't tap into the iPhone;
  • 10:40 - 10:41
    it's sacred.
  • 10:41 - 10:43
    This is our new cross.
  • 10:43 - 10:44
    It shouldn't be the iPhone X,
  • 10:44 - 10:46
    it should be called the iPhone cross.
  • 10:46 - 10:47
    We have our religion;
  • 10:47 - 10:48
    it's Apple.
  • 10:48 - 10:50
    Our Jesus Christ is Steve Jobs,
  • 10:50 - 10:54
    and we've decided this is holier
    than our person, our house
  • 10:54 - 10:55
    or our computer.
  • 10:55 - 10:57
    We have become totally out of control
  • 10:57 - 11:01
    with the gross idolatry
    of innovation and of youth.
  • 11:01 - 11:04
    We no longer worship
    at the altar of character,
  • 11:04 - 11:05
    of kindess,
  • 11:05 - 11:08
    but of innovation and people
    who create shareholder value.
  • 11:08 - 11:11
    Amazon has become so powerful
    in the marketplace
  • 11:11 - 11:13
    it can conduct Jedi mind tricks.
  • 11:13 - 11:16
    It can begin damaging other
    industries just by looking at it.
  • 11:16 - 11:18
    Nike announces they're
    distributing on Amazon,
  • 11:18 - 11:19
    their stock goes up,
  • 11:19 - 11:21
    every other footwear stock goes down.
  • 11:21 - 11:24
    When Amazon stock goes up
    the rest of retail stocks go down
  • 11:24 - 11:27
    because they assume what's good
    for Amazon is bad for everybody else.
  • 11:27 - 11:32
    They cut the cost on salmon 33 percent
    when they acquired Whole Foods.
  • 11:32 - 11:35
    In between the time they announced
    the acquisition of Whole Foods
  • 11:35 - 11:36
    and when it closed,
  • 11:36 - 11:38
    Kroger, the largest
    pure-play grocer in America,
  • 11:38 - 11:40
    shed one-third of its value
  • 11:40 - 11:47
    because Amazon purchased a grocer
    one-eleventh the size of Kroger.
  • 11:47 - 11:48
    I got very lucky.
  • 11:49 - 11:52
    I predicted the acquisition
    of Whole Foods by Amazon
  • 11:52 - 11:53
    the week before it happened.
  • 11:53 - 11:54
    This is me boasting.
  • 11:54 - 11:56
    I said this publicly in the media,
  • 11:56 - 11:58
    this was the largest
    acquisition in their history,
  • 11:58 - 12:01
    they'd never made
    an acquisition over one billion,
  • 12:01 - 12:03
    and people asked, "How did you know this?"
  • 12:03 - 12:06
    So I'm letting this very impressive
    audience in on the secret.
  • 12:06 - 12:07
    How did I know this?
  • 12:07 - 12:09
    I'm going to tell you how I knew.
  • 12:09 - 12:12
    I bark at Alexa all day long
  • 12:12 - 12:14
    and try to figure out what's going on.
  • 12:14 - 12:17
    "Alexa, buy whole milk."
  • 12:17 - 12:20
    (Alexa) I couldn't find
    anything for whole milk,
  • 12:20 - 12:22
    so I've added whole milk
    to your shopping list.
  • 12:23 - 12:24
    SG: Then I asked,
  • 12:24 - 12:27
    "Alexa, buy organic foods."
  • 12:27 - 12:29
    (Alexa) The top search result
    for organic food
  • 12:29 - 12:31
    is Plum Organic's baby food,
  • 12:31 - 12:32
    banana and pumpkin.
  • 12:32 - 12:34
    12 pack of four ounces each.
  • 12:34 - 12:36
    It's 15 dollars total.
  • 12:36 - 12:38
    Would you like to buy it?
  • 12:38 - 12:40
    SG: And then, as often
    happens at my age,
  • 12:40 - 12:41
    I got confused.
  • 12:41 - 12:44
    "Alexa, buy Whole Foods."
  • 12:44 - 12:48
    (Alexa) I have purchased the outstanding
    stock of Whole Foods Incorporated
  • 12:48 - 12:49
    at 42 dollars per share.
  • 12:49 - 12:53
    I have charged 13.7 billion
    to your American Express card.
  • 12:53 - 12:55
    (Laughter)
  • 12:56 - 12:57
    I thought that'd be funnier.
  • 12:58 - 12:59
    (Laughter)
  • 12:59 - 13:01
    We've personified these companies,
  • 13:01 - 13:04
    and just as when you're really angry
    over every little thing someone does
  • 13:04 - 13:06
    in your life and relationships,
  • 13:06 - 13:07
    you've got to ask yourself,
  • 13:07 - 13:08
    "What's going on here?
  • 13:08 - 13:10
    Why are we so disappointed in technology?"
  • 13:10 - 13:13
    And I believe it's because
    the ratio of one-percent pursuit
  • 13:13 - 13:14
    of shareholder value
  • 13:14 - 13:17
    and 99 percent the betterment of humanity
  • 13:17 - 13:18
    that technology used to play
  • 13:18 - 13:19
    has been flipped,
  • 13:19 - 13:21
    and now we're totally focused
    on shareholder value
  • 13:21 - 13:22
    instead of humanity.
  • 13:22 - 13:25
    100,000 people came together
    for the Manhattan Project
  • 13:25 - 13:26
    and literally saved the world.
  • 13:26 - 13:28
    Technology saved the world.
  • 13:28 - 13:32
    My mother was a four-year-old Jew
    living in London at the outset of the war.
  • 13:32 - 13:35
    If we had not formed the footrace
    towards splitting the atom,
  • 13:35 - 13:37
    would she have survived?
  • 13:37 - 13:38
    It's unlikely.
  • 13:39 - 13:40
    25 years later,
  • 13:40 - 13:42
    the most impressive accomplishment,
  • 13:42 - 13:45
    arguably ever in all of humankind --
  • 13:45 - 13:46
    put a man on the moon.
  • 13:46 - 13:49
    430,000 Canadians, Britsh
    and Americans came together,
  • 13:49 - 13:51
    again with very basic technology,
  • 13:51 - 13:53
    and put a man on the moon.
  • 13:53 - 13:56
    Now we have the 700,000
    best and brightest,
  • 13:56 - 14:00
    and these are the best and brightest
    from the four corners of the earth.
  • 14:00 - 14:04
    They are literally playing with lasers
    relative to slingshots,
  • 14:04 - 14:05
    relative to the squirt gun.
  • 14:05 - 14:07
    They have the GDP of India to work at.
  • 14:07 - 14:10
    And after studying
    these companies for 10 years,
  • 14:10 - 14:12
    I know what their mission is.
  • 14:12 - 14:14
    Is it to organize the world's information?
  • 14:14 - 14:16
    Is it to connect us?
  • 14:16 - 14:18
    Is it to create greater commonty of man?
  • 14:19 - 14:20
    It isn't.
  • 14:20 - 14:22
    I know why we have brought together --
  • 14:22 - 14:26
    I know that the greatest collection
    of IQ capital and creativity,
  • 14:26 - 14:30
    that their soul mission is
    to tell another fucking Nissan.
  • 14:30 - 14:32
    My name is Scott Galloway,
  • 14:32 - 14:33
    I teach at NYU
  • 14:33 - 14:34
    and I appreciate your time.
  • 14:34 - 14:36
    (Applause)
  • 14:43 - 14:44
    Chris Anderson: Not planned,
  • 14:44 - 14:47
    but you prompted
    some questions in me, Scott.
  • 14:47 - 14:48
    (Laughter)
  • 14:48 - 14:50
    That was a spectacular rant.
  • 14:51 - 14:52
    SG: Is this like Letterman?
  • 14:52 - 14:53
    When you do well,
  • 14:53 - 14:54
    he calls you onto the couch?
  • 14:55 - 14:58
    CA: No, no, you're going to the heart
    of the conversation right now.
  • 14:59 - 15:05
    Everyone's aware that after years
    of worshipping Silicon Valley,
  • 15:05 - 15:07
    suddenly the worm has turned,
  • 15:07 - 15:08
    and in such a big way.
  • 15:08 - 15:09
    To some people here,
  • 15:09 - 15:12
    it will just feel like you're piling on,
  • 15:12 - 15:15
    you're kicking the kids who've
    already been kicked to pieces anyway.
  • 15:15 - 15:17
    Don't you feel any empathy
    for them at all?
  • 15:17 - 15:19
    CG: None whatsoever.
  • 15:19 - 15:21
    Look, this is the issue.
  • 15:22 - 15:24
    It's not their fault, it's our fault.
  • 15:24 - 15:26
    They're for-profit companies.
  • 15:26 - 15:28
    They're not concerned
    with the condition of our souls.
  • 15:28 - 15:31
    They're not going to take care
    of us when we get older.
  • 15:31 - 15:35
    We have set up a society that values
    shareholder value over everything,
  • 15:35 - 15:37
    and they're doing what they're
    supposed to be doing,
  • 15:37 - 15:38
    but we need to elect people,
  • 15:38 - 15:40
    and we need to force
    ourselves to force them
  • 15:40 - 15:42
    to be subject to the same scrutiny
  • 15:42 - 15:44
    that the rest of business
    endures full-stop.
  • 15:44 - 15:46
    CA: There's another narrative
  • 15:46 - 15:50
    that is arguably equally
    consistent with the facts,
  • 15:50 - 15:57
    which is that there actually is good
    intent in much of the leadership --
  • 15:57 - 15:58
    I won't say everyone necessarily --
  • 15:58 - 16:00
    many of the employees.
  • 16:01 - 16:03
    We all know people who work
    in those companies,
  • 16:03 - 16:06
    and they still are pretty convincing
  • 16:06 - 16:07
    that their mission is to do --
  • 16:07 - 16:09
    so the alternative narrative
  • 16:09 - 16:12
    is that there have been
    unintended consequences here,
  • 16:12 - 16:14
    that the technologies
    that we're unleashing,
  • 16:14 - 16:18
    the algorithms that we're attempting
    to personalize the internet,
  • 16:18 - 16:19
    for example,
  • 16:19 - 16:25
    have a) resulted in weird effects
    like filter bubbles
  • 16:25 - 16:27
    that we weren't expecting,
  • 16:27 - 16:30
    and b) made themselves vulnerable
    to weird things like --
  • 16:30 - 16:31
    oh, I don't know,
  • 16:31 - 16:33
    Russian hackers creating accounts
  • 16:33 - 16:35
    and doing things that we didn't expect.
  • 16:35 - 16:39
    Isn't the unintended consequence
    a possibility here?
  • 16:40 - 16:41
    SG: I don't think --
  • 16:41 - 16:42
    so I'm pretty sure statistically
  • 16:42 - 16:45
    they're no less or better people
    than any other organization
  • 16:45 - 16:47
    that has 100,000 or more poeple.
  • 16:47 - 16:48
    I don't think they're bad people.
  • 16:48 - 16:49
    As a matter of fact,
  • 16:49 - 16:53
    I would argue that there's a lot
    of very civic-minded, decent leadership,
  • 16:53 - 16:54
    but this is the issue.
  • 16:54 - 16:59
    When you control 90 percent
    points of share in a market search
  • 16:59 - 17:02
    that is now bigger than the entire
    advertising market of any nation,
  • 17:02 - 17:06
    and you're primarily compensated
    and trying to develop economic security
  • 17:06 - 17:08
    for you and the families
    of your employees,
  • 17:08 - 17:09
    to increase that market share,
  • 17:09 - 17:12
    you can't help but leverage
    all the power at your disposal.
  • 17:13 - 17:15
    And that is the basis for regulation,
  • 17:15 - 17:17
    and it's the basis for truism
    throughout history
  • 17:17 - 17:18
    that power corrupts.
  • 17:18 - 17:19
    They're not bad people,
  • 17:19 - 17:22
    we've just let them get out of control.
  • 17:22 - 17:26
    CA: So maybe the case
    is slightly overstated?
  • 17:26 - 17:29
    I know at least a bit --
  • 17:29 - 17:30
    Larry Page for example,
  • 17:30 - 17:31
    Jeff Bezos --
  • 17:31 - 17:33
    I don't actually think
    they wake up thinking,
  • 17:33 - 17:35
    "I've got to sell a fucking Nissan."
  • 17:35 - 17:37
    I don't think they think that.
  • 17:37 - 17:40
    I think they are trying to build
    something cool,
  • 17:40 - 17:42
    and are probably,
  • 17:42 - 17:44
    in moments of reflection,
  • 17:44 - 17:48
    as horrified that some of the things
    that have happened as we might be.
  • 17:48 - 17:52
    So is there a different way
    of framing this?
  • 17:52 - 17:57
    To say that when your model is advertising
  • 17:57 - 18:02
    that there are dangers there
    that you have to take on more explicitly?
  • 18:03 - 18:06
    SG: I think it's very difficult
    to set an organization up
  • 18:06 - 18:07
    as we do,
  • 18:07 - 18:10
    to pursue shareholder value
    above all else.
  • 18:10 - 18:11
    They're not non-profits.
  • 18:11 - 18:15
    The reason people go to work there
    is they want to create economic security
  • 18:15 - 18:16
    for them and their families,
  • 18:16 - 18:18
    mostly first and foremost.
  • 18:18 - 18:19
    And when you get to a point
  • 18:19 - 18:21
    where you control so much
    economic power,
  • 18:21 - 18:23
    you use all the weapons at your disposal.
  • 18:23 - 18:25
    I don't think they're bad people,
  • 18:25 - 18:28
    but I think the role of government
    and the role of us as consumers
  • 18:28 - 18:29
    and people who elect our officials,
  • 18:29 - 18:31
    is to ensure that there
    are some checks here.
  • 18:31 - 18:34
    And we have given
    them the mother of all hall passes
  • 18:34 - 18:36
    because we find them just so fascinating.
  • 18:36 - 18:38
    CA: Scott, eloquently put,
  • 18:38 - 18:39
    spectacularly put.
  • 18:39 - 18:44
    Mark Zuckerberg, Jeff Bezos,
    Larry Page, Tim Cook,
  • 18:44 - 18:45
    if you're watching,
  • 18:45 - 18:48
    you're welcome to come and make
    the counter argument as well.
  • 18:48 - 18:49
    Scott, thank you so much.
  • 18:49 - 18:50
    SG: Thanks very much.
  • 18:50 - 18:51
    (Applause)
Title:
How Amazon, Apple, Facebook and Google manipulate our emotions
Speaker:
Scott Galloway
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
19:05

English subtitles

Revisions Compare revisions