Return to Video

34C3 - Social Cooling - big data’s unintended side effect

  • 0:15 - 0:22
    Herald (H): Yeah. Welcome to our next
    talk, Social Cooling. You know, people say
  • 0:22 - 0:29
    "I have no problem with surveillance. I
    have nothing to hide," but then, you know,
  • 0:29 - 0:37
    maybe the neighbors and maybe this and
    maybe that. So, tonight we're going to
  • 0:37 - 0:45
    hear Tijmen Schep who's from Holland. He's
    a privacy designer and a freelance
  • 0:45 - 0:53
    security researcher and he's gonna hold
    a talk about how digital surveillance
  • 0:53 - 1:04
    changes our social way of interacting. So,
    please, let's have a hand for Tijmen Schep!
  • 1:04 - 1:15
    Applause
  • 1:15 - 1:18
    Tijmen Schep (TS): Hi everyone. Really
    cool that you're all here and really happy
  • 1:18 - 1:24
    to talk here. It's really an honor. My
    name is Tijmen Schep and I am a technology
  • 1:24 - 1:31
    critic. And that means that it's my job to
    not believe [audio cuts out] tells us and
  • 1:31 - 1:37
    that's really a lot of fun. [audio cuts
    out] is, how do I get a wider audience
  • 1:37 - 1:40
    involved in understanding technology and
    the issues that are arising from
  • 1:40 - 1:46
    technology? Because I believe that change
    comes when the public demands it. I think
  • 1:46 - 1:51
    that's really one of the important things
    when change happens. And for me as a
  • 1:51 - 1:55
    technology critic, for me words are very
    much how I hack the system, how I try to
  • 1:55 - 2:02
    hack this world. And so, tonight I'm going
    to talk to you about one of these words
  • 2:02 - 2:08
    that I think could help us. Framing the
    issue is half the battle. [audio cuts out]
  • 2:08 - 2:13
    and frame the problem - if we can explain,
    what the problem is in a certain frame...
  • 2:13 - 2:18
    that, you know, makes certain positions
    already visible, that's really half the
  • 2:18 - 2:25
    battle won. So, that frame is social
    cooling. But before I go into it, I want
  • 2:25 - 2:32
    to ask you a question. Who here recognizes
    this? You're on Facebook or some other
  • 2:32 - 2:39
    social site, and you click on the link
    because you think "Oh I could [audio cuts
  • 2:39 - 2:44
    out] listen [audio cuts out] could click
    on this, but it might look bad. It might
  • 2:44 - 2:47
    be remembered by someone. Some agency
    might remember it, and I could click on
  • 2:47 - 2:52
    it, but I'm hesitating to click."
    Microphone buzzing
  • 2:52 - 2:55
    laughter
  • 3:02 - 3:07
    TS: That better? Can everyone hear me now?
    Audience: No.
  • 3:07 - 3:17
    TS: No. Okay, that... yeah. Should I start
    again? Okay. So, you're on Facebook, and
  • 3:17 - 3:20
    you're thinking "Oh, that's an interesting
    link. I could click on that," but you're
  • 3:20 - 3:23
    hesitating because maybe someone's gonna
    remember
  • 3:23 - 3:26
    that. And that might come back to me
    later, and who here recognizes that
  • 3:26 - 3:32
    feeling? So, pretty much almost everybody.
    And that's increasingly what I find, when
  • 3:32 - 3:37
    I talk about the issue, that people really
    start to recognize this. And I think a
  • 3:37 - 3:42
    word we could use to describe that is
    "Click Fear." This hesitation, it could be
  • 3:42 - 3:46
    click fear. And you're not alone.
    Increasingly, we find that, research
  • 3:46 - 3:51
    points that this is a wide problem, that
    people are hesitating to click some of the
  • 3:51 - 3:55
    links. For example, after the Snowden
    revelations, people were less likely to
  • 3:55 - 3:59
    research issues about terrorism and other
    things on Wikipedia because they thought
  • 3:59 - 4:04
    "Well, maybe the NSA wouldn't like it if I
    [audio cuts out] that. Okay, not gonna
  • 4:04 - 4:11
    move. And visiting Google as well. So this
    is a pattern that there's research... are
  • 4:11 - 4:15
    pointing to. And it's not very strange, of
    course. I mean, we all understand that if
  • 4:15 - 4:18
    you feel you're being
    watched, you change your behavior. It's a
  • 4:18 - 4:23
    very logical thing that we all understand.
    And I believe that technology is really
  • 4:23 - 4:27
    amplifying this effect. I think that's
    something that we really have to come to
  • 4:27 - 4:32
    grips with. And that's why I think social
    cooling could be useful with that. Social
  • 4:32 - 4:36
    cooling describes in a way how in
    increasingly digital world, where our
  • 4:36 - 4:43
    digital lives are increasingly digitized, it
    becomes easier to feel this pressure, to
  • 4:43 - 4:49
    feel these normative effects of these
    systems. And very much you see that,
  • 4:49 - 4:53
    because increasingly, your data is being
    turned into thousands of scores by data
  • 4:53 - 4:56
    brokers and other companies. And those
    scores are increasing influences you're...
  • 4:56 - 5:01
    influencing your chances in life. And this
    is creating an engine of oppression, an
  • 5:01 - 5:08
    engine of change that we have to
    understand. And the fun thing is that in a
  • 5:08 - 5:14
    way this idea is really being helped by
    Silicon Valley, who for a long time has
  • 5:14 - 5:17
    said "Data is the new gold," but they've
    recently, in the last five years, changed
  • 5:17 - 5:22
    that narrative. Now they're saying "Data
    is the new oil," and that's really funny,
  • 5:22 - 5:25
    because if data is the new oil, then
    immediately you get the question "Wait,
  • 5:25 - 5:31
    oil gave us global warming, so then, what
    does data give us?" And I believe that if
  • 5:31 - 5:35
    oil leads to global warming, then data
    could lead to social cooling. That could
  • 5:35 - 5:40
    be the word that we use for these negative
    effects of big data. In order to really
  • 5:40 - 5:43
    understand this, and go into it, we have
    to look at three things. First, we're
  • 5:43 - 5:47
    going to talk about the reputation
    economy, how that system works. Second
  • 5:47 - 5:51
    chapter, we're going to look at behavior
    change, how it is influencing us and
  • 5:51 - 5:55
    changing our behavior. And finally, to not
    let you go home depressed, I'm gonna talk
  • 5:55 - 6:02
    about how can we deal with this. So first.
    The reputation economy. Already we've seen
  • 6:02 - 6:08
    today that China is building this new
    system, the social credit system. It's a
  • 6:08 - 6:11
    system that will give every citizen in
    China a score that basically represents
  • 6:11 - 6:16
    how well-behaved they are. And it
    will influence your ability to get a job,
  • 6:16 - 6:21
    a loan, a visum and even a date. And for
    example, the current version of the
  • 6:21 - 6:26
    system, Sesame Credit, one of the early
    prototypes, already gives everybody that
  • 6:26 - 6:31
    wants to a score, but it also is connected
    to the largest dating website in China.
  • 6:31 - 6:35
    So, you can kind of find out "Is this
    person that I'm dating... what kind of
  • 6:35 - 6:41
    person is this? Is this something who's,
    you know, well viewed by Chinese society?"
  • 6:42 - 6:45
    This is where it gets really heinous for
    me, because until now you could say "Well,
  • 6:45 - 6:49
    these reputation systems, they're fair, if
    you're a good person, you get a higher
  • 6:49 - 6:52
    score. If you're bad person, you get a
    lower score," but it's not that simple. I
  • 6:52 - 6:55
    mean, your friends' score influences your
    score, and your score influences your
  • 6:55 - 7:01
    friends' score, and that's where you
    really start to see how complex social
  • 7:01 - 7:05
    pressures arrive, and where we can see the
    effects of data stratification, where
  • 7:05 - 7:07
    people are starting to think "Hey, who are
    my friends, and who should I be friends
  • 7:07 - 7:14
    with?" You could think "That only happens
    in China. Those Chinese people are, you
  • 7:14 - 7:19
    know, different." But the exact
    same thing is happening here in the West,
  • 7:19 - 7:22
    except we're letting the market build it.
    I'll give you an example. This is a
  • 7:22 - 7:26
    company called "deemly" - a Danish company
    - and this is their video for their
  • 7:26 - 7:29
    service.
    Video narrator (VN): ... renting
  • 7:29 - 7:34
    apartments from others, and she loves to
    swap trendy clothes and dresses. She's
  • 7:34 - 7:38
    looking to capture her first lift from a
    RideShare app, but has no previous reviews
  • 7:38 - 7:40
    to help support her.
    Video background voices: Awww.
  • 7:40 - 7:44
    VN: Luckily, she's just joined deemly,
    where her positive feedback from the other
  • 7:44 - 7:51
    sites appears as a deemly score, helping
    her to win a RideShare in no time. Deemly
  • 7:51 - 7:56
    is free to join and supports users across
    many platforms, helping you to share and
  • 7:56 - 8:01
    benefit from the great reputation you've
    earned. Imagine the power of using your
  • 8:01 - 8:04
    deemly score alongside your CV for a job
    application...
  • 8:04 - 8:06
    TS: Like in China.
    VN: ... perhaps to help get a bank loan...
  • 8:06 - 8:08
    TS: Like...
    VN: or even to link to from your dating
  • 8:08 - 8:09
    profile.
    TS: Like in China!
  • 8:09 - 8:16
    VN: Sign up now at deemly.co. Deemly:
    better your sharing.
  • 8:16 - 8:22
    Applause
    TS: Thanks. There is a change. There is
  • 8:22 - 8:26
    difference, though. The funny thing about
    here is that it's highly invisible to us.
  • 8:26 - 8:30
    The Chinese government is very open about
    what they're building, but here we are
  • 8:30 - 8:33
    very blind to what's going on. Mostly,
    when we talk about these things, then
  • 8:33 - 8:37
    we're talking about these systems that
    give us a very clear rating, like Airbnb,
  • 8:37 - 8:42
    Uber, and of course the Chinese system.
    The thing is, most of these systems are
  • 8:42 - 8:47
    invisible to us. There's a huge market of
    data brokers who are, you know, not
  • 8:47 - 8:52
    visible to you, because you are not the
    customer. You are the product. And these
  • 8:52 - 8:57
    data brokers, well, what they do is, they
    gather as much data as possible about you.
  • 8:57 - 9:04
    And that's not all. They then create up to
    eight thousand scores about you. In the
  • 9:04 - 9:08
    United States, these companies have up to
    8,000 scores, and in Europe it's a little
  • 9:08 - 9:13
    less, around 600. These are scores about
    things like your IQ, your psychological
  • 9:13 - 9:19
    profile, your gullibility, your religion,
    your estimated life span. 8,000 of these
  • 9:19 - 9:24
    different things about you. And how does
    that work? Well, it works by machine
  • 9:24 - 9:29
    learning. So, machine learning algorithms
    can find patterns in society that we can
  • 9:29 - 9:36
    really not anticipate. For example, let's
    say you're a diabetic, and, well, let's
  • 9:36 - 9:41
    say this data broker company has a mailing
    list, or has an app, that diabetic
  • 9:41 - 9:44
    patients use. And they also have the data
    of these diabetic patients about what they
  • 9:44 - 9:48
    do on Facebook. Well, there you can start
    to see correlations. So, if diabetic
  • 9:48 - 9:54
    patients more often like gangster-rap and
    pottery on Facebook, well, then you could
  • 9:54 - 9:58
    deduce from that if you also like
    gangster-rap or pottery on Facebook, then
  • 9:58 - 10:03
    perhaps you also are more likely to have
    or get diabetes. It is highly
  • 10:03 - 10:09
    unscientific, but this is how the system
    works. And this is an example of how that
  • 10:09 - 10:13
    works with just your Facebook scores.
    Woman in the video: ... see was lowest about
  • 10:13 - 10:18
    60% when it came to predicting whether a
    user's parents were still together when
  • 10:18 - 10:21
    they were 21. People whose parents
    divorced before they were 21 tended to
  • 10:21 - 10:26
    like statements about relationships. Drug
    users were ID'd with about 65% accuracy.
  • 10:26 - 10:33
    Smokers with 73%, and drinkers with 70%.
    Sexual orientation was also easier to
  • 10:33 - 10:40
    distinguish among men. 88% right there.
    For women, it was about 75%. Gender, by
  • 10:40 - 10:45
    the way, race, religion, and political
    views, were predicted with high accuracy
  • 10:45 - 10:50
    as well. For instance: White versus black:
    95%.
  • 10:50 - 10:54
    TS: So, the important thing to understand
    here is that this isn't really about your
  • 10:54 - 10:58
    data anymore. Like, oftentimes when we
    talk about data protection, we talk about
  • 10:58 - 11:03
    "Oh, I want to keep control of my data."
    But this is their data. This data that
  • 11:03 - 11:10
    they deduce, that they derive from your
    data. These are opinions about you. And
  • 11:10 - 11:14
    these things are what, you know, make it
    so that even though you never filled in a
  • 11:14 - 11:19
    psychological test, they'd have one. A
    great example of that, how that's used, is
  • 11:19 - 11:24
    a company called Cambridge Analytica. This
    company has created detailed profiles
  • 11:24 - 11:29
    about us through what they call
    psychographics and I'll let them explain
  • 11:29 - 11:33
    it themselves.
    Man in the video: By having hundreds and
  • 11:33 - 11:37
    hundreds of thousands of Americans
    undertake this survey, we were able to
  • 11:37 - 11:40
    form a
    model to predict the personality of every
  • 11:40 - 11:46
    single adult in the United States of
    America. If you know the personality of
  • 11:46 - 11:50
    the people you're targeting you can nuance
    your messaging to resonate more
  • 11:50 - 11:56
    effectively with those key audience
    groups. So, for a highly neurotic and
  • 11:56 - 12:01
    conscientious audience, you're going to
    need a message that is rational and fair-
  • 12:01 - 12:06
    based, or emotionally-based. In this case,
    the threat of a burglary, and the
  • 12:06 - 12:10
    insurance policy of a gun is very
    persuasive. And we can see where these
  • 12:10 - 12:15
    people are on the map. If we wanted to
    drill down further, we could resolve the
  • 12:15 - 12:20
    data to an individual level, where we have
    somewhere close to four or five thousand
  • 12:20 - 12:24
    data points on every adult in the United
    States.
  • 12:24 - 12:28
    TS: So, yeah. This is the company that
    worked with both the... for the Brexit
  • 12:28 - 12:34
    campaign and with the Trump campaign. Of
    course, little after Trump campaign, all
  • 12:34 - 12:38
    the data was leaked, so data on 200
    million Americans was leaked, And
  • 12:38 - 12:40
    increasingly, you can see
    this data described as "modeled voter
  • 12:40 - 12:46
    ethnicities and religions." So, this is
    this derived data. You might think that
  • 12:46 - 12:50
    when you go online and use Facebook and
    use all these services, that advertisers
  • 12:50 - 12:53
    are paying for you. That's a common
    misperception. That's not really the case.
  • 12:53 - 12:58
    What's really going on is that, according
    to SSC research, the majority of the
  • 12:58 - 13:02
    money made in this data broker market is
    made from risk management. All right, so,
  • 13:02 - 13:07
    in a way you could say that it's not
    really marketers that are paying for you,
  • 13:07 - 13:13
    it's your bank. It's ensurers. It's your
    employer. It's governments. These kind of
  • 13:13 - 13:19
    organizations are the ones who buy these
    profiles. The most. More than the other
  • 13:19 - 13:24
    ones. Of course, the promise of big data
    is that you can then manage risk. Big data
  • 13:24 - 13:28
    is the idea that with data you can
    understand things and then manage them.
  • 13:28 - 13:32
    So what really is innovation in this big
    data world, this data economy, is the
  • 13:32 - 13:36
    democratization of the background check.
    That's really the core of this, this market
  • 13:36 - 13:39
    that now you can find out everything about
    everyone.
  • 13:39 - 13:45
    So, yeah, now your... in past, only perhaps
    your bank could know your credit score but
  • 13:45 - 13:48
    now your green grocer knows your
    psychological profile.
  • 13:48 - 13:55
    Right that's a new level of, yeah, what's
    going on here. It's not only inv... not
  • 13:55 - 13:59
    only invisible but it's also huge
    according to the same research by the FCC
  • 13:59 - 14:05
    this market was already worth 150 billion
    dollars in 2015. So, it's invisible, it's
  • 14:05 - 14:11
    huge and hardly anyone knows about it. But
    that's probably going to change. And that
  • 14:11 - 14:18
    brings us to the second part: Behavioral
    change. We already see this first part of
  • 14:18 - 14:21
    this, how behavioral change is happening
    through these systems. That's through outside
  • 14:21 - 14:26
    influence and we've, we've talked a lot
    about this in this conference. For example
  • 14:26 - 14:29
    we see how Facebook and advertisers try to
    do that. We've also seen how China is
  • 14:29 - 14:33
    doing that, trying to influence you. Russia
    has recently tried to use Facebook to
  • 14:33 - 14:37
    influence the elections and of course
    companies like Cambridge Analytica try to
  • 14:37 - 14:39
    do the same thing.
    And here you can have a debate on, you
  • 14:39 - 14:43
    know, to what extent are they really
    influencing us, but I think that's not
  • 14:43 - 14:48
    actually the really, the most interesting
    question. What interests me most of all is
  • 14:48 - 14:54
    how we are doing it ourselves, how we are
    creating new forms of self-censorship and
  • 14:54 - 14:59
    and are proactively anticipating these
    systems. Because once you realize that
  • 14:59 - 15:02
    this is really about risk management you
    start... and this is about banks and
  • 15:02 - 15:06
    employers trying to understand you, people
    start to understand that this will go beyond
  • 15:06 - 15:10
    click fear, if you remember. This will go
    beyond, this will become, you know,
  • 15:10 - 15:14
    when people find out this will be, you
    know, not getting a job for example.
  • 15:14 - 15:19
    This'll be about getting really expensive
    insurance. It'll be about all these kinds
  • 15:19 - 15:22
    of problems and people are increasingly
    finding this out. So for example in the
  • 15:22 - 15:29
    United States if you... the IRS might now
    use data profile... are now using data
  • 15:29 - 15:34
    profiles to find out who they should
    audit. So I was talking recently to a girl
  • 15:34 - 15:38
    and and she said: "Oh I recently tweeted
    about... a negative tweet about the IRS," and
  • 15:38 - 15:40
    she immediately grabbed her phone to
    delete it.
  • 15:40 - 15:45
    When she realized that, you know, this
    could now be used against her in a way.
  • 15:45 - 15:49
    And that's the problem. Of course we see all
    kinds of other crazy examples that the big...
  • 15:49 - 15:54
    the audience that we measure... the wider public
    is picking up on, like who... so we now have
  • 15:54 - 15:59
    algorithms that can find out if you're gay
    or not. And these things scare people and
  • 15:59 - 16:03
    these things are something we have to
    understand. So, chilling effects this what
  • 16:03 - 16:08
    this boils down to. For me, more importantly
    than these influences of these big
  • 16:08 - 16:13
    companies and nation states is how people
    themselves are experiencing these chilling
  • 16:13 - 16:18
    effects like you yourself have as well.
    That brings us back to social cooling. For
  • 16:18 - 16:24
    me, social cooling is about these two
    things combined at once and this
  • 16:24 - 16:29
    increasing ability of agents and... and groups
    to influence you and on the other hand the
  • 16:29 - 16:33
    increasing willingness of people
    themselves to change their own behavior to
  • 16:33 - 16:39
    proactively engage with this issue.
    There are three long-term consequences
  • 16:39 - 16:44
    that I want to dive into. The first is how
    this affects the individual, the second is
  • 16:44 - 16:49
    how it affects society, and the third is
    how it affects the market. So let's look
  • 16:49 - 16:55
    the individual. Here we've seen, there's a
    rising culture of self-censorship. It
  • 16:55 - 16:59
    started for me with an article that I read in
    New York Times, where a student was saying:
  • 16:59 - 17:02
    "Well we're very very reserved." She's
    going to do things like spring break.
  • 17:02 - 17:06
    I said: "Well you don't have to defend
    yourself later," so you don't do it.
  • 17:06 - 17:09
    And what she's talking about, she's
    talking about doing crazy things, you
  • 17:09 - 17:12
    know, letting go, having fun. She's
    worried that the next day it'll be on
  • 17:12 - 17:16
    Facebook.
    So what's happening here is that you do
  • 17:16 - 17:18
    have all kinds of freedoms: You have the
    freedom to look up things, you have the
  • 17:18 - 17:22
    freedom to to say things, but you're
    hesitating to use it. And that's really
  • 17:22 - 17:28
    insidious. That has an effect on a wider
    society and here we really see the
  • 17:28 - 17:34
    societal value of privacy. Because in
    society often minority values later become
  • 17:34 - 17:41
    majority values. An example is... is weed.
    I'm from... I'm from the Netherlands and
  • 17:41 - 17:45
    there you see, you know, at first it's
    something that you just don't do and it's
  • 17:45 - 17:49
    you know a bit of a "uhh", but then "Oh, maybe
    yeah, you should... you should try it as well," and
  • 17:49 - 17:53
    people try it and slowly under the surface
    of the society, people change their minds
  • 17:53 - 17:56
    about these things. And then, after a while
    it's like, you know, "What are we still
  • 17:56 - 17:59
    worried about?"
    How the same pattern help it happens of
  • 17:59 - 18:03
    course with way bigger things like this:
    Martin Luther King: "I must honestly say to
  • 18:03 - 18:10
    you that I never intend to adjust myself
    to racial segregation and discrimination."
  • 18:12 - 18:14
    TS: This is the same pattern
    that's happening
  • 18:14 - 18:18
    for all kinds of things that that change
    in society, and that's what privacy is so
  • 18:18 - 18:20
    important for, and that's why it's so
    important that people have the ability to
  • 18:20 - 18:23
    look things up and to change their minds
    and to talk about each other without
  • 18:23 - 18:28
    feeling so watched all the time.
    The third thing is how this impacts the
  • 18:28 - 18:34
    market. Here we see very much the rise of
    a culture of risk avoidance. An example
  • 18:34 - 18:37
    here is that in
    1995 already, doctors in New York were
  • 18:37 - 18:44
    given scores, and what happened was that
    the doctors who try to help advanced
  • 18:44 - 18:47
    stage cancer patients, complex patients, who
    try to do the operation, difficult
  • 18:47 - 18:52
    operations, got a low score, because these
    people more often died, while doctors that
  • 18:52 - 18:57
    didn't lift a finger and didn't try to
    help got a high score. Because, well, people
  • 18:57 - 19:01
    didn't die. So you see here that these
    systems that, they bring all kinds of
  • 19:01 - 19:05
    perverse incentives. They, you know, they're
    they lower the willingness for everybody
  • 19:05 - 19:08
    to take a risk and in some areas of
    society we really like people to take
  • 19:08 - 19:15
    risks. They're like entrepreneurs, doctors.
    So in the whole part you could say that
  • 19:15 - 19:19
    this, what we're seeing here, is some kind
    of trickle-down risk aversion, where
  • 19:19 - 19:23
    the willing, the... the way that
    companies and governments want to
  • 19:23 - 19:27
    manage risk, that's trickling down to us.
    And we're we of course want them to like
  • 19:27 - 19:30
    us, want to have a job, we want to have
    insurance, and then we increasingly start
  • 19:30 - 19:37
    to think "Oh, maybe I should not do this." It's
    a subtle effect. So how do we deal with
  • 19:37 - 19:40
    this?
    Well, together. I think this is a really
  • 19:40 - 19:43
    big problem. I think this is such a big
    problem that, that it can't be managed by
  • 19:43 - 19:47
    just some, some hackers or nerds, building
    something, or by politicians, making a law.
  • 19:47 - 19:53
    This is a really a society-wide problem.
    So I want to talk about all these groups
  • 19:53 - 19:58
    that should get into this: the public,
    politicians, business, and us.
  • 19:58 - 20:03
    So the public. I think we have to talk
    about and maybe extend the metaphor of the
  • 20:03 - 20:07
    cloud and say we have to learn to see the
    stars behind the cloud. Alright, that's one
  • 20:07 - 20:12
    way that we could... that's a narrative we
    could use. I really like to use humor to
  • 20:12 - 20:17
    explain this to a wider audience, so for
    example, last year I was part of an
  • 20:17 - 20:22
    exhibits... helped develop exhibits about
    dubious devices and one of the devices
  • 20:22 - 20:25
    there was called "Taste your status"
    which was a coffee machine that gave you
  • 20:25 - 20:30
    coffee based on your area code. So if you
    live in a good area code, you get nice coffee.
  • 20:30 - 20:34
    You live in a bad area code, you get bad
    coffee.
  • 20:34 - 20:37
    music
    laugher
  • 20:37 - 20:42
    applause
    I'll go into it but... these... often times
  • 20:42 - 20:44
    you can use humor to explain these things
    to a wider audience. I really like that
  • 20:44 - 20:48
    method, that approach.
    We've got a long way to go though. I mean,
  • 20:48 - 20:50
    if we look at the long, you know, how long
    it took for us to understand global
  • 20:50 - 20:53
    warming, to really, you know, come to a stage
    where most people understand what it is
  • 20:53 - 20:58
    and care about it except Donald Trump.
    Well, with data we really got a long way to
  • 20:58 - 21:02
    go, we're really at the beginning of
    understanding this issue like this.
  • 21:02 - 21:08
    Okay, so the second group that has to
    really wake up is politicians. And they have
  • 21:08 - 21:11
    to understand that this is really about the
    balance of power. This is really about
  • 21:11 - 21:16
    power. And if you permit me, I'll go into
    the big picture a little bit, as a media
  • 21:16 - 21:22
    theorist. So this is called Giles Deleuze.
    he's a French philosopher and he explained
  • 21:22 - 21:26
    in his work something that I find really
    useful, He said you have two systems of
  • 21:26 - 21:30
    control in society and the one is the
    institutional one and that's the one we
  • 21:30 - 21:33
    all know.
    You know that the judicial system so
  • 21:33 - 21:38
    you're free to do what you want but then
    you cross a line you cross a law and the
  • 21:38 - 21:41
    police get you you go for every charge you
    go to prison. That's the system we
  • 21:41 - 21:43
    understand.
    But he says there's another system which
  • 21:43 - 21:47
    is the social system this is a social
    pressure system and this for a long time
  • 21:47 - 21:50
    wasn't really designed.
    But now increasingly we are able to do
  • 21:50 - 21:53
    that so this is the system where you
    perform suboptimal behavior and then that
  • 21:53 - 21:58
    gets measured and judged and then you get
    subtly nudged in the right direction. And
  • 21:58 - 22:01
    there's some very important differences
    between these 2 systems. The institutional
  • 22:01 - 22:05
    system you know it has this idea that
    you're a free citizen that makes up your
  • 22:05 - 22:10
    own mind and you know what social system
    is like that's working all the time,
  • 22:10 - 22:13
    constantly it doesn't matter if you're
    guilty or innocent it's always trying to
  • 22:13 - 22:19
    push you. The old system, the institutional
    system is very much about punishment so if
  • 22:19 - 22:21
    you
    break the rules you get punishment but
  • 22:21 - 22:24
    people sometimes don't really care about
    punishment sometimes it's cool to get
  • 22:24 - 22:28
    punishment. But the social system uses
    something way more powerful which is the
  • 22:28 - 22:34
    fear of exclusion. We are social animals
    and we really care to belong to a group.
  • 22:34 - 22:37
    The other difference is that it's very
    important that the institutional system is
  • 22:37 - 22:41
    accountable. You know democratically to us
    how the social system at the moment is
  • 22:41 - 22:45
    really really invisible like these
    algorithms how they work where the data is
  • 22:45 - 22:49
    going it's very hard to understand and of
    course it's exactly what China loved so
  • 22:49 - 22:53
    much about it right there's no you can
    stand in front of a tank but you can't
  • 22:53 - 22:57
    really stand in front of the cloud. So
    yeah that's that's great it also helps me
  • 22:57 - 23:01
    to understand when people say I have
    nothing to hide. I really understand that
  • 23:01 - 23:03
    because when people say I have nothing to
    hide what they're saying is I have nothing
  • 23:03 - 23:06
    to hide from the old system from the
    classic system from the institutional
  • 23:06 - 23:09
    system. They're saying I want to help the
    police I trust our gover
  • 23:09 - 23:13
    nment I trust our institutions and that's
    actually really a positive thing to say.
  • 23:13 - 23:17
    The thing is they don't really see the other
    part of the system how increasingly there
  • 23:17 - 23:22
    are parts that are not in your control
    they're not democratically checked and that's
  • 23:22 - 23:28
    really a problem. So the third thing that
    I think we have to wake up is business,
  • 23:28 - 23:33
    business has to see that this is not so
    much a problem perhaps but that it could
  • 23:33 - 23:36
    be an opportunity. I think I'm still
    looking for a metaphor here but perhaps,
  • 23:36 - 23:40
    if we you know again, compare this issue
    to global warming we say that we need
  • 23:40 - 23:45
    something like ecological food for data.
    And but I don't know what that's gonna
  • 23:45 - 23:48
    look like or how we're gonna explain that
    maybe we have to talk about fast food
  • 23:48 - 23:54
    versus fast data versus ecological data
    but we need a metaphor here. Of course
  • 23:54 - 24:08
    laws are also really helpful. So we might
    get things like this. I'm actually working
  • 24:08 - 24:19
    on this is funny. Or if things get really
    out of hand we might get here, right?
  • 24:19 - 24:23
    So luckily we see that in Europe
    the the politicians are awake and are
  • 24:23 - 24:26
    really trying to push this market I think
    that's really great, so I think in the
  • 24:26 - 24:29
    future we'll get to a moment where people
    say well I prefer European smart products
  • 24:29 - 24:33
    for example, I think that's a good thing I
    think this is really positive. Finally I
  • 24:33 - 24:37
    want to get to all of us what each of us
    can do. I think here again there's a
  • 24:37 - 24:40
    parallel to global warming where at its
    core it's not so much about the new
  • 24:40 - 24:44
    technology and all the issues, it's about
    a new mindset, a new way of looking at the
  • 24:44 - 24:48
    world. And I here think we have to stop
    saying that we have nothing to hide for
  • 24:48 - 24:52
    example. If I've learned anything in the
    past years understanding and researching
  • 24:52 - 24:58
    privacy and this big trade data market is
    privacy is the right to be imperfect. All
  • 24:58 - 25:01
    right increasing there's pressure to be
    the perfect citizen to be the perfect
  • 25:01 - 25:06
    consumer and privacy is a way of getting
    out of that. So this is how I would
  • 25:06 - 25:09
    reframe privacy it's not just being about
    which bits and bytes go where but it's
  • 25:09 - 25:13
    about you know the human right to be
    imperfect cause course we are human we are
  • 25:13 - 25:17
    all imperfect. Sometimes when I talk at
    technology conference people say well
  • 25:17 - 25:22
    privacy was just a phase. You know, it's
    like ebb and flood in and we got it and
  • 25:22 - 25:26
    it's gonna go away again, that's crazy you
    know, you don't say women's rights were
  • 25:26 - 25:31
    just a phase we had it for a while and
    it's gonna go again. Right? And of course
  • 25:31 - 25:34
    Edward Snowden explains it way better. He
    says arguing that you don't care about the
  • 25:34 - 25:37
    right to privacy because you have nothing
    to hide it's no different than saying you
  • 25:37 - 25:41
    don't care about free speech because you
    have nothing to say. What an eloquent
  • 25:41 - 25:47
    system admin. So I think what we have to
    do strive for here is that we develop for
  • 25:47 - 25:51
    more nuanced understanding of all these
    issues. I think we have to go away from
  • 25:51 - 25:54
    this idea that data more data is better,
    data is automatically progress. No it's
  • 25:54 - 25:59
    not data is a trade-off for example for
    the individual more data might mean less
  • 25:59 - 26:04
    psychological security, less willingness
    to share, less willing to try things. For
  • 26:04 - 26:09
    a country it might mean less autonomy for
    citizens and citizens need their own
  • 26:09 - 26:12
    autonomy they need to know what's going on
    they need to be able to vote in their own
  • 26:12 - 26:17
    autonomous way and decide what's what they
    want. In business you could say more data
  • 26:17 - 26:21
    might lead to less creativity right less
    willingness to share new ideas to come up
  • 26:21 - 26:30
    with new ideas - that's again an issue
    there. So in conclusion social cooling is
  • 26:30 - 26:32
    a way of understanding these issues or a
    way of framing these issues that I think
  • 26:32 - 26:37
    could be useful for us. That could help us
    understand and engage with these issues.
  • 26:37 - 26:42
    And yes social cooling is an alarm, it's
    alarmist - it is we're trying to say this
  • 26:42 - 26:46
    is the problem and we have to deal with
    this. But it's also really about hope. All
  • 26:46 - 26:50
    right. I trust not so much in technology I
    trust in us in people that we can fix this
  • 26:50 - 26:54
    once we understand the issue in the same
    way that when we understood the problem
  • 26:54 - 26:56
    with global warming we started to deal
    with it. Where do it's
  • 26:56 - 27:00
    gonna it's slow progress we're doing that
    and we can do the same thing with data.
  • 27:00 - 27:06
    It'll take a while but we'll get there.
    And finally this is about starting to
  • 27:06 - 27:10
    understand the difference between shallow
    optimism and deep optimism. All right,
  • 27:10 - 27:13
    oftentimes technology sectors right cool
    into technology and we're going to fix
  • 27:13 - 27:17
    this by creating an app and for me that's
    you know ,They: "we have to be
  • 27:17 - 27:21
    optimistic", that's very shallow optimism
    the TEDx make optimism. Like true optimism
  • 27:21 - 27:24
    recognizes that each technology comes with
    a downside and we have to recognize that
  • 27:24 - 27:28
    thats it's, that thats not a problem to,
    to point out these problems it's a good
  • 27:28 - 27:32
    thing if once you understand the problems
    you can deal with them - and you know come
  • 27:32 - 27:37
    up with better solutions. If we don't
    change in this mindset then we might
  • 27:37 - 27:41
    create the world where we're all more well
    behaved but perhaps also a little bit
  • 27:41 - 28:00
    less human. Thank you.
    Applause
  • 28:00 - 28:04
    H: Thank You Devin.
    TS: You are welcome.
  • 28:04 - 28:13
    Applause
    H: We still have five more minutes we'll
  • 28:13 - 28:18
    take some questions if you like. First
    microphone number 2.
  • 28:18 - 28:23
    Microphone 2 (M2): Hello, thanks that was
    a really interesting talk. I have a
  • 28:23 - 28:30
    question that I hope will work it's a bit
    complicated there's a project called indie
  • 28:30 - 28:34
    by a foundation called a sovereign
    foundation do you know about it? Okay very
  • 28:34 - 28:39
    great perfect so to just to quickly
    explain these people want to create an
  • 28:39 - 28:43
    identity layer that will be self sovereign
    which means people can reveal what they
  • 28:43 - 28:48
    want about themselves only when they want
    but is one unique identity on the entire
  • 28:48 - 28:52
    internet so that can potentially be very
    liberating because you control all your
  • 28:52 - 28:57
    identity and individual data. But at the
    same time it could be used to enable
  • 28:57 - 29:00
    something like the personal scores we were
    showing earlier on so made me think about
  • 29:00 - 29:03
    that and I wanted to know if you had an
    opinion on this.
  • 29:03 - 29:09
    TS: Yes well um the first thing I think
    about is that as I try to explain you see
  • 29:09 - 29:11
    a lot of initiatives have tried to be
    about: "Oo you have to control your own
  • 29:11 - 29:15
    data". But that's really missing the point
    that it's no longer really about your data
  • 29:15 - 29:19
    it's about this derived data and of course
    it can help to to manage what you share
  • 29:19 - 29:24
    you know then they can't derive anything
    from it. But to little I see that
  • 29:24 - 29:29
    awareness. Second of all this is very much
    for me an example of what nerds and
  • 29:29 - 29:32
    technologies are really good at it's like:
    "oh we've got a social problem let's
  • 29:32 - 29:36
    create a technology app and then we'll fix
    it". Well what I'm trying to explain is
  • 29:36 - 29:40
    that this is such a big problem that we
    cannot fix this with just one group alone
  • 29:40 - 29:42
    - not the politicians, not the designers,
    not the Nerds this is something that we
  • 29:42 - 29:48
    have to really get together you know grab
    - fix together because this is such a
  • 29:48 - 29:51
    fundamental issue right. The idea that
    risk is a problem that we want to manage
  • 29:51 - 29:56
    risk is such so deeply ingrained in people
    you know such stuff based in fear is
  • 29:56 - 29:59
    fundamental and it's everywhere so it's
    not enough for one group to try to fix
  • 29:59 - 30:02
    that it's something that we have to come
    to grips with together.
  • 30:02 - 30:06
    M2: Thanks a lot.
    H: Ok there is a signal angel has a
  • 30:06 - 30:10
    question from the internet I think.
    Signal Angel (SigA): Yes and BarkingSheep
  • 30:10 - 30:13
    is asking: "do you think there's a
    relationship between self-censorship and
  • 30:13 - 30:17
    echo chambers in a sense that people
    become afraid to challenge their own
  • 30:17 - 30:23
    belief and thus isolate themselves in
    groups with the same ideology?".
  • 30:23 - 30:28
    TS: That's, a that's a, that's a really
    big answer to that one.
  • 30:28 - 30:31
    pauses
    TS: Actually, I was e-mailing Vince Cerf,
  • 30:31 - 30:36
    and miraculously he, he responded, and he
    said what you really have to look for is
  • 30:36 - 30:39
    this, not just a reputation economy, but
    also the attention economy and how they're
  • 30:39 - 30:45
    linked. So for a while I've been looking
    for that, that link and there's a lot to
  • 30:45 - 30:50
    say there and there definitely is a link.
    I think important to understand over to
  • 30:50 - 30:54
    get new ones here is that, I'm not saying
    that everybody will become really well
  • 30:54 - 31:00
    behaved and gray book worm people. The
    thing is that what this situation's
  • 31:00 - 31:03
    creating, is that we're all becoming
    theater players while playing in identity
  • 31:03 - 31:06
    more and more, because we're watched more
    of the time. And for some people that
  • 31:06 - 31:11
    might mean that they're, you know, I think
    most people will be more conservative and
  • 31:11 - 31:15
    more careful, some people will go really
    all out and they all enjoy the stage! You
  • 31:15 - 31:19
    know? We have those people as well, and I
    think those people could really benefit
  • 31:19 - 31:24
    and that the attention economy could
    really you know give them a lot of
  • 31:24 - 31:27
    attention through that. So I think
    there's, there's a link there but I could
  • 31:27 - 31:30
    go on more but I think it's for now, where
    I'm aware.
  • 31:30 - 31:34
    H: Okay, we're short on time, we'll take,
    I'm sorry one more question. The number
  • 31:34 - 31:37
    one?
    Microphone 1 (M1): So, the, I think the
  • 31:37 - 31:39
    audience you're talking about, ...
    H: Louder, please.
  • 31:39 - 31:44
    M1: The the audience you're talking to
    here, is already very aware but I'm asking
  • 31:44 - 31:50
    for, like tactics, or your tips, to spread
    your message and to talk to people that
  • 31:50 - 31:55
    are in this, they say: "Uh, I don't care
    they can surveil me.", like what's, what's
  • 31:55 - 32:01
    your approach, like in a practical way?
    How do you actually do this?
  • 32:01 - 32:08
    TS: Yeah, so, I'm really glad to be here
    because I am, yes, I am a nerd, but I'm
  • 32:08 - 32:12
    also a philosopher or thinker, you know
    and, and that means that
  • 32:12 - 32:16
    for me what I work with, it's not just odd
    Rhinos, but words and ideas. I think those
  • 32:16 - 32:19
    I've been trying to show can be really
    powerful, like a word can be a really
  • 32:19 - 32:29
    powerful way to frame a debate or engage
    people. So, I haven't found yet a way to
  • 32:29 - 32:33
    push all this tar. Like, I was making joke
    that I can tell you in one sentence, what
  • 32:33 - 32:36
    privacy is and why it matters but I have
    to give a whole talk before that, all
  • 32:36 - 32:40
    right? Privacy is a right to be imperfect
    but in order to understand that you have
  • 32:40 - 32:43
    to understand the rise of the reputation
    economy, and how it affects your chances
  • 32:43 - 32:47
    in life. The fun thing is, that, that that
    will happen by itself that people will
  • 32:47 - 32:51
    become more aware of that, they will run
    into these problems. They will not get a
  • 32:51 - 32:56
    job or they might get other issues, and
    then they will start to see the problem.
  • 32:56 - 32:59
    And so my question not so much to help
    people understand it, but to help them
  • 32:59 - 33:03
    understand it before they run into the
    wall, right? That's how usually society at
  • 33:03 - 33:07
    the moment deals with technology problems.
    It's like "Oh we'll, we'll, oh ... Oh?
  • 33:07 - 33:11
    it's a problem? Oh well, now we'll try to
    fix it." Well, I believe you can really
  • 33:11 - 33:15
    see these problems come way earlier and I
    think the humanity's, to come around from,
  • 33:15 - 33:20
    is really helpful in that, and trying to
    you know like, the lows are really,
  • 33:20 - 33:28
    really clearly explaining what the problem
    is in 1995. So yeah, that I think that, I
  • 33:28 - 33:33
    don't have a short way of explaining, you
    know, why privacy matters but I think
  • 33:33 - 33:38
    it'll become easier over time as people
    start to really feel these pressures.
  • 33:38 - 33:43
    H: Sorry, thank you very much for the
    question. I think we all should go out and
  • 33:43 - 33:49
    spread the message. This talk is over, I'm
    awfully sorry. When you people leave,
  • 33:49 - 33:52
    please take your bottles, and your cups,
    ...
  • 33:52 - 33:54
    applause
    H: ... and all your junk, and thank you
  • 33:54 - 34:06
    very much again Tijmen Schep!
    applause
  • 34:06 - 34:10
    music
  • 34:10 - 34:27
    subtitles created by c3subtitles.de
    in the year 2017. Join, and help us!
Title:
34C3 - Social Cooling - big data’s unintended side effect
Description:

more » « less
Video Language:
English
Duration:
34:27

English subtitles

Revisions