0:00:14.702,0:00:22.030
Herald (H): Yeah. Welcome to our next[br]talk, Social Cooling. You know, people say
0:00:22.030,0:00:28.696
"I have no problem with surveillance. I[br]have nothing to hide," but then, you know,
0:00:28.696,0:00:37.453
maybe the neighbors and maybe this and[br]maybe that. So, tonight we're going to
0:00:37.453,0:00:45.210
hear Tijmen Schep who's from Holland. He's[br]a privacy designer and a freelance
0:00:45.210,0:00:53.193
security researcher and he's gonna hold[br]a talk about how digital surveillance
0:00:53.193,0:01:04.179
changes our social way of interacting. So,[br]please, let's have a hand for Tijmen Schep!
0:01:04.179,0:01:14.710
Applause
0:01:14.710,0:01:18.289
Tijmen Schep (TS): Hi everyone. Really[br]cool that you're all here and really happy
0:01:18.289,0:01:24.430
to talk here. It's really an honor. My[br]name is Tijmen Schep and I am a technology
0:01:24.430,0:01:30.889
critic. And that means that it's my job to[br]not believe [audio cuts out] tells us and
0:01:30.889,0:01:36.810
that's really a lot of fun. [audio cuts[br]out] is, how do I get a wider audience
0:01:36.810,0:01:40.350
involved in understanding technology and[br]the issues that are arising from
0:01:40.350,0:01:46.049
technology? Because I believe that change[br]comes when the public demands it. I think
0:01:46.049,0:01:51.299
that's really one of the important things[br]when change happens. And for me as a
0:01:51.299,0:01:55.279
technology critic, for me words are very[br]much how I hack the system, how I try to
0:01:55.279,0:02:01.579
hack this world. And so, tonight I'm going[br]to talk to you about one of these words
0:02:01.579,0:02:08.310
that I think could help us. Framing the[br]issue is half the battle. [audio cuts out]
0:02:08.310,0:02:13.069
and frame the problem - if we can explain,[br]what the problem is in a certain frame...
0:02:13.069,0:02:17.990
that, you know, makes certain positions[br]already visible, that's really half the
0:02:17.990,0:02:24.691
battle won. So, that frame is social[br]cooling. But before I go into it, I want
0:02:24.691,0:02:31.880
to ask you a question. Who here recognizes[br]this? You're on Facebook or some other
0:02:31.880,0:02:38.990
social site, and you click on the link[br]because you think "Oh I could [audio cuts
0:02:38.990,0:02:43.650
out] listen [audio cuts out] could click[br]on this, but it might look bad. It might
0:02:43.650,0:02:46.501
be remembered by someone. Some agency[br]might remember it, and I could click on
0:02:46.501,0:02:51.960
it, but I'm hesitating to click."[br]Microphone buzzing
0:02:51.960,0:02:54.960
laughter
0:03:02.380,0:03:07.340
TS: That better? Can everyone hear me now?[br]Audience: No.
0:03:07.340,0:03:16.570
TS: No. Okay, that... yeah. Should I start[br]again? Okay. So, you're on Facebook, and
0:03:16.570,0:03:19.850
you're thinking "Oh, that's an interesting[br]link. I could click on that," but you're
0:03:19.850,0:03:22.600
hesitating because maybe someone's gonna[br]remember
0:03:22.600,0:03:26.320
that. And that might come back to me[br]later, and who here recognizes that
0:03:26.320,0:03:31.581
feeling? So, pretty much almost everybody.[br]And that's increasingly what I find, when
0:03:31.581,0:03:36.520
I talk about the issue, that people really[br]start to recognize this. And I think a
0:03:36.520,0:03:41.540
word we could use to describe that is[br]"Click Fear." This hesitation, it could be
0:03:41.540,0:03:46.261
click fear. And you're not alone.[br]Increasingly, we find that, research
0:03:46.261,0:03:50.880
points that this is a wide problem, that[br]people are hesitating to click some of the
0:03:50.880,0:03:55.090
links. For example, after the Snowden[br]revelations, people were less likely to
0:03:55.090,0:03:59.290
research issues about terrorism and other[br]things on Wikipedia because they thought
0:03:59.290,0:04:03.520
"Well, maybe the NSA wouldn't like it if I[br][audio cuts out] that. Okay, not gonna
0:04:03.520,0:04:10.780
move. And visiting Google as well. So this[br]is a pattern that there's research... are
0:04:10.780,0:04:15.380
pointing to. And it's not very strange, of[br]course. I mean, we all understand that if
0:04:15.380,0:04:17.661
you feel you're being[br]watched, you change your behavior. It's a
0:04:17.661,0:04:23.400
very logical thing that we all understand.[br]And I believe that technology is really
0:04:23.400,0:04:27.440
amplifying this effect. I think that's[br]something that we really have to come to
0:04:27.440,0:04:32.090
grips with. And that's why I think social[br]cooling could be useful with that. Social
0:04:32.090,0:04:36.249
cooling describes in a way how in[br]increasingly digital world, where our
0:04:36.249,0:04:42.870
digital lives are increasingly digitized, it[br]becomes easier to feel this pressure, to
0:04:42.870,0:04:49.220
feel these normative effects of these[br]systems. And very much you see that,
0:04:49.220,0:04:52.740
because increasingly, your data is being[br]turned into thousands of scores by data
0:04:52.740,0:04:56.289
brokers and other companies. And those[br]scores are increasing influences you're...
0:04:56.289,0:05:01.349
influencing your chances in life. And this[br]is creating an engine of oppression, an
0:05:01.349,0:05:07.990
engine of change that we have to[br]understand. And the fun thing is that in a
0:05:07.990,0:05:13.689
way this idea is really being helped by[br]Silicon Valley, who for a long time has
0:05:13.689,0:05:17.120
said "Data is the new gold," but they've[br]recently, in the last five years, changed
0:05:17.120,0:05:22.029
that narrative. Now they're saying "Data[br]is the new oil," and that's really funny,
0:05:22.029,0:05:25.270
because if data is the new oil, then[br]immediately you get the question "Wait,
0:05:25.270,0:05:31.210
oil gave us global warming, so then, what[br]does data give us?" And I believe that if
0:05:31.210,0:05:35.319
oil leads to global warming, then data[br]could lead to social cooling. That could
0:05:35.319,0:05:40.360
be the word that we use for these negative[br]effects of big data. In order to really
0:05:40.360,0:05:43.169
understand this, and go into it, we have[br]to look at three things. First, we're
0:05:43.169,0:05:47.090
going to talk about the reputation[br]economy, how that system works. Second
0:05:47.090,0:05:51.099
chapter, we're going to look at behavior[br]change, how it is influencing us and
0:05:51.099,0:05:55.419
changing our behavior. And finally, to not[br]let you go home depressed, I'm gonna talk
0:05:55.419,0:06:01.939
about how can we deal with this. So first.[br]The reputation economy. Already we've seen
0:06:01.939,0:06:08.152
today that China is building this new[br]system, the social credit system. It's a
0:06:08.152,0:06:11.210
system that will give every citizen in[br]China a score that basically represents
0:06:11.210,0:06:15.650
how well-behaved they are. And it[br]will influence your ability to get a job,
0:06:15.650,0:06:21.349
a loan, a visum and even a date. And for[br]example, the current version of the
0:06:21.349,0:06:26.339
system, Sesame Credit, one of the early[br]prototypes, already gives everybody that
0:06:26.339,0:06:30.689
wants to a score, but it also is connected[br]to the largest dating website in China.
0:06:30.689,0:06:34.810
So, you can kind of find out "Is this[br]person that I'm dating... what kind of
0:06:34.810,0:06:40.819
person is this? Is this something who's,[br]you know, well viewed by Chinese society?"
0:06:42.239,0:06:45.189
This is where it gets really heinous for[br]me, because until now you could say "Well,
0:06:45.189,0:06:48.860
these reputation systems, they're fair, if[br]you're a good person, you get a higher
0:06:48.860,0:06:52.470
score. If you're bad person, you get a[br]lower score," but it's not that simple. I
0:06:52.470,0:06:55.499
mean, your friends' score influences your[br]score, and your score influences your
0:06:55.499,0:07:00.759
friends' score, and that's where you[br]really start to see how complex social
0:07:00.759,0:07:04.581
pressures arrive, and where we can see the[br]effects of data stratification, where
0:07:04.581,0:07:07.389
people are starting to think "Hey, who are[br]my friends, and who should I be friends
0:07:07.389,0:07:14.350
with?" You could think "That only happens[br]in China. Those Chinese people are, you
0:07:14.350,0:07:18.999
know, different." But the exact[br]same thing is happening here in the West,
0:07:18.999,0:07:22.031
except we're letting the market build it.[br]I'll give you an example. This is a
0:07:22.031,0:07:25.750
company called "deemly" - a Danish company[br]- and this is their video for their
0:07:25.750,0:07:28.849
service.[br]Video narrator (VN): ... renting
0:07:28.849,0:07:33.600
apartments from others, and she loves to[br]swap trendy clothes and dresses. She's
0:07:33.600,0:07:38.059
looking to capture her first lift from a[br]RideShare app, but has no previous reviews
0:07:38.059,0:07:40.020
to help support her.[br]Video background voices: Awww.
0:07:40.020,0:07:44.259
VN: Luckily, she's just joined deemly,[br]where her positive feedback from the other
0:07:44.259,0:07:50.699
sites appears as a deemly score, helping[br]her to win a RideShare in no time. Deemly
0:07:50.699,0:07:55.879
is free to join and supports users across[br]many platforms, helping you to share and
0:07:55.879,0:08:01.159
benefit from the great reputation you've[br]earned. Imagine the power of using your
0:08:01.159,0:08:04.040
deemly score alongside your CV for a job[br]application...
0:08:04.040,0:08:06.159
TS: Like in China.[br]VN: ... perhaps to help get a bank loan...
0:08:06.159,0:08:08.339
TS: Like...[br]VN: or even to link to from your dating
0:08:08.339,0:08:09.339
profile.[br]TS: Like in China!
0:08:09.339,0:08:15.759
VN: Sign up now at deemly.co. Deemly:[br]better your sharing.
0:08:15.759,0:08:21.590
Applause[br]TS: Thanks. There is a change. There is
0:08:21.590,0:08:26.499
difference, though. The funny thing about[br]here is that it's highly invisible to us.
0:08:26.499,0:08:29.580
The Chinese government is very open about[br]what they're building, but here we are
0:08:29.580,0:08:33.360
very blind to what's going on. Mostly,[br]when we talk about these things, then
0:08:33.360,0:08:37.450
we're talking about these systems that[br]give us a very clear rating, like Airbnb,
0:08:37.450,0:08:41.909
Uber, and of course the Chinese system.[br]The thing is, most of these systems are
0:08:41.909,0:08:47.350
invisible to us. There's a huge market of[br]data brokers who are, you know, not
0:08:47.350,0:08:52.330
visible to you, because you are not the[br]customer. You are the product. And these
0:08:52.330,0:08:57.160
data brokers, well, what they do is, they[br]gather as much data as possible about you.
0:08:57.160,0:09:03.590
And that's not all. They then create up to[br]eight thousand scores about you. In the
0:09:03.590,0:09:07.820
United States, these companies have up to[br]8,000 scores, and in Europe it's a little
0:09:07.820,0:09:13.190
less, around 600. These are scores about[br]things like your IQ, your psychological
0:09:13.190,0:09:19.020
profile, your gullibility, your religion,[br]your estimated life span. 8,000 of these
0:09:19.020,0:09:23.690
different things about you. And how does[br]that work? Well, it works by machine
0:09:23.690,0:09:28.580
learning. So, machine learning algorithms[br]can find patterns in society that we can
0:09:28.580,0:09:35.750
really not anticipate. For example, let's[br]say you're a diabetic, and, well, let's
0:09:35.750,0:09:40.880
say this data broker company has a mailing[br]list, or has an app, that diabetic
0:09:40.880,0:09:44.311
patients use. And they also have the data[br]of these diabetic patients about what they
0:09:44.311,0:09:48.440
do on Facebook. Well, there you can start[br]to see correlations. So, if diabetic
0:09:48.440,0:09:53.840
patients more often like gangster-rap and[br]pottery on Facebook, well, then you could
0:09:53.840,0:09:58.120
deduce from that if you also like[br]gangster-rap or pottery on Facebook, then
0:09:58.120,0:10:03.460
perhaps you also are more likely to have[br]or get diabetes. It is highly
0:10:03.460,0:10:08.690
unscientific, but this is how the system[br]works. And this is an example of how that
0:10:08.690,0:10:13.470
works with just your Facebook scores.[br]Woman in the video: ... see was lowest about
0:10:13.470,0:10:17.570
60% when it came to predicting whether a[br]user's parents were still together when
0:10:17.570,0:10:21.310
they were 21. People whose parents[br]divorced before they were 21 tended to
0:10:21.310,0:10:26.330
like statements about relationships. Drug[br]users were ID'd with about 65% accuracy.
0:10:26.330,0:10:33.120
Smokers with 73%, and drinkers with 70%.[br]Sexual orientation was also easier to
0:10:33.120,0:10:40.400
distinguish among men. 88% right there.[br]For women, it was about 75%. Gender, by
0:10:40.400,0:10:44.870
the way, race, religion, and political[br]views, were predicted with high accuracy
0:10:44.870,0:10:50.270
as well. For instance: White versus black:[br]95%.
0:10:50.270,0:10:54.070
TS: So, the important thing to understand[br]here is that this isn't really about your
0:10:54.070,0:10:57.890
data anymore. Like, oftentimes when we[br]talk about data protection, we talk about
0:10:57.890,0:11:03.190
"Oh, I want to keep control of my data."[br]But this is their data. This data that
0:11:03.190,0:11:10.190
they deduce, that they derive from your[br]data. These are opinions about you. And
0:11:10.190,0:11:14.300
these things are what, you know, make it[br]so that even though you never filled in a
0:11:14.300,0:11:18.960
psychological test, they'd have one. A[br]great example of that, how that's used, is
0:11:18.960,0:11:24.220
a company called Cambridge Analytica. This[br]company has created detailed profiles
0:11:24.220,0:11:28.690
about us through what they call[br]psychographics and I'll let them explain
0:11:28.690,0:11:33.100
it themselves.[br]Man in the video: By having hundreds and
0:11:33.100,0:11:36.930
hundreds of thousands of Americans[br]undertake this survey, we were able to
0:11:36.930,0:11:40.090
form a[br]model to predict the personality of every
0:11:40.090,0:11:46.230
single adult in the United States of[br]America. If you know the personality of
0:11:46.230,0:11:50.300
the people you're targeting you can nuance[br]your messaging to resonate more
0:11:50.300,0:11:55.640
effectively with those key audience[br]groups. So, for a highly neurotic and
0:11:55.640,0:12:01.200
conscientious audience, you're going to[br]need a message that is rational and fair-
0:12:01.200,0:12:05.700
based, or emotionally-based. In this case,[br]the threat of a burglary, and the
0:12:05.700,0:12:10.410
insurance policy of a gun is very[br]persuasive. And we can see where these
0:12:10.410,0:12:15.280
people are on the map. If we wanted to[br]drill down further, we could resolve the
0:12:15.280,0:12:19.780
data to an individual level, where we have[br]somewhere close to four or five thousand
0:12:19.780,0:12:24.460
data points on every adult in the United[br]States.
0:12:24.460,0:12:28.120
TS: So, yeah. This is the company that[br]worked with both the... for the Brexit
0:12:28.120,0:12:33.570
campaign and with the Trump campaign. Of[br]course, little after Trump campaign, all
0:12:33.570,0:12:37.510
the data was leaked, so data on 200[br]million Americans was leaked, And
0:12:37.510,0:12:40.380
increasingly, you can see[br]this data described as "modeled voter
0:12:40.380,0:12:46.440
ethnicities and religions." So, this is[br]this derived data. You might think that
0:12:46.440,0:12:49.630
when you go online and use Facebook and[br]use all these services, that advertisers
0:12:49.630,0:12:53.310
are paying for you. That's a common[br]misperception. That's not really the case.
0:12:53.310,0:12:57.500
What's really going on is that, according[br]to SSC research, the majority of the
0:12:57.500,0:13:01.980
money made in this data broker market is[br]made from risk management. All right, so,
0:13:01.980,0:13:07.380
in a way you could say that it's not[br]really marketers that are paying for you,
0:13:07.380,0:13:12.930
it's your bank. It's ensurers. It's your[br]employer. It's governments. These kind of
0:13:12.930,0:13:19.210
organizations are the ones who buy these[br]profiles. The most. More than the other
0:13:19.210,0:13:23.530
ones. Of course, the promise of big data[br]is that you can then manage risk. Big data
0:13:23.530,0:13:27.720
is the idea that with data you can[br]understand things and then manage them.
0:13:27.720,0:13:31.700
So what really is innovation in this big[br]data world, this data economy, is the
0:13:31.700,0:13:36.110
democratization of the background check.[br]That's really the core of this, this market
0:13:36.110,0:13:39.310
that now you can find out everything about[br]everyone.
0:13:39.310,0:13:44.810
So, yeah, now your... in past, only perhaps[br]your bank could know your credit score but
0:13:44.810,0:13:47.980
now your green grocer knows your[br]psychological profile.
0:13:47.980,0:13:55.070
Right that's a new level of, yeah, what's[br]going on here. It's not only inv... not
0:13:55.070,0:13:58.880
only invisible but it's also huge[br]according to the same research by the FCC
0:13:58.880,0:14:05.340
this market was already worth 150 billion[br]dollars in 2015. So, it's invisible, it's
0:14:05.340,0:14:10.760
huge and hardly anyone knows about it. But[br]that's probably going to change. And that
0:14:10.760,0:14:18.040
brings us to the second part: Behavioral[br]change. We already see this first part of
0:14:18.040,0:14:21.370
this, how behavioral change is happening[br]through these systems. That's through outside
0:14:21.370,0:14:25.520
influence and we've, we've talked a lot[br]about this in this conference. For example
0:14:25.520,0:14:29.110
we see how Facebook and advertisers try to[br]do that. We've also seen how China is
0:14:29.110,0:14:32.870
doing that, trying to influence you. Russia[br]has recently tried to use Facebook to
0:14:32.870,0:14:36.710
influence the elections and of course[br]companies like Cambridge Analytica try to
0:14:36.710,0:14:39.420
do the same thing.[br]And here you can have a debate on, you
0:14:39.420,0:14:43.480
know, to what extent are they really[br]influencing us, but I think that's not
0:14:43.480,0:14:48.090
actually the really, the most interesting[br]question. What interests me most of all is
0:14:48.090,0:14:53.840
how we are doing it ourselves, how we are[br]creating new forms of self-censorship and
0:14:53.840,0:14:59.040
and are proactively anticipating these[br]systems. Because once you realize that
0:14:59.040,0:15:01.870
this is really about risk management you[br]start... and this is about banks and
0:15:01.870,0:15:06.400
employers trying to understand you, people[br]start to understand that this will go beyond
0:15:06.400,0:15:10.470
click fear, if you remember. This will go[br]beyond, this will become, you know,
0:15:10.470,0:15:14.230
when people find out this will be, you[br]know, not getting a job for example.
0:15:14.230,0:15:18.980
This'll be about getting really expensive[br]insurance. It'll be about all these kinds
0:15:18.980,0:15:22.080
of problems and people are increasingly[br]finding this out. So for example in the
0:15:22.080,0:15:28.970
United States if you... the IRS might now[br]use data profile... are now using data
0:15:28.970,0:15:33.860
profiles to find out who they should[br]audit. So I was talking recently to a girl
0:15:33.860,0:15:37.560
and and she said: "Oh I recently tweeted[br]about... a negative tweet about the IRS," and
0:15:37.560,0:15:39.800
she immediately grabbed her phone to[br]delete it.
0:15:39.800,0:15:44.600
When she realized that, you know, this[br]could now be used against her in a way.
0:15:44.600,0:15:49.320
And that's the problem. Of course we see all[br]kinds of other crazy examples that the big...
0:15:49.320,0:15:54.450
the audience that we measure... the wider public[br]is picking up on, like who... so we now have
0:15:54.450,0:15:58.610
algorithms that can find out if you're gay[br]or not. And these things scare people and
0:15:58.610,0:16:03.460
these things are something we have to[br]understand. So, chilling effects this what
0:16:03.460,0:16:08.380
this boils down to. For me, more importantly[br]than these influences of these big
0:16:08.380,0:16:12.850
companies and nation states is how people[br]themselves are experiencing these chilling
0:16:12.850,0:16:17.860
effects like you yourself have as well.[br]That brings us back to social cooling. For
0:16:17.860,0:16:23.810
me, social cooling is about these two[br]things combined at once and this
0:16:23.810,0:16:28.720
increasing ability of agents and... and groups[br]to influence you and on the other hand the
0:16:28.720,0:16:33.240
increasing willingness of people[br]themselves to change their own behavior to
0:16:33.240,0:16:38.680
proactively engage with this issue.[br]There are three long-term consequences
0:16:38.680,0:16:44.120
that I want to dive into. The first is how[br]this affects the individual, the second is
0:16:44.120,0:16:49.320
how it affects society, and the third is[br]how it affects the market. So let's look
0:16:49.320,0:16:55.350
the individual. Here we've seen, there's a[br]rising culture of self-censorship. It
0:16:55.350,0:16:58.970
started for me with an article that I read in[br]New York Times, where a student was saying:
0:16:58.970,0:17:02.430
"Well we're very very reserved." She's[br]going to do things like spring break.
0:17:02.430,0:17:05.819
I said: "Well you don't have to defend[br]yourself later," so you don't do it.
0:17:05.819,0:17:08.859
And what she's talking about, she's[br]talking about doing crazy things, you
0:17:08.859,0:17:12.410
know, letting go, having fun. She's[br]worried that the next day it'll be on
0:17:12.410,0:17:15.839
Facebook.[br]So what's happening here is that you do
0:17:15.839,0:17:18.190
have all kinds of freedoms: You have the[br]freedom to look up things, you have the
0:17:18.190,0:17:22.290
freedom to to say things, but you're[br]hesitating to use it. And that's really
0:17:22.290,0:17:28.460
insidious. That has an effect on a wider[br]society and here we really see the
0:17:28.460,0:17:34.260
societal value of privacy. Because in[br]society often minority values later become
0:17:34.260,0:17:40.559
majority values. An example is... is weed.[br]I'm from... I'm from the Netherlands and
0:17:40.559,0:17:45.059
there you see, you know, at first it's[br]something that you just don't do and it's
0:17:45.059,0:17:48.800
you know a bit of a "uhh", but then "Oh, maybe[br]yeah, you should... you should try it as well," and
0:17:48.800,0:17:53.020
people try it and slowly under the surface[br]of the society, people change their minds
0:17:53.020,0:17:55.980
about these things. And then, after a while[br]it's like, you know, "What are we still
0:17:55.980,0:17:59.300
worried about?"[br]How the same pattern help it happens of
0:17:59.300,0:18:03.070
course with way bigger things like this:[br]Martin Luther King: "I must honestly say to
0:18:03.070,0:18:10.470
you that I never intend to adjust myself[br]to racial segregation and discrimination."
0:18:12.410,0:18:14.100
TS: This is the same pattern[br]that's happening
0:18:14.100,0:18:17.680
for all kinds of things that that change[br]in society, and that's what privacy is so
0:18:17.680,0:18:20.370
important for, and that's why it's so[br]important that people have the ability to
0:18:20.370,0:18:22.880
look things up and to change their minds[br]and to talk about each other without
0:18:22.880,0:18:27.930
feeling so watched all the time.[br]The third thing is how this impacts the
0:18:27.930,0:18:33.770
market. Here we see very much the rise of[br]a culture of risk avoidance. An example
0:18:33.770,0:18:36.580
here is that in[br]1995 already, doctors in New York were
0:18:36.580,0:18:43.510
given scores, and what happened was that[br]the doctors who try to help advanced
0:18:43.510,0:18:47.010
stage cancer patients, complex patients, who[br]try to do the operation, difficult
0:18:47.010,0:18:52.380
operations, got a low score, because these[br]people more often died, while doctors that
0:18:52.380,0:18:56.800
didn't lift a finger and didn't try to[br]help got a high score. Because, well, people
0:18:56.800,0:19:00.780
didn't die. So you see here that these[br]systems that, they bring all kinds of
0:19:00.780,0:19:04.670
perverse incentives. They, you know, they're[br]they lower the willingness for everybody
0:19:04.670,0:19:07.760
to take a risk and in some areas of[br]society we really like people to take
0:19:07.760,0:19:15.150
risks. They're like entrepreneurs, doctors.[br]So in the whole part you could say that
0:19:15.150,0:19:19.010
this, what we're seeing here, is some kind[br]of trickle-down risk aversion, where
0:19:19.010,0:19:23.130
the willing, the... the way that[br]companies and governments want to
0:19:23.130,0:19:27.440
manage risk, that's trickling down to us.[br]And we're we of course want them to like
0:19:27.440,0:19:30.309
us, want to have a job, we want to have[br]insurance, and then we increasingly start
0:19:30.309,0:19:36.830
to think "Oh, maybe I should not do this." It's[br]a subtle effect. So how do we deal with
0:19:36.830,0:19:39.960
this?[br]Well, together. I think this is a really
0:19:39.960,0:19:43.490
big problem. I think this is such a big[br]problem that, that it can't be managed by
0:19:43.490,0:19:47.460
just some, some hackers or nerds, building[br]something, or by politicians, making a law.
0:19:47.460,0:19:52.640
This is a really a society-wide problem.[br]So I want to talk about all these groups
0:19:52.640,0:19:57.920
that should get into this: the public,[br]politicians, business, and us.
0:19:57.920,0:20:03.270
So the public. I think we have to talk[br]about and maybe extend the metaphor of the
0:20:03.270,0:20:06.890
cloud and say we have to learn to see the[br]stars behind the cloud. Alright, that's one
0:20:06.890,0:20:11.679
way that we could... that's a narrative we[br]could use. I really like to use humor to
0:20:11.679,0:20:17.050
explain this to a wider audience, so for[br]example, last year I was part of an
0:20:17.050,0:20:22.240
exhibits... helped develop exhibits about[br]dubious devices and one of the devices
0:20:22.240,0:20:25.010
there was called "Taste your status"[br]which was a coffee machine that gave you
0:20:25.010,0:20:30.210
coffee based on your area code. So if you[br]live in a good area code, you get nice coffee.
0:20:30.210,0:20:33.630
You live in a bad area code, you get bad[br]coffee.
0:20:33.630,0:20:36.960
music[br]laugher
0:20:36.960,0:20:41.820
applause[br]I'll go into it but... these... often times
0:20:41.820,0:20:44.130
you can use humor to explain these things[br]to a wider audience. I really like that
0:20:44.130,0:20:47.670
method, that approach.[br]We've got a long way to go though. I mean,
0:20:47.670,0:20:50.150
if we look at the long, you know, how long[br]it took for us to understand global
0:20:50.150,0:20:53.150
warming, to really, you know, come to a stage[br]where most people understand what it is
0:20:53.150,0:20:58.260
and care about it except Donald Trump.[br]Well, with data we really got a long way to
0:20:58.260,0:21:01.890
go, we're really at the beginning of[br]understanding this issue like this.
0:21:01.890,0:21:07.970
Okay, so the second group that has to[br]really wake up is politicians. And they have
0:21:07.970,0:21:10.910
to understand that this is really about the[br]balance of power. This is really about
0:21:10.910,0:21:16.380
power. And if you permit me, I'll go into[br]the big picture a little bit, as a media
0:21:16.380,0:21:21.750
theorist. So this is called Giles Deleuze.[br]he's a French philosopher and he explained
0:21:21.750,0:21:26.000
in his work something that I find really[br]useful, He said you have two systems of
0:21:26.000,0:21:30.152
control in society and the one is the[br]institutional one and that's the one we
0:21:30.152,0:21:32.940
all know.[br]You know that the judicial system so
0:21:32.940,0:21:38.140
you're free to do what you want but then[br]you cross a line you cross a law and the
0:21:38.140,0:21:40.610
police get you you go for every charge you[br]go to prison. That's the system we
0:21:40.610,0:21:43.010
understand.[br]But he says there's another system which
0:21:43.010,0:21:46.840
is the social system this is a social[br]pressure system and this for a long time
0:21:46.850,0:21:49.670
wasn't really designed.[br]But now increasingly we are able to do
0:21:49.670,0:21:53.480
that so this is the system where you[br]perform suboptimal behavior and then that
0:21:53.480,0:21:58.240
gets measured and judged and then you get[br]subtly nudged in the right direction. And
0:21:58.240,0:22:01.130
there's some very important differences[br]between these 2 systems. The institutional
0:22:01.130,0:22:05.010
system you know it has this idea that[br]you're a free citizen that makes up your
0:22:05.010,0:22:10.500
own mind and you know what social system[br]is like that's working all the time,
0:22:10.500,0:22:13.280
constantly it doesn't matter if you're[br]guilty or innocent it's always trying to
0:22:13.280,0:22:19.020
push you. The old system, the institutional[br]system is very much about punishment so if
0:22:19.020,0:22:21.020
you[br]break the rules you get punishment but
0:22:21.020,0:22:23.570
people sometimes don't really care about[br]punishment sometimes it's cool to get
0:22:23.570,0:22:27.670
punishment. But the social system uses[br]something way more powerful which is the
0:22:27.670,0:22:33.890
fear of exclusion. We are social animals[br]and we really care to belong to a group.
0:22:33.890,0:22:37.350
The other difference is that it's very[br]important that the institutional system is
0:22:37.350,0:22:40.920
accountable. You know democratically to us[br]how the social system at the moment is
0:22:40.920,0:22:45.220
really really invisible like these[br]algorithms how they work where the data is
0:22:45.220,0:22:48.720
going it's very hard to understand and of[br]course it's exactly what China loved so
0:22:48.720,0:22:52.530
much about it right there's no you can[br]stand in front of a tank but you can't
0:22:52.530,0:22:56.690
really stand in front of the cloud. So[br]yeah that's that's great it also helps me
0:22:56.690,0:23:00.660
to understand when people say I have[br]nothing to hide. I really understand that
0:23:00.660,0:23:03.410
because when people say I have nothing to[br]hide what they're saying is I have nothing
0:23:03.410,0:23:06.340
to hide from the old system from the[br]classic system from the institutional
0:23:06.340,0:23:09.460
system. They're saying I want to help the[br]police I trust our gover
0:23:09.460,0:23:13.382
nment I trust our institutions and that's[br]actually really a positive thing to say.
0:23:13.382,0:23:16.980
The thing is they don't really see the other[br]part of the system how increasingly there
0:23:16.980,0:23:21.970
are parts that are not in your control[br]they're not democratically checked and that's
0:23:21.970,0:23:28.499
really a problem. So the third thing that[br]I think we have to wake up is business,
0:23:28.499,0:23:32.560
business has to see that this is not so[br]much a problem perhaps but that it could
0:23:32.560,0:23:36.110
be an opportunity. I think I'm still[br]looking for a metaphor here but perhaps,
0:23:36.110,0:23:40.350
if we you know again, compare this issue[br]to global warming we say that we need
0:23:40.350,0:23:44.930
something like ecological food for data.[br]And but I don't know what that's gonna
0:23:44.930,0:23:48.270
look like or how we're gonna explain that[br]maybe we have to talk about fast food
0:23:48.270,0:23:54.040
versus fast data versus ecological data[br]but we need a metaphor here. Of course
0:23:54.040,0:24:07.750
laws are also really helpful. So we might[br]get things like this. I'm actually working
0:24:07.750,0:24:18.600
on this is funny. Or if things get really[br]out of hand we might get here, right?
0:24:18.600,0:24:23.020
So luckily we see that in Europe[br]the the politicians are awake and are
0:24:23.020,0:24:25.630
really trying to push this market I think[br]that's really great, so I think in the
0:24:25.630,0:24:28.679
future we'll get to a moment where people[br]say well I prefer European smart products
0:24:28.679,0:24:33.320
for example, I think that's a good thing I[br]think this is really positive. Finally I
0:24:33.320,0:24:36.930
want to get to all of us what each of us[br]can do. I think here again there's a
0:24:36.930,0:24:40.412
parallel to global warming where at its[br]core it's not so much about the new
0:24:40.412,0:24:43.770
technology and all the issues, it's about[br]a new mindset, a new way of looking at the
0:24:43.770,0:24:48.140
world. And I here think we have to stop[br]saying that we have nothing to hide for
0:24:48.140,0:24:52.460
example. If I've learned anything in the[br]past years understanding and researching
0:24:52.460,0:24:58.200
privacy and this big trade data market is[br]privacy is the right to be imperfect. All
0:24:58.200,0:25:01.390
right increasing there's pressure to be[br]the perfect citizen to be the perfect
0:25:01.390,0:25:05.840
consumer and privacy is a way of getting[br]out of that. So this is how I would
0:25:05.840,0:25:09.470
reframe privacy it's not just being about[br]which bits and bytes go where but it's
0:25:09.470,0:25:13.040
about you know the human right to be[br]imperfect cause course we are human we are
0:25:13.040,0:25:17.440
all imperfect. Sometimes when I talk at[br]technology conference people say well
0:25:17.440,0:25:21.500
privacy was just a phase. You know, it's[br]like ebb and flood in and we got it and
0:25:21.500,0:25:25.980
it's gonna go away again, that's crazy you[br]know, you don't say women's rights were
0:25:25.980,0:25:30.710
just a phase we had it for a while and[br]it's gonna go again. Right? And of course
0:25:30.710,0:25:34.341
Edward Snowden explains it way better. He[br]says arguing that you don't care about the
0:25:34.341,0:25:37.200
right to privacy because you have nothing[br]to hide it's no different than saying you
0:25:37.200,0:25:40.930
don't care about free speech because you[br]have nothing to say. What an eloquent
0:25:40.930,0:25:47.210
system admin. So I think what we have to[br]do strive for here is that we develop for
0:25:47.210,0:25:50.799
more nuanced understanding of all these[br]issues. I think we have to go away from
0:25:50.799,0:25:54.330
this idea that data more data is better,[br]data is automatically progress. No it's
0:25:54.330,0:25:58.940
not data is a trade-off for example for[br]the individual more data might mean less
0:25:58.940,0:26:04.381
psychological security, less willingness[br]to share, less willing to try things. For
0:26:04.381,0:26:09.110
a country it might mean less autonomy for[br]citizens and citizens need their own
0:26:09.110,0:26:12.120
autonomy they need to know what's going on[br]they need to be able to vote in their own
0:26:12.120,0:26:17.050
autonomous way and decide what's what they[br]want. In business you could say more data
0:26:17.050,0:26:21.140
might lead to less creativity right less[br]willingness to share new ideas to come up
0:26:21.140,0:26:29.510
with new ideas - that's again an issue[br]there. So in conclusion social cooling is
0:26:29.510,0:26:32.460
a way of understanding these issues or a[br]way of framing these issues that I think
0:26:32.460,0:26:37.030
could be useful for us. That could help us[br]understand and engage with these issues.
0:26:37.030,0:26:41.620
And yes social cooling is an alarm, it's[br]alarmist - it is we're trying to say this
0:26:41.620,0:26:45.940
is the problem and we have to deal with[br]this. But it's also really about hope. All
0:26:45.940,0:26:50.020
right. I trust not so much in technology I[br]trust in us in people that we can fix this
0:26:50.020,0:26:53.530
once we understand the issue in the same[br]way that when we understood the problem
0:26:53.530,0:26:56.460
with global warming we started to deal[br]with it. Where do it's
0:26:56.460,0:27:00.030
gonna it's slow progress we're doing that[br]and we can do the same thing with data.
0:27:00.030,0:27:06.360
It'll take a while but we'll get there.[br]And finally this is about starting to
0:27:06.360,0:27:10.350
understand the difference between shallow[br]optimism and deep optimism. All right,
0:27:10.350,0:27:13.290
oftentimes technology sectors right cool[br]into technology and we're going to fix
0:27:13.290,0:27:16.570
this by creating an app and for me that's[br]you know ,They: "we have to be
0:27:16.570,0:27:20.880
optimistic", that's very shallow optimism[br]the TEDx make optimism. Like true optimism
0:27:20.880,0:27:24.480
recognizes that each technology comes with[br]a downside and we have to recognize that
0:27:24.480,0:27:28.420
thats it's, that thats not a problem to,[br]to point out these problems it's a good
0:27:28.420,0:27:32.049
thing if once you understand the problems[br]you can deal with them - and you know come
0:27:32.049,0:27:37.150
up with better solutions. If we don't[br]change in this mindset then we might
0:27:37.150,0:27:41.370
create the world where we're all more well[br]behaved but perhaps also a little bit
0:27:41.370,0:27:59.750
less human. Thank you.[br]Applause
0:27:59.750,0:28:04.440
H: Thank You Devin.[br]TS: You are welcome.
0:28:04.440,0:28:13.080
Applause[br]H: We still have five more minutes we'll
0:28:13.080,0:28:18.179
take some questions if you like. First[br]microphone number 2.
0:28:18.179,0:28:23.170
Microphone 2 (M2): Hello, thanks that was[br]a really interesting talk. I have a
0:28:23.170,0:28:29.790
question that I hope will work it's a bit[br]complicated there's a project called indie
0:28:29.790,0:28:34.230
by a foundation called a sovereign[br]foundation do you know about it? Okay very
0:28:34.230,0:28:38.720
great perfect so to just to quickly[br]explain these people want to create an
0:28:38.720,0:28:42.669
identity layer that will be self sovereign[br]which means people can reveal what they
0:28:42.669,0:28:47.530
want about themselves only when they want[br]but is one unique identity on the entire
0:28:47.530,0:28:51.750
internet so that can potentially be very[br]liberating because you control all your
0:28:51.750,0:28:57.130
identity and individual data. But at the[br]same time it could be used to enable
0:28:57.130,0:29:00.330
something like the personal scores we were[br]showing earlier on so made me think about
0:29:00.330,0:29:02.720
that and I wanted to know if you had an[br]opinion on this.
0:29:02.720,0:29:09.059
TS: Yes well um the first thing I think[br]about is that as I try to explain you see
0:29:09.059,0:29:11.440
a lot of initiatives have tried to be[br]about: "Oo you have to control your own
0:29:11.440,0:29:15.490
data". But that's really missing the point[br]that it's no longer really about your data
0:29:15.490,0:29:19.410
it's about this derived data and of course[br]it can help to to manage what you share
0:29:19.410,0:29:23.600
you know then they can't derive anything[br]from it. But to little I see that
0:29:23.600,0:29:29.360
awareness. Second of all this is very much[br]for me an example of what nerds and
0:29:29.360,0:29:32.480
technologies are really good at it's like:[br]"oh we've got a social problem let's
0:29:32.480,0:29:36.299
create a technology app and then we'll fix[br]it". Well what I'm trying to explain is
0:29:36.299,0:29:39.690
that this is such a big problem that we[br]cannot fix this with just one group alone
0:29:39.690,0:29:42.320
- not the politicians, not the designers,[br]not the Nerds this is something that we
0:29:42.320,0:29:47.799
have to really get together you know grab[br]- fix together because this is such a
0:29:47.799,0:29:50.930
fundamental issue right. The idea that[br]risk is a problem that we want to manage
0:29:50.930,0:29:55.570
risk is such so deeply ingrained in people[br]you know such stuff based in fear is
0:29:55.570,0:29:59.169
fundamental and it's everywhere so it's[br]not enough for one group to try to fix
0:29:59.169,0:30:02.450
that it's something that we have to come[br]to grips with together.
0:30:02.450,0:30:06.390
M2: Thanks a lot.[br]H: Ok there is a signal angel has a
0:30:06.390,0:30:09.730
question from the internet I think.[br]Signal Angel (SigA): Yes and BarkingSheep
0:30:09.730,0:30:13.160
is asking: "do you think there's a[br]relationship between self-censorship and
0:30:13.160,0:30:16.980
echo chambers in a sense that people[br]become afraid to challenge their own
0:30:16.980,0:30:22.799
belief and thus isolate themselves in[br]groups with the same ideology?".
0:30:22.799,0:30:27.830
TS: That's, a that's a, that's a really[br]big answer to that one.
0:30:27.830,0:30:31.440
pauses[br]TS: Actually, I was e-mailing Vince Cerf,
0:30:31.440,0:30:35.610
and miraculously he, he responded, and he[br]said what you really have to look for is
0:30:35.610,0:30:39.480
this, not just a reputation economy, but[br]also the attention economy and how they're
0:30:39.480,0:30:45.280
linked. So for a while I've been looking[br]for that, that link and there's a lot to
0:30:45.280,0:30:49.990
say there and there definitely is a link.[br]I think important to understand over to
0:30:49.990,0:30:53.540
get new ones here is that, I'm not saying[br]that everybody will become really well
0:30:53.540,0:30:59.700
behaved and gray book worm people. The[br]thing is that what this situation's
0:30:59.700,0:31:03.230
creating, is that we're all becoming[br]theater players while playing in identity
0:31:03.230,0:31:06.230
more and more, because we're watched more[br]of the time. And for some people that
0:31:06.230,0:31:10.660
might mean that they're, you know, I think[br]most people will be more conservative and
0:31:10.660,0:31:15.400
more careful, some people will go really[br]all out and they all enjoy the stage! You
0:31:15.400,0:31:18.850
know? We have those people as well, and I[br]think those people could really benefit
0:31:18.850,0:31:23.800
and that the attention economy could[br]really you know give them a lot of
0:31:23.800,0:31:26.859
attention through that. So I think[br]there's, there's a link there but I could
0:31:26.859,0:31:29.549
go on more but I think it's for now, where[br]I'm aware.
0:31:29.549,0:31:33.819
H: Okay, we're short on time, we'll take,[br]I'm sorry one more question. The number
0:31:33.819,0:31:36.850
one?[br]Microphone 1 (M1): So, the, I think the
0:31:36.850,0:31:38.710
audience you're talking about, ...[br]H: Louder, please.
0:31:38.710,0:31:44.350
M1: The the audience you're talking to[br]here, is already very aware but I'm asking
0:31:44.350,0:31:50.210
for, like tactics, or your tips, to spread[br]your message and to talk to people that
0:31:50.210,0:31:55.440
are in this, they say: "Uh, I don't care[br]they can surveil me.", like what's, what's
0:31:55.440,0:32:01.430
your approach, like in a practical way?[br]How do you actually do this?
0:32:01.430,0:32:08.230
TS: Yeah, so, I'm really glad to be here[br]because I am, yes, I am a nerd, but I'm
0:32:08.230,0:32:11.539
also a philosopher or thinker, you know[br]and, and that means that
0:32:11.539,0:32:15.820
for me what I work with, it's not just odd[br]Rhinos, but words and ideas. I think those
0:32:15.820,0:32:18.860
I've been trying to show can be really[br]powerful, like a word can be a really
0:32:18.860,0:32:29.229
powerful way to frame a debate or engage[br]people. So, I haven't found yet a way to
0:32:29.229,0:32:33.030
push all this tar. Like, I was making joke[br]that I can tell you in one sentence, what
0:32:33.030,0:32:36.500
privacy is and why it matters but I have[br]to give a whole talk before that, all
0:32:36.500,0:32:39.530
right? Privacy is a right to be imperfect[br]but in order to understand that you have
0:32:39.530,0:32:42.710
to understand the rise of the reputation[br]economy, and how it affects your chances
0:32:42.710,0:32:47.061
in life. The fun thing is, that, that that[br]will happen by itself that people will
0:32:47.061,0:32:50.700
become more aware of that, they will run[br]into these problems. They will not get a
0:32:50.700,0:32:55.660
job or they might get other issues, and[br]then they will start to see the problem.
0:32:55.660,0:32:59.150
And so my question not so much to help[br]people understand it, but to help them
0:32:59.150,0:33:02.980
understand it before they run into the[br]wall, right? That's how usually society at
0:33:02.980,0:33:06.530
the moment deals with technology problems.[br]It's like "Oh we'll, we'll, oh ... Oh?
0:33:06.530,0:33:10.820
it's a problem? Oh well, now we'll try to[br]fix it." Well, I believe you can really
0:33:10.820,0:33:15.020
see these problems come way earlier and I[br]think the humanity's, to come around from,
0:33:15.020,0:33:19.630
is really helpful in that, and trying to[br]you know like, the lows are really,
0:33:19.630,0:33:28.040
really clearly explaining what the problem[br]is in 1995. So yeah, that I think that, I
0:33:28.040,0:33:32.530
don't have a short way of explaining, you[br]know, why privacy matters but I think
0:33:32.530,0:33:38.220
it'll become easier over time as people[br]start to really feel these pressures.
0:33:38.220,0:33:42.530
H: Sorry, thank you very much for the[br]question. I think we all should go out and
0:33:42.530,0:33:49.320
spread the message. This talk is over, I'm[br]awfully sorry. When you people leave,
0:33:49.320,0:33:51.620
please take your bottles, and your cups,[br]...
0:33:51.620,0:33:54.190
applause[br]H: ... and all your junk, and thank you
0:33:54.190,0:34:05.500
very much again Tijmen Schep![br]applause
0:34:05.500,0:34:09.899
music
0:34:09.899,0:34:27.000
subtitles created by c3subtitles.de[br]in the year 2017. Join, and help us!