WEBVTT
00:00:14.702 --> 00:00:22.030
Herald (H): Yeah. Welcome to our next
talk, Social Cooling. You know, people say
00:00:22.030 --> 00:00:28.696
"I have no problem with surveillance. I
have nothing to hide," but then, you know,
00:00:28.696 --> 00:00:37.453
maybe the neighbors and maybe this and
maybe that. So, tonight we're going to
00:00:37.453 --> 00:00:45.210
hear Tijmen Schep who's from Holland. He's
a privacy designer and a freelance
00:00:45.210 --> 00:00:53.193
security researcher and he's gonna hold
a talk about how digital surveillance
00:00:53.193 --> 00:01:04.179
changes our social way of interacting. So,
please, let's have a hand for Tijmen Schep!
00:01:04.179 --> 00:01:14.710
Applause
00:01:14.710 --> 00:01:18.289
Tijmen Schep (TS): Hi everyone. Really
cool that you're all here and really happy
00:01:18.289 --> 00:01:24.430
to talk here. It's really an honor. My
name is Tijmen Schep and I am a technology
00:01:24.430 --> 00:01:30.889
critic. And that means that it's my job to
not believe [audio cuts out] tells us and
00:01:30.889 --> 00:01:36.810
that's really a lot of fun. [audio cuts
out] is, how do I get a wider audience
00:01:36.810 --> 00:01:40.350
involved in understanding technology and
the issues that are arising from
00:01:40.350 --> 00:01:46.049
technology? Because I believe that change
comes when the public demands it. I think
00:01:46.049 --> 00:01:51.299
that's really one of the important things
when change happens. And for me as a
00:01:51.299 --> 00:01:55.279
technology critic, for me words are very
much how I hack the system, how I try to
00:01:55.279 --> 00:02:01.579
hack this world. And so, tonight I'm going
to talk to you about one of these words
00:02:01.579 --> 00:02:08.310
that I think could help us. Framing the
issue is half the battle. [audio cuts out]
00:02:08.310 --> 00:02:13.069
and frame the problem - if we can explain,
what the problem is in a certain frame...
00:02:13.069 --> 00:02:17.990
that, you know, makes certain positions
already visible, that's really half the
00:02:17.990 --> 00:02:24.691
battle won. So, that frame is social
cooling. But before I go into it, I want
00:02:24.691 --> 00:02:31.880
to ask you a question. Who here recognizes
this? You're on Facebook or some other
00:02:31.880 --> 00:02:38.990
social site, and you click on the link
because you think "Oh I could [audio cuts
00:02:38.990 --> 00:02:43.650
out] listen [audio cuts out] could click
on this, but it might look bad. It might
00:02:43.650 --> 00:02:46.501
be remembered by someone. Some agency
might remember it, and I could click on
00:02:46.501 --> 00:02:51.960
it, but I'm hesitating to click."
Microphone buzzing
00:02:51.960 --> 00:02:54.960
laughter
00:03:02.380 --> 00:03:07.340
TS: That better? Can everyone hear me now?
Audience: No.
00:03:07.340 --> 00:03:16.570
TS: No. Okay, that... yeah. Should I start
again? Okay. So, you're on Facebook, and
00:03:16.570 --> 00:03:19.850
you're thinking "Oh, that's an interesting
link. I could click on that," but you're
00:03:19.850 --> 00:03:22.600
hesitating because maybe someone's gonna
remember
00:03:22.600 --> 00:03:26.320
that. And that might come back to me
later, and who here recognizes that
00:03:26.320 --> 00:03:31.581
feeling? So, pretty much almost everybody.
And that's increasingly what I find, when
00:03:31.581 --> 00:03:36.520
I talk about the issue, that people really
start to recognize this. And I think a
00:03:36.520 --> 00:03:41.540
word we could use to describe that is
"Click Fear." This hesitation, it could be
00:03:41.540 --> 00:03:46.261
click fear. And you're not alone.
Increasingly, we find that, research
00:03:46.261 --> 00:03:50.880
points that this is a wide problem, that
people are hesitating to click some of the
00:03:50.880 --> 00:03:55.090
links. For example, after the Snowden
revelations, people were less likely to
00:03:55.090 --> 00:03:59.290
research issues about terrorism and other
things on Wikipedia because they thought
00:03:59.290 --> 00:04:03.520
"Well, maybe the NSA wouldn't like it if I
[audio cuts out] that. Okay, not gonna
00:04:03.520 --> 00:04:10.780
move. And visiting Google as well. So this
is a pattern that there's research... are
00:04:10.780 --> 00:04:15.380
pointing to. And it's not very strange, of
course. I mean, we all understand that if
00:04:15.380 --> 00:04:17.661
you feel you're being
watched, you change your behavior. It's a
00:04:17.661 --> 00:04:23.400
very logical thing that we all understand.
And I believe that technology is really
00:04:23.400 --> 00:04:27.440
amplifying this effect. I think that's
something that we really have to come to
00:04:27.440 --> 00:04:32.090
grips with. And that's why I think social
cooling could be useful with that. Social
00:04:32.090 --> 00:04:36.249
cooling describes in a way how in
increasingly digital world, where our
00:04:36.249 --> 00:04:42.870
digital lives are increasingly digitized, it
becomes easier to feel this pressure, to
00:04:42.870 --> 00:04:49.220
feel these normative effects of these
systems. And very much you see that,
00:04:49.220 --> 00:04:52.740
because increasingly, your data is being
turned into thousands of scores by data
00:04:52.740 --> 00:04:56.289
brokers and other companies. And those
scores are increasing influences you're...
00:04:56.289 --> 00:05:01.349
influencing your chances in life. And this
is creating an engine of oppression, an
00:05:01.349 --> 00:05:07.990
engine of change that we have to
understand. And the fun thing is that in a
00:05:07.990 --> 00:05:13.689
way this idea is really being helped by
Silicon Valley, who for a long time has
00:05:13.689 --> 00:05:17.120
said "Data is the new gold," but they've
recently, in the last five years, changed
00:05:17.120 --> 00:05:22.029
that narrative. Now they're saying "Data
is the new oil," and that's really funny,
00:05:22.029 --> 00:05:25.270
because if data is the new oil, then
immediately you get the question "Wait,
00:05:25.270 --> 00:05:31.210
oil gave us global warming, so then, what
does data give us?" And I believe that if
00:05:31.210 --> 00:05:35.319
oil leads to global warming, then data
could lead to social cooling. That could
00:05:35.319 --> 00:05:40.360
be the word that we use for these negative
effects of big data. In order to really
00:05:40.360 --> 00:05:43.169
understand this, and go into it, we have
to look at three things. First, we're
00:05:43.169 --> 00:05:47.090
going to talk about the reputation
economy, how that system works. Second
00:05:47.090 --> 00:05:51.099
chapter, we're going to look at behavior
change, how it is influencing us and
00:05:51.099 --> 00:05:55.419
changing our behavior. And finally, to not
let you go home depressed, I'm gonna talk
00:05:55.419 --> 00:06:01.939
about how can we deal with this. So first.
The reputation economy. Already we've seen
00:06:01.939 --> 00:06:08.152
today that China is building this new
system, the social credit system. It's a
00:06:08.152 --> 00:06:11.210
system that will give every citizen in
China a score that basically represents
00:06:11.210 --> 00:06:15.650
how well-behaved they are. And it
will influence your ability to get a job,
00:06:15.650 --> 00:06:21.349
a loan, a visum and even a date. And for
example, the current version of the
00:06:21.349 --> 00:06:26.339
system, Sesame Credit, one of the early
prototypes, already gives everybody that
00:06:26.339 --> 00:06:30.689
wants to a score, but it also is connected
to the largest dating website in China.
00:06:30.689 --> 00:06:34.810
So, you can kind of find out "Is this
person that I'm dating... what kind of
00:06:34.810 --> 00:06:40.819
person is this? Is this something who's,
you know, well viewed by Chinese society?"
00:06:42.239 --> 00:06:45.189
This is where it gets really heinous for
me, because until now you could say "Well,
00:06:45.189 --> 00:06:48.860
these reputation systems, they're fair, if
you're a good person, you get a higher
00:06:48.860 --> 00:06:52.470
score. If you're bad person, you get a
lower score," but it's not that simple. I
00:06:52.470 --> 00:06:55.499
mean, your friends' score influences your
score, and your score influences your
00:06:55.499 --> 00:07:00.759
friends' score, and that's where you
really start to see how complex social
00:07:00.759 --> 00:07:04.581
pressures arrive, and where we can see the
effects of data stratification, where
00:07:04.581 --> 00:07:07.389
people are starting to think "Hey, who are
my friends, and who should I be friends
00:07:07.389 --> 00:07:14.350
with?" You could think "That only happens
in China. Those Chinese people are, you
00:07:14.350 --> 00:07:18.999
know, different." But the exact
same thing is happening here in the West,
00:07:18.999 --> 00:07:22.031
except we're letting the market build it.
I'll give you an example. This is a
00:07:22.031 --> 00:07:25.750
company called "deemly" - a Danish company
- and this is their video for their
00:07:25.750 --> 00:07:28.849
service.
Video narrator (VN): ... renting
00:07:28.849 --> 00:07:33.600
apartments from others, and she loves to
swap trendy clothes and dresses. She's
00:07:33.600 --> 00:07:38.059
looking to capture her first lift from a
RideShare app, but has no previous reviews
00:07:38.059 --> 00:07:40.020
to help support her.
Video background voices: Awww.
00:07:40.020 --> 00:07:44.259
VN: Luckily, she's just joined deemly,
where her positive feedback from the other
00:07:44.259 --> 00:07:50.699
sites appears as a deemly score, helping
her to win a RideShare in no time. Deemly
00:07:50.699 --> 00:07:55.879
is free to join and supports users across
many platforms, helping you to share and
00:07:55.879 --> 00:08:01.159
benefit from the great reputation you've
earned. Imagine the power of using your
00:08:01.159 --> 00:08:04.040
deemly score alongside your CV for a job
application...
00:08:04.040 --> 00:08:06.159
TS: Like in China.
VN: ... perhaps to help get a bank loan...
00:08:06.159 --> 00:08:08.339
TS: Like...
VN: or even to link to from your dating
00:08:08.339 --> 00:08:09.339
profile.
TS: Like in China!
00:08:09.339 --> 00:08:15.759
VN: Sign up now at deemly.co. Deemly:
better your sharing.
00:08:15.759 --> 00:08:21.590
Applause
TS: Thanks. There is a change. There is
00:08:21.590 --> 00:08:26.499
difference, though. The funny thing about
here is that it's highly invisible to us.
00:08:26.499 --> 00:08:29.580
The Chinese government is very open about
what they're building, but here we are
00:08:29.580 --> 00:08:33.360
very blind to what's going on. Mostly,
when we talk about these things, then
00:08:33.360 --> 00:08:37.450
we're talking about these systems that
give us a very clear rating, like Airbnb,
00:08:37.450 --> 00:08:41.909
Uber, and of course the Chinese system.
The thing is, most of these systems are
00:08:41.909 --> 00:08:47.350
invisible to us. There's a huge market of
data brokers who are, you know, not
00:08:47.350 --> 00:08:52.330
visible to you, because you are not the
customer. You are the product. And these
00:08:52.330 --> 00:08:57.160
data brokers, well, what they do is, they
gather as much data as possible about you.
00:08:57.160 --> 00:09:03.590
And that's not all. They then create up to
eight thousand scores about you. In the
00:09:03.590 --> 00:09:07.820
United States, these companies have up to
8,000 scores, and in Europe it's a little
00:09:07.820 --> 00:09:13.190
less, around 600. These are scores about
things like your IQ, your psychological
00:09:13.190 --> 00:09:19.020
profile, your gullibility, your religion,
your estimated life span. 8,000 of these
00:09:19.020 --> 00:09:23.690
different things about you. And how does
that work? Well, it works by machine
00:09:23.690 --> 00:09:28.580
learning. So, machine learning algorithms
can find patterns in society that we can
00:09:28.580 --> 00:09:35.750
really not anticipate. For example, let's
say you're a diabetic, and, well, let's
00:09:35.750 --> 00:09:40.880
say this data broker company has a mailing
list, or has an app, that diabetic
00:09:40.880 --> 00:09:44.311
patients use. And they also have the data
of these diabetic patients about what they
00:09:44.311 --> 00:09:48.440
do on Facebook. Well, there you can start
to see correlations. So, if diabetic
00:09:48.440 --> 00:09:53.840
patients more often like gangster-rap and
pottery on Facebook, well, then you could
00:09:53.840 --> 00:09:58.120
deduce from that if you also like
gangster-rap or pottery on Facebook, then
00:09:58.120 --> 00:10:03.460
perhaps you also are more likely to have
or get diabetes. It is highly
00:10:03.460 --> 00:10:08.690
unscientific, but this is how the system
works. And this is an example of how that
00:10:08.690 --> 00:10:13.470
works with just your Facebook scores.
Woman in the video: ... see was lowest about
00:10:13.470 --> 00:10:17.570
60% when it came to predicting whether a
user's parents were still together when
00:10:17.570 --> 00:10:21.310
they were 21. People whose parents
divorced before they were 21 tended to
00:10:21.310 --> 00:10:26.330
like statements about relationships. Drug
users were ID'd with about 65% accuracy.
00:10:26.330 --> 00:10:33.120
Smokers with 73%, and drinkers with 70%.
Sexual orientation was also easier to
00:10:33.120 --> 00:10:40.400
distinguish among men. 88% right there.
For women, it was about 75%. Gender, by
00:10:40.400 --> 00:10:44.870
the way, race, religion, and political
views, were predicted with high accuracy
00:10:44.870 --> 00:10:50.270
as well. For instance: White versus black:
95%.
00:10:50.270 --> 00:10:54.070
TS: So, the important thing to understand
here is that this isn't really about your
00:10:54.070 --> 00:10:57.890
data anymore. Like, oftentimes when we
talk about data protection, we talk about
00:10:57.890 --> 00:11:03.190
"Oh, I want to keep control of my data."
But this is their data. This data that
00:11:03.190 --> 00:11:10.190
they deduce, that they derive from your
data. These are opinions about you. And
00:11:10.190 --> 00:11:14.300
these things are what, you know, make it
so that even though you never filled in a
00:11:14.300 --> 00:11:18.960
psychological test, they'd have one. A
great example of that, how that's used, is
00:11:18.960 --> 00:11:24.220
a company called Cambridge Analytica. This
company has created detailed profiles
00:11:24.220 --> 00:11:28.690
about us through what they call
psychographics and I'll let them explain
00:11:28.690 --> 00:11:33.100
it themselves.
Man in the video: By having hundreds and
00:11:33.100 --> 00:11:36.930
hundreds of thousands of Americans
undertake this survey, we were able to
00:11:36.930 --> 00:11:40.090
form a
model to predict the personality of every
00:11:40.090 --> 00:11:46.230
single adult in the United States of
America. If you know the personality of
00:11:46.230 --> 00:11:50.300
the people you're targeting you can nuance
your messaging to resonate more
00:11:50.300 --> 00:11:55.640
effectively with those key audience
groups. So, for a highly neurotic and
00:11:55.640 --> 00:12:01.200
conscientious audience, you're going to
need a message that is rational and fair-
00:12:01.200 --> 00:12:05.700
based, or emotionally-based. In this case,
the threat of a burglary, and the
00:12:05.700 --> 00:12:10.410
insurance policy of a gun is very
persuasive. And we can see where these
00:12:10.410 --> 00:12:15.280
people are on the map. If we wanted to
drill down further, we could resolve the
00:12:15.280 --> 00:12:19.780
data to an individual level, where we have
somewhere close to four or five thousand
00:12:19.780 --> 00:12:24.460
data points on every adult in the United
States.
00:12:24.460 --> 00:12:28.120
TS: So, yeah. This is the company that
worked with both the... for the Brexit
00:12:28.120 --> 00:12:33.570
campaign and with the Trump campaign. Of
course, little after Trump campaign, all
00:12:33.570 --> 00:12:37.510
the data was leaked, so data on 200
million Americans was leaked, And
00:12:37.510 --> 00:12:40.380
increasingly, you can see
this data described as "modeled voter
00:12:40.380 --> 00:12:46.440
ethnicities and religions." So, this is
this derived data. You might think that
00:12:46.440 --> 00:12:49.630
when you go online and use Facebook and
use all these services, that advertisers
00:12:49.630 --> 00:12:53.310
are paying for you. That's a common
misperception. That's not really the case.
00:12:53.310 --> 00:12:57.500
What's really going on is that, according
to SSC research, the majority of the
00:12:57.500 --> 00:13:01.980
money made in this data broker market is
made from risk management. All right, so,
00:13:01.980 --> 00:13:07.380
in a way you could say that it's not
really marketers that are paying for you,
00:13:07.380 --> 00:13:12.930
it's your bank. It's ensurers. It's your
employer. It's governments. These kind of
00:13:12.930 --> 00:13:19.210
organizations are the ones who buy these
profiles. The most. More than the other
00:13:19.210 --> 00:13:23.530
ones. Of course, the promise of big data
is that you can then manage risk. Big data
00:13:23.530 --> 00:13:27.720
is the idea that with data you can
understand things and then manage them.
00:13:27.720 --> 00:13:31.700
So what really is innovation in this big
data world, this data economy, is the
00:13:31.700 --> 00:13:36.110
democratization of the background check.
That's really the core of this, this market
00:13:36.110 --> 00:13:39.310
that now you can find out everything about
everyone.
00:13:39.310 --> 00:13:44.810
So, yeah, now your... in past, only perhaps
your bank could know your credit score but
00:13:44.810 --> 00:13:47.980
now your green grocer knows your
psychological profile.
00:13:47.980 --> 00:13:55.070
Right that's a new level of, yeah, what's
going on here. It's not only inv... not
00:13:55.070 --> 00:13:58.880
only invisible but it's also huge
according to the same research by the FCC
00:13:58.880 --> 00:14:05.340
this market was already worth 150 billion
dollars in 2015. So, it's invisible, it's
00:14:05.340 --> 00:14:10.760
huge and hardly anyone knows about it. But
that's probably going to change. And that
00:14:10.760 --> 00:14:18.040
brings us to the second part: Behavioral
change. We already see this first part of
00:14:18.040 --> 00:14:21.370
this, how behavioral change is happening
through these systems. That's through outside
00:14:21.370 --> 00:14:25.520
influence and we've, we've talked a lot
about this in this conference. For example
00:14:25.520 --> 00:14:29.110
we see how Facebook and advertisers try to
do that. We've also seen how China is
00:14:29.110 --> 00:14:32.870
doing that, trying to influence you. Russia
has recently tried to use Facebook to
00:14:32.870 --> 00:14:36.710
influence the elections and of course
companies like Cambridge Analytica try to
00:14:36.710 --> 00:14:39.420
do the same thing.
And here you can have a debate on, you
00:14:39.420 --> 00:14:43.480
know, to what extent are they really
influencing us, but I think that's not
00:14:43.480 --> 00:14:48.090
actually the really, the most interesting
question. What interests me most of all is
00:14:48.090 --> 00:14:53.840
how we are doing it ourselves, how we are
creating new forms of self-censorship and
00:14:53.840 --> 00:14:59.040
and are proactively anticipating these
systems. Because once you realize that
00:14:59.040 --> 00:15:01.870
this is really about risk management you
start... and this is about banks and
00:15:01.870 --> 00:15:06.400
employers trying to understand you, people
start to understand that this will go beyond
00:15:06.400 --> 00:15:10.470
click fear, if you remember. This will go
beyond, this will become, you know,
00:15:10.470 --> 00:15:14.230
when people find out this will be, you
know, not getting a job for example.
00:15:14.230 --> 00:15:18.980
This'll be about getting really expensive
insurance. It'll be about all these kinds
00:15:18.980 --> 00:15:22.080
of problems and people are increasingly
finding this out. So for example in the
00:15:22.080 --> 00:15:28.970
United States if you... the IRS might now
use data profile... are now using data
00:15:28.970 --> 00:15:33.860
profiles to find out who they should
audit. So I was talking recently to a girl
00:15:33.860 --> 00:15:37.560
and and she said: "Oh I recently tweeted
about... a negative tweet about the IRS," and
00:15:37.560 --> 00:15:39.800
she immediately grabbed her phone to
delete it.
00:15:39.800 --> 00:15:44.600
When she realized that, you know, this
could now be used against her in a way.
00:15:44.600 --> 00:15:49.320
And that's the problem. Of course we see all
kinds of other crazy examples that the big...
00:15:49.320 --> 00:15:54.450
the audience that we measure... the wider public
is picking up on, like who... so we now have
00:15:54.450 --> 00:15:58.610
algorithms that can find out if you're gay
or not. And these things scare people and
00:15:58.610 --> 00:16:03.460
these things are something we have to
understand. So, chilling effects this what
00:16:03.460 --> 00:16:08.380
this boils down to. For me, more importantly
than these influences of these big
00:16:08.380 --> 00:16:12.850
companies and nation states is how people
themselves are experiencing these chilling
00:16:12.850 --> 00:16:17.860
effects like you yourself have as well.
That brings us back to social cooling. For
00:16:17.860 --> 00:16:23.810
me, social cooling is about these two
things combined at once and this
00:16:23.810 --> 00:16:28.720
increasing ability of agents and... and groups
to influence you and on the other hand the
00:16:28.720 --> 00:16:33.240
increasing willingness of people
themselves to change their own behavior to
00:16:33.240 --> 00:16:38.680
proactively engage with this issue.
There are three long-term consequences
00:16:38.680 --> 00:16:44.120
that I want to dive into. The first is how
this affects the individual, the second is
00:16:44.120 --> 00:16:49.320
how it affects society, and the third is
how it affects the market. So let's look
00:16:49.320 --> 00:16:55.350
the individual. Here we've seen, there's a
rising culture of self-censorship. It
00:16:55.350 --> 00:16:58.970
started for me with an article that I read in
New York Times, where a student was saying:
00:16:58.970 --> 00:17:02.430
"Well we're very very reserved." She's
going to do things like spring break.
00:17:02.430 --> 00:17:05.819
I said: "Well you don't have to defend
yourself later," so you don't do it.
00:17:05.819 --> 00:17:08.859
And what she's talking about, she's
talking about doing crazy things, you
00:17:08.859 --> 00:17:12.410
know, letting go, having fun. She's
worried that the next day it'll be on
00:17:12.410 --> 00:17:15.839
Facebook.
So what's happening here is that you do
00:17:15.839 --> 00:17:18.190
have all kinds of freedoms: You have the
freedom to look up things, you have the
00:17:18.190 --> 00:17:22.290
freedom to to say things, but you're
hesitating to use it. And that's really
00:17:22.290 --> 00:17:28.460
insidious. That has an effect on a wider
society and here we really see the
00:17:28.460 --> 00:17:34.260
societal value of privacy. Because in
society often minority values later become
00:17:34.260 --> 00:17:40.559
majority values. An example is... is weed.
I'm from... I'm from the Netherlands and
00:17:40.559 --> 00:17:45.059
there you see, you know, at first it's
something that you just don't do and it's
00:17:45.059 --> 00:17:48.800
you know a bit of a "uhh", but then "Oh, maybe
yeah, you should... you should try it as well," and
00:17:48.800 --> 00:17:53.020
people try it and slowly under the surface
of the society, people change their minds
00:17:53.020 --> 00:17:55.980
about these things. And then, after a while
it's like, you know, "What are we still
00:17:55.980 --> 00:17:59.300
worried about?"
How the same pattern help it happens of
00:17:59.300 --> 00:18:03.070
course with way bigger things like this:
Martin Luther King: "I must honestly say to
00:18:03.070 --> 00:18:10.470
you that I never intend to adjust myself
to racial segregation and discrimination."
00:18:12.410 --> 00:18:14.100
TS: This is the same pattern
that's happening
00:18:14.100 --> 00:18:17.680
for all kinds of things that that change
in society, and that's what privacy is so
00:18:17.680 --> 00:18:20.370
important for, and that's why it's so
important that people have the ability to
00:18:20.370 --> 00:18:22.880
look things up and to change their minds
and to talk about each other without
00:18:22.880 --> 00:18:27.930
feeling so watched all the time.
The third thing is how this impacts the
00:18:27.930 --> 00:18:33.770
market. Here we see very much the rise of
a culture of risk avoidance. An example
00:18:33.770 --> 00:18:36.580
here is that in
1995 already, doctors in New York were
00:18:36.580 --> 00:18:43.510
given scores, and what happened was that
the doctors who try to help advanced
00:18:43.510 --> 00:18:47.010
stage cancer patients, complex patients, who
try to do the operation, difficult
00:18:47.010 --> 00:18:52.380
operations, got a low score, because these
people more often died, while doctors that
00:18:52.380 --> 00:18:56.800
didn't lift a finger and didn't try to
help got a high score. Because, well, people
00:18:56.800 --> 00:19:00.780
didn't die. So you see here that these
systems that, they bring all kinds of
00:19:00.780 --> 00:19:04.670
perverse incentives. They, you know, they're
they lower the willingness for everybody
00:19:04.670 --> 00:19:07.760
to take a risk and in some areas of
society we really like people to take
00:19:07.760 --> 00:19:15.150
risks. They're like entrepreneurs, doctors.
So in the whole part you could say that
00:19:15.150 --> 00:19:19.010
this, what we're seeing here, is some kind
of trickle-down risk aversion, where
00:19:19.010 --> 00:19:23.130
the willing, the... the way that
companies and governments want to
00:19:23.130 --> 00:19:27.440
manage risk, that's trickling down to us.
And we're we of course want them to like
00:19:27.440 --> 00:19:30.309
us, want to have a job, we want to have
insurance, and then we increasingly start
00:19:30.309 --> 00:19:36.830
to think "Oh, maybe I should not do this." It's
a subtle effect. So how do we deal with
00:19:36.830 --> 00:19:39.960
this?
Well, together. I think this is a really
00:19:39.960 --> 00:19:43.490
big problem. I think this is such a big
problem that, that it can't be managed by
00:19:43.490 --> 00:19:47.460
just some, some hackers or nerds, building
something, or by politicians, making a law.
00:19:47.460 --> 00:19:52.640
This is a really a society-wide problem.
So I want to talk about all these groups
00:19:52.640 --> 00:19:57.920
that should get into this: the public,
politicians, business, and us.
00:19:57.920 --> 00:20:03.270
So the public. I think we have to talk
about and maybe extend the metaphor of the
00:20:03.270 --> 00:20:06.890
cloud and say we have to learn to see the
stars behind the cloud. Alright, that's one
00:20:06.890 --> 00:20:11.679
way that we could... that's a narrative we
could use. I really like to use humor to
00:20:11.679 --> 00:20:17.050
explain this to a wider audience, so for
example, last year I was part of an
00:20:17.050 --> 00:20:22.240
exhibits... helped develop exhibits about
dubious devices and one of the devices
00:20:22.240 --> 00:20:25.010
there was called "Taste your status"
which was a coffee machine that gave you
00:20:25.010 --> 00:20:30.210
coffee based on your area code. So if you
live in a good area code, you get nice coffee.
00:20:30.210 --> 00:20:33.630
You live in a bad area code, you get bad
coffee.
00:20:33.630 --> 00:20:36.960
music
laugher
00:20:36.960 --> 00:20:41.820
applause
I'll go into it but... these... often times
00:20:41.820 --> 00:20:44.130
you can use humor to explain these things
to a wider audience. I really like that
00:20:44.130 --> 00:20:47.670
method, that approach.
We've got a long way to go though. I mean,
00:20:47.670 --> 00:20:50.150
if we look at the long, you know, how long
it took for us to understand global
00:20:50.150 --> 00:20:53.150
warming, to really, you know, come to a stage
where most people understand what it is
00:20:53.150 --> 00:20:58.260
and care about it except Donald Trump.
Well, with data we really got a long way to
00:20:58.260 --> 00:21:01.890
go, we're really at the beginning of
understanding this issue like this.
00:21:01.890 --> 00:21:07.970
Okay, so the second group that has to
really wake up is politicians. And they have
00:21:07.970 --> 00:21:10.910
to understand that this is really about the
balance of power. This is really about
00:21:10.910 --> 00:21:16.380
power. And if you permit me, I'll go into
the big picture a little bit, as a media
00:21:16.380 --> 00:21:21.750
theorist. So this is called Giles Deleuze.
he's a French philosopher and he explained
00:21:21.750 --> 00:21:26.000
in his work something that I find really
useful, He said you have two systems of
00:21:26.000 --> 00:21:30.152
control in society and the one is the
institutional one and that's the one we
00:21:30.152 --> 00:21:32.940
all know.
You know that the judicial system so
00:21:32.940 --> 00:21:38.140
you're free to do what you want but then
you cross a line you cross a law and the
00:21:38.140 --> 00:21:40.610
police get you you go for every charge you
go to prison. That's the system we
00:21:40.610 --> 00:21:43.010
understand.
But he says there's another system which
00:21:43.010 --> 00:21:46.840
is the social system this is a social
pressure system and this for a long time
00:21:46.850 --> 00:21:49.670
wasn't really designed.
But now increasingly we are able to do
00:21:49.670 --> 00:21:53.480
that so this is the system where you
perform suboptimal behavior and then that
00:21:53.480 --> 00:21:58.240
gets measured and judged and then you get
subtly nudged in the right direction. And
00:21:58.240 --> 00:22:01.130
there's some very important differences
between these 2 systems. The institutional
00:22:01.130 --> 00:22:05.010
system you know it has this idea that
you're a free citizen that makes up your
00:22:05.010 --> 00:22:10.500
own mind and you know what social system
is like that's working all the time,
00:22:10.500 --> 00:22:13.280
constantly it doesn't matter if you're
guilty or innocent it's always trying to
00:22:13.280 --> 00:22:19.020
push you. The old system, the institutional
system is very much about punishment so if
00:22:19.020 --> 00:22:21.020
you
break the rules you get punishment but
00:22:21.020 --> 00:22:23.570
people sometimes don't really care about
punishment sometimes it's cool to get
00:22:23.570 --> 00:22:27.670
punishment. But the social system uses
something way more powerful which is the
00:22:27.670 --> 00:22:33.890
fear of exclusion. We are social animals
and we really care to belong to a group.
00:22:33.890 --> 00:22:37.350
The other difference is that it's very
important that the institutional system is
00:22:37.350 --> 00:22:40.920
accountable. You know democratically to us
how the social system at the moment is
00:22:40.920 --> 00:22:45.220
really really invisible like these
algorithms how they work where the data is
00:22:45.220 --> 00:22:48.720
going it's very hard to understand and of
course it's exactly what China loved so
00:22:48.720 --> 00:22:52.530
much about it right there's no you can
stand in front of a tank but you can't
00:22:52.530 --> 00:22:56.690
really stand in front of the cloud. So
yeah that's that's great it also helps me
00:22:56.690 --> 00:23:00.660
to understand when people say I have
nothing to hide. I really understand that
00:23:00.660 --> 00:23:03.410
because when people say I have nothing to
hide what they're saying is I have nothing
00:23:03.410 --> 00:23:06.340
to hide from the old system from the
classic system from the institutional
00:23:06.340 --> 00:23:09.460
system. They're saying I want to help the
police I trust our gover
00:23:09.460 --> 00:23:13.382
nment I trust our institutions and that's
actually really a positive thing to say.
00:23:13.382 --> 00:23:16.980
The thing is they don't really see the other
part of the system how increasingly there
00:23:16.980 --> 00:23:21.970
are parts that are not in your control
they're not democratically checked and that's
00:23:21.970 --> 00:23:28.499
really a problem. So the third thing that
I think we have to wake up is business,
00:23:28.499 --> 00:23:32.560
business has to see that this is not so
much a problem perhaps but that it could
00:23:32.560 --> 00:23:36.110
be an opportunity. I think I'm still
looking for a metaphor here but perhaps,
00:23:36.110 --> 00:23:40.350
if we you know again, compare this issue
to global warming we say that we need
00:23:40.350 --> 00:23:44.930
something like ecological food for data.
And but I don't know what that's gonna
00:23:44.930 --> 00:23:48.270
look like or how we're gonna explain that
maybe we have to talk about fast food
00:23:48.270 --> 00:23:54.040
versus fast data versus ecological data
but we need a metaphor here. Of course
00:23:54.040 --> 00:24:07.750
laws are also really helpful. So we might
get things like this. I'm actually working
00:24:07.750 --> 00:24:18.600
on this is funny. Or if things get really
out of hand we might get here, right?
00:24:18.600 --> 00:24:23.020
So luckily we see that in Europe
the the politicians are awake and are
00:24:23.020 --> 00:24:25.630
really trying to push this market I think
that's really great, so I think in the
00:24:25.630 --> 00:24:28.679
future we'll get to a moment where people
say well I prefer European smart products
00:24:28.679 --> 00:24:33.320
for example, I think that's a good thing I
think this is really positive. Finally I
00:24:33.320 --> 00:24:36.930
want to get to all of us what each of us
can do. I think here again there's a
00:24:36.930 --> 00:24:40.412
parallel to global warming where at its
core it's not so much about the new
00:24:40.412 --> 00:24:43.770
technology and all the issues, it's about
a new mindset, a new way of looking at the
00:24:43.770 --> 00:24:48.140
world. And I here think we have to stop
saying that we have nothing to hide for
00:24:48.140 --> 00:24:52.460
example. If I've learned anything in the
past years understanding and researching
00:24:52.460 --> 00:24:58.200
privacy and this big trade data market is
privacy is the right to be imperfect. All
00:24:58.200 --> 00:25:01.390
right increasing there's pressure to be
the perfect citizen to be the perfect
00:25:01.390 --> 00:25:05.840
consumer and privacy is a way of getting
out of that. So this is how I would
00:25:05.840 --> 00:25:09.470
reframe privacy it's not just being about
which bits and bytes go where but it's
00:25:09.470 --> 00:25:13.040
about you know the human right to be
imperfect cause course we are human we are
00:25:13.040 --> 00:25:17.440
all imperfect. Sometimes when I talk at
technology conference people say well
00:25:17.440 --> 00:25:21.500
privacy was just a phase. You know, it's
like ebb and flood in and we got it and
00:25:21.500 --> 00:25:25.980
it's gonna go away again, that's crazy you
know, you don't say women's rights were
00:25:25.980 --> 00:25:30.710
just a phase we had it for a while and
it's gonna go again. Right? And of course
00:25:30.710 --> 00:25:34.341
Edward Snowden explains it way better. He
says arguing that you don't care about the
00:25:34.341 --> 00:25:37.200
right to privacy because you have nothing
to hide it's no different than saying you
00:25:37.200 --> 00:25:40.930
don't care about free speech because you
have nothing to say. What an eloquent
00:25:40.930 --> 00:25:47.210
system admin. So I think what we have to
do strive for here is that we develop for
00:25:47.210 --> 00:25:50.799
more nuanced understanding of all these
issues. I think we have to go away from
00:25:50.799 --> 00:25:54.330
this idea that data more data is better,
data is automatically progress. No it's
00:25:54.330 --> 00:25:58.940
not data is a trade-off for example for
the individual more data might mean less
00:25:58.940 --> 00:26:04.381
psychological security, less willingness
to share, less willing to try things. For
00:26:04.381 --> 00:26:09.110
a country it might mean less autonomy for
citizens and citizens need their own
00:26:09.110 --> 00:26:12.120
autonomy they need to know what's going on
they need to be able to vote in their own
00:26:12.120 --> 00:26:17.050
autonomous way and decide what's what they
want. In business you could say more data
00:26:17.050 --> 00:26:21.140
might lead to less creativity right less
willingness to share new ideas to come up
00:26:21.140 --> 00:26:29.510
with new ideas - that's again an issue
there. So in conclusion social cooling is
00:26:29.510 --> 00:26:32.460
a way of understanding these issues or a
way of framing these issues that I think
00:26:32.460 --> 00:26:37.030
could be useful for us. That could help us
understand and engage with these issues.
00:26:37.030 --> 00:26:41.620
And yes social cooling is an alarm, it's
alarmist - it is we're trying to say this
00:26:41.620 --> 00:26:45.940
is the problem and we have to deal with
this. But it's also really about hope. All
00:26:45.940 --> 00:26:50.020
right. I trust not so much in technology I
trust in us in people that we can fix this
00:26:50.020 --> 00:26:53.530
once we understand the issue in the same
way that when we understood the problem
00:26:53.530 --> 00:26:56.460
with global warming we started to deal
with it. Where do it's
00:26:56.460 --> 00:27:00.030
gonna it's slow progress we're doing that
and we can do the same thing with data.
00:27:00.030 --> 00:27:06.360
It'll take a while but we'll get there.
And finally this is about starting to
00:27:06.360 --> 00:27:10.350
understand the difference between shallow
optimism and deep optimism. All right,
00:27:10.350 --> 00:27:13.290
oftentimes technology sectors right cool
into technology and we're going to fix
00:27:13.290 --> 00:27:16.570
this by creating an app and for me that's
you know ,They: "we have to be
00:27:16.570 --> 00:27:20.880
optimistic", that's very shallow optimism
the TEDx make optimism. Like true optimism
00:27:20.880 --> 00:27:24.480
recognizes that each technology comes with
a downside and we have to recognize that
00:27:24.480 --> 00:27:28.420
thats it's, that thats not a problem to,
to point out these problems it's a good
00:27:28.420 --> 00:27:32.049
thing if once you understand the problems
you can deal with them - and you know come
00:27:32.049 --> 00:27:37.150
up with better solutions. If we don't
change in this mindset then we might
00:27:37.150 --> 00:27:41.370
create the world where we're all more well
behaved but perhaps also a little bit
00:27:41.370 --> 00:27:59.750
less human. Thank you.
Applause
00:27:59.750 --> 00:28:04.440
H: Thank You Devin.
TS: You are welcome.
00:28:04.440 --> 00:28:13.080
Applause
H: We still have five more minutes we'll
00:28:13.080 --> 00:28:18.179
take some questions if you like. First
microphone number 2.
00:28:18.179 --> 00:28:23.170
Microphone 2 (M2): Hello, thanks that was
a really interesting talk. I have a
00:28:23.170 --> 00:28:29.790
question that I hope will work it's a bit
complicated there's a project called indie
00:28:29.790 --> 00:28:34.230
by a foundation called a sovereign
foundation do you know about it? Okay very
00:28:34.230 --> 00:28:38.720
great perfect so to just to quickly
explain these people want to create an
00:28:38.720 --> 00:28:42.669
identity layer that will be self sovereign
which means people can reveal what they
00:28:42.669 --> 00:28:47.530
want about themselves only when they want
but is one unique identity on the entire
00:28:47.530 --> 00:28:51.750
internet so that can potentially be very
liberating because you control all your
00:28:51.750 --> 00:28:57.130
identity and individual data. But at the
same time it could be used to enable
00:28:57.130 --> 00:29:00.330
something like the personal scores we were
showing earlier on so made me think about
00:29:00.330 --> 00:29:02.720
that and I wanted to know if you had an
opinion on this.
00:29:02.720 --> 00:29:09.059
TS: Yes well um the first thing I think
about is that as I try to explain you see
00:29:09.059 --> 00:29:11.440
a lot of initiatives have tried to be
about: "Oo you have to control your own
00:29:11.440 --> 00:29:15.490
data". But that's really missing the point
that it's no longer really about your data
00:29:15.490 --> 00:29:19.410
it's about this derived data and of course
it can help to to manage what you share
00:29:19.410 --> 00:29:23.600
you know then they can't derive anything
from it. But to little I see that
00:29:23.600 --> 00:29:29.360
awareness. Second of all this is very much
for me an example of what nerds and
00:29:29.360 --> 00:29:32.480
technologies are really good at it's like:
"oh we've got a social problem let's
00:29:32.480 --> 00:29:36.299
create a technology app and then we'll fix
it". Well what I'm trying to explain is
00:29:36.299 --> 00:29:39.690
that this is such a big problem that we
cannot fix this with just one group alone
00:29:39.690 --> 00:29:42.320
- not the politicians, not the designers,
not the Nerds this is something that we
00:29:42.320 --> 00:29:47.799
have to really get together you know grab
- fix together because this is such a
00:29:47.799 --> 00:29:50.930
fundamental issue right. The idea that
risk is a problem that we want to manage
00:29:50.930 --> 00:29:55.570
risk is such so deeply ingrained in people
you know such stuff based in fear is
00:29:55.570 --> 00:29:59.169
fundamental and it's everywhere so it's
not enough for one group to try to fix
00:29:59.169 --> 00:30:02.450
that it's something that we have to come
to grips with together.
00:30:02.450 --> 00:30:06.390
M2: Thanks a lot.
H: Ok there is a signal angel has a
00:30:06.390 --> 00:30:09.730
question from the internet I think.
Signal Angel (SigA): Yes and BarkingSheep
00:30:09.730 --> 00:30:13.160
is asking: "do you think there's a
relationship between self-censorship and
00:30:13.160 --> 00:30:16.980
echo chambers in a sense that people
become afraid to challenge their own
00:30:16.980 --> 00:30:22.799
belief and thus isolate themselves in
groups with the same ideology?".
00:30:22.799 --> 00:30:27.830
TS: That's, a that's a, that's a really
big answer to that one.
00:30:27.830 --> 00:30:31.440
pauses
TS: Actually, I was e-mailing Vince Cerf,
00:30:31.440 --> 00:30:35.610
and miraculously he, he responded, and he
said what you really have to look for is
00:30:35.610 --> 00:30:39.480
this, not just a reputation economy, but
also the attention economy and how they're
00:30:39.480 --> 00:30:45.280
linked. So for a while I've been looking
for that, that link and there's a lot to
00:30:45.280 --> 00:30:49.990
say there and there definitely is a link.
I think important to understand over to
00:30:49.990 --> 00:30:53.540
get new ones here is that, I'm not saying
that everybody will become really well
00:30:53.540 --> 00:30:59.700
behaved and gray book worm people. The
thing is that what this situation's
00:30:59.700 --> 00:31:03.230
creating, is that we're all becoming
theater players while playing in identity
00:31:03.230 --> 00:31:06.230
more and more, because we're watched more
of the time. And for some people that
00:31:06.230 --> 00:31:10.660
might mean that they're, you know, I think
most people will be more conservative and
00:31:10.660 --> 00:31:15.400
more careful, some people will go really
all out and they all enjoy the stage! You
00:31:15.400 --> 00:31:18.850
know? We have those people as well, and I
think those people could really benefit
00:31:18.850 --> 00:31:23.800
and that the attention economy could
really you know give them a lot of
00:31:23.800 --> 00:31:26.859
attention through that. So I think
there's, there's a link there but I could
00:31:26.859 --> 00:31:29.549
go on more but I think it's for now, where
I'm aware.
00:31:29.549 --> 00:31:33.819
H: Okay, we're short on time, we'll take,
I'm sorry one more question. The number
00:31:33.819 --> 00:31:36.850
one?
Microphone 1 (M1): So, the, I think the
00:31:36.850 --> 00:31:38.710
audience you're talking about, ...
H: Louder, please.
00:31:38.710 --> 00:31:44.350
M1: The the audience you're talking to
here, is already very aware but I'm asking
00:31:44.350 --> 00:31:50.210
for, like tactics, or your tips, to spread
your message and to talk to people that
00:31:50.210 --> 00:31:55.440
are in this, they say: "Uh, I don't care
they can surveil me.", like what's, what's
00:31:55.440 --> 00:32:01.430
your approach, like in a practical way?
How do you actually do this?
00:32:01.430 --> 00:32:08.230
TS: Yeah, so, I'm really glad to be here
because I am, yes, I am a nerd, but I'm
00:32:08.230 --> 00:32:11.539
also a philosopher or thinker, you know
and, and that means that
00:32:11.539 --> 00:32:15.820
for me what I work with, it's not just odd
Rhinos, but words and ideas. I think those
00:32:15.820 --> 00:32:18.860
I've been trying to show can be really
powerful, like a word can be a really
00:32:18.860 --> 00:32:29.229
powerful way to frame a debate or engage
people. So, I haven't found yet a way to
00:32:29.229 --> 00:32:33.030
push all this tar. Like, I was making joke
that I can tell you in one sentence, what
00:32:33.030 --> 00:32:36.500
privacy is and why it matters but I have
to give a whole talk before that, all
00:32:36.500 --> 00:32:39.530
right? Privacy is a right to be imperfect
but in order to understand that you have
00:32:39.530 --> 00:32:42.710
to understand the rise of the reputation
economy, and how it affects your chances
00:32:42.710 --> 00:32:47.061
in life. The fun thing is, that, that that
will happen by itself that people will
00:32:47.061 --> 00:32:50.700
become more aware of that, they will run
into these problems. They will not get a
00:32:50.700 --> 00:32:55.660
job or they might get other issues, and
then they will start to see the problem.
00:32:55.660 --> 00:32:59.150
And so my question not so much to help
people understand it, but to help them
00:32:59.150 --> 00:33:02.980
understand it before they run into the
wall, right? That's how usually society at
00:33:02.980 --> 00:33:06.530
the moment deals with technology problems.
It's like "Oh we'll, we'll, oh ... Oh?
00:33:06.530 --> 00:33:10.820
it's a problem? Oh well, now we'll try to
fix it." Well, I believe you can really
00:33:10.820 --> 00:33:15.020
see these problems come way earlier and I
think the humanity's, to come around from,
00:33:15.020 --> 00:33:19.630
is really helpful in that, and trying to
you know like, the lows are really,
00:33:19.630 --> 00:33:28.040
really clearly explaining what the problem
is in 1995. So yeah, that I think that, I
00:33:28.040 --> 00:33:32.530
don't have a short way of explaining, you
know, why privacy matters but I think
00:33:32.530 --> 00:33:38.220
it'll become easier over time as people
start to really feel these pressures.
00:33:38.220 --> 00:33:42.530
H: Sorry, thank you very much for the
question. I think we all should go out and
00:33:42.530 --> 00:33:49.320
spread the message. This talk is over, I'm
awfully sorry. When you people leave,
00:33:49.320 --> 00:33:51.620
please take your bottles, and your cups,
...
00:33:51.620 --> 00:33:54.190
applause
H: ... and all your junk, and thank you
00:33:54.190 --> 00:34:05.500
very much again Tijmen Schep!
applause
00:34:05.500 --> 00:34:09.899
music
00:34:09.899 --> 00:34:27.000
subtitles created by c3subtitles.de
in the year 2017. Join, and help us!