Herald (H): Yeah. Welcome to our next
talk, Social Cooling. You know, people say
"I have no problem with surveillance. I
have nothing to hide," but then, you know,
maybe the neighbors and maybe this and
maybe that. So, tonight we're going to
hear Tijmen Schep who's from Holland. He's
a privacy designer and a freelance
security researcher and he's gonna hold
a talk about how digital surveillance
changes our social way of interacting. So,
please, let's have a hand for Tijmen Schep!
Applause
Tijmen Schep (TS): Hi everyone. Really
cool that you're all here and really happy
to talk here. It's really an honor. My
name is Tijmen Schep and I am a technology
critic. And that means that it's my job to
not believe [audio cuts out] tells us and
that's really a lot of fun. [audio cuts
out] is, how do I get a wider audience
involved in understanding technology and
the issues that are arising from
technology? Because I believe that change
comes when the public demands it. I think
that's really one of the important things
when change happens. And for me as a
technology critic, for me words are very
much how I hack the system, how I try to
hack this world. And so, tonight I'm going
to talk to you about one of these words
that I think could help us. Framing the
issue is half the battle. [audio cuts out]
and frame the problem - if we can explain,
what the problem is in a certain frame...
that, you know, makes certain positions
already visible, that's really half the
battle won. So, that frame is social
cooling. But before I go into it, I want
to ask you a question. Who here recognizes
this? You're on Facebook or some other
social site, and you click on the link
because you think "Oh I could [audio cuts
out] listen [audio cuts out] could click
on this, but it might look bad. It might
be remembered by someone. Some agency
might remember it, and I could click on
it, but I'm hesitating to click."
Microphone buzzing
laughter
TS: That better? Can everyone hear me now?
Audience: No.
TS: No. Okay, that... yeah. Should I start
again? Okay. So, you're on Facebook, and
you're thinking "Oh, that's an interesting
link. I could click on that," but you're
hesitating because maybe someone's gonna
remember
that. And that might come back to me
later, and who here recognizes that
feeling? So, pretty much almost everybody.
And that's increasingly what I find, when
I talk about the issue, that people really
start to recognize this. And I think a
word we could use to describe that is
"Click Fear." This hesitation, it could be
click fear. And you're not alone.
Increasingly, we find that, research
points that this is a wide problem, that
people are hesitating to click some of the
links. For example, after the Snowden
revelations, people were less likely to
research issues about terrorism and other
things on Wikipedia because they thought
"Well, maybe the NSA wouldn't like it if I
[audio cuts out] that. Okay, not gonna
move. And visiting Google as well. So this
is a pattern that there's research... are
pointing to. And it's not very strange, of
course. I mean, we all understand that if
you feel you're being
watched, you change your behavior. It's a
very logical thing that we all understand.
And I believe that technology is really
amplifying this effect. I think that's
something that we really have to come to
grips with. And that's why I think social
cooling could be useful with that. Social
cooling describes in a way how in
increasingly digital world, where our
digital lives are increasingly digitized, it
becomes easier to feel this pressure, to
feel these normative effects of these
systems. And very much you see that,
because increasingly, your data is being
turned into thousands of scores by data
brokers and other companies. And those
scores are increasing influences you're...
influencing your chances in life. And this
is creating an engine of oppression, an
engine of change that we have to
understand. And the fun thing is that in a
way this idea is really being helped by
Silicon Valley, who for a long time has
said "Data is the new gold," but they've
recently, in the last five years, changed
that narrative. Now they're saying "Data
is the new oil," and that's really funny,
because if data is the new oil, then
immediately you get the question "Wait,
oil gave us global warming, so then, what
does data give us?" And I believe that if
oil leads to global warming, then data
could lead to social cooling. That could
be the word that we use for these negative
effects of big data. In order to really
understand this, and go into it, we have
to look at three things. First, we're
going to talk about the reputation
economy, how that system works. Second
chapter, we're going to look at behavior
change, how it is influencing us and
changing our behavior. And finally, to not
let you go home depressed, I'm gonna talk
about how can we deal with this. So first.
The reputation economy. Already we've seen
today that China is building this new
system, the social credit system. It's a
system that will give every citizen in
China a score that basically represents
how well-behaved they are. And it
will influence your ability to get a job,
a loan, a visum and even a date. And for
example, the current version of the
system, Sesame Credit, one of the early
prototypes, already gives everybody that
wants to a score, but it also is connected
to the largest dating website in China.
So, you can kind of find out "Is this
person that I'm dating... what kind of
person is this? Is this something who's,
you know, well viewed by Chinese society?"
This is where it gets really heinous for
me, because until now you could say "Well,
these reputation systems, they're fair, if
you're a good person, you get a higher
score. If you're bad person, you get a
lower score," but it's not that simple. I
mean, your friends' score influences your
score, and your score influences your
friends' score, and that's where you
really start to see how complex social
pressures arrive, and where we can see the
effects of data stratification, where
people are starting to think "Hey, who are
my friends, and who should I be friends
with?" You could think "That only happens
in China. Those Chinese people are, you
know, different." But the exact
same thing is happening here in the West,
except we're letting the market build it.
I'll give you an example. This is a
company called "deemly" - a Danish company
- and this is their video for their
service.
Video narrator (VN): ... renting
apartments from others, and she loves to
swap trendy clothes and dresses. She's
looking to capture her first lift from a
RideShare app, but has no previous reviews
to help support her.
Video background voices: Awww.
VN: Luckily, she's just joined deemly,
where her positive feedback from the other
sites appears as a deemly score, helping
her to win a RideShare in no time. Deemly
is free to join and supports users across
many platforms, helping you to share and
benefit from the great reputation you've
earned. Imagine the power of using your
deemly score alongside your CV for a job
application...
TS: Like in China.
VN: ... perhaps to help get a bank loan...
TS: Like...
VN: or even to link to from your dating
profile.
TS: Like in China!
VN: Sign up now at deemly.co. Deemly:
better your sharing.
Applause
TS: Thanks. There is a change. There is
difference, though. The funny thing about
here is that it's highly invisible to us.
The Chinese government is very open about
what they're building, but here we are
very blind to what's going on. Mostly,
when we talk about these things, then
we're talking about these systems that
give us a very clear rating, like Airbnb,
Uber, and of course the Chinese system.
The thing is, most of these systems are
invisible to us. There's a huge market of
data brokers who are, you know, not
visible to you, because you are not the
customer. You are the product. And these
data brokers, well, what they do is, they
gather as much data as possible about you.
And that's not all. They then create up to
eight thousand scores about you. In the
United States, these companies have up to
8,000 scores, and in Europe it's a little
less, around 600. These are scores about
things like your IQ, your psychological
profile, your gullibility, your religion,
your estimated life span. 8,000 of these
different things about you. And how does
that work? Well, it works by machine
learning. So, machine learning algorithms
can find patterns in society that we can
really not anticipate. For example, let's
say you're a diabetic, and, well, let's
say this data broker company has a mailing
list, or has an app, that diabetic
patients use. And they also have the data
of these diabetic patients about what they
do on Facebook. Well, there you can start
to see correlations. So, if diabetic
patients more often like gangster-rap and
pottery on Facebook, well, then you could
deduce from that if you also like
gangster-rap or pottery on Facebook, then
perhaps you also are more likely to have
or get diabetes. It is highly
unscientific, but this is how the system
works. And this is an example of how that
works with just your Facebook scores.
Woman in the video: ... see was lowest about
60% when it came to predicting whether a
user's parents were still together when
they were 21. People whose parents
divorced before they were 21 tended to
like statements about relationships. Drug
users were ID'd with about 65% accuracy.
Smokers with 73%, and drinkers with 70%.
Sexual orientation was also easier to
distinguish among men. 88% right there.
For women, it was about 75%. Gender, by
the way, race, religion, and political
views, were predicted with high accuracy
as well. For instance: White versus black:
95%.
TS: So, the important thing to understand
here is that this isn't really about your
data anymore. Like, oftentimes when we
talk about data protection, we talk about
"Oh, I want to keep control of my data."
But this is their data. This data that
they deduce, that they derive from your
data. These are opinions about you. And
these things are what, you know, make it
so that even though you never filled in a
psychological test, they'd have one. A
great example of that, how that's used, is
a company called Cambridge Analytica. This
company has created detailed profiles
about us through what they call
psychographics and I'll let them explain
it themselves.
Man in the video: By having hundreds and
hundreds of thousands of Americans
undertake this survey, we were able to
form a
model to predict the personality of every
single adult in the United States of
America. If you know the personality of
the people you're targeting you can nuance
your messaging to resonate more
effectively with those key audience
groups. So, for a highly neurotic and
conscientious audience, you're going to
need a message that is rational and fair-
based, or emotionally-based. In this case,
the threat of a burglary, and the
insurance policy of a gun is very
persuasive. And we can see where these
people are on the map. If we wanted to
drill down further, we could resolve the
data to an individual level, where we have
somewhere close to four or five thousand
data points on every adult in the United
States.
TS: So, yeah. This is the company that
worked with both the... for the Brexit
campaign and with the Trump campaign. Of
course, little after Trump campaign, all
the data was leaked, so data on 200
million Americans was leaked, And
increasingly, you can see
this data described as "modeled voter
ethnicities and religions." So, this is
this derived data. You might think that
when you go online and use Facebook and
use all these services, that advertisers
are paying for you. That's a common
misperception. That's not really the case.
What's really going on is that, according
to SSC research, the majority of the
money made in this data broker market is
made from risk management. All right, so,
in a way you could say that it's not
really marketers that are paying for you,
it's your bank. It's ensurers. It's your
employer. It's governments. These kind of
organizations are the ones who buy these
profiles. The most. More than the other
ones. Of course, the promise of big data
is that you can then manage risk. Big data
is the idea that with data you can
understand things and then manage them.
So what really is innovation in this big
data world, this data economy, is the
democratization of the background check.
That's really the core of this, this market
that now you can find out everything about
everyone.
So, yeah, now your... in past, only perhaps
your bank could know your credit score but
now your green grocer knows your
psychological profile.
Right that's a new level of, yeah, what's
going on here. It's not only inv... not
only invisible but it's also huge
according to the same research by the FCC
this market was already worth 150 billion
dollars in 2015. So, it's invisible, it's
huge and hardly anyone knows about it. But
that's probably going to change. And that
brings us to the second part: Behavioral
change. We already see this first part of
this, how behavioral change is happening
through these systems. That's through outside
influence and we've, we've talked a lot
about this in this conference. For example
we see how Facebook and advertisers try to
do that. We've also seen how China is
doing that, trying to influence you. Russia
has recently tried to use Facebook to
influence the elections and of course
companies like Cambridge Analytica try to
do the same thing.
And here you can have a debate on, you
know, to what extent are they really
influencing us, but I think that's not
actually the really, the most interesting
question. What interests me most of all is
how we are doing it ourselves, how we are
creating new forms of self-censorship and
and are proactively anticipating these
systems. Because once you realize that
this is really about risk management you
start... and this is about banks and
employers trying to understand you, people
start to understand that this will go beyond
click fear, if you remember. This will go
beyond, this will become, you know,
when people find out this will be, you
know, not getting a job for example.
This'll be about getting really expensive
insurance. It'll be about all these kinds
of problems and people are increasingly
finding this out. So for example in the
United States if you... the IRS might now
use data profile... are now using data
profiles to find out who they should
audit. So I was talking recently to a girl
and and she said: "Oh I recently tweeted
about... a negative tweet about the IRS," and
she immediately grabbed her phone to
delete it.
When she realized that, you know, this
could now be used against her in a way.
And that's the problem. Of course we see all
kinds of other crazy examples that the big...
the audience that we measure... the wider public
is picking up on, like who... so we now have
algorithms that can find out if you're gay
or not. And these things scare people and
these things are something we have to
understand. So, chilling effects this what
this boils down to. For me, more importantly
than these influences of these big
companies and nation states is how people
themselves are experiencing these chilling
effects like you yourself have as well.
That brings us back to social cooling. For
me, social cooling is about these two
things combined at once and this
increasing ability of agents and... and groups
to influence you and on the other hand the
increasing willingness of people
themselves to change their own behavior to
proactively engage with this issue.
There are three long-term consequences
that I want to dive into. The first is how
this affects the individual, the second is
how it affects society, and the third is
how it affects the market. So let's look
the individual. Here we've seen, there's a
rising culture of self-censorship. It
started for me with an article that I read in
New York Times, where a student was saying:
"Well we're very very reserved." She's
going to do things like spring break.
I said: "Well you don't have to defend
yourself later," so you don't do it.
And what she's talking about, she's
talking about doing crazy things, you
know, letting go, having fun. She's
worried that the next day it'll be on
Facebook.
So what's happening here is that you do
have all kinds of freedoms: You have the
freedom to look up things, you have the
freedom to to say things, but you're
hesitating to use it. And that's really
insidious. That has an effect on a wider
society and here we really see the
societal value of privacy. Because in
society often minority values later become
majority values. An example is... is weed.
I'm from... I'm from the Netherlands and
there you see, you know, at first it's
something that you just don't do and it's
you know a bit of a "uhh", but then "Oh, maybe
yeah, you should... you should try it as well," and
people try it and slowly under the surface
of the society, people change their minds
about these things. And then, after a while
it's like, you know, "What are we still
worried about?"
How the same pattern help it happens of
course with way bigger things like this:
Martin Luther King: "I must honestly say to
you that I never intend to adjust myself
to racial segregation and discrimination."
TS: This is the same pattern
that's happening
for all kinds of things that that change
in society, and that's what privacy is so
important for, and that's why it's so
important that people have the ability to
look things up and to change their minds
and to talk about each other without
feeling so watched all the time.
The third thing is how this impacts the
market. Here we see very much the rise of
a culture of risk avoidance. An example
here is that in
1995 already, doctors in New York were
given scores, and what happened was that
the doctors who try to help advanced
stage cancer patients, complex patients, who
try to do the operation, difficult
operations, got a low score, because these
people more often died, while doctors that
didn't lift a finger and didn't try to
help got a high score. Because, well, people
didn't die. So you see here that these
systems that, they bring all kinds of
perverse incentives. They, you know, they're
they lower the willingness for everybody
to take a risk and in some areas of
society we really like people to take
risks. They're like entrepreneurs, doctors.
So in the whole part you could say that
this, what we're seeing here, is some kind
of trickle-down risk aversion, where
the willing, the... the way that
companies and governments want to
manage risk, that's trickling down to us.
And we're we of course want them to like
us, want to have a job, we want to have
insurance, and then we increasingly start
to think "Oh, maybe I should not do this." It's
a subtle effect. So how do we deal with
this?
Well, together. I think this is a really
big problem. I think this is such a big
problem that, that it can't be managed by
just some, some hackers or nerds, building
something, or by politicians, making a law.
This is a really a society-wide problem.
So I want to talk about all these groups
that should get into this: the public,
politicians, business, and us.
So the public. I think we have to talk
about and maybe extend the metaphor of the
cloud and say we have to learn to see the
stars behind the cloud. Alright, that's one
way that we could... that's a narrative we
could use. I really like to use humor to
explain this to a wider audience, so for
example, last year I was part of an
exhibits... helped develop exhibits about
dubious devices and one of the devices
there was called "Taste your status"
which was a coffee machine that gave you
coffee based on your area code. So if you
live in a good area code, you get nice coffee.
You live in a bad area code, you get bad
coffee.
music
laugher
applause
I'll go into it but... these... often times
you can use humor to explain these things
to a wider audience. I really like that
method, that approach.
We've got a long way to go though. I mean,
if we look at the long, you know, how long
it took for us to understand global
warming, to really, you know, come to a stage
where most people understand what it is
and care about it except Donald Trump.
Well, with data we really got a long way to
go, we're really at the beginning of
understanding this issue like this.
Okay, so the second group that has to
really wake up is politicians. And they have
to understand that this is really about the
balance of power. This is really about
power. And if you permit me, I'll go into
the big picture a little bit, as a media
theorist. So this is called Giles Deleuze.
he's a French philosopher and he explained
in his work something that I find really
useful, He said you have two systems of
control in society and the one is the
institutional one and that's the one we
all know.
You know that the judicial system so
you're free to do what you want but then
you cross a line you cross a law and the
police get you you go for every charge you
go to prison. That's the system we
understand.
But he says there's another system which
is the social system this is a social
pressure system and this for a long time
wasn't really designed.
But now increasingly we are able to do
that so this is the system where you
perform suboptimal behavior and then that
gets measured and judged and then you get
subtly nudged in the right direction. And
there's some very important differences
between these 2 systems. The institutional
system you know it has this idea that
you're a free citizen that makes up your
own mind and you know what social system
is like that's working all the time,
constantly it doesn't matter if you're
guilty or innocent it's always trying to
push you. The old system, the institutional
system is very much about punishment so if
you
break the rules you get punishment but
people sometimes don't really care about
punishment sometimes it's cool to get
punishment. But the social system uses
something way more powerful which is the
fear of exclusion. We are social animals
and we really care to belong to a group.
The other difference is that it's very
important that the institutional system is
accountable. You know democratically to us
how the social system at the moment is
really really invisible like these
algorithms how they work where the data is
going it's very hard to understand and of
course it's exactly what China loved so
much about it right there's no you can
stand in front of a tank but you can't
really stand in front of the cloud. So
yeah that's that's great it also helps me
to understand when people say I have
nothing to hide. I really understand that
because when people say I have nothing to
hide what they're saying is I have nothing
to hide from the old system from the
classic system from the institutional
system. They're saying I want to help the
police I trust our gover
nment I trust our institutions and that's
actually really a positive thing to say.
The thing is they don't really see the other
part of the system how increasingly there
are parts that are not in your control
they're not democratically checked and that's
really a problem. So the third thing that
I think we have to wake up is business,
business has to see that this is not so
much a problem perhaps but that it could
be an opportunity. I think I'm still
looking for a metaphor here but perhaps,
if we you know again, compare this issue
to global warming we say that we need
something like ecological food for data.
And but I don't know what that's gonna
look like or how we're gonna explain that
maybe we have to talk about fast food
versus fast data versus ecological data
but we need a metaphor here. Of course
laws are also really helpful. So we might
get things like this. I'm actually working
on this is funny. Or if things get really
out of hand we might get here, right?
So luckily we see that in Europe
the the politicians are awake and are
really trying to push this market I think
that's really great, so I think in the
future we'll get to a moment where people
say well I prefer European smart products
for example, I think that's a good thing I
think this is really positive. Finally I
want to get to all of us what each of us
can do. I think here again there's a
parallel to global warming where at its
core it's not so much about the new
technology and all the issues, it's about
a new mindset, a new way of looking at the
world. And I here think we have to stop
saying that we have nothing to hide for
example. If I've learned anything in the
past years understanding and researching
privacy and this big trade data market is
privacy is the right to be imperfect. All
right increasing there's pressure to be
the perfect citizen to be the perfect
consumer and privacy is a way of getting
out of that. So this is how I would
reframe privacy it's not just being about
which bits and bytes go where but it's
about you know the human right to be
imperfect cause course we are human we are
all imperfect. Sometimes when I talk at
technology conference people say well
privacy was just a phase. You know, it's
like ebb and flood in and we got it and
it's gonna go away again, that's crazy you
know, you don't say women's rights were
just a phase we had it for a while and
it's gonna go again. Right? And of course
Edward Snowden explains it way better. He
says arguing that you don't care about the
right to privacy because you have nothing
to hide it's no different than saying you
don't care about free speech because you
have nothing to say. What an eloquent
system admin. So I think what we have to
do strive for here is that we develop for
more nuanced understanding of all these
issues. I think we have to go away from
this idea that data more data is better,
data is automatically progress. No it's
not data is a trade-off for example for
the individual more data might mean less
psychological security, less willingness
to share, less willing to try things. For
a country it might mean less autonomy for
citizens and citizens need their own
autonomy they need to know what's going on
they need to be able to vote in their own
autonomous way and decide what's what they
want. In business you could say more data
might lead to less creativity right less
willingness to share new ideas to come up
with new ideas - that's again an issue
there. So in conclusion social cooling is
a way of understanding these issues or a
way of framing these issues that I think
could be useful for us. That could help us
understand and engage with these issues.
And yes social cooling is an alarm, it's
alarmist - it is we're trying to say this
is the problem and we have to deal with
this. But it's also really about hope. All
right. I trust not so much in technology I
trust in us in people that we can fix this
once we understand the issue in the same
way that when we understood the problem
with global warming we started to deal
with it. Where do it's
gonna it's slow progress we're doing that
and we can do the same thing with data.
It'll take a while but we'll get there.
And finally this is about starting to
understand the difference between shallow
optimism and deep optimism. All right,
oftentimes technology sectors right cool
into technology and we're going to fix
this by creating an app and for me that's
you know ,They: "we have to be
optimistic", that's very shallow optimism
the TEDx make optimism. Like true optimism
recognizes that each technology comes with
a downside and we have to recognize that
thats it's, that thats not a problem to,
to point out these problems it's a good
thing if once you understand the problems
you can deal with them - and you know come
up with better solutions. If we don't
change in this mindset then we might
create the world where we're all more well
behaved but perhaps also a little bit
less human. Thank you.
Applause
H: Thank You Devin.
TS: You are welcome.
Applause
H: We still have five more minutes we'll
take some questions if you like. First
microphone number 2.
Microphone 2 (M2): Hello, thanks that was
a really interesting talk. I have a
question that I hope will work it's a bit
complicated there's a project called indie
by a foundation called a sovereign
foundation do you know about it? Okay very
great perfect so to just to quickly
explain these people want to create an
identity layer that will be self sovereign
which means people can reveal what they
want about themselves only when they want
but is one unique identity on the entire
internet so that can potentially be very
liberating because you control all your
identity and individual data. But at the
same time it could be used to enable
something like the personal scores we were
showing earlier on so made me think about
that and I wanted to know if you had an
opinion on this.
TS: Yes well um the first thing I think
about is that as I try to explain you see
a lot of initiatives have tried to be
about: "Oo you have to control your own
data". But that's really missing the point
that it's no longer really about your data
it's about this derived data and of course
it can help to to manage what you share
you know then they can't derive anything
from it. But to little I see that
awareness. Second of all this is very much
for me an example of what nerds and
technologies are really good at it's like:
"oh we've got a social problem let's
create a technology app and then we'll fix
it". Well what I'm trying to explain is
that this is such a big problem that we
cannot fix this with just one group alone
- not the politicians, not the designers,
not the Nerds this is something that we
have to really get together you know grab
- fix together because this is such a
fundamental issue right. The idea that
risk is a problem that we want to manage
risk is such so deeply ingrained in people
you know such stuff based in fear is
fundamental and it's everywhere so it's
not enough for one group to try to fix
that it's something that we have to come
to grips with together.
M2: Thanks a lot.
H: Ok there is a signal angel has a
question from the internet I think.
Signal Angel (SigA): Yes and BarkingSheep
is asking: "do you think there's a
relationship between self-censorship and
echo chambers in a sense that people
become afraid to challenge their own
belief and thus isolate themselves in
groups with the same ideology?".
TS: That's, a that's a, that's a really
big answer to that one.
pauses
TS: Actually, I was e-mailing Vince Cerf,
and miraculously he, he responded, and he
said what you really have to look for is
this, not just a reputation economy, but
also the attention economy and how they're
linked. So for a while I've been looking
for that, that link and there's a lot to
say there and there definitely is a link.
I think important to understand over to
get new ones here is that, I'm not saying
that everybody will become really well
behaved and gray book worm people. The
thing is that what this situation's
creating, is that we're all becoming
theater players while playing in identity
more and more, because we're watched more
of the time. And for some people that
might mean that they're, you know, I think
most people will be more conservative and
more careful, some people will go really
all out and they all enjoy the stage! You
know? We have those people as well, and I
think those people could really benefit
and that the attention economy could
really you know give them a lot of
attention through that. So I think
there's, there's a link there but I could
go on more but I think it's for now, where
I'm aware.
H: Okay, we're short on time, we'll take,
I'm sorry one more question. The number
one?
Microphone 1 (M1): So, the, I think the
audience you're talking about, ...
H: Louder, please.
M1: The the audience you're talking to
here, is already very aware but I'm asking
for, like tactics, or your tips, to spread
your message and to talk to people that
are in this, they say: "Uh, I don't care
they can surveil me.", like what's, what's
your approach, like in a practical way?
How do you actually do this?
TS: Yeah, so, I'm really glad to be here
because I am, yes, I am a nerd, but I'm
also a philosopher or thinker, you know
and, and that means that
for me what I work with, it's not just odd
Rhinos, but words and ideas. I think those
I've been trying to show can be really
powerful, like a word can be a really
powerful way to frame a debate or engage
people. So, I haven't found yet a way to
push all this tar. Like, I was making joke
that I can tell you in one sentence, what
privacy is and why it matters but I have
to give a whole talk before that, all
right? Privacy is a right to be imperfect
but in order to understand that you have
to understand the rise of the reputation
economy, and how it affects your chances
in life. The fun thing is, that, that that
will happen by itself that people will
become more aware of that, they will run
into these problems. They will not get a
job or they might get other issues, and
then they will start to see the problem.
And so my question not so much to help
people understand it, but to help them
understand it before they run into the
wall, right? That's how usually society at
the moment deals with technology problems.
It's like "Oh we'll, we'll, oh ... Oh?
it's a problem? Oh well, now we'll try to
fix it." Well, I believe you can really
see these problems come way earlier and I
think the humanity's, to come around from,
is really helpful in that, and trying to
you know like, the lows are really,
really clearly explaining what the problem
is in 1995. So yeah, that I think that, I
don't have a short way of explaining, you
know, why privacy matters but I think
it'll become easier over time as people
start to really feel these pressures.
H: Sorry, thank you very much for the
question. I think we all should go out and
spread the message. This talk is over, I'm
awfully sorry. When you people leave,
please take your bottles, and your cups,
...
applause
H: ... and all your junk, and thank you
very much again Tijmen Schep!
applause
music
subtitles created by c3subtitles.de
in the year 2017. Join, and help us!