0:00:06.440,0:00:18.430
35C3 Intro music
0:00:19.490,0:00:25.046
Herald Angel: We at the Congress, we not[br]only talk about technology, we also talk
0:00:25.046,0:00:30.990
about social and ethical responsibility.[br]About how we can change the world for
0:00:30.990,0:00:36.190
good. The Good Technology Collective[br]supports the development guidelines...
0:00:36.190,0:00:41.289
sorry, it supports the development process[br]of new technology with ethical engineering
0:00:41.289,0:00:47.829
guidelines, that offer a practical way to[br]take ethic and social impact into account.
0:00:47.829,0:00:54.235
Yannick Leretaille - and I hope this was[br]okay - will tell you more about it.
0:00:54.235,0:00:57.820
Please welcome on stage with a very warm applause[br]Yann Leretaille.
0:00:57.820,0:01:02.649
applause
0:01:02.649,0:01:07.060
Yannick Leretaille: Hi, thanks for the[br]introduction. So before we start, can you
0:01:07.060,0:01:11.360
kind of show me your hand if you, like,[br]work in tech building products as
0:01:11.360,0:01:17.720
designers, engineers, coders, product[br]management? OK, so it's like 95 percent,
0:01:17.720,0:01:27.820
90 percent. Great. Yeah. So, today we kind[br]of try to answer the question: What is
0:01:27.820,0:01:33.560
good technology and how can we build[br]better technology. Before that, shortly
0:01:33.560,0:01:39.810
something of me. So I am Yann. I'm French-[br]German. Kind of a hacker, among the CCC
0:01:39.810,0:01:44.140
for a long time, entrepreneur, like, co-[br]founder of a startup in Berlin. And I'm
0:01:44.140,0:01:48.110
also founding member of the Good[br]Technology Collective. The Good
0:01:48.110,0:01:55.920
Technology Collective was founded about a[br]year ago or almost over a year now actually
0:01:55.920,0:02:02.250
by a very diverse expert council[br]and we kinda have like 3 areas of work.
0:02:02.250,0:02:07.660
The first one is trying to educate the[br]public about current issues with
0:02:07.660,0:02:12.190
technology, then, to educate engineers or[br]to build better technology, and then
0:02:12.190,0:02:19.980
long-term hopefully one day we'll be[br]able to work like in legislation as well.
0:02:19.980,0:02:27.250
Here, it's a bit of what we achieved so[br]far. We've like 27 council members now. We
0:02:27.250,0:02:30.870
have several media partnerships and[br]published around 20 articles, that's kind
0:02:30.870,0:02:36.260
of the public education part. Then we[br]organized or participated in roughly 15
0:02:36.260,0:02:44.780
events already. And we are now publishing[br]one standard, well, kind of today
0:02:44.780,0:02:49.459
actually, and[br]applause
0:02:49.459,0:02:54.290
and if you're interested in what we do,[br]then, yeah, sign up for the newsletter and
0:02:54.290,0:03:00.290
we keep you up to date and you can join[br]events. So as I said the Expert Council is
0:03:00.290,0:03:08.290
really, really diverse. We have everything[br]from people in academia, to people in
0:03:08.290,0:03:13.380
government, to technology makers, to[br]philosophers, all sorts, journalists.
0:03:13.380,0:03:22.260
And the reason that is the case is that a year[br]ago we kind of noticed that in our own
0:03:22.260,0:03:27.840
circles, like, as technology makers or[br]academics, we were all talking about a lot
0:03:27.840,0:03:33.209
of, kind of, voice on development and[br]technology, but no one was really kind of
0:03:33.209,0:03:37.470
getting together and looking at it from[br]all angles. And there have been a lot of
0:03:37.470,0:03:43.580
very weird and troublesome developments in[br]the last two years. I think we really
0:03:43.580,0:03:49.010
finally feel, you know like, the impact of[br]filter bubbles. Something we have talked
0:03:49.010,0:03:54.310
for like five years, but now it's like,[br]really like, you know, deciding over
0:03:54.310,0:04:01.290
elections and people become politically[br]radicalized and society is, kind of,
0:04:01.290,0:04:06.360
polarized more because they only see a[br]certain opinion. And we have situations
0:04:06.360,0:04:11.310
that we only knew, like, from science[br]fiction, just kind of, you know, pre-crime,
0:04:11.310,0:04:16.629
like, governments, kind of, over-arching[br]and trying to use machine learning to make
0:04:16.629,0:04:22.820
decisions on whether or not you should go[br]to jail. We have more and more machine
0:04:22.820,0:04:26.820
learning and big data and automization[br]going into basically every single aspect
0:04:26.820,0:04:31.620
of our lives and not all of it been has[br]been positive. You know, like, literally
0:04:31.620,0:04:36.510
everything from e-commerce to banking to[br]navigating to moving to the vault now goes
0:04:36.510,0:04:45.889
through these interfaces. That present us[br]the data and a slice of the world at a time.
0:04:45.889,0:04:49.699
And then at the same time we have[br]really positive developments. Right? We have
0:04:49.699,0:04:54.180
things like this, you know, like space[br]travel, finally something's happening.
0:04:54.180,0:05:00.850
And we have huge advances in medicine. Maybe[br]soon we'll have, like, self-driving cars
0:05:00.850,0:05:10.870
and great renewable technology. And it kind[br]of begs the question: How can it be that
0:05:10.870,0:05:16.889
good and bad use of technology are kind of[br]showing up at such an increasing rate in
0:05:16.889,0:05:25.130
this, like, on such extremes, right? And[br]maybe the reason is just that everything
0:05:25.130,0:05:30.330
got so complicated, right? Data is[br]basically doubling every couple of years,
0:05:30.330,0:05:35.620
so no human can possibly process anymore.[br]So we had to build more and more complex
0:05:35.620,0:05:40.979
algorithms to process it, connecting more[br]and more parts together. And no one really
0:05:40.979,0:05:46.020
seems to understand it anymore, it seems.[br]And that leads to unintended consequences.
0:05:46.020,0:05:50.270
I've an example here: So, Google Photos –[br]this is actually only two years ago –
0:05:50.270,0:05:56.960
launched a classifier to automatically go[br]through all of your pictures and tell you
0:05:56.960,0:06:01.254
what it is. You could say "Show me the[br]picture of the bird in summer at this
0:06:01.254,0:06:06.330
location" and it would find it for you.[br]Kind of really cool technology, and they
0:06:06.330,0:06:11.180
released it to, like, a planetary user[br]base until someone figured out that people
0:06:11.180,0:06:17.090
of color were always marked as gorillas.[br]Of course it was a huge PR disaster, why
0:06:17.090,0:06:21.680
somehow no one found out about this before[br]it came out... But now the interesting thing
0:06:21.680,0:06:27.240
is: In two years they didn't even manage[br]to fix it! Their solution was to just
0:06:27.240,0:06:32.853
block all kind of apes, so they're just[br]not found anymore. And that's how they
0:06:32.853,0:06:37.645
solved it, right? But if even Google can't [br]solve this... what does it mean?
0:06:38.795,0:06:42.245
And then, at the same time, you know, [br]sometimes we seem to have, kind of,
0:06:42.245,0:06:44.130
[br]intended consequences?
0:06:45.640,0:06:49.650
I have an example... another example here:[br]Uber Greyball. I don't know if anyone
0:06:49.650,0:06:58.490
heard about it. So Uber was very eager to[br]change regulation and push the services
0:06:58.490,0:07:03.210
globally as much as possible, and kind of[br]starting a fight with, you know, all the
0:07:03.210,0:07:06.539
taxi laws and regulation, and taxi[br]drivers in the various countries around the
0:07:06.539,0:07:11.880
world. And what they realized, of course,[br]is that they didn't really want people to
0:07:11.880,0:07:18.620
be able to, like, investigate what they[br]were doing or, like, finding individual
0:07:18.620,0:07:22.170
drivers. So they built this absolutely[br]massive operation which was like following
0:07:22.170,0:07:27.120
data in social media profiles, linking,[br]like, your credit card and location data
0:07:27.120,0:07:31.039
to find out if you were working for the[br]government. And if you did, you would just
0:07:31.039,0:07:36.680
never find a car. It would just not show[br]up, right? And that was clearly
0:07:36.680,0:07:41.307
intentional, all right. So at the same[br]time they were pushing, like, on, like,
0:07:41.307,0:07:46.490
the lobbyism, political side to change[br]regulation, while heavily manipulating the
0:07:46.490,0:07:52.039
people that were pushing to change the[br]regulation, right? Which is really not a
0:07:52.039,0:07:54.754
very nice thing to do, I would say.
0:07:55.534,0:07:56.917
And...
0:07:58.417,0:08:02.711
The thing that I find, kind of... [br]worrisome about this:
0:08:02.711,0:08:08.207
No matter if it's intended or unintended, [br]is that it actually gets worse, right?
0:08:08.207,0:08:12.746
The more and more systems we[br]interconnect, the worse these consequences
0:08:12.746,0:08:17.300
can get. And I've an example here: So this[br]is a screenshot I took of Google Maps
0:08:17.300,0:08:24.100
yesterday and you notice there are, like,[br]certain locations... So they're kind of
0:08:24.100,0:08:28.160
highlighted on this map and I don't know[br]if you knew it but this map and the
0:08:28.160,0:08:32.510
locations that Google highlight look[br]different for every single person.
0:08:32.510,0:08:38.130
Actually, I went again and looked today[br]and it looked different again. So, Google
0:08:38.130,0:08:42.520
is already heavily filtering and kind of[br]highlighting certain places, like, maybe
0:08:42.520,0:08:48.850
this restaurant over there, if you can see[br]it. And I would say, like, from just
0:08:48.850,0:08:53.550
opening the map, that's not obvious to you[br]that it's doing that. Or that it's trying
0:08:53.550,0:08:59.130
to decide for you which place is[br]interesting for you. However, that's
0:08:59.130,0:09:05.620
probably not such a big issue. But the[br]same company, Google with Waymo, is also
0:09:05.620,0:09:11.360
developing this – and they just started[br]deploying them: self-driving cars. They're...
0:09:12.720,0:09:18.690
...still a good couple of years away from[br]actually making it reality, but they are
0:09:18.690,0:09:22.547
really – in terms of, like, all the others[br]trying it at the moment – the farthest, I
0:09:22.547,0:09:28.915
would say, and in some cities they started[br]deploying self-driving cars. So now, just
0:09:28.915,0:09:33.892
think like 5, 10 years into the future[br]and you have signed up in your Google...
0:09:35.246,0:09:38.834
...self-driving car. Probably you don't [br]have your own car, right? And you go in
0:09:38.834,0:09:44.381
the car and you are like: "Hey, Yann, where[br]do you want to go?" Do you want to go to
0:09:44.381,0:09:49.770
work? Because, I mean obviously that's why[br]I probably go most of the time. Do you
0:09:49.770,0:09:52.620
want to go to your favorite Asian[br]restaurant, like the one we just saw on the
0:09:52.620,0:09:56.700
map? Which is actually not my favorite,[br]but the first one I went to. So Google
0:09:56.700,0:10:00.300
just assumed it was. Do you want to go to[br]another Asian restaurant? Because,
0:10:00.300,0:10:06.960
obviously, that's all I like. And then[br]McDonald's. Because, everyone goes there.
0:10:06.960,0:10:11.420
And maybe the fifth entry is an[br]advertisement. And you would say: Well,
0:10:11.420,0:10:17.899
Yann, you know, that's still kind of fine,[br]but it's OK because I can still click on:
0:10:17.899,0:10:24.750
'No, I don't want these 5 options, give me,[br]like, the full map.' But now, we went back
0:10:24.750,0:10:31.050
here. So, even though you are seeing the[br]map, you're not actually not seeing all
0:10:31.050,0:10:35.650
the choices, right? Google is actually[br]filtering for you where it thinks you want
0:10:35.650,0:10:42.910
to go. So now we have, you know, the car[br]like this symbol of mobility and freedom.
0:10:42.910,0:10:49.790
It enables so much change in our society[br]that it's actually reducing the part of
0:10:49.790,0:10:53.910
the world that you see. And because, I[br]mean these days they call it AI, I think
0:10:53.910,0:10:58.660
it's just machine learning, because these[br]machine learning algorithms all do pattern
0:10:58.660,0:11:04.730
matching and basically just can recognize[br]similarities. When you open the map and
0:11:04.730,0:11:09.330
you zoom in and you select a random place,[br]it would only suggest places to you where
0:11:09.330,0:11:14.730
other people have been before. So now the[br]restaurant that opened around the corner
0:11:14.730,0:11:19.010
you'll probably not even discover it[br]anymore. And no one will. And it will
0:11:19.010,0:11:23.080
probably close. And the only ones that[br]will stay are the ones that are already
0:11:23.080,0:11:32.410
established now. And all of that without[br]being really obvious to anyone who would
0:11:32.410,0:11:40.200
use the technology. Because it has become[br]like kind of a black box. So, I do want
0:11:40.200,0:11:47.740
self-driving cars, I really do. I don't[br]want a future like this. Right. And if we
0:11:47.740,0:11:52.850
want to prevent that future, I think we[br]have to first ask a very simple question,
0:11:52.850,0:11:59.519
which is: Who is responsible for designing[br]these products? So, do you know the
0:11:59.519,0:12:01.519
answer?[br]audience: inaudible
0:12:01.519,0:12:05.200
Yann: Say it louder.[br]audience: We are.
0:12:05.200,0:12:10.220
Yann: Yeah, we are. Right. That's a really[br]frustrating thing about it that actually
0:12:10.220,0:12:15.490
gets us, right, as engineers and[br]developers. You know we are always driven
0:12:15.490,0:12:20.230
by perfection. We want to create, like,[br]the perfect code sources. One problem,
0:12:20.230,0:12:25.420
really, really nice. You know. Chasing the[br]next challenge over and over trying to be
0:12:25.420,0:12:31.950
first. But we have to realize that at the[br]same time we are kind of working on
0:12:31.950,0:12:37.040
frontier technologies, right, on things,[br]technology, that are really kind of on the
0:12:37.040,0:12:42.230
edge of values and norms we have in[br]society. And if we are not careful and
0:12:42.230,0:12:46.450
just, like, focus on our small problem and[br]don't look at the big picture, then we
0:12:46.450,0:12:52.270
have no say in on which side of the coin[br]the technology will fall. And probably it
0:12:52.270,0:12:58.680
will take a couple of years, or by that[br]time we alreaday moved on, I guess. So.
0:12:58.680,0:13:06.620
It's just that technology has become so[br]powerful and interconnected and impactful,
0:13:06.620,0:13:10.720
because we are not building stuff that[br]it's not affecting like 10 or 100 people
0:13:10.720,0:13:14.520
or a city but literally millions of[br]people, that we really have to take a step
0:13:14.520,0:13:20.950
back and not only look at the individual[br]problem as the challenge but also the big
0:13:20.950,0:13:27.120
picture. And I think if you want to do[br]that we have to start by asking the right
0:13:27.120,0:13:33.510
questions. And the first question of[br]course is: What is good technology? So,
0:13:33.510,0:13:39.250
that's also the name of the talk.[br]Unfortunately, I don't have a perfect
0:13:39.250,0:13:45.680
answer for that. And probably we will[br]never find a perfect answer for that. So,
0:13:45.680,0:13:53.060
what I would like to propose is to[br]establish some guidelines and engineering
0:13:53.060,0:13:57.980
processes that help us to build better[br]technology. To kind of ensure the same
0:13:57.980,0:14:03.959
where we have quality insurance and[br]project management systems and processes
0:14:03.959,0:14:09.480
to, like, kind of, this you were tasked[br]with. And companies that what we build is
0:14:09.480,0:14:15.589
actually, has a net positive outcome for[br]society. And we call it the good
0:14:15.589,0:14:22.130
technology standard. We've kind of been[br]working that over, the last year, and we
0:14:22.130,0:14:26.840
really wanted to make it really practical.[br]And what we kind of realized is that if you
0:14:26.840,0:14:32.250
want to make it practical you have to make[br]it very easy to use and also mostly,
0:14:32.250,0:14:39.000
actually what was surprising, just ask the[br]right questions. So, what is important
0:14:39.000,0:14:46.270
though, is that if you adapt the standard,[br]it has to be in all project phases. It has
0:14:46.270,0:14:50.200
to involve everyone. So, from, like, the[br]CTO to, like, the product managers to
0:14:50.200,0:14:55.820
actually legal. Today, legal has this[br]interesting role, where you develop
0:14:55.820,0:15:00.350
something and then you're like: Okay, now,[br]legal, make sure that we can actually ship it.
0:15:00.350,0:15:06.480
And that's what usually happens. And,[br]yeah, down to the individual engineer. And
0:15:06.480,0:15:09.940
if it's not applied globally and people[br]start making exceptions then of course it
0:15:09.940,0:15:17.970
won't be worth very much. Generally, we[br]kind of identified four main areas that we
0:15:17.970,0:15:22.730
think are important, kind of defining,[br]kind of an abstract way, if a product is
0:15:22.730,0:15:30.470
good. And the first one is empowerment. A[br]good product should empower its users. And
0:15:30.470,0:15:36.290
that's kind of a tricky thing. So, as[br]humans we have very limited decision
0:15:36.290,0:15:40.360
power. Right? And we are faced with, as I[br]said before, like, this huge amount of
0:15:40.360,0:15:46.339
data and choices. So it seems very natural[br]to build machines and interfaces that try
0:15:46.339,0:15:51.260
to make a lot of decisions for us. Like[br]the Google Maps one we saw before. But we
0:15:51.260,0:15:55.730
have to be careful because if we do that[br]too much then the machine ends up making
0:15:55.730,0:16:03.079
all decisions for us. So often, when you[br]develop something you should really ask
0:16:03.079,0:16:07.320
yourself, like, in the end if I take[br]everything together am I actually
0:16:07.320,0:16:12.880
empowering users, or am I taking[br]responsibility away from them? Do I
0:16:12.880,0:16:18.360
respect the individual choice? Why does he[br]say: I don't want this, or they give you
0:16:18.360,0:16:23.570
their preference, do we actually respect[br]it or do we still try to, you know, just
0:16:23.570,0:16:29.639
figure out what is better for them. Do my[br]users actually feel like they benefit from
0:16:29.639,0:16:34.330
using the product? So, I couldn't,[br]actually not a lot of people ask themselves,
0:16:34.330,0:16:39.720
because usually you think like in terms[br]of: Are you benefiting your company? And I
0:16:39.720,0:16:45.699
think what's really pressing in that[br]aspect: does it help the users, the humans
0:16:45.699,0:16:53.529
behind it, to grow in any way. If it helps[br]them to be more effective or faster or do
0:16:53.529,0:16:57.990
more things or be more relaxed or more[br]healthy, right, then it's probably positive.
0:16:57.990,0:17:02.100
But if you can't identify any of these,[br]then you really have to think about it.
0:17:02.100,0:17:08.789
And then, in terms of AI, in machine[br]learning, are we actually kind of
0:17:08.789,0:17:17.049
impacting their own reasoning so that they[br]can't make proper decisions anymore. The
0:17:17.049,0:17:22.560
second one is Purposeful Product Design.[br]That one is one that, it's been kind of a
0:17:22.560,0:17:26.789
pet peeve for me for a really long time.[br]So these days we have a lot of products
0:17:26.789,0:17:32.289
that are kind of like this. I don't have[br]something specifically against Philips
0:17:32.289,0:17:37.639
Hue, but there seems to be, like, this[br]trend that is kind of, making smart
0:17:37.639,0:17:42.899
things, right? You take a product, put a[br]Wi-Fi chip on it, just slap it on there.
0:17:42.899,0:17:47.860
Label it "smart", and then you make tons[br]of profit, right? And a lot of these new
0:17:47.860,0:17:50.309
products we've been seeing around us,[br]like, everyone is saying, like, oh yeah,
0:17:50.309,0:17:55.119
we will have this great interconnected[br]feature, but most of them are actually not
0:17:55.119,0:17:58.489
changing the actual product, right, like,[br]the Wi-Fi connected washing machine today
0:17:58.489,0:18:03.320
is still a boring washing machine that[br]breaks down after two years. But it has
0:18:03.320,0:18:08.970
Wi-Fi, so you can see what it's doing when[br]you're in the park. And we think we should
0:18:08.970,0:18:16.000
really think more in terms of intelligent[br]design. How can we design it in the first
0:18:16.000,0:18:21.769
place so it's intelligent, not smart. That[br]the different components interact in a
0:18:21.769,0:18:26.779
way, that it serves a purpose well, and[br]the kind of intelligent by design
0:18:26.779,0:18:33.950
philosophy is, when you start using your[br]product you kind of try to identify the
0:18:33.950,0:18:40.629
core purpose of it. And based on that, you[br]just use all the technologies available to
0:18:40.629,0:18:44.359
rebuild it from scratch. So, instead of[br]building a Wi-Fi connect washing machine
0:18:44.359,0:18:47.450
would actually try to build a better[br]washing machine. And if it ends up having
0:18:47.450,0:18:51.489
Wi-Fi, then that's good, but it doesn't[br]has to. And along each step actually try
0:18:51.489,0:18:58.309
to ask yourself: Am I actually improving[br]washing machines here? Or am I just
0:18:58.309,0:19:06.279
creating another data point? And yeah, a[br]good example for that is, kind of, a
0:19:06.279,0:19:09.960
watch. Of course it's very old and old[br]technology, it was invented a long time
0:19:09.960,0:19:14.149
ago. But back when it was invented it was[br]for something you could have on your arm
0:19:14.149,0:19:19.510
or in your pocket in the beginning and it[br]was kind of a natural extension of
0:19:19.510,0:19:25.489
yourself, right, that kind of enhances[br]your senses because it's never there, you
0:19:25.489,0:19:29.850
don't really feel it. But when you need it[br]it's always there and then you can just
0:19:29.850,0:19:33.940
look at it and you know the time. And that[br]profoundly changed how, like, we humans
0:19:33.940,0:19:37.570
actually worked in society because we[br]couldn't meet in the same place at the
0:19:37.570,0:19:42.779
same time. So, when you build a new[br]product try to ask yourself what is the
0:19:42.779,0:19:46.590
purpose of the product, who is it for.[br]Often I talk to people and they talk to me
0:19:46.590,0:19:50.750
for one hour, what like, literally the[br]details of how they solved the problem but
0:19:50.750,0:19:55.130
they can't tell me who their customer is.[br]Then does this product actually make
0:19:55.130,0:19:59.570
sense? Do I have features, and these[br]distract my users, that I maybe just don't
0:19:59.570,0:20:03.850
need. And can I find more intelligent[br]solutions by kind of thinking outside of
0:20:03.850,0:20:09.549
the box and focusing on the purpose of it.[br]And then of course what is the long term
0:20:09.549,0:20:12.950
product vision like, where do we want this[br]to go? This kind of technology I'm
0:20:12.950,0:20:20.090
developing in the next years. The next one[br]is kind of, Societal Impact, that goes
0:20:20.090,0:20:27.659
into what I talked about in the beginning[br]with all the negative consequences we have
0:20:27.659,0:20:30.820
seen. A lot of people these days don't[br]realize that even if you're, like, in a
0:20:30.820,0:20:34.571
small start up and you're working on, I[br]don't know, a technology, or robots, or
0:20:34.571,0:20:39.669
whatever. You don't know if your[br]algorithm, or your mechanism, or whatever
0:20:39.669,0:20:45.190
you build, will be used by 100 million[br]people in five years. Because this has
0:20:45.190,0:20:49.849
happened a lot, right? So, only when[br]starting to build it you have to think: If
0:20:49.849,0:20:54.220
this product would be used by 10 million,[br]100, maybe even a billion people, like
0:20:54.220,0:20:58.149
Facebook, would it have negative[br]consequences? Right, because then you get
0:20:58.149,0:21:03.470
completely different effects in society,[br]completely different engagement cycles and
0:21:03.470,0:21:09.279
so on. Then, are we taking advantage of[br]human weaknesses? So this is arguably
0:21:09.279,0:21:16.249
something that's just their technology. A[br]lot of products these days kind of try to
0:21:16.249,0:21:19.789
hack your brain, what, we understand[br]really well how, like, engagement works
0:21:19.789,0:21:24.870
and addiction. So a lot of things, like[br]social networks, actually have been
0:21:24.870,0:21:28.070
focusing, you know, and also built by[br]engineers, you know, trying to get a
0:21:28.070,0:21:34.509
little number from 0.1% to 0.2%, can mean[br]that you just do extensive A/B testing,
0:21:34.509,0:21:38.460
create an interface that no one can stop[br]looking at. You just continue scrolling,
0:21:38.460,0:21:42.340
right? You just continue, and two hours[br]have passed and you haven't actually
0:21:42.340,0:21:48.830
talked to anyone. And this attention[br]grabbing is kind of an issue and we can
0:21:48.830,0:21:53.590
see that Apple actually now implemented[br]screen time and they actually tell you how
0:21:53.590,0:21:56.700
much time you spend on your phone. So[br]there's definitely ways to build
0:21:56.700,0:22:01.830
technology that even helps you to get away[br]from these. And then for everything that
0:22:01.830,0:22:06.269
involves AI and machine learning, you[br]really have to take a really deep look at
0:22:06.269,0:22:11.090
your data sets and your algorithms because[br]it's very, very easy to build in biases
0:22:11.090,0:22:16.700
and discrimination. And again, if you it[br]applied to all of society many people who
0:22:16.700,0:22:20.440
are less fortunate, or more fortunate, or[br]they're just different, you know they just
0:22:20.440,0:22:24.899
do different things, kind of fall out of[br]the grid and now suddenly they can't,
0:22:24.899,0:22:30.849
like, [unintelligible] anymore. Or use[br]Uber, or Air B'n'B, or just live a normal
0:22:30.849,0:22:35.440
life, or do financial transactions. And[br]then, kind of what I said in the
0:22:35.440,0:22:39.789
beginning, not only look at your product[br]but also, if you combine it with other
0:22:39.789,0:22:43.689
technologies that are upcoming, are there[br]certain combinations that are dangerous?
0:22:43.689,0:22:48.509
And for that I kind of recommend to do,[br]like, some techno or litmus test to just
0:22:48.509,0:22:59.359
try to come up with the craziest scenario[br]that your technology could entail. And if
0:22:59.359,0:23:05.450
it's not too bad then, probably good. The[br]next thing is, kind of, sustainability. I
0:23:05.450,0:23:11.460
think in today's world it really should be[br]part of a good product, right. The first
0:23:11.460,0:23:16.700
question is of course kind of obvious. Are[br]we limiting product lifetime? Do we maybe
0:23:16.700,0:23:20.049
have planned obsolescence, or if we[br]build something that is so dependent on so
0:23:20.049,0:23:23.950
many services and we're not only going to[br]support it for one year anyways, that
0:23:23.950,0:23:29.389
basically it will have to be thrown in the[br]trash afterwards. Maybe it would be
0:23:29.389,0:23:34.159
possible to add a standalone node or a[br]very basic fallback feature so that at
0:23:34.159,0:23:37.659
least the products continues to work.[br]Especially if you talk about things like
0:23:37.659,0:23:43.710
home appliances. Then, what is the[br]environmental impact? A good example here
0:23:43.710,0:23:49.009
would be, you know, crypto currencies who[br]are now using as much energy as certain
0:23:49.009,0:23:56.960
countries. And when you consider that just[br]think like is there maybe an alternative solution
0:23:56.960,0:24:00.950
that doesn't have such a big impact. And[br]of course we are still capitalism, it has
0:24:00.950,0:24:05.289
to be economically viable, but often there[br]aren't, often it's again just really small
0:24:05.289,0:24:12.809
tweaks. Then of course: Which other[br]services are you working with? But for
0:24:12.809,0:24:17.879
example I would say, like, as european[br]companies, we're in Europe here, maybe try
0:24:17.879,0:24:22.049
to work mostly with suppliers from Europe,[br]right, because you know they follow GDPR
0:24:22.049,0:24:28.309
and strict rules, and in a sense the US.[br]Or check your supply chain if you build
0:24:28.309,0:24:32.919
hardware. And then for hardware[br]specifically that's because also I have,
0:24:32.919,0:24:37.979
like, we also do hardware in my company, I[br]always found that interesting. We're kind
0:24:37.979,0:24:41.710
of in a world where everyone tries to[br]save, like, the last little bit of money
0:24:41.710,0:24:46.460
out of every device that is built and[br]often makes the difference between plastic
0:24:46.460,0:24:51.690
and metal screws like half a cent, right.[br]And at that point it doesn't really change
0:24:51.690,0:24:57.039
your margins much. And maybe as an[br]engineer, you know, just say no and say:
0:24:57.039,0:25:00.620
You know, we don't have to do that. The[br]savings are too small to redesign
0:25:00.620,0:25:05.509
everything and it will impact upon our[br]quality so much that it just breaks
0:25:05.509,0:25:13.159
earlier. These are kind of the main four[br]points. I hope that makes sense. Then we
0:25:13.159,0:25:17.230
have two more, kind of, additional[br]checklists. The first one is data
0:25:17.230,0:25:24.490
collection. So really, just if, especially[br]like in terms of like IOT, you know,
0:25:24.490,0:25:29.419
everyone focuses on kind of collecting as[br]much data as possible without actually
0:25:29.419,0:25:33.629
having an application. And I think we[br]really have to start seeing that as a
0:25:33.629,0:25:39.889
liability. And instead try to really[br]define the application first, define which
0:25:39.889,0:25:44.279
data we need for it, and then really just[br]collect that. And we can start collecting
0:25:44.279,0:25:49.350
more data later on. And that can really[br]prevent a lot of these negative cycles we
0:25:49.350,0:25:53.029
have seen. By just having machine learning[br]organisms run on of it kind of
0:25:53.029,0:25:59.349
unsupervised and seeing what comes out.[br]Then also kind of really interesting I
0:25:59.349,0:26:02.820
found that, many times, like, a lot of[br]people are so fascinated by the amount of
0:26:02.820,0:26:08.649
data, right, just try to have as many data[br]points as possible. But very often you can
0:26:08.649,0:26:13.539
realize exactly the same application with a[br]fraction of data points. Because what you
0:26:13.539,0:26:17.879
really need is, like, trends. And that[br]usually also makes the product more
0:26:17.879,0:26:24.230
efficient. Then how privacy intrusive is[br]the data we collect? Right. There's a big
0:26:24.230,0:26:27.619
difference between, let's say, the[br]temperature in this building and
0:26:27.619,0:26:32.070
everyone's individual movements here. And[br]if it is privacy intrusive then we should
0:26:32.070,0:26:35.759
really, really think hard if we want to[br]collect it. Because we don't know how it
0:26:35.759,0:26:44.389
might be used at a later point. And then,[br]are we actually collecting data without
0:26:44.389,0:26:47.610
people realizing that they do it, right,[br]especially if we look at Facebook and
0:26:47.610,0:26:52.720
Google. They're collecting a lot of data[br]without really implicit consent. But of
0:26:52.720,0:26:58.659
course at some point you like all agreed[br]to the privacy policy. But it's often not
0:26:58.659,0:27:03.860
clear to you when and which data is[br]collected. And that's kind of dangerous
0:27:03.860,0:27:12.149
and kind of in the same way if you kind of[br]build dark patterns into your app. They
0:27:12.149,0:27:17.669
kind of fool you into sharing even more[br]data. I had, like, an example that someone
0:27:17.669,0:27:24.830
told me yesterday. I don't if you know[br]Venmo which is this American system where
0:27:24.830,0:27:27.381
you pay each other with your smartphone.[br]Basically to split the bill in a
0:27:27.381,0:27:32.570
restaurant. By default, all transactions[br]are public. So, like 200 million public
0:27:32.570,0:27:38.659
transactions which everyone can see,[br]including the description of it. So for
0:27:38.659,0:27:45.249
some of the more maybe not so legal[br]payments that was also very obvious,
0:27:45.249,0:27:50.720
right? And it's totally un-obvious when[br]you use the app that that is happening. So
0:27:50.720,0:27:56.809
that's definitely a dark pattern that[br]they're employing here. And then the next
0:27:56.809,0:28:02.809
point is User Product Education and[br]Transparency. Is a user able to understand
0:28:02.809,0:28:08.320
how the product works? And, of course, we[br]can't really ever have a perfect
0:28:08.320,0:28:15.609
explanation of all the intricacies of the[br]technology. But these days for most people
0:28:15.609,0:28:20.889
almost all of the apps, the interfaces,[br]the building technology and tech. This is
0:28:20.889,0:28:25.340
a complete black box and no one is really[br]doing an effort to explain it to them why
0:28:25.340,0:28:30.460
most companies advertise it like this[br]magical thing. But that just leads to kind
0:28:30.460,0:28:35.950
of this immunization where you just look at[br]it and you don't even try to understand
0:28:35.950,0:28:46.399
it. I'm pretty sure that no one ever,[br]like, these days is still opening up a PC
0:28:46.399,0:28:50.029
and trying looking at the components,[br]right, because everything is in tablet and
0:28:50.029,0:28:56.639
it's integrated and it's sold to us like[br]this magical media consumption machine.
0:28:56.639,0:29:01.809
Then, are users informed when decisions[br]are made for them? So we had that in
0:29:01.809,0:29:07.950
Empowerment, that we should try to reduce[br]the amount of decisions we make for the
0:29:07.950,0:29:12.149
user. But sometimes, that's a good thing[br]to do. But then, is it transparently
0:29:12.149,0:29:17.789
communicated? I would be totally fine with[br]Google Maps filtering out for me the
0:29:17.789,0:29:21.629
points of interest if it would actually[br]tell me that it's doing that. And if you
0:29:21.629,0:29:25.779
can't understand why it made that decision[br]and why it showed me this place. And maybe
0:29:25.779,0:29:29.589
also have a way to switch it off if I[br]want. But today we seem to kind of assume
0:29:29.589,0:29:34.039
that we know better for the people why[br]it's, so we found the perfect algorithm
0:29:34.039,0:29:37.869
that has a perfect answer. So we don't[br]even have to explain how it works, right?
0:29:37.869,0:29:42.440
We just do it and people will be happy.[br]But then we end up with is very negative
0:29:42.440,0:29:48.999
consequences. And then, that's more like a[br]marketing thing, how is it actually
0:29:48.999,0:29:54.820
advertised? I find it, for example, quite[br]worrisome that things like Siri and
0:29:54.820,0:30:00.210
Alexa and Google home are, like, sold as[br]these magical AI machines that make your
0:30:00.210,0:30:03.499
life better, and are you personal[br]assistant. When in reality they are
0:30:03.499,0:30:10.119
actually still pretty dumb, pattern[br]matching. And that also creates a big
0:30:10.119,0:30:14.379
disconnect. Because now we have children[br]growing up who actually think that Alexa
0:30:14.379,0:30:20.529
is a person. And that's kind of dangerous.[br]And I think we should try to prevent that
0:30:20.529,0:30:27.450
because for these children, basically, it[br]kind of creates this veil and it's
0:30:27.450,0:30:33.340
humanized. And that's especially dangerous[br]if then the machine starts to make
0:30:33.340,0:30:37.359
decisions for them. And suggestions[br]because they will take them as if a human
0:30:37.359,0:30:48.539
did it for them. So, what is that? So,[br]these are kind of the main areas. Of course
0:30:48.539,0:30:54.929
it's a bit more complicated. So we just[br]published the standard today in the first
0:30:54.929,0:31:00.990
draft version. And it's basically three[br]parts of science introduction, kind of the
0:31:00.990,0:31:04.809
questions and checklists that you just saw.[br]And then actually how to implement it in
0:31:04.809,0:31:10.239
your company, which processes to have, at[br]which point you basically should have
0:31:10.239,0:31:16.289
kind of a feature gate. And I would kind of[br]ask everyone to go there, look at it,
0:31:16.289,0:31:22.749
contribute, shared it with people. We hope[br]that we'll have a final version ready kind
0:31:22.749,0:31:40.499
of in Q1 and that by then people can start[br]to implement it. Oh, yeah. So, even though
0:31:40.499,0:31:45.330
we have this standard, right, I want to[br]make it clear having such a standard and
0:31:45.330,0:31:50.539
implementing it in your organization or[br]for yourself or your product is great. It
0:31:50.539,0:31:55.859
actually doesn't remove your[br]responsibility, right? This can only be
0:31:55.859,0:32:01.899
successful if we actually all accept that[br]we are responsible. Right? If today I
0:32:01.899,0:32:06.719
build a bridge as a structural engineer[br]and the bridge breaks down because I
0:32:06.719,0:32:10.479
miscalculated, I am responsible. And I[br]think, equally, we have to accept that if
0:32:10.479,0:32:18.589
we build technology like this we also have[br]to, kind of, assume that responsibility.
0:32:18.589,0:32:25.289
And before we kind of move to Q&A, I'd[br]like to kind of take the citations. This
0:32:25.289,0:32:30.839
is Chamath Palihapitiya, former Facebook[br]executive, from the really early times.
0:32:30.839,0:32:35.340
And also, around a year ago when we[br]actually saw the GTC he said this in a
0:32:35.340,0:32:40.190
conference: "I feel tremendous guilt. I[br]think in the back in the deep restlessness
0:32:40.190,0:32:44.269
of our mind we knew something bad could[br]happen. But I think the way we defined it
0:32:44.269,0:32:48.490
is not like this. It is now literally at a[br]point where I think we have created
0:32:48.490,0:32:54.169
tools that are ripping apart the social[br]fabric of how society works." And
0:32:54.169,0:33:02.769
personally, and I hope the same for you, I[br]do not want to be that person that five
0:33:02.769,0:33:07.979
years down the line realizes that they[br]built that technology. So if there is one
0:33:07.979,0:33:14.190
take-away that you can take home from this[br]talk, then to just start asking yourself:
0:33:14.190,0:33:18.720
What is good technology, what does it mean[br]for you? What does it mean for the
0:33:18.720,0:33:24.779
products you build and what does it mean[br]for your organization? Thanks.
0:33:24.779,0:33:29.970
applause
0:33:29.970,0:33:37.739
Herald: Thank you. Yann Leretaille. Do we[br]have questions in the room? There are
0:33:37.739,0:33:44.539
microphones, microphones number 1, 2, 3,[br]4, 5. If you have a question please speak
0:33:44.539,0:33:49.269
loud into the microphone, as the people in[br]the stream want to hear you as well.
0:33:49.269,0:33:52.616
I think microphone number 1 was the fastest. [br]So please.
0:33:52.616,0:33:57.659
Question: Thank you for your talk. I just[br]want to make a short comment first and
0:33:57.659,0:34:01.750
then ask a question. I think this last[br]thing you mentioned about offering users
0:34:01.750,0:34:06.999
the options to have more control of the[br]interface there is also a problem that
0:34:06.999,0:34:11.330
users don't want it. Because when you look[br]at the statistics of how people use online
0:34:11.330,0:34:17.250
web tools, only maybe 5 percent of them[br]actually use that option. So companies
0:34:17.250,0:34:22.260
remove them because for them it seems like[br]it's something not so efficient for user
0:34:22.260,0:34:26.409
experience. This was just one thing to[br]mention and maybe you can respond to that.
0:34:26.409,0:34:33.589
But what I wanted to ask you was, that all[br]these principles that you presented, they
0:34:33.589,0:34:40.079
seem to be very sound and interesting and[br]good. We can all accept them as
0:34:40.079,0:34:45.329
developers. But how would you propose to[br]actually sell them to companies. Because
0:34:45.329,0:34:50.090
if you adopt a principle like this as an[br]individual based on your ideology or the
0:34:50.090,0:34:53.680
way that you think, okay, it's great it[br]will work, but how would you convince a
0:34:53.680,0:34:58.590
company which is driven by profits to[br]adopt these practices? Have you thought of
0:34:58.590,0:35:04.589
this and what's your idea about this?[br]Thank you.
0:35:04.589,0:35:11.310
Yann: Yeah. Maybe to the first part.[br]First, that giving people choice is
0:35:11.310,0:35:16.760
something that people do not want and[br]that's why companies removed it. I think
0:35:16.760,0:35:21.970
if you look at the development process[br]it's basically like a huge cycle of
0:35:21.970,0:35:26.359
optimization and user testing geared[br]towards a very specific goal, right, which
0:35:26.359,0:35:31.060
is usually set by leadership which is,[br]like, bringing engagement up or increase
0:35:31.060,0:35:37.670
user amount by 200 percent. So I would say[br]the goals were, or are today, mostly
0:35:37.670,0:35:41.849
misaligned. And that's why we end up with[br]interfaces that are in a very certain way,
0:35:41.849,0:35:46.049
right? If we set the goals[br]differently, and I mean that's why we have
0:35:46.049,0:35:51.130
like UI and UX research. I'm very sure we[br]can find ways to build interfaces that are
0:35:51.130,0:35:59.370
just different. And still engaging, but[br]also give that choice. To the second
0:35:59.370,0:36:06.289
question. I mean it's kind of interesting.[br]So I wouldn't expect a company like Google
0:36:06.289,0:36:10.730
to implement something like this, because[br]it's a bit against that. This is more by
0:36:10.730,0:36:16.359
that point probably but I've met a lot of,[br]like, also high level executives already,
0:36:16.359,0:36:23.250
who were actually very aware of kind of[br]the issues of technology that they built.
0:36:23.250,0:36:28.480
And there is definitely interest there.[br]Also, more like industrial side, and so
0:36:28.480,0:36:34.250
on, especially, it seems like self-driving[br]cars to actually adopt that. And in the
0:36:34.250,0:36:39.530
end I think, you know, if everyone[br]actually demands it, then there's a pretty
0:36:39.530,0:36:44.069
high probability that it might actually[br]happen. Especially, as workers in the tech
0:36:44.069,0:36:50.760
field, we are quite flexible in the[br]selection of our employer. So I think if
0:36:50.760,0:36:56.340
you give it some time, that's definitely[br]something that's very possible. The second
0:36:56.340,0:37:01.930
aspect is that, actually, if we looked at[br]something like Facebook, I think they
0:37:01.930,0:37:08.730
overdid it. Say, optimize that so far and[br]push the engagement machine and kind of
0:37:08.730,0:37:13.150
triggering like your brain cells to[br]never stop being on the site and keeps
0:37:13.150,0:37:17.960
scrolling, that people got too much of it.[br]And now they're leaving the platform in
0:37:17.960,0:37:21.950
droves. And of course Facebook would not[br]go down, they own all these other social
0:37:21.950,0:37:27.420
networks. But for the product itself. as[br]you can see, that, long term it's not even
0:37:27.420,0:37:33.519
necessarily a positive business outcome.[br]And everything we are advertising here
0:37:33.519,0:37:38.610
still also to have very profitable businesses,[br]right, just tweaking the right screws.
0:37:38.610,0:37:42.730
Herald: Thank you. We have a question from[br]the interweb.
0:37:42.730,0:37:48.480
Signal Angel: Yes. Fly asks a question
0:37:48.480,0:37:54.700
that goes into a similar direction. In[br]recent months we had numerous reports
0:37:54.700,0:37:59.040
about social media executives forbidding[br]their children to use the products they
0:37:59.040,0:38:05.289
create at work. I think these people know[br]that their products are made addictive
0:38:05.289,0:38:11.079
deliberately. Do you think your work is[br]somewhat superfluous because big companies
0:38:11.079,0:38:16.400
are doing the opposite on purpose.[br]Yann: Right. I think that's why you have
0:38:16.400,0:38:23.119
to draw the line between intentional and[br]unintentional. If we go to intentional
0:38:23.119,0:38:27.220
things like what Uber did and so on. At[br]some point it should probably become a
0:38:27.220,0:38:32.289
legal issue. Unfortunately we are not[br]there yet and usually regulation is kind
0:38:32.289,0:38:39.019
of lagging way behind. So I think for now[br]we should focus on, you know, the more
0:38:39.019,0:38:45.190
unintentional consequences, of which there[br]are plentiful and kind of appeal to the
0:38:45.190,0:38:52.329
good in humans.[br]Herald: Okay. Microphone number 2 please.
0:38:52.329,0:38:59.619
Q: Thank you for sharing your ideas about[br]educating the engineer. What about
0:38:59.619,0:39:05.440
educating the customer, the consumer who[br]purchases the product.
0:39:05.440,0:39:12.390
Yann: Yeah. So that's a really valid[br]point. Right. As I said I think
0:39:12.390,0:39:19.609
[unintelligible] like part of your product[br]development. And the way you build a
0:39:19.609,0:39:24.619
product should also be how you educate[br]your users on how it works. Generally, we
0:39:24.619,0:39:30.940
have a really big kind of technology[br]illiteracy problem. Things have been
0:39:30.940,0:39:34.990
moving so fast in the last year that most[br]people haven't really caught up and they
0:39:34.990,0:39:39.849
just don't understand things anymore. And[br]I think again that's like a shared
0:39:39.849,0:39:44.500
responsibility, right? You can't just do[br]that in the tech field. You have to talk
0:39:44.500,0:39:48.670
to your relatives, to people. That's why[br]we're doing, like, this series of articles
0:39:48.670,0:39:54.860
and media partnerships to kind of explain[br]and make these things transparent. One
0:39:54.860,0:40:01.510
thing we just started working on is a[br]children's book. Because for children,
0:40:01.510,0:40:06.510
like, the entire world just exists with[br]this shiny glass surfaces and they don't
0:40:06.510,0:40:11.180
understand at all what is happening. But[br]it's also primetime to explain to them,
0:40:11.180,0:40:15.420
like, really simple machine learning[br]algorithms. How they work, how like,
0:40:15.420,0:40:19.280
filterbubbles work, how decisions are[br]made. And if you understand that from an
0:40:19.280,0:40:24.740
early age on, then maybe you'll be able to[br]deal with what is happening. In a way
0:40:24.740,0:40:31.730
better, an educated way. But I do think[br]that is a very long process and so only if
0:40:31.730,0:40:37.349
we start and the more work we invest in[br]that, the earlier people will be better
0:40:37.349,0:40:40.730
educated.[br]Herald: Thank you. Microphone number 1
0:40:40.730,0:40:45.090
please.[br]Q: Thanks for sharing your insights. I
0:40:45.090,0:40:50.829
feel like, while you presented these rules[br]along with their meaning, the specific
0:40:50.829,0:40:55.710
selection might seem a bit arbitrary. And[br]for my personal acceptance and willingness
0:40:55.710,0:41:01.840
to implement them it would be interesting[br]to know the reasoning, besides common
0:41:01.840,0:41:07.280
sense, that justifies this specific[br]selection of rules. So, it would be
0:41:07.280,0:41:12.779
interesting to know if you looked at[br]examples from history, or if you just sat
0:41:12.779,0:41:19.299
down and discussed things, or if you just[br]grabbed some rules out of the air. And so
0:41:19.299,0:41:26.230
my question is: What influenced you for[br]the development of these specific rules?
0:41:26.230,0:41:34.130
Yann: It's a very complicated question. So[br]how did we come up this specific selection
0:41:34.130,0:41:39.470
of rules and also, like, the main building[br]blocks of what we think should good
0:41:39.470,0:41:47.099
technology be. Well, let's say first what[br]we didn't want to do, right. We didn't
0:41:47.099,0:41:51.119
want to create like a value framework and[br]say, like, this is good, this is bad,
0:41:51.119,0:41:55.290
don't do this kind of research or[br]technology. Because this would also be
0:41:55.290,0:41:59.960
outdated, it doesn't apply to everyone. We[br]probably couldn't even agree in the expert
0:41:59.960,0:42:05.300
council on that because it's very diverse.[br]Generally, we try to get everyone on the
0:42:05.300,0:42:12.200
table. And we talked about issues we had,[br]like, for example me as an entrepreneur. And when
0:42:12.200,0:42:18.890
I was, like, in developing products with[br]our own engineers. Issues we've seen in terms
0:42:18.890,0:42:26.790
of public perception. Issues we've seen,[br]like, on a more governmental level. Then
0:42:26.790,0:42:32.349
we also have, like, cryptologists in[br]there. So we looked at that as well and
0:42:32.349,0:42:42.509
then we made a really, really long list[br]and kind of started clustering it. And a
0:42:42.509,0:42:49.589
couple of things did get cut off. But[br]generally, based on the clustering, these
0:42:49.589,0:42:57.569
were kind of the main themes that we saw.[br]And again, it's really more of a tool for
0:42:57.569,0:43:03.680
yourself as a company that developers,[br]designers and engineers to really
0:43:03.680,0:43:08.690
understand the impact and evaluate it. Right.[br]This is what these questions are
0:43:08.690,0:43:13.369
aimed at. And we think that for that they[br]do a very good job.
0:43:13.369,0:43:18.559
From microphone 1: Thank you.[br]Herald: Thank you. And I think. Microphone
0:43:18.559,0:43:22.359
number 2 has a question again.[br]Q: Hi. I was just wondering how you've
0:43:22.359,0:43:26.730
gone about engaging with other standards[br]bodies, that perhaps have a wider
0:43:26.730,0:43:32.540
representation. It looks largely like from[br]your team of the council currently that
0:43:32.540,0:43:36.740
there's not necessarily a lot of[br]engagement outside of Europe. So how do
0:43:36.740,0:43:41.540
you go about getting representation from[br]Asia. For example.
0:43:41.540,0:43:52.369
Yann: No, at the moment you're correct the[br]GTC is mostly a European initaitive. We
0:43:52.369,0:43:57.710
are in talks with other organizations who[br]work on similar issues and regularly
0:43:57.710,0:44:04.220
exchange ideas. But, yeah, we thought we[br]should probably start somewhere. In Europe
0:44:04.220,0:44:09.250
is actually a really good place to start.[br]Like a societal discourse about technology
0:44:09.250,0:44:14.049
and the impact it has and also to to have[br]change. But I think if for example
0:44:14.049,0:44:19.599
compared to things like Asia or the US[br]where is a very different perception of
0:44:19.599,0:44:25.029
privacy and technology and progress and[br]like the rights of the individual Europe
0:44:25.029,0:44:29.400
is actually a really good place to do[br]that. And we can also see things like GDPR
0:44:29.400,0:44:35.920
regulation, that actually, ... It's kind[br]of complicated. It's also kind of a big
0:44:35.920,0:44:40.790
step forward in terms of protecting the[br]individual from exactly these kind of
0:44:40.790,0:44:47.150
consequences. Of course though, long term[br]we would like to expand this globally.
0:44:47.150,0:44:52.640
Herald: Thank you. Microphone number 1[br]again.
0:44:52.640,0:44:57.270
Q: Hello. Just a short question. I[br]couldn't find a donate button on your
0:44:57.270,0:45:03.549
website. Do you accept donations? Is money[br]a problem? Like, do you need it?
0:45:03.549,0:45:12.960
Yann: Yes, we do need money. However it's[br]a bit complicated because we want to stay
0:45:12.960,0:45:19.750
as independent as possible. So we are not[br]accepting project related money. So you can't
0:45:19.750,0:45:22.470
like say we want to do certain research[br]product with you, it has to be
0:45:22.470,0:45:29.800
unconditional. And the second thing we do[br]is for the events we organize. We usually
0:45:29.800,0:45:33.690
have sponsors that provide, like, venue[br]and food and logistics and things like
0:45:33.690,0:45:39.140
that. But that's an, ... for the event.[br]And again, I can't, like, change the
0:45:39.140,0:45:44.249
program of it. So if you want to do that[br]you can come into contact with us. We
0:45:44.249,0:45:48.509
don't have a mechanism yet for individuals[br]to donate. We might add that.
0:45:48.509,0:45:54.109
Herald: Thank you. Did you think about[br]Patreon or something like that?
0:45:54.109,0:46:03.509
Yann: We thought about quite a few[br]options. Yeah, but it's actually not so
0:46:03.509,0:46:09.470
easy to not fall into the trap that,[br]like, as organizations in space have been,
0:46:09.470,0:46:15.190
like, Google at some point sweeps in and[br]it's like: Hey, do you want all this cash.
0:46:15.190,0:46:18.840
And then very quickly you have a big[br]conflict of interest. Even if you don't
0:46:18.840,0:46:25.660
want that to happen it starts happening.[br]Herald: Yeah right. Number 1 please.
0:46:25.660,0:46:32.730
Q: I was wondering how do you unite the[br]second and third points in your checklist.
0:46:32.730,0:46:37.960
Because the second one is intelligence by[br]design. The third one is to take into
0:46:37.960,0:46:43.080
account future technologies. But companies[br]do not want to push back their
0:46:43.080,0:46:48.519
technologies endlessly to take into[br]account future technologies. And on the
0:46:48.519,0:46:52.000
other hand they don't want to compromise[br]their own design too much.
0:46:52.000,0:47:00.160
Yann: Yeah. Okay. Okay. Got it. So you[br]were saying if we should always stop
0:47:00.160,0:47:04.109
these, like, future scenarios and the[br]worst case and everything and incorporate
0:47:04.109,0:47:07.869
every possible thing that might happen in[br]the future we might end up doing nothing
0:47:07.869,0:47:14.210
because everything looks horrible. For[br]that I would say, like, we are not like
0:47:14.210,0:47:21.079
technology haters. We are all from areas[br]working in tech. So of course the idea is
0:47:21.079,0:47:25.859
that you can just take a look at what is[br]there today and try to make an assessment
0:47:25.859,0:47:30.470
based on that. And the idea is if you look[br]it up and meet the standards that over
0:47:30.470,0:47:35.289
time actually you try to,... When you add[br]new major features to look back at your
0:47:35.289,0:47:40.079
assessment from before and see if it[br]changed. So the idea is you kind of create
0:47:40.079,0:47:46.819
a snapshot of how it is now. And this kind[br]of document that you end up as part of
0:47:46.819,0:47:50.549
your documentation kind of evolved over[br]time as your product changes and the
0:47:50.549,0:47:57.390
technology around it changes as well.[br]Herald: Thank you. Microphone number 2.
0:47:57.390,0:48:02.789
Q: So thanks for the talk and especially[br]the effort. Just to echo back the
0:48:02.789,0:48:07.430
question that was asked a bit before on[br]starting with Europe. I do think it's a
0:48:07.430,0:48:14.010
good option. What I'm a little bit worried[br]is it might be the only option. It might
0:48:14.010,0:48:19.569
become irrelevant rather quickly because[br]it's easy to do, it's less hard to
0:48:19.569,0:48:26.220
implement. Maybe in Europe now. Okay. The[br]question is. It might work in Europe now
0:48:26.220,0:48:30.960
but if Europe doesn't have the same[br]economical power it cannot bog in as much
0:48:30.960,0:48:36.549
politically with let's say China or the US[br]in Silicon Valley. So will it still be
0:48:36.549,0:48:41.390
possible and relevant if the economical[br]balance shifts?
0:48:41.390,0:48:52.329
Yann: Yes, I mean we have to start[br]somewhere, right? Just saying "Oh,
0:48:52.329,0:48:59.040
economical balance will shift anyway,[br]Google will invent singularity, and that's
0:48:59.040,0:49:02.039
why we shouldn't do anything" is, I think,[br]one of the reasons why we actually got
0:49:02.039,0:49:07.730
here, why it kind of is this assumption[br]that there is like this really big picture
0:49:07.730,0:49:14.490
that is kind of working against us, so we[br]all do our small part to fulfill that
0:49:14.490,0:49:20.779
kind of evil vision by not doing anything.[br]I think we have to start somewhere and I
0:49:20.779,0:49:26.780
think for having operated for one year, we[br]have been actually quite successful so far
0:49:26.780,0:49:31.690
and we have a good progress. And I'm[br]totally looking forward to make it a bit
0:49:31.690,0:49:35.769
more global and to start traveling more, I[br]think that like one event outside Europe
0:49:35.769,0:49:40.330
last year in the US and that will[br]definitely increase over time, and we're
0:49:40.330,0:49:46.450
also working on making kind of our[br]ambassadors more mobile and kind of expand
0:49:46.450,0:49:50.310
to other locations. So it's definitely on[br]the roadmap but it's not like yeah, just
0:49:50.310,0:49:54.030
staying here. But yeah, you have to start[br]somewhere and that's what we did.
0:49:54.030,0:50:01.809
Herald: Nice, thank you. Number 1 please.[br]Mic 1: Yeah. One thing I haven't found was
0:50:01.809,0:50:08.420
– all those general rules you formulated[br]fit into the more general rules of
0:50:08.420,0:50:16.390
society, like the constitutional rules.[br]Have you considered that and it's just not
0:50:16.390,0:50:25.319
clearly stated and will be stated, or did[br]you develop them more from the bottom up?
0:50:25.319,0:50:33.470
Yann: Yes, you are completely right. So we[br]are defining the process and the questions
0:50:33.470,0:50:39.330
to ask yourself, but we are actually not[br]defining a value framework. The reason for
0:50:39.330,0:50:42.809
that is that societies are different, as I[br]said they are widely different
0:50:42.809,0:50:48.260
expectations towards technology, privacy,[br]how society should work, all the ones
0:50:48.260,0:50:53.799
about. The second one is that every[br]company is also different, right, every
0:50:53.799,0:50:58.240
company has their own company culture and[br]things they want to do and they don't want
0:50:58.240,0:51:04.640
to do. If I would say, for example, we[br]would have put in there "You should not
0:51:04.640,0:51:08.220
build weapons or something like that",[br]right, that would mean that all these
0:51:08.220,0:51:12.950
companies that work in that field couldn't[br]try to adapt it. And while I don't want
0:51:12.950,0:51:17.029
them to build weapons maybe in their value[br]framework that's OK and we don't want to
0:51:17.029,0:51:21.069
impose that, right. That's why I said in[br]the beginning we actually, we're called
0:51:21.069,0:51:24.730
the Good Technology Collective, we are not[br]defining what it is and I think that's
0:51:24.730,0:51:28.780
really important. We are not trying to[br]impose our opinion here. We want others to
0:51:28.780,0:51:33.750
decide for themselves what is good and[br]cannot support them and guide them in
0:51:33.750,0:51:36.299
building products that they believe are[br]good.
0:51:36.299,0:51:44.599
Herald: Thank you. Number two .[br]Mic 2: Hello, thanks for sharing. As
0:51:44.599,0:51:51.710
engineer we always want users to spend[br]more time to use our product, right? But
0:51:51.710,0:51:58.990
I'm working at mobile game company. Yep.[br]We are making, we are making a world that
0:51:58.990,0:52:05.539
users love our product. So we want users[br]spend more time in our game. So we may
0:52:05.539,0:52:13.510
make a lot of money, yeah, but when users[br]spend time to play our game they may lose
0:52:13.510,0:52:19.549
something. Yeah. You know. So how do we[br]think about the balance in a game, mobile
0:52:19.549,0:52:24.910
game. Yeah.[br]Yann: Hmm. It's a really difficult
0:52:24.910,0:52:32.470
question. So the question was like[br]specifically for mobile gaming. Where's
0:52:32.470,0:52:38.490
kind of the balance between trying to[br]engage people more and, yeah, basically
0:52:38.490,0:52:44.510
making them addicted and having them spend[br]all their money, I guess. I personally
0:52:44.510,0:52:53.880
would say it's about intent, right? It's[br]totally fine with a business model where
0:52:53.880,0:52:58.119
you make money with a game. I mean that's[br]kind of good and people do want
0:52:58.119,0:53:08.750
entertainment. But if you actively use,[br]like, research in how, like, you know,
0:53:08.750,0:53:14.750
like the brain actually works and how it[br]get super engaged, and if you basically
0:53:14.750,0:53:18.540
build in, like, gamification and[br]lotteries, which a lot of, I think, have
0:53:18.540,0:53:21.829
done, where basically your game becomes a[br]slot machine, right, you always want to
0:53:21.829,0:53:28.270
see the next opening of a crate[br]and see what you got. Kind of making it a
0:53:28.270,0:53:32.651
luck based game, actually. I think if you[br]go too far into that direction, at some
0:53:32.651,0:53:36.280
point you cross the line. Where that line[br]is you have to decide yourself, right,
0:53:36.280,0:53:40.060
some of it could be a good game and[br]dynamic but there definitely some games
0:53:40.060,0:53:44.700
out there, I would say with quite a reason[br]to say that they pushed to the limit quite
0:53:44.700,0:53:48.099
a bit too far. And if you actually look[br]how they did it because they wrote about
0:53:48.099,0:53:52.730
it, they actually did use very modern[br]research and very extensive testing to
0:53:52.730,0:53:58.180
really find out these, all these patterns[br]that make you addicted. And then it's not
0:53:58.180,0:54:02.260
much better than an actual slot machine.[br]And that probably we don't want.
0:54:02.260,0:54:08.140
Herald: So it's also an ethical question[br]for each and every one of us, right?
0:54:08.140,0:54:10.750
Yann: Yes.[br]Herald: I think there is a light and I
0:54:10.750,0:54:13.500
think this light means the interwebs has a[br]question.
0:54:13.500,0:54:21.589
Signal angel: I, there's another question[br]from ploy about practical usage, I guess.
0:54:21.589,0:54:25.199
Are you putting your guidelines at work in[br]your company? You said you're an
0:54:25.199,0:54:29.880
entrepeneur.[br]Yann: That's a great question. Yes, we
0:54:29.880,0:54:37.569
will. So we kind of just completed some[br]and there was kind of a lot of work to get
0:54:37.569,0:54:41.740
there. Once they are finished and released[br]we will definitely be one of the first
0:54:41.740,0:54:47.910
adopters.[br]Herald: Nice. And with this I think we're
0:54:47.910,0:54:50.440
done for today.[br]Yann: Perfect.
0:54:50.440,0:54:54.049
Herald: Yann, people, warm applause!
0:54:54.049,0:54:55.549
applause
0:54:55.549,0:54:57.049
postroll music
0:54:57.049,0:55:19.000
subtitles created by c3subtitles.de[br]in the year 2020. Join, and help us!