1
00:00:00,000 --> 00:00:18,100
35C3 preroll music
2
00:00:18,100 --> 00:00:22,690
Herald Angel: The next talk will be given
by Régine Debatty and she will be talking
3
00:00:22,690 --> 00:00:31,510
about the goods the strange and the ugly
in art and science technology in 2018. Big
4
00:00:31,510 --> 00:00:34,397
applause to Régine
5
00:00:34,397 --> 00:00:36,117
Applause
6
00:00:36,117 --> 00:00:41,070
Sound from video
7
00:00:41,070 --> 00:00:47,610
Régine: This was not supposed to
happen, I am sorry. Hello everyone. So my
8
00:00:47,610 --> 00:00:55,371
name is Régine Debatty. I'm an art critic
and blogger. Since 2004 I've been writing
9
00:00:55,371 --> 00:01:00,470
"we make money not art" which is a blog
that looks at the way artists, designers
10
00:01:00,470 --> 00:01:06,720
and hackers are using science and
technology in a critical and socially
11
00:01:06,720 --> 00:01:15,140
engaged way. So a few weeks ago some of
the organizers of the Congress e-mailed me
12
00:01:15,140 --> 00:01:20,540
and they asked me if I could give a kind
of overview of the most exciting and
13
00:01:20,540 --> 00:01:27,770
interesting works in art and technology in
2018. And on Monday when I started working
14
00:01:27,770 --> 00:01:33,520
on my slides I was really on a mission to
deliver and then we had the intention
15
00:01:33,520 --> 00:01:40,810
to give an overview of the best of art
and tech in 2018. But somehow things got
16
00:01:40,810 --> 00:01:46,440
out of hand. I started inserting words
that are a bit older and also I could see
17
00:01:46,440 --> 00:01:52,830
a kind of narrative that emerged. So the result is going to be a
18
00:01:52,830 --> 00:01:58,500
presentation where I will still show some
of the works I found very interesting over
19
00:01:58,500 --> 00:02:04,120
this past year. But they're going to be
embedded into a broader narrative. And this
20
00:02:04,120 --> 00:02:09,709
narrative is going to look at the way
artists nowadays are using science and
21
00:02:09,709 --> 00:02:16,110
technology to really explore and expand
the boundaries and the limits of the human
22
00:02:16,110 --> 00:02:23,090
body. But also they use science and technology
to challenge intelligence or at least the
23
00:02:23,090 --> 00:02:31,410
understanding we have of human
intelligence. So Exhibit 1 it's "Face
24
00:02:31,410 --> 00:02:37,650
Detected". It's actually I wouldn't say it
was one of my favorites work from 2018
25
00:02:37,650 --> 00:02:42,350
but it's definitely by one of my favorite
artists. So that's why it's in
26
00:02:42,350 --> 00:02:48,280
there. So the setting is quite easy to
understand. Thérise Deporter, that's the name of the artist
27
00:02:48,280 --> 00:02:53,420
had a solo show a couple of months
ago at the New Art Space in Andover and
28
00:02:53,420 --> 00:03:01,550
install these blocks of clay on
tables and then invited artists sculpters
29
00:03:01,550 --> 00:03:08,260
to come and do portraits of people and
start shaping their head. And of course
30
00:03:08,260 --> 00:03:13,900
you know the process was followed by the
audience but it was also
31
00:03:13,900 --> 00:03:20,850
followed by face recognition software. And
as soon as the software had detected a
32
00:03:20,850 --> 00:03:27,970
face in the sculpture the Artist would
receive a message that say: Stop, we
33
00:03:27,970 --> 00:03:32,951
have detected a human face. And what is
interesting is that
34
00:03:32,951 --> 00:03:34,781
laughter
35
00:03:34,781 --> 00:03:40,151
there are several - I mean, there's worse -
36
00:03:40,151 --> 00:03:44,660
According to the face recognition system
37
00:03:44,660 --> 00:03:48,370
- because they work with different
parameters and they've been programmed by
38
00:03:48,370 --> 00:03:54,060
different programmers so they don't
necessarily - they're not as fast as, some
39
00:03:54,060 --> 00:03:58,440
are faster than others to recognize a face,
some have different parameters than
40
00:03:58,440 --> 00:04:05,290
others. So that's why the results are
always so different. But the reason why I
41
00:04:05,290 --> 00:04:10,460
found this project worth mentioning
is because it shows the increasingly big
42
00:04:10,460 --> 00:04:15,370
space that is called artificial
intelligence. You can call it algorithm,
43
00:04:15,370 --> 00:04:24,990
big data or software. It shows the
increasingly big role that systems and
44
00:04:24,990 --> 00:04:31,180
machines are taking into culture and the
place they're taking into spheres that
45
00:04:31,180 --> 00:04:36,770
until not so long ago we thought were
really the apanage and exclusivity of
46
00:04:36,770 --> 00:04:43,880
humans. So for example creativity,
innovation, imagination and art. And
47
00:04:43,880 --> 00:04:47,800
nowadays pretty banal I mean no one is
going to be surprised if you're listening
48
00:04:47,800 --> 00:04:52,090
to music and someone tells you: Oh
it's the music that's being composed
49
00:04:52,090 --> 00:04:57,399
actually by an algorithm or this is a
poem or a book that's been written
50
00:04:57,399 --> 00:05:02,250
by a machine. I mean it's becoming
almost mainstream. And this is happening
51
00:05:02,250 --> 00:05:09,739
also in the world of visual art. And we've
had a few examples this year. If you look
52
00:05:09,739 --> 00:05:15,159
at the newspaper like if a mainstream
newspaper or we had regularly news
53
00:05:15,159 --> 00:05:20,720
saying oh this algorithmic has made this
masterpiece. And one of the most recent
54
00:05:20,720 --> 00:05:27,570
example was this stunning portrait of a
gentleman that made the headlines because
55
00:05:27,570 --> 00:05:34,310
it was the first portrait that was made by
an algorithm that was sold at Christie's
56
00:05:34,310 --> 00:05:40,080
And Christie's is a famous auction house. And
it also made the headlines not only
57
00:05:40,080 --> 00:05:44,979
because he was the first one but it sold for a surprisingly high price like
58
00:05:44,979 --> 00:05:49,479
something like forty times the original
estimate. This painting made by an
59
00:05:49,479 --> 00:05:55,470
algorithm created by a Paris based
collective. So it sold for almost half a
60
00:05:55,470 --> 00:06:03,520
million dollar. And if I have to be honest
with you I cannot think of any work that's
61
00:06:03,520 --> 00:06:10,499
been mostly done by an algorithm that have
really impressed me and move me. But it
62
00:06:10,499 --> 00:06:16,529
seems that I am in the minority because
you might have heard of this. These
63
00:06:16,529 --> 00:06:24,990
scientific experiments where computer
scientists at Rutgers University in USA.
64
00:06:24,990 --> 00:06:31,679
They made this program that could
replicate just make and invent abstract
65
00:06:31,679 --> 00:06:38,340
paintings. And then they asked people like
human beings to have a look at the
66
00:06:38,340 --> 00:06:43,710
paintings made by the computer. But they
mixed it, like I said they were showing
67
00:06:43,710 --> 00:06:48,839
to people they mix it with paintings that
had really been done by human beings that
68
00:06:48,839 --> 00:06:55,039
had been exhibited at the famous art fair Art
Basel. And they asked them how they react
69
00:06:55,039 --> 00:07:00,250
and which one according to them had
been done by a human. And it turned out
70
00:07:00,250 --> 00:07:04,849
that people responded fairly well to the
paintings that had been done by this
71
00:07:04,849 --> 00:07:10,740
computer system. They tended to prefer
them. They found them this odd. They
72
00:07:10,740 --> 00:07:16,479
described them as being more communicative and
more inspiring. So you know maybe one day
73
00:07:16,479 --> 00:07:22,729
I will be impressed but so far I would say
I would not be worried if I were an
74
00:07:22,729 --> 00:07:26,999
artist. I'm not going to say to artists or
you know very soon machines are going to
75
00:07:26,999 --> 00:07:30,960
take your job. Well maybe if you make that
kind of painting yeah maybe you should be
76
00:07:30,960 --> 00:07:37,960
worried. But the kind of artist I'm
interested in. I really don't like it.
77
00:07:37,960 --> 00:07:45,819
Artists I'm interested in. I really trust
them to keep on challenging computer
78
00:07:45,819 --> 00:07:51,289
system. I trust them to explore and
collaborate with them and use them to
79
00:07:51,289 --> 00:07:58,139
expand their artistic range. And most of
all I trust them to keep on sabotaging
80
00:07:58,139 --> 00:08:04,509
and hacking and subverting these computer
systems. And so yeah I'm not worried at
81
00:08:04,509 --> 00:08:10,639
all for artists but maybe I should be
worried for art critic. Because a few years
82
00:08:10,639 --> 00:08:17,050
ago so it's 2006 and even in the
artificial intelligence time it's really
83
00:08:17,050 --> 00:08:26,120
really old. So in 2006 at the Palais de Tokyo
in Paris an artist was showing a program
84
00:08:26,120 --> 00:08:32,690
that was generated automatically. Text
written by art critique and I'm sorry but
85
00:08:32,690 --> 00:08:37,490
the example is in French. But if you
read French. And if you use the kind of
86
00:08:37,490 --> 00:08:44,860
press releases and "blah" they give you that
had been written by art critique. It is
87
00:08:44,860 --> 00:08:49,779
really the kind of thing they work. It is
so incredible you know it's the kind of
88
00:08:49,779 --> 00:09:00,210
inflated rhetoric and uses of vague terms
and also constantly referencing some fancy
89
00:09:00,210 --> 00:09:04,630
French philosopher. So it really really
worked. So personally I would be more
90
00:09:04,630 --> 00:09:12,810
worried for my job than for the job of the
artist. Anyway this is another work I
91
00:09:12,810 --> 00:09:18,070
really like in 2018 and it's one of
these work that challenges and play and
92
00:09:18,070 --> 00:09:25,700
try to subvert artificial intelligence and
in particular the so-called intelligent or
93
00:09:25,700 --> 00:09:33,780
home assistance which is Alexa or Siri. So
you may know the artist is Mediengruppe
94
00:09:33,780 --> 00:09:40,700
Bitnik. They collaborated with a DJ and
electronic music composer called Low Jack
95
00:09:40,700 --> 00:09:46,420
and they made an album to be played
exclusively for Alexa and Siri all this
96
00:09:46,420 --> 00:09:55,570
kind of devices. And we listen to a
track in a moment. And what did the
97
00:09:55,570 --> 00:10:00,580
music does is, it tries to interact and
make Alexa react. So it asked
98
00:10:00,580 --> 00:10:06,440
questions to Alex. It gives order to
Alex or Siri or whatever it's called. And
99
00:10:06,440 --> 00:10:13,510
it's also trying to express and convey and
communicate to these devices the kind of
100
00:10:13,510 --> 00:10:19,450
anxieties and ease and doubts they have
the artists have. And some of us have
101
00:10:19,450 --> 00:10:24,570
around these so-called intelligent home
assistant. Whether it's they're frightened
102
00:10:24,570 --> 00:10:31,060
about the encroachment of privacy and
unreliability and the fact that they are
103
00:10:31,060 --> 00:10:37,120
called smart. So that we kind became of lower
guard and implicitly start to trust
104
00:10:37,120 --> 00:10:45,310
them. So I'm going to make you listen to
one of the tracks. There are three of them
105
00:10:45,310 --> 00:10:51,250
- at least they sent me three of them.
Played music track: Hello!
106
00:10:51,250 --> 00:10:56,511
music
107
00:10:56,511 --> 00:11:04,500
High speed intelligent personal assistance all
listening to everything I say. I brought
108
00:11:04,500 --> 00:11:13,731
you into my home. You scary. Alexa, Prime's
new in time is now and all forever.
109
00:11:13,731 --> 00:11:21,240
music
110
00:11:21,240 --> 00:11:29,560
Alexa, I used to bark commands on you. Now I say
"please" and "thank you". Never mess with
111
00:11:29,560 --> 00:11:38,020
someone who knows your secrets. Siri,
there is no way far enough to hide. Ok,
112
00:11:38,020 --> 00:11:44,640
Google - are you laughing at me? You used
to answer that. Now, you disengage. Never
113
00:11:44,640 --> 00:11:54,320
enough digital street smarts to outsmart
Alexa. Alexa? Alexa? Hey, Alexiety, I want
114
00:11:54,320 --> 00:12:01,280
my control back. I want my home back.
Siri, delete my calender. Siri, hold my
115
00:12:01,280 --> 00:12:08,700
calls. Siri, delete my contacts. Ok,
Google, what's the loneliest number? Ok,
116
00:12:08,700 --> 00:12:16,492
Google, stop! Ok, Google, stop! Ok,
Google. Hello! Please, stop. Alexa, help!
117
00:12:16,492 --> 00:12:24,240
Alexa, disable Uber. Alexa, disable
lighting. Alexa, disable Amazon music
118
00:12:24,240 --> 00:12:31,680
unlimited. Alexa, quit! Hey, Siri!
Activate airplane mode. If you need me,
119
00:12:31,680 --> 00:12:40,473
I'll be offline, in the real world.
R: And that's it for this one.
120
00:12:40,473 --> 00:12:43,010
Next track playing
121
00:12:43,010 --> 00:12:51,830
R: Oh, my god, what is this? That is
why...? laughing My boyfriend is
122
00:12:51,830 --> 00:12:59,810
watching, he's doing to love so much.
Yeah, let's be serious with a bit of art
123
00:12:59,810 --> 00:13:07,080
and tech, shall we? So, there are two
reasons why this work is interesting. The
124
00:13:07,080 --> 00:13:11,290
first one is that you don't really need to
have one of these devices at home to have
125
00:13:11,290 --> 00:13:16,250
fun because it's been actually
conceived to be played as loud as
126
00:13:16,250 --> 00:13:22,890
possible. So, you don't need to have to
have one of these devices but you
127
00:13:22,890 --> 00:13:27,479
can put the music very loud, open the
window and hopefully it is going to
128
00:13:27,479 --> 00:13:33,270
interact with the devices of your
neighbors and mess a bit with their
129
00:13:33,270 --> 00:13:36,830
domesticity. The other thing that is
130
00:13:36,830 --> 00:13:42,810
interesting is that of course Alexa has
been designed to appear as close to a
131
00:13:42,810 --> 00:13:48,380
human as possible. So each time the record
is asking some question the answer going
132
00:13:48,380 --> 00:13:51,931
to change slightly because it's the way
it's been programmed. Then also because
133
00:13:51,931 --> 00:14:00,175
each time there is an update in the
program and so might change. And also as I
134
00:14:00,175 --> 00:14:05,936
said the artist made this record to
communicate their doubts and anxieties
135
00:14:05,936 --> 00:14:11,499
around this smart system because there's a
few problems around these systems. One of
136
00:14:11,499 --> 00:14:17,100
them is as I'm sure you know
it's about surveillance because they come
137
00:14:17,100 --> 00:14:21,930
with corporate systems of surveillance
that are embedded into the hardware. And
138
00:14:21,930 --> 00:14:26,560
the software. And we may not realize it
until, you know, sometimes when you open -
139
00:14:26,560 --> 00:14:31,460
one day we opened the pages of a newspaper
and this is a story that appeared on the
140
00:14:31,460 --> 00:14:39,630
Verge a few days ago and it's the story of
a German citizen who e-mailed Amazon and
141
00:14:39,630 --> 00:14:44,430
say look Amazon I want to know all the
data you've collected about me over these
142
00:14:44,430 --> 00:14:48,220
past few years. And then you received
these data except that he received the
143
00:14:48,220 --> 00:14:54,540
wrong data. He received 1700 Alexa voice
recordings but he doesn't have an Alexa.
144
00:14:54,540 --> 00:15:00,240
These were not his data. So he wrote back
to Amazon and says "Oh look I think there
145
00:15:00,240 --> 00:15:07,640
is a problem there". Amazon never answered
back. So what did the guy do? He went to a
146
00:15:07,640 --> 00:15:13,190
German newspaper and the reporters managed
to analyze the data. And they managed to
147
00:15:13,190 --> 00:15:18,650
identify who the original owner of the
data was and who his girlfriend is and who
148
00:15:18,650 --> 00:15:24,620
his circle of friends is. And you know if
you couple all the data from Alexa with
149
00:15:24,620 --> 00:15:28,890
what is already available for everyone to
see on Facebook and other forms of social
150
00:15:28,890 --> 00:15:34,710
media, you get kind of a scary and worrying
picture so you understand why artists and
151
00:15:34,710 --> 00:15:42,380
other people might be worried and anxious
around Alexa. And so they're called smart
152
00:15:42,380 --> 00:15:51,250
or intelligent devices, but sometimes they
give us advices that may not be, that I
153
00:15:51,250 --> 00:15:57,229
would say, judicious and smart, such as
the one you might have heard of "Alexa's
154
00:15:57,229 --> 00:16:05,520
advice to kill your foster parents". And
also they don't necessarily react to the
155
00:16:05,520 --> 00:16:11,290
person they're supposed to obey. So that
like almost every month there is a story
156
00:16:11,290 --> 00:16:16,570
in the newspaper. So this is the story of
Rocco the parrot and Rocco really when
157
00:16:16,570 --> 00:16:23,070
he's alone at home and his owner has left
it's singing - it imitates very well.
158
00:16:23,070 --> 00:16:28,440
There are videos of Rocco imitating the
voice of Alexa, it's really kinny. But
159
00:16:28,440 --> 00:16:34,880
he also plays his older like, it doesn't
want to but, for example, he added to the
160
00:16:34,880 --> 00:16:39,990
Amazon shopping basket of its owner
broccoli and watermelons and kites and ice
161
00:16:39,990 --> 00:16:47,460
cream. And there are stories like that all
the time in the newspaper, when the TV
162
00:16:47,460 --> 00:16:52,210
presenter talks about, say, "Okay Google
something or Alexa do something", and if
163
00:16:52,210 --> 00:16:58,120
people are watching the TV and Alexa is in
the room Alexa is going to perform this
164
00:16:58,120 --> 00:17:09,080
action or order some goods online. So
as you see this so-called smart system
165
00:17:09,080 --> 00:17:14,250
isn't smart enough to differentiate
between the actual human being that's in
166
00:17:14,250 --> 00:17:19,620
the room and that they should obey or they
should answer to, and voices that come
167
00:17:19,620 --> 00:17:24,819
through the radio or the TV or the voices
of the pet. So if on the one hand
168
00:17:24,819 --> 00:17:30,261
intelligent machines can be confused, I
have the feeling that also as human being
169
00:17:30,261 --> 00:17:35,790
we are confused when it comes to, you
know, as soon as you throw in the arena
170
00:17:35,790 --> 00:17:41,250
the words artificial intelligence, most
people, not you I'm sure, but most people
171
00:17:41,250 --> 00:17:48,090
are ready to believe the hype. And a good
example of that, I would say, is you know
172
00:17:48,090 --> 00:17:53,760
we keep reading how in China they have
this super sophisticated and highly
173
00:17:53,760 --> 00:17:59,600
efficient networks of surveillance of CCTV.
And how they have face recognition systems
174
00:17:59,600 --> 00:18:07,230
that are really really high hand and
highly sophisticated. So that is the
175
00:18:07,230 --> 00:18:11,090
theory but the reality is that the
technology has limitations and also
176
00:18:11,090 --> 00:18:17,640
limitations linked to bureaucracy and some
of the data in the files I've seen not
177
00:18:17,640 --> 00:18:24,120
been digitalized. So sometimes the work of
the police is not aided by sophisticated
178
00:18:24,120 --> 00:18:28,610
CCTV and their face recognition software,
but just consists of going through old
179
00:18:28,610 --> 00:18:34,780
people files and photos. And yet, maybe
that doesn't matter if you know if the
180
00:18:34,780 --> 00:18:39,640
system is not fully functional, because in
the case of surveillance what matters is
181
00:18:39,640 --> 00:18:45,270
that we think that we might at some point
be under the gaze of surveillance that
182
00:18:45,270 --> 00:18:54,720
might be enough to keep us docile and
compliant. Also, we are being told that
183
00:18:54,720 --> 00:18:58,150
very soon we just have to wait for them to
be launched on the market. We're going to
184
00:18:58,150 --> 00:19:02,890
have some fantastic autonomous car that's
going to drive us seamlessly from one part
185
00:19:02,890 --> 00:19:10,140
of the town to the other one. And in
reality, we are still training them, you
186
00:19:10,140 --> 00:19:16,730
and I, when we get these pop up images and
we are asked by the system to identify all
187
00:19:16,730 --> 00:19:21,110
the squares where there is a shop window,
where there is a traffic sign or where
188
00:19:21,110 --> 00:19:27,140
there is another vehicle. Where I want to
go with this is that there is still a lot
189
00:19:27,140 --> 00:19:32,490
of human beings, a lot of human clocks
behind the machines, because artificial
190
00:19:32,490 --> 00:19:38,360
intelligence is nothing without data, and
data is pretty useless if it's not been
191
00:19:38,360 --> 00:19:44,330
targged. And so you see the emergence of
data tagging factory in China and it's
192
00:19:44,330 --> 00:19:49,270
actually the new faces of outsourcing in
China, where you have people who spend the
193
00:19:49,270 --> 00:19:55,540
whole day behind computer screen adding
dots and descriptions to images and
194
00:19:55,540 --> 00:20:02,559
photo. And so artificial intelligence that
was supposed to, you know, free us from
195
00:20:02,559 --> 00:20:10,482
very monotonous, mind-numbing and boring
work is still generating for us a lot of
196
00:20:10,482 --> 00:20:16,760
this boring work. And these invisible
people behind artificial intelligence is
197
00:20:16,760 --> 00:20:22,830
what Amazon, which is Amazon Mechanical
Turk, but it's what Amazon call
198
00:20:22,830 --> 00:20:29,430
artificial intelligence. Some could photo
mission, and some of the characteristic of
199
00:20:29,430 --> 00:20:34,980
this hidden human labor for the
digital factory is that it's, one, the
200
00:20:34,980 --> 00:20:39,790
hyper fragmentation of the work and
because it's like the work is divided in
201
00:20:39,790 --> 00:20:46,080
small chunks, it's loses a lot of his
prestige, a lot of its value, so people
202
00:20:46,080 --> 00:20:50,851
are underpaid and what is a bit
frightening is that this kind of work
203
00:20:50,851 --> 00:20:58,430
culture is actually creeping into the
world of creative people and freelancing
204
00:20:58,430 --> 00:21:03,940
and you have platforms such as this one,
called Fiverr where it's also a
205
00:21:03,940 --> 00:21:08,890
crowdsourcing platform where you can buy
the services of a person who is going to
206
00:21:08,890 --> 00:21:13,570
design for you the logo of your new
company, or a birthday card, or a business
207
00:21:13,570 --> 00:21:20,280
card, or do translation for you or shoot a
testimonial for your products and there is
208
00:21:20,280 --> 00:21:24,840
one example there and someone who advertise
that they will design for you two
209
00:21:24,840 --> 00:21:32,340
magnificent logos in 48 hours. And the
price that's it at five dollars, that's
210
00:21:32,340 --> 00:21:38,850
very low - and the characteristic
of this kind of company is this scale of
211
00:21:38,850 --> 00:21:46,580
crowdsourcing so-called services is that
they ruthlessly rely on precarity and
212
00:21:46,580 --> 00:21:53,809
Fiverr recently had these ads, where
they were basically glorifying the idea of
213
00:21:53,809 --> 00:22:02,820
working yourself to death. And then this.
Well I'm going to the third work I really
214
00:22:02,820 --> 00:22:08,930
like in 2018. It was released a few months
ago. It's a work by Michael Mandiberg and
215
00:22:08,930 --> 00:22:17,040
he wanted to make a portrait of the hidden
human being behind the digital factory.
216
00:22:17,040 --> 00:22:23,920
And to do so he really wanted to recreate
or to modernized the iconic film by Charlie
217
00:22:23,920 --> 00:22:30,010
Chaplin. Modern Times. It's a film from
1936 which was you know it was the seventh
218
00:22:30,010 --> 00:22:34,760
year of the Great Depression. People
were struggling, being unemployed,
219
00:22:34,760 --> 00:22:43,216
starving and having difficulty to get through
life. And Michael Mandiberg made an
220
00:22:43,216 --> 00:22:49,670
update of this film, by... First of all he
wrote the code and the code cut the film
221
00:22:49,670 --> 00:22:55,650
of Charlie Chaplin into very short clips
of a few seconds and then Michael
222
00:22:55,650 --> 00:22:59,966
Mandiberg contacted people. He
found people on this platform on
223
00:22:59,966 --> 00:23:04,880
Fiverr and he asked them, he gave
each of them a short clip. And
224
00:23:04,880 --> 00:23:12,250
he asked them to recreate it as well
as they could. And that's what they did.
225
00:23:12,250 --> 00:23:17,620
And Michael Mandiberg, I'm going to show
an extract. Yeah you will see, he had no
226
00:23:17,620 --> 00:23:22,390
obviously no control over the location, he
had no control about around the production
227
00:23:22,390 --> 00:23:31,260
choice. So hopefully I'm going to find
the correct video and not a fitness video.
228
00:23:31,260 --> 00:23:32,910
So here's an extract.
229
00:23:32,910 --> 00:25:19,836
music plays
230
00:25:19,836 --> 00:25:29,170
I think you've got the point here. Good.
So with this work Michael Mandiberg tried
231
00:25:29,170 --> 00:25:34,760
to do a kind of portrait of these hidden
digital factory and of its human clocks
232
00:25:34,760 --> 00:25:39,850
But the portrait is done by the
freelancers, by the workers themselves and
233
00:25:39,850 --> 00:25:45,580
you can see the result is kind of
messy and chaotic geographically disperse
234
00:25:45,580 --> 00:25:56,174
and hyper-fragmented. So that it's added
characteristic, of the digital factory. So
235
00:25:56,174 --> 00:26:03,049
carrying on with the world that is hidden,
I'd like to speak briefly about social
236
00:26:03,049 --> 00:26:09,220
media content moderators. Maybe you know
this story because imagine that in
237
00:26:09,220 --> 00:26:15,159
Germany, I was listening to an interview
with a woman who had work as a social
238
00:26:15,159 --> 00:26:23,549
media content moderator for Facebook in
Germany and she was explaining what kind
239
00:26:23,549 --> 00:26:29,150
of work she was doing. It doesn't happen
very often that you have testimony, because
240
00:26:29,150 --> 00:26:33,310
usually the worker have to sign a
nondisclosure agreement. But she was
241
00:26:33,310 --> 00:26:39,640
explaining that, how difficult the work is
because every single day, each of the
242
00:26:39,640 --> 00:26:47,030
moderator has to go to 1300 tickets. So
1300 cases of content that have been
243
00:26:47,030 --> 00:26:52,820
flagged as inappropriate or disturbing,
either automatically or by another users.
244
00:26:52,820 --> 00:26:58,690
So you know that doesn't leave a lot of
time to have a thorough analysis.
245
00:26:58,690 --> 00:27:05,890
And indeed she was explaining that human
reflection is absolutely not encouraged and
246
00:27:05,890 --> 00:27:10,810
that you have all the process of
judging, you have only a few seconds to
247
00:27:10,810 --> 00:27:14,950
judge whether or not the content has
to be blocked or you could stay on
248
00:27:14,950 --> 00:27:19,640
Facebook. And all the process of the
reflection is actually reduced to a series
249
00:27:19,640 --> 00:27:25,420
of automatisms. And the interesting part
for me was when she was explaining that at
250
00:27:25,420 --> 00:27:30,170
night she was still dreaming of her work, but
where some of her co-workers were dreaming
251
00:27:30,170 --> 00:27:36,860
of the kind of horrible content they have to
moderate, she was explaining how she was
252
00:27:36,860 --> 00:27:42,920
actually dreaming of doing the same
automatic gesture over and over again. And
253
00:27:42,920 --> 00:27:49,309
that's how you realized, that all these
big companies, they wish they could replace
254
00:27:49,309 --> 00:27:54,360
workers by machines, by algorithm, by
robots. But some of the tasks
255
00:27:54,360 --> 00:27:59,020
unfortunately the robots or the
algorithm cannot do them yet. So they have
256
00:27:59,020 --> 00:28:05,900
to rely on human being with all the
other unreliability and propension
257
00:28:05,900 --> 00:28:13,270
to contest and rebell and maybe
laziness and all their human defects. So
258
00:28:13,270 --> 00:28:19,140
the solution they found was to. Sorry?
Okay I thought someone had asked me
259
00:28:19,140 --> 00:28:26,240
something. Where was I? So the solution,
yeah solution they found was to reduce
260
00:28:26,240 --> 00:28:31,850
these people to machines and automate
their work as much as possible. And so
261
00:28:31,850 --> 00:28:38,440
that's the work as a Facebook content
moderator. And just as is apparent as you
262
00:28:38,440 --> 00:28:44,710
can see how their work ambience and
environment is different from the fancy
263
00:28:44,710 --> 00:28:50,890
and glamorous image we see in architecture
magazine, each time one of these big media
264
00:28:50,890 --> 00:28:55,970
companies will open up headquarters
somewhere in the world. But anyway to come
265
00:28:55,970 --> 00:29:00,960
back to the work of Facebook content
moderator I think the way the work is
266
00:29:00,960 --> 00:29:10,920
automated can be compared to the way the
work of the employees in Amazon warehouses
267
00:29:10,920 --> 00:29:19,419
are being automated. As you know
they have to wear this kind of mini-
268
00:29:19,419 --> 00:29:25,060
computer on their wrist that identifies
them, that tracks them, that give them all
269
00:29:25,060 --> 00:29:34,020
the... monitor them, measure their
productivity in real time. And also like,
270
00:29:34,020 --> 00:29:40,480
allows Amazon to really remote control
them and so they work in a pretty
271
00:29:40,480 --> 00:29:46,940
alienating environment. Sometimes popup
online pretend that Amazon they feel this
272
00:29:46,940 --> 00:29:52,540
one is particularly creepy, because they
were thinking of having, of, you know,
273
00:29:52,540 --> 00:29:58,500
making this wrist computer even
more sophisticated and using haptic
274
00:29:58,500 --> 00:30:02,830
feedback so that they could direct and
remote control even more precisely the
275
00:30:02,830 --> 00:30:10,580
hands of the worker. So you see there is
this big push towards in certain contexts
276
00:30:10,580 --> 00:30:15,419
towards turning humans into robots. And
you know we're always mentioning Amazon,
277
00:30:15,419 --> 00:30:22,160
but this is the kind of work culture that
is being adopted elsewhere in France, but
278
00:30:22,160 --> 00:30:28,840
also in other European countries. There
are services such as the Auchan Drive. So
279
00:30:28,840 --> 00:30:32,280
if you're a customer and you don't
want to go to the supermarket to do your
280
00:30:32,280 --> 00:30:36,960
groceries, you do your groceries online
and 10 minutes later you arrive on the
281
00:30:36,960 --> 00:30:42,530
parking lot of your supermarket and
someone will be there with your grocery
282
00:30:42,530 --> 00:30:49,100
bags ready. And I was listening to
testimonies by people who used to work
283
00:30:49,100 --> 00:30:53,940
at Auchan Drive and as you can see they
wear all the same microcomputer
284
00:30:53,940 --> 00:30:59,140
that is also monitoring precisely all
their movement. And the thing that I found
285
00:30:59,140 --> 00:31:03,390
most disturbing was when they were
explaining, that they don't, when they get
286
00:31:03,390 --> 00:31:08,220
the list of of items they have to pick up,
they don't get you know a list that say
287
00:31:08,220 --> 00:31:13,660
you have to take three bananas and two
packs of soy milk. No they just get
288
00:31:13,660 --> 00:31:20,000
numbers that say good to this alley
another number that say this is
289
00:31:20,000 --> 00:31:24,780
the shelf where you have to take the item
and the number corresponding to the
290
00:31:24,780 --> 00:31:29,530
precise item on the shelf. And they also
they are submitted to very strict rules of
291
00:31:29,530 --> 00:31:35,929
being as fast as possible and I shouldn't
laugh, but they were explaining the kind
292
00:31:35,929 --> 00:31:41,390
of if they rebelled the kind of vengeance
their boss would have. And if they are
293
00:31:41,390 --> 00:31:45,790
particularly rebellious, they send them
into frozen food so they spend the day in
294
00:31:45,790 --> 00:31:54,580
the cold. And I'd like to have a sharp
look with you at people who also want to
295
00:31:54,580 --> 00:31:59,330
be closer to machines and robots. But
according to their own terms and they want
296
00:31:59,330 --> 00:32:05,540
to have total control over it and it's
something they do voluntarily and it's the
297
00:32:05,540 --> 00:32:10,610
community of so-called grinders. So these
people who don't hesitate to slice open
298
00:32:10,610 --> 00:32:15,940
their body to put, insert chips and
devices and magnets, into it to either
299
00:32:15,940 --> 00:32:22,150
facilitate their life or expand their
range of perception. And actually the
300
00:32:22,150 --> 00:32:27,919
first time I heard about about them was at
the Chaos Communication Congress back a
301
00:32:27,919 --> 00:32:36,950
long time ago in 2006 when a journalist
Quinn Norton had installed a magnet under
302
00:32:36,950 --> 00:32:41,750
her fingers and she could sense the
electromagnetic fields. I found that very
303
00:32:41,750 --> 00:32:47,950
fascinating. But as you can see the
aestetic is, it's pretty rough and gritty.
304
00:32:47,950 --> 00:32:52,670
It's very, it's a vision of a world to
come and of a communion with the machine
305
00:32:52,670 --> 00:32:58,789
that is completely aesthetically it's very
different from the one that you saw by the
306
00:32:58,789 --> 00:33:04,290
Silicon Valley. But I find it equally
fascinating how they take control of the
307
00:33:04,290 --> 00:33:12,580
way they want technology to be closer to
technology. Yes. So in the third part of
308
00:33:12,580 --> 00:33:17,390
my talk, the first part was about this
confusion we have between the intelligence
309
00:33:17,390 --> 00:33:22,910
of machine and human intelligence. The
second part was about the robotisation of
310
00:33:22,910 --> 00:33:28,059
the human body. And the third and
last part I would like to have a look
311
00:33:28,059 --> 00:33:35,560
and I don't know the type of hybridisation of
bodies, but this time not between the human body
312
00:33:35,560 --> 00:33:44,470
and the machine, but human body and other
animal species. So we may have actually
313
00:33:44,470 --> 00:33:51,000
been tinkering with their bodies for
decades since the 60s when they started
314
00:33:51,000 --> 00:33:58,740
taking the contraceptive pills and also
people going through want to change
315
00:33:58,740 --> 00:34:04,450
gender, they also tinker with their body
and the first transgender celebrity I
316
00:34:04,450 --> 00:34:11,099
recently found out was George Jorgensen
Jr. who started his adult life as a G.I.
317
00:34:11,099 --> 00:34:18,099
during World War 2 and ended his life as a
female entertainer in the US. And while
318
00:34:18,099 --> 00:34:22,089
assembling the slides I could not resist
adding this one because I saw it a couple
319
00:34:22,089 --> 00:34:28,119
of weeks ago in an exhibition - the World
Press Photo Exhibition. And it's a photo
320
00:34:28,119 --> 00:34:34,080
that was taken in Taiwan and Taiwan is
actually, it is a major on of the top
321
00:34:34,080 --> 00:34:38,630
tourist destination for medicine. People
fly to Taiwan because you get, you can get
322
00:34:38,630 --> 00:34:47,300
very good health care and and pretty
affordable surgery. And one of the hot
323
00:34:47,300 --> 00:34:53,680
niche in surgery for this kind of tourist
is actually gender reassignment surgery.
324
00:34:53,680 --> 00:35:01,390
So in this photo you have, the operation
is finished and the surgeon is showing to
325
00:35:01,390 --> 00:35:10,820
the patient her new vagina. And
before undergoing this surgery the
326
00:35:10,820 --> 00:35:16,119
patient of course had to go through a
course of... had to take hormones and
327
00:35:16,119 --> 00:35:22,210
series of hormones and one of them is
estrogen and estrogen is the primarily
328
00:35:22,210 --> 00:35:28,670
sex female hormone, it's the hormone
responsible for the feminization of the
329
00:35:28,670 --> 00:35:36,600
body and also sometimes of the emotion.
And, if you are a guy in the room you're
330
00:35:36,600 --> 00:35:39,680
probably thinking, well I'm going to leave
the room because I don't care about
331
00:35:39,680 --> 00:35:43,440
estrogen. What does it have to do with me?
And I'm going to spend the next few
332
00:35:43,440 --> 00:35:47,260
minutes trying to convince you that you
should you should care about estrogen.
333
00:35:47,260 --> 00:35:53,160
Because even fish care about estrogen it
turns out. At least the biologist. You
334
00:35:53,160 --> 00:35:57,380
might have heard a few years ago that
biology started to get very worrying
335
00:35:57,380 --> 00:36:04,160
because they discovered that some fish
population, especially the male ones, were
336
00:36:04,160 --> 00:36:11,110
getting feminized that their sexual organs
were both sometimes male and female and
337
00:36:11,110 --> 00:36:16,080
they were worried about it and they
discovered that it was the result of
338
00:36:16,080 --> 00:36:21,500
contact with xenoestrogen. And
xenoestrogen, everywhere in the
339
00:36:21,500 --> 00:36:27,619
environment, and they affect not only fish
but all animal species including human
340
00:36:27,619 --> 00:36:32,340
species. So when I say that they are
everywhere in the environment, it's really
341
00:36:32,340 --> 00:36:41,410
everywhere - only 10 minutes - They are
very often the result of industrial
342
00:36:41,410 --> 00:36:46,430
process so for example you can find them
in recycled plastic, in lacquers, in beauty
343
00:36:46,430 --> 00:36:54,050
products, in products you use to clean the
house, in wheat killers, insecticides, in
344
00:36:54,050 --> 00:36:58,020
PVC, they are pretty much everywhere.
Sometimes they are released by
345
00:36:58,020 --> 00:37:04,490
industries into the watersteam and I
wanted to show you again abstract art.
346
00:37:04,490 --> 00:37:08,410
It's a work by Juliette Bonneviot, and she
was really interested in these
347
00:37:08,410 --> 00:37:12,750
xenoestrogens. So she looked around her in
the environment where they were present
348
00:37:12,750 --> 00:37:19,400
then and she took all these, some of these
xenoestrogen sources I've just mentioned
349
00:37:19,400 --> 00:37:24,910
and she separated them according to colors
and then she crush them and turn them into
350
00:37:24,910 --> 00:37:31,080
powder. They then mixed them with PVC and
silicone which also contain xenoestrogens
351
00:37:31,080 --> 00:37:36,400
and then pulled them onto canvas and then
you have this kind of portrait of the
352
00:37:36,400 --> 00:37:43,670
silent colonizing chemicals that is
modifying our body and the kind of things
353
00:37:43,670 --> 00:37:48,150
they can do to the human body have already
been observed. So they're linked to higher
354
00:37:48,150 --> 00:37:56,200
cases of cancer for both men and women,
lowering of IQ, problems in fertility,
355
00:37:56,200 --> 00:38:05,280
early onset of puberty, etc. So we
should really be more vigilant around
356
00:38:05,280 --> 00:38:13,800
xenoestrogen and in the next few minutes
I'd like to mention briefly two projects
357
00:38:13,800 --> 00:38:20,940
where women artist exploring how
their hormones, more general their female
358
00:38:20,940 --> 00:38:29,690
body can be used to have other kinds of
relationships with non-human animals.
359
00:38:29,690 --> 00:38:35,900
So the first is a project by Ai Hasegawa,
she was in her early 30s and she had that
360
00:38:35,900 --> 00:38:41,140
dilemma. She thought maybe, maybe it's
time to have a child but do I really want
361
00:38:41,140 --> 00:38:46,530
to have a child because it's a lot of
responsibility and there is already
362
00:38:46,530 --> 00:38:50,540
overpopulation. But on the other hand if I
don't have a child it means that all these
363
00:38:50,540 --> 00:38:55,500
painful periods would have been there in
vain. She was also faced with another
364
00:38:55,500 --> 00:39:01,280
dilemma, is that she really likes to eat
endangered species and in particular
365
00:39:01,280 --> 00:39:07,400
dolphin and shark and whales and so on the
one hand she liked eating them. And on the
366
00:39:07,400 --> 00:39:13,280
other hand she realized that she was
contributing to their disappearance and
367
00:39:13,280 --> 00:39:18,360
the fact that their extinct. So she found
this solution, and this is a speculative
368
00:39:18,360 --> 00:39:24,190
project, but she saw it as a solution
maybe she could give birth to an
369
00:39:24,190 --> 00:39:28,660
endangered species, So maybe she could
become the surrogate mother of a dolphin.
370
00:39:28,660 --> 00:39:36,820
So using her belly as a kind of incubator
for another animal and to do this she
371
00:39:36,820 --> 00:39:41,520
consulted with an expert in synthetic
biology and an expert in obstetric who
372
00:39:41,520 --> 00:39:48,830
told her that technically it's possible,
that she would just have to resort to
373
00:39:48,830 --> 00:39:55,180
synthetic biology to modify the placenta
of the baby. She would not have to modify
374
00:39:55,180 --> 00:39:59,680
our own body but she just had to modify
the placenta of the baby to be sure
375
00:39:59,680 --> 00:40:04,750
that all the nutrients and the hormones
and the oxygen and everything is
376
00:40:04,750 --> 00:40:08,660
communicated between the mother and the
baby except the antibodies because they
377
00:40:08,660 --> 00:40:13,490
could damage the body. So in a scenario,
even if everything goes well, she gives
378
00:40:13,490 --> 00:40:20,800
birth will lovely healthy baby
Maui dolphin. And she gives it the
379
00:40:20,800 --> 00:40:27,609
antibodies through fat milk containing
these antibodies and she see grow. And
380
00:40:27,609 --> 00:40:32,760
once it's an adult he could be released
with a chip and under its skin. And then
381
00:40:32,760 --> 00:40:38,110
when once it's fished she can buy it and
eat it and have it again inside her own
382
00:40:38,110 --> 00:40:44,470
body. Well, when I first heard about the
project, I can tell you I was not
383
00:40:44,470 --> 00:40:48,390
laughing. I said it was really really
revolting. But the more I thought about,
384
00:40:48,390 --> 00:40:54,820
it the smarter I saw this project was,
because I understand it doesn't sadly make
385
00:40:54,820 --> 00:40:59,410
a lot of sense to give birth to yet
another human being that is going to put
386
00:40:59,410 --> 00:41:04,790
strain on this planet. So why not dedicate
your reproductive system to the
387
00:41:04,790 --> 00:41:11,620
environment and use it to help save
endanger species. And I have five minutes
388
00:41:11,620 --> 00:41:17,859
left so I have to make some really really
drastic choices. I wanted to talk about
389
00:41:17,859 --> 00:41:24,010
Maja Smrekar. I'm going to do it. So she
had a series of projects, where she is
390
00:41:24,010 --> 00:41:31,040
exploring the core evolution of dogs and
human and to make it very very brief. One
391
00:41:31,040 --> 00:41:36,630
of her performances consisted in spending
four months in a flat in Berlin with her
392
00:41:36,630 --> 00:41:41,540
two dogs. One of them is an adult Byron
and the other one is a puppy called Ada.
393
00:41:41,540 --> 00:41:46,510
And she tricked her body by some
physiological training and also by eating
394
00:41:46,510 --> 00:41:52,820
certain food and using a breast pump. She
treats her body to stimulate the hormone
395
00:41:52,820 --> 00:41:56,869
that make the body of a woman lactate,
even if she wasn't pregnant, so that she
396
00:41:56,869 --> 00:42:03,410
could breastfeed her puppy. And these two
projects I saw, were really are really
397
00:42:03,410 --> 00:42:08,859
interesting, because it shows the power of
the female body. It shows that what
398
00:42:08,859 --> 00:42:14,140
prevents women from having this kind of
transgender motherhood isn't is that
399
00:42:14,140 --> 00:42:19,960
technology. It's just it's just society
and the kind of limits that our culture is
400
00:42:19,960 --> 00:42:26,720
bringing on what woman can or should do
with her body. And I also like this
401
00:42:26,720 --> 00:42:32,190
project, because the kind of question our
anthropocentrism and our tendency to think
402
00:42:32,190 --> 00:42:37,859
that all the resources of the world are made
for us and not for other living creatures.
403
00:42:37,859 --> 00:42:43,470
So just as a parent this if you're
interested in estrogen and xenoestrogen
404
00:42:43,470 --> 00:42:49,860
I would recommend that you have
a look at the talk that Mary Maggic give last
405
00:42:49,860 --> 00:42:56,020
year had the Chaos Communication Congress
where she was talking about estrogen and
406
00:42:56,020 --> 00:43:01,900
then linked the history and links
to her work as a biohacker. So now my
407
00:43:01,900 --> 00:43:06,480
conclusion... maybe you're already wondering
why is she talking to us about artificial
408
00:43:06,480 --> 00:43:12,965
intelligence and transspecies motherhood?
What does it have to do with each other? I
409
00:43:12,965 --> 00:43:19,349
would say a lot! Because we have the
feeling that the digital world sometimes
410
00:43:19,349 --> 00:43:24,490
we tend to forget that behind it
there is a material world that to have
411
00:43:24,490 --> 00:43:29,210
artificial intelligence. You need
the infrastructure, you need the devices,
412
00:43:29,210 --> 00:43:35,109
you need the server farm, you need spaces
to manage these data. Physical spaces.
413
00:43:35,109 --> 00:43:40,540
And I would say it's a bit like the human
brain. The human brain is kind in dimensions
414
00:43:40,540 --> 00:43:47,040
pretty small, but apparently it's it eats
up a fifth of all the energy that our body
415
00:43:47,040 --> 00:43:53,560
consume. And so that means that artificial
intelligence is, it needs like a very
416
00:43:53,560 --> 00:43:59,810
heavy infrastructure, energy, hungry,
server farms and I'm sure you've seen all
417
00:43:59,810 --> 00:44:04,816
the PR stunts and the press releases of
Amazon, Google, Facebook, etc. promising
418
00:44:04,816 --> 00:44:09,450
you that they're transitioning that
they're going to use green energy, but the
419
00:44:09,450 --> 00:44:16,241
energy is not still green and they're still
using a lot of fossil fuel to pull out.
420
00:44:16,241 --> 00:44:22,210
Their server farms to make the devices
that we use. We still need minerals that
421
00:44:22,210 --> 00:44:25,930
have to be dug up from the ground,
sometimes in really horrible conditions.
422
00:44:25,930 --> 00:44:31,400
This is the coltan mine in the Democratic
Republic of the Congo. The minerals we
423
00:44:31,400 --> 00:44:36,890
use, well, they are not infinite and,
unless and it's not for tomorrow, unless we
424
00:44:36,890 --> 00:44:42,210
go and get these minerals in the asteroid
or in the deep sea. We are going to get
425
00:44:42,210 --> 00:44:47,109
into trouble very very fast and I don't
have the time for this one. But trust me
426
00:44:47,109 --> 00:44:52,849
the resources are not infinite. And then,
refind them and produce them. That's very
427
00:44:52,849 --> 00:44:59,900
very damaging for the environment. Yes. So
I have the feeling sometimes that we are a
428
00:44:59,900 --> 00:45:06,750
bit like the golfers in this image, that
when I go to some, not all of them, but
429
00:45:06,750 --> 00:45:12,050
some tech conference or art and tech
conferences, I have the feeling that we
430
00:45:12,050 --> 00:45:17,190
are kind of complacent and that we have
our vision of the future may be a bit
431
00:45:17,190 --> 00:45:21,710
narrow minded and maybe that's
normal. I guess it's like, if you go to
432
00:45:21,710 --> 00:45:25,800
someone who's working as a fancy job at
Silicon Valley and you ask him: "What's
433
00:45:25,800 --> 00:45:29,800
your vision of tomorrow? What's your
vision of the future?" And then you ask
434
00:45:29,800 --> 00:45:34,060
the same question to someone else, for
example is an activist for Greenpeace. Then
435
00:45:34,060 --> 00:45:39,280
we have a completely different
perspective, a very different answer of
436
00:45:39,280 --> 00:45:47,380
what the future might be. So sometimes I'm
wondering also, if we are not too obsessed
437
00:45:47,380 --> 00:45:51,765
with the technology and to obsess with
what I would call a techno feats. This
438
00:45:51,765 --> 00:45:57,450
tendency we have to see a problem and to
think, if we throw more technology on top
439
00:45:57,450 --> 00:46:01,870
of it we are going to solve it. Even if
the problem has been created in the first
440
00:46:01,870 --> 00:46:06,460
place by technology. So, that's why we get
extremely excited and I do get excited
441
00:46:06,460 --> 00:46:11,073
about the perspective that when they maybe
will get a baby mammoth that will be
442
00:46:11,073 --> 00:46:17,140
resurrected. And at the same time
we don't take care of the species that are
443
00:46:17,140 --> 00:46:21,790
getting instincts, you know every single
day every 24 hours. There are something
444
00:46:21,790 --> 00:46:28,210
like between 150 and 200 types of plants,
animals, insects that get disappeared and
445
00:46:28,210 --> 00:46:34,470
we'll never see again. Every single day
they disappear around the world. And yet
446
00:46:34,470 --> 00:46:38,070
we still get excited about the idea of
resurrecting the baby mammoth, the
447
00:46:38,070 --> 00:46:44,530
passenger pigeon or dodo, we still create
and breed creature, so that we can exploit
448
00:46:44,530 --> 00:46:50,060
them even better. Should we be looking
forward to a lab grown meat that is
449
00:46:50,060 --> 00:46:55,900
promised. I mean we are told that is going
to be cruelty-free and guilt-free, where it
450
00:46:55,900 --> 00:47:02,320
has in reality they are not totally guilt-
free and cruelty-free. And I don't think so that
451
00:47:02,320 --> 00:47:07,490
they are the best solution to solved the
horrible impact that the meat industry is
452
00:47:07,490 --> 00:47:12,790
having on the environment, on our health
and on the well-being of animals. I mean
453
00:47:12,790 --> 00:47:17,940
there is a solution. It's not the sexy
one. It's not a tricky one. It's to
454
00:47:17,940 --> 00:47:25,020
adapt to plain plant based diet and I
managed to do a bit of vegan propaganda
455
00:47:25,020 --> 00:47:34,090
here. Should we get excited, because there
are few vegan in the room. Should we get
456
00:47:34,090 --> 00:47:42,800
excited, because someone in Japan has made
some tiny drones with, on top of it, it's
457
00:47:42,800 --> 00:47:47,580
horse hair that they're used to pollinate
flowers, because everywhere around the
458
00:47:47,580 --> 00:47:54,200
world the population of bees is collapsing
and that is very bad for our food system.
459
00:47:54,200 --> 00:48:01,250
So should we, should we use geo
engineering to solve our climate trouble.
460
00:48:01,250 --> 00:48:06,260
And at the end with these slides of what
this for me ours, I see you know the
461
00:48:06,260 --> 00:48:12,329
service, which is for me the really the
embodiment of techno feats. So, you
462
00:48:12,329 --> 00:48:20,460
probably know that California has gone
through some very bad period of dry
463
00:48:20,460 --> 00:48:25,560
weather, so rich people wake up and they
see that the grass on their magnificent
464
00:48:25,560 --> 00:48:30,140
lawn is yellow instead of being green. So,
now in California you have services where
465
00:48:30,140 --> 00:48:34,109
you can call someone and they will just,
you know, fix the problem by painting the
466
00:48:34,109 --> 00:48:39,450
grass in green. So there you have it and,
you know, fuck you Anthropocene and global
467
00:48:39,450 --> 00:48:45,868
warming, my lawn is green!
468
00:48:45,868 --> 00:48:52,650
applause
469
00:48:52,650 --> 00:48:57,250
Okay so this how this why I wanted to
bring all these really contrasting visions
470
00:48:57,250 --> 00:49:02,109
together because we might have different
visions of the future but at some point
471
00:49:02,109 --> 00:49:08,770
they will have to dialogue because we have
only one humanity and one planet. And I'm
472
00:49:08,770 --> 00:49:15,890
very very bad that at conclusion. So I
wrote it down. So the artist whose work
473
00:49:15,890 --> 00:49:22,670
made in 2018 a truly exciting year
for me not the artist will showcase the
474
00:49:22,670 --> 00:49:28,230
magic and the wonders of science and
technology. They are the artists who bring
475
00:49:28,230 --> 00:49:33,619
to the table realistic complex and
sometimes also difficult conversation
476
00:49:33,619 --> 00:49:39,040
about the future whether it is the future
of technology, the future of men, or the
477
00:49:39,040 --> 00:49:43,830
future of other forms of life and other
forms of intelligence. They are the
478
00:49:43,830 --> 00:49:48,369
artists who try and establish a dialogue
between the organic, the digital and the
479
00:49:48,369 --> 00:49:53,510
mineral because they're all part of our
world and we cannot go on pretending that
480
00:49:53,510 --> 00:49:57,518
they don't affect each other. Thank you so
much.
481
00:49:57,518 --> 00:50:14,617
Applause
482
00:50:14,617 --> 00:50:20,010
Herald: Thank you Régine for the very
interesting times. Bit confusing talk
483
00:50:20,010 --> 00:50:29,849
about - I'm still thinking about the
dolphin part, but anyway. But by the way
484
00:50:29,849 --> 00:50:34,490
there's this grass painting thing is maybe
it's something that I can apply to
485
00:50:34,490 --> 00:50:39,970
my house. Okay. We have questions at
microphone 2, I guess. So let's start
486
00:50:39,970 --> 00:50:42,970
there.
Question: Hi. I have a question on a
487
00:50:42,970 --> 00:50:48,810
particular part at the beginning you
talked about AI in arts and you mentioned
488
00:50:48,810 --> 00:50:57,429
that there are no AI programs that draw
pictures or run texts but have you heard
489
00:50:57,429 --> 00:51:04,810
about AI developing ideas for say an art
installation?
490
00:51:04,810 --> 00:51:11,570
Régine: Yes as a matter of fact I think
tonight... I mean, if you're going to program
491
00:51:11,570 --> 00:51:17,720
and you may maybe tonight or tomorrow
night there is a talk by Maria and
492
00:51:17,720 --> 00:51:23,960
Nicole. There two artists and I think
that the title might be Disnovation
493
00:51:23,960 --> 00:51:28,010
and I think you might like what
they present. I don't know what they're
494
00:51:28,010 --> 00:51:33,150
going to present but what I know is that
one of their work... I forget the name... if
495
00:51:33,150 --> 00:51:38,540
they're in the room that would be
fantastic. But they had a project where
496
00:51:38,540 --> 00:51:45,339
they have a bot. It's on Twitter and it's
going through some blogs and newspaper
497
00:51:45,339 --> 00:51:51,769
about creativity and also about
technology and using all these data to
498
00:51:51,769 --> 00:51:59,839
generate really crazy stupid title for
installation . And then the artist
499
00:51:59,839 --> 00:52:07,339
challenge other artists to take these
tweets which are really crazy and make an
500
00:52:07,339 --> 00:52:14,020
installation out of it. So that's
a kind of a tricky way of AI being used
501
00:52:14,020 --> 00:52:18,390
for installation I'm sure. Like
right now I cannot think of about
502
00:52:18,390 --> 00:52:24,270
anything else but I mean if you
want you can write me and when my brain
503
00:52:24,270 --> 00:52:34,081
is switched on back, probably
I'll have other ideas.
504
00:52:34,081 --> 00:52:42,579
Herald: OK. Any more questions. I don't
see any.... Ah, there over there.
505
00:52:42,579 --> 00:52:48,650
Microphone 4 please.
Question: Yeah. I was wondering if, well.
506
00:52:48,650 --> 00:52:54,310
there's probably more certain that
we're developing to more suppose human race
507
00:52:54,310 --> 00:53:01,369
because we simply have to do to climate
change. There are also developments right
508
00:53:01,369 --> 00:53:08,359
now that when relatively short term we
would go to Mars and in a sort of sense do
509
00:53:08,359 --> 00:53:16,160
we need to do to fight the human race for
possible multiple planets and with modern
510
00:53:16,160 --> 00:53:21,787
human modification.
Régine: Okay, I didn't understand the
511
00:53:21,787 --> 00:53:26,710
question.
Herald: So, please repeat.
512
00:53:26,710 --> 00:53:35,099
Question: So in general we're going to a
human... both human race that definitely...
513
00:53:35,099 --> 00:53:43,010
we're not able to survive on this planet
for that long anymore. Really optimistic.
514
00:53:43,010 --> 00:53:52,750
I am vegan so yay. And we have
some new developments going on that we'll
515
00:53:52,750 --> 00:53:58,110
be able to go to Mars relatively soon.
Régine: I don't believe it. Who is going
516
00:53:58,110 --> 00:54:02,200
to want to go to that planet like
seriously, like it's going to be like
517
00:54:02,200 --> 00:54:07,570
Australia we are going to send prisoners
and criminals there. Who wants to be like
518
00:54:07,570 --> 00:54:15,803
Like come on. Yeah. Anyway now I
see what you mean. I think I'm kind of
519
00:54:15,803 --> 00:54:20,589
more optimistic about the future
than you are. I think we can still survive
520
00:54:20,589 --> 00:54:26,599
on this planet even if we get very
numerous. We just have to find another
521
00:54:26,599 --> 00:54:30,500
way to consume and take care of each
other. But maybe this my feminine side
522
00:54:30,500 --> 00:54:37,630
talking.
Question: Maybe in general without a lot
523
00:54:37,630 --> 00:54:45,030
of modification to the human
beings it's simply not possible.
524
00:54:45,030 --> 00:54:53,070
Certainly. I think that's a common ground.
And yeah I sort of wish we didn't need a
525
00:54:53,070 --> 00:54:59,290
planet B but I think we do.
Régine: Well I hope I'd be dead when that
526
00:54:59,290 --> 00:55:06,179
comes. That's my philosophy sometimes.
Okay.
527
00:55:06,179 --> 00:55:14,500
Herald: I don't see any more questions so
let's thank the speaker again. Thank you.
528
00:55:14,500 --> 00:55:18,144
Applause
529
00:55:18,144 --> 00:55:31,670
35C3 postroll music
530
00:55:31,670 --> 00:55:41,000
Subtitles created by c3subtitles.de
in the year 2020. Join, and help us!