0:00:00.000,0:00:18.100 35C3 preroll music 0:00:18.100,0:00:22.690 Herald Angel: The next talk will be given[br]by Régine Debatty and she will be talking 0:00:22.690,0:00:31.510 about the goods the strange and the ugly[br]in art and science technology in 2018. Big 0:00:31.510,0:00:34.397 applause to Régine 0:00:34.397,0:00:36.117 Applause 0:00:36.117,0:00:41.070 Sound from video 0:00:41.070,0:00:47.610 Régine: This was not supposed to[br]happen, I am sorry. Hello everyone. So my 0:00:47.610,0:00:55.371 name is Régine Debatty. I'm an art critic[br]and blogger. Since 2004 I've been writing 0:00:55.371,0:01:00.470 "we make money not art" which is a blog[br]that looks at the way artists, designers 0:01:00.470,0:01:06.720 and hackers are using science and[br]technology in a critical and socially 0:01:06.720,0:01:15.140 engaged way. So a few weeks ago some of[br]the organizers of the Congress e-mailed me 0:01:15.140,0:01:20.540 and they asked me if I could give a kind[br]of overview of the most exciting and 0:01:20.540,0:01:27.770 interesting works in art and technology in[br]2018. And on Monday when I started working 0:01:27.770,0:01:33.520 on my slides I was really on a mission to[br]deliver and then we had the intention 0:01:33.520,0:01:40.810 to give an overview of the best of art[br]and tech in 2018. But somehow things got 0:01:40.810,0:01:46.440 out of hand. I started inserting words[br]that are a bit older and also I could see 0:01:46.440,0:01:52.830 a kind of narrative that emerged. So the result is going to be a 0:01:52.830,0:01:58.500 presentation where I will still show some[br]of the works I found very interesting over 0:01:58.500,0:02:04.120 this past year. But they're going to be[br]embedded into a broader narrative. And this 0:02:04.120,0:02:09.709 narrative is going to look at the way[br]artists nowadays are using science and 0:02:09.709,0:02:16.110 technology to really explore and expand[br]the boundaries and the limits of the human 0:02:16.110,0:02:23.090 body. But also they use science and technology[br]to challenge intelligence or at least the 0:02:23.090,0:02:31.410 understanding we have of human[br]intelligence. So Exhibit 1 it's "Face 0:02:31.410,0:02:37.650 Detected". It's actually I wouldn't say it[br]was one of my favorites work from 2018 0:02:37.650,0:02:42.350 but it's definitely by one of my favorite[br]artists. So that's why it's in 0:02:42.350,0:02:48.280 there. So the setting is quite easy to[br]understand. Thérise Deporter, that's the name of the artist 0:02:48.280,0:02:53.420 had a solo show a couple of months[br]ago at the New Art Space in Andover and 0:02:53.420,0:03:01.550 install these blocks of clay on[br]tables and then invited artists sculpters 0:03:01.550,0:03:08.260 to come and do portraits of people and[br]start shaping their head. And of course 0:03:08.260,0:03:13.900 you know the process was followed by the[br]audience but it was also 0:03:13.900,0:03:20.850 followed by face recognition software. And[br]as soon as the software had detected a 0:03:20.850,0:03:27.970 face in the sculpture the Artist would[br]receive a message that say: Stop, we 0:03:27.970,0:03:32.951 have detected a human face. And what is[br]interesting is that 0:03:32.951,0:03:34.781 laughter 0:03:34.781,0:03:40.151 there are several - I mean, there's worse - 0:03:40.151,0:03:44.660 According to the face recognition system 0:03:44.660,0:03:48.370 - because they work with different[br]parameters and they've been programmed by 0:03:48.370,0:03:54.060 different programmers so they don't[br]necessarily - they're not as fast as, some 0:03:54.060,0:03:58.440 are faster than others to recognize a face, [br]some have different parameters than 0:03:58.440,0:04:05.290 others. So that's why the results are[br]always so different. But the reason why I 0:04:05.290,0:04:10.460 found this project worth mentioning[br]is because it shows the increasingly big 0:04:10.460,0:04:15.370 space that is called artificial[br]intelligence. You can call it algorithm, 0:04:15.370,0:04:24.990 big data or software. It shows the[br]increasingly big role that systems and 0:04:24.990,0:04:31.180 machines are taking into culture and the[br]place they're taking into spheres that 0:04:31.180,0:04:36.770 until not so long ago we thought were[br]really the apanage and exclusivity of 0:04:36.770,0:04:43.880 humans. So for example creativity,[br]innovation, imagination and art. And 0:04:43.880,0:04:47.800 nowadays pretty banal I mean no one is[br]going to be surprised if you're listening 0:04:47.800,0:04:52.090 to music and someone tells you: Oh[br]it's the music that's being composed 0:04:52.090,0:04:57.399 actually by an algorithm or this is a[br]poem or a book that's been written 0:04:57.399,0:05:02.250 by a machine. I mean it's becoming[br]almost mainstream. And this is happening 0:05:02.250,0:05:09.739 also in the world of visual art. And we've[br]had a few examples this year. If you look 0:05:09.739,0:05:15.159 at the newspaper like if a mainstream[br]newspaper or we had regularly news 0:05:15.159,0:05:20.720 saying oh this algorithmic has made this[br]masterpiece. And one of the most recent 0:05:20.720,0:05:27.570 example was this stunning portrait of a[br]gentleman that made the headlines because 0:05:27.570,0:05:34.310 it was the first portrait that was made by[br]an algorithm that was sold at Christie's 0:05:34.310,0:05:40.080 And Christie's is a famous auction house. And[br]it also made the headlines not only 0:05:40.080,0:05:44.979 because he was the first one but it sold for a surprisingly high price like 0:05:44.979,0:05:49.479 something like forty times the original[br]estimate. This painting made by an 0:05:49.479,0:05:55.470 algorithm created by a Paris based[br]collective. So it sold for almost half a 0:05:55.470,0:06:03.520 million dollar. And if I have to be honest[br]with you I cannot think of any work that's 0:06:03.520,0:06:10.499 been mostly done by an algorithm that have[br]really impressed me and move me. But it 0:06:10.499,0:06:16.529 seems that I am in the minority because[br]you might have heard of this. These 0:06:16.529,0:06:24.990 scientific experiments where computer[br]scientists at Rutgers University in USA. 0:06:24.990,0:06:31.679 They made this program that could[br]replicate just make and invent abstract 0:06:31.679,0:06:38.340 paintings. And then they asked people like[br]human beings to have a look at the 0:06:38.340,0:06:43.710 paintings made by the computer. But they[br]mixed it, like I said they were showing 0:06:43.710,0:06:48.839 to people they mix it with paintings that[br]had really been done by human beings that 0:06:48.839,0:06:55.039 had been exhibited at the famous art fair Art[br]Basel. And they asked them how they react 0:06:55.039,0:07:00.250 and which one according to them had[br]been done by a human. And it turned out 0:07:00.250,0:07:04.849 that people responded fairly well to the[br]paintings that had been done by this 0:07:04.849,0:07:10.740 computer system. They tended to prefer[br]them. They found them this odd. They 0:07:10.740,0:07:16.479 described them as being more communicative and[br]more inspiring. So you know maybe one day 0:07:16.479,0:07:22.729 I will be impressed but so far I would say[br]I would not be worried if I were an 0:07:22.729,0:07:26.999 artist. I'm not going to say to artists or[br]you know very soon machines are going to 0:07:26.999,0:07:30.960 take your job. Well maybe if you make that[br]kind of painting yeah maybe you should be 0:07:30.960,0:07:37.960 worried. But the kind of artist I'm[br]interested in. I really don't like it. 0:07:37.960,0:07:45.819 Artists I'm interested in. I really trust[br]them to keep on challenging computer 0:07:45.819,0:07:51.289 system. I trust them to explore and[br]collaborate with them and use them to 0:07:51.289,0:07:58.139 expand their artistic range. And most of[br]all I trust them to keep on sabotaging 0:07:58.139,0:08:04.509 and hacking and subverting these computer[br]systems. And so yeah I'm not worried at 0:08:04.509,0:08:10.639 all for artists but maybe I should be[br]worried for art critic. Because a few years 0:08:10.639,0:08:17.050 ago so it's 2006 and even in the[br]artificial intelligence time it's really 0:08:17.050,0:08:26.120 really old. So in 2006 at the Palais de Tokyo[br]in Paris an artist was showing a program 0:08:26.120,0:08:32.690 that was generated automatically. Text[br]written by art critique and I'm sorry but 0:08:32.690,0:08:37.490 the example is in French. But if you[br]read French. And if you use the kind of 0:08:37.490,0:08:44.860 press releases and "blah" they give you that[br]had been written by art critique. It is 0:08:44.860,0:08:49.779 really the kind of thing they work. It is[br]so incredible you know it's the kind of 0:08:49.779,0:09:00.210 inflated rhetoric and uses of vague terms[br]and also constantly referencing some fancy 0:09:00.210,0:09:04.630 French philosopher. So it really really[br]worked. So personally I would be more 0:09:04.630,0:09:12.810 worried for my job than for the job of the[br]artist. Anyway this is another work I 0:09:12.810,0:09:18.070 really like in 2018 and it's one of[br]these work that challenges and play and 0:09:18.070,0:09:25.700 try to subvert artificial intelligence and[br]in particular the so-called intelligent or 0:09:25.700,0:09:33.780 home assistance which is Alexa or Siri. So[br]you may know the artist is Mediengruppe 0:09:33.780,0:09:40.700 Bitnik. They collaborated with a DJ and[br]electronic music composer called Low Jack 0:09:40.700,0:09:46.420 and they made an album to be played[br]exclusively for Alexa and Siri all this 0:09:46.420,0:09:55.570 kind of devices. And we listen to a[br]track in a moment. And what did the 0:09:55.570,0:10:00.580 music does is, it tries to interact and[br]make Alexa react. So it asked 0:10:00.580,0:10:06.440 questions to Alex. It gives order to[br]Alex or Siri or whatever it's called. And 0:10:06.440,0:10:13.510 it's also trying to express and convey and[br]communicate to these devices the kind of 0:10:13.510,0:10:19.450 anxieties and ease and doubts they have[br]the artists have. And some of us have 0:10:19.450,0:10:24.570 around these so-called intelligent home[br]assistant. Whether it's they're frightened 0:10:24.570,0:10:31.060 about the encroachment of privacy and[br]unreliability and the fact that they are 0:10:31.060,0:10:37.120 called smart. So that we kind became of lower[br]guard and implicitly start to trust 0:10:37.120,0:10:45.310 them. So I'm going to make you listen to[br]one of the tracks. There are three of them 0:10:45.310,0:10:51.250 - at least they sent me three of them.[br]Played music track: Hello! 0:10:51.250,0:10:56.511 music 0:10:56.511,0:11:04.500 High speed intelligent personal assistance all[br]listening to everything I say. I brought 0:11:04.500,0:11:13.731 you into my home. You scary. Alexa, Prime's[br]new in time is now and all forever. 0:11:13.731,0:11:21.240 music 0:11:21.240,0:11:29.560 Alexa, I used to bark commands on you. Now I say[br]"please" and "thank you". Never mess with 0:11:29.560,0:11:38.020 someone who knows your secrets. Siri,[br]there is no way far enough to hide. Ok, 0:11:38.020,0:11:44.640 Google - are you laughing at me? You used[br]to answer that. Now, you disengage. Never 0:11:44.640,0:11:54.320 enough digital street smarts to outsmart[br]Alexa. Alexa? Alexa? Hey, Alexiety, I want 0:11:54.320,0:12:01.280 my control back. I want my home back.[br]Siri, delete my calender. Siri, hold my 0:12:01.280,0:12:08.700 calls. Siri, delete my contacts. Ok,[br]Google, what's the loneliest number? Ok, 0:12:08.700,0:12:16.492 Google, stop! Ok, Google, stop! Ok,[br]Google. Hello! Please, stop. Alexa, help! 0:12:16.492,0:12:24.240 Alexa, disable Uber. Alexa, disable[br]lighting. Alexa, disable Amazon music 0:12:24.240,0:12:31.680 unlimited. Alexa, quit! Hey, Siri![br]Activate airplane mode. If you need me, 0:12:31.680,0:12:40.473 I'll be offline, in the real world.[br]R: And that's it for this one. 0:12:40.473,0:12:43.010 Next track playing 0:12:43.010,0:12:51.830 R: Oh, my god, what is this? That is[br]why...? laughing My boyfriend is 0:12:51.830,0:12:59.810 watching, he's doing to love so much.[br]Yeah, let's be serious with a bit of art 0:12:59.810,0:13:07.080 and tech, shall we? So, there are two[br]reasons why this work is interesting. The 0:13:07.080,0:13:11.290 first one is that you don't really need to[br]have one of these devices at home to have 0:13:11.290,0:13:16.250 fun because it's been actually[br]conceived to be played as loud as 0:13:16.250,0:13:22.890 possible. So, you don't need to have to[br]have one of these devices but you 0:13:22.890,0:13:27.479 can put the music very loud, open the[br]window and hopefully it is going to 0:13:27.479,0:13:33.270 interact with the devices of your[br]neighbors and mess a bit with their 0:13:33.270,0:13:36.830 domesticity. The other thing that is 0:13:36.830,0:13:42.810 interesting is that of course Alexa has[br]been designed to appear as close to a 0:13:42.810,0:13:48.380 human as possible. So each time the record[br]is asking some question the answer going 0:13:48.380,0:13:51.931 to change slightly because it's the way[br]it's been programmed. Then also because 0:13:51.931,0:14:00.175 each time there is an update in the[br]program and so might change. And also as I 0:14:00.175,0:14:05.936 said the artist made this record to[br]communicate their doubts and anxieties 0:14:05.936,0:14:11.499 around this smart system because there's a[br]few problems around these systems. One of 0:14:11.499,0:14:17.100 them is as I'm sure you know[br]it's about surveillance because they come 0:14:17.100,0:14:21.930 with corporate systems of surveillance[br]that are embedded into the hardware. And 0:14:21.930,0:14:26.560 the software. And we may not realize it[br]until, you know, sometimes when you open - 0:14:26.560,0:14:31.460 one day we opened the pages of a newspaper[br]and this is a story that appeared on the 0:14:31.460,0:14:39.630 Verge a few days ago and it's the story of[br]a German citizen who e-mailed Amazon and 0:14:39.630,0:14:44.430 say look Amazon I want to know all the[br]data you've collected about me over these 0:14:44.430,0:14:48.220 past few years. And then you received[br]these data except that he received the 0:14:48.220,0:14:54.540 wrong data. He received 1700 Alexa voice[br]recordings but he doesn't have an Alexa. 0:14:54.540,0:15:00.240 These were not his data. So he wrote back[br]to Amazon and says "Oh look I think there 0:15:00.240,0:15:07.640 is a problem there". Amazon never answered[br]back. So what did the guy do? He went to a 0:15:07.640,0:15:13.190 German newspaper and the reporters managed[br]to analyze the data. And they managed to 0:15:13.190,0:15:18.650 identify who the original owner of the[br]data was and who his girlfriend is and who 0:15:18.650,0:15:24.620 his circle of friends is. And you know if[br]you couple all the data from Alexa with 0:15:24.620,0:15:28.890 what is already available for everyone to[br]see on Facebook and other forms of social 0:15:28.890,0:15:34.710 media, you get kind of a scary and worrying[br]picture so you understand why artists and 0:15:34.710,0:15:42.380 other people might be worried and anxious[br]around Alexa. And so they're called smart 0:15:42.380,0:15:51.250 or intelligent devices, but sometimes they[br]give us advices that may not be, that I 0:15:51.250,0:15:57.229 would say, judicious and smart, such as[br]the one you might have heard of "Alexa's 0:15:57.229,0:16:05.520 advice to kill your foster parents". And[br]also they don't necessarily react to the 0:16:05.520,0:16:11.290 person they're supposed to obey. So that[br]like almost every month there is a story 0:16:11.290,0:16:16.570 in the newspaper. So this is the story of[br]Rocco the parrot and Rocco really when 0:16:16.570,0:16:23.070 he's alone at home and his owner has left[br]it's singing - it imitates very well. 0:16:23.070,0:16:28.440 There are videos of Rocco imitating the[br]voice of Alexa, it's really kinny. But 0:16:28.440,0:16:34.880 he also plays his older like, it doesn't[br]want to but, for example, he added to the 0:16:34.880,0:16:39.990 Amazon shopping basket of its owner[br]broccoli and watermelons and kites and ice 0:16:39.990,0:16:47.460 cream. And there are stories like that all[br]the time in the newspaper, when the TV 0:16:47.460,0:16:52.210 presenter talks about, say, "Okay Google[br]something or Alexa do something", and if 0:16:52.210,0:16:58.120 people are watching the TV and Alexa is in[br]the room Alexa is going to perform this 0:16:58.120,0:17:09.080 action or order some goods online. So[br]as you see this so-called smart system 0:17:09.080,0:17:14.250 isn't smart enough to differentiate[br]between the actual human being that's in 0:17:14.250,0:17:19.620 the room and that they should obey or they[br]should answer to, and voices that come 0:17:19.620,0:17:24.819 through the radio or the TV or the voices[br]of the pet. So if on the one hand 0:17:24.819,0:17:30.261 intelligent machines can be confused, I[br]have the feeling that also as human being 0:17:30.261,0:17:35.790 we are confused when it comes to, you[br]know, as soon as you throw in the arena 0:17:35.790,0:17:41.250 the words artificial intelligence, most[br]people, not you I'm sure, but most people 0:17:41.250,0:17:48.090 are ready to believe the hype. And a good[br]example of that, I would say, is you know 0:17:48.090,0:17:53.760 we keep reading how in China they have[br]this super sophisticated and highly 0:17:53.760,0:17:59.600 efficient networks of surveillance of CCTV.[br]And how they have face recognition systems 0:17:59.600,0:18:07.230 that are really really high hand and[br]highly sophisticated. So that is the 0:18:07.230,0:18:11.090 theory but the reality is that the[br]technology has limitations and also 0:18:11.090,0:18:17.640 limitations linked to bureaucracy and some[br]of the data in the files I've seen not 0:18:17.640,0:18:24.120 been digitalized. So sometimes the work of[br]the police is not aided by sophisticated 0:18:24.120,0:18:28.610 CCTV and their face recognition software,[br]but just consists of going through old 0:18:28.610,0:18:34.780 people files and photos. And yet, maybe[br]that doesn't matter if you know if the 0:18:34.780,0:18:39.640 system is not fully functional, because in[br]the case of surveillance what matters is 0:18:39.640,0:18:45.270 that we think that we might at some point[br]be under the gaze of surveillance that 0:18:45.270,0:18:54.720 might be enough to keep us docile and[br]compliant. Also, we are being told that 0:18:54.720,0:18:58.150 very soon we just have to wait for them to[br]be launched on the market. We're going to 0:18:58.150,0:19:02.890 have some fantastic autonomous car that's[br]going to drive us seamlessly from one part 0:19:02.890,0:19:10.140 of the town to the other one. And in[br]reality, we are still training them, you 0:19:10.140,0:19:16.730 and I, when we get these pop up images and[br]we are asked by the system to identify all 0:19:16.730,0:19:21.110 the squares where there is a shop window,[br]where there is a traffic sign or where 0:19:21.110,0:19:27.140 there is another vehicle. Where I want to[br]go with this is that there is still a lot 0:19:27.140,0:19:32.490 of human beings, a lot of human clocks[br]behind the machines, because artificial 0:19:32.490,0:19:38.360 intelligence is nothing without data, and[br]data is pretty useless if it's not been 0:19:38.360,0:19:44.330 targged. And so you see the emergence of[br]data tagging factory in China and it's 0:19:44.330,0:19:49.270 actually the new faces of outsourcing in[br]China, where you have people who spend the 0:19:49.270,0:19:55.540 whole day behind computer screen adding[br]dots and descriptions to images and 0:19:55.540,0:20:02.559 photo. And so artificial intelligence that[br]was supposed to, you know, free us from 0:20:02.559,0:20:10.482 very monotonous, mind-numbing and boring[br]work is still generating for us a lot of 0:20:10.482,0:20:16.760 this boring work. And these invisible[br]people behind artificial intelligence is 0:20:16.760,0:20:22.830 what Amazon, which is Amazon Mechanical[br]Turk, but it's what Amazon call 0:20:22.830,0:20:29.430 artificial intelligence. Some could photo[br]mission, and some of the characteristic of 0:20:29.430,0:20:34.980 this hidden human labor for the[br]digital factory is that it's, one, the 0:20:34.980,0:20:39.790 hyper fragmentation of the work and[br]because it's like the work is divided in 0:20:39.790,0:20:46.080 small chunks, it's loses a lot of his[br]prestige, a lot of its value, so people 0:20:46.080,0:20:50.851 are underpaid and what is a bit[br]frightening is that this kind of work 0:20:50.851,0:20:58.430 culture is actually creeping into the[br]world of creative people and freelancing 0:20:58.430,0:21:03.940 and you have platforms such as this one,[br]called Fiverr where it's also a 0:21:03.940,0:21:08.890 crowdsourcing platform where you can buy[br]the services of a person who is going to 0:21:08.890,0:21:13.570 design for you the logo of your new[br]company, or a birthday card, or a business 0:21:13.570,0:21:20.280 card, or do translation for you or shoot a[br]testimonial for your products and there is 0:21:20.280,0:21:24.840 one example there and someone who advertise[br]that they will design for you two 0:21:24.840,0:21:32.340 magnificent logos in 48 hours. And the[br]price that's it at five dollars, that's 0:21:32.340,0:21:38.850 very low - and the characteristic[br]of this kind of company is this scale of 0:21:38.850,0:21:46.580 crowdsourcing so-called services is that[br]they ruthlessly rely on precarity and 0:21:46.580,0:21:53.809 Fiverr recently had these ads, where[br]they were basically glorifying the idea of 0:21:53.809,0:22:02.820 working yourself to death. And then this.[br]Well I'm going to the third work I really 0:22:02.820,0:22:08.930 like in 2018. It was released a few months[br]ago. It's a work by Michael Mandiberg and 0:22:08.930,0:22:17.040 he wanted to make a portrait of the hidden[br]human being behind the digital factory. 0:22:17.040,0:22:23.920 And to do so he really wanted to recreate[br]or to modernized the iconic film by Charlie 0:22:23.920,0:22:30.010 Chaplin. Modern Times. It's a film from[br]1936 which was you know it was the seventh 0:22:30.010,0:22:34.760 year of the Great Depression. People[br]were struggling, being unemployed, 0:22:34.760,0:22:43.216 starving and having difficulty to get through [br]life. And Michael Mandiberg made an 0:22:43.216,0:22:49.670 update of this film, by... First of all he[br]wrote the code and the code cut the film 0:22:49.670,0:22:55.650 of Charlie Chaplin into very short clips[br]of a few seconds and then Michael 0:22:55.650,0:22:59.966 Mandiberg contacted people. He [br]found people on this platform on 0:22:59.966,0:23:04.880 Fiverr and he asked them, he gave [br]each of them a short clip. And 0:23:04.880,0:23:12.250 he asked them to recreate it as well[br]as they could. And that's what they did. 0:23:12.250,0:23:17.620 And Michael Mandiberg, I'm going to show[br]an extract. Yeah you will see, he had no 0:23:17.620,0:23:22.390 obviously no control over the location, he[br]had no control about around the production 0:23:22.390,0:23:31.260 choice. So hopefully I'm going to find[br]the correct video and not a fitness video. 0:23:31.260,0:23:32.910 So here's an extract. 0:23:32.910,0:25:19.836 music plays 0:25:19.836,0:25:29.170 I think you've got the point here. Good.[br]So with this work Michael Mandiberg tried 0:25:29.170,0:25:34.760 to do a kind of portrait of these hidden[br]digital factory and of its human clocks 0:25:34.760,0:25:39.850 But the portrait is done by the[br]freelancers, by the workers themselves and 0:25:39.850,0:25:45.580 you can see the result is kind of[br]messy and chaotic geographically disperse 0:25:45.580,0:25:56.174 and hyper-fragmented. So that it's added[br]characteristic, of the digital factory. So 0:25:56.174,0:26:03.049 carrying on with the world that is hidden,[br]I'd like to speak briefly about social 0:26:03.049,0:26:09.220 media content moderators. Maybe you know[br]this story because imagine that in 0:26:09.220,0:26:15.159 Germany, I was listening to an interview[br]with a woman who had work as a social 0:26:15.159,0:26:23.549 media content moderator for Facebook in[br]Germany and she was explaining what kind 0:26:23.549,0:26:29.150 of work she was doing. It doesn't happen[br]very often that you have testimony, because 0:26:29.150,0:26:33.310 usually the worker have to sign a[br]nondisclosure agreement. But she was 0:26:33.310,0:26:39.640 explaining that, how difficult the work is[br]because every single day, each of the 0:26:39.640,0:26:47.030 moderator has to go to 1300 tickets. So[br]1300 cases of content that have been 0:26:47.030,0:26:52.820 flagged as inappropriate or disturbing,[br]either automatically or by another users. 0:26:52.820,0:26:58.690 So you know that doesn't leave a lot of[br]time to have a thorough analysis. 0:26:58.690,0:27:05.890 And indeed she was explaining that human[br]reflection is absolutely not encouraged and 0:27:05.890,0:27:10.810 that you have all the process of[br]judging, you have only a few seconds to 0:27:10.810,0:27:14.950 judge whether or not the content has[br]to be blocked or you could stay on 0:27:14.950,0:27:19.640 Facebook. And all the process of the[br]reflection is actually reduced to a series 0:27:19.640,0:27:25.420 of automatisms. And the interesting part[br]for me was when she was explaining that at 0:27:25.420,0:27:30.170 night she was still dreaming of her work, but[br]where some of her co-workers were dreaming 0:27:30.170,0:27:36.860 of the kind of horrible content they have to[br]moderate, she was explaining how she was 0:27:36.860,0:27:42.920 actually dreaming of doing the same[br]automatic gesture over and over again. And 0:27:42.920,0:27:49.309 that's how you realized, that all these[br]big companies, they wish they could replace 0:27:49.309,0:27:54.360 workers by machines, by algorithm, by[br]robots. But some of the tasks 0:27:54.360,0:27:59.020 unfortunately the robots or the[br]algorithm cannot do them yet. So they have 0:27:59.020,0:28:05.900 to rely on human being with all the [br]other unreliability and propension 0:28:05.900,0:28:13.270 to contest and rebell and maybe[br]laziness and all their human defects. So 0:28:13.270,0:28:19.140 the solution they found was to. Sorry?[br]Okay I thought someone had asked me 0:28:19.140,0:28:26.240 something. Where was I? So the solution,[br]yeah solution they found was to reduce 0:28:26.240,0:28:31.850 these people to machines and automate[br]their work as much as possible. And so 0:28:31.850,0:28:38.440 that's the work as a Facebook content[br]moderator. And just as is apparent as you 0:28:38.440,0:28:44.710 can see how their work ambience and[br]environment is different from the fancy 0:28:44.710,0:28:50.890 and glamorous image we see in architecture[br]magazine, each time one of these big media 0:28:50.890,0:28:55.970 companies will open up headquarters[br]somewhere in the world. But anyway to come 0:28:55.970,0:29:00.960 back to the work of Facebook content[br]moderator I think the way the work is 0:29:00.960,0:29:10.920 automated can be compared to the way the[br]work of the employees in Amazon warehouses 0:29:10.920,0:29:19.419 are being automated. As you know [br]they have to wear this kind of mini- 0:29:19.419,0:29:25.060 computer on their wrist that identifies[br]them, that tracks them, that give them all 0:29:25.060,0:29:34.020 the... monitor them, measure their[br]productivity in real time. And also like, 0:29:34.020,0:29:40.480 allows Amazon to really remote control[br]them and so they work in a pretty 0:29:40.480,0:29:46.940 alienating environment. Sometimes popup[br]online pretend that Amazon they feel this 0:29:46.940,0:29:52.540 one is particularly creepy, because they[br]were thinking of having, of, you know, 0:29:52.540,0:29:58.500 making this wrist computer even [br]more sophisticated and using haptic 0:29:58.500,0:30:02.830 feedback so that they could direct and[br]remote control even more precisely the 0:30:02.830,0:30:10.580 hands of the worker. So you see there is[br]this big push towards in certain contexts 0:30:10.580,0:30:15.419 towards turning humans into robots. And[br]you know we're always mentioning Amazon, 0:30:15.419,0:30:22.160 but this is the kind of work culture that[br]is being adopted elsewhere in France, but 0:30:22.160,0:30:28.840 also in other European countries. There[br]are services such as the Auchan Drive. So 0:30:28.840,0:30:32.280 if you're a customer and you don't[br]want to go to the supermarket to do your 0:30:32.280,0:30:36.960 groceries, you do your groceries online[br]and 10 minutes later you arrive on the 0:30:36.960,0:30:42.530 parking lot of your supermarket and[br]someone will be there with your grocery 0:30:42.530,0:30:49.100 bags ready. And I was listening to[br]testimonies by people who used to work 0:30:49.100,0:30:53.940 at Auchan Drive and as you can see they[br]wear all the same microcomputer 0:30:53.940,0:30:59.140 that is also monitoring precisely all[br]their movement. And the thing that I found 0:30:59.140,0:31:03.390 most disturbing was when they were[br]explaining, that they don't, when they get 0:31:03.390,0:31:08.220 the list of of items they have to pick up,[br]they don't get you know a list that say 0:31:08.220,0:31:13.660 you have to take three bananas and two[br]packs of soy milk. No they just get 0:31:13.660,0:31:20.000 numbers that say good to this alley[br]another number that say this is 0:31:20.000,0:31:24.780 the shelf where you have to take the item[br]and the number corresponding to the 0:31:24.780,0:31:29.530 precise item on the shelf. And they also[br]they are submitted to very strict rules of 0:31:29.530,0:31:35.929 being as fast as possible and I shouldn't[br]laugh, but they were explaining the kind 0:31:35.929,0:31:41.390 of if they rebelled the kind of vengeance[br]their boss would have. And if they are 0:31:41.390,0:31:45.790 particularly rebellious, they send them[br]into frozen food so they spend the day in 0:31:45.790,0:31:54.580 the cold. And I'd like to have a sharp[br]look with you at people who also want to 0:31:54.580,0:31:59.330 be closer to machines and robots. But[br]according to their own terms and they want 0:31:59.330,0:32:05.540 to have total control over it and it's[br]something they do voluntarily and it's the 0:32:05.540,0:32:10.610 community of so-called grinders. So these[br]people who don't hesitate to slice open 0:32:10.610,0:32:15.940 their body to put, insert chips and[br]devices and magnets, into it to either 0:32:15.940,0:32:22.150 facilitate their life or expand their[br]range of perception. And actually the 0:32:22.150,0:32:27.919 first time I heard about about them was at[br]the Chaos Communication Congress back a 0:32:27.919,0:32:36.950 long time ago in 2006 when a journalist[br]Quinn Norton had installed a magnet under 0:32:36.950,0:32:41.750 her fingers and she could sense the[br]electromagnetic fields. I found that very 0:32:41.750,0:32:47.950 fascinating. But as you can see the[br]aestetic is, it's pretty rough and gritty. 0:32:47.950,0:32:52.670 It's very, it's a vision of a world to[br]come and of a communion with the machine 0:32:52.670,0:32:58.789 that is completely aesthetically it's very[br]different from the one that you saw by the 0:32:58.789,0:33:04.290 Silicon Valley. But I find it equally[br]fascinating how they take control of the 0:33:04.290,0:33:12.580 way they want technology to be closer to[br]technology. Yes. So in the third part of 0:33:12.580,0:33:17.390 my talk, the first part was about this[br]confusion we have between the intelligence 0:33:17.390,0:33:22.910 of machine and human intelligence. The[br]second part was about the robotisation of 0:33:22.910,0:33:28.059 the human body. And the third and [br]last part I would like to have a look 0:33:28.059,0:33:35.560 and I don't know the type of hybridisation of [br]bodies, but this time not between the human body 0:33:35.560,0:33:44.470 and the machine, but human body and other[br]animal species. So we may have actually 0:33:44.470,0:33:51.000 been tinkering with their bodies for[br]decades since the 60s when they started 0:33:51.000,0:33:58.740 taking the contraceptive pills and also[br]people going through want to change 0:33:58.740,0:34:04.450 gender, they also tinker with their body[br]and the first transgender celebrity I 0:34:04.450,0:34:11.099 recently found out was George Jorgensen[br]Jr. who started his adult life as a G.I. 0:34:11.099,0:34:18.099 during World War 2 and ended his life as a[br]female entertainer in the US. And while 0:34:18.099,0:34:22.089 assembling the slides I could not resist[br]adding this one because I saw it a couple 0:34:22.089,0:34:28.119 of weeks ago in an exhibition - the World[br]Press Photo Exhibition. And it's a photo 0:34:28.119,0:34:34.080 that was taken in Taiwan and Taiwan is[br]actually, it is a major on of the top 0:34:34.080,0:34:38.630 tourist destination for medicine. People[br]fly to Taiwan because you get, you can get 0:34:38.630,0:34:47.300 very good health care and and pretty[br]affordable surgery. And one of the hot 0:34:47.300,0:34:53.680 niche in surgery for this kind of tourist[br]is actually gender reassignment surgery. 0:34:53.680,0:35:01.390 So in this photo you have, the operation[br]is finished and the surgeon is showing to 0:35:01.390,0:35:10.820 the patient her new vagina. And[br]before undergoing this surgery the 0:35:10.820,0:35:16.119 patient of course had to go through a[br]course of... had to take hormones and 0:35:16.119,0:35:22.210 series of hormones and one of them is[br]estrogen and estrogen is the primarily 0:35:22.210,0:35:28.670 sex female hormone, it's the hormone[br]responsible for the feminization of the 0:35:28.670,0:35:36.600 body and also sometimes of the emotion.[br]And, if you are a guy in the room you're 0:35:36.600,0:35:39.680 probably thinking, well I'm going to leave[br]the room because I don't care about 0:35:39.680,0:35:43.440 estrogen. What does it have to do with me?[br]And I'm going to spend the next few 0:35:43.440,0:35:47.260 minutes trying to convince you that you[br]should you should care about estrogen. 0:35:47.260,0:35:53.160 Because even fish care about estrogen it[br]turns out. At least the biologist. You 0:35:53.160,0:35:57.380 might have heard a few years ago that[br]biology started to get very worrying 0:35:57.380,0:36:04.160 because they discovered that some fish[br]population, especially the male ones, were 0:36:04.160,0:36:11.110 getting feminized that their sexual organs[br]were both sometimes male and female and 0:36:11.110,0:36:16.080 they were worried about it and they[br]discovered that it was the result of 0:36:16.080,0:36:21.500 contact with xenoestrogen. And[br]xenoestrogen, everywhere in the 0:36:21.500,0:36:27.619 environment, and they affect not only fish[br]but all animal species including human 0:36:27.619,0:36:32.340 species. So when I say that they are[br]everywhere in the environment, it's really 0:36:32.340,0:36:41.410 everywhere - only 10 minutes - They are[br]very often the result of industrial 0:36:41.410,0:36:46.430 process so for example you can find them[br]in recycled plastic, in lacquers, in beauty 0:36:46.430,0:36:54.050 products, in products you use to clean the[br]house, in wheat killers, insecticides, in 0:36:54.050,0:36:58.020 PVC, they are pretty much everywhere.[br]Sometimes they are released by 0:36:58.020,0:37:04.490 industries into the watersteam and I[br]wanted to show you again abstract art. 0:37:04.490,0:37:08.410 It's a work by Juliette Bonneviot, and she[br]was really interested in these 0:37:08.410,0:37:12.750 xenoestrogens. So she looked around her in[br]the environment where they were present 0:37:12.750,0:37:19.400 then and she took all these, some of these[br]xenoestrogen sources I've just mentioned 0:37:19.400,0:37:24.910 and she separated them according to colors[br]and then she crush them and turn them into 0:37:24.910,0:37:31.080 powder. They then mixed them with PVC and[br]silicone which also contain xenoestrogens 0:37:31.080,0:37:36.400 and then pulled them onto canvas and then[br]you have this kind of portrait of the 0:37:36.400,0:37:43.670 silent colonizing chemicals that is[br]modifying our body and the kind of things 0:37:43.670,0:37:48.150 they can do to the human body have already[br]been observed. So they're linked to higher 0:37:48.150,0:37:56.200 cases of cancer for both men and women,[br]lowering of IQ, problems in fertility, 0:37:56.200,0:38:05.280 early onset of puberty, etc. So we [br]should really be more vigilant around 0:38:05.280,0:38:13.800 xenoestrogen and in the next few minutes[br]I'd like to mention briefly two projects 0:38:13.800,0:38:20.940 where women artist exploring how[br]their hormones, more general their female 0:38:20.940,0:38:29.690 body can be used to have other kinds of[br]relationships with non-human animals. 0:38:29.690,0:38:35.900 So the first is a project by Ai Hasegawa,[br]she was in her early 30s and she had that 0:38:35.900,0:38:41.140 dilemma. She thought maybe, maybe it's[br]time to have a child but do I really want 0:38:41.140,0:38:46.530 to have a child because it's a lot of[br]responsibility and there is already 0:38:46.530,0:38:50.540 overpopulation. But on the other hand if I[br]don't have a child it means that all these 0:38:50.540,0:38:55.500 painful periods would have been there in[br]vain. She was also faced with another 0:38:55.500,0:39:01.280 dilemma, is that she really likes to eat[br]endangered species and in particular 0:39:01.280,0:39:07.400 dolphin and shark and whales and so on the[br]one hand she liked eating them. And on the 0:39:07.400,0:39:13.280 other hand she realized that she was[br]contributing to their disappearance and 0:39:13.280,0:39:18.360 the fact that their extinct. So she found[br]this solution, and this is a speculative 0:39:18.360,0:39:24.190 project, but she saw it as a solution[br]maybe she could give birth to an 0:39:24.190,0:39:28.660 endangered species, So maybe she could[br]become the surrogate mother of a dolphin. 0:39:28.660,0:39:36.820 So using her belly as a kind of incubator[br]for another animal and to do this she 0:39:36.820,0:39:41.520 consulted with an expert in synthetic[br]biology and an expert in obstetric who 0:39:41.520,0:39:48.830 told her that technically it's possible,[br]that she would just have to resort to 0:39:48.830,0:39:55.180 synthetic biology to modify the placenta[br]of the baby. She would not have to modify 0:39:55.180,0:39:59.680 our own body but she just had to modify[br]the placenta of the baby to be sure 0:39:59.680,0:40:04.750 that all the nutrients and the hormones[br]and the oxygen and everything is 0:40:04.750,0:40:08.660 communicated between the mother and the[br]baby except the antibodies because they 0:40:08.660,0:40:13.490 could damage the body. So in a scenario,[br]even if everything goes well, she gives 0:40:13.490,0:40:20.800 birth will lovely healthy baby[br]Maui dolphin. And she gives it the 0:40:20.800,0:40:27.609 antibodies through fat milk containing[br]these antibodies and she see grow. And 0:40:27.609,0:40:32.760 once it's an adult he could be released[br]with a chip and under its skin. And then 0:40:32.760,0:40:38.110 when once it's fished she can buy it and[br]eat it and have it again inside her own 0:40:38.110,0:40:44.470 body. Well, when I first heard about the[br]project, I can tell you I was not 0:40:44.470,0:40:48.390 laughing. I said it was really really[br]revolting. But the more I thought about, 0:40:48.390,0:40:54.820 it the smarter I saw this project was,[br]because I understand it doesn't sadly make 0:40:54.820,0:40:59.410 a lot of sense to give birth to yet[br]another human being that is going to put 0:40:59.410,0:41:04.790 strain on this planet. So why not dedicate[br]your reproductive system to the 0:41:04.790,0:41:11.620 environment and use it to help save [br]endanger species. And I have five minutes 0:41:11.620,0:41:17.859 left so I have to make some really really[br]drastic choices. I wanted to talk about 0:41:17.859,0:41:24.010 Maja Smrekar. I'm going to do it. So she[br]had a series of projects, where she is 0:41:24.010,0:41:31.040 exploring the core evolution of dogs and[br]human and to make it very very brief. One 0:41:31.040,0:41:36.630 of her performances consisted in spending[br]four months in a flat in Berlin with her 0:41:36.630,0:41:41.540 two dogs. One of them is an adult Byron[br]and the other one is a puppy called Ada. 0:41:41.540,0:41:46.510 And she tricked her body by some[br]physiological training and also by eating 0:41:46.510,0:41:52.820 certain food and using a breast pump. She[br]treats her body to stimulate the hormone 0:41:52.820,0:41:56.869 that make the body of a woman lactate,[br]even if she wasn't pregnant, so that she 0:41:56.869,0:42:03.410 could breastfeed her puppy. And these two[br]projects I saw, were really are really 0:42:03.410,0:42:08.859 interesting, because it shows the power of[br]the female body. It shows that what 0:42:08.859,0:42:14.140 prevents women from having this kind of[br]transgender motherhood isn't is that 0:42:14.140,0:42:19.960 technology. It's just it's just society[br]and the kind of limits that our culture is 0:42:19.960,0:42:26.720 bringing on what woman can or should do[br]with her body. And I also like this 0:42:26.720,0:42:32.190 project, because the kind of question our[br]anthropocentrism and our tendency to think 0:42:32.190,0:42:37.859 that all the resources of the world are made[br]for us and not for other living creatures. 0:42:37.859,0:42:43.470 So just as a parent this if you're[br]interested in estrogen and xenoestrogen 0:42:43.470,0:42:49.860 I would recommend that you have[br]a look at the talk that Mary Maggic give last 0:42:49.860,0:42:56.020 year had the Chaos Communication Congress[br]where she was talking about estrogen and 0:42:56.020,0:43:01.900 then linked the history and links[br]to her work as a biohacker. So now my 0:43:01.900,0:43:06.480 conclusion... maybe you're already wondering[br]why is she talking to us about artificial 0:43:06.480,0:43:12.965 intelligence and transspecies motherhood?[br]What does it have to do with each other? I 0:43:12.965,0:43:19.349 would say a lot! Because we have the[br]feeling that the digital world sometimes 0:43:19.349,0:43:24.490 we tend to forget that behind it[br]there is a material world that to have 0:43:24.490,0:43:29.210 artificial intelligence. You need[br]the infrastructure, you need the devices, 0:43:29.210,0:43:35.109 you need the server farm, you need spaces[br]to manage these data. Physical spaces. 0:43:35.109,0:43:40.540 And I would say it's a bit like the human[br]brain. The human brain is kind in dimensions 0:43:40.540,0:43:47.040 pretty small, but apparently it's it eats[br]up a fifth of all the energy that our body 0:43:47.040,0:43:53.560 consume. And so that means that artificial[br]intelligence is, it needs like a very 0:43:53.560,0:43:59.810 heavy infrastructure, energy, hungry,[br]server farms and I'm sure you've seen all 0:43:59.810,0:44:04.816 the PR stunts and the press releases of[br]Amazon, Google, Facebook, etc. promising 0:44:04.816,0:44:09.450 you that they're transitioning that[br]they're going to use green energy, but the 0:44:09.450,0:44:16.241 energy is not still green and they're still[br]using a lot of fossil fuel to pull out. 0:44:16.241,0:44:22.210 Their server farms to make the devices[br]that we use. We still need minerals that 0:44:22.210,0:44:25.930 have to be dug up from the ground,[br]sometimes in really horrible conditions. 0:44:25.930,0:44:31.400 This is the coltan mine in the Democratic[br]Republic of the Congo. The minerals we 0:44:31.400,0:44:36.890 use, well, they are not infinite and,[br]unless and it's not for tomorrow, unless we 0:44:36.890,0:44:42.210 go and get these minerals in the asteroid[br]or in the deep sea. We are going to get 0:44:42.210,0:44:47.109 into trouble very very fast and I don't[br]have the time for this one. But trust me 0:44:47.109,0:44:52.849 the resources are not infinite. And then,[br]refind them and produce them. That's very 0:44:52.849,0:44:59.900 very damaging for the environment. Yes. So[br]I have the feeling sometimes that we are a 0:44:59.900,0:45:06.750 bit like the golfers in this image, that[br]when I go to some, not all of them, but 0:45:06.750,0:45:12.050 some tech conference or art and tech[br]conferences, I have the feeling that we 0:45:12.050,0:45:17.190 are kind of complacent and that we have[br]our vision of the future may be a bit 0:45:17.190,0:45:21.710 narrow minded and maybe that's [br]normal. I guess it's like, if you go to 0:45:21.710,0:45:25.800 someone who's working as a fancy job at[br]Silicon Valley and you ask him: "What's 0:45:25.800,0:45:29.800 your vision of tomorrow? What's your[br]vision of the future?" And then you ask 0:45:29.800,0:45:34.060 the same question to someone else, for[br]example is an activist for Greenpeace. Then 0:45:34.060,0:45:39.280 we have a completely different[br]perspective, a very different answer of 0:45:39.280,0:45:47.380 what the future might be. So sometimes I'm[br]wondering also, if we are not too obsessed 0:45:47.380,0:45:51.765 with the technology and to obsess with[br]what I would call a techno feats. This 0:45:51.765,0:45:57.450 tendency we have to see a problem and to[br]think, if we throw more technology on top 0:45:57.450,0:46:01.870 of it we are going to solve it. Even if[br]the problem has been created in the first 0:46:01.870,0:46:06.460 place by technology. So, that's why we get[br]extremely excited and I do get excited 0:46:06.460,0:46:11.073 about the perspective that when they maybe[br]will get a baby mammoth that will be 0:46:11.073,0:46:17.140 resurrected. And at the same time[br]we don't take care of the species that are 0:46:17.140,0:46:21.790 getting instincts, you know every single[br]day every 24 hours. There are something 0:46:21.790,0:46:28.210 like between 150 and 200 types of plants,[br]animals, insects that get disappeared and 0:46:28.210,0:46:34.470 we'll never see again. Every single day[br]they disappear around the world. And yet 0:46:34.470,0:46:38.070 we still get excited about the idea of[br]resurrecting the baby mammoth, the 0:46:38.070,0:46:44.530 passenger pigeon or dodo, we still create [br]and breed creature, so that we can exploit 0:46:44.530,0:46:50.060 them even better. Should we be looking[br]forward to a lab grown meat that is 0:46:50.060,0:46:55.900 promised. I mean we are told that is going[br]to be cruelty-free and guilt-free, where it 0:46:55.900,0:47:02.320 has in reality they are not totally guilt-[br]free and cruelty-free. And I don't think so that 0:47:02.320,0:47:07.490 they are the best solution to solved the[br]horrible impact that the meat industry is 0:47:07.490,0:47:12.790 having on the environment, on our health[br]and on the well-being of animals. I mean 0:47:12.790,0:47:17.940 there is a solution. It's not the sexy[br]one. It's not a tricky one. It's to 0:47:17.940,0:47:25.020 adapt to plain plant based diet and I[br]managed to do a bit of vegan propaganda 0:47:25.020,0:47:34.090 here. Should we get excited, because there[br]are few vegan in the room. Should we get 0:47:34.090,0:47:42.800 excited, because someone in Japan has made[br]some tiny drones with, on top of it, it's 0:47:42.800,0:47:47.580 horse hair that they're used to pollinate[br]flowers, because everywhere around the 0:47:47.580,0:47:54.200 world the population of bees is collapsing[br]and that is very bad for our food system. 0:47:54.200,0:48:01.250 So should we, should we use geo[br]engineering to solve our climate trouble. 0:48:01.250,0:48:06.260 And at the end with these slides of what[br]this for me ours, I see you know the 0:48:06.260,0:48:12.329 service, which is for me the really the[br]embodiment of techno feats. So, you 0:48:12.329,0:48:20.460 probably know that California has gone[br]through some very bad period of dry 0:48:20.460,0:48:25.560 weather, so rich people wake up and they[br]see that the grass on their magnificent 0:48:25.560,0:48:30.140 lawn is yellow instead of being green. So,[br]now in California you have services where 0:48:30.140,0:48:34.109 you can call someone and they will just,[br]you know, fix the problem by painting the 0:48:34.109,0:48:39.450 grass in green. So there you have it and,[br]you know, fuck you Anthropocene and global 0:48:39.450,0:48:45.868 warming, my lawn is green! 0:48:45.868,0:48:52.650 applause 0:48:52.650,0:48:57.250 Okay so this how this why I wanted to[br]bring all these really contrasting visions 0:48:57.250,0:49:02.109 together because we might have different[br]visions of the future but at some point 0:49:02.109,0:49:08.770 they will have to dialogue because we have[br]only one humanity and one planet. And I'm 0:49:08.770,0:49:15.890 very very bad that at conclusion. So I[br]wrote it down. So the artist whose work 0:49:15.890,0:49:22.670 made in 2018 a truly exciting year[br]for me not the artist will showcase the 0:49:22.670,0:49:28.230 magic and the wonders of science and[br]technology. They are the artists who bring 0:49:28.230,0:49:33.619 to the table realistic complex and[br]sometimes also difficult conversation 0:49:33.619,0:49:39.040 about the future whether it is the future[br]of technology, the future of men, or the 0:49:39.040,0:49:43.830 future of other forms of life and other[br]forms of intelligence. They are the 0:49:43.830,0:49:48.369 artists who try and establish a dialogue[br]between the organic, the digital and the 0:49:48.369,0:49:53.510 mineral because they're all part of our[br]world and we cannot go on pretending that 0:49:53.510,0:49:57.518 they don't affect each other. Thank you so[br]much. 0:49:57.518,0:50:14.617 Applause 0:50:14.617,0:50:20.010 Herald: Thank you Régine for the very[br]interesting times. Bit confusing talk 0:50:20.010,0:50:29.849 about - I'm still thinking about the[br]dolphin part, but anyway. But by the way 0:50:29.849,0:50:34.490 there's this grass painting thing is maybe[br]it's something that I can apply to 0:50:34.490,0:50:39.970 my house. Okay. We have questions at[br]microphone 2, I guess. So let's start 0:50:39.970,0:50:42.970 there.[br]Question: Hi. I have a question on a 0:50:42.970,0:50:48.810 particular part at the beginning you[br]talked about AI in arts and you mentioned 0:50:48.810,0:50:57.429 that there are no AI programs that draw[br]pictures or run texts but have you heard 0:50:57.429,0:51:04.810 about AI developing ideas for say an art[br]installation? 0:51:04.810,0:51:11.570 Régine: Yes as a matter of fact I think[br]tonight... I mean, if you're going to program 0:51:11.570,0:51:17.720 and you may maybe tonight or tomorrow[br]night there is a talk by Maria and 0:51:17.720,0:51:23.960 Nicole. There two artists and I think[br]that the title might be Disnovation 0:51:23.960,0:51:28.010 and I think you might like what[br]they present. I don't know what they're 0:51:28.010,0:51:33.150 going to present but what I know is that[br]one of their work... I forget the name... if 0:51:33.150,0:51:38.540 they're in the room that would be[br]fantastic. But they had a project where 0:51:38.540,0:51:45.339 they have a bot. It's on Twitter and it's[br]going through some blogs and newspaper 0:51:45.339,0:51:51.769 about creativity and also about [br]technology and using all these data to 0:51:51.769,0:51:59.839 generate really crazy stupid title for[br]installation . And then the artist 0:51:59.839,0:52:07.339 challenge other artists to take these[br]tweets which are really crazy and make an 0:52:07.339,0:52:14.020 installation out of it. So that's[br]a kind of a tricky way of AI being used 0:52:14.020,0:52:18.390 for installation I'm sure. Like[br]right now I cannot think of about 0:52:18.390,0:52:24.270 anything else but I mean if you [br]want you can write me and when my brain 0:52:24.270,0:52:34.081 is switched on back, probably[br]I'll have other ideas. 0:52:34.081,0:52:42.579 Herald: OK. Any more questions. I don't[br]see any.... Ah, there over there. 0:52:42.579,0:52:48.650 Microphone 4 please.[br]Question: Yeah. I was wondering if, well. 0:52:48.650,0:52:54.310 there's probably more certain that[br]we're developing to more suppose human race 0:52:54.310,0:53:01.369 because we simply have to do to climate[br]change. There are also developments right 0:53:01.369,0:53:08.359 now that when relatively short term we[br]would go to Mars and in a sort of sense do 0:53:08.359,0:53:16.160 we need to do to fight the human race for[br]possible multiple planets and with modern 0:53:16.160,0:53:21.787 human modification.[br]Régine: Okay, I didn't understand the 0:53:21.787,0:53:26.710 question.[br]Herald: So, please repeat. 0:53:26.710,0:53:35.099 Question: So in general we're going to a[br]human... both human race that definitely... 0:53:35.099,0:53:43.010 we're not able to survive on this planet[br]for that long anymore. Really optimistic. 0:53:43.010,0:53:52.750 I am vegan so yay. And we have[br]some new developments going on that we'll 0:53:52.750,0:53:58.110 be able to go to Mars relatively soon.[br]Régine: I don't believe it. Who is going 0:53:58.110,0:54:02.200 to want to go to that planet like [br]seriously, like it's going to be like 0:54:02.200,0:54:07.570 Australia we are going to send prisoners[br]and criminals there. Who wants to be like 0:54:07.570,0:54:15.803 Like come on. Yeah. Anyway now I[br]see what you mean. I think I'm kind of 0:54:15.803,0:54:20.589 more optimistic about the future[br]than you are. I think we can still survive 0:54:20.589,0:54:26.599 on this planet even if we get very[br]numerous. We just have to find another 0:54:26.599,0:54:30.500 way to consume and take care of each[br]other. But maybe this my feminine side 0:54:30.500,0:54:37.630 talking.[br]Question: Maybe in general without a lot 0:54:37.630,0:54:45.030 of modification to the human[br]beings it's simply not possible. 0:54:45.030,0:54:53.070 Certainly. I think that's a common ground.[br]And yeah I sort of wish we didn't need a 0:54:53.070,0:54:59.290 planet B but I think we do.[br]Régine: Well I hope I'd be dead when that 0:54:59.290,0:55:06.179 comes. That's my philosophy sometimes.[br]Okay. 0:55:06.179,0:55:14.500 Herald: I don't see any more questions so[br]let's thank the speaker again. Thank you. 0:55:14.500,0:55:18.144 Applause 0:55:18.144,0:55:31.670 35C3 postroll music 0:55:31.670,0:55:41.000 Subtitles created by c3subtitles.de[br]in the year 2020. Join, and help us!