Return to Video

35C3 - The good, the strange and the ugly in 2018 art &tech

  • 0:00 - 0:18
    35C3 preroll music
  • 0:18 - 0:23
    Herald Angel: The next talk will be given
    by Régine Debatty and she will be talking
  • 0:23 - 0:32
    about the goods the strange and the ugly
    in art and science technology in 2018. Big
  • 0:32 - 0:34
    applause to Régine
  • 0:34 - 0:36
    Applause
  • 0:36 - 0:41
    Sound from video
  • 0:41 - 0:48
    Régine: This was not supposed to
    happen, I am sorry. Hello everyone. So my
  • 0:48 - 0:55
    name is Régine Debatty. I'm an art critic
    and blogger. Since 2004 I've been writing
  • 0:55 - 1:00
    "we make money not art" which is a blog
    that looks at the way artists, designers
  • 1:00 - 1:07
    and hackers are using science and
    technology in a critical and socially
  • 1:07 - 1:15
    engaged way. So a few weeks ago some of
    the organizers of the Congress e-mailed me
  • 1:15 - 1:21
    and they asked me if I could give a kind
    of overview of the most exciting and
  • 1:21 - 1:28
    interesting works in art and technology in
    2018. And on Monday when I started working
  • 1:28 - 1:34
    on my slides I was really on a mission to
    deliver and then we had the intention
  • 1:34 - 1:41
    to give an overview of the best of art
    and tech in 2018. But somehow things got
  • 1:41 - 1:46
    out of hand. I started inserting words
    that are a bit older and also I could see
  • 1:46 - 1:53
    a kind of narrative that emerged. So the result is going to be a
  • 1:53 - 1:58
    presentation where I will still show some
    of the works I found very interesting over
  • 1:58 - 2:04
    this past year. But they're going to be
    embedded into a broader narrative. And this
  • 2:04 - 2:10
    narrative is going to look at the way
    artists nowadays are using science and
  • 2:10 - 2:16
    technology to really explore and expand
    the boundaries and the limits of the human
  • 2:16 - 2:23
    body. But also they use science and technology
    to challenge intelligence or at least the
  • 2:23 - 2:31
    understanding we have of human
    intelligence. So Exhibit 1 it's "Face
  • 2:31 - 2:38
    Detected". It's actually I wouldn't say it
    was one of my favorites work from 2018
  • 2:38 - 2:42
    but it's definitely by one of my favorite
    artists. So that's why it's in
  • 2:42 - 2:48
    there. So the setting is quite easy to
    understand. Thérise Deporter, that's the name of the artist
  • 2:48 - 2:53
    had a solo show a couple of months
    ago at the New Art Space in Andover and
  • 2:53 - 3:02
    install these blocks of clay on
    tables and then invited artists sculpters
  • 3:02 - 3:08
    to come and do portraits of people and
    start shaping their head. And of course
  • 3:08 - 3:14
    you know the process was followed by the
    audience but it was also
  • 3:14 - 3:21
    followed by face recognition software. And
    as soon as the software had detected a
  • 3:21 - 3:28
    face in the sculpture the Artist would
    receive a message that say: Stop, we
  • 3:28 - 3:33
    have detected a human face. And what is
    interesting is that
  • 3:33 - 3:35
    laughter
  • 3:35 - 3:40
    there are several - I mean, there's worse -
  • 3:40 - 3:45
    According to the face recognition system
  • 3:45 - 3:48
    - because they work with different
    parameters and they've been programmed by
  • 3:48 - 3:54
    different programmers so they don't
    necessarily - they're not as fast as, some
  • 3:54 - 3:58
    are faster than others to recognize a face,
    some have different parameters than
  • 3:58 - 4:05
    others. So that's why the results are
    always so different. But the reason why I
  • 4:05 - 4:10
    found this project worth mentioning
    is because it shows the increasingly big
  • 4:10 - 4:15
    space that is called artificial
    intelligence. You can call it algorithm,
  • 4:15 - 4:25
    big data or software. It shows the
    increasingly big role that systems and
  • 4:25 - 4:31
    machines are taking into culture and the
    place they're taking into spheres that
  • 4:31 - 4:37
    until not so long ago we thought were
    really the apanage and exclusivity of
  • 4:37 - 4:44
    humans. So for example creativity,
    innovation, imagination and art. And
  • 4:44 - 4:48
    nowadays pretty banal I mean no one is
    going to be surprised if you're listening
  • 4:48 - 4:52
    to music and someone tells you: Oh
    it's the music that's being composed
  • 4:52 - 4:57
    actually by an algorithm or this is a
    poem or a book that's been written
  • 4:57 - 5:02
    by a machine. I mean it's becoming
    almost mainstream. And this is happening
  • 5:02 - 5:10
    also in the world of visual art. And we've
    had a few examples this year. If you look
  • 5:10 - 5:15
    at the newspaper like if a mainstream
    newspaper or we had regularly news
  • 5:15 - 5:21
    saying oh this algorithmic has made this
    masterpiece. And one of the most recent
  • 5:21 - 5:28
    example was this stunning portrait of a
    gentleman that made the headlines because
  • 5:28 - 5:34
    it was the first portrait that was made by
    an algorithm that was sold at Christie's
  • 5:34 - 5:40
    And Christie's is a famous auction house. And
    it also made the headlines not only
  • 5:40 - 5:45
    because he was the first one but it sold for a surprisingly high price like
  • 5:45 - 5:49
    something like forty times the original
    estimate. This painting made by an
  • 5:49 - 5:55
    algorithm created by a Paris based
    collective. So it sold for almost half a
  • 5:55 - 6:04
    million dollar. And if I have to be honest
    with you I cannot think of any work that's
  • 6:04 - 6:10
    been mostly done by an algorithm that have
    really impressed me and move me. But it
  • 6:10 - 6:17
    seems that I am in the minority because
    you might have heard of this. These
  • 6:17 - 6:25
    scientific experiments where computer
    scientists at Rutgers University in USA.
  • 6:25 - 6:32
    They made this program that could
    replicate just make and invent abstract
  • 6:32 - 6:38
    paintings. And then they asked people like
    human beings to have a look at the
  • 6:38 - 6:44
    paintings made by the computer. But they
    mixed it, like I said they were showing
  • 6:44 - 6:49
    to people they mix it with paintings that
    had really been done by human beings that
  • 6:49 - 6:55
    had been exhibited at the famous art fair Art
    Basel. And they asked them how they react
  • 6:55 - 7:00
    and which one according to them had
    been done by a human. And it turned out
  • 7:00 - 7:05
    that people responded fairly well to the
    paintings that had been done by this
  • 7:05 - 7:11
    computer system. They tended to prefer
    them. They found them this odd. They
  • 7:11 - 7:16
    described them as being more communicative and
    more inspiring. So you know maybe one day
  • 7:16 - 7:23
    I will be impressed but so far I would say
    I would not be worried if I were an
  • 7:23 - 7:27
    artist. I'm not going to say to artists or
    you know very soon machines are going to
  • 7:27 - 7:31
    take your job. Well maybe if you make that
    kind of painting yeah maybe you should be
  • 7:31 - 7:38
    worried. But the kind of artist I'm
    interested in. I really don't like it.
  • 7:38 - 7:46
    Artists I'm interested in. I really trust
    them to keep on challenging computer
  • 7:46 - 7:51
    system. I trust them to explore and
    collaborate with them and use them to
  • 7:51 - 7:58
    expand their artistic range. And most of
    all I trust them to keep on sabotaging
  • 7:58 - 8:05
    and hacking and subverting these computer
    systems. And so yeah I'm not worried at
  • 8:05 - 8:11
    all for artists but maybe I should be
    worried for art critic. Because a few years
  • 8:11 - 8:17
    ago so it's 2006 and even in the
    artificial intelligence time it's really
  • 8:17 - 8:26
    really old. So in 2006 at the Palais de Tokyo
    in Paris an artist was showing a program
  • 8:26 - 8:33
    that was generated automatically. Text
    written by art critique and I'm sorry but
  • 8:33 - 8:37
    the example is in French. But if you
    read French. And if you use the kind of
  • 8:37 - 8:45
    press releases and "blah" they give you that
    had been written by art critique. It is
  • 8:45 - 8:50
    really the kind of thing they work. It is
    so incredible you know it's the kind of
  • 8:50 - 9:00
    inflated rhetoric and uses of vague terms
    and also constantly referencing some fancy
  • 9:00 - 9:05
    French philosopher. So it really really
    worked. So personally I would be more
  • 9:05 - 9:13
    worried for my job than for the job of the
    artist. Anyway this is another work I
  • 9:13 - 9:18
    really like in 2018 and it's one of
    these work that challenges and play and
  • 9:18 - 9:26
    try to subvert artificial intelligence and
    in particular the so-called intelligent or
  • 9:26 - 9:34
    home assistance which is Alexa or Siri. So
    you may know the artist is Mediengruppe
  • 9:34 - 9:41
    Bitnik. They collaborated with a DJ and
    electronic music composer called Low Jack
  • 9:41 - 9:46
    and they made an album to be played
    exclusively for Alexa and Siri all this
  • 9:46 - 9:56
    kind of devices. And we listen to a
    track in a moment. And what did the
  • 9:56 - 10:01
    music does is, it tries to interact and
    make Alexa react. So it asked
  • 10:01 - 10:06
    questions to Alex. It gives order to
    Alex or Siri or whatever it's called. And
  • 10:06 - 10:14
    it's also trying to express and convey and
    communicate to these devices the kind of
  • 10:14 - 10:19
    anxieties and ease and doubts they have
    the artists have. And some of us have
  • 10:19 - 10:25
    around these so-called intelligent home
    assistant. Whether it's they're frightened
  • 10:25 - 10:31
    about the encroachment of privacy and
    unreliability and the fact that they are
  • 10:31 - 10:37
    called smart. So that we kind became of lower
    guard and implicitly start to trust
  • 10:37 - 10:45
    them. So I'm going to make you listen to
    one of the tracks. There are three of them
  • 10:45 - 10:51
    - at least they sent me three of them.
    Played music track: Hello!
  • 10:51 - 10:57
    music
  • 10:57 - 11:04
    High speed intelligent personal assistance all
    listening to everything I say. I brought
  • 11:04 - 11:14
    you into my home. You scary. Alexa, Prime's
    new in time is now and all forever.
  • 11:14 - 11:21
    music
  • 11:21 - 11:30
    Alexa, I used to bark commands on you. Now I say
    "please" and "thank you". Never mess with
  • 11:30 - 11:38
    someone who knows your secrets. Siri,
    there is no way far enough to hide. Ok,
  • 11:38 - 11:45
    Google - are you laughing at me? You used
    to answer that. Now, you disengage. Never
  • 11:45 - 11:54
    enough digital street smarts to outsmart
    Alexa. Alexa? Alexa? Hey, Alexiety, I want
  • 11:54 - 12:01
    my control back. I want my home back.
    Siri, delete my calender. Siri, hold my
  • 12:01 - 12:09
    calls. Siri, delete my contacts. Ok,
    Google, what's the loneliest number? Ok,
  • 12:09 - 12:16
    Google, stop! Ok, Google, stop! Ok,
    Google. Hello! Please, stop. Alexa, help!
  • 12:16 - 12:24
    Alexa, disable Uber. Alexa, disable
    lighting. Alexa, disable Amazon music
  • 12:24 - 12:32
    unlimited. Alexa, quit! Hey, Siri!
    Activate airplane mode. If you need me,
  • 12:32 - 12:40
    I'll be offline, in the real world.
    R: And that's it for this one.
  • 12:40 - 12:43
    Next track playing
  • 12:43 - 12:52
    R: Oh, my god, what is this? That is
    why...? laughing My boyfriend is
  • 12:52 - 13:00
    watching, he's doing to love so much.
    Yeah, let's be serious with a bit of art
  • 13:00 - 13:07
    and tech, shall we? So, there are two
    reasons why this work is interesting. The
  • 13:07 - 13:11
    first one is that you don't really need to
    have one of these devices at home to have
  • 13:11 - 13:16
    fun because it's been actually
    conceived to be played as loud as
  • 13:16 - 13:23
    possible. So, you don't need to have to
    have one of these devices but you
  • 13:23 - 13:27
    can put the music very loud, open the
    window and hopefully it is going to
  • 13:27 - 13:33
    interact with the devices of your
    neighbors and mess a bit with their
  • 13:33 - 13:37
    domesticity. The other thing that is
  • 13:37 - 13:43
    interesting is that of course Alexa has
    been designed to appear as close to a
  • 13:43 - 13:48
    human as possible. So each time the record
    is asking some question the answer going
  • 13:48 - 13:52
    to change slightly because it's the way
    it's been programmed. Then also because
  • 13:52 - 14:00
    each time there is an update in the
    program and so might change. And also as I
  • 14:00 - 14:06
    said the artist made this record to
    communicate their doubts and anxieties
  • 14:06 - 14:11
    around this smart system because there's a
    few problems around these systems. One of
  • 14:11 - 14:17
    them is as I'm sure you know
    it's about surveillance because they come
  • 14:17 - 14:22
    with corporate systems of surveillance
    that are embedded into the hardware. And
  • 14:22 - 14:27
    the software. And we may not realize it
    until, you know, sometimes when you open -
  • 14:27 - 14:31
    one day we opened the pages of a newspaper
    and this is a story that appeared on the
  • 14:31 - 14:40
    Verge a few days ago and it's the story of
    a German citizen who e-mailed Amazon and
  • 14:40 - 14:44
    say look Amazon I want to know all the
    data you've collected about me over these
  • 14:44 - 14:48
    past few years. And then you received
    these data except that he received the
  • 14:48 - 14:55
    wrong data. He received 1700 Alexa voice
    recordings but he doesn't have an Alexa.
  • 14:55 - 15:00
    These were not his data. So he wrote back
    to Amazon and says "Oh look I think there
  • 15:00 - 15:08
    is a problem there". Amazon never answered
    back. So what did the guy do? He went to a
  • 15:08 - 15:13
    German newspaper and the reporters managed
    to analyze the data. And they managed to
  • 15:13 - 15:19
    identify who the original owner of the
    data was and who his girlfriend is and who
  • 15:19 - 15:25
    his circle of friends is. And you know if
    you couple all the data from Alexa with
  • 15:25 - 15:29
    what is already available for everyone to
    see on Facebook and other forms of social
  • 15:29 - 15:35
    media, you get kind of a scary and worrying
    picture so you understand why artists and
  • 15:35 - 15:42
    other people might be worried and anxious
    around Alexa. And so they're called smart
  • 15:42 - 15:51
    or intelligent devices, but sometimes they
    give us advices that may not be, that I
  • 15:51 - 15:57
    would say, judicious and smart, such as
    the one you might have heard of "Alexa's
  • 15:57 - 16:06
    advice to kill your foster parents". And
    also they don't necessarily react to the
  • 16:06 - 16:11
    person they're supposed to obey. So that
    like almost every month there is a story
  • 16:11 - 16:17
    in the newspaper. So this is the story of
    Rocco the parrot and Rocco really when
  • 16:17 - 16:23
    he's alone at home and his owner has left
    it's singing - it imitates very well.
  • 16:23 - 16:28
    There are videos of Rocco imitating the
    voice of Alexa, it's really kinny. But
  • 16:28 - 16:35
    he also plays his older like, it doesn't
    want to but, for example, he added to the
  • 16:35 - 16:40
    Amazon shopping basket of its owner
    broccoli and watermelons and kites and ice
  • 16:40 - 16:47
    cream. And there are stories like that all
    the time in the newspaper, when the TV
  • 16:47 - 16:52
    presenter talks about, say, "Okay Google
    something or Alexa do something", and if
  • 16:52 - 16:58
    people are watching the TV and Alexa is in
    the room Alexa is going to perform this
  • 16:58 - 17:09
    action or order some goods online. So
    as you see this so-called smart system
  • 17:09 - 17:14
    isn't smart enough to differentiate
    between the actual human being that's in
  • 17:14 - 17:20
    the room and that they should obey or they
    should answer to, and voices that come
  • 17:20 - 17:25
    through the radio or the TV or the voices
    of the pet. So if on the one hand
  • 17:25 - 17:30
    intelligent machines can be confused, I
    have the feeling that also as human being
  • 17:30 - 17:36
    we are confused when it comes to, you
    know, as soon as you throw in the arena
  • 17:36 - 17:41
    the words artificial intelligence, most
    people, not you I'm sure, but most people
  • 17:41 - 17:48
    are ready to believe the hype. And a good
    example of that, I would say, is you know
  • 17:48 - 17:54
    we keep reading how in China they have
    this super sophisticated and highly
  • 17:54 - 18:00
    efficient networks of surveillance of CCTV.
    And how they have face recognition systems
  • 18:00 - 18:07
    that are really really high hand and
    highly sophisticated. So that is the
  • 18:07 - 18:11
    theory but the reality is that the
    technology has limitations and also
  • 18:11 - 18:18
    limitations linked to bureaucracy and some
    of the data in the files I've seen not
  • 18:18 - 18:24
    been digitalized. So sometimes the work of
    the police is not aided by sophisticated
  • 18:24 - 18:29
    CCTV and their face recognition software,
    but just consists of going through old
  • 18:29 - 18:35
    people files and photos. And yet, maybe
    that doesn't matter if you know if the
  • 18:35 - 18:40
    system is not fully functional, because in
    the case of surveillance what matters is
  • 18:40 - 18:45
    that we think that we might at some point
    be under the gaze of surveillance that
  • 18:45 - 18:55
    might be enough to keep us docile and
    compliant. Also, we are being told that
  • 18:55 - 18:58
    very soon we just have to wait for them to
    be launched on the market. We're going to
  • 18:58 - 19:03
    have some fantastic autonomous car that's
    going to drive us seamlessly from one part
  • 19:03 - 19:10
    of the town to the other one. And in
    reality, we are still training them, you
  • 19:10 - 19:17
    and I, when we get these pop up images and
    we are asked by the system to identify all
  • 19:17 - 19:21
    the squares where there is a shop window,
    where there is a traffic sign or where
  • 19:21 - 19:27
    there is another vehicle. Where I want to
    go with this is that there is still a lot
  • 19:27 - 19:32
    of human beings, a lot of human clocks
    behind the machines, because artificial
  • 19:32 - 19:38
    intelligence is nothing without data, and
    data is pretty useless if it's not been
  • 19:38 - 19:44
    targged. And so you see the emergence of
    data tagging factory in China and it's
  • 19:44 - 19:49
    actually the new faces of outsourcing in
    China, where you have people who spend the
  • 19:49 - 19:56
    whole day behind computer screen adding
    dots and descriptions to images and
  • 19:56 - 20:03
    photo. And so artificial intelligence that
    was supposed to, you know, free us from
  • 20:03 - 20:10
    very monotonous, mind-numbing and boring
    work is still generating for us a lot of
  • 20:10 - 20:17
    this boring work. And these invisible
    people behind artificial intelligence is
  • 20:17 - 20:23
    what Amazon, which is Amazon Mechanical
    Turk, but it's what Amazon call
  • 20:23 - 20:29
    artificial intelligence. Some could photo
    mission, and some of the characteristic of
  • 20:29 - 20:35
    this hidden human labor for the
    digital factory is that it's, one, the
  • 20:35 - 20:40
    hyper fragmentation of the work and
    because it's like the work is divided in
  • 20:40 - 20:46
    small chunks, it's loses a lot of his
    prestige, a lot of its value, so people
  • 20:46 - 20:51
    are underpaid and what is a bit
    frightening is that this kind of work
  • 20:51 - 20:58
    culture is actually creeping into the
    world of creative people and freelancing
  • 20:58 - 21:04
    and you have platforms such as this one,
    called Fiverr where it's also a
  • 21:04 - 21:09
    crowdsourcing platform where you can buy
    the services of a person who is going to
  • 21:09 - 21:14
    design for you the logo of your new
    company, or a birthday card, or a business
  • 21:14 - 21:20
    card, or do translation for you or shoot a
    testimonial for your products and there is
  • 21:20 - 21:25
    one example there and someone who advertise
    that they will design for you two
  • 21:25 - 21:32
    magnificent logos in 48 hours. And the
    price that's it at five dollars, that's
  • 21:32 - 21:39
    very low - and the characteristic
    of this kind of company is this scale of
  • 21:39 - 21:47
    crowdsourcing so-called services is that
    they ruthlessly rely on precarity and
  • 21:47 - 21:54
    Fiverr recently had these ads, where
    they were basically glorifying the idea of
  • 21:54 - 22:03
    working yourself to death. And then this.
    Well I'm going to the third work I really
  • 22:03 - 22:09
    like in 2018. It was released a few months
    ago. It's a work by Michael Mandiberg and
  • 22:09 - 22:17
    he wanted to make a portrait of the hidden
    human being behind the digital factory.
  • 22:17 - 22:24
    And to do so he really wanted to recreate
    or to modernized the iconic film by Charlie
  • 22:24 - 22:30
    Chaplin. Modern Times. It's a film from
    1936 which was you know it was the seventh
  • 22:30 - 22:35
    year of the Great Depression. People
    were struggling, being unemployed,
  • 22:35 - 22:43
    starving and having difficulty to get through
    life. And Michael Mandiberg made an
  • 22:43 - 22:50
    update of this film, by... First of all he
    wrote the code and the code cut the film
  • 22:50 - 22:56
    of Charlie Chaplin into very short clips
    of a few seconds and then Michael
  • 22:56 - 23:00
    Mandiberg contacted people. He
    found people on this platform on
  • 23:00 - 23:05
    Fiverr and he asked them, he gave
    each of them a short clip. And
  • 23:05 - 23:12
    he asked them to recreate it as well
    as they could. And that's what they did.
  • 23:12 - 23:18
    And Michael Mandiberg, I'm going to show
    an extract. Yeah you will see, he had no
  • 23:18 - 23:22
    obviously no control over the location, he
    had no control about around the production
  • 23:22 - 23:31
    choice. So hopefully I'm going to find
    the correct video and not a fitness video.
  • 23:31 - 23:33
    So here's an extract.
  • 23:33 - 25:20
    music plays
  • 25:20 - 25:29
    I think you've got the point here. Good.
    So with this work Michael Mandiberg tried
  • 25:29 - 25:35
    to do a kind of portrait of these hidden
    digital factory and of its human clocks
  • 25:35 - 25:40
    But the portrait is done by the
    freelancers, by the workers themselves and
  • 25:40 - 25:46
    you can see the result is kind of
    messy and chaotic geographically disperse
  • 25:46 - 25:56
    and hyper-fragmented. So that it's added
    characteristic, of the digital factory. So
  • 25:56 - 26:03
    carrying on with the world that is hidden,
    I'd like to speak briefly about social
  • 26:03 - 26:09
    media content moderators. Maybe you know
    this story because imagine that in
  • 26:09 - 26:15
    Germany, I was listening to an interview
    with a woman who had work as a social
  • 26:15 - 26:24
    media content moderator for Facebook in
    Germany and she was explaining what kind
  • 26:24 - 26:29
    of work she was doing. It doesn't happen
    very often that you have testimony, because
  • 26:29 - 26:33
    usually the worker have to sign a
    nondisclosure agreement. But she was
  • 26:33 - 26:40
    explaining that, how difficult the work is
    because every single day, each of the
  • 26:40 - 26:47
    moderator has to go to 1300 tickets. So
    1300 cases of content that have been
  • 26:47 - 26:53
    flagged as inappropriate or disturbing,
    either automatically or by another users.
  • 26:53 - 26:59
    So you know that doesn't leave a lot of
    time to have a thorough analysis.
  • 26:59 - 27:06
    And indeed she was explaining that human
    reflection is absolutely not encouraged and
  • 27:06 - 27:11
    that you have all the process of
    judging, you have only a few seconds to
  • 27:11 - 27:15
    judge whether or not the content has
    to be blocked or you could stay on
  • 27:15 - 27:20
    Facebook. And all the process of the
    reflection is actually reduced to a series
  • 27:20 - 27:25
    of automatisms. And the interesting part
    for me was when she was explaining that at
  • 27:25 - 27:30
    night she was still dreaming of her work, but
    where some of her co-workers were dreaming
  • 27:30 - 27:37
    of the kind of horrible content they have to
    moderate, she was explaining how she was
  • 27:37 - 27:43
    actually dreaming of doing the same
    automatic gesture over and over again. And
  • 27:43 - 27:49
    that's how you realized, that all these
    big companies, they wish they could replace
  • 27:49 - 27:54
    workers by machines, by algorithm, by
    robots. But some of the tasks
  • 27:54 - 27:59
    unfortunately the robots or the
    algorithm cannot do them yet. So they have
  • 27:59 - 28:06
    to rely on human being with all the
    other unreliability and propension
  • 28:06 - 28:13
    to contest and rebell and maybe
    laziness and all their human defects. So
  • 28:13 - 28:19
    the solution they found was to. Sorry?
    Okay I thought someone had asked me
  • 28:19 - 28:26
    something. Where was I? So the solution,
    yeah solution they found was to reduce
  • 28:26 - 28:32
    these people to machines and automate
    their work as much as possible. And so
  • 28:32 - 28:38
    that's the work as a Facebook content
    moderator. And just as is apparent as you
  • 28:38 - 28:45
    can see how their work ambience and
    environment is different from the fancy
  • 28:45 - 28:51
    and glamorous image we see in architecture
    magazine, each time one of these big media
  • 28:51 - 28:56
    companies will open up headquarters
    somewhere in the world. But anyway to come
  • 28:56 - 29:01
    back to the work of Facebook content
    moderator I think the way the work is
  • 29:01 - 29:11
    automated can be compared to the way the
    work of the employees in Amazon warehouses
  • 29:11 - 29:19
    are being automated. As you know
    they have to wear this kind of mini-
  • 29:19 - 29:25
    computer on their wrist that identifies
    them, that tracks them, that give them all
  • 29:25 - 29:34
    the... monitor them, measure their
    productivity in real time. And also like,
  • 29:34 - 29:40
    allows Amazon to really remote control
    them and so they work in a pretty
  • 29:40 - 29:47
    alienating environment. Sometimes popup
    online pretend that Amazon they feel this
  • 29:47 - 29:53
    one is particularly creepy, because they
    were thinking of having, of, you know,
  • 29:53 - 29:58
    making this wrist computer even
    more sophisticated and using haptic
  • 29:58 - 30:03
    feedback so that they could direct and
    remote control even more precisely the
  • 30:03 - 30:11
    hands of the worker. So you see there is
    this big push towards in certain contexts
  • 30:11 - 30:15
    towards turning humans into robots. And
    you know we're always mentioning Amazon,
  • 30:15 - 30:22
    but this is the kind of work culture that
    is being adopted elsewhere in France, but
  • 30:22 - 30:29
    also in other European countries. There
    are services such as the Auchan Drive. So
  • 30:29 - 30:32
    if you're a customer and you don't
    want to go to the supermarket to do your
  • 30:32 - 30:37
    groceries, you do your groceries online
    and 10 minutes later you arrive on the
  • 30:37 - 30:43
    parking lot of your supermarket and
    someone will be there with your grocery
  • 30:43 - 30:49
    bags ready. And I was listening to
    testimonies by people who used to work
  • 30:49 - 30:54
    at Auchan Drive and as you can see they
    wear all the same microcomputer
  • 30:54 - 30:59
    that is also monitoring precisely all
    their movement. And the thing that I found
  • 30:59 - 31:03
    most disturbing was when they were
    explaining, that they don't, when they get
  • 31:03 - 31:08
    the list of of items they have to pick up,
    they don't get you know a list that say
  • 31:08 - 31:14
    you have to take three bananas and two
    packs of soy milk. No they just get
  • 31:14 - 31:20
    numbers that say good to this alley
    another number that say this is
  • 31:20 - 31:25
    the shelf where you have to take the item
    and the number corresponding to the
  • 31:25 - 31:30
    precise item on the shelf. And they also
    they are submitted to very strict rules of
  • 31:30 - 31:36
    being as fast as possible and I shouldn't
    laugh, but they were explaining the kind
  • 31:36 - 31:41
    of if they rebelled the kind of vengeance
    their boss would have. And if they are
  • 31:41 - 31:46
    particularly rebellious, they send them
    into frozen food so they spend the day in
  • 31:46 - 31:55
    the cold. And I'd like to have a sharp
    look with you at people who also want to
  • 31:55 - 31:59
    be closer to machines and robots. But
    according to their own terms and they want
  • 31:59 - 32:06
    to have total control over it and it's
    something they do voluntarily and it's the
  • 32:06 - 32:11
    community of so-called grinders. So these
    people who don't hesitate to slice open
  • 32:11 - 32:16
    their body to put, insert chips and
    devices and magnets, into it to either
  • 32:16 - 32:22
    facilitate their life or expand their
    range of perception. And actually the
  • 32:22 - 32:28
    first time I heard about about them was at
    the Chaos Communication Congress back a
  • 32:28 - 32:37
    long time ago in 2006 when a journalist
    Quinn Norton had installed a magnet under
  • 32:37 - 32:42
    her fingers and she could sense the
    electromagnetic fields. I found that very
  • 32:42 - 32:48
    fascinating. But as you can see the
    aestetic is, it's pretty rough and gritty.
  • 32:48 - 32:53
    It's very, it's a vision of a world to
    come and of a communion with the machine
  • 32:53 - 32:59
    that is completely aesthetically it's very
    different from the one that you saw by the
  • 32:59 - 33:04
    Silicon Valley. But I find it equally
    fascinating how they take control of the
  • 33:04 - 33:13
    way they want technology to be closer to
    technology. Yes. So in the third part of
  • 33:13 - 33:17
    my talk, the first part was about this
    confusion we have between the intelligence
  • 33:17 - 33:23
    of machine and human intelligence. The
    second part was about the robotisation of
  • 33:23 - 33:28
    the human body. And the third and
    last part I would like to have a look
  • 33:28 - 33:36
    and I don't know the type of hybridisation of
    bodies, but this time not between the human body
  • 33:36 - 33:44
    and the machine, but human body and other
    animal species. So we may have actually
  • 33:44 - 33:51
    been tinkering with their bodies for
    decades since the 60s when they started
  • 33:51 - 33:59
    taking the contraceptive pills and also
    people going through want to change
  • 33:59 - 34:04
    gender, they also tinker with their body
    and the first transgender celebrity I
  • 34:04 - 34:11
    recently found out was George Jorgensen
    Jr. who started his adult life as a G.I.
  • 34:11 - 34:18
    during World War 2 and ended his life as a
    female entertainer in the US. And while
  • 34:18 - 34:22
    assembling the slides I could not resist
    adding this one because I saw it a couple
  • 34:22 - 34:28
    of weeks ago in an exhibition - the World
    Press Photo Exhibition. And it's a photo
  • 34:28 - 34:34
    that was taken in Taiwan and Taiwan is
    actually, it is a major on of the top
  • 34:34 - 34:39
    tourist destination for medicine. People
    fly to Taiwan because you get, you can get
  • 34:39 - 34:47
    very good health care and and pretty
    affordable surgery. And one of the hot
  • 34:47 - 34:54
    niche in surgery for this kind of tourist
    is actually gender reassignment surgery.
  • 34:54 - 35:01
    So in this photo you have, the operation
    is finished and the surgeon is showing to
  • 35:01 - 35:11
    the patient her new vagina. And
    before undergoing this surgery the
  • 35:11 - 35:16
    patient of course had to go through a
    course of... had to take hormones and
  • 35:16 - 35:22
    series of hormones and one of them is
    estrogen and estrogen is the primarily
  • 35:22 - 35:29
    sex female hormone, it's the hormone
    responsible for the feminization of the
  • 35:29 - 35:37
    body and also sometimes of the emotion.
    And, if you are a guy in the room you're
  • 35:37 - 35:40
    probably thinking, well I'm going to leave
    the room because I don't care about
  • 35:40 - 35:43
    estrogen. What does it have to do with me?
    And I'm going to spend the next few
  • 35:43 - 35:47
    minutes trying to convince you that you
    should you should care about estrogen.
  • 35:47 - 35:53
    Because even fish care about estrogen it
    turns out. At least the biologist. You
  • 35:53 - 35:57
    might have heard a few years ago that
    biology started to get very worrying
  • 35:57 - 36:04
    because they discovered that some fish
    population, especially the male ones, were
  • 36:04 - 36:11
    getting feminized that their sexual organs
    were both sometimes male and female and
  • 36:11 - 36:16
    they were worried about it and they
    discovered that it was the result of
  • 36:16 - 36:22
    contact with xenoestrogen. And
    xenoestrogen, everywhere in the
  • 36:22 - 36:28
    environment, and they affect not only fish
    but all animal species including human
  • 36:28 - 36:32
    species. So when I say that they are
    everywhere in the environment, it's really
  • 36:32 - 36:41
    everywhere - only 10 minutes - They are
    very often the result of industrial
  • 36:41 - 36:46
    process so for example you can find them
    in recycled plastic, in lacquers, in beauty
  • 36:46 - 36:54
    products, in products you use to clean the
    house, in wheat killers, insecticides, in
  • 36:54 - 36:58
    PVC, they are pretty much everywhere.
    Sometimes they are released by
  • 36:58 - 37:04
    industries into the watersteam and I
    wanted to show you again abstract art.
  • 37:04 - 37:08
    It's a work by Juliette Bonneviot, and she
    was really interested in these
  • 37:08 - 37:13
    xenoestrogens. So she looked around her in
    the environment where they were present
  • 37:13 - 37:19
    then and she took all these, some of these
    xenoestrogen sources I've just mentioned
  • 37:19 - 37:25
    and she separated them according to colors
    and then she crush them and turn them into
  • 37:25 - 37:31
    powder. They then mixed them with PVC and
    silicone which also contain xenoestrogens
  • 37:31 - 37:36
    and then pulled them onto canvas and then
    you have this kind of portrait of the
  • 37:36 - 37:44
    silent colonizing chemicals that is
    modifying our body and the kind of things
  • 37:44 - 37:48
    they can do to the human body have already
    been observed. So they're linked to higher
  • 37:48 - 37:56
    cases of cancer for both men and women,
    lowering of IQ, problems in fertility,
  • 37:56 - 38:05
    early onset of puberty, etc. So we
    should really be more vigilant around
  • 38:05 - 38:14
    xenoestrogen and in the next few minutes
    I'd like to mention briefly two projects
  • 38:14 - 38:21
    where women artist exploring how
    their hormones, more general their female
  • 38:21 - 38:30
    body can be used to have other kinds of
    relationships with non-human animals.
  • 38:30 - 38:36
    So the first is a project by Ai Hasegawa,
    she was in her early 30s and she had that
  • 38:36 - 38:41
    dilemma. She thought maybe, maybe it's
    time to have a child but do I really want
  • 38:41 - 38:47
    to have a child because it's a lot of
    responsibility and there is already
  • 38:47 - 38:51
    overpopulation. But on the other hand if I
    don't have a child it means that all these
  • 38:51 - 38:56
    painful periods would have been there in
    vain. She was also faced with another
  • 38:56 - 39:01
    dilemma, is that she really likes to eat
    endangered species and in particular
  • 39:01 - 39:07
    dolphin and shark and whales and so on the
    one hand she liked eating them. And on the
  • 39:07 - 39:13
    other hand she realized that she was
    contributing to their disappearance and
  • 39:13 - 39:18
    the fact that their extinct. So she found
    this solution, and this is a speculative
  • 39:18 - 39:24
    project, but she saw it as a solution
    maybe she could give birth to an
  • 39:24 - 39:29
    endangered species, So maybe she could
    become the surrogate mother of a dolphin.
  • 39:29 - 39:37
    So using her belly as a kind of incubator
    for another animal and to do this she
  • 39:37 - 39:42
    consulted with an expert in synthetic
    biology and an expert in obstetric who
  • 39:42 - 39:49
    told her that technically it's possible,
    that she would just have to resort to
  • 39:49 - 39:55
    synthetic biology to modify the placenta
    of the baby. She would not have to modify
  • 39:55 - 40:00
    our own body but she just had to modify
    the placenta of the baby to be sure
  • 40:00 - 40:05
    that all the nutrients and the hormones
    and the oxygen and everything is
  • 40:05 - 40:09
    communicated between the mother and the
    baby except the antibodies because they
  • 40:09 - 40:13
    could damage the body. So in a scenario,
    even if everything goes well, she gives
  • 40:13 - 40:21
    birth will lovely healthy baby
    Maui dolphin. And she gives it the
  • 40:21 - 40:28
    antibodies through fat milk containing
    these antibodies and she see grow. And
  • 40:28 - 40:33
    once it's an adult he could be released
    with a chip and under its skin. And then
  • 40:33 - 40:38
    when once it's fished she can buy it and
    eat it and have it again inside her own
  • 40:38 - 40:44
    body. Well, when I first heard about the
    project, I can tell you I was not
  • 40:44 - 40:48
    laughing. I said it was really really
    revolting. But the more I thought about,
  • 40:48 - 40:55
    it the smarter I saw this project was,
    because I understand it doesn't sadly make
  • 40:55 - 40:59
    a lot of sense to give birth to yet
    another human being that is going to put
  • 40:59 - 41:05
    strain on this planet. So why not dedicate
    your reproductive system to the
  • 41:05 - 41:12
    environment and use it to help save
    endanger species. And I have five minutes
  • 41:12 - 41:18
    left so I have to make some really really
    drastic choices. I wanted to talk about
  • 41:18 - 41:24
    Maja Smrekar. I'm going to do it. So she
    had a series of projects, where she is
  • 41:24 - 41:31
    exploring the core evolution of dogs and
    human and to make it very very brief. One
  • 41:31 - 41:37
    of her performances consisted in spending
    four months in a flat in Berlin with her
  • 41:37 - 41:42
    two dogs. One of them is an adult Byron
    and the other one is a puppy called Ada.
  • 41:42 - 41:47
    And she tricked her body by some
    physiological training and also by eating
  • 41:47 - 41:53
    certain food and using a breast pump. She
    treats her body to stimulate the hormone
  • 41:53 - 41:57
    that make the body of a woman lactate,
    even if she wasn't pregnant, so that she
  • 41:57 - 42:03
    could breastfeed her puppy. And these two
    projects I saw, were really are really
  • 42:03 - 42:09
    interesting, because it shows the power of
    the female body. It shows that what
  • 42:09 - 42:14
    prevents women from having this kind of
    transgender motherhood isn't is that
  • 42:14 - 42:20
    technology. It's just it's just society
    and the kind of limits that our culture is
  • 42:20 - 42:27
    bringing on what woman can or should do
    with her body. And I also like this
  • 42:27 - 42:32
    project, because the kind of question our
    anthropocentrism and our tendency to think
  • 42:32 - 42:38
    that all the resources of the world are made
    for us and not for other living creatures.
  • 42:38 - 42:43
    So just as a parent this if you're
    interested in estrogen and xenoestrogen
  • 42:43 - 42:50
    I would recommend that you have
    a look at the talk that Mary Maggic give last
  • 42:50 - 42:56
    year had the Chaos Communication Congress
    where she was talking about estrogen and
  • 42:56 - 43:02
    then linked the history and links
    to her work as a biohacker. So now my
  • 43:02 - 43:06
    conclusion... maybe you're already wondering
    why is she talking to us about artificial
  • 43:06 - 43:13
    intelligence and transspecies motherhood?
    What does it have to do with each other? I
  • 43:13 - 43:19
    would say a lot! Because we have the
    feeling that the digital world sometimes
  • 43:19 - 43:24
    we tend to forget that behind it
    there is a material world that to have
  • 43:24 - 43:29
    artificial intelligence. You need
    the infrastructure, you need the devices,
  • 43:29 - 43:35
    you need the server farm, you need spaces
    to manage these data. Physical spaces.
  • 43:35 - 43:41
    And I would say it's a bit like the human
    brain. The human brain is kind in dimensions
  • 43:41 - 43:47
    pretty small, but apparently it's it eats
    up a fifth of all the energy that our body
  • 43:47 - 43:54
    consume. And so that means that artificial
    intelligence is, it needs like a very
  • 43:54 - 44:00
    heavy infrastructure, energy, hungry,
    server farms and I'm sure you've seen all
  • 44:00 - 44:05
    the PR stunts and the press releases of
    Amazon, Google, Facebook, etc. promising
  • 44:05 - 44:09
    you that they're transitioning that
    they're going to use green energy, but the
  • 44:09 - 44:16
    energy is not still green and they're still
    using a lot of fossil fuel to pull out.
  • 44:16 - 44:22
    Their server farms to make the devices
    that we use. We still need minerals that
  • 44:22 - 44:26
    have to be dug up from the ground,
    sometimes in really horrible conditions.
  • 44:26 - 44:31
    This is the coltan mine in the Democratic
    Republic of the Congo. The minerals we
  • 44:31 - 44:37
    use, well, they are not infinite and,
    unless and it's not for tomorrow, unless we
  • 44:37 - 44:42
    go and get these minerals in the asteroid
    or in the deep sea. We are going to get
  • 44:42 - 44:47
    into trouble very very fast and I don't
    have the time for this one. But trust me
  • 44:47 - 44:53
    the resources are not infinite. And then,
    refind them and produce them. That's very
  • 44:53 - 45:00
    very damaging for the environment. Yes. So
    I have the feeling sometimes that we are a
  • 45:00 - 45:07
    bit like the golfers in this image, that
    when I go to some, not all of them, but
  • 45:07 - 45:12
    some tech conference or art and tech
    conferences, I have the feeling that we
  • 45:12 - 45:17
    are kind of complacent and that we have
    our vision of the future may be a bit
  • 45:17 - 45:22
    narrow minded and maybe that's
    normal. I guess it's like, if you go to
  • 45:22 - 45:26
    someone who's working as a fancy job at
    Silicon Valley and you ask him: "What's
  • 45:26 - 45:30
    your vision of tomorrow? What's your
    vision of the future?" And then you ask
  • 45:30 - 45:34
    the same question to someone else, for
    example is an activist for Greenpeace. Then
  • 45:34 - 45:39
    we have a completely different
    perspective, a very different answer of
  • 45:39 - 45:47
    what the future might be. So sometimes I'm
    wondering also, if we are not too obsessed
  • 45:47 - 45:52
    with the technology and to obsess with
    what I would call a techno feats. This
  • 45:52 - 45:57
    tendency we have to see a problem and to
    think, if we throw more technology on top
  • 45:57 - 46:02
    of it we are going to solve it. Even if
    the problem has been created in the first
  • 46:02 - 46:06
    place by technology. So, that's why we get
    extremely excited and I do get excited
  • 46:06 - 46:11
    about the perspective that when they maybe
    will get a baby mammoth that will be
  • 46:11 - 46:17
    resurrected. And at the same time
    we don't take care of the species that are
  • 46:17 - 46:22
    getting instincts, you know every single
    day every 24 hours. There are something
  • 46:22 - 46:28
    like between 150 and 200 types of plants,
    animals, insects that get disappeared and
  • 46:28 - 46:34
    we'll never see again. Every single day
    they disappear around the world. And yet
  • 46:34 - 46:38
    we still get excited about the idea of
    resurrecting the baby mammoth, the
  • 46:38 - 46:45
    passenger pigeon or dodo, we still create
    and breed creature, so that we can exploit
  • 46:45 - 46:50
    them even better. Should we be looking
    forward to a lab grown meat that is
  • 46:50 - 46:56
    promised. I mean we are told that is going
    to be cruelty-free and guilt-free, where it
  • 46:56 - 47:02
    has in reality they are not totally guilt-
    free and cruelty-free. And I don't think so that
  • 47:02 - 47:07
    they are the best solution to solved the
    horrible impact that the meat industry is
  • 47:07 - 47:13
    having on the environment, on our health
    and on the well-being of animals. I mean
  • 47:13 - 47:18
    there is a solution. It's not the sexy
    one. It's not a tricky one. It's to
  • 47:18 - 47:25
    adapt to plain plant based diet and I
    managed to do a bit of vegan propaganda
  • 47:25 - 47:34
    here. Should we get excited, because there
    are few vegan in the room. Should we get
  • 47:34 - 47:43
    excited, because someone in Japan has made
    some tiny drones with, on top of it, it's
  • 47:43 - 47:48
    horse hair that they're used to pollinate
    flowers, because everywhere around the
  • 47:48 - 47:54
    world the population of bees is collapsing
    and that is very bad for our food system.
  • 47:54 - 48:01
    So should we, should we use geo
    engineering to solve our climate trouble.
  • 48:01 - 48:06
    And at the end with these slides of what
    this for me ours, I see you know the
  • 48:06 - 48:12
    service, which is for me the really the
    embodiment of techno feats. So, you
  • 48:12 - 48:20
    probably know that California has gone
    through some very bad period of dry
  • 48:20 - 48:26
    weather, so rich people wake up and they
    see that the grass on their magnificent
  • 48:26 - 48:30
    lawn is yellow instead of being green. So,
    now in California you have services where
  • 48:30 - 48:34
    you can call someone and they will just,
    you know, fix the problem by painting the
  • 48:34 - 48:39
    grass in green. So there you have it and,
    you know, fuck you Anthropocene and global
  • 48:39 - 48:46
    warming, my lawn is green!
  • 48:46 - 48:53
    applause
  • 48:53 - 48:57
    Okay so this how this why I wanted to
    bring all these really contrasting visions
  • 48:57 - 49:02
    together because we might have different
    visions of the future but at some point
  • 49:02 - 49:09
    they will have to dialogue because we have
    only one humanity and one planet. And I'm
  • 49:09 - 49:16
    very very bad that at conclusion. So I
    wrote it down. So the artist whose work
  • 49:16 - 49:23
    made in 2018 a truly exciting year
    for me not the artist will showcase the
  • 49:23 - 49:28
    magic and the wonders of science and
    technology. They are the artists who bring
  • 49:28 - 49:34
    to the table realistic complex and
    sometimes also difficult conversation
  • 49:34 - 49:39
    about the future whether it is the future
    of technology, the future of men, or the
  • 49:39 - 49:44
    future of other forms of life and other
    forms of intelligence. They are the
  • 49:44 - 49:48
    artists who try and establish a dialogue
    between the organic, the digital and the
  • 49:48 - 49:54
    mineral because they're all part of our
    world and we cannot go on pretending that
  • 49:54 - 49:58
    they don't affect each other. Thank you so
    much.
  • 49:58 - 50:15
    Applause
  • 50:15 - 50:20
    Herald: Thank you Régine for the very
    interesting times. Bit confusing talk
  • 50:20 - 50:30
    about - I'm still thinking about the
    dolphin part, but anyway. But by the way
  • 50:30 - 50:34
    there's this grass painting thing is maybe
    it's something that I can apply to
  • 50:34 - 50:40
    my house. Okay. We have questions at
    microphone 2, I guess. So let's start
  • 50:40 - 50:43
    there.
    Question: Hi. I have a question on a
  • 50:43 - 50:49
    particular part at the beginning you
    talked about AI in arts and you mentioned
  • 50:49 - 50:57
    that there are no AI programs that draw
    pictures or run texts but have you heard
  • 50:57 - 51:05
    about AI developing ideas for say an art
    installation?
  • 51:05 - 51:12
    Régine: Yes as a matter of fact I think
    tonight... I mean, if you're going to program
  • 51:12 - 51:18
    and you may maybe tonight or tomorrow
    night there is a talk by Maria and
  • 51:18 - 51:24
    Nicole. There two artists and I think
    that the title might be Disnovation
  • 51:24 - 51:28
    and I think you might like what
    they present. I don't know what they're
  • 51:28 - 51:33
    going to present but what I know is that
    one of their work... I forget the name... if
  • 51:33 - 51:39
    they're in the room that would be
    fantastic. But they had a project where
  • 51:39 - 51:45
    they have a bot. It's on Twitter and it's
    going through some blogs and newspaper
  • 51:45 - 51:52
    about creativity and also about
    technology and using all these data to
  • 51:52 - 52:00
    generate really crazy stupid title for
    installation . And then the artist
  • 52:00 - 52:07
    challenge other artists to take these
    tweets which are really crazy and make an
  • 52:07 - 52:14
    installation out of it. So that's
    a kind of a tricky way of AI being used
  • 52:14 - 52:18
    for installation I'm sure. Like
    right now I cannot think of about
  • 52:18 - 52:24
    anything else but I mean if you
    want you can write me and when my brain
  • 52:24 - 52:34
    is switched on back, probably
    I'll have other ideas.
  • 52:34 - 52:43
    Herald: OK. Any more questions. I don't
    see any.... Ah, there over there.
  • 52:43 - 52:49
    Microphone 4 please.
    Question: Yeah. I was wondering if, well.
  • 52:49 - 52:54
    there's probably more certain that
    we're developing to more suppose human race
  • 52:54 - 53:01
    because we simply have to do to climate
    change. There are also developments right
  • 53:01 - 53:08
    now that when relatively short term we
    would go to Mars and in a sort of sense do
  • 53:08 - 53:16
    we need to do to fight the human race for
    possible multiple planets and with modern
  • 53:16 - 53:22
    human modification.
    Régine: Okay, I didn't understand the
  • 53:22 - 53:27
    question.
    Herald: So, please repeat.
  • 53:27 - 53:35
    Question: So in general we're going to a
    human... both human race that definitely...
  • 53:35 - 53:43
    we're not able to survive on this planet
    for that long anymore. Really optimistic.
  • 53:43 - 53:53
    I am vegan so yay. And we have
    some new developments going on that we'll
  • 53:53 - 53:58
    be able to go to Mars relatively soon.
    Régine: I don't believe it. Who is going
  • 53:58 - 54:02
    to want to go to that planet like
    seriously, like it's going to be like
  • 54:02 - 54:08
    Australia we are going to send prisoners
    and criminals there. Who wants to be like
  • 54:08 - 54:16
    Like come on. Yeah. Anyway now I
    see what you mean. I think I'm kind of
  • 54:16 - 54:21
    more optimistic about the future
    than you are. I think we can still survive
  • 54:21 - 54:27
    on this planet even if we get very
    numerous. We just have to find another
  • 54:27 - 54:30
    way to consume and take care of each
    other. But maybe this my feminine side
  • 54:30 - 54:38
    talking.
    Question: Maybe in general without a lot
  • 54:38 - 54:45
    of modification to the human
    beings it's simply not possible.
  • 54:45 - 54:53
    Certainly. I think that's a common ground.
    And yeah I sort of wish we didn't need a
  • 54:53 - 54:59
    planet B but I think we do.
    Régine: Well I hope I'd be dead when that
  • 54:59 - 55:06
    comes. That's my philosophy sometimes.
    Okay.
  • 55:06 - 55:14
    Herald: I don't see any more questions so
    let's thank the speaker again. Thank you.
  • 55:14 - 55:18
    Applause
  • 55:18 - 55:32
    35C3 postroll music
  • 55:32 - 55:41
    Subtitles created by c3subtitles.de
    in the year 2020. Join, and help us!
Title:
35C3 - The good, the strange and the ugly in 2018 art &tech
Description:

more » « less
Video Language:
English
Duration:
55:41

English subtitles

Revisions Compare revisions