< Return to Video

Dilemas Éticos y Políticos de la Computación en la Nube - Dr. Javier Bustamante Donas

  • 0:07 - 0:11
    Thank you very much. I’m pleased to be here with you.
  • 0:12 - 0:17
    We have until 2:15, approximately.
  • 0:17 - 0:22
    Later at 3, I’m giving a Chinese Thinking class at the Faculty of Philosophy.
  • 0:22 - 0:25
    A “ni hao” thing.
  • 0:29 - 0:36
    At the Faculty of Philosophy, I have the pleasure of teaching a class of Science, Technology and Society,
  • 0:36 - 0:39
    which is very close to Philosophy-Technology.
  • 0:39 - 0:44
    Also another of Political Philosophy, where I introduce to plenty of elements of participatory democracy,
  • 0:44 - 0:51
    of how the Internet transforms the boundaries of the political and ethical ambit.
  • 0:51 - 1:00
    And a wonderful class of Chinese Thinking where I have the luck of discussing the fundamental paradigms
  • 1:00 - 1:08
    of the ancient Chinese thinking in the times of the battling states, Chun, Zhou, and it’s a time
  • 1:08 - 1:16
    in which the Lao-Tse thinking emerges. Confucius’ as well and other thinkers
  • 1:16 - 1:19
    like Mozi, Gong Sunlong, Han Feizi, etc.
  • 1:19 - 1:24
    All the basics of the Chinese thinking comes
    from that time and it’s quite interesting
  • 1:24 - 1:29
    because, normally, the nexus between politics
    and technology are forgotten.
  • 1:29 - 1:33
    And, nowadays, we’re seeing how technology is totally extending
  • 1:33 - 1:36
    the range of possibilities of the political ambit.
  • 1:37 - 1:40
    In fact, it’s usually said that
    it’s the ideas that shape the world.
  • 1:40 - 1:47
    Well, today we can see that it’s
    the artifacts, the devices and
  • 1:48 - 1:52
    the non-official use that people give them
    what really makes
  • 1:52 - 1:56
    a very powerful engine of social change.
  • 1:57 - 2:02
    Note that even the very political ambit
    is built through technology.
  • 2:04 - 2:05
    Do we have any Italians here or not?
  • 2:06 - 2:11
    Anyone from Italy with an Erasmus
    scholarship or so?
  • 2:11 - 2:12
    No one in here?
  • 2:13 - 2:18
    Who can, please, tell me
    when, before the Italian construction
  • 2:19 - 2:25
    of the 19th century…? People travelled
    on horses between sites.
  • 2:26 - 2:33
    The Italians felt Venetian, Milanesi,
    Torinese, Bolognesi, Roman, Sicilian…
  • 2:34 - 2:35
    But there was no conscience of Italy.
  • 2:36 - 2:40
    What is it that, suddenly,
    makes the thought of being Italian appear,
  • 2:40 - 2:43
    the conscience of the Italian identity?
    Who’d like to tell me?
  • 2:44 - 2:47
    Silence in the crowd.
  • 2:47 - 2:48
    Anyone?
  • 2:49 - 2:51
    The invention of the train.
  • 2:52 - 2:56
    When the first iron tracks appear in Italy
    and when the distances
  • 2:56 - 2:58
    that had to be covered in a matter of days before
    can be covered in hours,
  • 2:59 - 3:02
    people realize the transformation
    of the physical space.
  • 3:02 - 3:05
    How directly the political identity,
  • 3:05 - 3:07
    the identity that one feels on himself,
  • 3:08 - 3:09
    is transformed through technology.
  • 3:10 - 3:11
    Aristotle said so.
  • 3:11 - 3:16
    Aristotle, who’s one of the founders
    of our democratic thinking,
  • 3:16 - 3:21
    although people who speak of this don’t usually
    know that the Athenian democracy
  • 3:21 - 3:26
    only reached out to ten
    or fifteen thousands of people.
  • 3:26 - 3:30
    Just men, who did not rely on their salary
  • 3:31 - 3:35
    but their income and were free to
    practice the “dolce far niente”,
  • 3:35 - 3:40
    bios theoretikos, the theoretic life,
    the poiesis, poetry,
  • 3:40 - 3:44
    philosophy, general thinking, etc.
  • 3:44 - 3:48
    So, Aristotle said that democracy
    can’t go beyond
  • 3:48 - 3:55
    the walls of the polis,
    because it can’t reach further
  • 3:55 - 3:59
    than the orators voices.
    When there’s no dialogue,
  • 3:59 - 4:01
    there’s no possible democracy,
    and he was right, so
  • 4:01 - 4:05
    the walls of the polis were
    physical until now and
  • 4:05 - 4:09
    no more than one city could be
    democratic in those terms.
  • 4:09 - 4:12
    I mean, there couldn’t be
    a participatory democracy
  • 4:12 - 4:15
    if there was no possibility of dialogue.
  • 4:16 - 4:20
    What do the Internet and the Information and
    Communication Technology allow in general?
  • 4:21 - 4:22
    That everyone speaks and listens.
  • 4:23 - 4:27
    Those walls are directly brought down.
    When there’s a students protest
  • 4:27 - 4:35
    in Iran or when there’s a strike and
    a later protest by Buddhist monks
  • 4:35 - 4:40
    in Myanmar, there’s always someone recording
    with a phone which is streaming live
  • 4:40 - 4:44
    those events to a global audience.
  • 4:44 - 4:49
    So, what about it? People who listen,
    see, observe, take part
  • 4:49 - 4:51
    in that information become protagonists.
  • 4:52 - 4:54
    Things start happening live.
  • 4:54 - 5:00
    And it almost sounds like a joke that there’s
    a Ministry of Foreign Affairs.
  • 5:01 - 5:05
    What are foreign affairs?
    Every single thing out there
  • 5:05 - 5:09
    directly affects us, we’re now
    living in a time
  • 5:09 - 5:13
    of fundamental elections for
    Spain, which is the election
  • 5:13 - 5:16
    for president in the United States.
    I think that we should all vote,
  • 5:17 - 5:21
    because, deep down, it’s going to be someone
    ruling over our heads,
  • 5:21 - 5:25
    to put it like that, right?
    Last night, a very interesting debate took place
  • 5:25 - 5:31
    between Mrs. Clinton and Mr. Trump,
    if the latter can be called a mister,
  • 5:31 - 5:37
    and it was viewed live,
    especially for those who wanted to get up
  • 5:37 - 5:39
    at 2:30 AM to follow it, right?
  • 5:40 - 5:43
    Anyway, inside this matter
    I would like to talk to you about
  • 5:43 - 5:47
    the ethical and political dilemmas brought
    by one of the most interesting models
  • 5:47 - 5:50
    of computing that exists today,
    which is the model of cloud computing.
  • 5:50 - 5:55
    I forgot to say that I had
    the luck of doing
  • 5:55 - 5:58
    a Philosophy PhD, at Education Sciences,
    from the Complutense.
  • 5:58 - 6:01
    I’m also a Computer Science master
    from Universidad Pontificia of Salamanca.
  • 6:02 - 6:06
    I did a Science, Technology and Society PhD
    at the University of Rensselaer,
  • 6:06 - 6:08
    Rensselaer Polytechnic Institute, New York,
  • 6:08 - 6:13
    where they have the best department
    of science, technology and society in the world,
  • 6:13 - 6:17
    headed by a tremendously interesting
    person called Langdon Winner,
  • 6:17 - 6:20
    who was my thesis director and has
    become my best friend.
  • 6:20 - 6:26
    And, apart from being an exceptional
    person in many fields, he’s a person
  • 6:26 - 6:32
    who the Senate and the Congress, in
    a group hearing, invited to define
  • 6:32 - 6:36
    the American law of social
    control of nanotechnology.
  • 6:36 - 6:39
    Therefore, I come from a field
    where differences between science and arts
  • 6:40 - 6:43
    don’t exist at all, where you can’t say
    one is on technical side
  • 6:44 - 6:47
    or on a humanistic side,
    and where those two sides necessarily
  • 6:47 - 6:51
    have to have a meeting point,
    or else we’ll hardly get to do anything interesting.
  • 6:51 - 6:54
    But well, since the subject of this conference
    is very concrete and peculiar,
  • 6:54 - 7:00
    I’d like to stick to it, and as I said,
    I’d like to talk about the ethical and political
  • 7:00 - 7:02
    dilemmas in the cloud computing,
  • 7:02 - 7:06
    along with the big data, one of the most
    powerful paradigms of today.
  • 7:06 - 7:10
    And let’s do something else too:
    if anyone at any time
  • 7:10 - 7:13
    would like to raise a hand, I won’t
    be bothered at all, all the contrary.
  • 7:13 - 7:16
    The more interactive this is, the better.
  • 7:16 - 7:21
    I’ll try to go fast to have
    some time later for you
  • 7:21 - 7:25
    to show me your concerns, or your protests,
    or your divergent opinions,
  • 7:26 - 7:28
    for there’s nothing more
    interesting than that, alright?
  • 7:28 - 7:29
    Then let’s begin.
  • 7:30 - 7:36
    What I’m about to discuss comes from
    a research project by MINECO,
  • 7:37 - 7:39
    titled “Science, Technology and Society.
  • 7:39 - 7:45
    Ethical and Political Dilemmas in the Model of
    Cloud Computing as a New Sociotechnical Paradigm“.
  • 7:45 - 7:47
    And the reference to this
    innovation project is here.
  • 7:48 - 7:50
    This is important to mention,
  • 7:51 - 7:53
    since at the end of the three years
    and a half, I’m asked, as director
  • 7:53 - 7:56
    of this research project,
    for a report of everything I do,
  • 7:56 - 7:59
    so if I don’t mention this,
    it’s all for nothing, isn’t it?
  • 7:59 - 8:03
    But the important thing is that, at least,
    there was funding to have
  • 8:03 - 8:06
    an interdisciplinary team that
    can handle these matters.
  • 8:06 - 8:09
    If we part from a definition of cloud computing,
  • 8:09 - 8:15
    there’s one created by the National Institute of
    Standards and Technology from the United States government
  • 8:15 - 8:18
    which is really simple:
    “Cloud computing is a model
  • 8:18 - 8:34
    that enables a whole set of computing
    resources such as networks, servers,
  • 8:34 - 8:40
    applications, storage, service, etc...
    to be available in a direct way
  • 8:41 - 8:47
    on demand for users and to be
    acquired and released
  • 8:47 - 8:53
    with a minimal effort and without
    an important interaction between
  • 8:54 - 8:57
    the provider and the own user”.
  • 8:57 - 9:00
    It has five specific features.
  • 9:00 - 9:09
    The first one is that the services will
    be provided in a clear way
  • 9:09 - 9:12
    for the user, it doesn’t have
    any kind of direct interaction.
  • 9:13 - 9:19
    Then, it’s very important that
    the computers or devices
  • 9:19 - 9:24
    from which these applications can be launched,
    these are called “thin clients",
  • 9:24 - 9:29
    I mean, computers with very little power
    because the processing ability is not
  • 9:29 - 9:31
    on the user’s side, but on the cloud’s.
  • 9:31 - 9:36
    So tablets, smartphones, netbooks,
    etc., can be used.
  • 9:36 - 9:45
    After that, there will be a set of resources
    which will always be available
  • 9:45 - 9:49
    dynamically for the users,
    in terms of memory, storage,
  • 9:49 - 9:52
    bandwidth, virtual machines, etc.
  • 9:53 - 10:02
    Also, there will be a very notable
    elasticity since every single user
  • 10:02 - 10:06
    will experience the resources in use
    as if they were only available
  • 10:06 - 10:11
    for him or her, that gives a feeling
    of unlimited resources
  • 10:11 - 10:14
    and a system scalability:
    the more users that are in,
  • 10:14 - 10:17
    if the system is well designed,
    each one will probably have
  • 10:17 - 10:25
    a unique experience as to being the only
    user using those resources.
  • 10:26 - 10:30
    And then, there’s the possibility of a clear
    monitoring of the resources,
  • 10:31 - 10:35
    which is particularly important,
    for both users and providers.
  • 10:36 - 10:42
    There are three types of models of service,
    one of them is the applications service,
  • 10:42 - 10:48
    another one is the platforms, in the applications
    one is what we usually know,
  • 10:48 - 10:54
    services such as Gmail. By the way, here’s
    a small review, and it’s the fact that,
  • 10:54 - 10:58
    as we’ll later see, there’s a series of
    issues with privacy and with
  • 10:58 - 11:02
    the integrity of emails
    through a service like Gmail.
  • 11:03 - 11:06
    And it’s stunning to me
    that the Complutense University
  • 11:06 - 11:09
    has outsourced all
    of his email network.
  • 11:09 - 11:15
    Which means that all the information that goes
    among researchers in our university,
  • 11:15 - 11:19
    including matters of potential
    patents, is being monitored
  • 11:19 - 11:23
    by a private enterprise with strong
    ties to the American government.
  • 11:24 - 11:26
    It’s not an opinion,
    it’s a matter of fact.
  • 11:27 - 11:33
    And then, services like platforms
    offer tools and
  • 11:33 - 11:40
    a programming environment that allow
    a customization of the applications.
  • 11:40 - 11:46
    Microsoft Windows Azure and Google App
    are two examples where the control
  • 11:46 - 11:50
    is leveled with the application but not
    with the physical infrastructures.
  • 11:52 - 11:57
    Then, we have a last version
    of the models of cloud computing
  • 11:57 - 11:59
    which is infrastructure as service.
  • 12:00 - 12:04
    Here the final users do have
    access to the processing,
  • 12:04 - 12:10
    the network resources, etc... and
    they can configure said resources
  • 12:10 - 12:15
    inside their operating systems
    and use them as they please.
  • 12:16 - 12:22
    Here you have examples such as Rackspace,
    Amazon Elastic Compute Cloud, etc.
  • 12:22 - 12:30
    When we talk about cloud computing,
    I’d like to emphasize that it’s not just
  • 12:30 - 12:34
    a technical development, but it involves
    a whole huge change of mentality
  • 12:34 - 12:39
    in the way of living and understanding
    how we process our information.
  • 12:39 - 12:44
    Of course, this has many advantages,
    such as that one can focus more
  • 12:45 - 12:50
    on the creative aspects, not on
    the capture infrastructure,
  • 12:50 - 12:53
    or process, or use of information,
    the spreading of it, etc.
  • 12:53 - 12:59
    But it also presents serious ethical and political
    dilemmas which must be taken into account.
  • 13:00 - 13:04
    A lot of them, in my opinion,
    don’t have so much to do with the contexts of use,
  • 13:05 - 13:08
    with the way they are used, but with
    details belonging to the design
  • 13:09 - 13:12
    of the cloud computing. And the key
    question here would be if
  • 13:12 - 13:18
    cloud computing can be understood as
    an inherently political technology.
  • 13:18 - 13:22
    That's a very, very interesting
    matter we can see now,
  • 13:22 - 13:26
    because the history of computing
    shows how technical decisions
  • 13:26 - 13:33
    always or often have some
    important social consequences.
  • 13:33 - 13:36
    I mean, the ambit of technology
    is not opaque at all,
  • 13:37 - 13:40
    in general it's not opaque to the daily life,
  • 13:40 - 13:47
    but innovation is much more than
    a resource that optimizes efficiency itself.
  • 13:47 - 13:54
    We have so many examples of it..
    Believing in the neutrality of the technical decisions
  • 13:54 - 14:00
    or thinking that nowadays there are aspects
    of the human life that remain isolated
  • 14:00 - 14:03
    from the technical development is quite hard.
  • 14:03 - 14:07
    Could anyone in here tell me any aspect of
    our reality, of this activity,
  • 14:07 - 14:13
    in which technology, as an instrument or as a metaphor,
    hasn't altered it substantially
  • 14:13 - 14:16
    or isn't affecting it? Can anyone
    think of any ambit?
  • 14:17 - 14:21
    There are, but not many, though.
    I can't think of any right now,
  • 14:21 - 14:27
    but, can you come up with any aspect where
    technology doesn't play an important role?
  • 14:31 - 14:32
    Yes, please.
  • 14:32 - 14:43
    (PUPIL) The ambit of the countryside,
    agriculture, livestock...
  • 14:44 - 14:51
    Look, the most trolled enterprises in
    the whole world of technology,
  • 14:52 - 14:56
    enterprises like Monsanto, are
    precisely related to agricultural production,
  • 14:57 - 15:05
    to the creation of fertilizers, etc...
    which have automatically transformed
  • 15:05 - 15:07
    the whole agricultural world.
  • 15:07 - 15:12
    Nowadays there’s even a huge controversy
    with those enterprises that are producing
  • 15:12 - 15:18
    what they call ‘killing seeds’, I mean,
    seeds that can be sown by the farmer
  • 15:18 - 15:25
    but are no longer fertile a year later,
    therefore they’re no use for a new harvest,
  • 15:25 - 15:29
    so the farmer depends all the time
    on the enterprise
  • 15:29 - 15:33
    to keep sowing his fields.
  • 15:33 - 15:39
    In fact, there’s a tag that serves as
    a mantra in the agricultural ambit
  • 15:39 - 15:44
    which is the Green Revolution.
    It involves the use
  • 15:45 - 15:53
    and general advertising of transgenic
    organisms to enhance productivity
  • 15:53 - 15:58
    on behalf of erasing starvation in the world.
    The only problem is that, giving use to
  • 15:58 - 16:03
    a basic precaution principle, we still don’t
    have much experience on what the consequences
  • 16:03 - 16:08
    of the technological transformation
    of agriculture will be.
  • 16:09 - 16:13
    I can only think of one that affects
    a huge part of the people here
  • 16:13 - 16:18
    and it’s about, at the end of the 20th century,
    occidental men having 90% less
  • 16:18 - 16:22
    of sperm than occidental men from
    the beginning of the 20th century.
  • 16:22 - 16:29
    I mean, the ones from the end of the 20th century 90% less
    than those from the end of the 19th and beginning of 20th
  • 16:29 - 16:31
    because of the diet,
    among other things.
  • 16:31 - 16:35
    If you can tell me something natural
    that you can buy at a supermarket
  • 16:35 - 16:38
    I would encourage your teachers to add
    one more point to all of your subjects.
  • 16:39 - 16:42
    Go buy at a supermarket and
    tell me what’s natural there.
  • 16:42 - 16:46
    (PUPIL) Unless it’s an ecological supermarket...
  • 16:47 - 16:50
    Okay… I can say something about wheat.
  • 16:51 - 16:56
    The wheat we eat nowadays is already
    completely different to the wheat
  • 16:57 - 17:00
    that ceased existing in the 40s and 50s,
  • 17:00 - 17:03
    because taller ears of the wheat were made
    for the combine harvesters
  • 17:03 - 17:10
    to reap them properly and from that point,
    96 or 97% of all the wheat in the world
  • 17:10 - 17:16
    is already transgenic, it’s already transformed
    from the original varieties.
  • 17:16 - 17:19
    Yes, anything else, please?
  • 17:20 - 17:23
    (PUPIL) Philosophy.
  • 17:23 - 17:25
    Wow, well, philosophy…
  • 17:27 - 17:35
    You’ve actually said something quite beautiful,
    and philosophy in many cases
  • 17:35 - 17:37
    is still unaware of the technology subject.
  • 17:38 - 17:41
    When they say “Man,
    the essence of man, man”.
  • 17:42 - 17:45
    Can anyone tell me what is
    man out of science?
  • 17:45 - 17:48
    If you know what a human being is...
  • 17:48 - 17:52
    What’s a human being when at the Beijing
    Olympics they started to
  • 17:52 - 17:58
    introduce a committee for
    genetic doping to detect
  • 17:58 - 18:04
    the inclusion of monkey genes
    so the high-jumpers had
  • 18:04 - 18:06
    more explosiveness in the jump?
  • 18:06 - 18:12
    Or dolphin genes so the skin of the
    swimmers was more hydrodynamic?
  • 18:12 - 18:21
    When the result of an athletic
    test is totally transformed
  • 18:21 - 18:25
    or also the resistance ability through
    stuff like erythropoietin
  • 18:25 - 18:30
    or tetrahydrogestrinone. I mean,
    there’s such a huge developed doping,
  • 18:30 - 18:33
    the human being can be transformed
    in such an incredible way
  • 18:33 - 18:36
    and also when we see the difference
  • 18:36 - 18:40
    between a “pan troglodytes” chimpanzee
    and a human being...
  • 18:40 - 18:45
    Who’d like to tell me what is it?
    Minimal, 46 chromosomes for us,
  • 18:46 - 18:47
    48 chromosomes for them.
  • 18:47 - 18:51
    Take two acrocentric chromosomes,
    turn them into a metacentric one,
  • 18:51 - 18:56
    and we already have patterns for
    possible transformation, artificial evolution,
  • 18:56 - 18:58
    from chimpanzee to human
    being or vice versa.
  • 18:59 - 19:02
    I mean, the essence of human being
    is completely touched
  • 19:02 - 19:07
    and defined by science,
    and philosophy is totally unaware of it.
  • 19:07 - 19:10
    Just like some philosophy
    that doesn’t talk with science
  • 19:10 - 19:15
    is just as blind as any computer science
    that’s not in touch with the outer world.
  • 19:15 - 19:21
    Therefore, every concept of free
    software, free knowledge, etc...
  • 19:21 - 19:25
    has so much strength, because, until this moment,
    we’ve conceived science
  • 19:25 - 19:29
    as a centralized knowledge and not peripheral,
    thinking that specialists are
  • 19:30 - 19:32
    in the center of the system.
    However, nowadays,
  • 19:32 - 19:35
    we begin to see that the called
    “hidden innovation”, I mean,
  • 19:35 - 19:37
    the innovation that is not
    brought up in universities,
  • 19:37 - 19:42
    research institutes, etc.,
    has a key role today.
  • 19:42 - 19:49
    How the called "power users", or
    "early adopters" too, people who adopts
  • 19:49 - 19:53
    some technology in an early way
    and give new uses to it
  • 19:53 - 19:58
    often give new development patterns
    that seemed totally unthinkable until now.
  • 19:58 - 20:01
    And the problem with free
    software and knowledge,
  • 20:01 - 20:05
    the problem to the system lies in the metaphor
  • 20:05 - 20:09
    they entail because, somehow,
    if one gets used to sharing
  • 20:09 - 20:16
    we’re already talking about
    a whole new mind model
  • 20:16 - 20:22
    where synergy opposes competence.
    Where everything not given is a loss,
  • 20:22 - 20:26
    and where wealth doesn’t have anything to do
    anymore with gold, with something you
  • 20:26 - 20:30
    keep under your bed, but it has more
    to do with lettuces
  • 20:30 - 20:32
    and tomatoes, I mean,
    perishable products
  • 20:32 - 20:37
    which at some point no longer have value
    if not watered or kept
  • 20:37 - 20:42
    or taken care of, etc... So,
    there’s also another interesting paradigm.
  • 20:42 - 20:45
    Our economy was based on
    the law of supply and demand
  • 20:46 - 20:51
    through which the most scarce
    has more value because few people
  • 20:51 - 20:56
    has access to it. So, if anyone owns
    a Silver Ghost’s Rolls Royce
  • 20:56 - 21:02
    from the 30s, it’s very valuable, not because
    it’s a great car, but because it’s a very rare car,
  • 21:02 - 21:08
    there are very few models in the world.
    The fewer there are, the more valuable they are.
  • 21:08 - 21:13
    There’s a movie, “The Collector“, where
    a philatelic who collects rare stamps
  • 21:13 - 21:18
    finds out he has a very limited
    one and finds another person
  • 21:18 - 21:21
    who has another copy of the same stamp.
  • 21:21 - 21:25
    What does he do when he has both stamps in his hand,
    when he gets to buy the second stamp?
  • 21:25 - 21:30
    He tears one of them apart, because one unique stamp is
    more valuable than the sum of two rare stamps.
  • 21:30 - 21:34
    I mean, the fewer people has access
    to something, the more valuable that is.
  • 21:35 - 21:39
    However, if we talk about emails
    or operating systems...
  • 21:39 - 21:42
    If someone has an operating
    system that nobody uses
  • 21:42 - 21:44
    or just one uses,
    is it very or not very valuable?
  • 21:45 - 21:48
    If someone has an email and just him/her
    has access to the Internet,
  • 21:48 - 21:50
    is it very or not very valuable?
    Not valuable at all.
  • 21:50 - 21:53
    When does it start being valuable?
    The moment the more it has.
  • 21:53 - 21:56
    When is it very valuable?
    When everyone has it.
  • 21:56 - 22:00
    So, when we talk about
    an economy of the information,
  • 22:00 - 22:03
    of the cloud computing, when we
    work on the cloud and when
  • 22:03 - 22:08
    we work in a network, the total value of the network
    will rise with the squared number of users,
  • 22:08 - 22:13
    in other words, the total value of the network
    will increase in a quadratic function.
  • 22:13 - 22:19
    While in the real world, the fewer
    people that has access to something,
  • 22:19 - 22:21
    the more valuable that is.
    Alright?
  • 22:22 - 22:32
    Technology can embody models of life,
    ideological forms
  • 22:32 - 22:37
    and also forms to dissolve power or to resolve
    controversies regarding power.
  • 22:37 - 22:41
    My teacher, Langdon Winner,
    has a beautiful article
  • 22:41 - 22:43
    titled
    “Do artifacts have politics?”
  • 22:43 - 22:47
    in which he presents many cases,
    and one of them is Robert Moses’.
  • 22:47 - 22:54
    Robert Moses was the great architect
    of New York who in the 30s and 40s
  • 22:54 - 22:58
    worked on public constructions
    that gave this city the role
  • 22:58 - 23:01
    of business metropolis that has until now.
  • 23:02 - 23:06
    And among his most acclaimed works
    are over 200 overpasses
  • 23:06 - 23:15
    which lead to Jones Beach Park,
    in Long Island,
  • 23:15 - 23:19
    which is like a green lung for the
    city, like Casa de Campo
  • 23:19 - 23:23
    in that time. He built them in the 30s,
    they’re overpasses that cross
  • 23:23 - 23:26
    the big highways such as Palisades Parkway
  • 23:26 - 23:30
    that lead to Long Island.
    So under those overpasses
  • 23:30 - 23:37
    there’s a height of 2.5 meters that
    contrasts with the huge width
  • 23:37 - 23:41
    of the multiple lane highways
    in the United States.
  • 23:41 - 23:45
    Then, when they tried to understand why
  • 23:45 - 23:50
    this man had built overpasses of
    just 2.5 meters, they started
  • 23:50 - 23:52
    with the esthetic and architectural guidelines.
  • 23:52 - 23:57
    The evolution of European art nouveau,
    of matter qualities,
  • 23:57 - 24:02
    the volumetric dialogue…
    And Langdon Winner says in this article:
  • 24:02 - 24:06
    “None of that is true, it needs
    to be seen in a social environment”.
  • 24:06 - 24:10
    In that time, Robert Moses was
    quite racist and snob,
  • 24:10 - 24:15
    but, in any case, he couldn’t hang
    a sign in the parks of New York
  • 24:15 - 24:19
    that said “Black and Hispanic people
    are not welcomed in these parks".
  • 24:19 - 24:23
    However, when overpasses with a height
    of 2.5 meters are built,
  • 24:23 - 24:28
    what can pass below them?
    Automobiles, right?
  • 24:28 - 24:29
    But, what cannot pass?
  • 24:30 - 24:31
    (PUPIL) Buses.
  • 24:32 - 24:36
    Then if you technically prevent buses from
    crossing under those overpasses,
  • 24:36 - 24:39
    the access to those parks
    is radically limited.
  • 24:39 - 24:45
    So technology was acting as
    a way of social discrimination
  • 24:45 - 24:49
    to make only black and Hispanics
    with their own cars
  • 24:49 - 24:52
    able to access those parks.
    Everyone knows that black people
  • 24:52 - 24:57
    who had their own car in the 30s
    and have a lot of money
  • 24:57 - 25:01
    and are very famous, they whiten before
    our eyes in a metaphysical way.
  • 25:01 - 25:05
    Just like the transformation that Michael Jackson
    went through but in a natural way.
  • 25:05 - 25:14
    If we continue, we’ll see that until now
    we’ve gone from centralized models,
  • 25:14 - 25:22
    from the famous mainframes,
    the IBM S/360 and S/370 from the 70s,
  • 25:22 - 25:31
    to distributed computing, citizen web
    2.0, web 3.0, the Internet of things, etc...
  • 25:31 - 25:35
    We’ve come from models where
    intelligence laid in the center
  • 25:35 - 25:40
    to a distributed intelligence.
    From time to time,
  • 25:40 - 25:44
    a new sociotechnical paradigm arises.
    There are tablets, smartphones,
  • 25:45 - 25:52
    phablets now, virtual reality,
    and I hope that by the end of this year
  • 25:52 - 25:58
    we already have the first models of
    Oculus Rift and PlayStation 4 goggles
  • 25:58 - 26:01
    out on sale.
    The big question is whether all of this
  • 26:01 - 26:07
    has anything to do with
    a wider cultural context.
  • 26:07 - 26:12
    There are official uses, but under that,
    non-official uses given by
  • 26:12 - 26:18
    the very users, and this is what I was
    previously calling “hidden innovation”.
  • 26:18 - 26:24
    If we talk about the dilemmas
    that cloud computing brings along,
  • 26:24 - 26:29
    there are three fundamental risks that
    should be noted, and a fourth one showed there.
  • 26:29 - 26:35
    First off, we have the privacy issues,
  • 26:35 - 26:39
    with the loss of control over access
    to the information that we
  • 26:39 - 26:42
    store on the cloud.
    Think about the messages that go through Gmail,
  • 26:42 - 26:47
    about the photos on
    Picasa or Instagram, etc.
  • 26:48 - 26:52
    Intellectual property issues, due to
  • 26:52 - 26:56
    delocalization of servers
    and multiple legislations.
  • 26:56 - 27:00
    Cloud computing is ubiquitous,
    however, servers are
  • 27:00 - 27:04
    in certain countries and it’s their
    law the one they must follow.
  • 27:04 - 27:11
    Therefore, if a server is in China,
    in the United States, in an informational paradise
  • 27:11 - 27:17
    or in the European Union, it will obey
    very different legislations about protection
  • 27:17 - 27:22
    of personal data.
    And then there are problems
  • 27:23 - 27:27
    related to the inherently nature
    of cloud computing.
  • 27:27 - 27:32
    Technical and political decisions
    on CC implementation
  • 27:33 - 27:36
    are, as we’ll see, interdependent.
  • 27:36 - 27:40
    There’s a very interesting concept by
    Bruin Floridi, which is the concept of
  • 27:40 - 27:52
    interlucency, of transparency in some way.
    Floridi deffends that the fundamental
  • 27:52 - 27:57
    to ethically use cloud computing
    is that it exists
  • 27:57 - 28:01
    an epistemic virtue that creates
    a shared knowdledge
  • 28:01 - 28:04
    necessary for the user’s knowledge
    to know what to conform to.
  • 28:05 - 28:10
    I mean, there’s a duty from
    the provider’s side to inform
  • 28:10 - 28:15
    of the conditions of use of CC
  • 28:15 - 28:19
    and of the pros and cons for
    the users, in a way that
  • 28:20 - 28:23
    the user may decide what
    he/she really wants to do.
  • 28:23 - 28:27
    Truth is I don’t really
    share this point of view for a reason,
  • 28:27 - 28:32
    because we’re often seeing that the terms
    of agreement from the computer programs
  • 28:32 - 28:36
    are like the pamphlets of medication.
  • 28:36 - 28:42
    Is there anyone in here who reads
    the full contract of a program, of Windows
  • 28:42 - 28:47
    or of an operating system, of Apple,
    when you click “I agree”?
  • 28:47 - 28:53
    Yes? You do read it?
    You are the first one in the world I’ve met.
  • 28:53 - 28:55
    (LAUGHTER)
  • 28:56 - 29:03
    They had an experiment at a bar, in
    the United States, some time ago,
  • 29:03 - 29:09
    not long ago, in which they allowed
    the use of WiFi after giving
  • 29:09 - 29:13
    a contract and answering “I agree“,
    and that contract involved the sale
  • 29:13 - 29:17
    of your children, your home
    and something else.
  • 29:17 - 29:21
    And people fell directly for it,
    no one reads those contracts.
  • 29:21 - 29:27
    Deep down, I think that before having
    a proper information about
  • 29:27 - 29:30
    what we win or what we lose,
    we need to have a deeper education
  • 29:30 - 29:35
    to know what society we want
    and what values are those
  • 29:35 - 29:39
    we want to serve with our own work.
  • 29:39 - 29:46
    When we talk about these ethical dilemmas,
    I’ll mention ten in particular,
  • 29:46 - 29:49
    and I’ll go fast so we have
    time later to talk about it.
  • 29:49 - 29:56
    First, there’s already a new
    factor of digital divide.
  • 29:56 - 30:00
    If society was in the past divided
    between the rich and the poor,
  • 30:00 - 30:04
    and now between inforich and infopoor,
    we should also take into account that
  • 30:04 - 30:10
    in cloud computing any
    infraestructural weakness
  • 30:10 - 30:12
    will affect a lot its implantation.
  • 30:12 - 30:14
    I mean, the world can be
    divided into two categories:
  • 30:15 - 30:17
    first-class places where
    we have fast Internet access
  • 30:17 - 30:20
    and second-class places where
    there’s no service.
  • 30:20 - 30:24
    In the second, there will never
    be informational equality
  • 30:24 - 30:29
    because one won’t be able
    to have the processing ability,
  • 30:29 - 30:33
    somehow, we not only become
    slaves to our devices
  • 30:33 - 30:37
    but slave to what we would
    call "imperfect places”,
  • 30:37 - 30:41
    those places where one cannot
    use all of the cloud computing services.
  • 30:42 - 30:49
    Reliability becomes an absolutely
    critical element, given that our data
  • 30:49 - 30:53
    is on the cloud, like
    what happened to Yahoo.
  • 30:53 - 30:58
    Not long ago 50 million passwords were stolen.
  • 30:58 - 31:02
    I think that’s the number, right?
    In a single strike.
  • 31:03 - 31:06
    If they steal something from your
    computer, it is limited.
  • 31:07 - 31:12
    However, any theft,
    any fragility of vital subsystems
  • 31:12 - 31:18
    becomes a big problem. If you’ve noticed,
    we’re going back to a centralized model,
  • 31:18 - 31:23
    where intelligence once again is part of the
    core, of the central almond of the system.
  • 31:23 - 31:28
    All of this of course creates a collective perception
    of vulnerability, and it’s very interesting.
  • 31:28 - 31:33
    This happens, for example,
    to the energy models.
  • 31:33 - 31:39
    It’s not the same to choose a centralized
    energy model like nuclear plants,
  • 31:39 - 31:43
    big dams, etc., or another
    distributed with green energies.
  • 31:43 - 31:47
    It’s not just a technical decision,
    it’s a political decision.
  • 31:47 - 31:50
    Whether our energy is based on uranium
    or on plutonium,
  • 31:50 - 31:57
    it’ll be justified that a government states
    “We are forced to bug your
  • 31:57 - 32:03
    telephone lines and your electronic communications
    for the sake of the security of society.”,
  • 32:03 - 32:08
    “We have to torture terrorists in prisons
    before the menace of a possible attack
  • 32:08 - 32:13
    by crashing an airplane into a nuclear plant", etc...
  • 32:13 - 32:18
    But well, look: if the energy model
    was distributed and not centralized,
  • 32:18 - 32:24
    I can’t picture Osama bin Laden with
    a baseball bat breaking solar panels
  • 32:24 - 32:30
    on the roofs, I can’t.
    So, if there wasn’t that
  • 32:30 - 32:34
    centralized energy model, there wouldn’t be
    a collective perception of vulnerability either.
  • 32:34 - 32:43
    And the perception of vulnerability is
    what leads us to accept the undesirable as inevitable,
  • 32:43 - 32:47
    it’s a kind of policy which Naomi Klein,
    the great American intellectual,
  • 32:48 - 32:53
    reports on the called “The Shock Doctrine".
    That on many occasions we are presented with
  • 32:54 - 32:57
    the measures we do not wish to adopt
    as inevitable measures.
  • 32:57 - 33:01
    Earlier, we were talking about
    the Faculty of Philosophy and I’ll give
  • 33:01 - 33:04
    you one small and fun example.
    The Faculty of Philosophy had never had
  • 33:04 - 33:07
    closed doors in its classrooms for many years,
  • 33:07 - 33:11
    however, there comes a time when
    the deans are interested in having
  • 33:11 - 33:15
    control over the classrooms,
    so for the people to be locked out
  • 33:16 - 33:18
    they thought of adding locks to the doors.
  • 33:18 - 33:23
    But the students, their representatives,
    would have complained at any moment
  • 33:23 - 33:25
    if that had happened.
    What are the solutions?
  • 33:26 - 33:29
    Put cannons inside.
    They put cannons inside the classrooms,
  • 33:29 - 33:31
    without a cage and lowered
  • 33:31 - 33:34
    so many of them could be stolen.
  • 33:34 - 33:36
    The moment they’re stolen, locks are added
  • 33:37 - 33:39
    to the doors, but not to lock people out,
  • 33:39 - 33:41
    but to protect the patrimony.
  • 33:41 - 33:44
    So look how the best way
  • 33:44 - 33:45
    to get a problem out of the
    way is to redefine it.
  • 33:45 - 33:48
    A political problem is redefined
    as a technical problem.
  • 33:48 - 33:52
    Same thing happens with a government.
    If a government has a strong police,
  • 33:52 - 33:57
    a harsh system for social control,
    the best thing to do is to create
  • 33:57 - 34:03
    a very sensitive technology, very likely to be
    attacked and very centralized, very gathered.
  • 34:03 - 34:09
    In that moment, we’ll have to choose
    whether to protect society and its liberties
  • 34:09 - 34:12
    or to protect our own infrastructures.
  • 34:13 - 34:17
    Therefore, uninterrupted functioning
    is an absolute must.
  • 34:17 - 34:22
    Nothing can be done offline and,
    obviously, the risk of
  • 34:22 - 34:28
    a cyberterrorist attack is bigger
    in a society whose services
  • 34:28 - 34:32
    are centralized and sensitive.
    In the end, we’ll have to decide
  • 34:32 - 34:34
    what to protect:
    society or the very cloud.
  • 34:34 - 34:40
    Internet will be needing a more
    democratic architecture to prevent
  • 34:40 - 34:44
    cloud computing from turning
    into a new form for social control.
  • 34:44 - 34:49
    I don’t wanna take long on this,
    about BGP,
  • 34:49 - 34:53
    but as you know, the Border
    Gateway Protocol is the way
  • 34:53 - 34:56
    of Internet to determine the flux
    of data packages between domains.
  • 34:56 - 34:59
    And it is not very flexible
    and not very smart
  • 34:59 - 35:04
    when it comes to choosing a path that goes
    through the least possible number of independent systems.
  • 35:04 - 35:12
    A more intelligent and decentralized
    multi-path intelligent routing will be necessary
  • 35:12 - 35:18
    for the cloud computing not to turn
    into a new form for social control.
  • 35:18 - 35:23
    This has examples like the viral
    networks of Walter Lippmann or like
  • 35:23 - 35:28
    the whole Open Spectrum matter.
    I mean, the non-assignment of
  • 35:28 - 35:32
    radio or microwave frequencies,
    but a smart sharing
  • 35:33 - 35:35
    of the whole spectrum.
    But that would be another matter.
  • 35:35 - 35:39
    What we said before, too,
    the delocalization of information
  • 35:39 - 35:45
    and the extraterritoriality of laws.
    CC services are
  • 35:45 - 35:50
    by definition, ubiquitous.
    However, communication
  • 35:50 - 35:53
    infrastructures and servers
    belong to countries
  • 35:53 - 35:58
    and are submitted to the national regulations.
    The legal framework is very confusing nowadays.
  • 35:58 - 36:03
    And there are risks for privacy
    and integrity of personal information.
  • 36:04 - 36:08
    CC implies massive data traffic,
  • 36:08 - 36:14
    that data flows out
    of our firewalls,
  • 36:14 - 36:19
    so by having systems of shared
    storage, shared channels,
  • 36:19 - 36:23
    shared resources,
    virtualization of the same data
  • 36:23 - 36:27
    in different operating systems
    at the same time,
  • 36:27 - 36:30
    there’s always a major
    vulnerability factor.
  • 36:30 - 36:34
    And we shan’t forget that the user
    is the ultimate responsible
  • 36:34 - 36:37
    for any violation of the law.
  • 36:37 - 36:42
    Also talking about the
    challenges to social empowerment
  • 36:42 - 36:45
    and hidden innovation,
  • 36:45 - 36:51
    we must think that having local
    tools and, above all, free software
  • 36:51 - 36:54
    allows us to be the
    owners of what we’re doing.
  • 36:54 - 37:02
    However, inside all this
    set of services in CC,
  • 37:02 - 37:07
    the user does not need any
    specific skills to work on it
  • 37:07 - 37:10
    and intelligence is on the side
    of the cloud and not on the users’.
  • 37:11 - 37:14
    What about its technical autonomy?
  • 37:14 - 37:19
    If there’s a shutdown, if a service
    like FileFactory closes,
  • 37:19 - 37:23
    what about the people?
    If we forget to use
  • 37:23 - 37:28
    our local tools and let
    the image storage services
  • 37:28 - 37:34
    the ones that process and exchange them, etc.,
    there comes a point in which we lose
  • 37:34 - 37:41
    an important part of autonomy which
    Richard Stallman reports in this quote:
  • 37:41 - 38:00
    He says: “One reason you should
    not use Web applications
  • 38:00 - 38:04
    to do your computing is that you lose control.
    It’s just as bad as
  • 38:04 - 38:07
    using a proprietary program".
  • 38:08 - 38:17
    He encourages us to do our
    own computing with copies of
  • 38:17 - 38:21
    a freedom-respecting program.
  • 38:22 - 38:27
    Otherwise, one is putty in the hands
    of whoever developed that software.
  • 38:27 - 38:32
    Menaces to network neutrality
    are also important
  • 38:32 - 38:37
    since the thin clients, I mean,
    the not-so-powerful computers
  • 38:37 - 38:43
    that launch these services,
    are enough to be able to
  • 38:43 - 38:46
    enjoy them.
    The key matter is
  • 38:47 - 38:51
    the quality of network access,
    so the access quality
  • 38:51 - 38:55
    becomes a critical requirement and also
    there’ll be more and more enterprises
  • 38:55 - 38:59
    interested in being able to give
    people who are willing to pay more
  • 38:59 - 39:02
    a privileged access to the network,
    hence that much pressure
  • 39:03 - 39:04
    for that neutrality to be broken.
  • 39:06 - 39:09
    And there’s the need for a
    higher decentralization of Internet.
  • 39:12 - 39:19
    When the Internet is used to further
    centralize computing power,
  • 39:19 - 39:25
    the pendulum seems to be swinging away
    from individual autonomy,
  • 39:26 - 39:29
    and towards more concentrated power
  • 39:30 - 39:32
    in fewer hands.
    We should ask ourselves
  • 39:32 - 39:34
    if this is within
    the Internet’s philosophy.
  • 39:34 - 39:39
    CC will be, from this point of view,
  • 39:39 - 39:43
    an inherently political technology.
    Once fundamental decisions are taken,
  • 39:44 - 39:52
    changing the sign of its social
    impact will be extremely difficult.
  • 39:52 - 39:54
    Just like with Robert Moses’ overpasses.
  • 39:54 - 39:58
    Once those were built,
    there was no way for the buses
  • 39:58 - 40:00
    to cross them to get to the parks.
  • 40:01 - 40:04
    So putting rails and flowers
  • 40:04 - 40:08
    won’t do much change to the system.
    Then the important thing is to be able
  • 40:08 - 40:11
    to do a social futurology previous
  • 40:11 - 40:13
    to the own uses of CC.
  • 40:13 - 40:20
    From this point of view, since
    it will be affecting even more areas
  • 40:20 - 40:26
    of our own activity,
    be it jobs, be it
  • 40:26 - 40:31
    our leisure time for example, Pokémon GO is
    a good example of a on-cloud game,
  • 40:31 - 40:38
    but as we’ll see as well, autonomous cars,
    driverless, by Apple
  • 40:38 - 40:44
    and Google are becoming elements
    that use factors
  • 40:44 - 40:47
    of CC such as geolocation and also
  • 40:48 - 40:52
    ethical norms, as we’ll see,
    to know how a car should react
  • 40:52 - 40:56
    during a risk factor such
    as the possibility of having
  • 40:56 - 40:59
    many people in the road
    ahead and we have to divert.
  • 40:59 - 41:03
    What will the car do? Run them over or
    throw the driver and the family down a cliff?
  • 41:04 - 41:06
    We’ll see more about it later.
  • 41:06 - 41:14
    Corporate agents, the enterprises,
    become much vulnerable
  • 41:15 - 41:18
    to the influence of central states
  • 41:18 - 41:24
    because of national security matters,
    law enforcement, war on terrorism,
  • 41:24 - 41:28
    defense of national values,
    protection of free trade, etc.,
  • 41:28 - 41:32
    for the enterprises to be
  • 41:32 - 41:37
    some kind of cultural and political
    battering rams in this war.
  • 41:37 - 41:45
    Google, Microsoft and Apple
    admitted that they accepted
  • 41:45 - 41:50
    subsidies from the US Government
    in exchange for
  • 41:50 - 41:56
    user and content identification
    and for general access without
  • 41:56 - 42:01
    explicit judicial order to files downloaded
  • 42:02 - 42:03
    by users to their systems.
  • 42:04 - 42:08
    I’ll quote two sentences
    which are very curious
  • 42:08 - 42:14
    where a Google attorney says that:
    “Users from any
  • 42:14 - 42:16
    email system who exchange information
  • 42:16 - 42:20
    and mails with a Gmail user
  • 42:20 - 42:27
    should have no legitimate
    expectation of privacy ever",
  • 42:27 - 42:30
    I mean, they shouldn’t be surprised
    that their emails are being monitored.
  • 42:31 - 42:38
    And because of this he states:
    “Just as a sender
  • 42:38 - 42:43
    of a letter to a business colleague
    cannot be surprised
  • 42:44 - 42:49
    that the recipient’s assistant opens
    the letter, people who use
  • 42:49 - 42:52
    Web-based email today,
  • 42:53 - 42:56
    cannot be surprised of their
    emails are processed by
  • 42:56 - 43:01
    the recipient’s email provider.
  • 43:01 - 43:10
    And, of course, a person should
    not have legitimate expectation of
  • 43:10 - 43:17
    privacy for their information that they
    voluntarily turn over to third parties.
  • 43:17 - 43:22
    It’s quite curious, because when
    I give a letter to the mail service
  • 43:23 - 43:26
    I’m not expecting the postman
    to open it, just to deliver it.
  • 43:26 - 43:30
    I mean, I’m not giving him a
    blank check, the question is:
  • 43:30 - 43:33
    why am I giving it to Gmail?
    And of course when I check my mail
  • 43:33 - 43:36
    from my Complutense teacher account,
  • 43:36 - 43:38
    I suddenly notice it says
    “Featured messages”,
  • 43:39 - 43:41
    “Important people who sent you information".
  • 43:41 - 43:45
    I say: why do you people have to check
    if those people are important to me?
  • 43:46 - 43:51
    Of course, that entails
    information scanning.
  • 43:51 - 43:57
    Okay, what I said before.
    Sending an email
  • 43:57 - 44:01
    is like giving a letter to the mail service.
    The expectations are that the service delivers
  • 44:01 - 44:05
    that letter, but not that
    it opens it and reads it.
  • 44:05 - 44:10
    Similarly, when I send an email I expect it
    to be delivered to the intended
  • 44:10 - 44:16
    recipient with a Gmail account.
    But, why would I expect
  • 44:16 - 44:20
    its content to be intercepted
    or read by the provider?
  • 44:21 - 44:25
    And, lastly, we have
    the called "function creep",
  • 44:26 - 44:31
    I mean, the unauthorized spreading of
    information for different purposes,
  • 44:31 - 44:36
    and personal data collected collected
    for a particular and well defined purpose
  • 44:36 - 44:41
    can be added to other data,
    mixed, combined and
  • 44:41 - 44:46
    get a result from it that
    is unrelated to the original purposes.
  • 44:46 - 44:51
    The case of Instagram,
    in 2012 it announced
  • 44:51 - 44:57
    that its advertisers can freely make use
    of any uploaded personal picture
  • 44:57 - 45:01
    without economic compensation
    and notification to the users
  • 45:01 - 45:04
    A big scandal was on the table
  • 45:04 - 45:09
    and they backed out, but anyway this
    is an example of how with just
  • 45:09 - 45:12
    an ownership change of a company
    or with an American company
  • 45:13 - 45:16
    being sold to another country outside
    the US or the European Union
  • 45:17 - 45:21
    and with other type of legislation,
    the laws that the information conforms to
  • 45:21 - 45:24
    inside the servers
  • 45:24 - 45:30
    totally changes.
    There are other dilemmas as well
  • 45:30 - 45:35
    that have to do with such fun
    things as cloud robotics,
  • 45:35 - 45:39
    autonomous cars,
    drones programming,
  • 45:40 - 45:47
    who’s responsible for
    the accidental shots by a drone
  • 45:47 - 45:52
    that is, by definition, automatic?
    Automated weapons are also
  • 45:52 - 45:58
    subject to the dilemma and of course
    they take the dilemma of morality of ethics
  • 45:58 - 46:02
    of robots to another more complex
    than we had until now.
  • 46:02 - 46:08
    Car makers are facing the
    problem of which algorithm
  • 46:08 - 46:14
    they should have to protect the passengers
    in case a possible running over.
  • 46:14 - 46:17
    Let’s say you’re driving
    on a mountain road
  • 46:18 - 46:23
    and there are five people on the road.
    Your autonomous car notices there are
  • 46:23 - 46:28
    five people and so to prevent the greater harm
    it throws itself down the cliff.
  • 46:28 - 46:32
    The question is:
    who would buy that car?
  • 46:32 - 46:36
    But if you ask people how an autonomous
    car should be programmed,
  • 46:36 - 46:39
    the answer will be:
    “aiming for the greater possible good".
  • 46:39 - 46:43
    It’s the utilitarian concept.
    This takes us to the famous dilemma
  • 46:43 - 46:50
    of the trolley problem which
    Philippa Foot introduced in 1967,
  • 46:51 - 46:53
    it’s a very fun matter.
    Let’s say you’re on a trolley
  • 46:54 - 47:00
    with no brakes and all of a sudden
    you see ahead of you a certain number
  • 47:00 - 47:03
    of workers, let’s say five,
    and you’re going to run them over.
  • 47:03 - 47:12
    However, if you change of track,
    there’s a split of tracks,
  • 47:12 - 47:17
    you can just kill one person,
    what would you do in such case?
  • 47:17 - 47:20
    Divert to kill one person
    or run those five over?
  • 47:20 - 47:25
    And the answers are
    obviously very complicated.
  • 47:25 - 47:28
    You can say: “Well, five is okay, but
  • 47:28 - 47:32
    what if there are five on one side
    and a little kid on the other?"
  • 47:33 - 47:36
    How do we react in this case?
    There’s also another version where
  • 47:36 - 47:38
    there’s a very, very fat person
    who we know if we can push
  • 47:39 - 47:42
    it’ll make the trolley derail
    and not kill anybody.
  • 47:42 - 47:47
    We must throw down the fatty, this looks
    like something by Mike Myers, right?
  • 47:51 - 47:55
    So, here we have a very
    important problem to know how
  • 47:55 - 47:59
    to react. Also, when we talk
    about on-cloud robotics
  • 47:59 - 48:04
    look at the extraordinary
    ability given by the fact that
  • 48:04 - 48:08
    the programming of the robots
    or automatic devices,
  • 48:08 - 48:12
    be it weapons, robots,
    autonomous cars, etc.,
  • 48:12 - 48:17
    can be done live from
    the cloud itself, therefore
  • 48:17 - 48:21
    the changes are much
    more effective and efficient.
  • 48:21 - 48:27
    Good, we’re ending here
    with the CC matter
  • 48:27 - 48:30
    as a way of life.
    According to Langdon Winner,
  • 48:30 - 48:34
    technologies are forms of life
    because they reflect our interests,
  • 48:35 - 48:44
    desires and embody the goals of people,
  • 48:45 - 48:47
    the goals we promote and develop..
  • 48:47 - 48:52
    In a world of global consciousness is also needed
    that the global technical infrastructure
  • 48:52 - 49:00
    has a certain ethical element and of course
  • 49:00 - 49:05
    CC is very compatible with
    that definition of privacy,
  • 49:05 - 49:13
    for example, from the young people now
    where privacy is not something to defend
  • 49:13 - 49:21
    in an absolute way, but relatively.
    And it’s also important and must be counterbalanced
  • 49:21 - 49:27
    with the right that every youngster
    of being able to live-stream his/her life.
  • 49:27 - 49:31
    So it’s not just about the right
    for privacy of data
  • 49:32 - 49:35
    but the right of being able
    to share with my friends
  • 49:35 - 49:37
    all the important data which
    may be important to me.
  • 49:43 - 49:48
    We also need a critical approach
    to problems like these,
  • 49:48 - 49:53
    since a more advanced technology
    doesn’t just mean a better society.
  • 49:53 - 49:58
    “Technological determinism" would mean that,
    given the advance of the Internet,
  • 49:58 - 50:02
    everything is getting democratized.
    Given the advance of the
  • 50:02 - 50:04
    Information Technologies and
    the drop of prices by Moore’s law,
  • 50:05 - 50:08
    everyone will be able to enjoy
    by the trickle-down effect
  • 50:08 - 50:11
    what was very expensive at a certain time,
    now becomes cheaper and everyone
  • 50:11 - 50:13
    has access to it.
    Well, it’s proven that
  • 50:14 - 50:19
    this is not so simple.
    Technology opens some democratizer doors
  • 50:19 - 50:25
    but not necessarily achieves
    this without any other element
  • 50:25 - 50:30
    if a new form of construction
    is not created, a new social order.
  • 50:31 - 50:37
    When technologies take over
    society, the form
  • 50:37 - 50:42
    a new sociotechnical system which sometimes
    even replace the very constitutions
  • 50:42 - 50:45
    because they tell us how to
    distribute power, authority,
  • 50:45 - 50:49
    freedom and justice. We should ask
    ourselves if that’s somehow the case
  • 50:49 - 50:54
    of cloud computing.
    CC will allow
  • 50:54 - 50:59
    companies and agencies to function
  • 50:59 - 51:04
    at a much larger scale,
    providing global solutions.
  • 51:04 - 51:11
    The questions is whether we will eventually
    eliminate cultural differences
  • 51:11 - 51:16
    and local solutions on behalf of a
    unified standardized market-society
  • 51:16 - 51:21
    The rationality of CC
  • 51:22 - 51:26
    also will impose new forms
    of hierarchical reorganization,
  • 51:26 - 51:31
    new actors will rule
    and information as big data
  • 51:31 - 51:35
    will be power more than ever.
    Maybe the perfect paradigm
  • 51:35 - 51:38
    lies in the combination
    of the flexibility given by CC
  • 51:38 - 51:43
    with the ability of data processing
    given by data mining and big data in general.
  • 51:43 - 51:50
    In this digital ecosystem,
    we’ll have to see if CC
  • 51:50 - 51:55
    will predate other models
    of non-global business, right?
  • 51:55 - 52:02
    Since we can work on a much
    larger scale with the same flexibility
  • 52:02 - 52:07
    of the local models, for example,
    who buys at their local store
  • 52:07 - 52:09
    instead of doing it in Amazon?
  • 52:09 - 52:15
    And there will also be a bigger
    concentration of political power on
  • 52:15 - 52:21
    CC companies, which will have a
    powerful voice in regulation
  • 52:21 - 52:27
    or deregulation of the market,
    controlling the political institutions
  • 52:27 - 52:33
    that should control them, then
    there could be a shift
  • 52:33 - 52:38
    of political power leading to
    a more vulnerable society to corporative
  • 52:38 - 52:39
    and enterprise interests.
  • 52:39 - 52:46
    Alright, it’s 2:00, only a
    minute to uh... to finish.
  • 52:46 - 52:48
    Thanks for your patience.
  • 52:48 - 52:52
    As I said before, those who call
    themselves "generation X" or "millennials",
  • 52:52 - 52:56
    I mean, people who were born
    after the year 2000,
  • 52:56 - 53:01
    do not care very much about who owns
  • 53:01 - 53:09
    a personal picture of them
    stored on a cloud service
  • 53:10 - 53:15
    but care more about the freedom
    to share it and show it at will.
  • 53:15 - 53:19
    Then, any process of
    identity construction emerges
  • 53:19 - 53:23
    from the “mediascape".
  • 53:23 - 53:31
    When one used to build
    his/her identity in a traditional way,
  • 53:31 - 53:35
    he/she didn’t carry it around,
    but it was what was shared.
  • 53:35 - 53:39
    I mean, the neighborhood one lived in,
    the language spoken, the nation
  • 53:39 - 53:47
    one lived in, etc. But today,
    the soundtrack of our life is
  • 53:47 - 53:51
    made of what we listen, read,
  • 53:51 - 53:55
    put in our MP3 player.
    So, look at the same walkman metaphor.
  • 53:55 - 53:58
    When Sony invented the walkman,
    which was the first
  • 53:58 - 54:02
    portable cassette working with batteries,
    and one could carry it with earphones
  • 54:02 - 54:06
    on the street, they invented
    much more than a device.
  • 54:07 - 54:09
    The concept of walkman is also
    “the man that walks” and
  • 54:09 - 54:14
    it’s the person that builds his/her
    own identity through this walk,
  • 54:14 - 54:19
    so what one carries inside their ears,
    their brain,
  • 54:20 - 54:23
    what one listens to through
    a walkman , initially, and now
  • 54:23 - 54:30
    an MP3 or MP4 player or whatever,
    is built in their true landscape.
  • 54:30 - 54:35
    In other words, one life’s landscape is
    the "soundscape" or the "mediascape".
  • 54:35 - 54:40
    If we want to know what’s with a person,
    what are his/her habits, values...
  • 54:40 - 54:44
    it’s much more important to know
    what’s on the MP3 player rather
  • 54:44 - 54:47
    than knowing where he/she lives,
    walks and goes around.
  • 54:47 - 54:51
    So it’s a mobile construction
    of the identity.
  • 54:51 - 54:57
    And in the end, I’d just like to highlight
    a funny side that is the discussion that
  • 54:57 - 55:04
    Weaver makes about "god terms
    and evil terms".
  • 55:04 - 55:09
    I mean, terms that are
    the best and lame terms.
  • 55:09 - 55:14
    So a "god term", I mean,
    a super positive term,
  • 55:14 - 55:18
    a god term, are those that from
    the moment they’re spoken, everyone
  • 55:18 - 55:23
    is astonished, so
    they play a huge role
  • 55:23 - 55:28
    in the settlement of the science
    schedule, in countries like
  • 55:28 - 55:32
    the US. The concept of the last
    frontier has been, for example,
  • 55:32 - 55:37
    a constant reference in American politics,
    when they talked about
  • 55:37 - 55:44
    “the conquest of the west, the last frontier”.
    Even in Star Trek: Space, the last frontier.
  • 55:44 - 55:50
    And it’s always an appeal
    to an achievable dream.
  • 55:51 - 55:56
    When talking about CC,
    if the data is on the cloud
  • 55:56 - 55:59
    it'll be so close to God
    and if it's really in the sky,
  • 55:59 - 56:03
    who's going to worry about it, right?
    There's a certain funny rhetoric
  • 56:03 - 56:08
    in that, but Weaver and
    Langdon Winner say that language
  • 56:08 - 56:12
    reflects of a society,
    so we have to be
  • 56:12 - 56:17
    very careful so, from an enterprise context,
    some terms are not redefined
  • 56:18 - 56:23
    such as freedom, rights,
    privacy, information ownership, etc.
  • 56:23 - 56:29
    We're closing with five short conclusions.
    First, cloud computing
  • 56:29 - 56:34
    is a revolutionary development in I.T.
    and it is
  • 56:34 - 56:39
    a sociotechnical paradigm,
    a whole rationality set,
  • 56:39 - 56:42
    social and territorial organization, etc.
  • 56:42 - 56:50
    There is a strong financial
    pressure to invest
  • 56:51 - 56:55
    in critical infrastructures.
    It doesn't work in any case
  • 56:56 - 57:00
    if the territory is not wired.
    Third, it can be interpreted
  • 57:00 - 57:06
    as an inherently political
    technology in the strong sense.
  • 57:06 - 57:11
    I mean, certain technologies require
    a set of certain requirements
  • 57:12 - 57:14
    to properly function.
    Look at this.
  • 57:14 - 57:18
    There's an article about technology...
    computing, well, kidding,
  • 57:18 - 57:26
    very rad written by Engels in
    1872 and it's titled "On Authority".
  • 57:26 - 57:32
    A tremendously interesting article
    in which Engels, the famous socialist author,
  • 57:32 - 57:40
    argues against anarchists.
    So, Engels says in this article:
  • 57:40 - 57:48
    "Anarchists say that if deep down
    the workers take control of
  • 57:48 - 57:52
    a mass production factory,
    the worker will be freed
  • 57:52 - 57:56
    and people will live better".
    However, what Engels argues
  • 57:56 - 58:00
    is that it's not the entrepreneur
    who bothers the workers,
  • 58:00 - 58:04
    but it's what he calls
    "the authority of the steam".
  • 58:04 - 58:10
    He says then: "When we mass produce,
    when we produce through
  • 58:10 - 58:15
    technology, we dominate nature through
    it but, at the same time,
  • 58:15 - 58:19
    we become slaves to ourselves because,
    if one doesn't follow the pace set
  • 58:19 - 58:24
    by the chain of production, things
    won't be properly made, they don't pass
  • 58:24 - 58:26
    the quality test,
    it's like nothing was made".
  • 58:26 - 58:31
    Then, Engels says: "Honestly, it's not
    always the will of some people
  • 58:31 - 58:35
    that oppresses others, but every
    technological model, at a certain moment,
  • 58:35 - 58:41
    brings certain game rules
    that one can hardly subtract".
  • 58:41 - 58:48
    Because of this, even Marx, notice that there's
    a deterministic version of him in the first volume
  • 58:48 - 58:53
    of the book "Capital", where he states:
    "The plough is to the farming society
  • 58:53 - 58:55
    the same as the steam machine is
    to industrial society".
  • 58:56 - 58:59
    In other version he says: "The plough is
    to the farming society the same
  • 58:59 - 59:01
    as the mechanical loom is
    to industrial society".
  • 59:01 - 59:04
    I mean, when there are ploughs
    the organization is feudal.
  • 59:05 - 59:07
    When there's Internet, the organization
    can't be feudal,
  • 59:08 - 59:11
    it can't be centralized,
    it can't be as simple as before.
  • 59:15 - 59:22
    Finally, we don't have to wait
    to the implementation of technology
  • 59:22 - 59:28
    to forecast its impact on society.
    These impacts above all
  • 59:28 - 59:31
    are due to its nature.
    The last point is, to me,
  • 59:31 - 59:34
    the most interesting, but is subject
    to discussion, obviously.
  • 59:34 - 59:38
    CC is in fact a political constitution.
  • 59:38 - 59:43
    When people get used to having
    their data not in their phones,
  • 59:43 - 59:47
    but on the cloud, the mentality
    will have changed. And just like
  • 59:47 - 59:54
    the way that a person's mentality
    changes when having
  • 59:54 - 59:58
    their own energy at home, when they have
    their photovoltaic panel or
  • 59:59 - 60:05
    their production of bituminous
    schists or biomass or compost...
  • 60:05 - 60:11
    When they have their wind
    turbines in windy lands,
  • 60:11 - 60:15
    for example the Canary Islands are great
    for having a wind turbine at home,
  • 60:15 - 60:17
    it makes you practically self-sufficient.
    When people have
  • 60:17 - 60:20
    informational autonomy, energy
    autonomy, they often ask too
  • 60:21 - 60:23
    for a certain political autonomy.
    However, when our
  • 60:24 - 60:28
    energy and information systems, etc...
    are centralized
  • 60:28 - 60:32
    they also promote a more
    centralized mentality, more competitive
  • 60:33 - 60:36
    and more selfish. Then, the discussion
    on cloud computing,
  • 60:36 - 60:41
    and this is the final sentence, doesn't
    have to do just with technical design,
  • 60:41 - 60:45
    but with which values
    we are willing to support
  • 60:45 - 60:47
    and with the kind of
    society we want to be.
  • 60:48 - 60:51
    Thank you so much, it's been
    a pleasure being with you.
Title:
Dilemas Éticos y Políticos de la Computación en la Nube - Dr. Javier Bustamante Donas
Description:

Conferencia dentro del ciclo de la asignatura de Ética, Legislación y Profesión.

more » « less
Video Language:
Spanish
Duration:
01:16:59

English subtitles

Revisions