Thank you very much. I’m pleased to be here with you. We have until 2:15, approximately. Later at 3, I’m giving a Chinese Thinking class at the Faculty of Philosophy. A “ni hao” thing. At the Faculty of Philosophy, I have the pleasure of teaching a class of Science, Technology and Society, which is very close to Philosophy-Technology. Also another of Political Philosophy, where I introduce to plenty of elements of participatory democracy, of how the Internet transforms the boundaries of the political and ethical ambit. And a wonderful class of Chinese Thinking where I have the luck of discussing the fundamental paradigms of the ancient Chinese thinking in the times of the battling states, Chun, Zhou, and it’s a time in which the Lao-Tse thinking emerges. Confucius’ as well and other thinkers like Mozi, Gong Sunlong, Han Feizi, etc. All the basics of the Chinese thinking comes from that time and it’s quite interesting because, normally, the nexus between politics and technology are forgotten. And, nowadays, we’re seeing how technology is totally extending the range of possibilities of the political ambit. In fact, it’s usually said that it’s the ideas that shape the world. Well, today we can see that it’s the artifacts, the devices and the non-official use that people give them what really makes a very powerful engine of social change. Note that even the very political ambit is built through technology. Do we have any Italians here or not? Anyone from Italy with an Erasmus scholarship or so? No one in here? Who can, please, tell me when, before the Italian construction of the 19th century…? People travelled on horses between sites. The Italians felt Venetian, Milanesi, Torinese, Bolognesi, Roman, Sicilian… But there was no conscience of Italy. What is it that, suddenly, makes the thought of being Italian appear, the conscience of the Italian identity? Who’d like to tell me? Silence in the crowd. Anyone? The invention of the train. When the first iron tracks appear in Italy and when the distances that had to be covered in a matter of days before can be covered in hours, people realize the transformation of the physical space. How directly the political identity, the identity that one feels on himself, is transformed through technology. Aristotle said so. Aristotle, who’s one of the founders of our democratic thinking, although people who speak of this don’t usually know that the Athenian democracy only reached out to ten or fifteen thousands of people. Just men, who did not rely on their salary but their income and were free to practice the “dolce far niente”, bios theoretikos, the theoretic life, the poiesis, poetry, philosophy, general thinking, etc. So, Aristotle said that democracy can’t go beyond the walls of the polis, because it can’t reach further than the orators voices. When there’s no dialogue, there’s no possible democracy, and he was right, so the walls of the polis were physical until now and no more than one city could be democratic in those terms. I mean, there couldn’t be a participatory democracy if there was no possibility of dialogue. What do the Internet and the Information and Communication Technology allow in general? That everyone speaks and listens. Those walls are directly brought down. When there’s a students protest in Iran or when there’s a strike and a later protest by Buddhist monks in Myanmar, there’s always someone recording with a phone which is streaming live those events to a global audience. So, what about it? People who listen, see, observe, take part in that information become protagonists. Things start happening live. And it almost sounds like a joke that there’s a Ministry of Foreign Affairs. What are foreign affairs? Every single thing out there directly affects us, we’re now living in a time of fundamental elections for Spain, which is the election for president in the United States. I think that we should all vote, because, deep down, it’s going to be someone ruling over our heads, to put it like that, right? Last night, a very interesting debate took place between Mrs. Clinton and Mr. Trump, if the latter can be called a mister, and it was viewed live, especially for those who wanted to get up at 2:30 AM to follow it, right? Anyway, inside this matter I would like to talk to you about the ethical and political dilemmas brought by one of the most interesting models of computing that exists today, which is the model of cloud computing. I forgot to say that I had the luck of doing a Philosophy PhD, at Education Sciences, from the Complutense. I’m also a Computer Science master from Universidad Pontificia of Salamanca. I did a Science, Technology and Society PhD at the University of Rensselaer, Rensselaer Polytechnic Institute, New York, where they have the best department of science, technology and society in the world, headed by a tremendously interesting person called Langdon Winner, who was my thesis director and has become my best friend. And, apart from being an exceptional person in many fields, he’s a person who the Senate and the Congress, in a group hearing, invited to define the American law of social control of nanotechnology. Therefore, I come from a field where differences between science and arts don’t exist at all, where you can’t say one is on technical side or on a humanistic side, and where those two sides necessarily have to have a meeting point, or else we’ll hardly get to do anything interesting. But well, since the subject of this conference is very concrete and peculiar, I’d like to stick to it, and as I said, I’d like to talk about the ethical and political dilemmas in the cloud computing, along with the big data, one of the most powerful paradigms of today. And let’s do something else too: if anyone at any time would like to raise a hand, I won’t be bothered at all, all the contrary. The more interactive this is, the better. I’ll try to go fast to have some time later for you to show me your concerns, or your protests, or your divergent opinions, for there’s nothing more interesting than that, alright? Then let’s begin. What I’m about to discuss comes from a research project by MINECO, titled “Science, Technology and Society. Ethical and Political Dilemmas in the Model of Cloud Computing as a New Sociotechnical Paradigm“. And the reference to this innovation project is here. This is important to mention, since at the end of the three years and a half, I’m asked, as director of this research project, for a report of everything I do, so if I don’t mention this, it’s all for nothing, isn’t it? But the important thing is that, at least, there was funding to have an interdisciplinary team that can handle these matters. If we part from a definition of cloud computing, there’s one created by the National Institute of Standards and Technology from the United States government which is really simple: “Cloud computing is a model that enables a whole set of computing resources such as networks, servers, applications, storage, service, etc... to be available in a direct way on demand for users and to be acquired and released with a minimal effort and without an important interaction between the provider and the own user”. It has five specific features. The first one is that the services will be provided in a clear way for the user, it doesn’t have any kind of direct interaction. Then, it’s very important that the computers or devices from which these applications can be launched, these are called “thin clients", I mean, computers with very little power because the processing ability is not on the user’s side, but on the cloud’s. So tablets, smartphones, netbooks, etc., can be used. After that, there will be a set of resources which will always be available dynamically for the users, in terms of memory, storage, bandwidth, virtual machines, etc. Also, there will be a very notable elasticity since every single user will experience the resources in use as if they were only available for him or her, that gives a feeling of unlimited resources and a system scalability: the more users that are in, if the system is well designed, each one will probably have a unique experience as to being the only user using those resources. And then, there’s the possibility of a clear monitoring of the resources, which is particularly important, for both users and providers. There are three types of models of service, one of them is the applications service, another one is the platforms, in the applications one is what we usually know, services such as Gmail. By the way, here’s a small review, and it’s the fact that, as we’ll later see, there’s a series of issues with privacy and with the integrity of emails through a service like Gmail. And it’s stunning to me that the Complutense University has outsourced all of his email network. Which means that all the information that goes among researchers in our university, including matters of potential patents, is being monitored by a private enterprise with strong ties to the American government. It’s not an opinion, it’s a matter of fact. And then, services like platforms offer tools and a programming environment that allow a customization of the applications. Microsoft Windows Azure and Google App are two examples where the control is leveled with the application but not with the physical infrastructures. Then, we have a last version of the models of cloud computing which is infrastructure as service. Here the final users do have access to the processing, the network resources, etc... and they can configure said resources inside their operating systems and use them as they please. Here you have examples such as Rackspace, Amazon Elastic Compute Cloud, etc. When we talk about cloud computing, I’d like to emphasize that it’s not just a technical development, but it involves a whole huge change of mentality in the way of living and understanding how we process our information. Of course, this has many advantages, such as that one can focus more on the creative aspects, not on the capture infrastructure, or process, or use of information, the spreading of it, etc. But it also presents serious ethical and political dilemmas which must be taken into account. A lot of them, in my opinion, don’t have so much to do with the contexts of use, with the way they are used, but with details belonging to the design of the cloud computing. And the key question here would be if cloud computing can be understood as an inherently political technology. That's a very, very interesting matter we can see now, because the history of computing shows how technical decisions always or often have some important social consequences. I mean, the ambit of technology is not opaque at all, in general it's not opaque to the daily life, but innovation is much more than a resource that optimizes efficiency itself. We have so many examples of it.. Believing in the neutrality of the technical decisions or thinking that nowadays there are aspects of the human life that remain isolated from the technical development is quite hard. Could anyone in here tell me any aspect of our reality, of this activity, in which technology, as an instrument or as a metaphor, hasn't altered it substantially or isn't affecting it? Can anyone think of any ambit? There are, but not many, though. I can't think of any right now, but, can you come up with any aspect where technology doesn't play an important role? Yes, please. (PUPIL) The ambit of the countryside, agriculture, livestock... Look, the most trolled enterprises in the whole world of technology, enterprises like Monsanto, are precisely related to agricultural production, to the creation of fertilizers, etc... which have automatically transformed the whole agricultural world. Nowadays there’s even a huge controversy with those enterprises that are producing what they call ‘killing seeds’, I mean, seeds that can be sown by the farmer but are no longer fertile a year later, therefore they’re no use for a new harvest, so the farmer depends all the time on the enterprise to keep sowing his fields. In fact, there’s a tag that serves as a mantra in the agricultural ambit which is the Green Revolution. It involves the use and general advertising of transgenic organisms to enhance productivity on behalf of erasing starvation in the world. The only problem is that, giving use to a basic precaution principle, we still don’t have much experience on what the consequences of the technological transformation of agriculture will be. I can only think of one that affects a huge part of the people here and it’s about, at the end of the 20th century, occidental men having 90% less of sperm than occidental men from the beginning of the 20th century. I mean, the ones from the end of the 20th century 90% less than those from the end of the 19th and beginning of 20th because of the diet, among other things. If you can tell me something natural that you can buy at a supermarket I would encourage your teachers to add one more point to all of your subjects. Go buy at a supermarket and tell me what’s natural there. (PUPIL) Unless it’s an ecological supermarket... Okay… I can say something about wheat. The wheat we eat nowadays is already completely different to the wheat that ceased existing in the 40s and 50s, because taller ears of the wheat were made for the combine harvesters to reap them properly and from that point, 96 or 97% of all the wheat in the world is already transgenic, it’s already transformed from the original varieties. Yes, anything else, please? (PUPIL) Philosophy. Wow, well, philosophy… You’ve actually said something quite beautiful, and philosophy in many cases is still unaware of the technology subject. When they say “Man, the essence of man, man”. Can anyone tell me what is man out of science? If you know what a human being is... What’s a human being when at the Beijing Olympics they started to introduce a committee for genetic doping to detect the inclusion of monkey genes so the high-jumpers had more explosiveness in the jump? Or dolphin genes so the skin of the swimmers was more hydrodynamic? When the result of an athletic test is totally transformed or also the resistance ability through stuff like erythropoietin or tetrahydrogestrinone. I mean, there’s such a huge developed doping, the human being can be transformed in such an incredible way and also when we see the difference between a “pan troglodytes” chimpanzee and a human being... Who’d like to tell me what is it? Minimal, 46 chromosomes for us, 48 chromosomes for them. Take two acrocentric chromosomes, turn them into a metacentric one, and we already have patterns for possible transformation, artificial evolution, from chimpanzee to human being or vice versa. I mean, the essence of human being is completely touched and defined by science, and philosophy is totally unaware of it. Just like some philosophy that doesn’t talk with science is just as blind as any computer science that’s not in touch with the outer world. Therefore, every concept of free software, free knowledge, etc... has so much strength, because, until this moment, we’ve conceived science as a centralized knowledge and not peripheral, thinking that specialists are in the center of the system. However, nowadays, we begin to see that the called “hidden innovation”, I mean, the innovation that is not brought up in universities, research institutes, etc., has a key role today. How the called "power users", or "early adopters" too, people who adopts some technology in an early way and give new uses to it often give new development patterns that seemed totally unthinkable until now. And the problem with free software and knowledge, the problem to the system lies in the metaphor they entail because, somehow, if one gets used to sharing we’re already talking about a whole new mind model where synergy opposes competence. Where everything not given is a loss, and where wealth doesn’t have anything to do anymore with gold, with something you keep under your bed, but it has more to do with lettuces and tomatoes, I mean, perishable products which at some point no longer have value if not watered or kept or taken care of, etc... So, there’s also another interesting paradigm. Our economy was based on the law of supply and demand through which the most scarce has more value because few people has access to it. So, if anyone owns a Silver Ghost’s Rolls Royce from the 30s, it’s very valuable, not because it’s a great car, but because it’s a very rare car, there are very few models in the world. The fewer there are, the more valuable they are. There’s a movie, “The Collector“, where a philatelic who collects rare stamps finds out he has a very limited one and finds another person who has another copy of the same stamp. What does he do when he has both stamps in his hand, when he gets to buy the second stamp? He tears one of them apart, because one unique stamp is more valuable than the sum of two rare stamps. I mean, the fewer people has access to something, the more valuable that is. However, if we talk about emails or operating systems... If someone has an operating system that nobody uses or just one uses, is it very or not very valuable? If someone has an email and just him/her has access to the Internet, is it very or not very valuable? Not valuable at all. When does it start being valuable? The moment the more it has. When is it very valuable? When everyone has it. So, when we talk about an economy of the information, of the cloud computing, when we work on the cloud and when we work in a network, the total value of the network will rise with the squared number of users, in other words, the total value of the network will increase in a quadratic function. While in the real world, the fewer people that has access to something, the more valuable that is. Alright? Technology can embody models of life, ideological forms and also forms to dissolve power or to resolve controversies regarding power. My teacher, Langdon Winner, has a beautiful article titled “Do artifacts have politics?” in which he presents many cases, and one of them is Robert Moses’. Robert Moses was the great architect of New York who in the 30s and 40s worked on public constructions that gave this city the role of business metropolis that has until now. And among his most acclaimed works are over 200 overpasses which lead to Jones Beach Park, in Long Island, which is like a green lung for the city, like Casa de Campo in that time. He built them in the 30s, they’re overpasses that cross the big highways such as Palisades Parkway that lead to Long Island. So under those overpasses there’s a height of 2.5 meters that contrasts with the huge width of the multiple lane highways in the United States. Then, when they tried to understand why this man had built overpasses of just 2.5 meters, they started with the esthetic and architectural guidelines. The evolution of European art nouveau, of matter qualities, the volumetric dialogue… And Langdon Winner says in this article: “None of that is true, it needs to be seen in a social environment”. In that time, Robert Moses was quite racist and snob, but, in any case, he couldn’t hang a sign in the parks of New York that said “Black and Hispanic people are not welcomed in these parks". However, when overpasses with a height of 2.5 meters are built, what can pass below them? Automobiles, right? But, what cannot pass? (PUPIL) Buses. Then if you technically prevent buses from crossing under those overpasses, the access to those parks is radically limited. So technology was acting as a way of social discrimination to make only black and Hispanics with their own cars able to access those parks. Everyone knows that black people who had their own car in the 30s and have a lot of money and are very famous, they whiten before our eyes in a metaphysical way. Just like the transformation that Michael Jackson went through but in a natural way. If we continue, we’ll see that until now we’ve gone from centralized models, from the famous mainframes, the IBM S/360 and S/370 from the 70s, to distributed computing, citizen web 2.0, web 3.0, the Internet of things, etc... We’ve come from models where intelligence laid in the center to a distributed intelligence. From time to time, a new sociotechnical paradigm arises. There are tablets, smartphones, phablets now, virtual reality, and I hope that by the end of this year we already have the first models of Oculus Rift and PlayStation 4 goggles out on sale. The big question is whether all of this has anything to do with a wider cultural context. There are official uses, but under that, non-official uses given by the very users, and this is what I was previously calling “hidden innovation”. If we talk about the dilemmas that cloud computing brings along, there are three fundamental risks that should be noted, and a fourth one showed there. First off, we have the privacy issues, with the loss of control over access to the information that we store on the cloud. Think about the messages that go through Gmail, about the photos on Picasa or Instagram, etc. Intellectual property issues, due to delocalization of servers and multiple legislations. Cloud computing is ubiquitous, however, servers are in certain countries and it’s their law the one they must follow. Therefore, if a server is in China, in the United States, in an informational paradise or in the European Union, it will obey very different legislations about protection of personal data. And then there are problems related to the inherently nature of cloud computing. Technical and political decisions on CC implementation are, as we’ll see, interdependent. There’s a very interesting concept by Bruin Floridi, which is the concept of interlucency, of transparency in some way. Floridi deffends that the fundamental to ethically use cloud computing is that it exists an epistemic virtue that creates a shared knowdledge necessary for the user’s knowledge to know what to conform to. I mean, there’s a duty from the provider’s side to inform of the conditions of use of CC and of the pros and cons for the users, in a way that the user may decide what he/she really wants to do. Truth is I don’t really share this point of view for a reason, because we’re often seeing that the terms of agreement from the computer programs are like the pamphlets of medication. Is there anyone in here who reads the full contract of a program, of Windows or of an operating system, of Apple, when you click “I agree”? Yes? You do read it? You are the first one in the world I’ve met. (LAUGHTER) They had an experiment at a bar, in the United States, some time ago, not long ago, in which they allowed the use of WiFi after giving a contract and answering “I agree“, and that contract involved the sale of your children, your home and something else. And people fell directly for it, no one reads those contracts. Deep down, I think that before having a proper information about what we win or what we lose, we need to have a deeper education to know what society we want and what values are those we want to serve with our own work. When we talk about these ethical dilemmas, I’ll mention ten in particular, and I’ll go fast so we have time later to talk about it. First, there’s already a new factor of digital divide. If society was in the past divided between the rich and the poor, and now between inforich and infopoor, we should also take into account that in cloud computing any infraestructural weakness will affect a lot its implantation. I mean, the world can be divided into two categories: first-class places where we have fast Internet access and second-class places where there’s no service. In the second, there will never be informational equality because one won’t be able to have the processing ability, somehow, we not only become slaves to our devices but slave to what we would call "imperfect places”, those places where one cannot use all of the cloud computing services. Reliability becomes an absolutely critical element, given that our data is on the cloud, like what happened to Yahoo. Not long ago 50 million passwords were stolen. I think that’s the number, right? In a single strike. If they steal something from your computer, it is limited. However, any theft, any fragility of vital subsystems becomes a big problem. If you’ve noticed, we’re going back to a centralized model, where intelligence once again is part of the core, of the central almond of the system. All of this of course creates a collective perception of vulnerability, and it’s very interesting. This happens, for example, to the energy models. It’s not the same to choose a centralized energy model like nuclear plants, big dams, etc., or another distributed with green energies. It’s not just a technical decision, it’s a political decision. Whether our energy is based on uranium or on plutonium, it’ll be justified that a government states “We are forced to bug your telephone lines and your electronic communications for the sake of the security of society.”, “We have to torture terrorists in prisons before the menace of a possible attack by crashing an airplane into a nuclear plant", etc... But well, look: if the energy model was distributed and not centralized, I can’t picture Osama bin Laden with a baseball bat breaking solar panels on the roofs, I can’t. So, if there wasn’t that centralized energy model, there wouldn’t be a collective perception of vulnerability either. And the perception of vulnerability is what leads us to accept the undesirable as inevitable, it’s a kind of policy which Naomi Klein, the great American intellectual, reports on the called “The Shock Doctrine". That on many occasions we are presented with the measures we do not wish to adopt as inevitable measures. Earlier, we were talking about the Faculty of Philosophy and I’ll give you one small and fun example. The Faculty of Philosophy had never had closed doors in its classrooms for many years, however, there comes a time when the deans are interested in having control over the classrooms, so for the people to be locked out they thought of adding locks to the doors. But the students, their representatives, would have complained at any moment if that had happened. What are the solutions? Put cannons inside. They put cannons inside the classrooms, without a cage and lowered so many of them could be stolen. The moment they’re stolen, locks are added to the doors, but not to lock people out, but to protect the patrimony. So look how the best way to get a problem out of the way is to redefine it. A political problem is redefined as a technical problem. Same thing happens with a government. If a government has a strong police, a harsh system for social control, the best thing to do is to create a very sensitive technology, very likely to be attacked and very centralized, very gathered. In that moment, we’ll have to choose whether to protect society and its liberties or to protect our own infrastructures. Therefore, uninterrupted functioning is an absolute must. Nothing can be done offline and, obviously, the risk of a cyberterrorist attack is bigger in a society whose services are centralized and sensitive. In the end, we’ll have to decide what to protect: society or the very cloud. Internet will be needing a more democratic architecture to prevent cloud computing from turning into a new form for social control. I don’t wanna take long on this, about BGP, but as you know, the Border Gateway Protocol is the way of Internet to determine the flux of data packages between domains. And it is not very flexible and not very smart when it comes to choosing a path that goes through the least possible number of independent systems. A more intelligent and decentralized multi-path intelligent routing will be necessary for the cloud computing not to turn into a new form for social control. This has examples like the viral networks of Walter Lippmann or like the whole Open Spectrum matter. I mean, the non-assignment of radio or microwave frequencies, but a smart sharing of the whole spectrum. But that would be another matter. What we said before, too, the delocalization of information and the extraterritoriality of laws. CC services are by definition, ubiquitous. However, communication infrastructures and servers belong to countries and are submitted to the national regulations. The legal framework is very confusing nowadays. And there are risks for privacy and integrity of personal information. CC implies massive data traffic, that data flows out of our firewalls, so by having systems of shared storage, shared channels, shared resources, virtualization of the same data in different operating systems at the same time, there’s always a major vulnerability factor. And we shan’t forget that the user is the ultimate responsible for any violation of the law. Also talking about the challenges to social empowerment and hidden innovation, we must think that having local tools and, above all, free software allows us to be the owners of what we’re doing. However, inside all this set of services in CC, the user does not need any specific skills to work on it and intelligence is on the side of the cloud and not on the users’. What about its technical autonomy? If there’s a shutdown, if a service like FileFactory closes, what about the people? If we forget to use our local tools and let the image storage services the ones that process and exchange them, etc., there comes a point in which we lose an important part of autonomy which Richard Stallman reports in this quote: He says: “One reason you should not use Web applications to do your computing is that you lose control. It’s just as bad as using a proprietary program". He encourages us to do our own computing with copies of a freedom-respecting program. Otherwise, one is putty in the hands of whoever developed that software. Menaces to network neutrality are also important since the thin clients, I mean, the not-so-powerful computers that launch these services, are enough to be able to enjoy them. The key matter is the quality of network access, so the access quality becomes a critical requirement and also there’ll be more and more enterprises interested in being able to give people who are willing to pay more a privileged access to the network, hence that much pressure for that neutrality to be broken. And there’s the need for a higher decentralization of Internet. When the Internet is used to further centralize computing power, the pendulum seems to be swinging away from individual autonomy, and towards more concentrated power in fewer hands. We should ask ourselves if this is within the Internet’s philosophy. CC will be, from this point of view, an inherently political technology. Once fundamental decisions are taken, changing the sign of its social impact will be extremely difficult. Just like with Robert Moses’ overpasses. Once those were built, there was no way for the buses to cross them to get to the parks. So putting rails and flowers won’t do much change to the system. Then the important thing is to be able to do a social futurology previous to the own uses of CC. From this point of view, since it will be affecting even more areas of our own activity, be it jobs, be it our leisure time for example, Pokémon GO is a good example of a on-cloud game, but as we’ll see as well, autonomous cars, driverless, by Apple and Google are becoming elements that use factors of CC such as geolocation and also ethical norms, as we’ll see, to know how a car should react during a risk factor such as the possibility of having many people in the road ahead and we have to divert. What will the car do? Run them over or throw the driver and the family down a cliff? We’ll see more about it later. Corporate agents, the enterprises, become much vulnerable to the influence of central states because of national security matters, law enforcement, war on terrorism, defense of national values, protection of free trade, etc., for the enterprises to be some kind of cultural and political battering rams in this war. Google, Microsoft and Apple admitted that they accepted subsidies from the US Government in exchange for user and content identification and for general access without explicit judicial order to files downloaded by users to their systems. I’ll quote two sentences which are very curious where a Google attorney says that: “Users from any email system who exchange information and mails with a Gmail user should have no legitimate expectation of privacy ever", I mean, they shouldn’t be surprised that their emails are being monitored. And because of this he states: “Just as a sender of a letter to a business colleague cannot be surprised that the recipient’s assistant opens the letter, people who use Web-based email today, cannot be surprised of their emails are processed by the recipient’s email provider. And, of course, a person should not have legitimate expectation of privacy for their information that they voluntarily turn over to third parties. It’s quite curious, because when I give a letter to the mail service I’m not expecting the postman to open it, just to deliver it. I mean, I’m not giving him a blank check, the question is: why am I giving it to Gmail? And of course when I check my mail from my Complutense teacher account, I suddenly notice it says “Featured messages”, “Important people who sent you information". I say: why do you people have to check if those people are important to me? Of course, that entails information scanning. Okay, what I said before. Sending an email is like giving a letter to the mail service. The expectations are that the service delivers that letter, but not that it opens it and reads it. Similarly, when I send an email I expect it to be delivered to the intended recipient with a Gmail account. But, why would I expect its content to be intercepted or read by the provider? And, lastly, we have the called "function creep", I mean, the unauthorized spreading of information for different purposes, and personal data collected collected for a particular and well defined purpose can be added to other data, mixed, combined and get a result from it that is unrelated to the original purposes. The case of Instagram, in 2012 it announced that its advertisers can freely make use of any uploaded personal picture without economic compensation and notification to the users A big scandal was on the table and they backed out, but anyway this is an example of how with just an ownership change of a company or with an American company being sold to another country outside the US or the European Union and with other type of legislation, the laws that the information conforms to inside the servers totally changes. There are other dilemmas as well that have to do with such fun things as cloud robotics, autonomous cars, drones programming, who’s responsible for the accidental shots by a drone that is, by definition, automatic? Automated weapons are also subject to the dilemma and of course they take the dilemma of morality of ethics of robots to another more complex than we had until now. Car makers are facing the problem of which algorithm they should have to protect the passengers in case a possible running over. Let’s say you’re driving on a mountain road and there are five people on the road. Your autonomous car notices there are five people and so to prevent the greater harm it throws itself down the cliff. The question is: who would buy that car? But if you ask people how an autonomous car should be programmed, the answer will be: “aiming for the greater possible good". It’s the utilitarian concept. This takes us to the famous dilemma of the trolley problem which Philippa Foot introduced in 1967, it’s a very fun matter. Let’s say you’re on a trolley with no brakes and all of a sudden you see ahead of you a certain number of workers, let’s say five, and you’re going to run them over. However, if you change of track, there’s a split of tracks, you can just kill one person, what would you do in such case? Divert to kill one person or run those five over? And the answers are obviously very complicated. You can say: “Well, five is okay, but what if there are five on one side and a little kid on the other?" How do we react in this case? There’s also another version where there’s a very, very fat person who we know if we can push it’ll make the trolley derail and not kill anybody. We must throw down the fatty, this looks like something by Mike Myers, right? So, here we have a very important problem to know how to react. Also, when we talk about on-cloud robotics look at the extraordinary ability given by the fact that the programming of the robots or automatic devices, be it weapons, robots, autonomous cars, etc., can be done live from the cloud itself, therefore the changes are much more effective and efficient. Good, we’re ending here with the CC matter as a way of life. According to Langdon Winner, technologies are forms of life because they reflect our interests, desires and embody the goals of people, the goals we promote and develop.. In a world of global consciousness is also needed that the global technical infrastructure has a certain ethical element and of course CC is very compatible with that definition of privacy, for example, from the young people now where privacy is not something to defend in an absolute way, but relatively. And it’s also important and must be counterbalanced with the right that every youngster of being able to live-stream his/her life. So it’s not just about the right for privacy of data but the right of being able to share with my friends all the important data which may be important to me. We also need a critical approach to problems like these, since a more advanced technology doesn’t just mean a better society. “Technological determinism" would mean that, given the advance of the Internet, everything is getting democratized. Given the advance of the Information Technologies and the drop of prices by Moore’s law, everyone will be able to enjoy by the trickle-down effect what was very expensive at a certain time, now becomes cheaper and everyone has access to it. Well, it’s proven that this is not so simple. Technology opens some democratizer doors but not necessarily achieves this without any other element if a new form of construction is not created, a new social order. When technologies take over society, the form a new sociotechnical system which sometimes even replace the very constitutions because they tell us how to distribute power, authority, freedom and justice. We should ask ourselves if that’s somehow the case of cloud computing. CC will allow companies and agencies to function at a much larger scale, providing global solutions. The questions is whether we will eventually eliminate cultural differences and local solutions on behalf of a unified standardized market-society The rationality of CC also will impose new forms of hierarchical reorganization, new actors will rule and information as big data will be power more than ever. Maybe the perfect paradigm lies in the combination of the flexibility given by CC with the ability of data processing given by data mining and big data in general. In this digital ecosystem, we’ll have to see if CC will predate other models of non-global business, right? Since we can work on a much larger scale with the same flexibility of the local models, for example, who buys at their local store instead of doing it in Amazon? And there will also be a bigger concentration of political power on CC companies, which will have a powerful voice in regulation or deregulation of the market, controlling the political institutions that should control them, then there could be a shift of political power leading to a more vulnerable society to corporative and enterprise interests. Alright, it’s 2:00, only a minute to uh... to finish. Thanks for your patience. As I said before, those who call themselves "generation X" or "millennials", I mean, people who were born after the year 2000, do not care very much about who owns a personal picture of them stored on a cloud service but care more about the freedom to share it and show it at will. Then, any process of identity construction emerges from the “mediascape". When one used to build his/her identity in a traditional way, he/she didn’t carry it around, but it was what was shared. I mean, the neighborhood one lived in, the language spoken, the nation one lived in, etc. But today, the soundtrack of our life is made of what we listen, read, put in our MP3 player. So, look at the same walkman metaphor. When Sony invented the walkman, which was the first portable cassette working with batteries, and one could carry it with earphones on the street, they invented much more than a device. The concept of walkman is also “the man that walks” and it’s the person that builds his/her own identity through this walk, so what one carries inside their ears, their brain, what one listens to through a walkman , initially, and now an MP3 or MP4 player or whatever, is built in their true landscape. In other words, one life’s landscape is the "soundscape" or the "mediascape". If we want to know what’s with a person, what are his/her habits, values... it’s much more important to know what’s on the MP3 player rather than knowing where he/she lives, walks and goes around. So it’s a mobile construction of the identity. And in the end, I’d just like to highlight a funny side that is the discussion that Weaver makes about "god terms and evil terms". I mean, terms that are the best and lame terms. So a "god term", I mean, a super positive term, a god term, are those that from the moment they’re spoken, everyone is astonished, so they play a huge role in the settlement of the science schedule, in countries like the US. The concept of the last frontier has been, for example, a constant reference in American politics, when they talked about “the conquest of the west, the last frontier”. Even in Star Trek: Space, the last frontier. And it’s always an appeal to an achievable dream. When talking about CC, if the data is on the cloud it'll be so close to God and if it's really in the sky, who's going to worry about it, right? There's a certain funny rhetoric in that, but Weaver and Langdon Winner say that language reflects of a society, so we have to be very careful so, from an enterprise context, some terms are not redefined such as freedom, rights, privacy, information ownership, etc. We're closing with five short conclusions. First, cloud computing is a revolutionary development in I.T. and it is a sociotechnical paradigm, a whole rationality set, social and territorial organization, etc. There is a strong financial pressure to invest in critical infrastructures. It doesn't work in any case if the territory is not wired. Third, it can be interpreted as an inherently political technology in the strong sense. I mean, certain technologies require a set of certain requirements to properly function. Look at this. There's an article about technology... computing, well, kidding, very rad written by Engels in 1872 and it's titled "On Authority". A tremendously interesting article in which Engels, the famous socialist author, argues against anarchists. So, Engels says in this article: "Anarchists say that if deep down the workers take control of a mass production factory, the worker will be freed and people will live better". However, what Engels argues is that it's not the entrepreneur who bothers the workers, but it's what he calls "the authority of the steam". He says then: "When we mass produce, when we produce through technology, we dominate nature through it but, at the same time, we become slaves to ourselves because, if one doesn't follow the pace set by the chain of production, things won't be properly made, they don't pass the quality test, it's like nothing was made". Then, Engels says: "Honestly, it's not always the will of some people that oppresses others, but every technological model, at a certain moment, brings certain game rules that one can hardly subtract". Because of this, even Marx, notice that there's a deterministic version of him in the first volume of the book "Capital", where he states: "The plough is to the farming society the same as the steam machine is to industrial society". In other version he says: "The plough is to the farming society the same as the mechanical loom is to industrial society". I mean, when there are ploughs the organization is feudal. When there's Internet, the organization can't be feudal, it can't be centralized, it can't be as simple as before. Finally, we don't have to wait to the implementation of technology to forecast its impact on society. These impacts above all are due to its nature. The last point is, to me, the most interesting, but is subject to discussion, obviously. CC is in fact a political constitution. When people get used to having their data not in their phones, but on the cloud, the mentality will have changed. And just like the way that a person's mentality changes when having their own energy at home, when they have their photovoltaic panel or their production of bituminous schists or biomass or compost... When they have their wind turbines in windy lands, for example the Canary Islands are great for having a wind turbine at home, it makes you practically self-sufficient. When people have informational autonomy, energy autonomy, they often ask too for a certain political autonomy. However, when our energy and information systems, etc... are centralized they also promote a more centralized mentality, more competitive and more selfish. Then, the discussion on cloud computing, and this is the final sentence, doesn't have to do just with technical design, but with which values we are willing to support and with the kind of society we want to be. Thank you so much, it's been a pleasure being with you.