0:00:00.000,0:00:14.990
34c3 intro
0:00:14.990,0:00:23.740
Herald: the next talk is Marloes de Valk,[br]she's an artist and writer from the
0:00:23.740,0:00:31.289
Netherlands and she's working with lots of[br]different materials and media and at the
0:00:31.289,0:00:40.490
moment she's doing an 8-bit game, so the[br]topic is "why do we anthropomorphize
0:00:40.490,0:00:48.460
computers and dehumanize ourselves in the[br]process?" and we have a mumble, which is
0:00:48.460,0:00:54.240
doing the translation, the talk is in[br]English and we will translate into French
0:00:54.240,0:01:01.930
and German.[br]Okay, give a big applause for Marloes!
0:01:01.930,0:01:09.670
applause
0:01:09.670,0:01:13.740
Marloes: Thank you and thank you all for
0:01:13.740,0:01:20.860
coming, my name is Marloes de Valk and I'm[br]going to talk about anthropomorphization
0:01:20.860,0:01:27.450
and I will approach this as a survival[br]strategy, see how it works and if it is
0:01:27.450,0:01:34.490
effective. And when I'm speaking of big[br]data, which is an umbrella term, my focus
0:01:34.490,0:01:38.940
will be on the socio-technical aspect of[br]the phenomenon, the assumptions and
0:01:38.940,0:01:44.090
beliefs surrounding Big Data and on[br]research using data exhaust or found data
0:01:44.090,0:01:49.770
such as status updates on social media web[br]searches and credit card payments.
0:01:55.750,0:02:08.300
Oh and now my slides are frozen. Oh my[br]gosh.
0:02:09.460,0:02:11.280
Audience: Have you tried turning it of and on again?
0:02:11.280,0:02:15.880
Marloes: laughs[br]I will in a moment. gosh it's
0:02:15.880,0:02:25.730
completely frozen... I'm very sorry,[br]technical staff I have to exit, if I can.
0:02:25.730,0:02:40.730
I can't. Help! I have to get rid of[br]something I think, should we just kill it?
0:02:40.730,0:02:55.280
That's so stupid yeah.
0:02:55.280,0:02:58.280
But they're gonna have a coffee soon and then it's gonna
0:02:58.280,0:03:33.070
Yes, force quit... I think I know[br]what the problem is. I'm sorry it's, it's
0:03:33.070,0:04:09.940
really not working. All right let's see if[br]we're back.
0:04:09.940,0:04:18.870
Okay, okay so sorry for the interruption.
0:04:18.870,0:04:24.150
I wanted to start by letting Silicon[br]Valley itself tell a story about
0:04:24.150,0:04:30.930
technology, really, sorry about the[br]interruption. So, Silicon Valley propaganda
0:04:30.930,0:04:35.330
during our lifetime, we're about to see[br]the transformation of the human race, it's
0:04:35.330,0:04:39.400
really something that blows my mind every[br]time I think about it. People have no idea
0:04:39.400,0:04:42.800
how fast the world is changing and I want[br]to give you a sense of that because it
0:04:42.800,0:04:47.650
fills me with awe and with an[br]extraordinary sense of responsibility. I
0:04:47.650,0:04:52.120
want to give you a sense of why now is[br]different why this decade the next decade
0:04:52.120,0:04:56.370
is not interesting times[br]but THE most extraordinary times ever in
0:04:56.370,0:05:00.520
human history and they truly are. What[br]we're talking about here is the notion
0:05:00.520,0:05:04.670
that faster cheaper computing power which[br]is almost like a force of nature, is
0:05:04.670,0:05:09.350
driving a whole slew of technologies,[br]technology being the force that takes what
0:05:09.350,0:05:14.090
used to be scarce and make it abundant.[br]That is why we're heading towards this
0:05:14.090,0:05:19.630
extraordinary age of abundance. The future[br]will not take care of itself as we know
0:05:19.630,0:05:23.590
the world looks to America for progress[br]and America looks to California and if you
0:05:23.590,0:05:28.070
ask most Californians where they get their[br]progress they'll point towards the bay,
0:05:28.070,0:05:33.210
but here at the bay there is no place left[br]to point, so we have to create solutions
0:05:33.210,0:05:37.840
and my goal is to simplify complexity,[br]take Internet technology and cross it with
0:05:37.840,0:05:43.020
an old industry and magic and progress and[br]big things can happen. I really think
0:05:43.020,0:05:46.990
there are two fundamental paths for[br]humans, one path is we stay on earth
0:05:46.990,0:05:51.650
forever, or some eventual extinction event[br]wipes us out, I don't have a doomsday
0:05:51.650,0:05:56.009
prophecy but history suggests some[br]doomsday event will happen. The
0:05:56.009,0:06:00.930
alternative is becoming a spacefaring and[br]multiplanetary species and it will be like
0:06:00.930,0:06:06.670
really fun to go, you'll have a great[br]time. We will set on Mars and we should,
0:06:06.670,0:06:10.630
because it's cool. When it comes to space[br]I see it as my job to build infrastructure
0:06:10.630,0:06:14.480
the hard way. I'm using my resources to[br]put in that infrastructure so that the
0:06:14.480,0:06:19.150
next generation of people can have a[br]dynamic entrepreneurial solar system as
0:06:19.150,0:06:23.000
interesting as we see on the internet[br]today. We want the population to keep
0:06:23.000,0:06:29.430
growing on this planet, we want to keep[br]you using more energy per capita. Death
0:06:29.430,0:06:33.889
makes me very angry, probably the most[br]extreme form of inequality is between
0:06:33.889,0:06:38.180
people who are alive and people who are[br]dead. I have the idea that aging is
0:06:38.180,0:06:42.050
plastic, that it's encoded and if[br]something is encoded you can crack the
0:06:42.050,0:06:46.390
code if you can crack the code you can[br]hack the code and thermodynamically there
0:06:46.390,0:06:51.470
should be no reason we can't defer entropy[br]indefinitely. We can end aging forever.
0:06:51.470,0:06:54.510
This is not about[br]Silicon Valley billionaires
0:06:54.510,0:06:57.199
living forever off the blood of young[br]people.
0:06:57.199,0:07:02.020
It's about a Star Trek future where no one[br]dies of preventable diseases where life is
0:07:02.020,0:07:05.770
fair. Health technology is becoming an[br]information technology, where we can read
0:07:05.770,0:07:10.539
and edit our own genomes clearly it is[br]possible through technology to make death
0:07:10.539,0:07:16.880
optional. Yes, our bodies are information[br]processing systems. We can enable human
0:07:16.880,0:07:22.350
transformations that would rival Marvel[br]Comics super muscularity ultra endurance,
0:07:22.350,0:07:26.880
super radiation resistance, you could have[br]people living on the moons of Jupiter,
0:07:26.880,0:07:30.259
who'd be modified in this way and they[br]could physically harvest energy from the
0:07:30.259,0:07:35.199
gamma rays they were exposed to. Form a[br]culture connected with the ideology of the
0:07:35.199,0:07:39.080
future, promoting technical progress[br]artificial intellects, multi-body
0:07:39.080,0:07:44.419
immortality and cyborgization. We are at[br]the beginning of the beginning the first
0:07:44.419,0:07:49.250
hour of day one, there have never been[br]more opportunities the greatest products
0:07:49.250,0:07:54.830
of the next 25 years have not been[br]invented yet. You are not too late.
0:07:54.830,0:07:59.700
We're going to take over the world, one[br]robot at a time. It's gonna be an AI that
0:07:59.700,0:08:03.740
is able to source create solve an answer[br]just what is your desire. I mean this is
0:08:03.740,0:08:09.380
an almost godlike view of the future. AI[br]is gonna be magic. Especially in the
0:08:09.380,0:08:14.310
digital manufacturing world, what is going[br]to be created will effectively be a god,
0:08:14.310,0:08:17.810
the idea needs to spread before the[br]technology, the church is how we spread
0:08:17.810,0:08:21.690
the word, the gospel. If you believe in[br]it, start a conversation with someone else
0:08:21.690,0:08:26.280
and help them understand the same things.[br]Computers are going to take over from
0:08:26.280,0:08:29.940
humans, no question, but when I got that[br]thinking in my head about if I'm going to
0:08:29.940,0:08:33.599
be treated in the future as a pet to these[br]smart machines, well I'm gonna treat my
0:08:33.599,0:08:38.068
own pet dog really nice, but in the end we[br]may just have created the species that is
0:08:38.068,0:08:42.688
above us. Chaining it isn't gonna be the[br]solution as it will be stronger than any
0:08:42.688,0:08:48.149
change could put on. the existential risk[br]that is associated with AI we will not be
0:08:48.149,0:08:51.180
able to beat[br]AI, so then as the saying goes if you
0:08:51.180,0:08:54.879
can't beat them, join them.[br]History has shown us we aren't gonna win
0:08:54.879,0:08:58.879
this war by changing human behavior but[br]maybe we can build systems that are so
0:08:58.879,0:09:03.149
locked down, that humans lose the ability[br]to make dumb mistakes until we gain the
0:09:03.149,0:09:08.190
ability to upgrade the human brain, it's[br]the only way. Let's stop pretending we can
0:09:08.190,0:09:11.929
hold back the development of intelligence[br]when there are clear massive short-term
0:09:11.929,0:09:16.069
economic benefits to those who develop it[br]and instead understand the future and have
0:09:16.069,0:09:20.611
it treat us like a beloved elder who[br]created it. As a company, one of our
0:09:20.611,0:09:24.119
greatest cultural strengths is accepting[br]the fact that if you're gonna invent,
0:09:24.119,0:09:29.019
you're gonna disrupt. Progress is[br]happening because there is economic
0:09:29.019,0:09:32.629
advantage to having machines work for you[br]and solve problems for you. People are
0:09:32.629,0:09:38.929
chasing that. AI, the term has become more[br]of a broad, almost marketing driven term
0:09:38.929,0:09:42.439
and I'm probably okay with that. What[br]matters is what people think of when they
0:09:42.439,0:09:47.339
hear of this. We are in a deadly race[br]between politics and technology, the fate
0:09:47.339,0:09:51.420
of our world may depend on the effort of a[br]single person who builds or propagates the
0:09:51.420,0:09:57.749
machinery of freedom, that makes the world[br]safe for capitalism.
0:09:57.749,0:10:04.470
These were all quotes. Every single one.[br]not only Silicon Valley CEO speak of
0:10:04.470,0:10:07.889
Technology in mysterious ways, let's see[br]some examples from the media.
0:10:07.889,0:10:12.079
Our official intelligence regulation,[br]"lets not regulate mathematics" a headline
0:10:12.079,0:10:16.779
from import dot IO from May 2016 about the[br]European general data protection
0:10:16.779,0:10:22.319
regulation and the article concludes[br]autonomous cars should be regulated as
0:10:22.319,0:10:26.129
cars, they should safely deliver users to[br]their destinations in the real world and
0:10:26.129,0:10:31.320
overall reduce the number of accidents.[br]How they achieve this is irrelevant. With
0:10:31.320,0:10:34.299
enough data the numbers speak for[br]themselves which comes from the super
0:10:34.299,0:10:40.240
famous article "The end of theory" from[br]Chris Anderson in Wired magazine 2008.
0:10:40.240,0:10:45.309
"Google creates an AI that can teach[br]itself to be better than humans" headline
0:10:45.309,0:10:48.549
from "The Independent. The[br]article continues the company's AI
0:10:48.549,0:10:52.639
division deepmind has unveiled "alpha go[br]zero" an extremely advanced system that
0:10:52.639,0:10:58.160
managed to accumulate thousands of years[br]of human knowledge within days. Microsoft
0:10:58.160,0:11:02.850
apologizing for their teen chat Bot gone[br]Nazi stating it wasn't their fault. "We're
0:11:02.850,0:11:06.199
deeply sorry for the unintended and[br]hurtful tweets from Tay which do not
0:11:06.199,0:11:13.699
represent who we are or what we stand for[br]nor how we design Tay" and then the PC
0:11:13.699,0:11:18.829
world article "AI just 3d printed a brand[br]new Rembrandt and it's shockingly good",
0:11:18.829,0:11:21.550
the subtitle reads[br]"the next Rembrandt project used data and
0:11:21.550,0:11:26.860
deep learning to produce uncanny results".[br]Advertising firm J walter thompson
0:11:26.860,0:11:32.049
unveiled a 3d printed painting called "the[br]next Rembrandt" based on 346 paintings of
0:11:32.049,0:11:36.100
the old master, not just PC world, but[br]many more articles touted similar titles
0:11:36.100,0:11:41.209
presenting the painting to the public, as[br]if it were made by a computer, a 3d
0:11:41.209,0:11:45.179
printer, AI and deep learning. It is clear[br]though, that the computer programmers who
0:11:45.179,0:11:48.765
worked on the project are not computers[br]and neither are the people who tagged the
0:11:48.765,0:11:53.509
346 Rembrandt paintings by hand. The[br]painting was made by a team of programmers
0:11:53.509,0:11:58.589
and researchers and it took them 18 months[br]to do. So what is communicated through
0:11:58.589,0:12:03.130
these messages is that the computer did[br]it, yet there is no strong AI, as in
0:12:03.130,0:12:07.029
consciousness in machines at this moment,[br]only very clever automation, meaning it
0:12:07.029,0:12:11.570
was really us. We comprehend the role and[br]function of non-human actors rationally,
0:12:11.570,0:12:15.680
but still intuitively approach them[br]differently. We anthropomorphize and
0:12:15.680,0:12:19.160
stories about the intelligent things[br]machines can do and force the belief in
0:12:19.160,0:12:26.620
the human-like agency of machines, so why[br]do we do it.
0:12:26.620,0:12:28.019
I'd like to think of this as two survival
0:12:28.019,0:12:32.680
strategies that found each other in big[br]data and AI discourse. George Zarkadakis
0:12:32.680,0:12:37.209
in the book "in our own image" describes[br]the root of anthropomorphization, during
0:12:37.209,0:12:40.610
the evolution of the modern mind humans[br]acquired and developed general-purpose
0:12:40.610,0:12:44.259
language, through social language and this[br]first social language was a way of
0:12:44.259,0:12:48.999
grooming of creating social cohesion.[br]We gained theory of mind to believe that
0:12:48.999,0:12:52.959
other people have thoughts, desires,[br]intentions and feelings of their own -
0:12:52.959,0:12:56.579
Empathy. And this led to the describing of[br]the world in social terms, perceiving
0:12:56.579,0:13:01.259
everything around us as agents possessing[br]mind, including the nonhuman, when hunting
0:13:01.259,0:13:05.600
anthropomorphizing animals had a great[br]advantage because you could strategize,
0:13:05.600,0:13:12.190
predict their movements. They show through[br]multiple experiment- Oh, Reeves and Nass
0:13:12.190,0:13:16.040
were picking up on this[br]anthropomorphization and they show through
0:13:16.040,0:13:20.220
multiple experiments that we haven't[br]changed that much, through multiple
0:13:20.220,0:13:24.309
experiments they show how people treat[br]computers, television and new media like
0:13:24.309,0:13:28.249
real people in places even though it test[br]subjects were completely unaware of it,
0:13:28.249,0:13:33.189
they responded to computers as they would[br]to people being polite cooperative,
0:13:33.189,0:13:37.009
attributing personality characteristics[br]such as aggressiveness, humor, expertise
0:13:37.009,0:13:41.339
and even gender. Meaning we haven't[br]evolved that much, we still do it.
0:13:41.339,0:13:44.799
Microsoft unfortunately misinterpreted[br]their research and developed the innocent
0:13:44.799,0:13:49.589
yet much hated Clippy the paper clip,[br]appearing one year later in office 97.
0:13:49.589,0:13:54.920
This survival strategy found its way into[br]another one. The Oracle. Survival through
0:13:54.920,0:13:58.480
predicting events.[br]The second strategy is trying to predict
0:13:58.480,0:14:03.049
the future, to steer events in our favor,[br]in order to avoid disaster. The fear of
0:14:03.049,0:14:06.989
death has inspired us throughout the ages[br]to try and predict the future and it has
0:14:06.989,0:14:10.480
led us to consult Oracles and to creating[br]a new one.
0:14:10.480,0:14:13.679
Because we cannot predict the future in[br]the midst of lives many insecurities, we
0:14:13.679,0:14:18.000
desperately crave the feeling of being in[br]control over our destiny, we have
0:14:18.000,0:14:23.269
developed ways to calm our anxiety, to[br]comfort ourselves and what we do is we
0:14:23.269,0:14:27.420
obfuscate that human hand in a generation[br]of messages that require an objective or
0:14:27.420,0:14:31.660
authority feel, although disputed is[br]commonly believed that the Delphic Oracle
0:14:31.660,0:14:36.079
delivered messages from her god Apollo in[br]a state of trance induced by intoxicating
0:14:36.079,0:14:40.949
vapors arising from the chasm over which[br]she was seated, possesed by her God the
0:14:40.949,0:14:45.209
Oracle spoke ecstatically and[br]spontaneously. Priests of the temple then
0:14:45.209,0:14:50.360
translated her gibberish into the[br]prophesies, the seekers of advice were
0:14:50.360,0:14:56.220
sent home with. And Apollo had spoken.[br]Nowadays we turn to data for advice. The
0:14:56.220,0:15:00.121
Oracle of Big Data functions in a similar[br]way to the Oracle of Delphi. Algorithms
0:15:00.121,0:15:04.750
programmed by humans are fed data and[br]consequently spit out numbers that are
0:15:04.750,0:15:07.949
then translated and interpreted by[br]researchers into the prophecies the
0:15:07.949,0:15:12.230
seekers of advice are sent home with. The[br]bigger data the set, the more accurate the
0:15:12.230,0:15:16.290
results. Data has spoken.[br]We are brought closer to the truth, to
0:15:16.290,0:15:22.060
reality as it is, unmediated by us,[br]subjective biased and error-prone humans.
0:15:22.060,0:15:26.529
This Oracle inspires great hope. It's a[br]utopia and this is best putting words in
0:15:26.529,0:15:29.600
the article "The end of theory" by[br]Anderson where he states that with enough
0:15:29.600,0:15:34.859
data the numbers can speak for themselves.[br]We can forget about taxonomy, ontology,
0:15:34.859,0:15:39.459
psychology, who knows why people do what[br]they do. The point is they do it and we
0:15:39.459,0:15:43.229
can track and measure it with[br]unprecedented fidelity, with enough data
0:15:43.229,0:15:48.149
the numbers speak for themselves. This[br]Oracle is of course embraced with great
0:15:48.149,0:15:52.649
enthusiasm by database and storage[br]businesses as shown here in an Oracle
0:15:52.649,0:15:58.660
presentation slide. High Five! And getting[br]it right one out of ten times and using
0:15:58.660,0:16:02.730
the one success story to strengthen the[br]belief in big data superpowers happens a
0:16:02.730,0:16:06.889
lot in the media, a peculiar example is[br]the story on Motherboard about how
0:16:06.889,0:16:10.579
"Cambridge Analytica" helped Trump win the[br]elections by psychologically profiling the
0:16:10.579,0:16:14.809
entire American population and using[br]targeted Facebook ads to influence the
0:16:14.809,0:16:20.209
results of the election. This story evokes[br]the idea that they know more about you
0:16:20.209,0:16:26.790
than your own mother. The article reads[br]"more likes could even surpass what a
0:16:26.790,0:16:31.019
person thought they knew about themselves"[br]and although this form of manipulation is
0:16:31.019,0:16:36.160
seriously scary in very undemocratic as[br]Cathy O'Neil author of "weapons
0:16:36.160,0:16:41.879
of mass mass destruction" notes, "don't[br]believe the hype".
0:16:41.879,0:16:45.459
It wasn't just Trump everyone was doing it[br]Hillary was using the groundwork, a
0:16:45.459,0:16:49.380
startup funded by Google's Eric Schmidt,[br]Obama used groundwork too, but the
0:16:49.380,0:16:53.089
groundwork somehow comes across a lot more[br]cute compared to Cambridge analytica,
0:16:53.089,0:16:56.279
funded by billionaire Robert Mercer who[br]also is heavily invested in all-tried
0:16:56.279,0:17:00.759
media outlet Breitbart, who describes[br]itself as a killing machine waging the war
0:17:00.759,0:17:05.869
for the West, he also donated Cambridge[br]analytica service to the brexit campaign.
0:17:05.869,0:17:08.949
The Motherboard article and many others[br]describing the incredibly detailed
0:17:08.949,0:17:13.019
knowledge Cambridge Analytica has on[br]American citizens were amazing advertising
0:17:13.019,0:17:17.659
for the company, but most of all a warning[br]sign that applying big data research to
0:17:17.659,0:17:21.539
elections creates a very undemocratic[br]Asymmetry and available information and
0:17:21.539,0:17:27.559
undermines the notion of an informed[br]citizenry. Dana Boyd and Kate Crawford
0:17:27.559,0:17:31.309
described the beliefs attached to big data[br]as a mythology "the widespread believe
0:17:31.309,0:17:34.880
that large datasets offer a higher form of[br]intelligence and knowledge that can
0:17:34.880,0:17:39.120
generate insights, that were previously[br]impossible with the aura of truth
0:17:39.120,0:17:43.610
objectivity and accuracy".[br]The deconstruction of this myth was
0:17:43.610,0:17:48.080
attempted as early as 1984 in a[br]spreadsheet way of knowledge, Steven Levi
0:17:48.080,0:17:51.929
describes how the authority of look of a[br]spreadsheet and the fact that it was done
0:17:51.929,0:17:55.330
by a computer has a strong persuasive[br]effect on people, leading to the
0:17:55.330,0:18:01.990
acceptance of the proposed model of[br]reality as gospel. He says fortunately few
0:18:01.990,0:18:05.390
would argue that all relations between[br]people can be quantified and manipulated
0:18:05.390,0:18:09.559
by formulas of human behavior, no[br]faultless assumptions and so no perfect
0:18:09.559,0:18:14.679
model can be made. Tim Harford also refers[br]to faith when he describes four
0:18:14.679,0:18:19.980
assumptions underlying Big Data research,[br]the first uncanny accuracy is easy to
0:18:19.980,0:18:24.299
overrate, if we simply ignore false[br]positives, oh sorry, the claim that
0:18:24.299,0:18:27.320
causation has been knocked off its[br]pedestal is fine if we are making
0:18:27.320,0:18:31.539
predictions in the stable environment,[br]but not if the world is changing. If you
0:18:31.539,0:18:35.000
do not understand why things correlate,[br]you cannot know what might breakdown this
0:18:35.000,0:18:38.620
correlation either.[br]The promise that sampling bias does not
0:18:38.620,0:18:43.320
matter in such large data sets is simply[br]not true, there is lots of bias in data
0:18:43.320,0:18:47.950
sets, as for the idea of why with enough[br]data, the numbers speak for themselves
0:18:47.950,0:18:53.720
that seems hopelessly naive, in data sets[br]where spurious patterns vastly outnumber
0:18:53.720,0:18:58.299
genuine discoveries. This last point is[br]described by Nicholas Taleb who writes
0:18:58.299,0:19:02.700
that big data research has brought cherry-[br]picking to an industrial level. Liam Weber
0:19:02.700,0:19:08.140
in a 2007 paper demonstrated that data[br]mining techniques could show a strong, but
0:19:08.140,0:19:13.549
spurious relationship between the changes[br]in the S&P 500 stock index and butter
0:19:13.549,0:19:18.740
production in Bangladesh. What is strange[br]about this mythology, that large data sets
0:19:18.740,0:19:23.419
offer some higher form of intelligences,[br]is that is paradoxical it attributes human
0:19:23.419,0:19:26.730
qualities to something, while at the same[br]time considering it to be more objective
0:19:26.730,0:19:32.070
and more accurate than humans, but these[br]beliefs can exist side by side. Consulting
0:19:32.070,0:19:36.259
this Oracle and critically has quite far-[br]reaching implications.
0:19:36.259,0:19:40.490
For one it dehumanizes humans by asserting[br]that human involvement through hypothesis
0:19:40.490,0:19:46.950
and interpretation is unreliable and only[br]by removing ourselves from the equation,
0:19:46.950,0:19:51.100
can we finally see the world as it is.[br]The practical consequence of this dynamic
0:19:51.100,0:19:55.450
is that it is no longer possible to argue[br]with the outcome of big data analysis
0:19:55.450,0:19:59.770
because first of all it's supposedly bias[br]free, interpretation free, you can't
0:19:59.770,0:20:04.210
question it, you cannot check if it is[br]bias free because the algorithms governing
0:20:04.210,0:20:09.160
the analysis are often completely opaque.[br]This becomes painful when you find
0:20:09.160,0:20:13.159
yourself in the wrong category of a social[br]sorting algorithm guiding real-world
0:20:13.159,0:20:18.280
decisions on insurance, mortgage, work[br]border check, scholarships and so on.
0:20:18.280,0:20:23.919
Exclusion from certain privileges is only[br]the most optimistic scenario, so it is not
0:20:23.919,0:20:28.179
as effective as we might hope. It has a[br]dehumanizing dark side
0:20:28.179,0:20:30.910
so why do we[br]believe. How did we come so infatuated
0:20:30.910,0:20:35.200
with information. Our idea about[br]information changed radically in the
0:20:35.200,0:20:39.830
previous century from small statement of[br]fact, to the essence of man's inner life
0:20:39.830,0:20:43.870
and this shift started with the advent of[br]cybernetics and information theory in the
0:20:43.870,0:20:48.200
40s and 50s where information was suddenly[br]seen as a means to control a system, any
0:20:48.200,0:20:53.429
system be it mechanical physical,[br]biological, cognitive or social. Here you
0:20:53.429,0:20:58.410
see Norbert Wiener's moths a machine he[br]built as part of a public relations stunt
0:20:58.410,0:21:03.370
financed by Life magazine. The photos with[br]him and his moth were unfortunately never
0:21:03.370,0:21:07.470
published, because according to Life's[br]editors, it didn't illustrate the human
0:21:07.470,0:21:12.540
characteristics of computers very well.[br]Norbert Wiener in the human hues of human
0:21:12.540,0:21:15.090
beings wrote,[br]"to live effectively is to live with
0:21:15.090,0:21:19.340
adequate information, thus communication[br]and control belong to the essence of man's
0:21:19.340,0:21:23.539
inner life, even as they belong to his[br]life in society" and almost
0:21:23.539,0:21:27.570
simultaneously, Shannon published a[br]mathematical theory of communication a
0:21:27.570,0:21:31.720
theory of signals transmitted over[br]distance. John Durham Peters in speaking
0:21:31.720,0:21:36.389
into the air, describes how over time this[br]information theory got reinterpreted by
0:21:36.389,0:21:41.040
social scientists who mistook signal for[br]significance.
0:21:41.040,0:21:45.299
Or at Halpern in beautiful data describes[br]how Alan Turing and Bertrand Russell had
0:21:45.299,0:21:49.659
proved conclusively in struggling with the[br]Entscheidungsproblem that many analytic
0:21:49.659,0:21:54.090
functions could not be logically[br]represented or mechanically executed and
0:21:54.090,0:21:58.470
therefore machines were not human minds.[br]She asks the very important question of
0:21:58.470,0:22:03.320
why we have forgotten this history and do[br]we still regularly equate reason with
0:22:03.320,0:22:08.031
rationality. Having forgotten this ten[br]years later in '58, artificial
0:22:08.031,0:22:11.929
intelligence research began comparing[br]computers and humans. Simon and Newell
0:22:11.929,0:22:15.740
wrote: the programmed computer and human[br]problem solver are both species belonging
0:22:15.740,0:22:19.220
to the genus 'information processing[br]system'.
0:22:19.220,0:22:23.390
In the 80s, information was granted an[br]even more powerful status: that of
0:22:23.390,0:22:27.669
commodity. Like it or not, information has[br]finally surpassed material goods as our
0:22:27.669,0:22:41.009
basic resource. Bon appetit! How did we[br]become so infatuated with information?
0:22:41.009,0:22:50.890
Hey sorry sighs yeah, this is an image[br]of a medieval drawing where the humors,
0:22:50.890,0:22:57.009
the liquids in the body were seen as the[br]the essence of our intelligence in the
0:22:57.009,0:23:02.610
functioning of our system. A metaphor for[br]our intelligence by the 1500s automata
0:23:02.610,0:23:06.190
powered by Springs and gears had been[br]devised, inspiring leading thinkers such
0:23:06.190,0:23:11.669
as Rene Descartes to assert that humans[br]are complex machines. The mind or soul was
0:23:11.669,0:23:15.190
immaterial, completely separated from the[br]body - only able to interact with the body
0:23:15.190,0:23:18.909
through the pineal gland, which he[br]considered the seat of the soul.
0:23:18.909,0:23:22.760
And we still do it, the brain is commonly[br]compared to a computer with the role of
0:23:22.760,0:23:26.809
physical hardware played by the brain, and[br]our thoughts serving a software. The brain
0:23:26.809,0:23:31.769
is information processor. It is a metaphor[br]that is sometimes mistaken for reality.
0:23:31.769,0:23:36.150
Because of this the belief in the Oracle[br]of big data is not such a great leap.
0:23:36.150,0:23:39.440
Information is the essence of[br]consciousness in this view. We've come
0:23:39.440,0:23:45.169
full circle, we see machines as human like[br]and view ourselves as machines. So does it
0:23:45.169,0:23:48.309
work, we started out with two survival[br]strategies predicting the behavior of
0:23:48.309,0:23:52.090
others through anthropomorphizing and[br]trying to predict the future through
0:23:52.090,0:23:55.710
oracles. The first has helped us survive[br]in the past, allows us to be empathic
0:23:55.710,0:24:00.440
towards others - human and non-human. The[br]second has comforted us throughout the
0:24:00.440,0:24:03.769
ages, creating the idea of control of[br]being able to predict and prevent
0:24:03.769,0:24:07.809
disaster. So how are they working for us[br]today?
0:24:07.809,0:24:11.809
We definitely have reasons to be concerned[br]with the sword of Damocles hanging over
0:24:11.809,0:24:16.260
our heads: global warming setting in[br]motion a chain of catastrophes threatening
0:24:16.260,0:24:20.910
our survival, facing the inevitable death[br]of capitalism's myth of eternal growth as
0:24:20.910,0:24:26.970
Earth's research has run out we are in a[br]bit of a pickle. Seeing our consciousness
0:24:26.970,0:24:31.210
as separate from our bodies, like software[br]and hardware. That offers some comforting
0:24:31.210,0:24:34.190
options.[br]One option is that since human
0:24:34.190,0:24:37.820
consciousness is so similar to computer[br]software, it can be transferred to a
0:24:37.820,0:24:42.940
computer. Ray Kurzweil for example[br]believes that it will soon be possible to
0:24:42.940,0:24:48.160
download human minds to a computer, with[br]immortality as a result. "Alliance to
0:24:48.160,0:24:52.279
Rescue Civilization" by Burrows and[br]Shapiro is a project that aims to back up
0:24:52.279,0:24:56.110
human civilization in a lunar facility.[br]The project artificially separates the
0:24:56.110,0:25:01.000
hardware of the planet with its oceans and[br]soils, and a data of human civilization.
0:25:01.000,0:25:04.289
And last but not least, the most explicit[br]and radical separation as well as the
0:25:04.289,0:25:08.609
least optimistic outlook on our future,[br]Elon Musk's SpaceX planned to colonize
0:25:08.609,0:25:13.009
Mars, presented in September last year.[br]The goal of the presentation being to make
0:25:13.009,0:25:18.470
living on Mars seemed possible within our[br]lifetime. Possible - and fun.
0:25:18.470,0:25:22.460
A less extreme version of these attempts[br]to escape doom is what, that with so much
0:25:22.460,0:25:26.409
data at our fingertips and clever[br]scientists, will figure out a way to solve
0:25:26.409,0:25:30.419
our problems. Soon we'll laugh at our[br]panic over global warming safely aboard
0:25:30.419,0:25:34.790
our CO2 vacuum cleaners. With this belief[br]we don't have to change our lives, our
0:25:34.790,0:25:40.360
economies, our politics. We can carry on[br]without making radical changes. Is this
0:25:40.360,0:25:46.239
apathy warranted? What is happening while[br]we are filling up the world's hard disks?
0:25:46.239,0:25:50.009
Well, information is never disembodied, it[br]always needs a carrier and the minerals
0:25:50.009,0:25:54.019
used in the technology hosting our data[br]come from conflict zones, resulting in
0:25:54.019,0:25:58.549
slavery and ecocide. As for instance in[br]the coltan and cassiterite mines in Congo,
0:25:58.549,0:26:02.679
gold mines in Ghana. Minerals used in[br]technology hosting our data come from
0:26:02.679,0:26:07.399
unregulated zones leading to extreme[br]pollution, as here in the black sludge
0:26:07.399,0:26:15.850
lake in Baotou in China. EU waste is[br]exported to unregulated zones, and server
0:26:15.850,0:26:20.070
farms spit out an equal amount of CO2 as[br]the global aviation industry. Our data
0:26:20.070,0:26:23.860
cannot be separated from the physical, and[br]its physical side is not so pretty.
0:26:23.860,0:26:27.789
And what is happening is that the earth is[br]getting warmer and climate research is not
0:26:27.789,0:26:32.039
based on Twitter feeds, but our[br]measurements yet somehow largely has been
0:26:32.039,0:26:36.539
ignored for decades. Scientific consensus[br]was reached in the 80s, and if you compare
0:26:36.539,0:26:40.070
the dangerously slow response to this, to[br]the response given to the threat of
0:26:40.070,0:26:44.789
terrorism which has rapidly led to new[br]laws, even new presidents, this shows how
0:26:44.789,0:26:48.799
stories, metaphors, and mythologies in the[br]world of social beings have more impact
0:26:48.799,0:26:52.740
than scientific facts. And how threats[br]that require drastic changes to the status
0:26:52.740,0:26:59.730
quo are willfully ignored.[br]So does this survival strategy work? This
0:26:59.730,0:27:02.990
mythology, this belief in taking ourselves[br]out of the equation, to bring us closer to
0:27:02.990,0:27:07.260
truth, to reality as it is, separating[br]ourselves from that which we observe,
0:27:07.260,0:27:12.590
blinds us to the trouble we are in. And[br]our true nature and embodied intelligence,
0:27:12.590,0:27:17.620
not a brain in a jar, an organism[br]completely intertwined with its
0:27:17.620,0:27:21.240
environment, its existence completely[br]dependent on the survival of the organisms
0:27:21.240,0:27:25.679
it shares this planet with, we can't help[br]to anthropomorphize, to approach
0:27:25.679,0:27:30.390
everything around us as part of our social[br]sphere with minds and agencies. And that
0:27:30.390,0:27:35.320
is fine, it makes us human. It allows us[br]to study the world around us with empathy.
0:27:35.320,0:27:39.090
The most important thing is that the[br]metaphor is not mistaken for reality. The
0:27:39.090,0:27:45.009
computer creating, thinking, memorizing,[br]writing, reading, learning, understanding,
0:27:45.009,0:27:50.169
and people being hard-wired, stuck in a[br]loop, unable to compute, interfacing with,
0:27:50.169,0:27:55.710
and reprogramming ourselves - those[br]metaphors are so embedded in our culture.
0:27:55.710,0:27:59.159
You can only hope to create awareness[br]about them. If there is more awareness
0:27:59.159,0:28:02.480
about the misleading descriptions of[br]machines as human-like and humans as
0:28:02.480,0:28:06.820
machine-like and all of reality as an[br]information process, it is more likely
0:28:06.820,0:28:10.139
that there will be less blind enchantment[br]with certain technology, and more
0:28:10.139,0:28:14.249
questions asked about its[br]purpose and demands.
0:28:14.249,0:28:18.980
There is no strong AI... yet, only very[br]clever automation. At this moment in
0:28:18.980,0:28:23.419
history machines are proxies for human[br]agendas and ideologies. There are many
0:28:23.419,0:28:32.679
issues that need addressing. As Kate[br]Crawford and Meredith Whittaker point out
0:28:32.679,0:28:37.369
in the AI Now report, recent examples of[br]AI deployments such as during the US
0:28:37.369,0:28:41.190
elections and Brexit, or Facebook[br]revealing teenagers emotional states to
0:28:41.190,0:28:45.539
advertisers looking to target depressed[br]teens, show how the interests of those
0:28:45.539,0:28:49.039
deploying advanced data systems can[br]overshadow the public interest, acting in
0:28:49.039,0:28:53.080
ways contrary to individual autonomy and[br]collective welfare, often without this
0:28:53.080,0:28:58.630
being visible at all to those affected.[br]The report points to many - I highly
0:28:58.630,0:29:04.769
recommend reading it - and here are a few[br]concerns. Concerns about social safety
0:29:04.769,0:29:08.290
nets and human resource distributions when[br]the dynamic of labor and employment
0:29:08.290,0:29:12.691
change. Workers most likely to be affected[br]are women and minorities. Automated
0:29:12.691,0:29:16.899
decision-making systems are often unseen[br]and there are few established means to
0:29:16.899,0:29:20.460
assess their fairness, to contest and[br]rectify wrong or harmful decisions or
0:29:20.460,0:29:29.590
impacts.[br]Those directly impacted.... Sorry,
0:29:29.590,0:29:35.570
automated... No, sorry.... I'm lost...[br]Those directly impacted by deployment of
0:29:35.570,0:29:49.909
AI systems rarely have a role in designing[br]them. sighs And to assess their
0:29:49.909,0:29:54.070
fairness, to confess and rectify wrong and[br]harmful decisions or impacts, lacks....
0:29:54.070,0:29:58.960
lack of methods measuring and assessing[br]social and economic impacts... nah, let's
0:29:58.960,0:30:09.240
keep scrolling back.... In any case, there[br]is a great chance of like me bias because
0:30:09.240,0:30:17.299
of the uniform... uniformity of those[br]developing these systems. Seeing the
0:30:17.299,0:30:20.889
Oracle we've constructed for what it is[br]means to stop comforting, comforting
0:30:20.889,0:30:25.269
ourselves, to ask questions. A quote from[br]super intelligence, the idea that it's
0:30:25.269,0:30:29.800
smart people by (?)Muchaichai(?) (?)Clowsky(?),[br]the pressing ethical questions in machine
0:30:29.800,0:30:33.430
learning are not about machines becoming[br]self-aware and taking over the world, but
0:30:33.430,0:30:37.910
about how people can exploit other people.[br]Or through careless, in carelessness
0:30:37.910,0:30:43.070
introduce immoral behavior into automated[br]systems. Instead of waiting for the nerd
0:30:43.070,0:30:46.860
rapture, or for Elon Musk to whisk us off[br]the planet, it is important to come to
0:30:46.860,0:30:50.750
terms with a more modest perception of[br]ourselves and our machines. Facing the
0:30:50.750,0:30:56.590
ethical repercussions of the systems we[br]are putting in place. Having the real
0:30:56.590,0:31:02.320
discussion, not the one we hope for, but[br]the hard one that requires actual change
0:31:02.320,0:31:06.540
and a new mythology. One that works, not[br]only for us, but for all those human and
0:31:06.540,0:31:11.800
non-human, we share the planet with.[br]Thank you. That's it.
0:31:11.800,0:31:24.170
applause
0:31:24.170,0:31:30.209
Herald Angel: Thank you, Marloes. Is there[br]any questions? Like, you would have one
0:31:30.209,0:31:36.689
minute. laugs Okay. So, thank you[br]again. Give her a big applause again,
0:31:36.689,0:31:39.469
thank you.
0:31:39.469,0:31:43.976
applause
0:31:43.976,0:31:49.958
34c3 outro
0:31:49.958,0:32:04.901
subtitles created by c3subtitles.de[br]in the year 2018. Join, and help us!