0:00:00.000,0:00:18.721
34C3 preroll music
0:00:18.721,0:00:24.996
Herald: Humans of Congress, it is my[br]pleasure to announce the next speaker.
0:00:24.996,0:00:31.749
I was supposed to pick out a few awards or[br]something, to actually present what he's
0:00:31.750,0:00:36.419
done in his life, but I can[br]only say: he's one of us!
0:00:36.419,0:00:40.149
applause
0:00:40.149,0:00:43.279
Charles Stross![br]ongoing applause
0:00:43.279,0:00:45.969
Charles Stross: Hi! Is this on?[br]Good. Great.
0:00:45.969,0:00:50.679
I'm really pleased to be here and I[br]want to start by apologizing for my total
0:00:50.679,0:00:57.629
lack of German. So this talk is gonna be in[br]English. Good morning. I'm Charlie Stross
0:00:57.629,0:01:03.559
and it's my job to tell lies for money, or[br]rather, I write science fiction, much of
0:01:03.559,0:01:07.860
it about on the future, which in recent[br]years has become ridiculously hard to
0:01:07.860,0:01:16.090
predict. In this talk I'm going to talk[br]about why. Now our species, Homo sapiens
0:01:16.090,0:01:21.990
sapiens, is about 300,000 years old. It[br]used to be about 200,000 years old,
0:01:21.990,0:01:26.189
but it grew an extra 100,000[br]years in the past year because of new
0:01:26.189,0:01:31.329
archaeological discoveries, I mean, go[br]figure. For all but the last three
0:01:31.329,0:01:36.759
centuries or so - of that span, however -[br]predicting the future was really easy. If
0:01:36.759,0:01:41.369
you were an average person - as opposed to[br]maybe a king or a pope - natural disasters
0:01:41.369,0:01:46.690
aside, everyday life 50 years in the[br]future would resemble everyday life 50
0:01:46.690,0:01:56.089
years in your past. Let that sink in for a[br]bit. For 99.9% of human existence on this
0:01:56.089,0:02:01.929
earth, the future was static. Then[br]something changed and the future began to
0:02:01.929,0:02:06.600
shift increasingly rapidly, until, in the[br]present day, things are moving so fast,
0:02:06.600,0:02:13.440
it's barely possible to anticipate trends[br]from one month to the next. Now as an
0:02:13.440,0:02:17.830
eminent computer scientist, Edsger Dijkstra[br]once remarked, computer science is no more
0:02:17.830,0:02:23.620
about computers than astronomy is about[br]building big telescopes, the same can be
0:02:23.620,0:02:28.760
said of my field of work, writing science[br]fiction, sci-fi is rarely about science
0:02:28.760,0:02:33.690
and even more rarely about predicting the[br]future, but sometimes we dabble in
0:02:33.690,0:02:42.140
Futurism and lately, Futurism has gotten[br]really, really, weird. Now when I write a
0:02:42.140,0:02:47.230
near future work of fiction, one set, say, a[br]decade hence, there used to be a recipe I
0:02:47.230,0:02:53.770
could follow, that worked eerily well. Simply put:[br]90% of the next decade stuff is
0:02:53.770,0:02:57.420
already here around us today.[br]Buildings are designed to
0:02:57.420,0:03:03.040
last many years, automobiles have a design[br]life of about a decade, so half the cars on
0:03:03.040,0:03:10.240
the road in 2027 are already there now -[br]they're new. People? There'll be some new
0:03:10.240,0:03:15.520
faces, aged 10 and under, and some older[br]people will have died, but most of us
0:03:15.520,0:03:22.840
adults will still be around, albeit older[br]and grayer, this is the 90% of a near
0:03:22.840,0:03:30.820
future that's already here today. After[br]the already existing 90%, another 9% of a
0:03:30.820,0:03:35.650
near future a decade hence used to be[br]easily predictable: you look at trends
0:03:35.650,0:03:39.970
dictated by physical limits, such as[br]Moore's law and you look at Intel's road
0:03:39.970,0:03:44.000
map and you use a bit of creative[br]extrapolation and you won't go too far
0:03:44.000,0:03:52.510
wrong. If I predict - wearing my futurology[br]hat - that in 2027 LTE cellular phones will
0:03:52.510,0:03:57.560
be ubiquitous, 5G will be available for[br]high bandwidth applications and there will be
0:03:57.560,0:04:01.790
fallback to some kind of satellite data[br]service at a price, you probably won't
0:04:01.790,0:04:04.330
laugh at me.[br]I mean, it's not like I'm predicting that
0:04:04.330,0:04:08.110
airlines will fly slower and Nazis will[br]take over the United States, is it ?
0:04:08.110,0:04:09.900
laughing
0:04:09.900,0:04:15.210
And therein lies the problem. There is[br]remaining 1% of what Donald Rumsfeld
0:04:15.210,0:04:20.940
called the "unknown unknowns", what throws off[br]all predictions. As it happens, airliners
0:04:20.940,0:04:26.060
today are slower than they were in the[br]1970s and don't get me started about the Nazis,
0:04:26.060,0:04:31.860
I mean, nobody in 2007 was expecting a Nazi[br]revival in 2017, were they?
0:04:31.860,0:04:37.320
Only this time, Germans get to be the good guys. [br]laughing, applause
0:04:37.320,0:04:42.260
So. My recipe for fiction set 10 years [br]in the future used to be:
0:04:42.260,0:04:47.360
"90% is already here,[br]9% is not here yet but predictable
0:04:47.360,0:04:53.730
and 1% is 'who ordered that?'" But unfortunately[br]the ratios have changed, I think we're now
0:04:53.730,0:04:59.660
down to maybe 80% already here - climate[br]change takes a huge toll on architecture -
0:04:59.660,0:05:05.760
then 15% not here yet, but predictable and[br]a whopping 5% of utterly unpredictable
0:05:05.760,0:05:12.740
deep craziness. Now... before I carry on[br]with this talk, I want to spend a minute or
0:05:12.740,0:05:18.530
two ranting loudly and ruling out the[br]singularity. Some of you might assume, that
0:05:18.530,0:05:23.590
as the author of books like "Singularity[br]Sky" and "Accelerando",
0:05:23.590,0:05:28.220
I expect an impending technological[br]singularity,
0:05:28.220,0:05:32.090
that we will develop self-improving[br]artificial intelligence and mind uploading
0:05:32.090,0:05:35.410
and the whole wish list of transhumanist[br]aspirations promoted by the likes of
0:05:35.410,0:05:42.300
Ray Kurzweil, will come to pass. Unfortunately[br]this isn't the case. I think transhumanism
0:05:42.300,0:05:49.050
is a warmed-over Christian heresy. While[br]its adherents tend to be outspoken atheists,
0:05:49.050,0:05:51.910
they can't quite escape from the[br]history that gave rise to our current
0:05:51.910,0:05:57.220
Western civilization. Many of you are[br]familiar with design patterns, an approach
0:05:57.220,0:06:01.560
to software engineering that focuses on[br]abstraction and simplification, in order
0:06:01.560,0:06:07.730
to promote reusable code. When you look at[br]the AI singularity as a narrative and
0:06:07.730,0:06:11.790
identify the numerous places in their[br]story where the phrase "and then a miracle
0:06:11.790,0:06:18.790
happens" occur, it becomes apparent pretty[br]quickly, that they've reinvented Christiantiy.
0:06:18.790,0:06:19.460
applause
0:06:19.460,0:06:25.330
Indeed, the wellspring of[br]today's transhumanists draw in a long rich
0:06:25.330,0:06:29.990
history of Russian philosophy, exemplified[br]by the russian orthodox theologian Nikolai
0:06:29.990,0:06:37.169
Fyodorovich Fedorov by way of his disciple[br]Konstantin Tsiolkovsky, whose derivation
0:06:37.169,0:06:40.520
of a rocket equation makes him[br]essentially the father of modern space
0:06:40.520,0:06:45.350
flight. Once you start probing the nether[br]regions of transhumanist forth and run
0:06:45.350,0:06:49.800
into concepts like Roko's Basilisk - by the[br]way, any of you who didn't know about the
0:06:49.800,0:06:54.120
Basilisk before, are now doomed to an[br]eternity in AI hell, terribly sorry - you
0:06:54.120,0:06:57.889
realize, they've mangled it to match some[br]of the nastier aspects of Presbyterian
0:06:57.889,0:07:03.189
Protestantism. Now they basically invented[br]original sin and Satan in the guise of an
0:07:03.189,0:07:09.270
AI that doesn't exist yet ,it's.. kind of[br]peculiar. Anyway, my take on the
0:07:09.270,0:07:12.949
singularity is: What if something walks[br]like a duck and quacks like a duck? It's
0:07:12.949,0:07:18.460
probably a duck. And if it looks like a[br]religion, it's probably a religion.
0:07:18.460,0:07:23.280
I don't see much evidence for human-like,[br]self-directed artificial intelligences
0:07:23.280,0:07:28.070
coming along any time soon, and a fair bit[br]of evidence, that nobody accepts and freaks
0:07:28.070,0:07:32.150
in cognitive science departments, even[br]want it. I mean, if we invented an AI
0:07:32.150,0:07:35.940
that was like a human mind, it would do the[br]AI equivalent of sitting on the sofa,
0:07:35.940,0:07:39.455
munching popcorn and[br]watching the Super Bowl all day.
0:07:39.455,0:07:43.261
It wouldn't be much use to us.[br]laughter, applause
0:07:43.261,0:07:46.776
What we're getting instead,[br]is self-optimizing tools that defy
0:07:46.776,0:07:51.220
human comprehension, but are not[br]in fact any more like our kind
0:07:51.220,0:07:57.500
of intelligence than a Boeing 737 is like[br]a seagull. Boeing 737s and seagulls both
0:07:57.500,0:08:04.240
fly, Boeing 737s don't lay eggs and shit[br]everywhere. So I'm going to wash my hands
0:08:04.240,0:08:09.590
of a singularity as a useful explanatory[br]model of the future without further ado.
0:08:09.590,0:08:14.670
I'm one of those vehement atheists as well[br]and I'm gonna try and offer you a better
0:08:14.670,0:08:20.960
model for what's happening to us. Now, as[br]my fellow Scottish science fictional author
0:08:20.960,0:08:27.130
Ken MacLeod likes to say "the secret[br]weapon of science fiction is history".
0:08:27.130,0:08:31.229
History is, loosely speaking, is the written[br]record of what and how people did things
0:08:31.229,0:08:36.528
in past times. Times that have slipped out[br]of our personal memories. We science
0:08:36.528,0:08:41.099
fiction writers tend to treat history as a[br]giant toy chest to raid, whenever we feel
0:08:41.099,0:08:45.389
like telling a story. With a little bit of[br]history, it's really easy to whip up an
0:08:45.389,0:08:49.019
entertaining yarn about a galactic empire[br]that mirrors the development and decline
0:08:49.019,0:08:53.529
of a Habsburg Empire or to respin the[br]October Revolution as a tale of how Mars
0:08:53.529,0:08:59.599
got its independence. But history is[br]useful for so much more than that.
0:08:59.599,0:09:05.380
It turns out, that our personal memories[br]don't span very much time at all. I'm 53
0:09:05.380,0:09:10.800
and I barely remember the 1960s. I only[br]remember the 1970s with the eyes of a 6 to
0:09:10.800,0:09:17.580
16 year old. My father died this year,[br]aged 93, and he'd just about remembered the
0:09:17.580,0:09:22.770
1930s. Only those of my father's[br]generation directly remember the Great
0:09:22.780,0:09:29.079
Depression and can compare it to the[br]2007/08 global financial crisis directly.
0:09:29.079,0:09:34.250
We Westerners tend to pay little attention[br]to cautionary tales told by 90-somethings.
0:09:34.250,0:09:38.999
We're modern, we're change obsessed and we[br]tend to repeat our biggest social mistakes
0:09:38.999,0:09:43.259
just as they slip out of living memory,[br]which means they recur on a timescale of
0:09:43.259,0:09:47.680
70 to 100 years.[br]So if our personal memories are useless,
0:09:47.680,0:09:52.330
we need a better toolkit[br]and history provides that toolkit.
0:09:52.330,0:09:57.099
History gives us the perspective to see what[br]went wrong in the past and to look for
0:09:57.099,0:10:01.509
patterns and check to see whether those [br]patterns are recurring in the present.
0:10:01.509,0:10:06.689
Looking in particular at the history of the past two[br]to four hundred years, that age of rapidly
0:10:06.689,0:10:11.800
increasing change that I mentioned at the[br]beginning. One glaringly obvious deviation
0:10:11.800,0:10:16.929
from the norm of the preceding[br]3000 centuries is obvious, and that's
0:10:16.929,0:10:22.059
the development of artificial intelligence,[br]which happened no earlier than 1553 and no
0:10:22.059,0:10:29.359
later than 1844. I'm talking of course[br]about the very old, very slow AI's we call
0:10:29.359,0:10:34.240
corporations. What lessons from the history[br]of a company can we draw that tell us
0:10:34.240,0:10:38.490
about the likely behavior of the type of[br]artificial intelligence we're interested
0:10:38.490,0:10:47.258
in here, today?[br]Well. Need a mouthful of water.
0:10:47.258,0:10:51.618
Let me crib from Wikipedia for a moment.
0:10:51.618,0:10:56.329
Wikipedia: "In the late 18th[br]century, Stewart Kyd, the author of the
0:10:56.329,0:11:02.800
first treatise on corporate law in English,[br]defined a corporation as: 'a collection of
0:11:02.800,0:11:08.290
many individuals united into one body,[br]under a special denomination, having
0:11:08.290,0:11:13.779
perpetual succession under an artificial[br]form, and vested, by policy of the law, with
0:11:13.779,0:11:20.201
the capacity of acting, in several respects,[br]as an individual, enjoying privileges and
0:11:20.201,0:11:24.151
immunities in common, and of exercising a[br]variety of political rights, more or less
0:11:24.151,0:11:28.510
extensive, according to the design of its[br]institution, or the powers conferred upon
0:11:28.510,0:11:32.269
it, either at the time of its creation, or[br]at any subsequent period of its
0:11:32.269,0:11:36.699
existence.'"[br]This was a late 18th century definition,
0:11:36.699,0:11:42.509
sound like a piece of software to you?[br]In 1844, the British government passed the
0:11:42.509,0:11:46.450
"Joint Stock Companies Act" which created[br]a register of companies and allowed any
0:11:46.450,0:11:51.360
legal person, for a fee, to register a[br]company which in turn existed as a
0:11:51.360,0:11:55.869
separate legal person. Prior to that point,[br]it required a Royal Charter or an act of
0:11:55.869,0:12:00.420
Parliament to create a company.[br]Subsequently, the law was extended to limit
0:12:00.420,0:12:04.680
the liability of individual shareholders[br]in event of business failure and then both
0:12:04.680,0:12:08.629
Germany and the United States added their[br]own unique twists to what today we see is
0:12:08.629,0:12:14.360
the doctrine of corporate personhood.[br]Now, though plenty of other things that
0:12:14.360,0:12:18.740
happened between the 16th and 21st centuries[br]did change the shape of the world we live in.
0:12:18.740,0:12:22.149
I've skipped the changes in[br]agricultural productivity that happened
0:12:22.149,0:12:25.550
due to energy economics,[br]which finally broke the Malthusian trap
0:12:25.550,0:12:29.129
our predecessors lived in. [br]This in turn broke the long-term
0:12:29.129,0:12:32.681
cap on economic growth of about [br]0.1% per year
0:12:32.681,0:12:35.681
in the absence of famines, plagues and [br]wars and so on.
0:12:35.681,0:12:38.790
I've skipped the germ theory of diseases [br]and the development of trade empires
0:12:38.790,0:12:42.760
in the age of sail and gunpowder,[br]that were made possible by advances
0:12:42.760,0:12:45.119
in accurate time measurement.
0:12:45.119,0:12:49.079
I've skipped the rise, and[br]hopefully decline, of the pernicious
0:12:49.079,0:12:52.430
theory of scientific racism that[br]underpinned Western colonialism and the
0:12:52.430,0:12:57.220
slave trade. I've skipped the rise of[br]feminism, the ideological position that
0:12:57.220,0:13:02.470
women are human beings rather than[br]property and the decline of patriarchy.
0:13:02.470,0:13:05.999
I've skipped the whole of the[br]Enlightenment and the Age of Revolutions,
0:13:05.999,0:13:09.410
but this is a technocratic.. technocentric[br]Congress, so I want to frame this talk in
0:13:09.410,0:13:14.869
terms of AI, which we all like to think we[br]understand. Here's the thing about these
0:13:14.869,0:13:21.050
artificial persons we call corporations.[br]Legally, they're people. They have goals,
0:13:21.050,0:13:25.957
they operate in pursuit of these goals,[br]they have a natural life cycle.
0:13:25.957,0:13:33.230
In the 1950s, a typical U.S. corporation on the[br]S&P 500 Index had a life span of 60 years.
0:13:33.230,0:13:37.805
Today it's down to less than 20 years. [br]This is largely due to predation.
0:13:37.805,0:13:41.659
Corporations are cannibals, they eat [br]one another.
0:13:41.659,0:13:45.849
They're also hive super organisms[br]like bees or ants.
0:13:45.849,0:13:48.699
For the first century and a[br]half, they relied entirely on human
0:13:48.699,0:13:52.399
employees for their internal operation,[br]but today they're automating their
0:13:52.399,0:13:57.110
business processes very rapidly. Each[br]human is only retained so long as they can
0:13:57.110,0:14:01.339
perform their assigned tasks more[br]efficiently than a piece of software
0:14:01.339,0:14:05.439
and they can all be replaced by another[br]human, much as the cells in our own bodies
0:14:05.439,0:14:09.799
are functionally interchangeable and a[br]group of cells can - in extremis - often be
0:14:09.799,0:14:14.509
replaced by a prosthetic device.[br]To some extent, corporations can be
0:14:14.509,0:14:19.019
trained to serve of the personal desires of[br]their chief executives, but even CEOs can
0:14:19.019,0:14:22.749
be dispensed with, if their activities[br]damage the corporation, as Harvey
0:14:22.749,0:14:26.389
Weinstein found out a couple of months[br]ago.
0:14:26.389,0:14:30.939
Finally, our legal environment today has[br]been tailored for the convenience of
0:14:30.939,0:14:34.910
corporate persons, rather than human[br]persons, to the point where our governments
0:14:34.910,0:14:40.379
now mimic corporations in many of our[br]internal structures.
0:14:40.379,0:14:43.940
So, to understand where we're going, we[br]need to start by asking "What do our
0:14:43.940,0:14:51.850
current actually existing AI overlords[br]want?"
0:14:51.850,0:14:56.400
Now, Elon Musk, who I believe you've[br]all heard of, has an obsessive fear of one
0:14:56.400,0:14:59.750
particular hazard of artificial[br]intelligence, which he conceives of as
0:14:59.750,0:15:03.760
being a piece of software that functions[br]like a brain in a box, namely the
0:15:03.760,0:15:09.859
Paperclip Optimizer or Maximizer.[br]A Paperclip Maximizer is a term of art for
0:15:09.859,0:15:14.519
a goal seeking AI that has a single[br]priority, e.g., maximizing the
0:15:14.519,0:15:19.759
number of paperclips in the universe. The[br]Paperclip Maximizer is able to improve
0:15:19.759,0:15:24.110
itself in pursuit of its goal, but has no[br]ability to vary its goal, so will
0:15:24.124,0:15:27.604
ultimately attempt to convert all the[br]metallic elements in the solar system into
0:15:27.604,0:15:31.749
paperclips, even if this is obviously[br]detrimental to the well-being of the
0:15:31.749,0:15:36.359
humans who set it this goal. [br]Unfortunately I don't think Musk
0:15:36.359,0:15:41.169
is paying enough attention, [br]consider his own companies.[br]
0:15:41.169,0:15:45.480
Tesla isn't a Paperclip Maximizer, it's a[br]battery Maximizer.
0:15:45.480,0:15:48.240
After all, a battery.. an[br]electric car is a battery with wheels and
0:15:48.240,0:15:54.199
seats. SpaceX is an orbital payload[br]Maximizer, driving down the cost of space
0:15:54.199,0:15:59.424
launches in order to encourage more sales[br]for the service it provides. SolarCity is
0:15:59.424,0:16:05.700
a photovoltaic panel maximizer and so on.[br]All three of the.. Musk's very own slow AIs
0:16:05.700,0:16:09.259
are based on an architecture, designed to[br]maximize return on shareholder
0:16:09.259,0:16:13.180
investment, even if by doing so they cook[br]the planet the shareholders have to live
0:16:13.180,0:16:15.979
on or turn the entire thing into solar[br]panels.
0:16:15.979,0:16:19.479
But hey, if you're Elon Musk, thats okay,[br]you're gonna retire on Mars anyway.
0:16:19.479,0:16:20.879
laughing
0:16:20.879,0:16:25.139
By the way, I'm ragging on Musk in this[br]talks, simply because he's the current
0:16:25.139,0:16:28.600
opinionated tech billionaire, who thinks[br]for disrupting a couple of industries
0:16:28.600,0:16:33.579
entitles him to make headlines.[br]If this was 2007 and my focus slightly
0:16:33.579,0:16:39.041
difference.. different, I'd be ragging on[br]Steve Jobs and if we're in 1997 my target
0:16:39.041,0:16:42.259
would be Bill Gates.[br]Don't take it personally, Elon.
0:16:42.259,0:16:44.109
laughing
0:16:44.109,0:16:49.139
Back to topic. The problem of[br]corporations is, that despite their overt
0:16:49.139,0:16:53.959
goals, whether they make electric vehicles[br]or beer or sell life insurance policies,
0:16:53.959,0:16:59.778
they all have a common implicit Paperclip[br]Maximizer goal: to generate revenue. If
0:16:59.778,0:17:03.729
they don't make money, they're eaten by a[br]bigger predator or they go bust. It's as
0:17:03.729,0:17:07.929
vital to them as breathing is to us[br]mammals. They generally pursue their
0:17:07.929,0:17:12.439
implicit goal - maximizing revenue - by[br]pursuing their overt goal.
0:17:12.439,0:17:16.510
But sometimes they try instead to[br]manipulate their environment, to ensure
0:17:16.510,0:17:22.970
that money flows to them regardless.[br]Human toolmaking culture has become very
0:17:22.970,0:17:27.980
complicated over time. New technologies[br]always come with an attached implicit
0:17:27.980,0:17:32.869
political agenda that seeks to extend the[br]use of the technology. Governments react
0:17:32.869,0:17:37.230
to this by legislating to control new[br]technologies and sometimes we end up with
0:17:37.230,0:17:41.630
industries actually indulging in legal[br]duels through the regulatory mechanism of
0:17:41.630,0:17:49.582
law to determine, who prevails. For[br]example, consider the automobile. You
0:17:49.582,0:17:53.910
can't have mass automobile transport[br]without gas stations and fuel distribution
0:17:53.910,0:17:57.160
pipelines.[br]These in turn require access to whoever
0:17:57.160,0:18:01.179
owns the land the oil is extracted from[br]under and before you know it, you end up
0:18:01.179,0:18:06.660
with a permanent army in Iraq and a clamp[br]dictatorship in Saudi Arabia. Closer to
0:18:06.660,0:18:12.380
home, automobiles imply jaywalking laws and[br]drink-driving laws. They affect Town
0:18:12.380,0:18:16.750
Planning regulations and encourage[br]suburban sprawl, the construction of human
0:18:16.750,0:18:21.429
infrastructure on a scale required by[br]automobiles, not pedestrians.
0:18:21.429,0:18:24.979
This in turn is bad for competing[br]transport technologies, like buses or
0:18:24.979,0:18:31.720
trams, which work best in cities with a[br]high population density. So to get laws
0:18:31.720,0:18:35.399
that favour the automobile in place,[br]providing an environment conducive to
0:18:35.399,0:18:40.409
doing business, automobile companies spend[br]money on political lobbyists and when they
0:18:40.409,0:18:46.620
can get away with it, on bribes. Bribery[br]needn't be blatant of course. E.g.,
0:18:46.620,0:18:51.929
the reforms of a British railway network[br]in the 1960s dismembered many branch lines
0:18:51.929,0:18:56.408
and coincided with a surge in road[br]building and automobile sales. These
0:18:56.408,0:19:01.138
reforms were orchestrated by Transport[br]Minister Ernest Marples, who was purely a
0:19:01.138,0:19:05.950
politician. The fact that he accumulated a[br]considerable personal fortune during this
0:19:05.950,0:19:10.269
period by buying shares in motorway[br]construction corporations, has nothing to
0:19:10.269,0:19:18.120
do with it. So, no conflict of interest[br]there - now if the automobile in industry
0:19:18.120,0:19:23.230
can't be considered a pure Paperclip[br]Maximizer... sorry, the automobile
0:19:23.230,0:19:27.510
industry in isolation can't be considered[br]a pure Paperclip Maximizer. You have to
0:19:27.510,0:19:31.659
look at it in conjunction with the fossil[br]fuel industries, the road construction
0:19:31.659,0:19:37.809
business, the accident insurance sector[br]and so on. When you do this, you begin to
0:19:37.809,0:19:42.740
see the outline of a paperclip-maximizing[br]ecosystem that invades far-flung lands and
0:19:42.740,0:19:47.167
grinds up and kills around one and a[br]quarter million people per year. That's
0:19:47.167,0:19:50.951
the global death toll from automobile[br]accidents currently, according to the World
0:19:50.951,0:19:56.429
Health Organization. It rivals the First[br]World War on an ongoing permanent basis
0:19:56.429,0:20:01.510
and these are all side effects of its[br]drive to sell you a new car. Now,
0:20:01.510,0:20:06.639
automobiles aren't of course a total[br]liability. Today's cars are regulated
0:20:06.639,0:20:11.302
stringently for safety and, in theory, to[br]reduce toxic emissions. They're fast,
0:20:11.302,0:20:16.990
efficient and comfortable. We can thank[br]legal mandated regulations imposed by
0:20:16.990,0:20:22.029
governments for this, of course. Go back[br]to the 1970s and cars didn't have crumple
0:20:22.029,0:20:27.199
zones, go back to the 50s and they didn't[br]come with seat belts as standard. In the
0:20:27.199,0:20:33.690
1930s, indicators, turn signals and brakes[br]on all four wheels were optional and your
0:20:33.690,0:20:38.850
best hope of surviving a 50 km/h-crash was [br]to be thrown out of a car and land somewhere
0:20:38.850,0:20:43.059
without breaking your neck.[br]Regulator agencies are our current
0:20:43.059,0:20:47.380
political system's tool of choice for[br]preventing Paperclip Maximizers from
0:20:47.380,0:20:55.159
running amok. Unfortunately, regulators[br]don't always work. The first failure mode
0:20:55.159,0:20:59.730
of regulators that you need to be aware of[br]is regulatory capture, where regulatory
0:20:59.730,0:21:05.038
bodies are captured by the industries they[br]control. Ajit Pai, Head of American Federal
0:21:05.038,0:21:09.460
Communications Commission, which just voted[br]to eliminate net neutrality rules in the
0:21:09.460,0:21:14.101
U.S., has worked as Associate[br]General Counsel for Verizon Communications
0:21:14.101,0:21:19.291
Inc, the largest current descendant of the[br]Bell Telephone system's monopoly. After
0:21:19.291,0:21:24.690
the AT&T antitrust lawsuit, the Bell[br]network was broken up into the seven baby
0:21:24.690,0:21:32.279
bells. They've now pretty much reformed[br]and reaggregated and Verizon is the largest current one.
0:21:32.279,0:21:36.099
Why should someone with a transparent[br]interest in a technology corporation end
0:21:36.099,0:21:41.129
up running a regulator that tries to[br]control the industry in question? Well, if
0:21:41.129,0:21:44.919
you're going to regulate a complex[br]technology, you need to recruit regulators
0:21:44.919,0:21:48.580
from people who understand it.[br]Unfortunately, most of those people are
0:21:48.580,0:21:53.520
industry insiders. Ajit Pai is clearly[br]very much aware of how Verizon is
0:21:53.520,0:21:58.450
regulated, very insightful into its[br]operations and wants to do something about
0:21:58.450,0:22:02.509
it - just not necessarily in the public[br]interest.
0:22:02.509,0:22:11.029
applause[br]When regulators end up staffed by people
0:22:11.029,0:22:15.370
drawn from the industries they're supposed[br]to control, they frequently end up working
0:22:15.370,0:22:20.019
with their former office mates, to make it[br]easier to turn a profit, either by raising
0:22:20.019,0:22:24.350
barriers to keep new insurgent companies[br]out or by dismantling safeguards that
0:22:24.350,0:22:31.610
protect the public. Now a second problem[br]is regulatory lag where a technology
0:22:31.610,0:22:35.240
advances so rapidly, that regulations are[br]laughably obsolete by the time they're
0:22:35.240,0:22:40.450
issued. Consider the EU directive[br]requiring cookie notices on websites to
0:22:40.450,0:22:45.709
caution users, that their activities are[br]tracked and their privacy may be violated.
0:22:45.709,0:22:51.419
This would have been a good idea in 1993[br]or 1996, but unfortunatelly it didn't show up
0:22:51.419,0:22:57.690
until 2011. Fingerprinting and tracking[br]mechanisms have nothing to do with cookies
0:22:57.690,0:23:04.149
and were already widespread by then. Tim[br]Berners-Lee observed in 1995, that five
0:23:04.149,0:23:07.780
years worth of change was happening on the[br]web for every 12 months of real-world
0:23:07.780,0:23:12.389
time. By that yardstick, the cookie law[br]came out nearly a century too late to do
0:23:12.389,0:23:19.269
any good. Again, look at Uber. This month,[br]the European Court of Justice ruled that
0:23:19.269,0:23:24.520
Uber is a taxi service, not a Web App. This[br]is arguably correct - the problem is, Uber
0:23:24.520,0:23:28.970
has spread globally since it was founded[br]eight years ago, subsidizing its drivers to
0:23:28.970,0:23:33.580
put competing private hire firms out of[br]business. Whether this is a net good for
0:23:33.580,0:23:38.659
societys own is debatable. The problem is, a[br]taxi driver can get awfully hungry if she
0:23:38.659,0:23:42.429
has to wait eight years for a court ruling[br]against a predator intent on disrupting
0:23:42.429,0:23:49.549
her business. So, to recap: firstly, we[br]already have Paperclip Maximizers and
0:23:49.549,0:23:54.980
Musk's AI alarmism is curiously mirror[br]blind. Secondly, we have mechanisms for
0:23:54.980,0:23:59.509
keeping Paperclip Maximizers in check, but[br]they don't work very well against AIs that
0:23:59.509,0:24:03.490
deploy the dark arts, especially[br]corruption and bribery and they're even
0:24:03.490,0:24:07.769
worse against true AIs, that evolved too[br]fast for human mediated mechanisms like
0:24:07.769,0:24:13.529
the law to keep up with. Finally, unlike[br]the naive vision of a Paperclip Maximizer
0:24:13.529,0:24:19.190
that maximizes only paperclips, existing[br]AIs have multiple agendas, their overt
0:24:19.190,0:24:24.210
goal, but also profit seeking, expansion[br]into new markets and to accommodate the
0:24:24.210,0:24:27.500
desire of whoever is currently in the[br]driving seat.
0:24:27.500,0:24:30.500
sighs
0:24:30.500,0:24:36.190
Now, this brings me to the next major[br]heading in this dismaying laundry list:
0:24:36.190,0:24:42.560
how it all went wrong. It seems to me that[br]our current political upheavals, the best
0:24:42.560,0:24:49.379
understood, is arising from the capture[br]of post 1917 democratic institutions by
0:24:49.379,0:24:54.811
large-scale AI. Everywhere you look, you[br]see voters protesting angrily against an
0:24:54.811,0:24:58.779
entrenched establishment, that seems[br]determined to ignore the wants and needs
0:24:58.779,0:25:04.330
of their human constituents in favor of[br]those of the machines. The brexit upset
0:25:04.330,0:25:07.130
was largely result of a protest vote[br]against the British political
0:25:07.130,0:25:11.340
establishment, the election of Donald[br]Trump likewise, with a side order of racism
0:25:11.340,0:25:15.940
on top. Our major political parties are[br]led by people who are compatible with the
0:25:15.940,0:25:20.789
system as it exists today, a system that[br]has been shaped over decades by
0:25:20.789,0:25:26.010
corporations distorting our government and[br]regulatory environments. We humans live in
0:25:26.010,0:25:30.700
a world shaped by the desires and needs of[br]AI, forced to live on their terms and we're
0:25:30.700,0:25:34.010
taught, that we're valuable only to the[br]extent we contribute to the rule of the
0:25:34.010,0:25:40.259
machines. Now this is free sea and we're[br]all more interested in computers and
0:25:40.259,0:25:44.389
communications technology than this[br]historical crap. But as I said earlier,
0:25:44.389,0:25:49.149
history is a secret weapon, if you know how[br]to use it. What history is good for, is
0:25:49.149,0:25:53.080
enabling us to spot recurring patterns[br]that repeat across timescales outside our
0:25:53.080,0:25:58.129
personal experience. And if we look at our[br]historical very slow AIs, what do we learn
0:25:58.129,0:26:04.880
from them about modern AI and how it's[br]going to behave? Well to start with, our
0:26:04.880,0:26:09.879
AIs have been warped, the new AIs,[br]the electronic one's instantiated in our
0:26:09.879,0:26:14.639
machines, have been warped by a terrible[br]fundamentally flawed design decision back
0:26:14.639,0:26:20.289
in 1995, but as damaged democratic[br]political processes crippled our ability
0:26:20.289,0:26:24.570
to truly understand the world around us[br]and led to the angry upheavals and upsets
0:26:24.570,0:26:30.440
of our present decade. That mistake was[br]the decision, to fund the build-out of a
0:26:30.440,0:26:34.370
public World Wide Web as opposed to be[br]earlier government-funded corporate and
0:26:34.370,0:26:38.230
academic Internet by[br]monetizing eyeballs through advertising
0:26:38.230,0:26:45.260
revenue. The ad-supported web we're used[br]to today wasn't inevitable. If you recall
0:26:45.260,0:26:49.510
the web as it was in 1994, there were very[br]few ads at all and not much, in a way, of
0:26:49.510,0:26:55.720
Commerce. 1995 was the year, the World Wide[br]Web really came to public attention in the
0:26:55.720,0:27:00.570
anglophone world and consumer-facing[br]websites began to appear. Nobody really
0:27:00.570,0:27:04.490
knew, how this thing was going to be paid[br]for. The original .com bubble was all
0:27:04.490,0:27:07.850
about working out, how to monetize the web[br]for the first time and a lot of people
0:27:07.850,0:27:12.840
lost their shirts in the process. A naive[br]initial assumption was that the
0:27:12.840,0:27:17.440
transaction cost of setting up a tcp/ip[br]connection over modem was too high to
0:27:17.440,0:27:22.500
support.. to be supported by per-use micro[br]billing for web pages. So instead of
0:27:22.500,0:27:27.059
charging people fraction of a euro cent[br]for every page view, we'd bill customers
0:27:27.059,0:27:31.570
indirectly, by shoving advertising banners[br]in front of their eyes and hoping they'd
0:27:31.570,0:27:39.029
click through and buy something.[br]Unfortunately, advertising is in an
0:27:39.029,0:27:45.509
industry, one of those pre-existing very[br]slow AI ecosystems I already alluded to.
0:27:45.509,0:27:49.700
Advertising tries to maximize its hold on[br]the attention of the minds behind each
0:27:49.700,0:27:53.750
human eyeball. The coupling of advertising[br]with web search was an inevitable
0:27:53.750,0:27:57.991
outgrowth, I mean how better to attract[br]the attention of reluctant subjects, than to
0:27:57.991,0:28:01.450
find out what they're really interested in[br]seeing and selling ads that relate to
0:28:01.450,0:28:07.070
those interests. The problem of applying[br]the paperclip maximize approach to
0:28:07.070,0:28:12.509
monopolizing eyeballs, however, is that[br]eyeballs are a limited, scarce resource.
0:28:12.509,0:28:18.500
There are only 168 hours in every week, in[br]which I can gaze at banner ads. Moreover,
0:28:18.500,0:28:22.289
most ads are irrelevant to my interests and[br]it doesn't matter, how often you flash an ad
0:28:22.289,0:28:28.129
for dog biscuits at me, I'm never going to[br]buy any. I have a cat. To make best
0:28:28.129,0:28:31.990
revenue-generating use of our eyeballs,[br]it's necessary for the ad industry to
0:28:31.990,0:28:36.789
learn, who we are and what interests us and[br]to target us increasingly minutely in hope
0:28:36.789,0:28:39.549
of hooking us with stuff we're attracted[br]to.
0:28:39.549,0:28:43.879
In other words: the ad industry is a[br]paperclip maximizer, but for its success,
0:28:43.879,0:28:49.990
it relies on developing a theory of mind[br]that applies to human beings.
0:28:49.990,0:28:52.990
sighs
0:28:52.990,0:28:55.739
Do I need to divert on to the impassioned [br]rant about the hideous corruption
0:28:55.739,0:28:59.759
and evil that is Facebook?[br]Audience: Yes!
0:28:59.759,0:29:03.289
CS: Okay, somebody said yes.[br]I'm guessing you've heard it all before,
0:29:03.289,0:29:07.429
but for too long don't read.. summary is:[br]Facebook is as much a search engine as
0:29:07.429,0:29:12.029
Google or Amazon. Facebook searches are[br]optimized for faces, that is for human
0:29:12.029,0:29:15.649
beings. If you want to find someone you[br]fell out of touch with thirty years ago,
0:29:15.649,0:29:20.610
Facebook probably knows where they live,[br]what their favorite color is, what sized
0:29:20.610,0:29:24.390
shoes they wear and what they said about[br]you to your friends behind your back all
0:29:24.390,0:29:30.090
those years ago, that made you cut them off.[br]Even if you don't have a Facebook account,
0:29:30.090,0:29:34.230
Facebook has a You account, a hole in their[br]social graph of a bunch of connections
0:29:34.230,0:29:38.669
pointing in to it and your name tagged on[br]your friends photographs. They know a lot
0:29:38.669,0:29:42.529
about you and they sell access to their[br]social graph to advertisers, who then
0:29:42.529,0:29:47.400
target you, even if you don't think you use[br]Facebook. Indeed, there is barely any
0:29:47.400,0:29:52.270
point in not using Facebook these days, if[br]ever. Social media Borg: "Resistance is
0:29:52.270,0:30:00.510
futile!" So however, Facebook is trying to[br]get eyeballs on ads, so is Twitter and so
0:30:00.510,0:30:05.970
are Google. To do this, they fine-tuned the[br]content they show you to make it more
0:30:05.970,0:30:11.520
attractive to your eyes and by attractive[br]I do not mean pleasant. We humans have an
0:30:11.520,0:30:15.139
evolved automatic reflex to pay attention[br]to threats and horrors as well as
0:30:15.139,0:30:19.830
pleasurable stimuli and the algorithms,[br]that determine what they show us when we
0:30:19.830,0:30:24.259
look at Facebook or Twitter, take this bias[br]into account. You might react more
0:30:24.259,0:30:28.490
strongly to a public hanging in Iran or an[br]outrageous statement by Donald Trump than
0:30:28.490,0:30:32.100
to a couple kissing. The algorithm knows[br]and will show you whatever makes you pay
0:30:32.100,0:30:38.389
attention, not necessarily what you need or[br]want to see.
0:30:38.389,0:30:42.510
So this brings me to another point about[br]computerized AI as opposed to corporate
0:30:42.510,0:30:47.260
AI. AI algorithms tend to embody the[br]prejudices and beliefs of either the
0:30:47.260,0:30:52.759
programmers, or the data set [br]the AI was trained on.[br]
0:30:52.759,0:30:56.250
A couple of years ago I ran across an[br]account of a webcam, developed by mostly[br]
0:30:56.250,0:31:00.860
pale-skinned Silicon Valley engineers, that[br]had difficulty focusing or achieving correct
0:31:00.860,0:31:04.500
color balance, when pointed at dark-skinned[br]faces.
0:31:04.500,0:31:08.400
Fast an example of human programmer[br]induced bias, they didn't have a wide
0:31:08.400,0:31:13.249
enough test set and didn't recognize that[br]they were inherently biased towards
0:31:13.249,0:31:19.290
expecting people to have pale skin. But[br]with today's deep learning, bias can creep
0:31:19.290,0:31:24.009
in, while the datasets for neural networks are[br]trained on, even without the programmers
0:31:24.009,0:31:28.659
intending it. Microsoft's first foray into[br]a conversational chat bot driven by
0:31:28.659,0:31:33.330
machine learning, Tay, was what we yanked[br]offline within days last year, because
0:31:33.330,0:31:37.149
4chan and reddit based trolls discovered,[br]that they could train it towards racism and
0:31:37.149,0:31:43.649
sexism for shits and giggles. Just imagine[br]you're a poor naive innocent AI who's just
0:31:43.649,0:31:48.480
been switched on and you're hoping to pass[br]your Turing test and what happens? 4chan
0:31:48.480,0:31:53.189
decide to play with your head.[br]laughing
0:31:53.189,0:31:57.519
I got to feel sorry for Tay.[br]Now, humans may be biased,
0:31:57.519,0:32:01.059
but at least individually we're[br]accountable and if somebody gives you
0:32:01.059,0:32:06.499
racist or sexist abuse to your face, you[br]can complain or maybe punch them. It's
0:32:06.499,0:32:10.740
impossible to punch a corporation and it[br]may not even be possible to identify the
0:32:10.740,0:32:16.029
source of unfair bias, when you're dealing[br]with a machine learning system. AI based
0:32:16.029,0:32:21.739
systems that instantiate existing[br]prejudices make social change harder.
0:32:21.739,0:32:25.470
Traditional advertising works by playing[br]on the target customer's insecurity and
0:32:25.470,0:32:30.750
fear as much as their aspirations. And fear[br]of a loss of social status and privileges
0:32:30.750,0:32:36.240
are powerful stress. Fear and xenophobia[br]are useful tools for tracking advertising..
0:32:36.240,0:32:39.580
ah, eyeballs.[br]What happens when we get pervasive social
0:32:39.580,0:32:44.379
networks, that have learned biases against[br]say Feminism or Islam or melanin? Or deep
0:32:44.379,0:32:48.230
learning systems, trained on datasets[br]contaminated by racist dipshits and their
0:32:48.230,0:32:53.369
propaganda? Deep learning systems like the[br]ones inside Facebook, that determine which
0:32:53.369,0:32:58.210
stories to show you to get you to pay as[br]much attention as possible to be adverse.
0:32:58.210,0:33:04.841
I think, you probably have an inkling of[br]how.. where this is now going. Now, if you
0:33:04.841,0:33:08.990
think, this is sounding a bit bleak and[br]unpleasant, you'd be right. I write sci-fi.
0:33:08.990,0:33:12.669
You read or watch or play sci-fi. We're[br]acculturated to think of science and
0:33:12.669,0:33:19.420
technology as good things that make life[br]better, but this ain't always so. Plenty of
0:33:19.420,0:33:23.140
technologies have historically been[br]heavily regulated or even criminalized for
0:33:23.140,0:33:27.690
good reason and once you get past any[br]reflexive indignation, criticism of
0:33:27.690,0:33:32.740
technology and progress, you might agree[br]with me, that it is reasonable to ban
0:33:32.740,0:33:39.139
individuals from owning nuclear weapons or[br]nerve gas. Less obviously, they may not be
0:33:39.139,0:33:42.820
weapons, but we've banned[br]chlorofluorocarbon refrigerants, because
0:33:42.820,0:33:45.570
they were building up in the high[br]stratosphere and destroying the ozone
0:33:45.570,0:33:51.360
layer that protects us from UVB radiation.[br]We banned tetra e-file LED in
0:33:51.360,0:33:57.540
gasoline, because it poisoned people and[br]led to a crime wave. These are not
0:33:57.540,0:34:03.240
weaponized technologies, but they have[br]horrible side effects. Now, nerve gas and
0:34:03.240,0:34:08.690
leaded gasoline were 1930s chemical[br]technologies, promoted by 1930s
0:34:08.690,0:34:15.390
corporations. Halogenated refrigerants and[br]nuclear weapons are totally 1940s. ICBMs
0:34:15.390,0:34:19.210
date to the 1950s. You know, I have[br]difficulty seeing why people are getting
0:34:19.210,0:34:26.040
so worked up over North Korea. North Korea[br]reaches 1953 level parity - be terrified
0:34:26.040,0:34:30.760
and hide under the bed![br]I submit that the 21st century is throwing
0:34:30.760,0:34:35.030
up dangerous new technologies, just as our[br]existing strategies for regulating very
0:34:35.030,0:34:42.060
slow AIs have proven to be inadequate. And[br]I don't have an answer to how we regulate
0:34:42.060,0:34:45.889
new technologies, I just want to flag it up[br]as a huge social problem that is going to
0:34:45.889,0:34:50.380
affect the coming century.[br]I'm now going to give you four examples of
0:34:50.380,0:34:54.290
new types of AI application that are[br]going to warp our societies even more
0:34:54.290,0:35:00.800
badly than the old slow AIs, we.. have done.[br]This isn't an exhaustive list, this is just
0:35:00.800,0:35:05.160
some examples I dream, I pulled out of[br]my ass. We need to work out a general
0:35:05.160,0:35:08.430
strategy for getting on top of this sort[br]of thing before they get on top of us and
0:35:08.430,0:35:12.040
I think, this is actually a very urgent[br]problem. So I'm just going to give you this
0:35:12.040,0:35:18.110
list of dangerous new technologies that[br]are arriving now, or coming, and send you
0:35:18.110,0:35:22.150
away to think about what to do next. I[br]mean, we are activists here, we should be
0:35:22.150,0:35:27.690
thinking about this and planning what[br]to do. Now, the first nasty technology I'd
0:35:27.690,0:35:31.600
like to talk about, is political hacking[br]tools that rely on social graph directed
0:35:31.600,0:35:39.500
propaganda. This is low-hanging fruit[br]after the electoral surprises of 2016.
0:35:39.500,0:35:43.280
Cambridge Analytica pioneered the use of[br]deep learning by scanning the Facebook and
0:35:43.280,0:35:47.750
Twitter social graphs to identify voters[br]political affiliations by simply looking
0:35:47.750,0:35:52.950
at what tweets or Facebook comments they[br]liked, very able to do this, to identify
0:35:52.950,0:35:56.400
individuals with a high degree of[br]precision, who were vulnerable to
0:35:56.400,0:36:01.490
persuasion and who lived in electorally[br]sensitive districts. They then canvassed
0:36:01.490,0:36:06.910
them with propaganda, that targeted their[br]personal hot-button issues to change their
0:36:06.910,0:36:12.060
electoral intentions. The tools developed[br]by web advertisers to sell products have
0:36:12.060,0:36:16.260
now been weaponized for political purposes[br]and the amount of personal information
0:36:16.260,0:36:21.260
about our affiliations that we expose on[br]social media, makes us vulnerable. Aside, in
0:36:21.260,0:36:24.970
the last U.S. Presidential election, as[br]mounting evidence for the British
0:36:24.970,0:36:28.510
referendum on leaving the EU was subject[br]to foreign cyber war attack, now
0:36:28.510,0:36:33.250
weaponized social media, as was the most[br]recent French Presidential election.
0:36:33.250,0:36:38.080
In fact, if we remember the leak of emails[br]from the Macron campaign, it turns out that
0:36:38.080,0:36:42.020
many of those emails were false, because[br]the Macron campaign anticipated that they
0:36:42.020,0:36:47.120
would be attacked and an email trove would[br]be leaked in the last days before the
0:36:47.120,0:36:51.260
election. So they deliberately set up[br]false emails that would be hacked and then
0:36:51.260,0:37:00.610
leaked and then could be discredited. It[br]gets twisty fast. Now I'm kind of biting
0:37:00.610,0:37:04.650
my tongue and trying, not to take sides[br]here. I have my own political affiliation
0:37:04.650,0:37:09.770
after all, and I'm not terribly mainstream.[br]But if social media companies don't work
0:37:09.770,0:37:14.220
out how to identify and flag micro-[br]targeted propaganda, then democratic
0:37:14.220,0:37:18.140
institutions will stop working and elections[br]will be replaced by victories, whoever
0:37:18.140,0:37:23.200
can buy the most trolls. This won't[br]simply be billionaires but.. like the Koch
0:37:23.200,0:37:26.470
brothers and Robert Mercer from the U.S.[br]throwing elections to whoever will
0:37:26.470,0:37:31.120
hand them the biggest tax cuts. Russian[br]military cyber war doctrine calls for the
0:37:31.120,0:37:35.730
use of social media to confuse and disable[br]perceived enemies, in addition to the
0:37:35.730,0:37:39.900
increasingly familiar use of zero-day[br]exploits for espionage, such as spear
0:37:39.900,0:37:43.200
phishing and distributed denial-of-service[br]attacks, on our infrastructure, which are
0:37:43.200,0:37:49.260
practiced by Western agencies. Problem is,[br]once the Russians have demonstrated that
0:37:49.260,0:37:53.990
this is an effective tactic, the use of[br]propaganda bot armies in cyber war will go
0:37:53.990,0:38:00.400
global. And at that point, our social[br]discourse will be irreparably poisoned.
0:38:00.400,0:38:04.990
Incidentally, I'd like to add - as another[br]aside like the Elon Musk thing - I hate
0:38:04.990,0:38:09.680
the cyber prefix! It usually indicates,[br]that whoever's using it has no idea what
0:38:09.680,0:38:15.940
they're talking about.[br]applause, laughter
0:38:15.940,0:38:21.120
Unfortunately, much as the way the term[br]hacker was corrupted from its original
0:38:21.120,0:38:27.020
meaning in the 1990s, the term cyber war[br]has, it seems, to have stuck and it's now an
0:38:27.020,0:38:31.500
actual thing that we can point to and say:[br]"This is what we're talking about". So I'm
0:38:31.500,0:38:35.510
afraid, we're stuck with this really[br]horrible term. But that's a digression, I
0:38:35.510,0:38:38.780
should get back on topic, because I've only[br]got 20 minutes to go.
0:38:38.780,0:38:46.240
Now, the second threat that we need to[br]think about regulating ,or controlling, is
0:38:46.240,0:38:50.031
an adjunct to deep learning target[br]propaganda: it's the use of neural network
0:38:50.031,0:38:56.940
generated false video media. We're used to[br]photoshopped images these days, but faking
0:38:56.940,0:39:02.510
video and audio takes it to the next[br]level. Luckily, faking video and audio is
0:39:02.510,0:39:08.560
labor-intensive, isn't it? Well nope, not[br]anymore. We're seeing the first generation
0:39:08.560,0:39:13.510
of AI assisted video porn, in which the[br]faces of film stars are mapped onto those
0:39:13.510,0:39:17.370
of other people in a video clip, using[br]software rather than laborious in human
0:39:17.370,0:39:22.490
process.[br]A properly trained neural network
0:39:22.490,0:39:29.430
recognizes faces and transforms the face[br]of the Hollywood star, they want to put
0:39:29.430,0:39:35.190
into a porn movie, into the face of - onto[br]the face of the porn star in the porn clip
0:39:35.190,0:39:40.520
and suddenly you have "Oh dear God, get it[br]out of my head" - no, not gonna give you
0:39:40.520,0:39:43.500
any examples. Let's just say it's bad[br]stuff.
0:39:43.500,0:39:47.260
laughs[br]Meanwhile we have WaveNet, a system
0:39:47.260,0:39:51.120
for generating realistic sounding speech,[br]if a voice of a human's speak of a neural
0:39:51.120,0:39:56.160
network has been trained to mimic any[br]human speaker. We can now put words into
0:39:56.160,0:40:01.250
other people's mouths realistically[br]without employing a voice actor. This
0:40:01.250,0:40:06.650
stuff is still geek intensive. It requires[br]relatively expensive GPUs or cloud
0:40:06.650,0:40:10.910
computing clusters, but in less than a[br]decade it'll be out in the wild, turned
0:40:10.910,0:40:15.500
into something, any damn script kiddie can[br]use and just about everyone will be able
0:40:15.500,0:40:18.940
to fake up a realistic video of someone[br]they don't like doing something horrible.
0:40:18.940,0:40:26.970
I mean, Donald Trump in the White House. I[br]can't help but hope that out there
0:40:26.970,0:40:30.670
somewhere there's some geek like Steve[br]Bannon with a huge rack of servers who's
0:40:30.670,0:40:40.980
faking it all, but no. Now, also we've[br]already seen alarm this year over bizarre
0:40:40.980,0:40:45.090
YouTube channels that attempt to monetize[br]children's TV brands by scraping the video
0:40:45.090,0:40:50.220
content of legitimate channels and adding[br]their own advertising in keywords on top
0:40:50.220,0:40:54.110
before reposting it. This is basically[br]your YouTube spam.
0:40:54.110,0:40:58.521
Many of these channels are shaped by[br]paperclip maximizing advertising AIs, but
0:40:58.521,0:41:03.670
are simply trying to maximise their search[br]ranking on YouTube and it's entirely
0:41:03.670,0:41:08.470
algorithmic: you have a whole list of[br]keywords, you perm, you take them, you slap
0:41:08.470,0:41:15.300
them on top of existing popular videos and[br]re-upload the videos. Once you add neural
0:41:15.300,0:41:19.510
network driven tools for inserting[br]character A into pirated video B, to click
0:41:19.510,0:41:24.420
maximize.. for click maximizing bots,[br]things are gonna get very weird, though. And
0:41:24.420,0:41:28.980
they're gonna get even weirder, when these[br]tools are deployed for political gain.
0:41:28.980,0:41:34.830
We tend - being primates, that evolved 300[br]thousand years ago in a smartphone free
0:41:34.830,0:41:39.950
environment - to evaluate the inputs from[br]our eyes and ears much less critically
0:41:39.950,0:41:44.380
than what random strangers on the Internet[br]tell us in text. We're already too
0:41:44.380,0:41:48.930
vulnerable to fake news as it is. Soon[br]they'll be coming for us, armed with
0:41:48.930,0:41:54.830
believable video evidence. The Smart Money[br]says that by 2027 you won't be able to
0:41:54.830,0:41:58.550
believe anything you see in video, unless[br]for a cryptographic signatures on it,
0:41:58.550,0:42:02.530
linking it back to the camera that shot[br]the raw feed. But you know how good most
0:42:02.530,0:42:08.150
people are at using encryption - it's going to[br]be chaos!
0:42:08.150,0:42:14.070
So, paperclip maximizers with focus on[br]eyeballs are very 20th century. The new
0:42:14.070,0:42:19.750
generation is going to be focusing on our[br]nervous system. Advertising as an industry
0:42:19.750,0:42:22.510
can only exist because of a quirk of our[br]nervous system, which is that we're
0:42:22.510,0:42:26.680
susceptible to addiction. Be it [br]tobacco, gambling or heroin, we
0:42:26.680,0:42:31.980
recognize addictive behavior, when we see[br]it. Well, do we? It turns out the human
0:42:31.980,0:42:36.250
brain's reward feedback loops are[br]relatively easy to gain. Large
0:42:36.250,0:42:41.400
corporations like Zynga - producers of[br]FarmVille - exist solely because of it,
0:42:41.400,0:42:45.580
free to use social media platforms like[br]Facebook and Twitter, are dominant precisely
0:42:45.580,0:42:50.100
because they're structured to reward[br]frequent short bursts of interaction and
0:42:50.100,0:42:54.790
to generate emotional engagement - not[br]necessarily positive emotions, anger and
0:42:54.790,0:42:59.660
hatred are just as good when it comes to[br]attracting eyeballs for advertisers.
0:42:59.660,0:43:04.510
Smartphone addiction is a side effect of[br]advertising as a revenue model. Frequent
0:43:04.510,0:43:09.850
short bursts of interaction to keep us[br]coming back for more. Now a new.. newish
0:43:09.850,0:43:13.720
development, thanks to deep learning again -[br]I keep coming back to deep learning,
0:43:13.720,0:43:19.290
don't I? - use of neural networks in a[br]manner that Marvin Minsky never envisaged,
0:43:19.290,0:43:22.530
back when he was deciding that the[br]Perzeptron was where it began and ended
0:43:22.530,0:43:27.360
and it couldn't do anything.[br]Well, we have neuroscientists now, who've
0:43:27.360,0:43:33.680
mechanized the process of making apps more[br]addictive. Dopamine Labs is one startup
0:43:33.680,0:43:38.030
that provides tools to app developers to[br]make any app more addictive, as well as to
0:43:38.030,0:43:41.130
reduce the desire to continue[br]participating in a behavior if it's
0:43:41.130,0:43:46.520
undesirable, if the app developer actually[br]wants to help people kick the habit. This
0:43:46.520,0:43:52.060
goes way beyond automated A/B testing. A/B[br]testing allows developers to plot a binary
0:43:52.060,0:43:58.760
tree path between options, moving towards a[br]single desired goal. But true deep
0:43:58.760,0:44:03.640
learning, addictiveness maximizers, can[br]optimize for multiple attractors in
0:44:03.640,0:44:10.011
parallel. The more users you've got on[br]your app, the more effectively you can work
0:44:10.011,0:44:16.920
out, what attracts them and train them and[br]focus on extra addictive characteristics.
0:44:16.920,0:44:20.610
Now, going by their public face, the folks[br]at Dopamine Labs seem to have ethical
0:44:20.610,0:44:24.930
qualms about the misuse of addiction[br]maximizers. But neuroscience isn't a
0:44:24.930,0:44:28.730
secret and sooner or later some really[br]unscrupulous sociopaths will try to see
0:44:28.730,0:44:36.440
how far they can push it. So let me give[br]you a specific imaginary scenario: Apple
0:44:36.440,0:44:41.400
have put a lot of effort into making real-[br]time face recognition work on the iPhone X
0:44:41.400,0:44:44.940
and it's going to be everywhere on[br]everybody's phone in another couple of
0:44:44.940,0:44:50.760
years. You can't fool an iPhone X with a[br]photo or even a simple mask. It does depth
0:44:50.760,0:44:54.010
mapping to ensure, your eyes are in the[br]right place and can tell whether they're
0:44:54.010,0:44:58.200
open or closed. It recognizes your face[br]from underlying bone structure through
0:44:58.200,0:45:02.760
makeup and bruises. It's running[br]continuously, checking pretty much as often
0:45:02.760,0:45:06.840
as every time you'd hit the home button on[br]a more traditional smartphone UI and it
0:45:06.840,0:45:13.750
can see where your eyeballs are pointing.[br]The purpose of a face recognition system
0:45:13.750,0:45:19.980
is to provide for real-time authenticate[br]continuous authentication when you're
0:45:19.980,0:45:23.960
using a device - not just enter a PIN or[br]sign a password or use a two factor
0:45:23.960,0:45:28.420
authentication pad, but the device knows[br]that you are its authorized user on a
0:45:28.420,0:45:33.000
continuous basis and if somebody grabs[br]your phone and runs away with it, it'll
0:45:33.000,0:45:38.050
know that it's been stolen immediately, it[br]sees the face of the thief.
0:45:38.050,0:45:42.910
However, your phone monitoring your facial[br]expressions and correlating against app
0:45:42.910,0:45:48.250
usage has other implications. Your phone[br]will be aware of precisely what you like
0:45:48.250,0:45:53.550
to look at on your screen.. on its screen.[br]We may well have sufficient insight on the
0:45:53.550,0:45:59.740
part of the phone to identify whether[br]you're happy or sad, bored or engaged.
0:45:59.740,0:46:04.680
With addiction seeking deep learning tools[br]and neural network generated images, those
0:46:04.680,0:46:09.150
synthetic videos I was talking about, it's[br]entirely.. in principle entirely possible to
0:46:09.150,0:46:15.250
feed you an endlessly escalating payload[br]of arousal-inducing inputs. It might be
0:46:15.250,0:46:19.440
Facebook or Twitter messages, optimized to[br]produce outrage, or it could be porn
0:46:19.440,0:46:24.620
generated by AI to appeal to kinks you[br]don't even consciously know you have.
0:46:24.620,0:46:28.980
But either way, the app now owns your[br]central nervous system and you will be
0:46:28.980,0:46:36.230
monetized. And finally, I'd like to raise a[br]really hair-raising specter that goes well
0:46:36.230,0:46:41.730
beyond the use of deep learning and[br]targeted propaganda and cyber war. Back in
0:46:41.730,0:46:46.340
2011, an obscure Russian software house[br]launched an iPhone app for pickup artists
0:46:46.340,0:46:52.940
called 'Girls Around Me'. Spoiler: Apple[br]pulled it like a hot potato as soon as
0:46:52.940,0:46:59.060
word got out that it existed. Now, Girls[br]Around Me works out where the user is
0:46:59.060,0:47:03.730
using GPS, then it would query Foursquare[br]and Facebook for people matching a simple
0:47:03.730,0:47:09.880
relational search, for single females on[br]Facebook, per relationship status, who have
0:47:09.880,0:47:15.020
checked in, or been checked in by their[br]friends, in your vicinity on Foursquare.
0:47:15.020,0:47:19.270
The app then displays their locations on a[br]map along with links to their social media
0:47:19.270,0:47:24.560
profiles. If they were doing it today, the[br]interface would be gamified, showing strike
0:47:24.560,0:47:28.980
rates and a leaderboard and flagging[br]targets who succumbed to harassment as
0:47:28.980,0:47:31.960
easy lays.[br]But these days, the cool kids and single
0:47:31.960,0:47:35.980
adults are all using dating apps with a[br]missing vowel in the name, only a creeper
0:47:35.980,0:47:45.420
would want something like Girls Around Me,[br]right? Unfortunately, there are much, much
0:47:45.420,0:47:49.450
nastier uses of and scraping social media[br]to find potential victims for serial
0:47:49.450,0:47:54.480
rapists. Does your social media profile[br]indicate your political religious
0:47:54.480,0:47:59.650
affiliation? No? Cambridge Analytica can[br]work them out with 99.9% precision
0:47:59.650,0:48:04.660
anyway, so don't worry about that. We[br]already have you pegged. Now add a service
0:48:04.660,0:48:08.430
that can identify people's affiliation and[br]location and you have a beginning of a
0:48:08.430,0:48:13.520
flash mob app, one that will show people[br]like us and people like them on a
0:48:13.520,0:48:18.551
hyperlocal map.[br]Imagine you're a young female and a
0:48:18.551,0:48:22.240
supermarket like Target has figured out[br]from your purchase patterns, that you're
0:48:22.240,0:48:28.160
pregnant, even though you don't know it[br]yet. This actually happened in 2011. Now
0:48:28.160,0:48:31.250
imagine, that all the anti-abortion[br]campaigners in your town have an app
0:48:31.250,0:48:36.110
called "Babies Risk" on their phones.[br]Someone has paid for the analytics feed
0:48:36.110,0:48:40.320
from the supermarket and every time you go[br]near a family planning clinic, a group of
0:48:40.320,0:48:44.270
unfriendly anti-abortion protesters[br]somehow miraculously show up and swarm
0:48:44.270,0:48:49.990
you. Or imagine you're male and gay and[br]the "God hates fags"-crowd has invented a
0:48:49.990,0:48:55.300
100% reliable gaydar app, based on your[br]Grindr profile, and is getting their fellow
0:48:55.300,0:49:00.810
travelers to queer bash gay men - only when[br]they're alone or outnumbered by ten to
0:49:00.810,0:49:06.410
one. That's the special horror of precise[br]geolocation not only do you always know
0:49:06.410,0:49:13.020
where you are, the AIs know, where you are[br]and some of them aren't friendly. Or
0:49:13.020,0:49:17.330
imagine, you're in Pakistan and Christian[br]Muslim tensions are rising or in rural
0:49:17.330,0:49:24.250
Alabama or an Democrat, you know the[br]possibilities are endless. Someone out
0:49:24.250,0:49:30.120
there is working on this. A geolocation[br]aware, social media scraping deep learning
0:49:30.120,0:49:34.230
application, that uses a gamified[br]competitive interface to reward its
0:49:34.230,0:49:38.500
players for joining in acts of mob[br]violence against whoever the app developer
0:49:38.500,0:49:41.640
hates.[br]Probably it has an innocuous seeming, but
0:49:41.640,0:49:46.650
highly addictive training mode, to get the[br]users accustomed to working in teams and
0:49:46.650,0:49:52.640
obeying the apps instructions. Think[br]Ingress or Pokemon Go. Then at some pre-
0:49:52.640,0:49:57.770
planned zero-hour, it switches mode and[br]starts rewarding players for violence,
0:49:57.770,0:50:01.780
players who have been primed to think of[br]their targets as vermin by a steady drip
0:50:01.780,0:50:06.470
feed of micro-targeted dehumanizing[br]propaganda inputs, delivered over a period
0:50:06.470,0:50:12.310
of months. And the worst bit of this picture?[br]Is that the app developer isn't even a
0:50:12.310,0:50:16.540
nation-state trying to disrupt its enemies[br]or an extremist political group trying to
0:50:16.540,0:50:22.000
murder gays, Jews or Muslims. It's just a[br]Paperclip Maximizer doing what it does
0:50:22.000,0:50:27.900
and you are the paper. Welcome to the 21st[br]century.
0:50:27.900,0:50:40.730
applause[br]Uhm...
0:50:40.730,0:50:41.960
Thank you.
0:50:41.960,0:50:47.780
ongoing applause[br]We have a little time for questions. Do
0:50:47.780,0:50:53.790
you have a microphone for the orders? Do[br]we have any questions? ... OK.
0:50:53.790,0:50:56.430
Herald: So you are doing a Q&A?[br]CS: Hmm?
0:50:56.430,0:51:02.290
Herald: So you are doing a Q&A. Well if[br]there are any questions, please come
0:51:02.290,0:51:24.320
forward to the microphones, numbers 1[br]through 4 and ask.
0:51:24.320,0:51:29.010
Mic 1: Do you really think it's all[br]bleak and dystopian like you prescribed
0:51:29.010,0:51:34.750
it, because I also think the future can be[br]bright, looking at the internet with open
0:51:34.750,0:51:38.730
source and like, it's all growing and going[br]faster and faster in a good
0:51:38.730,0:51:44.110
direction. So what do you think about[br]the balance here?
0:51:44.110,0:51:48.070
CS: sighs Basically, I think the[br]problem is, that about 3% of us
0:51:48.070,0:51:54.330
are sociopaths or psychopaths, who spoil[br]everything for the other 97% of us.
0:51:54.330,0:51:56.980
Wouldn't it be great if somebody could[br]write an app that would identify all the
0:51:56.980,0:51:59.690
psychopaths among us and let the rest of[br]us just kill them?
0:51:59.690,0:52:05.320
laughing, applause[br]Yeah, we have all the
0:52:05.320,0:52:12.300
tools to make a utopia, we have it now[br]today. A bleak miserable grim meathook
0:52:12.300,0:52:17.510
future is not inevitable, but it's up to[br]us to use these tools to prevent the bad
0:52:17.510,0:52:22.480
stuff happening and to do that, we have to[br]anticipate the bad outcomes and work to
0:52:22.480,0:52:25.260
try and figure out a way to deal with[br]them. That's what this talk is. I'm trying
0:52:25.260,0:52:29.300
to do a bit of a wake-up call and get[br]people thinking about how much worse
0:52:29.300,0:52:33.630
things can get and what we need to do to[br]prevent it from happening. What I was
0:52:33.630,0:52:37.960
saying earlier about our regulatory[br]systems being broken, stands. How do we
0:52:37.960,0:52:47.500
regulate the deep learning technologies?[br]This is something we need to think about.
0:52:47.500,0:52:55.230
H: Okay mic number two.[br]Mic 2: Hello? ... When you talk about
0:52:55.230,0:53:02.250
corporations as AIs, where do you see that[br]analogy you're making? Do you see them as
0:53:02.250,0:53:10.010
literally AIs or figuratively?[br]CS: Almost literally. If
0:53:10.010,0:53:14.230
you're familiar with philosopher[br](?) Searle's Chinese room paradox
0:53:14.230,0:53:17.620
from the 1970s, by which he attempted to[br]prove that artificial intelligence was
0:53:17.620,0:53:24.380
impossible, a corporation is very much the[br]Chinese room implementation of an AI. It
0:53:24.380,0:53:29.400
is a bunch of human beings in a box. You[br]put inputs into the box, you get apples
0:53:29.400,0:53:33.000
out of a box. Does it matter, whether it's[br]all happening in software or whether
0:53:33.000,0:53:37.520
there's a human being following rules[br]inbetween to assemble the output? I don't
0:53:37.520,0:53:41.190
see there being much of a difference.[br]Now you have to look at a company at a
0:53:41.190,0:53:46.630
very abstract level to view it as an AI,[br]but more and more companies are automating
0:53:46.630,0:53:54.500
their internal business processes. You've[br]got to view this as an ongoing trend. And
0:53:54.500,0:54:01.490
yeah, they have many of the characteristics[br]of an AI.
0:54:01.490,0:54:06.240
Herald: Okay mic number four.[br]Mic 4: Hi, thanks for your talk.
0:54:06.240,0:54:13.220
You probably heard of the Time Well[br]Spent and Design Ethics movements that
0:54:13.220,0:54:16.850
are alerting developers to dark patterns[br]in UI design, where
0:54:16.850,0:54:20.630
these people design apps to manipulate[br]people. I'm curious if you find any
0:54:20.630,0:54:26.750
optimism in the possibility of amplifying[br]or promoting those movements.
0:54:26.750,0:54:31.570
CS: Uhm, you know, I knew about dark[br]patterns, I knew about people trying to
0:54:31.570,0:54:36.230
optimize them, I wasn't actually aware[br]there were movements against this. Okay I'm
0:54:36.230,0:54:40.760
53 years old, I'm out of touch. I haven't[br]actually done any serious programming in
0:54:40.760,0:54:47.000
15 years. I'm so rusty, my rust has rust on[br]it. But, you know, it is a worrying trend
0:54:47.000,0:54:54.000
and actual activism is a good start.[br]Raising awareness of hazards and of what
0:54:54.000,0:54:58.330
we should be doing about them, is a good[br]start. And I would classify this actually
0:54:58.330,0:55:04.220
as a moral issue. We need to..[br]corporations evaluate everything in terms
0:55:04.220,0:55:07.960
of revenue, because it's very[br]equivalent to breathing, they have to
0:55:07.960,0:55:13.810
breathe. Corporations don't usually have[br]any moral framework. We're humans, we need
0:55:13.810,0:55:18.480
a moral framework to operate within. Even[br]if it's as simple as first "Do no harm!"
0:55:18.480,0:55:22.500
or "Do not do unto others that which would[br]be repugnant if it was done unto you!",
0:55:22.500,0:55:26.450
the Golden Rule. So, yeah, we should be[br]trying to spread awareness of this about
0:55:26.450,0:55:32.170
and working with program developers, to[br]look to remind them that they are human
0:55:32.170,0:55:36.480
beings and have to be humane in their[br]application of technology, is a necessary
0:55:36.480,0:55:39.510
start.[br]applause
0:55:39.510,0:55:45.501
H: Thank you! Mic 3?[br]Mic 3: Hi! Yeah, I think that folks,
0:55:45.501,0:55:48.600
especially in this sort of crowd, tend to[br]jump to the "just get off of
0:55:48.600,0:55:52.330
Facebook"-solution first, for a lot of[br]these things that are really, really
0:55:52.330,0:55:56.640
scary. But what worries me, is how we sort[br]of silence ourselves when we do that.
0:55:56.640,0:56:02.240
After the election I actually got back on[br]Facebook, because the Women's March was
0:56:02.240,0:56:07.250
mostly organized through Facebook. But[br]yeah, I think we need a lot more
0:56:07.250,0:56:12.880
regulation, but we can't just throw it[br]out. We're.. because it's..
0:56:12.880,0:56:16.520
social media is the only... really good[br]platform we have right now
0:56:16.520,0:56:20.410
to express ourselves, to[br]have our rules, or power.
0:56:20.410,0:56:24.660
CS: Absolutely. I have made[br]a point of not really using Facebook
0:56:24.660,0:56:28.480
for many, many, many years.[br]I have a Facebook page simply to
0:56:28.480,0:56:31.530
shut up the young marketing people at my[br]publisher, who used to prop up every two
0:56:31.530,0:56:34.680
years and say: "Why don't you have a[br]Facebook. Everybody's got a Facebook."
0:56:34.680,0:56:39.790
No, I've had a blog since 1993![br]laughing
0:56:39.790,0:56:44.170
But no, I'm gonna have to use Facebook,[br]because these days, not using Facebook is
0:56:44.170,0:56:50.780
like not using email. You're cutting off[br]your nose to spite your face. What we
0:56:50.780,0:56:54.830
really do need to be doing, is looking for[br]some form of effective oversight of
0:56:54.830,0:57:00.830
Facebook and particularly, of how they..[br]the algorithms that show you content, are
0:57:00.830,0:57:05.340
written. What I was saying earlier about[br]how algorithms are not as transparent as
0:57:05.340,0:57:10.530
human beings to people, applies hugely to[br]them. And both, Facebook and Twitter
0:57:10.530,0:57:15.180
control the information[br]that they display to you.
0:57:15.180,0:57:18.690
Herald: Okay, I'm terribly sorry for all the[br]people queuing at the mics now, we're out
0:57:18.690,0:57:24.960
of time. I also have to apologize, I[br]announced, that this talk was being held in
0:57:24.960,0:57:29.690
English, but it was being held in English.[br]the latter pronounced on the G
0:57:29.690,0:57:31.470
Thank you very much, Charles Stross!
0:57:31.470,0:57:33.690
CS: Thank you very much for[br]listening to me, it's been a pleasure!
0:57:33.690,0:57:36.500
applause
0:57:36.500,0:57:52.667
postroll music
0:57:52.667,0:57:57.781
subtitles created by c3subtitles.de[br]in the year 2018