OEB 2015 - Opening Plenary - Cory Doctorow
-
0:00 - 0:01(Cory Doctorow) Thank you very much
-
0:01 - 0:05So I'd like to start with something of a
benediction or permission. -
0:05 - 0:08I am one of nature's fast talkers
-
0:08 - 0:11and many of you are not
native English speakers, or -
0:11 - 0:13maybe not accustomed
to my harsh Canadian accent -
0:13 - 0:16in addition I've just come in
from Australia -
0:16 - 0:19and so like many of you I am horribly
jetlagged and have drunk enough coffee -
0:19 - 0:21this morning to kill a rhino.
-
0:22 - 0:24When I used to be at the United Nations
-
0:24 - 0:27I was known as the scourge of the
simultaneous translation core -
0:27 - 0:30I would stand up and speak
as slowly as I could -
0:30 - 0:32and turn around, and there they
would be in their boots doing this -
0:32 - 0:35(laughter)
When I start to speak too fast, -
0:35 - 0:38this is the universal symbol --
my wife invented it -- -
0:38 - 0:42for "Cory, you are talking too fast".
Please, don't be shy. -
0:42 - 0:46So, I'm a parent , like many of you
and I'm like I'm sure all of you -
0:46 - 0:49who are parents, parenting takes my ass
all the time. -
0:50 - 0:55And there are many regrets I have
about the mere seven and half years -
0:55 - 0:57that I've been a parent
but none ares so keenly felt -
0:58 - 1:01as my regrets over what's happened
when I've been wandering -
1:01 - 1:05around the house and seen my
daughter working on something -
1:05 - 1:09that was beyond her abilities, that was
right at the edge of what she could do -
1:09 - 1:13and where she was doing something
that she didn't have competence in yet -
1:13 - 1:17and you know it's that amazing thing
to see that frowning concentration, -
1:17 - 1:20tongue stuck out: as a parent, your
heart swells with pride -
1:20 - 1:21and you can't help but go over
-
1:21 - 1:24and sort of peer over their shoulder
what they are doing -
1:24 - 1:27and those of you who are parents know
what happens when you look too closely -
1:28 - 1:30at someone who is working
beyond the age of their competence. -
1:31 - 1:33They go back to doing something
they're already good at. -
1:33 - 1:35You interrupt a moment
of genuine learning -
1:35 - 1:38and you replace it with
a kind of embarrassment -
1:39 - 1:42about what you're good at
and what you're not. -
1:43 - 1:48So, it matters a lot that our schools are
increasingly surveilled environments, -
1:48 - 1:52environments in which everything that
our kids do is watched and recorded. -
1:53 - 1:56Because when you do that, you interfere
with those moments of real learning. -
1:57 - 2:01Our ability to do things that we are not
good at yet, that we are not proud of yet, -
2:01 - 2:04is negatively impacted
by that kind of scrutiny. -
2:04 - 2:06And that scrutiny comes
from a strange place. -
2:07 - 2:11We have decided that there are
some programmatic means -
2:11 - 2:14by which we can find all the web page
children shouldn't look at -
2:14 - 2:18and we will filter our networks
to be sure that they don't see them. -
2:19 - 2:22Anyone who has ever paid attention
knows that this doesn't work. -
2:22 - 2:26There are more web pages
that kids shouldn't look at -
2:26 - 2:29than can ever be cataloged,
and any attempt to catalog them -
2:29 - 2:32will always catch pages that kids
must be looking at. -
2:32 - 2:35Any of you who have ever taught
a unit on reproductive health -
2:35 - 2:38know the frustration of trying
to get round a school network. -
2:39 - 2:42Now, this is done in the name of
digital protection -
2:43 - 2:46but it flies in the face of digital
literacy and of real learning. -
2:47 - 2:50Because the only way to stop kids
from looking at web pages -
2:50 - 2:52they shouldn't be looking at
-
2:52 - 2:56is to take all of the clicks that they
make, all of the messages that they send, -
2:56 - 3:00all of their online activity
and offshore it to a firm -
3:00 - 3:04that has some nonsensically arrived at
list of the bad pages. -
3:04 - 3:08And so, what we are doing is that we're
exfiltrating all of our students' data -
3:08 - 3:10to unknown third parties.
-
3:10 - 3:13Now, most of these firms,
their primary business isn't -
3:13 - 3:14serving the education sector.
-
3:14 - 3:16Most of them service
the government sectors. -
3:17 - 3:21They primarily service governments in
repressive autocratic regimes. -
3:21 - 3:24They help them ensure that
their citizens aren't looking at -
3:24 - 3:25Amnesty International web pages.
-
3:26 - 3:29They repackage those tools
and sell them to our educators. -
3:30 - 3:33So we are offshoring our children's clicks
to war criminals. -
3:34 - 3:37And what our kids do, now,
is they just get around it, -
3:37 - 3:39because it's not hard to get around it.
-
3:39 - 3:44You know, never underestimate the power
of a kid who is time-rich and cash-poor -
3:44 - 3:46to get around our
technological blockades. -
3:48 - 3:51But when they do this, they don't acquire
the kind of digital literacy -
3:51 - 3:54that we want them to do, they don't
acquire real digital agency -
3:54 - 3:58and moreover, they risk exclusion
and in extreme cases, -
3:58 - 3:59they risk criminal prosecution.
-
4:00 - 4:04So what if instead, those of us who are
trapped in this system of teaching kids -
4:04 - 4:08where we're required to subject them
to this kind of surveillance -
4:08 - 4:10that flies in the face
of their real learning, -
4:10 - 4:13what if instead, we invented
curricular units -
4:13 - 4:16that made them real first class
digital citizens, -
4:16 - 4:20in charge of trying to influence
real digital problems? -
4:20 - 4:23Like what if we said to them:
"We want you to catalog the web pages -
4:23 - 4:25that this vendor lets through
that you shouldn't be seeing. -
4:25 - 4:29We want you to catalog those pages that
you should be seeing, that are blocked. -
4:29 - 4:32We want you to go and interview
every teacher in the school -
4:32 - 4:35about all those lesson plans that were
carefully laid out before lunch -
4:35 - 4:38with a video and a web page,
and over lunch, -
4:38 - 4:41the unaccountable distance center
blocked these critical resources -
4:41 - 4:45and left them handing out photographed
worksheets in the afternoon -
4:45 - 4:47instead of the unit they prepared.
-
4:47 - 4:51We want you to learn how to do the Freedom
of Information Act requests -
4:51 - 4:53and find out what your
school authority is spending -
4:53 - 4:56to censor your internet access
and surveil your activity. -
4:56 - 5:00We want you to learn to use the internet
to research these companies -
5:00 - 5:04and we want you to present this
to your parent-teacher association, -
5:04 - 5:07to your school authority,
to your local newspaper." -
5:07 - 5:09Because that's the kind
of digital literacy -
5:09 - 5:11that makes kids into first-class
digital citizens, -
5:11 - 5:16that prepares them for a future
in which they can participate fully -
5:16 - 5:18in a world that's changing.
-
5:19 - 5:23Kids are the beta-testers
of the surveillance state. -
5:23 - 5:27The path of surveillance technology
starts with prisoners, -
5:27 - 5:30moves to asylum seekers,
people in mental institutions -
5:30 - 5:34and then to its first non-incarcerated
population: children -
5:34 - 5:37and then moves to blue-collar workers,
government workers -
5:37 - 5:38and white-collar workers.
-
5:38 - 5:42And so, what we do to kids today
is what we did to prisoners yesterday -
5:42 - 5:44and what we're going to be doing
to you tomorrow. -
5:44 - 5:47And so it matters, what we teach our kids.
-
5:47 - 5:51If you want to see where this goes, this
is a kid named Blake Robbins -
5:51 - 5:55and he attended Lower Merion High School
in Lower Merion Pennsylvania -
5:55 - 5:56outside f Philadelphia.
-
5:56 - 6:00It's the most affluent public school
district in America, so affluent -
6:00 - 6:03that all the kids were issued Macbooks
at the start of the year -
6:03 - 6:05and they had to do their homework on
their Macbooks, -
6:05 - 6:08and bring them to school every day
and bring them home every night. -
6:08 - 6:11And the Macbooks had been fitted with
Laptop Theft Recovery Software, -
6:11 - 6:16which is fancy word for a rootkit, that
let the school administration -
6:16 - 6:20covertly (check) operate the cameras
and microphones on these computers -
6:20 - 6:23and harvest files off
of their hard drives -
6:24 - 6:26view all their clicks, and so on.
-
6:26 - 6:31Now Blake Robbins found out
that the software existed -
6:31 - 6:34and how it was being used
because he and the head teacher -
6:34 - 6:37had been knocking heads for years,
since he first got into the school, -
6:37 - 6:40and one day, the head teacher
summoned him to his office -
6:40 - 6:41and said: "Blake, I've got you now."
-
6:42 - 6:45and handed him a print-out of Blake
in his bedroom the night before, -
6:46 - 6:49taking what looked like a pill,
and he said: "You're taking drugs." -
6:49 - 6:54And Blake Robbins said: "That's a candy,
it's a Mike and Ike's candy, I take them -
6:54 - 6:56when I -- I eat them when I'm studying.
-
6:56 - 6:58How did you get a picture
of me in my bedroom?" -
6:59 - 7:03This head teacher had taken
over 6000 photos of Blake Robbins: -
7:03 - 7:06awake and asleep, dressed and undressed,
in the presence of his family. -
7:07 - 7:10And in the ensuing lawsuit, the school
settled for a large amount of money -
7:10 - 7:12and promised that
they wouldn't do it again -
7:13 - 7:16without informing the students
that it was going on. -
7:16 - 7:19And increasingly, the practice is now
-
7:19 - 7:22that school administrations hand out
laptops, because they're getting cheaper, -
7:23 - 7:25with exactly the same kind of software,
-
7:25 - 7:28but they let the students know and
hey find that that works even better -
7:28 - 7:30at curbing the students' behavior,
-
7:30 - 7:33because the students know that
they're always on camera. -
7:34 - 7:38Now, the surveillance state is moving
from kids to the rest of the world. -
7:38 - 7:40It's metastasizing.
-
7:40 - 7:44Our devices are increasingly designed
to treat us as attackers, -
7:44 - 7:47as suspicious parties
who can't be trusted -
7:47 - 7:51because our devices' job is to do things
that we don't want them to do. -
7:51 - 7:54Now that's not because the vendors
who make our technology -
7:54 - 7:56want to spy on us necessarily,
-
7:56 - 8:01but they want to take
the ink-jet printer business model -
8:01 - 8:04and bring it into every other realm
of the world. -
8:04 - 8:09So the ink-jet printer business model
is where you sell someone a device -
8:09 - 8:12and then you get a continuing
revenue stream from that device -
8:12 - 8:16by making sure that competitors can't make
consumables or parts -
8:16 - 8:19or additional features
or plugins for that device, -
8:19 - 8:22without paying rent
to the original manufacturer. -
8:22 - 8:26And that allows you to maintain
monopoly margins on your devices. -
8:26 - 8:31Now, in 1998, the American government
passed a law called -
8:31 - 8:32the Digital Millennium Copyright Act,
-
8:32 - 8:35in 2001 the European Union
introduced its own version, -
8:35 - 8:37the European Union Copyright Directive.
-
8:38 - 8:40And these two laws, along with laws
all around the world, -
8:40 - 8:46in Australia, Canada and elsewhere,
these laws prohibit removing digital laws -
8:46 - 8:49that are used to restrict
access to copyrighted works -
8:49 - 8:52and they were original envisioned as a way
of making sure that Europeans didn't -
8:52 - 8:55bring cheap DVDs in from America,
-
8:55 - 8:58or making sure that Australians didn't
import cheap DVDs from China. -
8:58 - 9:04And so you have a digital work, a DVD,
and it has a lock on it and to unlock it, -
9:04 - 9:05you have to buy an authorized player
-
9:05 - 9:07and the player checks to make sure
you are in region -
9:07 - 9:10and making your own player
that doesn't make that check -
9:10 - 9:12is illegal because you'd have
to remove the digital lock. -
9:12 - 9:14And that was the original intent,
-
9:14 - 9:19it was to allow high rents to be
maintained on removable media, -
9:19 - 9:21DVDs and other entertainment content.
-
9:21 - 9:24But it very quickly spread
into new rounds. -
9:25 - 9:28So, for example, auto manufacturers
now lock up -
9:28 - 9:31all of their cars' telemetry
with digital locks. -
9:31 - 9:33If you're a mechanic
and want to fix a car, -
9:33 - 9:37you have to get a reader
from the manufacturer -
9:37 - 9:40to make sure that you can
see the telemetry -
9:40 - 9:43and then know what parts to order
and how to fix it. -
9:43 - 9:46And in order to get this reader,
you have to promise the manufacturer -
9:46 - 9:50that you will only buy parts
from that manufacturer -
9:50 - 9:51and not from third parties.
-
9:51 - 9:54So the manufacturers can keep
the repair costs high -
9:54 - 9:57and get a secondary revenue stream
out of the cars. -
9:57 - 10:05This year, the Chrysler corporation filed
comments with the US Copyright Office, -
10:05 - 10:08to say that they believed that
this was the right way to do it -
10:08 - 10:10and that it should be a felony,
punishable by 5 years in prison -
10:10 - 10:12and a $500'000 fine,
-
10:12 - 10:16to change the locks on a car that you own,
so that you can choose who fixes it. -
10:17 - 10:20It turned out that when they advertised
-
10:20 - 10:22-- well, where is my slide here?
Oh, there we go -- -
10:22 - 10:25when they advertised that
it wasn't your father's Oldsmobile, -
10:26 - 10:29they weren't speaking metaphorically,
they literally meant -
10:29 - 10:30that even though your father
bought the Oldsmobile, -
10:30 - 10:33it remained their property in perpetuity.
-
10:34 - 10:36And it's not just cars,
it's every kind of device, -
10:36 - 10:39because every kind of device today
has a computer in it. -
10:40 - 10:43The John Deer Company, the world's leading seller of heavy equipment
-
10:43 - 10:45and agricultural equipment technologies,
-
10:46 - 10:49they now view their tractors as
information gathering platforms -
10:49 - 10:51and they view the people who use them
-
10:51 - 10:56as the kind of inconvenient gut flora
of their ecosystem. -
10:56 - 10:58So if you are a farmer
and you own a John Deer tractor, -
10:58 - 11:03when you drive it around your fields,
the torque centers on the wheels -
11:03 - 11:08conduct a centimeter-accurate soil
density survey of your agricultural land. -
11:08 - 11:11That would be extremely useful to you
when you're planting your seed -
11:11 - 11:13but that data is not available to you
-
11:13 - 11:15unless unless you remove the digital lock
from your John Deer tractor -
11:15 - 11:18which again, is against the law
everywhere in the world. -
11:18 - 11:20Instead, in order to get that data
-
11:20 - 11:23you have to buy a bundle with seeds
from Monsanto, -
11:23 - 11:25who are John Deer's seed partners.
-
11:25 - 11:29John Deer then takes this data that they
aggregate across whole regions -
11:29 - 11:31and they use it to gain insight
into regional crop yields -
11:31 - 11:33that they use to play the futures market.
-
11:34 - 11:37John Deer's tractors are really just
a way of gathering information -
11:37 - 11:39and the farmers are secondary to it.
-
11:39 - 11:42Just because you own it
doesn't mean it's yours. -
11:42 - 11:45And it's not just the computers
that we put our bodies into -
11:46 - 11:47that have this business model.
-
11:47 - 11:49It's the computers that we put
inside of our bodies. -
11:50 - 11:51If you're someone who is diabetic
-
11:51 - 11:55and you're fitted with a continuous
glucose-measuring insulin pump, -
11:55 - 11:58that insulin pump is designed
with a digital lock -
11:58 - 12:02that makes sure that your doctor
can only use the manufacturer's software -
12:02 - 12:04to read the data coming off of it
-
12:04 - 12:07and that software is resold
on a rolling annual license -
12:07 - 12:10and it can't be just bought outright.
-
12:10 - 12:12And the digital locks are also
used to make sure -
12:12 - 12:14that you only buy the insulin
that vendors approved -
12:14 - 12:17and not generic insulin
that might be cheaper. -
12:17 - 12:21We've literally turned human beings
into ink-jet printers. -
12:21 - 12:28Now, this has really deep implications
beyond the economic implications. -
12:28 - 12:31Because the rules that prohibit
breaking these digital locks -
12:31 - 12:35also prohibit telling people
about flaws that programmers made -
12:35 - 12:37because if you know about a flaw
that a programmer made, -
12:38 - 12:40you can use it to break the digital lock.
-
12:40 - 12:44And that means that the errors,
the vulnerabilities, -
12:44 - 12:50the mistakes in our devices, they fester
in them, they go on and on and on -
12:50 - 12:54and our devices become these longlife
reservoirs of digital pathogens. -
12:54 - 12:56And we've seen how that plays out.
-
12:56 - 12:59One of the reasons that Volkswagen
was able to get away -
12:59 - 13:01with their Diesel cheating for so long
-
13:01 - 13:04is because no one could independently
alter their firmware. -
13:05 - 13:07It's happening all over the place.
-
13:08 - 13:11You may have seen --
you may have seen this summer -
13:11 - 13:15that Chrysler had to recall
1.4 million jeeps -
13:15 - 13:19because it turned out that they could be
remotely controlled over the internet -
13:19 - 13:22while driving down a motorway
and have their brakes and steerings -
13:22 - 13:26commandeered by anyone, anywhere
in the world, over the internet. -
13:27 - 13:31We only have one methodology
for determining whether security works -
13:31 - 13:34and that's to subject it
to public scrutiny, -
13:34 - 13:38to allow for other people to see
what assumptions you've made. -
13:38 - 13:40Anyone can design a security system
-
13:40 - 13:42that he himself can think
of a way of breaking, -
13:43 - 13:45but all that means is that you've
designed a security system -
13:45 - 13:47that works against people
who are stupider than you. -
13:48 - 13:50And in this regard, security
is no different -
13:50 - 13:52from any other kind of knowledge creation.
-
13:52 - 13:55You know, before we had
contemporary science and scholarship, -
13:55 - 13:58we had something that looked
a lot like it, called alchemy. -
13:58 - 14:02And for 500 years, alchemists kept
what they thought they knew a secret. -
14:02 - 14:06And that meant that every alchemist
was capable of falling prey -
14:06 - 14:13to that most urgent of human frailties,
which is our ability to fool ourselves. -
14:13 - 14:16And so, every alchemist discovered
for himself in the hardest way possible -
14:16 - 14:19that drinking mercury was a bad idea.
-
14:19 - 14:22We call that 500-year period the Dark Ages
-
14:22 - 14:25and we call the moment at which
they started publishing -
14:25 - 14:28and subjectig themselves
to adversarial peer review, -
14:28 - 14:31which is when your friends tell you
about the mistakes that you've made -
14:31 - 14:33and your enemies call you an idiot
for having made them, -
14:33 - 14:36we call that moment the Enlightenment.
-
14:37 - 14:40Now, this has profound implications.
-
14:40 - 14:46The restriction of our ability to alter
the security of our devices -
14:46 - 14:49for our own surveillance society,
-
14:49 - 14:52for our ability to be free people
in society. -
14:52 - 14:56At the height of the GDR, in 1989,
-
14:56 - 15:01the STASI had one snitch for every
60 people in East Germany, -
15:01 - 15:03in order to surveil the entire country.
-
15:04 - 15:07A couple of decades later, we found out
through Edward Snowden -
15:07 - 15:09that the NSA was spying
on everybody in the world. -
15:10 - 15:13And the ratio of people who work
at the NSA to people they are spying on -
15:13 - 15:15is more like 1 in 10'000.
-
15:15 - 15:18They've achieved a two and a half
order of magnitude -
15:18 - 15:20productivity gain in surveillance.
-
15:20 - 15:23And the way that they got there
is in part by the fact that -
15:23 - 15:26we use devices that
we're not allowed to alter, -
15:26 - 15:28that are designed to treat us as attackers
-
15:28 - 15:31and that gather an enormous
amount of information on us. -
15:32 - 15:34If the government told you that you're
required to carry around -
15:34 - 15:38a small electronic rectangle that
recorded all of your social relationships, -
15:38 - 15:39all of your movements,
-
15:39 - 15:42all of your transient thoughts that
you made known or ever looked up, (check) -
15:43 - 15:46and would make that
available to the state, -
15:46 - 15:48and you would have to pay for it,
you would revolt. -
15:49 - 15:52But the phone companies have
managed to convince us, -
15:52 - 15:54along with the mobile vendors,
-
15:54 - 15:57that we should foot the bill
for our own surveillance. -
15:57 - 15:59It's a bit like during
the Cultural Revolution, -
15:59 - 16:01where, after your family members
were executed, -
16:01 - 16:03they sent you a bill for the bullet.
-
16:05 - 16:11So, this has big implications, as I said,
for where we go as a society. -
16:11 - 16:15Because just as our kids have
a hard time functioning -
16:15 - 16:17in the presence of surveillance,
and learning, -
16:17 - 16:19and advancing their own knowledge,
-
16:19 - 16:21we as a society have a hard time
progressing -
16:21 - 16:23in the presence of surveillance.
-
16:23 - 16:29In our own living memory, people who are
today thought of as normal and right -
16:29 - 16:31were doing something that
a generation ago -
16:31 - 16:34would have been illegal
and landed them in jail. -
16:34 - 16:35For example, you probably know someone
-
16:35 - 16:38who's married to a partner
of the same sex. -
16:38 - 16:41If you live in America, you may know
someone who takes medical marijuana, -
16:41 - 16:43or if you live in the Netherlands.
-
16:43 - 16:48And not that long ago, people
who undertook these activities -
16:48 - 16:49could have gone to jail for them,
-
16:49 - 16:52could have faced enormous
social exclusion for them. -
16:52 - 16:56The way that we got from there to here
was by having a private zone, -
16:56 - 16:58a place where people weren't surveilled,
-
16:58 - 17:01in which they could advance
their interest ideas, -
17:01 - 17:04do things that were thought of as
socially unacceptable -
17:04 - 17:06and slowly change our social attitudes.
-
17:07 - 17:09And in ........ (check) few things
that in 50 years, -
17:09 - 17:13your grandchildren will sit around
the Christmas table, in 2065, and say: -
17:13 - 17:15"How was it, grandma,
how was it, grandpa, -
17:15 - 17:17that in 2015, you got it all right,
-
17:17 - 17:20and we haven't had
any social changes since then?" -
17:20 - 17:22Then you have to ask yourself
how in a world, -
17:22 - 17:25in which we are all
under continuous surveillance, -
17:25 - 17:27we are going to find a way
to improve this. -
17:28 - 17:30So, our kids need ICT literacy,
-
17:30 - 17:35but ICT literacy isn't just typing skills
or learning how to use PowerPoint. -
17:35 - 17:37It's learning how to think critically
-
17:37 - 17:40about how they relate
to the means of information, -
17:40 - 17:43about whether they are its masters
or servants. -
17:44 - 17:47Our networks are not
the most important issue that we have. -
17:47 - 17:52There are much more important issues
in society and in the world today. -
17:52 - 17:55The future of the internet is
way less important -
17:55 - 17:58than the future of our climate,
the future of gender equity, -
17:58 - 18:00the future of racial equity,
-
18:00 - 18:03the future of the wage gap
and the wealth gap in the world, -
18:03 - 18:07but everyone of those fights is going
to be fought and won or lost -
18:07 - 18:11on the internet:
it's our most foundational fight -
18:11 - 18:15So weakened (check) computers
can make us more free -
18:16 - 18:18or they can take away our freedom.
-
18:18 - 18:21It all comes down to how we regulate them
and how we use them. -
18:21 - 18:25And it's our job, as people who are
training the next generation, -
18:25 - 18:29and whose next generation
is beta-testing -
18:29 - 18:31the surveillance technology
that will be coming to us, -
18:31 - 18:35it's our job to teach them to seize
the means of information, -
18:35 - 18:39to make themselves self-determinant
in the way that they use their networks -
18:39 - 18:43and to find ways to show them
how to be critical and how to be smart -
18:43 - 18:45and how to be, above all, subversive
-
18:45 - 18:48and how to use the technology around them.
Thank you. -
18:49 - 18:57(Applause)
-
18:57 - 18:59(Moderator) Cory, thank you very much
indeed. -
18:59 - 19:00(Doctorow) Thank you
(Moderator) And I've got a bundle of -
19:00 - 19:05points which you've stimulated
from many in the audience, which sent -- -
19:05 - 19:07(Doctorow) I'm shocked to hear that
that was at all controversial, -
19:07 - 19:08but go on.
(Moderator) I didn't say "controversial", -
19:08 - 19:11you stimulated thinking, which is great.
(Doctorow laughs) -
19:11 - 19:16But a lot of them resonate around
violation of secrecy and security. -
19:17 - 19:21And this, for example,
from Annika Burgess -
19:21 - 19:23"Is there a way for students
to protect themselves -
19:23 - 19:27from privacy violations by institutions
they are supposed to trust." -
19:27 - 19:30I think this is probably a question
Ian Goldin (check) as well, as -
19:30 - 19:34someone who is a senior figure
in a major university, but -
19:34 - 19:37this issue of privacy violations and trust.
-
19:37 - 19:40(Doctorow) Well, I think that computers
have a curious dual nature. -
19:40 - 19:44So on the one hand, they do expose us
to an enormous amount of scrutiny, -
19:44 - 19:45depending on how they are configured.
-
19:46 - 19:49But on the other hand, computers
have brought new powers to us -
19:49 - 19:52that are literally new
on the face of the world, right? -
19:52 - 19:56We have never had a reality in which
normal people could have secrets -
19:56 - 19:58from powerful people.
-
19:58 - 20:01But with the computer in your pocket,
with that, [shows a smartphone] -
20:01 - 20:03you can encrypt a message so thoroughly
-
20:03 - 20:06that if every hydrogen atom in the
universe were turned into a computer -
20:06 - 20:09and it did nothing until
the heat death of the universe, -
20:09 - 20:10but try to guess what your key was,
-
20:10 - 20:13we would run out of universe
before we ran out of possible keys. -
20:14 - 20:16So, computers do give us
the power to have secrets. -
20:17 - 20:21The problem is that institutions
prohibit the use of technology -
20:21 - 20:24that allow you to take back
your own privacy. -
20:24 - 20:26It's funny, right? Because we take kids
and we say to them: -
20:26 - 20:29"Your privacy is like your virginity:
-
20:29 - 20:31once you've lost it,
you'll never get it back. -
20:32 - 20:34Watch out what you're
putting on Facebook" -
20:34 - 20:36-- and I think they should watch
what they're putting on Facebook, -
20:36 - 20:40I'm a Facebook vegan, I don't even --
I don't use it, but we say: -
20:40 - 20:42"Watch what you're putting on Facebook,
-
20:42 - 20:45don't send out dirty pictures
of yourself on SnapChat." -
20:45 - 20:46All good advice.
-
20:46 - 20:50But we do it while we are taking away
all the private information -
20:50 - 20:54that they have, all of their privacy
and all of their agency. -
20:54 - 20:56You know, if a parent says to a kid:
-
20:56 - 20:58"You mustn't smoke
because you'll get sick" -
20:58 - 21:00and the parent says it
while lighting a new cigarette -
21:00 - 21:03off the one that she's just put down
in the ashtray, -
21:03 - 21:06the kid knows that what you're doing
matters more than what you're saying. -
21:06 - 21:07(Moderator) The point is deficit of trust.
-
21:07 - 21:09It builds in the kind of work that
David has been doing as well, -
21:09 - 21:12this deficit of trust and privacy.
-
21:12 - 21:13And there is another point here:
-
21:13 - 21:15"Is the battle for privacy already lost?
-
21:15 - 21:19Are we already too comfortable
with giving away our data?" -
21:19 - 21:20(Doctorow) No, I don't think so at all.
-
21:20 - 21:23In fact, I think that if anything,
we've reached -
21:23 - 21:25peak indifference to surveillance, right?
-
21:25 - 21:29The surveillance races on over by it
by a long shot. -
21:29 - 21:31There will be more surveillance
before there is less. -
21:32 - 21:34But there'll never be fewer people
who care about surveillance -
21:34 - 21:36than there are today,
-
21:36 - 21:39because, as privacy advocates,
we spectacularly failed, -
21:39 - 21:42over the last 20 years, to get people
to take privacy seriously -
21:43 - 21:46and now we have firms that are
frankly incompetent, -
21:46 - 21:50retaining huge amounts of our personally
identifying sensitive information -
21:50 - 21:53and those firms are leaking
that information at speed. -
21:53 - 21:58So, it started this year with things like
(phone rings) -- is that me? that's me! -- -
21:58 - 22:01It started this year with things like
Ashley Madison -
22:01 - 22:02(Moderator) Do you want to take it?
-
22:02 - 22:05(Doctorow) No no, that was my timer going
off telling me I've had my 22 minutes -
22:05 - 22:06(Moderator) It may be someone who
doesn't trust my.... -
22:06 - 22:08(Doctorow) No,that was --
that was my 22 minutes. -
22:08 - 22:14So, it started with Ashley Madison,
the office of personnel management, -
22:14 - 22:18everyone who ever applied for security
clearance in America -
22:18 - 22:21had all of their sensitive information,
everything you had to give them -
22:21 - 22:23about why you shouldn't have
security clearance, -
22:23 - 22:25everything that's potentially
compromising about you, -
22:25 - 22:29all of it exfiltrated by what's believed
to have been a Chinese spy ring. -
22:30 - 22:32Something like
one in twenty Americans now, -
22:32 - 22:35have had their data captured and
exfiltrated from the United States. -
22:36 - 22:40This week, VTech, the largest electronic
toy manufacturer in the world, -
22:40 - 22:44leaked the personal information of
at least five million children, -
22:44 - 22:47including potentially photos that
they took with their electronic toys, -
22:47 - 22:52as well as their parents' information,
their home addresses, their passwords, -
22:52 - 22:54their parents' passwords
and their password hints. -
22:54 - 22:57So every couple of weeks,
from now on, -
22:57 - 23:01a couple of million people
are going to show up -
23:01 - 23:01on the door of people who
care about privacy and say: -
23:01 - 23:03"You were right all along,
what do we do?" -
23:04 - 23:06And the challenge is to give them
something useful -
23:06 - 23:08they can do about privacy.
-
23:08 - 23:09And there are steps that
you can take personally. -
23:10 - 23:13If you go to the Surveillance Defense Kit
-
23:13 - 23:15at the Electronic Frontier Foundation's
website, -
23:15 - 23:17you'll find a set of tools you can use.
-
23:17 - 23:20But ultimately, it's not an individual
choice, it's a social one. -
23:20 - 23:24Privacy is a team sport, because if you
keep your information private -
23:24 - 23:27and you send it to someone else
who is in your social network, -
23:27 - 23:30who doesn't keep it private, well, then
it leaks out their back door. -
23:30 - 23:35And so we need social movements
to improve our privacy. -
23:35 - 23:37We also need better tools.
-
23:37 - 23:41It's really clear that our privacy tools
have fallen short of the mark -
23:41 - 23:44in terms of being accessible
to normal people. -
23:44 - 23:45(Moderator) Corey --
(Doctorow) I sense you want -
23:45 - 23:46to say something: go on
(Moderator) Yes. -
23:46 - 23:49Yes, but what I want to do is keep pushing
this down the route of education as well. -
23:49 - 23:53There are two or three questions here
which are very specific about that. -
23:54 - 23:56Could you tell us more about
how we help students -
23:56 - 23:57become digital citizens?
-
23:58 - 24:02It links in important ways to the
German and Nordic pre-digital concept -
24:02 - 24:04of Bildung -
- that's from Gertrud Fullend -- (check) -
24:04 - 24:08and if you could give your daughter
just one piece of advice for living -
24:08 - 24:11in an age of digital surveillance,
what would it be? -
24:12 - 24:15(Doctorow) So, the one piece of advice
I would give her is -
24:15 - 24:18"Don't settle for anything less
than total control -
24:18 - 24:19of the means of information.
-
24:20 - 24:23Anytime someone tells you that
the computer says you must do something, -
24:23 - 24:27you have to ask yourself why the computer
isn't doing what you want it to do. -
24:27 - 24:29Who is in charge here?"
-
24:30 - 24:31-- Thank you --
-
24:31 - 24:35In terms of how we improve
digital citizenship, -
24:35 - 24:38I think we are at odds with
our institutions in large part here -
24:38 - 24:42and I think that our institutions
have decreed that privacy -
24:42 - 24:44is a luxury we can't afford
to give our students -
24:44 - 24:46because if we do, they might do
something we wouldn't like. -
24:46 - 24:51And so, as teachers, the only thing that
we can do without risking our jobs, -
24:51 - 24:54without risking our students' education
and their ability to stay in school -
24:54 - 24:57is to teach them how to do judo,
to teach them -
24:57 - 25:01how to ask critical questions,
to gather data, -
25:01 - 25:03to demand evidence-based policy.
-
25:03 - 25:05And ultimately, I actually think
that's better. -
25:05 - 25:07I think that teaching kids
how to evade firewalls -
25:07 - 25:10is way less interesting than
teaching them how to identify -
25:10 - 25:12who is providing that firewall,
-
25:12 - 25:15who made the decision
to buy that firewall, -
25:15 - 25:16how much they're spending,
-
25:16 - 25:19and what the process is
for getting that person fired. -
25:19 - 25:22That's a much more interesting
piece of digital citizenship -
25:23 - 25:24than any piece of firewall hacking.
-
25:24 - 25:26(Moderator) But Corey, you're telling
kids to get hold of their information, -
25:26 - 25:29but picking up on your analogy
with the auto -
25:29 - 25:33where there's very clear lock-on,
so much which has to be done -
25:33 - 25:37has to be done through the manufacturers
-- Martin Siebel (check) picking up -- -
25:37 - 25:40How busy is the hacker community
with breaking digital locks? -
25:40 - 25:43In other words, digital locks
are being put on so much, -
25:44 - 25:47can the next generation, can the kids
work out where these locks are, -
25:47 - 25:49to keep control of their information?
-
25:49 - 25:52(Doctorow) So, getting the digital locks
off of our devices -
25:52 - 25:56is not just a matter of technology,
it's also a matter of policy. -
25:56 - 25:59Laws like the EUCD and new treaty
instruments like TTIP -
25:59 - 26:01and the Transpacific Partnership
-
26:01 - 26:05specify that these digital locks
can't be removed legally, -
26:05 - 26:09that firms can use them to extract
monopoly ransom, to harm our security. -
26:09 - 26:13And these treaties will make it impossible
to remove these locks legally. -
26:13 - 26:16It's not enough for there to be
a demi-monde, in which -
26:16 - 26:19people who are technologically savvy
know how to break their iPhones. -
26:19 - 26:23The problem with that is that it produces
this world in which you can't know -
26:23 - 26:28whether the software that you're using
has anything bad lurking in it, right? -
26:28 - 26:33to say to people who have iPhones that
they want it reconfigured -
26:33 - 26:35to accept software from parties that
aren't approved by Apple, -
26:35 - 26:39well, all you need to do is find a random
piece of software on the internet -
26:39 - 26:42that claims it will allow you to do it,
get it (check) on your phone, -
26:42 - 26:45and then run this software on it
is like saying to people: -
26:45 - 26:48"Well, all you need to do is just
find some heroin lying around -
26:48 - 26:51and put it in your arm: I'm sure
it'll work out fine." Right? -
26:52 - 26:56The ways in which a phone that is
compromised can harm you -
26:56 - 26:58are really without limits, right?
-
26:58 - 27:01If you think about this, this is a
rectangle with a camera and a microphone -
27:01 - 27:03that you take into the bedroom
and the toilet. -
27:03 - 27:06and the only way you know whether
that camera or microphone are on or off -
27:06 - 27:10is if the software is doing
what it says it's doing, right? -
27:10 - 27:13So, I think that we have this policy
dimension that's way more important -
27:13 - 27:15than the technical dimension,
-
27:15 - 27:17and that the only way we're going to
change that policy -
27:17 - 27:20is by making kids know what they can't do
-
27:20 - 27:23and them making them know
why they're not allowed to do it, -
27:23 - 27:26and then set them loose.
(Moderator) Two final points then, -
27:26 - 27:28building on that: Ian ...........(check)
-
27:28 - 27:30but what do you think of recent laws,
say, in the Netherlands, -
27:30 - 27:34allowing police to spy on criminals
and terrorists with webcams -
27:34 - 27:36on their own laptops?
-
27:36 - 27:39In other words, the extension implicit
in this question is, -
27:39 - 27:43what about keeping an eye on others,
including the next generation coming up, -
27:43 - 27:46and this from Peter Andreesen (check),
which is slightly connected: -
27:46 - 27:49But when the predominant management
tool in our business is still based -
27:49 - 27:55on the assumption of control, how do we
change that psychology and that mindset? -
27:55 - 27:58And David is nodding for those views
you can see it at the back. -
27:59 - 28:02(Doctorow) You know, there's not an
operating system or suite of applications -
28:02 - 28:06that bad guys use, and another set
that the rest of us use. -
28:06 - 28:09And so, when our security services
discover vulnerabilities -
28:09 - 28:13in the tools that we rely on, and then
hoard those vulnerabilities, -
28:13 - 28:16so that they can weaponize them
to attack criminals or terrorists -
28:16 - 28:18or whoever they're after,
-
28:19 - 28:23they insure that those vulnerabilities
remain unpatched on our devices too. -
28:23 - 28:25So that the people who are
our adversaries, -
28:25 - 28:26the criminals who want to attack us,
-
28:26 - 28:30the spies who want to exfiltrate
our corporate secrets to their countries, -
28:31 - 28:36the griefers, the autocratic regimes
who use those vulnerabilities -
28:36 - 28:37to spy on their population --
-
28:37 - 28:40at Electronic Frontier Foundation,
we have a client from Ethiopia, -
28:40 - 28:42Mr. Kadani, who is dissident journalist
-
28:42 - 28:46-- Ethiopia imprisons more journalists
than any other country in the world -- -
28:46 - 28:50he fled to America, where the Ethiopian
government hacked his computer -
28:50 - 28:53with a bug in Skype that they had
discovered and not reported. -
28:53 - 28:56They'd bought the tools to do this from
Italy, from a company called Hacking Team, -
28:56 - 29:00they hacked his computer, they found out
who is colleagues were in Ethiopia -
29:00 - 29:02and they arrested them and
subjected them to torture. -
29:03 - 29:07So, when our security services take a bug
and weaponize it -
29:07 - 29:11so they can attack their adversaries,
they leave that bug intact -
29:11 - 29:13so that our adversaries can attack us.
-
29:13 - 29:16You know, this is -- I'm giving a talk
later today, -
29:16 - 29:20no matter what side you're on in the war
on general-purpose computing, -
29:20 - 29:21you're losing,
-
29:21 - 29:24because all we have right now
is attack and not defense. -
29:24 - 29:28Our security services are so concerned
with making sure that their job is easy -
29:28 - 29:31when they attack their adversaries,
that they are neglecting the fact -
29:31 - 29:34that they are making their adversaries'
job easy in attacking us. -
29:34 - 29:36If this were a football match,
-
29:36 - 29:39the score would be tied
400 to 400 after 10 minutes. -
29:39 - 29:41(Moderator) Corey,
I've got to stop you there. -
29:41 - 29:43Thank you very much indeed.
(Doctorow) Thank you very much. -
29:43 - 29:46(Applause)
- Title:
- OEB 2015 - Opening Plenary - Cory Doctorow
- Description:
-
Cory Doctorow - Writer, Blogger, Activist - USA
The Opening Plenary session of OEB 2015 looked at the challenges of modernity and identify how people, organisations, institutions and societies can make technology and knowledge work together to accelerate the shift to a new age of opportunity.
More info: http://bit.ly/1lugQWX
- Video Language:
- English
- Team:
Captions Requested
- Duration:
- 29:46
![]() |
Cathy edited English subtitles for OEB 2015 - Opening Plenary - Cory Doctorow | |
![]() |
Claude Almansi edited English subtitles for OEB 2015 - Opening Plenary - Cory Doctorow | |
![]() |
Claude Almansi edited English subtitles for OEB 2015 - Opening Plenary - Cory Doctorow | |
![]() |
Claude Almansi edited English subtitles for OEB 2015 - Opening Plenary - Cory Doctorow | |
![]() |
Claude Almansi edited English subtitles for OEB 2015 - Opening Plenary - Cory Doctorow | |
![]() |
Claude Almansi edited English subtitles for OEB 2015 - Opening Plenary - Cory Doctorow | |
![]() |
Claude Almansi edited English subtitles for OEB 2015 - Opening Plenary - Cory Doctorow | |
![]() |
Claude Almansi edited English subtitles for OEB 2015 - Opening Plenary - Cory Doctorow |