(Cory Doctorow) Thank you very much
So I'd like to start with something of a
benediction or permission.
I am one of nature's fast talkers
and many of you are not
native English speakers, or
maybe not accustomed
to my harsh Canadian accent
in addition I've just come in
from Australia
and so like many of you I am horribly
jetlagged and have drunk enough coffee
this morning to kill a rhino.
When I used to be at the United Nations
I was known as the scourge of the
simultaneous translation core
I would stand up and speak
as slowly as I could
and turn around, and there they
would be in their boots doing this
(laughter)
When I start to speak too fast,
this is the universal symbol --
my wife invented it --
for "Cory, you are talking too fast".
Please, don't be shy.
So, I'm a parent , like many of you
and I'm like I'm sure all of you
who are parents, parenting takes my ass
all the time.
And there are many regrets I have
about the mere seven and half years
that I've been a parent
but none ares so keenly felt
as my regrets over what's happened
when I've been wandering
around the house and seen my
daughter working on something
that was beyond her abilities, that was
right at the edge of what she could do
and where she was doing something
that she didn't have competence in yet
and you know it's that amazing thing
to see that frowning concentration,
tongue stuck out: as a parent, your
heart swells with pride
and you can't help but go over
and sort of peer over their shoulder
what they are doing
and those of you who are parents know
what happens when you look too closely
at someone who is working
beyond the age of their competence.
They go back to doing something
they're already good at.
You interrupt a moment
of genuine learning
and you replace it with
a kind of embarrassment
about what you're good at
and what you're not.
So, it matters a lot that our schools are
increasingly surveilled environments,
environments in which everything that
our kids do is watched and recorded.
Because when you do that, you interfere
with those moments of real learning.
Our ability to do things that we are not
good at yet, that we are not proud of yet,
is negatively impacted
by that kind of scrutiny.
And that scrutiny comes
from a strange place.
We have decided that there are
some programmatic means
by which we can find all the web page
children shouldn't look at
and we will filter our networks
to be sure that they don't see them.
Anyone who has ever paid attention
knows that this doesn't work.
There are more web pages
that kids shouldn't look at
than can ever be cataloged,
and any attempt to catalog them
will always catch pages that kids
must be looking at.
Any of you who have ever taught
a unit on reproductive health
know the frustration of trying
to get round a school network.
Now, this is done in the name of
digital protection
but it flies in the face of digital
literacy and of real learning.
Because the only way to stop kids
from looking at web pages
they shouldn't be looking at
is to take all of the clicks that they
make, all of the messages that they send,
all of their online activity
and offshore it to a firm
that has some nonsensically arrived at
list of the bad pages.
And so, what we are doing is that we're
exfiltrating all of our students' data
to unknown third parties.
Now, most of these firms,
their primary business isn't
serving the education sector.
Most of them service
the government sectors.
They primarily service governments in
repressive autocratic regimes.
They help them ensure that
their citizens aren't looking at
Amnesty International web pages.
They repackage those tools
and sell them to our educators.
So we are offshoring our children's clicks
to war criminals.
And what our kids do, now,
is they just get around it,
because it's not hard to get around it.
You know, never underestimate the power
of a kid who is time-rich and cash-poor
to get around our
technological blockades.
But when they do this, they don't acquire
the kind of digital literacy
that we want them to do, they don't
acquire real digital agency
and moreover, they risk exclusion
and in extreme cases,
they risk criminal prosecution.
So what if instead, those of us who are
trapped in this system of teaching kids
where we're required to subject them
to this kind of surveillance
that flies in the face
of their real learning,
what if instead, we invented
curricular units
that made them real first class
digital citizens,
in charge of trying to influence
real digital problems?
Like what if we said to them:
"We want you to catalog the web pages
that this vendor lets through
that you shouldn't be seeing.
We want you to catalog those pages that
you should be seeing, that are blocked.
We want you to go and interview
every teacher in the school
about all those lesson plans that were
carefully laid out before lunch
with a video and a web page,
and over lunch,
the unaccountable distance center
blocked these critical resources
and left them handing out photographed
worksheets in the afternoon
instead of the unit they prepared.
We want you to learn how to do the Freedom
of Information Act requests
and find out what your
school authority is spending
to censor your internet access
and surveil your activity.
We want you to learn to use the internet
to research these companies
and we want you to present this
to your parent-teacher association,
to your school authority,
to your local newspaper."
Because that's the kind
of digital literacy
that makes kids into first-class
digital citizens,
that prepares them for a future
in which they can participate fully
in a world that's changing.
Kids are the beta-testers
of the surveillance state.
The path of surveillance technology
starts with prisoners,
moves to asylum seekers,
people in mental institutions
and then to its first non-incarcerated
population: children
and then moves to blue-collar workers,
government workers
and white-collar workers.
And so, what we do to kids today
is what we did to prisoners yesterday
and what we're going to be doing
to you tomorrow.
And so it matters, what we teach our kids.
If you want to see where this goes, this
is a kid named Blake Robbins
and he attended Lower Merion High School
in Lower Merion Pennsylvania
outside f Philadelphia.
It's the most affluent public school
district in America, so affluent
that all the kids were issued Macbooks
at the start of the year
and they had to do their homework on
their Macbooks,
and bring them to school every day
and bring them home every night.
And the Macbooks had been fitted with
Laptop Theft Recovery Software,
which is fancy word for a rootkit, that
let the school administration
covertly (check) operate the cameras
and microphones on these computers
and harvest files off
of their hard drives
view all their clicks, and so on.
Now Blake Robbins found out
that the software existed
and how it was being used
because he and the head teacher
had been knocking heads for years,
since he first got into the school,
and one day, the head teacher
summoned him to his office
and said: "Blake, I've got you now."
and handed him a print-out of Blake
in his bedroom the night before,
taking what looked like a pill,
and said: "You're taking drugs."
And Blake Robbins said: "That's a candy,
it's a Mike and Ike candy, I take them --
I eat them when I'm studying.
How did you get a picture
of me in my bedroom?"
This head teacher had taken
over 6000 photos of Blake Robbins:
awake and asleep, dressed and undressed,
in the presence of his family.
And in the ensuing lawsuit, the school
settled for a large amount of money
and promised that
they wouldn't do it again
without informing the students
that it was going on.
And increasingly, the practice is now
that school administrations hand out
laptops, because they're getting cheaper,
with exactly the same kind of software,
but they let the students know and t
hey find that that works even better
at curbing the students' behavior,
because the students know that
they're always on camera.
Now, the surveillance state is moving
from kids to the rest of the world.
It's metastasizing.
Our devices are increasingly designed
to treat us as attackers,
as suspicious parties
who can't be trusted
because our devices' job is to do things
that we don't want them to do.
Now that's not because the vendors
who make our technology
want to spy on us necessarily,
but they want to take
the ink-jet printer business model
and bring it into every other realm
of the world.
So the ink-jet printer business model
is where you sell someone a device
and then you get a continuing
revenue stream from that device
by making sure that competitors can't make
consumables or parts
or additional features
or plugins for that device,
without paying rent
to the original manufacturer.
And that allows you to maintain
monopoly margins on your devices.
Now, in 1998, the American government
passed a law called
the Digital Millennium Copyright Act,
in 2001 the European Union
introduced its own version,
the European Union Copyright Directive.
And these two laws, along with laws
all around the world,
in Australia, Canada and elsewhere.
These laws prohibit removing digital laws
that are used to restrict
access to copyrighted works
and they were original envisioned as a way
of making sure that Europeans didn't
bring cheap DVDs in from America,
or making sure that Australians didn't
mport cheap DVDs from China.
And so you have a digital work, a DVD,
and it has a lock on it and to unlock it,
you have to buy an authorized player
and the player checks to make sure
you are in region
and making your own player
that doesn't make that check
is illegal because you'd have
to remove the digital lock.
And that was the original intent,
it was to allow high rates to be
maintained on removable media,
DVDs and other entertainment content.
But it very quickly spread
into new rounds.
So, for example, auto manufacturers now
lock up all of their cars' telemetry
with digital locks.
If you're a mechanic
and want to fix a car,
you have to get a reader
from the manufacturer
to make sure that you can
see the telemetry
and know what parts to order
and how to fix it.
And in order to get this reader,
you have to promise the manufacturer
that you will only buy parts
from that manufacturer
and not from third parties.
So the manufacturers can keep
the repair costs high
and get a secondary revenue stream
out of the cars.
This year, the Chrysler corporation filed
comments with the US Copyright Office,
to say that they believed that
this was the right way to do it
and that it should be a felony,
punishable by 5 years in prison
and a $500'000 fine,
to change the locks on a car that you own,
so that you can choose who fixes it.
It turned out that when they advertised
-- well, where is my slide here?
Oh, there we go --
when they advertised that
it wasn't your father's Oldsmobile,
they weren't speaking metaphorically,
they really meant
that even though your father
bought the Oldsmobile,
it remained their property in perpetuity.
And it's not just cars,
it's every kind of device,
because every kind of device today
has a computer in it.
The John Deer Company, the world's leading seller of heavy equipment
and agricultural equipment technologies,
they now view their tractors as
information gathering platforms
and they view the people who use them
as the kind of inconvenient gut flora
of their ecosystem.
So if you are a farmer
and you own a John Deer tractor,
when you drive it around your fields,
the torque centers on the wheels
conduct a centimeter-accurate soil
density survey of your agricultural land.
That would be extremely useful to you
when you're planting your seeds
but that data is not available to you
unless unless you remove the digital lock
from your John Deer tractor
which again, is against the law
everywhere in the world.
Instead, in order to get that data
you have to buy a bundle with seeds
from Monsanto,
who are John Deer's seed partners.
John Deer then takes this data that they
aggregate across whole regions
and they use it to gain insight
into regional crop yields
That they use to play the futures market.
John Deer's tractors are really just
a way of gathering information
and the farmers are secondary to it.
Just because you own it
doesn't mean it's yours.
And it's not just the computers
that we put our bodies into
that have this business model.
It's the computers that we put
inside of our bodies.
If you're someone who is diabetic
and you're fitted with a continuous
glucose-measuring insulin pump,
that insulin pump is designed
with a digital log
that makes sure that your doctor
can only use the manufacturer's software
to read the data coming off of it
and that software is resold
on a rolling annual license
and it can't be just bought outright.
And the digital locks are also
used to make sure
that you only buy the insulin
that vendors approved
and not generic insulin
that might be cheaper.
We've literally turned human beings
into ink-jet printers.
Now, this has really deep implications
beyond the economic implications.
Because the rules that prohibit
breaking these digital locks
also prohibit telling people
about flaws that programmers made
because if you know about a flaw
that a programmer made,
you can use it to break the digital lock.
And that means that the errors,
the vulnerabilities,
the mistakes in our devices, they fester
in them, they go on and on and on
and our devices become these longlife
reservoirs of digital pathogens.
And we've seen how that plays out.
One of the reasons that Volkswagen
was able to get away
with their Diesel cheating for so long
is because no one could independently
alter their firmware.
It's happening all over the place.
You may have seen --
you may have seen this summer
that Chrysler had to recall
1.4 million jeeps
because it turned out that they could be
remotely controlled over the internet
while driving down a motorway
and have their brakes and steering
commandeered by anyone, anywhere
in the world, over the internet.
We only have one methodology
for determining whether security works
and that's to submit it
to public scrutiny,
to allow for other people to see
what assumptions you've made.
Anyone can design a security system
that he himself can think
of a way of breaking,
but all that means is that you've
designed a security system
that works against people
who are stupider than you.
And in this regard, security
is no different
from any other kind of knowledge creation.
You know, before we had
contemporary science and scholarship,
we had something that looked
a lot like it, called alchemy.
And for 500 years, alchemists kept
what they thought they knew a secret.
And that meant that every alchemist
was capable of falling prey
to that most urgent of human frailties,
which is our ability to fool ourselves.
And so, every alchemist discovered
for himself in the hardest way possible
that drinking mercury was a bad idea.
We call that 500-year period the Dark Ages
and we call the moment at which
they started publishing
and submitting themselves
to adversarial peer review,
which is when your friends tell you
about the mistakes that you've made
and your enemies call you an idiot
for having made them,
we call that moment the Enlightenment.
Now, this has profound implications.
The restriction of our ability to alter
the security of our devices
for our own surveillance society,
for our ability to be free people
in society.
At the height of the GDR, in 1989,
the STASI had one snitch for every
60 people in East Germany,
in order to surveil the entire country.
A couple of decades later, we found out
through Edward Snowden
that the NSA was spying
on everybody in the world.
And the ratio of people who work
at the NSA to people they are spying on
is more like 1 in 10'000.
They've achieved a two and a half
order of magnitude
productivity gain in surveillance.
And the way that they got there
is in part by the fact that
we use devices that
we're not allowed to alter,
that are designed to treat us as attackers
and that gather an enormous
amount of information on us.
If the government told you that you're
required to carry on
a small electronic rectangle that
recorded all of your social relationships,
all of your movements,
all of your transient thoughts that
you made known or ever looked up,
and would make that
available to the state,
and you would have to pay for it,
you would revolt.
But the phone companies have
managed to convince us,
along with the mobile phone vendors,
that we should foot the bill
for our own surveillance.
It's a bit like during
the Cultural Revolution,
where, after your family members
were executed,
they sent you a bill for the bullet.
So, this has big implications, as I said,
for where we go as a society.
Because just as our kids have
a hard time functioning
in the presence of surveillance,
and learning,
and advancing their own knowledge,
we as a society have a hard time
progressing
in the presence of surveillance.
In our own living memory, people who are
today though of as normal and right
were doing something that
a generation ago
would have been illegal
and landed them in jail.
For example, you probably know someone
who's married to a partner
of the same sex.
If you live in America, you may know
someone who takes medical marijuana,
or if you live in the Netherlands.
And not that long ago, people
who undertook these activities
could have gone to jail for them,
could have faced enormous
social exclusion for them.
The way that we got from there to here
was by having a private zone,
a place where people weren't surveilled,
in which they could advance
their interest ideas,
do things that were thought of as
socially unacceptable
and slowly change our social attitudes.
And in ........ (check) few things
that in 50 years,
your grandchildren will sit around
the Christmas table, in 2065, and say:
"How was it, grandma,
how was it, grandpa,
that in 2015, you got it all right,
and we haven't had
any social changes since then?"
Then you have to ask yourself
how in a world,
in which we are all
under continuous surveillance,
we are going to find a way
to improve this.
So, our kids need ICT literacy,
but ICT literacy isn't just typing skills
or learning how to use PowerPoing.
It's learning how to think critically
about how they relate
to the means of information,
about whether they are its masters
or servants.
Our networks are not
the most important issue that we have.
There are much more important issues
in society and in the world today.
The future of the internet is
way less important
than the future of our climate,
the future of gender equity,
the future of racial equity,
the future of the wage gap
and the wealth gap in the world,
but everyone of those fights is going
to be fought and won or lost
on the internet:
it's our most foundational fight
So weakened (check) computers
can make us more free
or they can take away our freedom.
It all comes down to how we regulate them
and how we use them.
And it's our job, as people who are
training the next generation,
and whose next generation
is beta-testing
the surveillance technology
that will be coming to us,
it's our job to teach them to seize
the means of information,
to make themselves self-determinant
in the way that they use their networks
and to find ways to show them
how to be critical and how to be smart
and how to be, above all, subversive
and how to use the technology around them.
Thank you. (18:49)
(Applause)
(Moderator) Cory, thank you very much
indeed.
(Doctorow) Thank you
(Moderator) And I've got a bundle
of points which you've stimulated
from many in the audience, which sent --
(Doctorow) I'm shocked to hear that
that was at all controversial,
but go on.
(Moderator) I didn't say "controversial",
you stimulated thinking, which is great.
But a lot of them resonate around
violation of secrecy and security.
And this, for example,
from Anneke Burgess (check)
"Is there a way for students
to protect themselves
from privacy violations by institutions
they are supposed to trust."
I think this is probably a question
William Golding (check) as well,
someone who is a senior figure
in a major university, but
this issue of privacy violations and trust.
(Doctorow) Well, I think that computers
have a curious dual nature.
So on the one hand, they do expose us
to an enormous amount of scrutiny,
depending on how they are configured.
But on the other hand, computers
have brought new powers to us
that are literally new
on the face of the world, right?
We have never had a reality in which
normal people could have secrets
from powerful people.
But with the computer in your pocket,
with that,
you can encrypt a message so thoroughly
that if every hydrogen atom in the
universe were turned into a computer
and it did nothing until
the heat death of the universe,
but try to guess what your key was,
we would run out of universe
before we ran out of possible keys.
So, computers do give us
the power to have secrets.
The problem is that institutions
prohibit the use of technology
that allow you to take back
your own privacy.
It's funny, right? Because we take kids
and we say to them:
"Your privacy is like your virginity:
once you've lost it,
you'll never get it back.
Watch out what you're
putting on Facebook"
-- and I think they should watch
what they're putting on Facebook,
I'm a Facebook vegan, I don't even --
I don't use it, but we say:
"Watch what you're putting on Facebook,
don't send out dirty pictures
of yourself on SnapChat."
All good advice.
But we do it while we are taking away
all the private information
that they have, all of their privacy
and all of their agency.
You know, if a parent says to a kid:
"You mustn't smoke
because you'll get sick"
and the parent says it
while lighting a new cigarette
off the one that she's just put down
in the ashtray,
the kid knows that what you're doing
matters more than what you're saying.
(Moderator) The point is deficit of trust.
It builds in the kind of work that
David has been doing as well,
this deficit of trust and privacy.
And there is another point here:
"Is the battle for privacy already lost?
Are we already too comfortable
with giving away our data?"
(Doctorow) No, I don't think so at all.
In fact, I think that if anything,
we've reached
peak indifference to surveillance, right? (21:25)