(Cory Doctorow) Thank you very much
So I'd like to start with something of a
benediction or permission.
I am one of nature's fast talkers
and many of you are not
native English speakers, or
maybe not accustomed
to my harsh Canadian accent
in addition I've just come in
from Australia
and so like many of you I am horribly
jetlagged and have drunkenough coffee
this morning to kill a rhino.
When I used to be at the United Nations
I was known as the scourge of the
simultaneous translation core
I would stand up and speak
as slowly as I could
and turn around, and there they
would be in their boots doing this
(laughter)
When I start to speak too fast,
this is the universal symbol --
my wife invented it --
for "Cory, you are talking too fast".
Please, don't be shy.
So, I'm a parent , like many of you
and I'm like I'm sure all of you
who are parents, parenting takes my ass
all the time.
And there are many regrets I have
about the mere seven and half years
that I've been a parent
but none ares so keenly felt
as my regrets over what's happened
when I've been wandering
around the house and seen my
daughter working on something
that was beyond her abilities, that was
right at the edge of what she could do
and where she was doing something
that she didn't have competence in yet
and you know it's that amazing thing
to see that frowning concentration,
tongue stuck out: as a parent, your
heart swells with pride
and you can't help but go over
and sort of peer over their shoulder
what they are doing
and those of you who are parents know
what happens when you look too closely
at someone who is working
beyond the age of their competence.
They go back to doing something
they're already good at.
You interrupt a moment
of genuine learning
and you replace it with
a kind of embarrassment
about what you're good at
and what you're not.
So, it matters a lot that our schools are
increasingly surveilled environments,
environments in which everything that
our kids do is watched and recorded.
Because when you do that, you interfere
with those moments of real learning.
Our ability to do things that we are not
good at yet, that we are not proud of yet,
is negatively impacted
by that kind of scrutiny.
And that scrutiny comes
from a strange place.
We have decided that there are
some programmatic means
by which we can find all the web page
children shouldn't look at
and we will filter our networks
to be sure that they don't see them.
Anyone who has ever paid attention
knows that this doesn't work.
There are more web pages
that kids shouldn't look at
than can ever be cataloged,
and any attempt to catalog them
will always catch pages that kids
must be looking at.
Any of you who have ever taught
a unit on reproductive health
know the frustration of trying
to get round a school network.
Now, this is done in the name of
digital protection
but it flies in the face of digital
literacy and of real learning.
Because the only way to stop kids
from looking at web pages
they shouldn't be looking at
is to take all of the clicks that they
make, all of the messages that they send,
all of their online activity
and offshore it to a firm
that has some nonsensically arrived at
list of the bad pages.
And so, what we are doing is that we're
exfiltrating all of our students' data
to unknown third parties.
Now, most of these firms,
their primary business is
in serving the education sector.
Most of them service
the government sector.
The primarily service governments in
repressive autocratic regimes.
They help them make sure that
their citizens aren't looking at
Amnesty International web pages.
They repackage those tools
and sell them to our educators.
So we are offshoring our children's clicks
to war criminals.
And what our kids do, we know,
is they just get around it,
because it's not hard to get around it.
You know, never underestimate the power
of a kid who is time-rich and cash-poor
to get around our
technological blockades.
But when they do this, they don't acquire
the kind of digital literacy
that we want them to do, they don't
acquire real digital agency
and moreover, they risk exclusion
and in extreme cases,
they risk criminal prosecution.
So what if instead, those of us who are
trapped in this system of teaching kids
where we're required to subject them
to this kind of surveillance
that flies in the face
of their real learning,
what if instead, we invented
curricular units
that made them real first class
digital citizens,
in charge of trying to influence
real digital problems?
Like what if we said to them:
"We want you to catalog the web pages
that this vendor lets through
that you shouldn't be seeing.
We want you to catalog those pages that
you should be seeing, that are blocked.
We want you to go and interview
every teacher in the school
about all those lesson plans that were
carefully laid out before lunch
with a video and a web page,
and over lunch,
the unaccountable distance center
blocked these critical resources
and left them handing out photographed
worksheets in the afternoon
instead of the unit they prepared.
We want you to learn how to do the Freedom
of Information Act's requests
and find out what your
school authority is spending
to censor your internet access
and surveil your activity.
We want you to learn to use the internet
to research these companies and
we want you to present this
to your parent-teacher association,
to your school authority,
to your local newspaper."
Because that's the kind
of digital literacy
that makes kids into first-class
digital citizens,
that prepares them for a future
in which they can participate fully
in a world that's changing.
Kids are the beta-testers
of the surveillance state.
The path of surveillance technology
starts with prisoners,
moves to asylum seekers,
people in mental institutions
and then to its first non-incarcerated
population: children
and then moves to blue-collar workers,
government workers
and white-collar workers.
And so, what we do to kids today
is what we did to prisoners yesterday
and what we're going to be doing
to you tomorrow.
And so it matters, what we teach our kids.
If you want to see where this goes, this
is a kid named Blake Robbins
and he attended Lower Merion High School
in Lower Merion Pennsylvania
outside f Philadelphia.
It's the most affluent school district
in America, so affluent
that all the kids were issued Macbooks
at the start of the year
and they had to do their homework on
their Macbooks,
they had to bring them to school every day
and bring them home every night.
And the Macbooks had been fitted with
Laptop Theft Recovery Software,
which is fancy word for a rootkit, that
let the school administration
scovertly (check) operate the cameras
and microphones on these computers
and harvest files off
of their hard drives
view all their clicks, and so on.
Now Blake Robbins found out
that the software existed
and how it was being used
because he and the head teacher
had been knocking heads for years,
since he first got into the school,
and one day, the head teacher
summoned him to his office
and said: "Blake, I've got you now."
and handed him a print-out of Blake
in his bedroom the night before,
taking what looked like a pill,
and said: "You're taking drugs."
And Blake Robbins said: "That's a candy,
it's a Mike and Ike candy, I take them --
I eat them when I'm studying.
How did you get a picture
of me in my bedroom?"
This head teacher had taken
over 6000 photos of Blake Robbins:
awake and asleep, dressed and undressed,
in the presence of his family.
And in the ensuing lawsuit, the school
settled for a large amount of money
and promised that
they wouldn't do it again
without informing the students
that it was going on.
And increasingly, the practice is now
that school administrations hand out
laptops, because they're getting cheaper,
with exactly the same kind of software,
but they let the students know and t
hey find that that works even better
at curbing the students' behavior,
because the students know that
they're always on camera.
Now, the surveillance state is moving
from kids to the rest of the world.
It's metastasizing.
Our devices are increasingly designed
to treat us as attackers,
as suspicious parties
who can't be trusted
because our devices' job is to do things
that we don't want them to do.
Now that's not because the vendors
who make our technology
want to spy on us necessarily,
but they want to take
the ink-jet printer business model
and bring it into every other realm
of the world.
So the ink-jet printer business model
is where you sell someone a device
and then you get a continuing
revenue stream from that device
by making sure that competitors can't make
consumables or parts
or additional features
or plugins for that device,
without paying rent
to the original manufacturer.
And that allows you to maintain
monopoly margins on your devices.
Now, in 1998, the American government
passed a law called
the Digital Millennium Copyright Act,
in 2001 the European Union
introduced its own version,
the European Union Copyright Directive.
And these two laws, along with laws
all around the world,
in Australia, Canada and elsewhere.
These laws prohibit removing digital laws
that are used to restrict
access to copyrighted works
and they were original envisioned as a way
of making sure that Europeans didn't
bring cheap DVDs in from America,
or making sure that Australians didn't
mport cheap DVDs from China.
And so you have a digital work, a DVD,
and it has a lock on it and to unlock it,
you have to buy an authorized player
and the player checks to make sure
you are in region
and making your own player
that doesn't make that check
is illegal because you'd have
to remove the digital lock.
And that was the original intent,
it was to allow high rates to be
maintained on removable media,
DVDs and other entertainment content.
But it very quickly spread
into new rounds.
So, for example, auto manufacturers now
lock up all of their cars' telemetry
with digital locks.
If you're a mechanic
and want to fix a car,
you have to get a reader
from the manufacturer
to make sure that you can
see the telemetry
and know what parts to order
and how to fix it.
And in order to get this reader,
you have to promise the manufacturer
that you will only buy parts
from that manufacturer
and not from third parties.
So the manufacturers can keep
the repair costs high
and get a secondary revenue stream
out of the cars.
This year, the Chrysler corporation filed
comments with the US Copyright Office,
to say that they believed that
this was the right way to do it
and that it should be a felony,
punishable by 5 years in prison
and a $500'000 fine,
to change the locks on a car that you own,
so that you can choose who fixes it.
It turned out that when they advertised
-- well, where is my slide here?
Oh, there we go --
when they advertised that
it wasn't your father's Oldsmobile,
they weren't speaking metaphorically,
they really meant
that even though your father
bought the Oldsmobile,
it remained their property in perpetuity.
And it's not just cars,
it's every kind of device,
because every kind of device today
has a computer in it.
The John Deer Company, the world's leading seller of heavy equipment
and agricultural equipment technologies,
they now view their tractors as
information gathering platforms
and they view the people who use them
as the kind of inconvenient gut flora
of their ecosystem.
So if you are a farmer
and you own a John Deer tractor,
when you drive it around your fields,
the torque centers on the wheels
conduct a centimeter-accurate soil
density survey of your agricultural land.
That would be extremely useful to you
when you're planting your seeds
but that data is not available to you
unless unless you remove the digital lock
from your John Deer tractor
which again, is against the law
everywhere in the world.
Instead, in order to get that data
you have to buy a bundle with seeds
from Monsanto,
who are John Deer's seed partners.
John Deer then takes this data that they
aggregate across whole regions
and they use it to gain insight
into regional crop yields
That they use to play the futures market.
John Deer's tractors are really just
a way of gathering information
and the farmers are secondary to it.
Just because you own it
doesn't mean it's yours.
And it's not just the computers
that we put our bodies into
that have this business model.
It's the computers that we put
inside of our bodies.
If you're someone who is diabetic
and you're fitted with a continuous
glucose-measuring insulin pump,
that insulin pump is designed
with a digital log
that makes sure that your doctor
can only use the manufacturer's software
to read the data coming off of it
and that software is resold
on a rolling annual license
and it can't be just bought outright.
And the digital locks are also
used to make sure
that you only buy the insulin
that vendors approved
and not generic insulin
that might be cheaper.
We've literally turned human beings
into ink-jet printers. 12:21