WEBVTT
00:00:00.000 --> 00:00:19.462
36C3 Preroll music
00:00:19.462 --> 00:00:23.945
Herald: Two speakers, that are on stage
today are both anthropologists and they
00:00:23.945 --> 00:00:32.847
are both experts on hacking culture. Today
also, they launched a website HackCur.io.
00:00:32.847 --> 00:00:37.633
And which is also the name of the talk
'hack_curio, decoding the Cultures of
00:00:37.633 --> 00:00:44.471
Hacking'. One video at a time. I welcome
Gabriella, a.k.a. Biella Coleman and
00:00:44.471 --> 00:00:46.282
Paula Bialski.
00:00:46.282 --> 00:00:50.823
Applause
00:00:50.823 --> 00:00:56.285
Paula Belsky : Hello. Hello. Yes, good
evening. CCC is so lovely to be here. We
00:00:56.285 --> 00:01:01.747
are super excited to stand before you here
today and present a project we've been
00:01:01.747 --> 00:01:04.316
working on for the past year or so.
00:01:04.316 --> 00:01:06.899
Biella: Would not have been
finished if it were not for
00:01:06.899 --> 00:01:09.751
this talk. Paula: Exactly. Biella: So thank you.
00:01:09.751 --> 00:01:12.031
Paula: Exactly.
Thanks for forcing us to stand before you
00:01:12.031 --> 00:01:17.133
and get away from our desks. Here's a
drink, some wine, have some 11:30 PM
00:01:17.133 --> 00:01:21.205
discussion with you and there's no
better place to launch the project that
00:01:21.205 --> 00:01:25.467
we're gonna show you then at the CCC. So
we're super excited to be here. Let's
00:01:25.467 --> 00:01:29.349
start with the very basics. What is
hack_curio? What is it that you guys are
00:01:29.349 --> 00:01:35.421
gonna see in the next hour or so?
Hack_curio is a web shot site featuring
00:01:35.421 --> 00:01:40.723
short video clips all related to computer
hackers. Now a bit a bit of background. My
00:01:40.723 --> 00:01:45.395
name is Paula Bialski and I am a
sociologist. I'm an ethnographer of hacker
00:01:45.395 --> 00:01:49.297
cultures. I study corporate hacker
developers. And for those of you who don't
00:01:49.297 --> 00:01:51.719
know Biella Coleman.
Biella: I'm an anthropologist.
00:01:51.719 --> 00:01:55.939
I also study computer hackers. And we,
along with Chris Kelty, have helped to put
00:01:55.939 --> 00:01:59.821
this Website together.
Paula: Exactly. And in
00:01:59.821 --> 00:02:04.393
the past year, we've decided to come
together and bring all sorts of clips from
00:02:04.393 --> 00:02:11.485
public talks, from documentaries, from
Hollywood films, mems, advertising, all
00:02:11.485 --> 00:02:16.197
sorts of sources. We've brought together
these videos that also come together
00:02:16.197 --> 00:02:22.109
short descriptions by authors, by
scholars, by journalists, by people who
00:02:22.109 --> 00:02:26.929
know something about hacker cultures. And
we brought that together all in one place.
00:02:26.929 --> 00:02:32.663
So call it a museum, call it a compendium,
call it a web site. And it's a place for
00:02:32.663 --> 00:02:38.465
you to really pay homage to you guys,
because hackers come in all shapes and
00:02:38.465 --> 00:02:41.797
sizes. What it means to hack, might mean
something to you, but might mean something
00:02:41.797 --> 00:02:46.199
very different to you. And we decided as
anthropologists, we think it's very
00:02:46.199 --> 00:02:50.211
important to represent a certain culture
in a certain way. We're not just hackers
00:02:50.211 --> 00:02:54.268
in hoodies. It's a really diverse culture.
So we're going to talk about that today.
00:02:54.268 --> 00:02:58.119
Biella: All right. So like, how did this project
come into being? Like, why are we here?
00:02:58.119 --> 00:03:02.717
Why did we spend the last year doing this?
Well, you know, first of all, it wasn't
00:03:02.717 --> 00:03:08.849
created. I didn't create it because I had
this idea in mind. It was created because
00:03:08.849 --> 00:03:14.731
I started to collect videos for a reason.
I'm a professor and I twice a week stand
00:03:14.731 --> 00:03:20.533
in front of students who are on the
Internet, on Facebook, maybe buying shoes.
00:03:20.533 --> 00:03:25.964
And it's really hard to get their
attention. And you know, what I found
00:03:25.964 --> 00:03:31.367
using videos in class was an amazing way
to get them off Facebook and paying
00:03:31.367 --> 00:03:36.628
attention to what to me. Right. So over
the years, I just collected a lot of
00:03:36.628 --> 00:03:40.660
videos. Right. Video after video after
video after video. And in certain
00:03:40.660 --> 00:03:44.139
point. I was like, you know, I have this
private collection, semi private
00:03:44.139 --> 00:03:49.976
collection they use in class. Why don't I
transform it into a public resource and
00:03:49.976 --> 00:03:55.877
more so as someone who studied hackers for
many years, why don't I kind of make it
00:03:55.877 --> 00:04:01.572
into a collaborative project? Why don't I
tap into the kind of expertise that exists
00:04:01.572 --> 00:04:05.758
among hackers and journalists and
researchers and academics and draw them
00:04:05.758 --> 00:04:11.595
in? And so I decided to do that right.
Until about a year and a half ago, I
00:04:11.595 --> 00:04:16.648
brought together a couple of other people
like Paula, Chris Kelty, who's another
00:04:16.648 --> 00:04:21.904
curator. And I said, like, let's get this
going. So when we were kind of fashioning
00:04:21.904 --> 00:04:26.159
the project, we were also thinking like,
what are we trying to do with this project?
00:04:26.159 --> 00:04:31.847
Right. You're not my students. I don't see
you twice a week. And so we came up with
00:04:31.847 --> 00:04:35.006
some goals and we don't know if we're
gonna achieve these goals. The site
00:04:35.006 --> 00:04:39.228
literally is going live like right now.
But this is what we're trying to do with a
00:04:39.228 --> 00:04:42.677
project. We're trying to chip away at
simplistic conceptions and stereotypes of
00:04:42.677 --> 00:04:47.705
hackers. We know these exist. Can we chip
away at them? Right. We want to offer new
00:04:47.705 --> 00:04:52.870
perspectives on what hackers have actually
done and what they do. A really important
00:04:52.870 --> 00:04:56.601
thing which Paula has already kind of
mentioned is showcase the diversity of
00:04:56.601 --> 00:05:00.616
hacking. Right. People who do block chain
and free software and security. And
00:05:00.616 --> 00:05:04.295
there's there's similarities, but there's
also differences like let's try to show
00:05:04.295 --> 00:05:09.272
this. And while this is not an archive,
this is not the Internet Archive. We are
00:05:09.272 --> 00:05:13.794
trying to kind of preserve bits and bytes
of hacker history. So these are the four
00:05:13.794 --> 00:05:20.724
goals. And we do feel that video. Right,
is a nice medium, a mechanism to achieve
00:05:20.724 --> 00:05:26.783
these four goals. It's persuasive, it's
compelling, it's memorable. It's fun. Like
00:05:26.783 --> 00:05:31.525
we like to waste time at work on video.
Right. So we're like, hey, let's add a
00:05:31.525 --> 00:05:37.012
little persuasive punch to Tex. And this
is why we decided to do it this way.
00:05:37.012 --> 00:05:40.827
Paula: Exactly. So what happens when you click on
the site today and how is it organized? We
00:05:40.827 --> 00:05:47.004
want to show you a little bit of of the
actual architecture of the site itself. So
00:05:47.004 --> 00:05:53.939
we got. When you click on the Web site,
you see as... you see certain categories,
00:05:53.939 --> 00:05:58.814
we've grouped the videos in two different
categories because as you say, there's a
00:05:58.814 --> 00:06:03.036
huge diversity. So you can see here,
Biella is lovely here, pointing out the
00:06:03.036 --> 00:06:07.884
beautiful categories. We've got anti
security hackers, block chain hackers.
00:06:07.884 --> 00:06:12.760
We've got free and open the software,
we've got phreaking, we've got hacker
00:06:12.760 --> 00:06:17.724
depictions. You can look at all sorts of
sort of different categories. You go onto
00:06:17.724 --> 00:06:23.554
a category, website site and then you have
a blurb about what this subculture of
00:06:23.554 --> 00:06:28.348
hacking is all about or what this what the
aim is, exactly what the theme is. And
00:06:28.348 --> 00:06:32.308
then you have all sorts of little videos
that last maybe 30 seconds, maybe a few
00:06:32.308 --> 00:06:38.249
minutes. And under these videos, you would
look at the video and then you would have
00:06:38.249 --> 00:06:42.100
a little bit of a blurb. It's not an
essay. It's not a book. It's not some
00:06:42.100 --> 00:06:45.935
boring academic text. It's supposed to be
funny. It's supposed to be for your
00:06:45.935 --> 00:06:50.201
grandmother's to read. It's supposed to be
actually accessible and understandable.
00:06:50.201 --> 00:06:54.829
Right. So you have the video and the
actual text itself. Right. So this is how
00:06:54.829 --> 00:06:59.214
it looks like. And this is maybe some
sample of our content itself. What do we
00:06:59.214 --> 00:07:04.209
have? We've got 42 entries at the moment
which we've collected from, as I said,
00:07:04.209 --> 00:07:09.822
various different academics with different
authors. And by the end of 2020, we would
00:07:09.822 --> 00:07:15.690
love to have around 100, 100 entries and
we'd try to publish around 50 or 20
00:07:15.690 --> 00:07:21.689
entries. Biella: after that. Because it's really
brutally hard to edit academics. Paula: Exactly.
00:07:21.689 --> 00:07:26.689
Exactly. And so we've got what you'll
find. These are just some examples. We'll
00:07:26.689 --> 00:07:30.878
get into some really of the videos in just
a moment. But for example, you would look
00:07:30.878 --> 00:07:35.437
at hackers and engineers humming at the
Internet Engineering Task Force, or you
00:07:35.437 --> 00:07:40.109
look at an entry that's about the
programing legend, of course, Grace Hopper
00:07:40.109 --> 00:07:44.087
being interviewed by a clue, David
Letterman. Maybe you guys have seen this
00:07:44.087 --> 00:07:49.291
video, a block chain ad that people see it
or you'd say you'd ask, is this real? It's
00:07:49.291 --> 00:07:53.644
kind of wacky ad or is it parody? And when
you watch you, you have to know that this
00:07:53.644 --> 00:07:58.766
is actually real. The actor Robert Redford
showing off his mad social engineering
00:07:58.766 --> 00:08:04.244
skills with the help of cakes and balloons
or how to make sense of why algerien hacker
00:08:04.244 --> 00:08:08.827
Hamza Bender Lodge dressed by U.S.
government smiles and how many people from
00:08:08.827 --> 00:08:13.509
Algeria understand his grin. So this kind
of various diversity of really what
00:08:13.509 --> 00:08:19.434
hacking is really all about. Biella: But but
we're here to get the video party started.
00:08:19.434 --> 00:08:22.597
Paula: Exact right.
From audience: Exactly!
00:08:22.597 --> 00:08:27.771
Finaly. Fine. Let's get started. Yeah.
Biella: with a little Background. Paula: Exactly. Exactly.
00:08:27.771 --> 00:08:32.090
Ok. So which I'm going to start
with the day. You start. You start. Biella: All
00:08:32.090 --> 00:08:35.024
right. So we thought it would be a good
idea to start with phone phreaking,
00:08:35.024 --> 00:08:39.978
because phone phreaking really developed
at the same time. If not kind of before
00:08:39.978 --> 00:08:45.233
computer hacking. And we're going to show
Joy Bubbles. Joe Ingrassia, who is, you
00:08:45.233 --> 00:08:49.181
know, often considered to be the
grandfather of phone phreaking. So let's
00:08:49.181 --> 00:08:53.528
go to a video.
Text from Video Speaker1: In the
00:08:53.528 --> 00:08:57.463
days when calls went through, the
operators phreaking wasn't possible. But as
00:08:57.463 --> 00:09:01.705
humans switchboards were replaced by
mechanical systems, different noises were
00:09:01.705 --> 00:09:04.389
used to trigger the switches.
00:09:04.389 --> 00:09:09.435
Whistling
00:09:09.435 --> 00:09:12.602
If you'd had
perfect pitch like blind phone free Joe
00:09:12.602 --> 00:09:23.208
Ingrassia, you could whistle calls through
the network. Joe: Let's see if I make it this time. This is really hard
00:09:23.208 --> 00:09:29.688
to do, it sounded like all the tones were present,
so the phone should be ringing a bell. Now. Okay.
00:09:29.688 --> 00:09:31.958
I get the phone, it just take a little while...
Speaker1: He even showed
00:09:31.958 --> 00:09:37.263
off his skills at the local media. Speaker2: From his one
phone to a town in Illinois and back
00:09:37.263 --> 00:09:42.920
to his other phone, a thousand miles phone
call by whistling. Joe Ingrassia....
00:09:42.920 --> 00:09:46.581
Biella: right? Very cool, right? So Joe Ingrassia
00:09:46.581 --> 00:09:52.120
is featured. And Joan Donovan, who is like
a mad researcher at Harvard University,
00:09:52.120 --> 00:09:56.796
wrote a really awesome entry about that.
No, of course, she emphasizes things like,
00:09:56.796 --> 00:10:01.461
you know, while hacking is often tied to
computers, it's often tied to any system
00:10:01.461 --> 00:10:07.484
that you could understand, improve, fix,
undermine. And the phreakers really showed
00:10:07.484 --> 00:10:11.954
that. Right. And of course, the history of
phone phreaking is about blind kids. Not
00:10:11.954 --> 00:10:16.390
everyone who is a freak was blind, but
many of them were. They met each other in
00:10:16.390 --> 00:10:22.289
camp and kind of exchanged information.
And that was one of the ways in which
00:10:22.289 --> 00:10:26.092
phone phreaking grew. Phone phreaking
really grew as well. When a big article in
00:10:26.092 --> 00:10:33.001
1971 was published by Roone Rosin Bomb in
Esquire magazine, who has read that
00:10:33.001 --> 00:10:38.167
article? Is anyone? It's incredible. We
mentioned it, I think, in a piece. Check
00:10:38.167 --> 00:10:43.995
it out. Phreaking freaking exploded after
that article. The spelling phreaking
00:10:43.995 --> 00:10:49.637
changed from Capital F Freak to Ph.
Because of that article, phreaking also
00:10:49.637 --> 00:10:54.820
grew when blue boxes were created. Right.
This is also something that Joan writes
00:10:54.820 --> 00:10:59.774
about in her entry. One of the cool things
that Joan writes about and then I'm going
00:10:59.774 --> 00:11:05.524
to turn it over to Paula again is that
some phreaks train birds, OK, to freaking
00:11:05.524 --> 00:11:10.405
phreak. Let's just leave it at that,
because that's pretty cool. All right.
00:11:10.405 --> 00:11:14.035
Paula: OK. Are you guys ready now to
cringe? There we need a little bit of a
00:11:14.035 --> 00:11:18.600
cringing moment as well. So without
further ado, this is Steve Ballmer that
00:11:18.600 --> 00:11:22.844
would like to do some dancing. Biella:
From Microsoft. You just don't know.
00:11:22.844 --> 00:11:49.633
Music
00:11:49.633 --> 00:11:51.752
Paula: OK. Yeah, that's right. Biella:
I just want to say one little
00:11:51.752 --> 00:11:56.715
thing. Paula: Yeah, of course there's a remix of
this with goats screaming like, look it
00:11:56.715 --> 00:12:03.297
up. It's awesome. Paula: Exactly. But why do we show
Steve Ballmer the sort of like Godfather?
00:12:03.297 --> 00:12:08.728
Exactly. Kind of an anti hacker of
sorts. I myself am a staff who've worked
00:12:08.728 --> 00:12:13.538
among a corporate culture of software
developers. Aren't hackers per se? But if
00:12:13.538 --> 00:12:18.030
you think of a figure like Steve Ballmer,
a lot of you guys who perhaps identify
00:12:18.030 --> 00:12:22.255
yourself as hackers, you have day jobs,
you go to work and you have to make some
00:12:22.255 --> 00:12:26.451
money in order to live and do work on your
own projects. And you often have to face
00:12:26.451 --> 00:12:31.366
sort of mini Steve Ballmers at work. And
this is a quote that I have my own entry
00:12:31.366 --> 00:12:35.530
that I did right next to this video. Steve
Ballmer, even Ballmers unbridled display
00:12:35.530 --> 00:12:39.758
of exuberance is exceptional. Many
software developers will have to deal with
00:12:39.758 --> 00:12:44.265
Mini Steve Ballmers every day. Biella:
We are sorry that you do. But If you do - you do.
00:12:44.265 --> 00:12:47.261
Paula: Exactly. And so this but this exuberance is
00:12:47.261 --> 00:12:51.577
all about these sort of slogans of win
big, save the world while building
00:12:51.577 --> 00:12:56.271
technology, be awesome, be true, whatever
it is your corporate slogan is. And
00:12:56.271 --> 00:13:01.655
there is, I think, the way in which the
software developer and sort of the hackers
00:13:01.655 --> 00:13:06.199
that work in their day jobs challenge this
sort of really intense exuberance of
00:13:06.199 --> 00:13:11.403
wearing your corporate T-shirt and smiling
every day in a way in which you hack your
00:13:11.403 --> 00:13:16.247
daily projects, you work on your own
private projects on the side. You actually
00:13:16.247 --> 00:13:22.723
do have many acts of resistance in a way
to this kind of loud, massive exuberance.
00:13:22.723 --> 00:13:27.957
And I talk about these sort of side line
mini hacks that happen on an everyday
00:13:27.957 --> 00:13:30.632
corporate culture.
Biella: Check out your entry. It's really
00:13:30.632 --> 00:13:35.666
funny. All right. So now we're going to
hacktivists. So who here has heard of
00:13:35.666 --> 00:13:40.597
Phineas Fisher? All right. Awesome. Just
in case, for those who are watching the
00:13:40.597 --> 00:13:45.193
video now or later, I'm going to give a
little bit of background. But I love this
00:13:45.193 --> 00:13:50.597
video about Phineas Fisher because he's
explained what he or the group has done,
00:13:50.597 --> 00:13:54.458
but he also does kind of a very clever
media hack. So for those that don't know
00:13:54.458 --> 00:14:01.415
who Phineas Fisher, is he or the group is a
hacktivists that claims to be inspired by
00:14:01.415 --> 00:14:06.347
Anonymous, says Jeremy Hammond. He's
hacked into various corporations from
00:14:06.347 --> 00:14:11.611
FinFisher to hacking team. And what he did
was take documents, take e-mail and then
00:14:11.611 --> 00:14:16.033
publish them. And these were important in
ways that I'll talk about in a moment.
00:14:16.033 --> 00:14:23.687
He's donated, I think, stolen bitcoins to
rush over government. In this fall.
00:14:23.687 --> 00:14:30.042
He published a manifesto kind of calling
for public interest hacking and claims he
00:14:30.042 --> 00:14:34.761
would give one hundred thousand dollars to
anyone who does this. So now I'm going to
00:14:34.761 --> 00:14:39.524
show the first and I believe only
interview that he has done. And he did
00:14:39.524 --> 00:14:49.529
this with Vise News a couple of years ago.
Video starts
00:14:49.529 --> 00:14:54.621
Let's do this. These are the exact words
from our live text exchange, voiced by one
00:14:54.621 --> 00:14:57.907
of my colleagues.
Colleage: So why did you hack hacking
00:14:57.907 --> 00:15:01.167
team?
Cermet: Well, I just for the citizen lab
00:15:01.167 --> 00:15:06.453
reports on FinFisher and hacking team and
thought, that's fucked up. And I hacked
00:15:06.453 --> 00:15:08.594
them.
Colleage: What was the goal on hacking the
00:15:08.594 --> 00:15:14.343
hacking team data? Were you tried to stop them?
Cermet: For the locals. I don't really
00:15:14.343 --> 00:15:18.459
expect leaking data to stop a company, but
hopefully can at least set them back a bit
00:15:18.459 --> 00:15:23.035
and give some breathing room to the people
being targeted with their software.
00:15:23.035 --> 00:15:25.944
Video ends
Biella: OK, so this does not yet exist on
00:15:25.944 --> 00:15:29.974
Hack_Curio. I have to write the
entry, but because I was so busy getting
00:15:29.974 --> 00:15:33.128
the other site in preparation, I haven't
done it, but it will happen in the next
00:15:33.128 --> 00:15:37.602
few weeks. But what I love about this
video is, first of all, he's like hacking
00:15:37.602 --> 00:15:44.771
media representations. Right? I mean, even
when awesome journalists like Motherboard
00:15:44.771 --> 00:15:52.046
publish on hackers or other kind of
entities, they still kind of use a masked
00:15:52.046 --> 00:15:56.264
hacker even once they published about
FinFisher and they put like a mask on him.
00:15:56.264 --> 00:16:00.791
And it's like hackers have heat, like they
don't need a mask. Right. And there is
00:16:00.791 --> 00:16:04.666
this this sense where there's always a
kind of demonic, masked figure. And he was
00:16:04.666 --> 00:16:09.141
like, OK, I'll do this interview, but you
have to represent me as like a lovable
00:16:09.141 --> 00:16:15.034
Muppet like figure. Right? So he's there
hacking the media. But what's also really
00:16:15.034 --> 00:16:20.859
interesting in it. And you watch the full
video, it's kind of amazing. Is that, you
00:16:20.859 --> 00:16:24.675
know, he kind of claims, oh, I didn't have
much of in fact, I don't think he could do
00:16:24.675 --> 00:16:29.053
anything, but in fact, first of all, the
information that was released really
00:16:29.053 --> 00:16:33.704
reaffirms what people suspected. For
example, and in the case of hacking team
00:16:33.704 --> 00:16:41.308
who was selling problematic exploit
spyware to dictatorial regimes. We really
00:16:41.308 --> 00:16:46.414
got a confirmation that this was
happening. And in fact, eventually hacking
00:16:46.414 --> 00:16:51.464
team even lost her license. Right. This
was like a direct effect from what
00:16:51.464 --> 00:16:56.911
FinFischer did. So really, it's it's a
kind of amazing video that showcases what
00:16:56.911 --> 00:17:01.465
he was doing, his reasoning, and then was
a performance, literally, a puppet that
00:17:01.465 --> 00:17:07.397
hacked the media. OK, so now we're going
to rewind a little bit and go back in
00:17:07.397 --> 00:17:13.624
time. So a lot of hackers care about
cryptography. Right? And ever since the
00:17:13.624 --> 00:17:21.822
cipher punks. And since that period, there
have been projects from TOR to Signal that
00:17:21.822 --> 00:17:27.050
have enabled cryptography. That has been
really important for human rights
00:17:27.050 --> 00:17:34.425
activists and others. But one of the
great, great kind of encryption projects
00:17:34.425 --> 00:17:39.451
came from this fellow, Tim Jenkins, who
here in the room has heard of Tim Jenkins.
00:17:39.451 --> 00:17:45.315
OK. This is amazing. This is why we're
doing kind of hack_curio. So Tim Jenkins is
00:17:45.315 --> 00:17:53.111
from South Africa. And beginning in 1988,
secret messages were sent and received
00:17:53.111 --> 00:17:58.313
regularly across South Africa borders
using an encrypted telematics system
00:17:58.313 --> 00:18:03.519
assembling assemble during the final years
of the South African liberation struggle
00:18:03.519 --> 00:18:08.923
and Tim Jenkins, along with Ronnie
Press, who has since passed away, created
00:18:08.923 --> 00:18:13.348
the system. And Tim Jenkins was kind of
like a phone phreak. And that was one of
00:18:13.348 --> 00:18:17.569
the reasons, like he was good at working
with phones. And what was amazing about
00:18:17.569 --> 00:18:22.856
this system, which was part of Operation
Vula, was that allowed people in South
00:18:22.856 --> 00:18:30.985
Africa to communicate with leaders in
exile - in London. Right? And Tim Jenkin
00:18:30.985 --> 00:18:36.040
created this system. I'm going to show a
video about it in a moment. And Sophie
00:18:36.040 --> 00:18:41.591
Dupin has written a terrific entry. The
reason why we have him with the key there
00:18:41.591 --> 00:18:46.180
was that like, you know, the South African
apartheid government did not really like
00:18:46.180 --> 00:18:52.247
Tim Jenkins, so they threw him in jail.
Well, a lot of hackers lock pick. He
00:18:52.247 --> 00:18:59.918
actually created 10 wooden keys secretly
in the wooden shop and broke out of jail.
00:18:59.918 --> 00:19:04.324
I mean, talk about like taking lock
picking to like another sort of level. All
00:19:04.324 --> 00:19:10.161
right. So let's listen and see the video
about this incredible program.
00:19:10.161 --> 00:19:12.787
Video starts
Tim Jenkin: After we sent in the first
00:19:12.787 --> 00:19:18.102
computer. We expect things to start
immediately, but it actually took a couple
00:19:18.102 --> 00:19:24.101
of weeks. And then suddenly one day I was
sitting at my desk and the telephone
00:19:24.101 --> 00:19:29.831
answering machine suddenly started
roaring and I thought, this must be the
00:19:29.831 --> 00:19:33.831
wrong number or something. But then, sure
enough, I heard the distinctive tone of
00:19:33.831 --> 00:19:38.039
the messages and I could hear this thing
coming through the tape. Modem 14.5k
00:19:38.039 --> 00:19:42.764
sound Word, and word, and word. And
then it stopped and I loaded the message
00:19:42.764 --> 00:19:48.419
onto my computer. In fact, it was a report
from Matt. And sure enough, there was our
00:19:48.419 --> 00:19:53.196
first message. Absolutely perfect. sound of a
00:19:53.196 --> 00:19:58.321
printer working
Video ends
00:19:58.321 --> 00:20:05.051
Biella: Ah, fax machine. OK. So this is from the
entry by Sophie Dupin, who is writing a
00:20:05.051 --> 00:20:08.913
dissertation on this topic. The
international hacker community has since
00:20:08.913 --> 00:20:12.681
taken notice of Tim Jenkins and the Vula
encrypted communication system that
00:20:12.681 --> 00:20:17.504
embodies so many qualities often
associated with exceptional, with an
00:20:17.504 --> 00:20:23.038
exceptional hack. Elegant, clever, usable
and pragmatic. Right? Jenkins has been
00:20:23.038 --> 00:20:28.650
invited to speak at the Berlin Logan
Symposium in 2016 and to lock picking
00:20:28.650 --> 00:20:34.573
communities in the Netherlands and the
United States. In 2018 the RSA Security
00:20:34.573 --> 00:20:40.611
Conference gave Jenkin the first award for
excellence in humanitarian service. So
00:20:40.611 --> 00:20:46.258
just like one last thing, this is a good
reminder that histories of computer
00:20:46.258 --> 00:20:51.679
hacking are often skewed. They often
actually start with the United States.
00:20:51.679 --> 00:20:57.348
When, for example, in Europe with the CCC,
that story's been told in bits and pieces,
00:20:57.348 --> 00:21:04.330
but deserves a much longer or much larger
showcase. And actually this example also
00:21:04.330 --> 00:21:10.050
shows that, for example, the history of
encryption when it comes to communication
00:21:10.050 --> 00:21:14.833
didn't even necessarily start in the
United States. Right? And so it's really,
00:21:14.833 --> 00:21:19.169
really important to kind of showcase these
histories that haven't been told
00:21:19.169 --> 00:21:21.379
elsewhere.
Paula: So maybe by now you're kind of
00:21:21.379 --> 00:21:27.347
getting at the fact that we see hacking as
a diverse practice. Hackers as a diverse
00:21:27.347 --> 00:21:31.981
group of people who do different things.
And at the moment we're going I want to
00:21:31.981 --> 00:21:38.054
come back to the ways in which hackers
challenge power through challenging really
00:21:38.054 --> 00:21:43.196
the very stereotype of what gender means
and challenging, really gender politics.
00:21:43.196 --> 00:21:48.693
And it will start to turn to this topic by
looking at an entry that a woman named
00:21:48.693 --> 00:21:54.284
Christina Dunbar Hester has done on a
woman named Naomi Cedar. And some of you
00:21:54.284 --> 00:22:00.418
probably know Naomi Cedar. This is part of
her entry. And she wrote, Naomi Cedar is a
00:22:00.418 --> 00:22:04.355
programmer and core participant in the
Python programing language community. As a
00:22:04.355 --> 00:22:08.677
trans identified person, Cedar grappled
with whether she would have to give up
00:22:08.677 --> 00:22:13.352
everything in order to transition and
whether the community would accept her for
00:22:13.352 --> 00:22:19.187
doing so. So let's watch a clip of the
video and let's see how Naomi Cedar challenge that.
00:22:19.187 --> 00:22:26.087
Biella: I think she gave this talk at
PyCon, the Python Open Source Developer
00:22:26.087 --> 00:22:30.077
conference, and it's really incredible
talk. I really encourage you to watch the
00:22:30.077 --> 00:22:33.480
whole talk. But this is a question. This
is the moment where she's like, do I have
00:22:33.480 --> 00:22:37.232
to leave the community or can I transition
in the community?
00:22:37.232 --> 00:22:39.632
Paula: Exactly. So let's watch a tiny
clip.
00:22:39.632 --> 00:22:42.826
clip starts
I decided that to do that would probably
00:22:42.826 --> 00:22:47.928
mean giving up everything. Remember, back
at 13, I had absorbed this into my brain
00:22:47.928 --> 00:22:51.536
that the only way you were going to get
out of this was to basically leave
00:22:51.536 --> 00:22:56.577
everything. And this was a very painful
thing to think about. But like a lot of
00:22:56.577 --> 00:22:59.709
trans people, I had come to the point
where even if I lost everything, that was
00:22:59.709 --> 00:23:08.744
fine. So I started to think about other
alternatives here. I had toyed with the
00:23:08.744 --> 00:23:12.640
idea of doing the education summit as a
farewell thing to the community. I would
00:23:12.640 --> 00:23:16.615
do it and then disappear, go into the
witness protection program. The
00:23:16.615 --> 00:23:20.754
only problem was I actually started
accelerating the pace of my transition
00:23:20.754 --> 00:23:26.118
because, well, it was just such freaky
relief to start moving in that direction
00:23:26.118 --> 00:23:31.145
that that wouldn't work. So I actually
thought about what was for me hacking back
00:23:31.145 --> 00:23:36.892
to Laverne Cox, a very revolutionary idea.
What if I just did it and was open about
00:23:36.892 --> 00:23:42.856
it? First thing I looked at codes of
conduct. I looked for specifics. What
00:23:42.856 --> 00:23:49.137
happens to me if there is a problem? If I
am harassed? This was important to me.
00:23:49.137 --> 00:23:54.417
Other thing I did was I started telling a
few people Jesse Nola, Avi Alaska. Some
00:23:54.417 --> 00:23:58.461
people I would work with PyCon on and they
were all pretty cool with the idea. And
00:23:58.461 --> 00:24:01.821
the more I talked about it, the more I
decided that I would go ahead and take
00:24:01.821 --> 00:24:06.329
that chance. So I did. I started by
teaching at some Python workshops for
00:24:06.329 --> 00:24:12.230
women. I spoke at some conferences. We
went to PyCon . It was good. The education
00:24:12.230 --> 00:24:16.029
summit was fine. Okay. Some of the people
I worked with in organizing it were a
00:24:16.029 --> 00:24:19.917
little bit confused when the names on the
emails changed. I apologize, but in
00:24:19.917 --> 00:24:25.533
general it went pretty well. In fact, the
more open I was, the easier it was on. It
00:24:25.533 --> 00:24:29.429
was for me because I didn't have to worry
about being outed. And it was easier for
00:24:29.429 --> 00:24:33.272
other people because they certainly knew
what to expect. The other interesting
00:24:33.272 --> 00:24:37.297
sidelight is that when I told people they
sometimes felt an obligation to share some
00:24:37.297 --> 00:24:41.728
deep, dark secret about themselves, like I
kind of thrump them and they had to answer
00:24:41.728 --> 00:24:50.267
back. So my takeaway here is that, we talk
a lot about diversity and that's real. So
00:24:50.267 --> 00:24:56.818
we should be ending on this point, except
that I'm a contrarian in my old age.
00:24:56.818 --> 00:25:02.816
So it is not quite all rainbows and
unicorns or as you might put it, this is
00:25:02.816 --> 00:25:09.441
kind of common in social justice circles
right now. We don't get a cookie.
00:25:09.441 --> 00:25:11.621
Video ends
Paula: All right. And yeah, yeah,
00:25:11.621 --> 00:25:13.940
Paula and Biella are applauding
Biella: He's a very powerful player.
00:25:13.940 --> 00:25:19.246
Paula: Exactly. And I guess we could also
say that the next if I want to show that
00:25:19.246 --> 00:25:23.997
after the entry by Christina Dunbar
Hester, Naomi Cedar actually gave a response
00:25:23.997 --> 00:25:27.242
to this entry, which we've also published,
which we also want to do. We want to have
00:25:27.242 --> 00:25:32.353
a discussion between some of the responses
to the actual very areas.
00:25:32.353 --> 00:25:35.188
Biella: So we actually wanted to quote it
in full.
00:25:35.188 --> 00:25:40.771
Paula: Yeah, exactly. So perhaps. Let's
read, let's read this this section from
00:25:40.771 --> 00:25:47.416
the response of Naomi Cedar. PyCon itself
has continued to evolve into an ever more
00:25:47.416 --> 00:25:51.436
diverse place with an ever stronger
representation of queer folks, people of
00:25:51.436 --> 00:25:55.511
color, people who speak different
languages, etc. Codes of conduct are
00:25:55.511 --> 00:26:00.615
nearly universal these days, and more
often than not, communities insist that
00:26:00.615 --> 00:26:05.317
they be well crafted and meaningful and
backed up by real enforcement. Even in
00:26:05.317 --> 00:26:10.168
these retrograde times of official attacks
on the rights of so many groups, we have
00:26:10.168 --> 00:26:15.110
come a long way. But just as I said five
years ago, it's still not all rainbows and
00:26:15.110 --> 00:26:20.113
unicorns. Too many groups throughout the
open source world globally are making only
00:26:20.113 --> 00:26:24.469
token efforts to foster inclusion. And in
my opinion, too many members of privileged
00:26:24.469 --> 00:26:29.204
groups tend to focus on supervisual or
cosmetic changes rather than addressing
00:26:29.204 --> 00:26:33.794
the underlying fundamental issues.
Marginalized groups face. It doesn't take
00:26:33.794 --> 00:26:39.412
a bit away from how far we've come to also
acknowledge how much we still have to do.
00:26:39.412 --> 00:26:43.503
Naomi Cedar. So this really part we wanted
to discuss this in the way in which
00:26:43.503 --> 00:26:48.382
hacking is also a practice of challenging
power, challenging stereotypes and
00:26:48.382 --> 00:26:52.276
challenging really gender norms in many
ways. All right, let's move on.
00:26:52.276 --> 00:26:57.083
Biella: All right. So the final frontier.
We have three more videos to show. Before
00:26:57.083 --> 00:27:03.517
we get to the Q and A. In all videos
relate to geopolitics and hacking. You
00:27:03.517 --> 00:27:08.482
know, hacking has always been political in
some fashion, if for no other reason than
00:27:08.482 --> 00:27:12.658
sometimes laws are challenged. You're
you're doing what you're doing, something
00:27:12.658 --> 00:27:16.520
that someone doesn't want you to do.
Right. But there's only been certain
00:27:16.520 --> 00:27:22.151
moments, where nation states have been
interested in hacking or there have been
00:27:22.151 --> 00:27:26.755
sort of ways in which nation states have
used hacking. For example, recently in
00:27:26.755 --> 00:27:32.699
order to kind of engage in international
politics. So we're going to kind of focus
00:27:32.699 --> 00:27:39.838
on these last, the last three videos will
focus on these issues. We're at the CCC.
00:27:39.838 --> 00:27:44.667
So of course, I wanted to show a video
related to CCC. Unfortunately, I don't have
00:27:44.667 --> 00:27:52.619
one related to the German CCC. Please do
send good videos related to the CCC to me.
00:27:52.619 --> 00:27:59.665
But I am going to show one related to the
FCCC established in Lion by Jean-Bernard
00:27:59.665 --> 00:28:05.926
Condat. So do people know what the F
stands for? All right. What
00:28:05.926 --> 00:28:08.419
does it stand for?
One Auditor: French? Biella: French.
00:28:08.419 --> 00:28:16.442
OK. Once you see the video. Oh no.
Hold on. You will also see that it stands
00:28:16.442 --> 00:28:23.583
for fake and fuck as well, because
basically the French chapter of the CCC
00:28:23.583 --> 00:28:30.070
was established in part to try to entrap
hackers in order to kind of work for the
00:28:30.070 --> 00:28:35.529
French government. It's a fascinating
story that's been told in bits and pieces
00:28:35.529 --> 00:28:39.098
and I'm going to say a little bit more
about it. But now I'm going to show a clip
00:28:39.098 --> 00:28:43.928
from a French documentary that kind of,
you know, charts a little bit of that
00:28:43.928 --> 00:28:47.025
history. It's in French with subtitles.
00:28:47.025 --> 00:31:42.793
Video is in progress
00:31:42.793 --> 00:31:50.829
Biella: OK. So pretty incredible, right?
And this story has been told in bits and
00:31:50.829 --> 00:31:54.489
pieces by French journalists. I'm
working with another French journalist to
00:31:54.489 --> 00:32:00.823
try to kind of uncover the fuller history,
as well tell the story of kind of American
00:32:00.823 --> 00:32:04.616
and European hackers who did not get
recruited by intelligence, but who
00:32:04.616 --> 00:32:09.244
nevertheless came from the underground,
because they were breaking into systems,
00:32:09.244 --> 00:32:13.661
not maliciously, but they learned a lot
and they had really valuable knowledge
00:32:13.661 --> 00:32:21.661
that no one else had. I mean, it's kind of
really incredible, right? And, you know,
00:32:21.661 --> 00:32:25.658
this history, whether it's just that the
transformation of the underground into
00:32:25.658 --> 00:32:31.065
security hackers or in the case of France,
where some portion of people were tapped
00:32:31.065 --> 00:32:37.907
to work for intelligence informally,
formerly with pressure. Right. Has yet to
00:32:37.907 --> 00:32:43.199
be written. And there's many remarkable
elements about this. But basically, I do
00:32:43.199 --> 00:32:48.271
think it's remarkable that it's a bunch of
kind of amateurs who just were obsessed
00:32:48.271 --> 00:32:53.139
with with networks who were the ones
holding the special knowledge that were
00:32:53.139 --> 00:32:57.732
needed, that was needed by corporations
and intelligence in order to start
00:32:57.732 --> 00:33:02.635
securing systems. Right. The other kind of
really interesting thing is that some of
00:33:02.635 --> 00:33:09.935
the best underground non malicious hacker
crews were European. TESO, which had a lot
00:33:09.935 --> 00:33:18.297
of Austrian and German members. ADM, which
is from France, was considered to be the
00:33:18.297 --> 00:33:23.239
best at exploit writing. Rights. So the
entry, which I'm going to write with a
00:33:23.239 --> 00:33:26.764
French journalist is going to reflect on
this. And this is actually a big project
00:33:26.764 --> 00:33:32.253
that I'm working on as well. So I'll have
more to say about it later. All right. So
00:33:32.253 --> 00:33:35.741
going from the past to the present.
Paula: Exactly. And I guess we couldn't
00:33:35.741 --> 00:33:41.351
talk to you politics and hacking without
talking about Trump, talking about Putin.
00:33:41.351 --> 00:33:45.915
A slew of politicians that we know in
recent years has used the hacker for their
00:33:45.915 --> 00:33:51.255
own political discourse, for their somehow
political gain. And with this next video
00:33:51.255 --> 00:33:57.818
will show us just that. This is our hacker
depictions section. It was posted by a
00:33:57.818 --> 00:34:01.733
scholar named Marietta Brezovich. So
without further ado, let's listen to the
00:34:01.733 --> 00:34:04.681
way in which Putin sees the hacker.
00:34:04.681 --> 00:35:06.412
Video is in Progress
00:35:06.412 --> 00:35:15.008
Paula: So I don't know if Putin was reading
a Russian Hacker for the night. Biella: best image
00:35:15.008 --> 00:35:19.663
of the night. Possibly. I don't know.
Paula: We weren't sure if Putin is reading
00:35:19.663 --> 00:35:23.929
Paul Graham's Hackers & Painters on the
toilet or some of his other Hacker
00:35:23.929 --> 00:35:26.449
cultures literature. But it seems like
he's getting something right. Right. We
00:35:26.449 --> 00:35:30.196
kind of think, hey, you kind of got it.
It's not hackers actually.
00:35:30.196 --> 00:35:32.932
Biella: well, except for one part.
Paula: Exactly. That's what we want to
00:35:32.932 --> 00:35:37.288
say. In some ways, yes. It's true. The
hackers are artistic and creative, etc.
00:35:37.288 --> 00:35:38.960
Biella: They just don't wake up early
in the morning.
00:35:38.960 --> 00:35:41.747
Paula: Exactly. Maybe they don't wake up
early in the morning. But what's
00:35:41.747 --> 00:35:46.065
important, I think in here and this is
also what Brezovich points out in her
00:35:46.065 --> 00:35:51.566
entry, is that he uses this, of course,
for his political gain to show that he is
00:35:51.566 --> 00:35:55.630
not influencing any hackers or any
technologists, who maybe identify as
00:35:55.630 --> 00:36:02.041
hackers or not. He's not influencing them.
And because they are so free and artistic
00:36:02.041 --> 00:36:06.422
and sort of living in their sort of creative
world that they're beyond his control.
00:36:06.422 --> 00:36:10.451
Right? So partially it's true. But
partially he's gonna employing this to
00:36:10.451 --> 00:36:15.485
make a political statement about his non
involvement with any sort of role.
00:36:15.485 --> 00:36:18.321
Biella: And what's interesting is all
evidence points to the fact that that
00:36:18.321 --> 00:36:23.203
technologists who did the hacking just
work at intelligence organizations.
00:36:23.203 --> 00:36:25.127
Paula: Exactly.
Biella: All right. So we just had one more
00:36:25.127 --> 00:36:31.195
video and we'll end on a positive note.
Right? A lot of stuff around hackers is
00:36:31.195 --> 00:36:34.389
sometimes depressing, especially when it
comes to the law. They get arrested, they
00:36:34.389 --> 00:36:36.985
get thrown in jail. They commit suicide.
Paula: True.
00:36:36.985 --> 00:36:40.826
Biella: And so we want to showcase a video
that covers British and Finnish hacker
00:36:40.826 --> 00:36:45.071
Lauri Love, who's presented here at the
CCC. Some of you may know that he face
00:36:45.071 --> 00:36:50.127
extradition to the United States due to
his alleged involvement with Anonymous
00:36:50.127 --> 00:36:57.885
operation called #OpLastResort, which was
kind of in support of Aaron Swartz, who
00:36:57.885 --> 00:37:01.912
had committed suicide when he was facing
many criminal charges. And we'll watch a
00:37:01.912 --> 00:37:04.710
clip where parliamentarians and others
debate his case.
00:37:04.710 --> 00:37:08.299
Video proceedes
00:37:08.299 --> 00:37:10.615
A young man with Asperger's syndrome
00:37:10.615 --> 00:37:16.617
awaits extradition to the United States,
facing charges of computer hacking and is
00:37:16.617 --> 00:37:21.327
then likely to kill himself. It sounds
familiar. He's not, of course, Gary
00:37:21.327 --> 00:37:25.913
McKinnon, who is saved by the prime
minister. But Lauri Love, who faces, in
00:37:25.913 --> 00:37:30.212
a fact, a death sentence. So when the
prime minister introduced the form above
00:37:30.212 --> 00:37:34.409
to, in her words, provide greater
safeguards for individuals, surely she
00:37:34.409 --> 00:37:40.738
expensed it to protect the vulnerable,
like Gary McKinnon, like Lauri Love. The
00:37:40.738 --> 00:37:44.715
honorable gentleman. My honorable friend
obviously campaigned long and hard for
00:37:44.715 --> 00:37:48.768
Gary McKinnon. And obviously I took that
decision because at that time it was a
00:37:48.768 --> 00:37:52.949
decision for the home secretary to decide
whether there was a human rights case for
00:37:52.949 --> 00:37:57.453
an individual not to be extradited. We
subsequently changed the legal position on
00:37:57.453 --> 00:38:01.148
that. So this is now a matter for the
courts. There are certain parameters that
00:38:01.148 --> 00:38:04.481
the courts look at in terms of the
extradition decision, and that is then
00:38:04.481 --> 00:38:07.848
passed to the home secretary. But it is
for the courts to determine the human
00:38:07.848 --> 00:38:12.702
rights aspects of any case that comes
forward. It was right, I think, to
00:38:12.702 --> 00:38:19.519
introduce the form box, to make sure that
there was that challenge for cases here in
00:38:19.519 --> 00:38:22.930
the United Kingdom, as to whether they
should be held here in the United Kingdom.
00:38:22.930 --> 00:38:26.496
But the legal process is very clear and
the home secretary is part of that legal
00:38:26.496 --> 00:38:32.000
process.
Biella: OK, so the author of the entry,
00:38:32.000 --> 00:38:38.612
Naomi Colvin, is right there in front.
And she has a great sentence which
00:38:38.612 --> 00:38:42.631
says in Lauri Love, the U.S. had
definitively chosen the wrong target
00:38:42.631 --> 00:38:47.609
principle, passionate and articulate,
certainly more articulate than Theresa May
00:38:47.609 --> 00:38:53.267
herself in the clip which accompanies this
article, Love versus USA would be one for
00:38:53.267 --> 00:38:58.176
the underdog. And it was Love one. He's
not being extradited. And in part, it was
00:38:58.176 --> 00:39:05.107
also because Naomi Colvin was part of the
team that stopped it. So let's thank Naomi
00:39:05.107 --> 00:39:15.849
as well as. Applause And it's just
really important to document some of the
00:39:15.849 --> 00:39:20.513
wins every once in a while. So do check.
Check that out. So we are now going to
00:39:20.513 --> 00:39:24.826
wrap up said that there's going to be 10
minutes for Q and A, but a few final
00:39:24.826 --> 00:39:31.309
reflections about this project.
Paula: So I think these videos show actual
00:39:31.309 --> 00:39:35.835
hackers and hackings and at a more level
meta level demonstrated how hackers have
00:39:35.835 --> 00:39:41.362
become central to our popular imagination.
How hackers and hacking are one medium to
00:39:41.362 --> 00:39:44.843
think through digital cultures, to think
through politics. I mean, we care about
00:39:44.843 --> 00:39:49.107
culture. We care about representing
digging deep, looking at various angles of
00:39:49.107 --> 00:39:52.630
a certain culture. And I think that's the
purpose. Where I see this is the purpose
00:39:52.630 --> 00:39:57.511
of Biella and mine, and Chris', and our friends
projects is that we really want to take
00:39:57.511 --> 00:40:02.059
the work that we've been doing and really
pay tribute to this really huge, diverse
00:40:02.059 --> 00:40:06.868
community that that you are.
Biella: On a more practical level being a
00:40:06.868 --> 00:40:13.323
little less meta. We do hope that people
assign hack_curio entries in their
00:40:13.323 --> 00:40:16.756
courses. You could use them in high
school. You can use them in college
00:40:16.756 --> 00:40:20.840
classes. You know, heck, you know, maybe
you could even use them in middle school,
00:40:20.840 --> 00:40:26.455
elementary. I don't know if that will
work. But get it out there. And also for
00:40:26.455 --> 00:40:30.112
some of you, I think it will be fun to
look at different tidbits of hacker
00:40:30.112 --> 00:40:35.405
history. And when you're at home for the
holidays before you come to the CCC and
00:40:35.405 --> 00:40:41.051
you're like, man, my parents don't really
understand what I do. So you could fire up
00:40:41.051 --> 00:40:44.707
a video that kind of represents what you
do and fire up another video that
00:40:44.707 --> 00:40:49.855
represents what you don't do.
Paula: And have a discussion. Haven't
00:40:49.855 --> 00:40:51.740
especially.
Biella: And so this is our last slide.
00:40:51.740 --> 00:40:58.594
What next? The site is Life. Share It. Our
Twitter address is up there. We consider
00:40:58.594 --> 00:41:03.834
this a soft launch. We have 42 entries,
but we'll get some feedback and tweet
00:41:03.834 --> 00:41:09.237
things, send video suggestions, spread the
word. And to end before Q and A. We just
00:41:09.237 --> 00:41:13.476
really want to thank the CCC. We want to
thank Lisa for having us here. This is
00:41:13.476 --> 00:41:18.830
really an amazing place to launch. And we
also want to thank everyone who made this
00:41:18.830 --> 00:41:24.242
possible from funding to the authors to
the entire hack_curio team. So thank you
00:41:24.242 --> 00:41:39.390
so much. And we're here for a little Q &
A. Applause
00:41:39.390 --> 00:41:43.158
Herald: Thanks a lot for this beautiful
talk. We are now open for the
00:41:43.158 --> 00:41:46.738
question. Mics. If there's any questions
from the audience, please just stand up to
00:41:46.738 --> 00:41:58.816
one of the mics. Paula: Don't be shy. Herald: Nobody is
more interested in hacking culture? Are
00:41:58.816 --> 00:42:00.962
you overwhelmed?
Paula: Someone.
00:42:00.962 --> 00:42:03.621
Herald: Yeah. There's someone on mic 1.
Please.
00:42:03.621 --> 00:42:07.552
Mic1: Thank you for this talk and for
the energy that was in your talk. It was just
00:42:07.552 --> 00:42:15.453
amazing! I have one question to ask. What
was like the... way more like
00:42:15.453 --> 00:42:20.299
surprising moments for you in this, like,
research journey.
00:42:20.299 --> 00:42:28.318
Paula: OK, that's a good question.
Biella: I mean. In terms of the
00:42:28.318 --> 00:42:35.760
project, you know, collaborating with
others and building a Web site is very
00:42:35.760 --> 00:42:39.681
different than what academics often do,
where we do often have to rely on
00:42:39.681 --> 00:42:44.183
ourselves and we get feedback. You know
what I mean? And I think it does give a
00:42:44.183 --> 00:42:50.623
sense of the really beautiful relations
that form, where you go back and forth
00:42:50.623 --> 00:42:55.411
with an author, with a web developer. You
know, it really does give you a sense of
00:42:55.411 --> 00:43:00.182
the deep social ties that we do have as
academics. But I think it's much, much
00:43:00.182 --> 00:43:06.552
deeper with hackers. That's one thing. But
I do think I mean, I am frustrated as an
00:43:06.552 --> 00:43:11.857
academic, where a lot of people do have
very, very, very narrow conceptions of
00:43:11.857 --> 00:43:16.241
hackers. It's not a perfect world. And
there's a lot which, you know, we can
00:43:16.241 --> 00:43:21.745
change. There is very clear also that as
academics, we weren't necessarily changing
00:43:21.745 --> 00:43:28.519
perceptions so much. And this project was
an effort to finally do that. It's like
00:43:28.519 --> 00:43:32.785
see them like stop listening or reading
just my words, because obviously that's
00:43:32.785 --> 00:43:37.590
not really changing chat, you know, so
come see some of the videos. Yeah.
00:43:37.590 --> 00:43:42.279
Paula: Yeah. And I guess for me, I also
mean, if you work in your own little
00:43:42.279 --> 00:43:46.692
bubble and you work in your own little
corner, just in any type of science, you
00:43:46.692 --> 00:43:50.403
don't see as much as what's going on out
of there. And I for me, the whole
00:43:50.403 --> 00:43:55.084
definition of what it is to hack what a
hacker actually is, you start really
00:43:55.084 --> 00:43:59.939
opening your eyes out when you see while
there's 50 hundred other scholars out
00:43:59.939 --> 00:44:03.843
there that are actually think that a
hacker is this or hackers that. And I
00:44:03.843 --> 00:44:07.588
think for me, that opened my eyes up
really saying, hey, well, this is what you
00:44:07.588 --> 00:44:14.286
think it means. So interesting know.
Herald: Thank you. Now a question from
00:44:14.286 --> 00:44:16.854
mic 2, too, please.
Mic2: Hi, thank you for the talk. It was
00:44:16.854 --> 00:44:22.061
very enlightening. I have two questions.
The first one would be, could you tell us
00:44:22.061 --> 00:44:29.137
maybe a bit more about the server and
infrastructure you using or are you just
00:44:29.137 --> 00:44:34.642
linking YouTube videos? And the second one
would be, how would you envision future
00:44:34.642 --> 00:44:39.724
engagement with students? Because I'm
teaching a course for computer scientists
00:44:39.724 --> 00:44:44.773
undergrads. And we did something
similar around movies and descriptions
00:44:44.773 --> 00:44:50.444
that they have to make around hacker
movies. And they don't really learn how to
00:44:50.444 --> 00:44:56.217
reflect on social issues a lot in the
studies. So I wonder how does this could
00:44:56.217 --> 00:45:02.363
be integrated into platform and how that
could how you could engage students further?
00:45:02.363 --> 00:45:05.953
Biella: So great questions. I mean, first
of all, for the Web site, it runs on
00:45:05.953 --> 00:45:13.625
WordPress. It just seemed like an easy way
to, like, hack it up for this sort of
00:45:13.625 --> 00:45:19.039
thing. And we hired actually a master
student from my department at McGill
00:45:19.039 --> 00:45:27.993
University. Thanks to all. You're awesome.
And then we're hosting the videos on Vimeo
00:45:27.993 --> 00:45:31.559
and they come from all sorts of different
places. That's actually not the best or
00:45:31.559 --> 00:45:36.896
the most ideal solution. And so far as
like, you know, who knows if Vimeo is
00:45:36.896 --> 00:45:41.585
going to exist in 15 years? Right.
Internet Archive. We looked into them and
00:45:41.585 --> 00:45:46.050
they were kind of like psyched about it,
that it was going to be slower to deliver
00:45:46.050 --> 00:45:54.028
the video. Right? So maybe if the project
grows, we can at a certain point host our
00:45:54.028 --> 00:46:00.005
own videos. Right? But like we'll have to
sort of graduate there at the next level.
00:46:00.005 --> 00:46:05.912
The entries are all going to be creative
comments and we're using clips that then we
00:46:05.912 --> 00:46:11.543
cite the entire clip and where it came
from. We consider this fair use and for
00:46:11.543 --> 00:46:16.147
those that may be wondering. And so we'll
see how that goes.
00:46:16.147 --> 00:46:19.647
Paula: And for the second, I guess I could
take the second question. When ever I
00:46:19.647 --> 00:46:23.467
mean, my students are not their digital
media students. They're not from computing
00:46:23.467 --> 00:46:29.079
science. But if you ever even try to touch
along something around culture or
00:46:29.079 --> 00:46:34.030
something, maybe real social science is
always, I think, ask how is power really
00:46:34.030 --> 00:46:37.892
how these people relate to power? How did
they relate to critique? How do they use
00:46:37.892 --> 00:46:41.814
these tools to critique something? And I
think all of these videos and maybe even
00:46:41.814 --> 00:46:46.411
the videos that your students chose, if
they just asked that question, whether
00:46:46.411 --> 00:46:49.915
they're studying computing science,
whether they're studying geography or
00:46:49.915 --> 00:46:54.176
whatever it is, if they look at it from a
form of power and how it's contested, I
00:46:54.176 --> 00:46:58.479
think that that's a way in which they they
really can engage into a certain topic
00:46:58.479 --> 00:47:04.593
really deeply. That's cool. There's a nice
little text by Fuko with what's called
00:47:04.593 --> 00:47:09.158
what is critique. That's it. I use it for
my students that are non maybe cultural
00:47:09.158 --> 00:47:13.798
studies students or whatever. And there's
a nice little text that could be with
00:47:13.798 --> 00:47:16.223
Herald: Thank you. One more question from
00:47:16.223 --> 00:47:19.763
mic 2, please.
Mic2: So thank you again. And I wanted
00:47:19.763 --> 00:47:25.618
to ask you, because I looked at the videos
on the site and I see a lot of stories of
00:47:25.618 --> 00:47:30.779
single people and I'm quite surprised to
find very little stories of communities
00:47:30.779 --> 00:47:36.733
and showcases of hacker spaces. And a lot
of researchers I've spoke about are
00:47:36.733 --> 00:47:42.492
actually focusing on like how communities
work. So was there any conscious decision
00:47:42.492 --> 00:47:48.200
that you want to tell singular
people, singular person stories instead of
00:47:48.200 --> 00:47:53.334
like communities?
Biella: First of all, it's a great piece
00:47:53.334 --> 00:47:57.056
of feedback because I mean, one of the
things as an anthropologist that I've
00:47:57.056 --> 00:48:01.969
always loved about the hacker world is on
the one hand, you know, people often talk
00:48:01.969 --> 00:48:07.514
about rights that are are tied to notions
of individualism. But hacking is so
00:48:07.514 --> 00:48:12.829
collectivist. Right. I mean, look at the
CCC. I mean, you can't have a better
00:48:12.829 --> 00:48:17.900
example of a kind of ritual, collective,
effervescent experience, hacker spaces.
00:48:17.900 --> 00:48:23.059
Right. So I do think it's really important
to try to showcase that. And we do, we do
00:48:23.059 --> 00:48:29.295
have videos around hacker spaces and
they're being written up like the authors
00:48:29.295 --> 00:48:33.806
are writing about them now. But if that's
not coming through the sites, we actually
00:48:33.806 --> 00:48:39.869
need to write. But it does show I mean,
one of the problems with video and we and
00:48:39.869 --> 00:48:44.104
we will reflect on this is that on the one
hand, while you could put a face to
00:48:44.104 --> 00:48:48.980
hacking, which is great. It's like it's
not the hooded person video has its own
00:48:48.980 --> 00:48:54.308
limits. Right? Often it's an individual.
It's often what journalists are interested
00:48:54.308 --> 00:48:59.113
in. And we also have to make sure that
this isn't the whole of hacking and also
00:48:59.113 --> 00:49:05.194
at times use the video to tell a different
story than what the video is showing. So I
00:49:05.194 --> 00:49:08.632
think that's a great comment. And we're
going to keep that in mind because to me,
00:49:08.632 --> 00:49:13.026
the collectivist community, part of
hacking. Is one of the most amazing parts
00:49:13.026 --> 00:49:16.315
that never makes it into kind of
mainstream representation.
00:49:16.315 --> 00:49:18.591
Paula: That's right. Thank you.
Herald: Thank you.
00:49:18.591 --> 00:49:21.458
Herald: Then we have a question from the
00:49:21.458 --> 00:49:29.491
Internet. First Internet.
Biella: Internet. Tell us. Talk to us.
00:49:29.491 --> 00:49:35.746
Signal Angel: That question from the Internet is:
when covering international scenes, scenes
00:49:35.746 --> 00:49:42.125
like Phrack magazine use as source material.
Biella: Is Phrack magazine a source?
00:49:42.125 --> 00:49:45.105
Signal Angel: Yeah.
Biella: Yeah. I mean, Phrack magazine.
00:49:45.105 --> 00:49:52.341
Remember the video that I showed around
the fake French CCC? That is a larger
00:49:52.341 --> 00:49:59.292
project around how parts of the
underground went pro and started doing
00:49:59.292 --> 00:50:06.196
security work. And Phrack is amazing. I
mean, Phrack tells so much of that story.
00:50:06.196 --> 00:50:10.495
And what is also so interesting, because
I've done like almost 26 interviews, in-
00:50:10.495 --> 00:50:15.752
depth interviews around on this. And like
you'd expect in many hacker circles,
00:50:15.752 --> 00:50:20.514
there's a lot of diversity of opinions.
And the one thing that people agree on was
00:50:20.514 --> 00:50:25.349
that like Phrack was awesome, technically.
And it brought very different types of
00:50:25.349 --> 00:50:32.171
people together. You know, Phrack hasn't
come up in the video because it's one of
00:50:32.171 --> 00:50:37.487
these things that hasn't been documented.
Right? So much in documentaries or film.
00:50:37.487 --> 00:50:41.158
And again, it points to that problem,
which is on the one hand, we're trying to
00:50:41.158 --> 00:50:45.888
show the faces of hacking. But we also
have to make very, very clear, that
00:50:45.888 --> 00:50:50.816
there's certain parts of hacker history
that don't exist in video and don't take
00:50:50.816 --> 00:50:56.683
this as the definitive sort of word or
record.
00:50:56.683 --> 00:50:59.682
Herald: Now the question from microphone
2, please.
00:50:59.682 --> 00:51:06.119
Mic2: Hi, I'm... I was wondering,
whether you plan to expand your
00:51:06.119 --> 00:51:15.106
categories. If I didn't miss anything
to something for example as in my PhD.
00:51:15.106 --> 00:51:21.581
Examples of hacking connected with biology,
genetics and digital fabrication,
00:51:21.581 --> 00:51:23.716
neuro-hacking and so on.
Biella: Ja.
00:51:23.716 --> 00:51:30.873
Mic2.: So here that the CCC does a track
dedicated to science that I think it's
00:51:30.873 --> 00:51:35.343
somehow related. Thanks.
Biella: Great. Yeah. So if I can come
00:51:35.343 --> 00:51:40.945
correctly, I think we have 11 categories
and we absolutely are expanding and like
00:51:40.945 --> 00:51:45.209
bio hacking is one that we want include
because actually, you know, hackers are
00:51:45.209 --> 00:51:49.767
like creating insulin in the context of
the United States, where insulin is
00:51:49.767 --> 00:51:54.607
ridiculously expensive, like some of the
most important hacking I think is
00:51:54.607 --> 00:52:00.291
happening. So we're absolutely going to
expand by a handful. We also don't want to
00:52:00.291 --> 00:52:07.341
go much more beyond 15 or 18. And one of
the ways that we're also then handling
00:52:07.341 --> 00:52:14.114
that is that each entry comes with tags
and then there's gonna be other groupings
00:52:14.114 --> 00:52:20.697
around tags. But it's certainly I mean,
what you've seen is alive. It's alive, it's
00:52:20.697 --> 00:52:25.223
alive, but it's also very much beta, you
know.
00:52:25.223 --> 00:52:29.114
Paula: And it and if you've written also
on this topic and you have an interesting
00:52:29.114 --> 00:52:33.970
video, please email us, send it over. We'd
be really interested to hear about your
00:52:33.970 --> 00:52:38.705
research. Yeah. Yeah.
Herald: And then we have another question
00:52:38.705 --> 00:52:40.955
on mic 1, please.
Mic1: Thank you. Thank you.
00:52:40.955 --> 00:52:47.085
My question is for Biella.
And it's about would you say that your
00:52:47.085 --> 00:52:55.361
work be done on Anonymous affected the way
you engage with working with video
00:52:55.361 --> 00:53:03.689
after going deep into seeing, how
Anonymous uses video as a medium to engage
00:53:03.689 --> 00:53:07.417
with the public as compared to other
activist groups who are way less
00:53:07.417 --> 00:53:10.549
successful in that?
Biella: That's great. I mean, that is
00:53:10.549 --> 00:53:15.838
definitely, you know, I on the one hand
always use video in my class. And it's not
00:53:15.838 --> 00:53:20.010
just like hackers. You know, if I'm
talking about Martin Luther King and
00:53:20.010 --> 00:53:26.819
something he said, I will show a video of
what he said. Because having me repeat it
00:53:26.819 --> 00:53:32.661
versus having MLK on the screen it's a lot
more persuasive. And we are in a moment
00:53:32.661 --> 00:53:37.045
where truth is not winning the game and we
have to think about our game of
00:53:37.045 --> 00:53:41.142
persuasion. Right? That's just this is a
kind of side project. But you're
00:53:41.142 --> 00:53:46.154
absolutely right. It was also Anonymous
who used so many videos. Right. In a
00:53:46.154 --> 00:53:50.510
period where, sure, others had to use
videos. But it was groups like, for
00:53:50.510 --> 00:53:57.325
example, Indymedia who's turned 20 this
year, who took videos of the world around
00:53:57.325 --> 00:54:02.524
us, whereas Anonymous created videos as a
means for persuasion. And it was very
00:54:02.524 --> 00:54:08.526
powerful at the time. And I am... I am
inspired to think about how can we think
00:54:08.526 --> 00:54:15.762
about persuasive mediums in all contexts
in order to get our message out. Because
00:54:15.762 --> 00:54:21.937
again, we're not always winning in this
regard. Truth can never speak on its own,
00:54:21.937 --> 00:54:28.733
right? And we always need adjuncts and
adjuvants in order to get truth's message
00:54:28.733 --> 00:54:33.294
out there. And certainly it was Anonymous
in part that that helped me see the
00:54:33.294 --> 00:54:40.001
importance of video and in a new way. So
I'm really glad you mentioned that.
00:54:40.001 --> 00:54:44.435
Herald: Thank you. And then we have
another question from the Internet.
00:54:44.435 --> 00:54:48.311
Signal-engel: Yeah, and the next question
from the Internet is: how will you select
00:54:48.311 --> 00:54:53.514
the right curators for the entries
and how do they decide how they are
00:54:53.514 --> 00:54:58.851
presented and contextualized?
Biella: All right. So, I mean, I've been
00:54:58.851 --> 00:55:05.350
working on hacker cultures for since 1998.
Paula: Mine is a journey has been a
00:55:05.350 --> 00:55:08.703
little bit shorter, but also for about 10
years or so.
00:55:08.703 --> 00:55:14.396
Biella: Yeah. And so I do, I know a lot of
people working on different topics. And
00:55:14.396 --> 00:55:20.281
for the first round, we invited people.
And it wasn't just academics. I have
00:55:20.281 --> 00:55:24.270
gotten journalists and hackers are writing
some entries as well. But they're just
00:55:24.270 --> 00:55:28.885
like a little bit harder to kind of get
them to turn in their entries. But
00:55:28.885 --> 00:55:33.221
hopefully they will, because, again, it's
it's not just who's been credentialed to
00:55:33.221 --> 00:55:39.864
talk about a topic. It's who knows about a
topic, who has something to say and who's
00:55:39.864 --> 00:55:45.037
willing to go through the editing process.
Because while journalists generally don't
00:55:45.037 --> 00:55:48.813
have to go through multiple edits because
you all just really know how to write for
00:55:48.813 --> 00:55:55.084
the public, everyone else actually does
struggle a little bit. And we do really
00:55:55.084 --> 00:56:00.119
try to get the entries written in such a
way where we're presuming, you know,
00:56:00.119 --> 00:56:06.370
nothing about hackers or the video. It's
not always easy, then, to write an entry,
00:56:06.370 --> 00:56:12.045
that kind of starts from that that low
level. And then in terms of the
00:56:12.045 --> 00:56:19.431
contextualization, that's where we have
three editors and curators. And I would
00:56:19.431 --> 00:56:26.010
actually even say four because our final
editor, Matt Gorson. He was an M.A.
00:56:26.010 --> 00:56:30.554
student under me. He's doing a big project
on security, hacking with me at data and
00:56:30.554 --> 00:56:35.760
society. He knows a ton. And it's
precisely having many eyeballs on one
00:56:35.760 --> 00:56:43.047
entry that allows us to hopefully
contextualize it properly. But, you know,
00:56:43.047 --> 00:56:47.816
again, if something seems off, people
should email us. And again, we're also
00:56:47.816 --> 00:56:53.271
open to responses from the community as
well, which we have one response from
00:56:53.271 --> 00:56:58.305
Naomi. But, you know, perhaps that will
kind of grow into something larger.
00:56:58.305 --> 00:57:02.528
Paula: So when you ask why or why is it us
that are curating, who's curating, really,
00:57:02.528 --> 00:57:07.161
it's just the three of us that are doing
this. And what kind of speech position are
00:57:07.161 --> 00:57:10.398
we coming from? I mean, we're
anthropologists of hacker cultures. What
00:57:10.398 --> 00:57:14.706
does that mean? May for you guys, it
doesn't mean much or it means a lot. Or
00:57:14.706 --> 00:57:18.607
it's really we've studied you guys for a
long time.
00:57:18.607 --> 00:57:22.993
Biella: Yes. But it's it's also cool
because it's like, well, except for Paula.
00:57:22.993 --> 00:57:29.213
I mean, Chris and I like we have tenure
and that may mean nothing to you all. But,
00:57:29.213 --> 00:57:34.937
you know, hackers care about freedom and
free speech and tenure allows you to be
00:57:34.937 --> 00:57:36.956
free.
Puala: I have tenure now.
00:57:36.956 --> 00:57:41.247
Biella: Oh, you do? Sweet. We all are free
to kind of do what we want in interesting
00:57:41.247 --> 00:57:45.642
ways. And again, we're trying to
experiment with mediums that go a little
00:57:45.642 --> 00:57:50.402
bit beyond the academic journal, which I'm
totally behind. I think there's really
00:57:50.402 --> 00:57:53.165
good things about the academic journal. I
think there's really good things about the
00:57:53.165 --> 00:58:00.216
book. But we have the freedom to
experiment with new mediums. And so
00:58:00.216 --> 00:58:04.688
hopefully this this new medium will kind
of reach different types of publics in a
00:58:04.688 --> 00:58:12.429
way that kind of academic journal articles
will never reach.
00:58:12.429 --> 00:58:17.909
Herald: Are there any more questions?
Paula: Party. Party.
00:58:17.909 --> 00:58:21.475
Herald: It doesn't look like it. So I
would like to invite you for another round
00:58:21.475 --> 00:58:22.664
of applause for Biella and Paula.
00:58:22.664 --> 00:58:23.523
Applause
00:58:23.523 --> 00:58:26.075
Biella und Paula: Thank you guys, thank
you so much.
00:58:26.075 --> 00:58:34.648
36C3 Postroll music
00:58:34.648 --> 00:58:44.259
Subtitles created by c3subtitles.de
in the year 2020. Join, and help us!