The nightmare videos of childrens' YouTube -- and what's wrong with the internet today
-
0:01 - 0:02I'm James.
-
0:02 - 0:04I'm a writer and artist,
-
0:04 - 0:06and I make work about technology.
-
0:06 - 0:10I do things like draw life-size outlines
of military drones -
0:10 - 0:12in city streets around the world,
-
0:12 - 0:15so that people can start to think
and get their heads around -
0:15 - 0:19these really quite hard-to-see
and hard-to-think-about technologies. -
0:19 - 0:23I make things like neural networks
that predict the results of elections -
0:23 - 0:25based on weather reports,
-
0:25 - 0:26because I'm intrigued about
-
0:26 - 0:30what the actual possibilities
of these weird new technologies are. -
0:31 - 0:34Last year, I built
my own self-driving car. -
0:34 - 0:36But because I don't
really trust technology, -
0:36 - 0:38I also designed a trap for it.
-
0:39 - 0:40(Laughter)
-
0:40 - 0:44And I do these things mostly because
I find them completely fascinating, -
0:44 - 0:47but also because I think
when we talk about technology, -
0:47 - 0:49we're largely talking about ourselves
-
0:49 - 0:52and the way that we understand the world.
-
0:52 - 0:54So here's a story about technology.
-
0:56 - 0:58This is a "surprise egg" video.
-
0:58 - 1:02It's basically a video of someone
opening up loads of chocolate eggs -
1:02 - 1:04and showing the toys inside to the viewer.
-
1:04 - 1:07That's it. That's all it does
for seven long minutes. -
1:07 - 1:10And I want you to notice
two things about this. -
1:11 - 1:15First of all, this video
has 30 million views. -
1:15 - 1:16(Laughter)
-
1:16 - 1:18And the other thing is,
-
1:18 - 1:21it comes from a channel
that has 6.3 million subscribers, -
1:21 - 1:24that has a total of eight billion views,
-
1:24 - 1:27and it's all just more videos like this --
-
1:28 - 1:3230 million people watching a guy
opening up these eggs. -
1:32 - 1:37It sounds pretty weird, but if you search
for "surprise eggs" on YouTube, -
1:37 - 1:40it'll tell you there's
10 million of these videos, -
1:40 - 1:42and I think that's an undercount.
-
1:42 - 1:44I think there's way, way more of these.
-
1:44 - 1:46If you keep searching, they're endless.
-
1:46 - 1:48There's millions and millions
of these videos -
1:48 - 1:52in increasingly baroque combinations
of brands and materials, -
1:52 - 1:56and there's more and more of them
being uploaded every single day. -
1:56 - 1:59Like, this is a strange world. Right?
-
1:59 - 2:03But the thing is, it's not adults
who are watching these videos. -
2:03 - 2:06It's kids, small children.
-
2:06 - 2:08These videos are
like crack for little kids. -
2:08 - 2:10There's something about the repetition,
-
2:10 - 2:12the constant little
dopamine hit of the reveal, -
2:12 - 2:14that completely hooks them in.
-
2:14 - 2:19And little kids watch these videos
over and over and over again, -
2:19 - 2:21and they do it for hours
and hours and hours. -
2:21 - 2:24And if you try and take
the screen away from them, -
2:24 - 2:26they'll scream and scream and scream.
-
2:26 - 2:27If you don't believe me --
-
2:27 - 2:29and I've already seen people
in the audience nodding -- -
2:29 - 2:33if you don't believe me, find someone
with small children and ask them, -
2:33 - 2:35and they'll know about
the surprise egg videos. -
2:35 - 2:37So this is where we start.
-
2:37 - 2:41It's 2018, and someone, or lots of people,
-
2:41 - 2:45are using the same mechanism that, like,
Facebook and Instagram are using -
2:45 - 2:47to get you to keep checking that app,
-
2:47 - 2:51and they're using it on YouTube
to hack the brains of very small children -
2:51 - 2:53in return for advertising revenue.
-
2:54 - 2:56At least, I hope
that's what they're doing. -
2:56 - 2:58I hope that's what they're doing it for,
-
2:58 - 3:04because there's easier ways
of making ad revenue on YouTube. -
3:04 - 3:06You can just make stuff up or steal stuff.
-
3:06 - 3:09So if you search for really
popular kids' cartoons -
3:09 - 3:10like "Peppa Pig" or "Paw Patrol,"
-
3:10 - 3:14you'll find there's millions and millions
of these online as well. -
3:14 - 3:17Of course, most of them aren't posted
by the original content creators. -
3:17 - 3:20They come from loads and loads
of different random accounts, -
3:20 - 3:22and it's impossible to know
who's posting them -
3:22 - 3:24or what their motives might be.
-
3:24 - 3:26Does that sound kind of familiar?
-
3:26 - 3:28Because it's exactly the same mechanism
-
3:28 - 3:31that's happening across most
of our digital services, -
3:31 - 3:34where it's impossible to know
where this information is coming from. -
3:34 - 3:36It's basically fake news for kids,
-
3:36 - 3:38and we're training them from birth
-
3:38 - 3:41to click on the very first link
that comes along, -
3:41 - 3:43regardless of what the source is.
-
3:43 - 3:45That's doesn't seem like
a terribly good idea. -
3:46 - 3:49Here's another thing
that's really big on kids' YouTube. -
3:49 - 3:51This is called the "Finger Family Song."
-
3:51 - 3:53I just heard someone groan
in the audience. -
3:53 - 3:55This is the "Finger Family Song."
-
3:55 - 3:57This is the very first one I could find.
-
3:57 - 4:00It's from 2007, and it only has
200,000 views, -
4:00 - 4:02which is, like, nothing in this game.
-
4:02 - 4:04But it has this insanely earwormy tune,
-
4:04 - 4:06which I'm not going to play to you,
-
4:06 - 4:08because it will sear itself
into your brain -
4:08 - 4:11in the same way that
it seared itself into mine, -
4:11 - 4:12and I'm not going to do that to you.
-
4:12 - 4:14But like the surprise eggs,
-
4:14 - 4:16it's got inside kids' heads
-
4:16 - 4:18and addicted them to it.
-
4:18 - 4:20So within a few years,
these finger family videos -
4:20 - 4:21start appearing everywhere,
-
4:21 - 4:24and you get versions
in different languages -
4:24 - 4:26with popular kids' cartoons using food
-
4:26 - 4:28or, frankly, using whatever kind
of animation elements -
4:28 - 4:31you seem to have lying around.
-
4:31 - 4:36And once again, there are millions
and millions and millions of these videos -
4:36 - 4:40available online in all of these
kind of insane combinations. -
4:40 - 4:42And the more time
you start to spend with them, -
4:42 - 4:46the crazier and crazier
you start to feel that you might be. -
4:46 - 4:49And that's where I
kind of launched into this, -
4:49 - 4:53that feeling of deep strangeness
and deep lack of understanding -
4:53 - 4:57of how this thing was constructed
that seems to be presented around me. -
4:57 - 5:00Because it's impossible to know
where these things are coming from. -
5:00 - 5:01Like, who is making them?
-
5:01 - 5:04Some of them appear to be made
of teams of professional animators. -
5:05 - 5:07Some of them are just randomly
assembled by software. -
5:07 - 5:12Some of them are quite wholesome-looking
young kids' entertainers. -
5:12 - 5:13And some of them are from people
-
5:13 - 5:16who really clearly
shouldn't be around children at all. -
5:16 - 5:18(Laughter)
-
5:19 - 5:24And once again, this impossibility
of figuring out who's making this stuff -- -
5:24 - 5:25like, this is a bot?
-
5:25 - 5:27Is this a person? Is this a troll?
-
5:28 - 5:30What does it mean
that we can't tell the difference -
5:30 - 5:31between these things anymore?
-
5:32 - 5:36And again, doesn't that uncertainty
feel kind of familiar right now? -
5:38 - 5:41So the main way people get views
on their videos -- -
5:41 - 5:42and remember, views mean money --
-
5:42 - 5:47is that they stuff the titles
of these videos with these popular terms. -
5:47 - 5:49So you take, like, "surprise eggs"
-
5:49 - 5:51and then you add
"Paw Patrol," "Easter egg," -
5:51 - 5:52or whatever these things are,
-
5:52 - 5:55all of these words from other
popular videos into your title, -
5:55 - 5:58until you end up with this kind of
meaningless mash of language -
5:58 - 6:01that doesn't make sense to humans at all.
-
6:01 - 6:04Because of course it's only really
tiny kids who are watching your video, -
6:04 - 6:06and what the hell do they know?
-
6:06 - 6:09Your real audience
for this stuff is software. -
6:09 - 6:11It's the algorithms.
-
6:11 - 6:12It's the software that YouTube uses
-
6:12 - 6:15to select which videos
are like other videos, -
6:15 - 6:17to make them popular,
to make them recommended. -
6:17 - 6:21And that's why you end up with this
kind of completely meaningless mash, -
6:21 - 6:23both of title and of content.
-
6:24 - 6:26But the thing is, you have to remember,
-
6:26 - 6:30there really are still people within
this algorithmically optimized system, -
6:30 - 6:33people who are kind
of increasingly forced to act out -
6:33 - 6:36these increasingly bizarre
combinations of words, -
6:36 - 6:41like a desperate improvisation artist
responding to the combined screams -
6:41 - 6:44of a million toddlers at once.
-
6:45 - 6:48There are real people
trapped within these systems, -
6:48 - 6:52and that's the other deeply strange thing
about this algorithmically driven culture, -
6:52 - 6:53because even if you're human,
-
6:53 - 6:55you have to end up behaving like a machine
-
6:55 - 6:57just to survive.
-
6:57 - 6:59And also, on the other side of the screen,
-
6:59 - 7:02there still are these little kids
watching this stuff, -
7:02 - 7:06stuck, their full attention grabbed
by these weird mechanisms. -
7:07 - 7:10And most of these kids are too small
to even use a website. -
7:10 - 7:13They're just kind of hammering
on the screen with their little hands. -
7:13 - 7:14And so there's autoplay,
-
7:14 - 7:18where it just keeps playing these videos
over and over and over in a loop, -
7:18 - 7:20endlessly for hours and hours at a time.
-
7:20 - 7:23And there's so much weirdness
in the system now -
7:23 - 7:26that autoplay takes you
to some pretty strange places. -
7:26 - 7:28This is how, within a dozen steps,
-
7:28 - 7:31you can go from a cute video
of a counting train -
7:31 - 7:34to masturbating Mickey Mouse.
-
7:35 - 7:37Yeah. I'm sorry about that.
-
7:37 - 7:39This does get worse.
-
7:39 - 7:40This is what happens
-
7:40 - 7:43when all of these different keywords,
-
7:43 - 7:45all these different pieces of attention,
-
7:45 - 7:48this desperate generation of content,
-
7:48 - 7:51all comes together into a single place.
-
7:52 - 7:56This is where all those deeply weird
keywords come home to roost. -
7:56 - 7:59You cross-breed the finger family video
-
7:59 - 8:01with some live-action superhero stuff,
-
8:01 - 8:04you add in some weird,
trollish in-jokes or something, -
8:04 - 8:08and suddenly, you come
to a very weird place indeed. -
8:08 - 8:10The stuff that tends to upset parents
-
8:10 - 8:13is the stuff that has kind of violent
or sexual content, right? -
8:13 - 8:16Children's cartoons getting assaulted,
-
8:16 - 8:18getting killed,
-
8:18 - 8:21weird pranks that actually
genuinely terrify children. -
8:21 - 8:25What you have is software pulling in
all of these different influences -
8:25 - 8:28to automatically generate
kids' worst nightmares. -
8:28 - 8:31And this stuff really, really
does affect small children. -
8:31 - 8:34Parents report their children
being traumatized, -
8:34 - 8:35becoming afraid of the dark,
-
8:35 - 8:38becoming afraid of their favorite
cartoon characters. -
8:39 - 8:42If you take one thing away from this,
it's that if you have small children, -
8:42 - 8:44keep them the hell away from YouTube.
-
8:45 - 8:49(Applause)
-
8:51 - 8:54But the other thing, the thing
that really gets to me about this, -
8:54 - 8:58is that I'm not sure we even really
understand how we got to this point. -
8:59 - 9:02We've taken all of this influence,
all of these things, -
9:02 - 9:05and munged them together in a way
that no one really intended. -
9:05 - 9:08And yet, this is also the way
that we're building the entire world. -
9:08 - 9:10We're taking all of this data,
-
9:10 - 9:11a lot of it bad data,
-
9:11 - 9:14a lot of historical data
full of prejudice, -
9:14 - 9:17full of all of our worst
impulses of history, -
9:17 - 9:19and we're building that
into huge data sets -
9:19 - 9:21and then we're automating it.
-
9:21 - 9:24And we're munging it together
into things like credit reports, -
9:24 - 9:26into insurance premiums,
-
9:26 - 9:29into things like predictive
policing systems, -
9:29 - 9:30into sentencing guidelines.
-
9:30 - 9:33This is the way we're actually
constructing the world today -
9:33 - 9:34out of this data.
-
9:34 - 9:36And I don't know what's worse,
-
9:36 - 9:39that we built a system
that seems to be entirely optimized -
9:39 - 9:42for the absolute worst aspects
of human behavior, -
9:42 - 9:45or that we seem
to have done it by accident, -
9:45 - 9:47without even realizing
that we were doing it, -
9:47 - 9:50because we didn't really understand
the systems that we were building, -
9:50 - 9:54and we didn't really understand
how to do anything differently with it. -
9:55 - 9:58There's a couple of things I think
that really seem to be driving this -
9:58 - 9:59most fully on YouTube,
-
9:59 - 10:01and the first of those is advertising,
-
10:01 - 10:04which is the monetization of attention
-
10:04 - 10:07without any real other variables at work,
-
10:07 - 10:11any care for the people who are
actually developing this content, -
10:11 - 10:15the centralization of the power,
the separation of those things. -
10:15 - 10:18And I think however you feel
about the use of advertising -
10:18 - 10:19to kind of support stuff,
-
10:19 - 10:22the sight of grown men in diapers
rolling around in the sand -
10:22 - 10:25in the hope that an algorithm
that they don't really understand -
10:25 - 10:27will give them money for it
-
10:27 - 10:29suggests that this
probably isn't the thing -
10:29 - 10:31that we should be basing
our society and culture upon, -
10:31 - 10:33and the way in which
we should be funding it. -
10:34 - 10:37And the other thing that's kind of
the major driver of this is automation, -
10:37 - 10:39which is the deployment
of all of this technology -
10:39 - 10:42as soon as it arrives,
without any kind of oversight, -
10:42 - 10:43and then once it's out there,
-
10:43 - 10:47kind of throwing up our hands and going,
"Hey, it's not us, it's the technology." -
10:47 - 10:49Like, "We're not involved in it."
-
10:49 - 10:51That's not really good enough,
-
10:51 - 10:53because this stuff isn't
just algorithmically governed, -
10:53 - 10:56it's also algorithmically policed.
-
10:56 - 10:59When YouTube first started
to pay attention to this, -
10:59 - 11:01the first thing they said
they'd do about it -
11:01 - 11:04was that they'd deploy
better machine learning algorithms -
11:04 - 11:05to moderate the content.
-
11:05 - 11:09Well, machine learning,
as any expert in it will tell you, -
11:09 - 11:10is basically what we've started to call
-
11:10 - 11:13software that we don't really
understand how it works. -
11:13 - 11:17And I think we have
enough of that already. -
11:17 - 11:20We shouldn't be leaving
this stuff up to AI to decide -
11:20 - 11:22what's appropriate or not,
-
11:22 - 11:23because we know what happens.
-
11:23 - 11:25It'll start censoring other things.
-
11:25 - 11:26It'll start censoring queer content.
-
11:27 - 11:29It'll start censoring
legitimate public speech. -
11:29 - 11:31What's allowed in these discourses,
-
11:31 - 11:34it shouldn't be something
that's left up to unaccountable systems. -
11:34 - 11:37It's part of a discussion
all of us should be having. -
11:37 - 11:38But I'd leave a reminder
-
11:38 - 11:41that the alternative isn't
very pleasant, either. -
11:41 - 11:42YouTube also announced recently
-
11:42 - 11:45that they're going to release
a version of their kids' app -
11:45 - 11:48that would be entirely
moderated by humans. -
11:48 - 11:52Facebook -- Zuckerberg said
much the same thing at Congress, -
11:52 - 11:55when pressed about how they
were going to moderate their stuff. -
11:55 - 11:57He said they'd have humans doing it.
-
11:57 - 11:58And what that really means is,
-
11:58 - 12:01instead of having toddlers being
the first person to see this stuff, -
12:01 - 12:04you're going to have underpaid,
precarious contract workers -
12:04 - 12:06without proper mental health support
-
12:06 - 12:07being damaged by it as well.
-
12:07 - 12:08(Laughter)
-
12:08 - 12:11And I think we can all do
quite a lot better than that. -
12:11 - 12:13(Applause)
-
12:14 - 12:19The thought, I think, that brings those
two things together, really, for me, -
12:19 - 12:20is agency.
-
12:20 - 12:23It's like, how much do we really
understand -- by agency, I mean: -
12:23 - 12:28how we know how to act
in our own best interests. -
12:28 - 12:30Which -- it's almost impossible to do
-
12:30 - 12:33in these systems that we don't
really fully understand. -
12:33 - 12:36Inequality of power
always leads to violence. -
12:36 - 12:38And we can see inside these systems
-
12:38 - 12:40that inequality of understanding
does the same thing. -
12:41 - 12:44If there's one thing that we can do
to start to improve these systems, -
12:44 - 12:47it's to make them more legible
to the people who use them, -
12:47 - 12:49so that all of us have
a common understanding -
12:49 - 12:51of what's actually going on here.
-
12:52 - 12:55The thing, though, I think
most about these systems -
12:55 - 12:59is that this isn't, as I hope
I've explained, really about YouTube. -
12:59 - 13:00It's about everything.
-
13:00 - 13:03These issues of accountability and agency,
-
13:03 - 13:05of opacity and complexity,
-
13:05 - 13:08of the violence and exploitation
that inherently results -
13:08 - 13:11from the concentration
of power in a few hands -- -
13:11 - 13:13these are much, much larger issues.
-
13:14 - 13:18And they're issues not just of YouTube
and not just of technology in general, -
13:18 - 13:19and they're not even new.
-
13:19 - 13:21They've been with us for ages.
-
13:21 - 13:25But we finally built this system,
this global system, the internet, -
13:25 - 13:28that's actually showing them to us
in this extraordinary way, -
13:28 - 13:30making them undeniable.
-
13:30 - 13:33Technology has this extraordinary capacity
-
13:33 - 13:37to both instantiate and continue
-
13:37 - 13:41all of our most extraordinary,
often hidden desires and biases -
13:41 - 13:43and encoding them into the world,
-
13:43 - 13:46but it also writes them down
so that we can see them, -
13:46 - 13:50so that we can't pretend
they don't exist anymore. -
13:50 - 13:54We need to stop thinking about technology
as a solution to all of our problems, -
13:54 - 13:58but think of it as a guide
to what those problems actually are, -
13:58 - 14:00so we can start thinking
about them properly -
14:00 - 14:02and start to address them.
-
14:02 - 14:03Thank you very much.
-
14:03 - 14:08(Applause)
-
14:10 - 14:11Thank you.
-
14:11 - 14:14(Applause)
-
14:17 - 14:20Helen Walters: James, thank you
for coming and giving us that talk. -
14:20 - 14:21So it's interesting:
-
14:21 - 14:25when you think about the films where
the robotic overlords take over, -
14:25 - 14:28it's all a bit more glamorous
than what you're describing. -
14:28 - 14:32But I wonder -- in those films,
you have the resistance mounting. -
14:32 - 14:35Is there a resistance mounting
towards this stuff? -
14:35 - 14:39Do you see any positive signs,
green shoots of resistance? -
14:41 - 14:43James Bridle: I don't know
about direct resistance, -
14:43 - 14:45because I think this stuff
is super long-term. -
14:45 - 14:48I think it's baked into culture
in really deep ways. -
14:48 - 14:50A friend of mine,
Eleanor Saitta, always says -
14:50 - 14:54that any technological problems
of sufficient scale and scope -
14:54 - 14:56are political problems first of all.
-
14:56 - 14:59So all of these things we're working
to address within this -
14:59 - 15:02are not going to be addressed
just by building the technology better, -
15:02 - 15:05but actually by changing the society
that's producing these technologies. -
15:05 - 15:08So no, right now, I think we've got
a hell of a long way to go. -
15:09 - 15:10But as I said, I think by unpacking them,
-
15:11 - 15:13by explaining them, by talking
about them super honestly, -
15:13 - 15:16we can actually start
to at least begin that process. -
15:16 - 15:19HW: And so when you talk about
legibility and digital literacy, -
15:19 - 15:21I find it difficult to imagine
-
15:21 - 15:25that we need to place the burden
of digital literacy on users themselves. -
15:25 - 15:29But whose responsibility
is education in this new world? -
15:29 - 15:33JB: Again, I think this responsibility
is kind of up to all of us, -
15:33 - 15:36that everything we do,
everything we build, everything we make, -
15:36 - 15:40needs to be made
in a consensual discussion -
15:40 - 15:42with everyone who's avoiding it;
-
15:42 - 15:46that we're not building systems
intended to trick and surprise people -
15:46 - 15:48into doing the right thing,
-
15:48 - 15:52but that they're actually involved
in every step in educating them, -
15:52 - 15:54because each of these systems
is educational. -
15:54 - 15:57That's what I'm hopeful about,
about even this really grim stuff, -
15:57 - 15:59that if you can take it
and look at it properly, -
15:59 - 16:01it's actually in itself
a piece of education -
16:01 - 16:05that allows you to start seeing
how complex systems come together and work -
16:05 - 16:09and maybe be able to apply
that knowledge elsewhere in the world. -
16:09 - 16:11HW: James, it's such
an important discussion, -
16:11 - 16:14and I know many people here
are really open and prepared to have it, -
16:14 - 16:16so thanks for starting off our morning.
-
16:16 - 16:17JB: Thanks very much. Cheers.
-
16:17 - 16:19(Applause)
- Title:
- The nightmare videos of childrens' YouTube -- and what's wrong with the internet today
- Speaker:
- James Bridle
- Description:
-
Writer and artist James Bridle uncovers a dark, strange corner of the internet, where unknown people or groups on YouTube hack the brains of young children in return for advertising revenue. From "surprise egg" reveals and the "Finger Family Song" to algorithmically created mashups of familiar cartoon characters in violent situations, these videos exploit and terrify young minds -- and they tell us something about where our increasingly data-driven world is headed. "We need to stop thinking about technology as a solution to all of our problems, but think of it as a guide to what those problems actually are, so we can start thinking about them properly and start to address them," Bridle says.
- Video Language:
- English
- Team:
closed TED
- Project:
- TEDTalks
- Duration:
- 16:32