< Return to Video

The nightmare videos of childrens' YouTube -- and what's wrong with the internet today

  • 0:01 - 0:02
    I'm James.
  • 0:02 - 0:04
    I'm a writer and artist,
  • 0:04 - 0:06
    and I make work about technology.
  • 0:06 - 0:10
    I do things like draw life-size outlines
    of military drones
  • 0:10 - 0:12
    in city streets around the world,
  • 0:12 - 0:15
    so that people can start to think
    and get their heads around
  • 0:15 - 0:19
    these really quite hard-to-see
    and hard-to-think-about technologies.
  • 0:19 - 0:23
    I make things like neural networks
    that predict the results of elections
  • 0:23 - 0:25
    based on weather reports,
  • 0:25 - 0:26
    because I'm intrigued about
  • 0:26 - 0:30
    what the actual possibilities
    of these weird new technologies are.
  • 0:31 - 0:34
    Last year, I built
    my own self-driving car.
  • 0:34 - 0:36
    But because I don't
    really trust technology,
  • 0:36 - 0:38
    I also designed a trap for it.
  • 0:39 - 0:40
    (Laughter)
  • 0:40 - 0:44
    And I do these things mostly because
    I find them completely fascinating,
  • 0:44 - 0:47
    but also because I think
    when we talk about technology,
  • 0:47 - 0:49
    we're largely talking about ourselves
  • 0:49 - 0:52
    and the way that we understand the world.
  • 0:52 - 0:54
    So here's a story about technology.
  • 0:56 - 0:58
    This is a "surprise egg" video.
  • 0:58 - 1:02
    It's basically a video of someone
    opening up loads of chocolate eggs
  • 1:02 - 1:04
    and showing the toys inside to the viewer.
  • 1:04 - 1:07
    That's it. That's all it does
    for seven long minutes.
  • 1:07 - 1:10
    And I want you to notice
    two things about this.
  • 1:11 - 1:15
    First of all, this video
    has 30 million views.
  • 1:15 - 1:16
    (Laughter)
  • 1:16 - 1:18
    And the other thing is,
  • 1:18 - 1:21
    it comes from a channel
    that has 6.3 million subscribers,
  • 1:21 - 1:24
    that has a total of 8 billion views,
  • 1:24 - 1:27
    and it's all just more videos like this --
  • 1:28 - 1:32
    30 million people watching a guy
    opening up these eggs.
  • 1:32 - 1:37
    It sounds pretty weird, but if you search
    for "surprise eggs" on YouTube,
  • 1:37 - 1:40
    it'll tell you there's
    10 million of these videos,
  • 1:40 - 1:42
    and I think that's an undercount.
  • 1:42 - 1:44
    I think there's way, way more of these.
  • 1:44 - 1:46
    If you keep searching, they're endless.
  • 1:46 - 1:48
    There's millions and millions
    of these videos
  • 1:48 - 1:52
    in increasingly baroque combinations
    of brands and materials,
  • 1:52 - 1:56
    and there's more and more of them
    being uploaded every single day.
  • 1:56 - 1:59
    Like, this is a strange world. Right?
  • 1:59 - 2:03
    But the thing is, it's not adults
    who are watching these videos.
  • 2:03 - 2:06
    It's kids, small children.
  • 2:06 - 2:08
    These videos are
    like crack for little kids.
  • 2:08 - 2:10
    There's something about the repetition,
  • 2:10 - 2:12
    the constant little
    dopamine hit of the reveal,
  • 2:12 - 2:14
    that completely hooks them in.
  • 2:14 - 2:19
    And little kids watch these videos
    over and over and over again,
  • 2:19 - 2:21
    and they do it for hours
    and hours and hours.
  • 2:21 - 2:24
    And if you try and take
    the screen away from them,
  • 2:24 - 2:26
    they'll scream and scream and scream.
  • 2:26 - 2:27
    If you don't believe me --
  • 2:27 - 2:29
    and I've already seen people
    in the audience nodding --
  • 2:29 - 2:33
    if you don't believe me, find someone
    with small children and ask them,
  • 2:33 - 2:35
    and they'll know about
    the surprise egg videos.
  • 2:35 - 2:37
    So this is where we start.
  • 2:37 - 2:41
    It's 2018, and someone, or lots of people,
  • 2:41 - 2:45
    are using the same mechanism that, like,
    Facebook and Instagram are using
  • 2:45 - 2:47
    to get you to keep checking that app,
  • 2:47 - 2:51
    and they're using it on YouTube
    to hack the brains of very small children
  • 2:51 - 2:53
    in return for advertising revenue.
  • 2:54 - 2:56
    At least, I hope
    that's what they're doing.
  • 2:56 - 2:58
    I hope that's what they're doing it for,
  • 2:58 - 3:04
    because there's easier ways
    of making ad revenue on YouTube.
  • 3:04 - 3:06
    You can just make stuff up or steal stuff.
  • 3:06 - 3:09
    So if you search for really
    popular kids' cartoons
  • 3:09 - 3:10
    like "Pepper Pig" or "Paw Patrol,"
  • 3:10 - 3:14
    you'll find there's millions and millions
    of these online as well.
  • 3:14 - 3:17
    Of course, most of them aren't posted
    by the original content creators.
  • 3:17 - 3:20
    They come from loads and loads
    of different random accounts,
  • 3:20 - 3:22
    and it's impossible to know
    who's posting them
  • 3:22 - 3:24
    or what their motives might be.
  • 3:24 - 3:26
    Does that sound kind of familiar?
  • 3:26 - 3:28
    Because it's exactly the same mechanism
  • 3:28 - 3:31
    that's happening across most
    of our digital services,
  • 3:31 - 3:34
    where it's impossible to know
    where this information is coming from.
  • 3:34 - 3:36
    It's basically fake news for kids,
  • 3:36 - 3:38
    and we're training them from birth
  • 3:38 - 3:41
    to click on the very first link
    that comes along,
  • 3:41 - 3:43
    regardless of what the source is.
  • 3:43 - 3:45
    That's doesn't seem like
    a terribly good idea.
  • 3:46 - 3:49
    Here's another thing
    that's really big on kids' YouTube.
  • 3:49 - 3:51
    This is called the "Finger Family Song."
  • 3:51 - 3:53
    I just heard someone groan
    in the audience.
  • 3:53 - 3:55
    This is the "Finger Family Song."
  • 3:55 - 3:57
    This is the very first one I could find.
  • 3:57 - 4:00
    It's from 2007, and it only has
    200,000 views,
  • 4:00 - 4:02
    which is, like, nothing in this game.
  • 4:02 - 4:04
    But it has this insanely earwormy tune,
  • 4:04 - 4:06
    which I'm not going to play to you,
  • 4:06 - 4:08
    because it will sear itself
    into your brain
  • 4:08 - 4:11
    in the same way that
    it seared itself into mine,
  • 4:11 - 4:12
    and I'm not going to do that to you.
  • 4:12 - 4:14
    But like the surprise eggs,
  • 4:14 - 4:16
    it's got inside kids' heads
  • 4:16 - 4:18
    and addicted them to it.
  • 4:18 - 4:20
    So within a few years,
    these finger family videos
  • 4:20 - 4:21
    start appearing everywhere,
  • 4:21 - 4:24
    and you get versions
    in different languages
  • 4:24 - 4:26
    with popular kids' cartoons using food
  • 4:26 - 4:28
    or frankly, using whatever kind
    of animation elements
  • 4:28 - 4:31
    you seem to have lying around.
  • 4:31 - 4:36
    And once again, there are millions
    and millions and millions of these videos
  • 4:36 - 4:40
    available online in all of these
    kind of insane combinations.
  • 4:40 - 4:42
    And the more time
    you start to spend with them,
  • 4:42 - 4:46
    the crazier and crazier
    you start to feel that you might be.
  • 4:46 - 4:49
    And that's where I
    kind of launched into this,
  • 4:49 - 4:53
    that feeling of deep strangeness
    and deep lack of understanding
  • 4:53 - 4:57
    of how this thing was constructed
    that seems to be presented around me.
  • 4:57 - 5:00
    Because it's impossible to know
    where these things are coming from.
  • 5:00 - 5:01
    Like, who is making them?
  • 5:01 - 5:04
    Some of them appear to made
    of teams of professional animators.
  • 5:05 - 5:07
    Some of them are just randomly
    assembled by software.
  • 5:07 - 5:12
    Some of them are quite wholesome-looking
    young kids' entertainers,
  • 5:12 - 5:13
    And some of them are from people
  • 5:13 - 5:16
    who really clearly
    shouldn't be around children at all.
  • 5:16 - 5:18
    (Laughter)
  • 5:19 - 5:24
    And once again, this impossibility
    of figuring out who's making this stuff --
  • 5:24 - 5:25
    like, this is a bot?
  • 5:25 - 5:27
    Is this a person? Is this a troll?
  • 5:28 - 5:30
    What does it mean
    that we can't tell the difference
  • 5:30 - 5:31
    between these things anymore?
  • 5:32 - 5:36
    And again, doesn't that uncertainty
    feel kind of familiar right now?
  • 5:38 - 5:41
    So the main way people get views
    on their videos --
  • 5:41 - 5:42
    and remember, views mean money --
  • 5:42 - 5:47
    is that they stuff the titles
    of these videos with these popular terms.
  • 5:47 - 5:49
    So you take, like, "surprise eggs"
  • 5:49 - 5:51
    and then you add
    "Paw Patrol," "Easter egg,"
  • 5:51 - 5:52
    or whatever these things are,
  • 5:52 - 5:55
    all of these words from other
    popular videos into your title,
  • 5:55 - 5:58
    until you end up with this kind of
    meaningless mash of language
  • 5:58 - 6:01
    that doesn't make sense to humans at all.
  • 6:01 - 6:04
    Because of course it's only really
    tiny kids who are watching your video,
  • 6:04 - 6:06
    and what the hell do they know?
  • 6:06 - 6:09
    Your real audience
    for this stuff is software.
  • 6:09 - 6:11
    It's the algorithms.
  • 6:11 - 6:12
    It's the software that YouTube uses
  • 6:12 - 6:15
    to select which videos
    are like other videos,
  • 6:15 - 6:17
    to make them popular,
    to make them recommended.
  • 6:17 - 6:21
    And that's why you end up with this
    kind of completely meaningless mash,
  • 6:21 - 6:23
    both of title and of content.
  • 6:24 - 6:26
    But the thing is, you have to remember,
  • 6:26 - 6:30
    there really are still people within
    this algorithmically optimized system,
  • 6:30 - 6:33
    people who are kind
    of increasingly forced to act out
  • 6:33 - 6:36
    these increasingly bizarre
    combinations of words,
  • 6:36 - 6:41
    like a desperate improvisation artist
    responding to the combined screams
  • 6:41 - 6:44
    of a million toddlers at once.
  • 6:45 - 6:48
    There are real people
    trapped within these systems,
  • 6:48 - 6:52
    and that's the other deeply strange thing
    about this algorithmically driven culture,
  • 6:52 - 6:53
    because even if you're human,
  • 6:53 - 6:55
    you have to end up behaving like a machine
  • 6:55 - 6:57
    just to survive.
  • 6:57 - 6:59
    And also, on the other side of the screen,
  • 6:59 - 7:02
    there still are these little kids
    watching this stuff,
  • 7:02 - 7:06
    stuck, their full attention grabbed
    by these weird mechanisms.
  • 7:07 - 7:10
    And most of these kids are too small
    to even use a website.
  • 7:10 - 7:13
    They're just kind of hammering
    on the screen with their little hands.
  • 7:13 - 7:14
    And so there's autoplay,
  • 7:14 - 7:18
    where it just keeps playing these videos
    over and over and over in a loop,
  • 7:18 - 7:20
    endlessly for hours and hours at a time.
  • 7:20 - 7:23
    And there's so much weirdness
    in the system now
  • 7:23 - 7:26
    that autoplay takes you
    to some pretty strange places.
  • 7:26 - 7:28
    This is how, within a dozen steps,
  • 7:28 - 7:31
    you can go from a cute video
    of a counting train
  • 7:31 - 7:34
    to masturbating Mickey Mouse.
  • 7:35 - 7:37
    Yeah. I'm sorry about that.
  • 7:37 - 7:39
    This does get worse.
  • 7:39 - 7:40
    This is what happens
  • 7:40 - 7:43
    when all of these different keywords,
  • 7:43 - 7:45
    all these different pieces of attention,
  • 7:45 - 7:48
    this desperate generation of content,
  • 7:48 - 7:51
    all comes together into a single place.
  • 7:52 - 7:56
    This is where all those deeply weird
    keywords come home to roost.
  • 7:56 - 7:59
    You cross-breed the finger family video
  • 7:59 - 8:01
    with some live-action superhero stuff,
  • 8:01 - 8:04
    you add in some weird,
    trollish in-jokes or something,
  • 8:04 - 8:08
    and suddenly, you come
    to a very weird place indeed.
  • 8:08 - 8:10
    The stuff that tends to upset parents
  • 8:10 - 8:13
    is the stuff that has kind of violent
    or sexual content, right?
  • 8:13 - 8:16
    Children's cartoons getting assaulted,
  • 8:16 - 8:18
    getting killed,
  • 8:18 - 8:21
    weird pranks that actually
    genuinely terrify children.
  • 8:21 - 8:25
    What you have is software pulling in
    all of these different influences
  • 8:25 - 8:28
    to automatically generate
    kids' worst nightmares.
  • 8:28 - 8:31
    And this stuff really, really
    does affect small children.
  • 8:31 - 8:34
    Parents report their children
    being traumatized,
  • 8:34 - 8:35
    becoming afraid of the dark,
  • 8:35 - 8:38
    becoming afraid of their favorite
    cartoon characters.
  • 8:39 - 8:42
    If you take one thing away from this,
    it's that if you have small children,
  • 8:42 - 8:44
    keep them the hell away from YouTube.
  • 8:45 - 8:49
    (Applause)
  • 8:51 - 8:54
    But the other thing, the thing
    that really gets to me about this,
  • 8:54 - 8:58
    is that I'm not sure we even really
    understand how we got to this point.
  • 8:59 - 9:02
    We've taken all of this influence,
    all of these things,
  • 9:02 - 9:05
    and munged them together in a way
    that no one really intended.
  • 9:05 - 9:08
    And yet, this is also the way
    that we're building the entire world.
  • 9:08 - 9:10
    We're taking all of this data,
  • 9:10 - 9:11
    a lot of it bad data,
  • 9:11 - 9:14
    a lot of historical data
    full of prejudice,
  • 9:14 - 9:17
    full of all of our worst
    impulses of history,
  • 9:17 - 9:19
    and we're building that
    into huge data sets
  • 9:19 - 9:21
    and then we're automating it.
  • 9:21 - 9:24
    And we're munging it together
    into things like credit reports,
  • 9:24 - 9:26
    into insurance premiums,
  • 9:26 - 9:29
    into things like predictive
    policing systems,
  • 9:29 - 9:30
    into sentencing guidelines.
  • 9:30 - 9:33
    This is the way we're actually
    constructing the world today
  • 9:33 - 9:34
    out of this data.
  • 9:34 - 9:36
    And I don't know what's worse,
  • 9:36 - 9:39
    that we built a system
    that seems to be entirely optimized
  • 9:39 - 9:42
    for the absolute worst aspects
    of human behavior,
  • 9:42 - 9:45
    or that we seem
    to have done it by accident,
  • 9:45 - 9:47
    without even realizing
    that we were doing it,
  • 9:47 - 9:50
    because we didn't really understand
    the systems that we were building,
  • 9:50 - 9:54
    and we didn't really understand
    how to do anything differently with it.
  • 9:55 - 9:58
    There's a couple of things I think
    that really seem to be driving this
  • 9:58 - 9:59
    most fully on YouTube,
  • 9:59 - 10:01
    and the first of those is advertising,
  • 10:01 - 10:04
    which is the monetization of attention
  • 10:04 - 10:07
    without any real other variables at work,
  • 10:07 - 10:11
    any care for the people who are
    actually developing this content,
  • 10:11 - 10:15
    the centralization of the power,
    the separation of those things.
  • 10:15 - 10:18
    And I think however you feel
    about the use of advertising
  • 10:18 - 10:19
    to kind of support stuff,
  • 10:19 - 10:22
    the sight of grown men in diapers
    rolling around the in the sand
  • 10:22 - 10:25
    in the hope that an algorithm
    that they don't really understand
  • 10:25 - 10:27
    will give them money for it
  • 10:27 - 10:29
    suggests that this
    probably isn't the thing
  • 10:29 - 10:31
    that we should be basing
    our society and culture upon,
  • 10:31 - 10:33
    and the way in which
    we should be funding it.
  • 10:34 - 10:37
    And the other thing that's kind of
    the major driver of this is automation,
  • 10:37 - 10:39
    which is the deployment
    of all of this technology
  • 10:39 - 10:42
    as soon as it arrives,
    without any kind of oversight,
  • 10:42 - 10:43
    and then once it's out there,
  • 10:43 - 10:47
    kind of throwing up our hands and going,
    "Hey, it's not us, it's the technology,"
  • 10:47 - 10:49
    like, "We're not involved in it."
  • 10:49 - 10:51
    That's not really good enough,
  • 10:51 - 10:53
    because this stuff isn't
    just algorithmically governed,
  • 10:53 - 10:56
    it's also algorithmically policed.
  • 10:56 - 10:59
    When YouTube first started
    to pay attention to this,
  • 10:59 - 11:01
    the first thing they said
    they'd do about it
  • 11:01 - 11:04
    was that they'd deploy
    better machine learning algorithms
  • 11:04 - 11:05
    to moderate the content.
  • 11:05 - 11:09
    Well, machine learning,
    as any expert in it will tell you,
  • 11:09 - 11:10
    is basically what we've started to call
  • 11:10 - 11:13
    software that we don't really
    understand how it works.
  • 11:13 - 11:17
    And I think we have
    enough of that already.
  • 11:17 - 11:20
    We shouldn't be leaving
    this stuff up to AI to decide
  • 11:20 - 11:22
    what's appropriate or not,
  • 11:22 - 11:23
    because we know what happens.
  • 11:23 - 11:25
    It'll start censoring other things.
  • 11:25 - 11:26
    It'll start censoring queer content.
  • 11:27 - 11:29
    It'll start censoring
    legitimate public speech.
  • 11:29 - 11:31
    What's allowed in these discourses,
  • 11:31 - 11:34
    it shouldn't be something
    that's left up to unaccountable systems.
  • 11:34 - 11:37
    It's part of a discussion
    all of us should be having.
  • 11:37 - 11:38
    But I'd leave a reminder
  • 11:38 - 11:41
    that the alternative isn't
    very pleasant, either.
  • 11:41 - 11:42
    YouTube also announced recently
  • 11:42 - 11:45
    that they're going to release
    a version of their kids' app
  • 11:45 - 11:48
    that would be entirely
    moderated by humans.
  • 11:48 - 11:52
    Facebook -- Zuckerberg said
    much the same thing at Congress,
  • 11:52 - 11:55
    when pressed about how they
    were going to moderate their stuff.
  • 11:55 - 11:57
    He said they'd have humans doing it.
  • 11:57 - 11:58
    And what that really means is,
  • 11:58 - 12:01
    instead of having toddlers being
    the first person to see this stuff,
  • 12:01 - 12:04
    you're going to have underpaid,
    precarious contract workers
  • 12:04 - 12:06
    without proper mental health support
  • 12:06 - 12:07
    being damaged by it as well.
  • 12:07 - 12:08
    (Laughter)
  • 12:08 - 12:11
    And I think we can all do
    quite a lot better than that.
  • 12:11 - 12:13
    (Applause)
  • 12:14 - 12:19
    The thought, I think, that brings those
    two things together, really, for me,
  • 12:19 - 12:20
    is agency.
  • 12:20 - 12:23
    It's like, how much do we really
    understand -- by agency, I mean:
  • 12:23 - 12:28
    How will we know how to act
    in our own best interests?
  • 12:28 - 12:30
    Which -- it's almost impossible to do
  • 12:30 - 12:33
    in these systems that we don't
    really fully understand.
  • 12:33 - 12:36
    Inequality of power
    always leads to violence.
  • 12:36 - 12:38
    And we can see inside these systems
  • 12:38 - 12:40
    that inequality of understanding
    does the same thing.
  • 12:41 - 12:44
    If there's one thing that we can do
    to start to improve these systems,
  • 12:44 - 12:47
    it's to make them more legible
    to the people who use them,
  • 12:47 - 12:49
    so that all of us have
    a common understanding
  • 12:49 - 12:51
    of what's actually going on here.
  • 12:52 - 12:55
    The thing, though, I think
    most about these systems
  • 12:55 - 12:59
    is that this isn't, as I hope
    I've explained, really about YouTube.
  • 12:59 - 13:00
    It's about everything.
  • 13:00 - 13:03
    These issues of accountability and agency,
  • 13:03 - 13:05
    of opacity and complexity,
  • 13:05 - 13:08
    of the violence and exploitation
    that inherently results
  • 13:08 - 13:11
    from the concentration
    of power in a few hands --
  • 13:11 - 13:13
    these are much, much larger issues.
  • 13:14 - 13:18
    And they're issues not just of YouTube
    and not just of technology in general,
  • 13:18 - 13:19
    and they're not even new.
  • 13:19 - 13:21
    They've been with us for ages.
  • 13:21 - 13:25
    But we finally built this system,
    this global system, the internet,
  • 13:25 - 13:28
    that's actually showing them to us
    in this extraordinary way,
  • 13:28 - 13:30
    making them undeniable.
  • 13:30 - 13:33
    Technology has this extraordinary capacity
  • 13:33 - 13:37
    to both instantiate and continue
  • 13:37 - 13:41
    all of our most extraordinary,
    often hidden desires and biases
  • 13:41 - 13:43
    and encoding them into the world,
  • 13:43 - 13:46
    but it also writes them down
    so that we can see them,
  • 13:46 - 13:50
    so that we can't pretend
    they don't exist anymore.
  • 13:50 - 13:54
    We need to stop thinking about technology
    as a solution to all of our problems,
  • 13:54 - 13:58
    but think of it as a guide
    to what those problems actually are,
  • 13:58 - 14:00
    so we can start thinking
    about them ?properly?
  • 14:00 - 14:02
    and start to address them.
  • 14:02 - 14:03
    Thank you very much.
  • 14:03 - 14:08
    (Applause)
  • 14:10 - 14:11
    Thank you.
  • 14:11 - 14:14
    (Applause)
  • 14:17 - 14:20
    Helen Walters: James, thank you
    for coming and giving us that talk.
  • 14:20 - 14:21
    So it's interesting:
  • 14:21 - 14:25
    when you think about the films where
    the robotic overlords take over,
  • 14:25 - 14:28
    it's all a bit more glamorous
    than what you're describing.
  • 14:28 - 14:32
    But I wonder -- in those films,
    you have the resistance mounting.
  • 14:32 - 14:35
    Is there a resistance mounting
    towards this stuff?
  • 14:35 - 14:39
    Do you see any positive signs,
    green shoots of resistance?
  • 14:41 - 14:43
    James Bridle: I don't know
    about direct resistance,
  • 14:43 - 14:45
    because I think this stuff
    is super long-term.
  • 14:45 - 14:48
    I think it's baked into culture
    in really deep ways.
  • 14:48 - 14:50
    A friend of mine,
    Eleanor ?????, always says
  • 14:50 - 14:54
    that any technological problems
    of sufficient scale and scope
  • 14:54 - 14:56
    are political problems first of all.
  • 14:56 - 14:59
    So all of these things we're working
    to address within this
  • 14:59 - 15:02
    are not going to be addressed
    just by building the technology better,
  • 15:02 - 15:05
    but actually by changing the society
    that's producing these technologies.
  • 15:05 - 15:08
    So no, right now, I think we've got
    a hell of a long way to go.
  • 15:09 - 15:10
    But as I said, I think by unpacking them,
  • 15:11 - 15:13
    by explaining them, by talking
    about them super-honestly,
  • 15:13 - 15:16
    we can actually start
    to at least begin that process.
  • 15:16 - 15:19
    HW: And so when you talk about
    legibility and digital literacy,
  • 15:19 - 15:21
    I find it difficult to imagine
  • 15:21 - 15:25
    that we need to place the burden
    of digital literacy on users themselves.
  • 15:25 - 15:29
    But whose responsibility
    is education in this new world?
  • 15:29 - 15:33
    JB: Again, I think this responsibility
    is kind of up to all of us,
  • 15:33 - 15:36
    that everything we do,
    everything we build, everything we make,
  • 15:36 - 15:40
    needs to be made
    in a consensual discussion
  • 15:40 - 15:42
    with everyone who's avoiding it;
  • 15:42 - 15:46
    that we're not building systems
    intended to trick and surprise people
  • 15:46 - 15:48
    into doing the right thing,
  • 15:48 - 15:52
    but that they're actually involved
    in every step in educating them,
  • 15:52 - 15:54
    because each of these systems
    is educational.
  • 15:54 - 15:57
    That's what I'm hopeful about,
    about even this really grim stuff,
  • 15:57 - 15:59
    that if you can take it
    and look at it properly,
  • 15:59 - 16:01
    it's actually in itself
    a piece of education
  • 16:01 - 16:05
    that allows you to start seeing
    how complex systems come together and work
  • 16:05 - 16:09
    and maybe be able to apply
    that knowledge elsewhere in the world.
  • 16:09 - 16:11
    HW: James, it's such
    an important discussion,
  • 16:11 - 16:14
    and I know many people here
    are really open and prepared to have it,
  • 16:14 - 16:16
    so thanks for starting off our morning.
  • 16:16 - 16:17
    JB: Thanks very much. Cheers.
  • 16:17 - 16:19
    (Applause)
Title:
The nightmare videos of childrens' YouTube -- and what's wrong with the internet today
Speaker:
James Bridle
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
16:32

English subtitles

Revisions Compare revisions