< Return to Video

Sandbox

  • 0:00 - 0:06
    but ever since these companies started
  • 0:03 - 0:11
    amassing our data the clock has been
  • 0:06 - 0:13
    ticking in 2008 we showed you how you
  • 0:11 - 0:16
    and your friends 'facebook data could be
  • 0:13 - 0:21
    accessed by a rogue facebook application
  • 0:16 - 0:23
    without consent in 2011 a researcher
  • 0:21 - 0:26
    called Mikael Kaczynski warns that if
  • 0:23 - 0:29
    computers could analyze enough Facebook
  • 0:26 - 0:31
    data from enough people they could spot
  • 0:29 - 0:34
    connections between the way you act
  • 0:31 - 0:39
    online and your personality traits the
  • 0:34 - 0:39
    type of person you are what's really
  • 0:39 - 0:44
    world-changing about those algorithms
  • 0:41 - 0:46
    it's that they can take your music
  • 0:44 - 0:49
    preferences or your book preferences and
  • 0:46 - 0:51
    extract from these seemingly innocent
  • 0:49 - 0:53
    information very accurate predictions
  • 0:51 - 0:56
    about your religiosity leadership
  • 0:53 - 0:59
    potential political views personality
  • 0:56 - 1:01
    and so on by having hundreds and
  • 0:59 - 1:03
    hundreds of thousands of Americans
  • 1:01 - 1:05
    undertake this survey we were able to
  • 1:03 - 1:07
    form a model to predict the personality
  • 1:05 - 1:11
    of every single adult in the United
  • 1:07 - 1:13
    States of America by 2016 alexander nix
  • 1:11 - 1:16
    was explaining how cambridge analytic
  • 1:13 - 1:18
    could use this kind of research to find
  • 1:16 - 1:21
    people of different personality types
  • 1:18 - 1:24
    and target them with specific messages
  • 1:21 - 1:26
    that might influence their behavior if
  • 1:24 - 1:28
    you know that the personality of the
  • 1:26 - 1:30
    people you're targeting you can nuance
  • 1:28 - 1:33
    your messaging to resonate more
  • 1:30 - 1:35
    effectively with those key audience
  • 1:33 - 1:38
    groups because it's personality that
  • 1:35 - 1:41
    drives behavior and behavior that
  • 1:38 - 1:43
    obviously influences how you vote soon
  • 1:41 - 1:46
    afterwards these techniques were used by
  • 1:43 - 1:49
    two political campaigns that would rock
  • 1:46 - 1:49
    the world
  • 1:50 - 1:56
    yes your likes and dislikes your
  • 1:53 - 1:59
    comments and posts your personal data
  • 1:56 - 2:03
    they are valuable but it's what they say
  • 1:59 - 2:07
    about you as a person that's where the
  • 2:03 - 2:09
    real power lies no one knows exactly how
  • 2:07 - 2:12
    much these techniques actually
  • 2:09 - 2:14
    contributed to the results of the votes
  • 2:12 - 2:16
    one of the first researchers to ask the
  • 2:14 - 2:18
    question was Paul Olivier - hey
  • 2:16 - 2:20
    he works on an article at the end of
  • 2:18 - 2:22
    2016 that investigated what was
  • 2:20 - 2:24
    happening and this week he was here in
  • 2:22 - 2:28
    London to give evidence to MPs about the
  • 2:24 - 2:30
    latest revelations sitting alongside him
  • 2:28 - 2:32
    in the Commons Select Committee was
  • 2:30 - 2:35
    Cambridge analytic a whistleblower
  • 2:32 - 2:39
    Christopher Wylie and straight after the
  • 2:35 - 2:40
    session Paul sat down with me this isn't
  • 2:39 - 2:42
    just about Facebook and this isn't just
  • 2:40 - 2:44
    about Cambridge analytic where is it
  • 2:42 - 2:45
    this kind of data collection analysis
  • 2:44 - 2:47
    has been going on for a long time
  • 2:45 - 2:50
    and it's being done by lots of people
  • 2:47 - 2:54
    right so it's in two ways it's not just
  • 2:50 - 2:55
    about those companies Facebook enables a
  • 2:54 - 2:59
    lot more companies than just camber
  • 2:55 - 3:01
    charity care to suck out data in similar
  • 2:59 - 3:03
    ways so that's the first thing and then
  • 3:01 - 3:05
    Facebook is just one player in a big
  • 3:03 - 3:09
    ecosystem of online advertising online
  • 3:05 - 3:11
    profiling some of the companies you have
  • 3:09 - 3:14
    heard of but some of them you just have
  • 3:11 - 3:15
    no relationship with even if you fully
  • 3:14 - 3:17
    understand the terms and conditions that
  • 3:15 - 3:20
    you're agreeing to about what data
  • 3:17 - 3:22
    you're sharing I don't think anyone
  • 3:20 - 3:25
    really understood what can be inferred
  • 3:22 - 3:28
    from the data so not the list of your
  • 3:25 - 3:29
    friends not your likes and dislikes but
  • 3:28 - 3:31
    the things that you've never talked
  • 3:29 - 3:33
    about that now they can tell from your
  • 3:31 - 3:35
    digital footprint yeah it's really hard
  • 3:33 - 3:37
    to understand the inference power of
  • 3:35 - 3:39
    this data what can be deduced from it
  • 3:37 - 3:42
    that's true how people make decisions
  • 3:39 - 3:43
    basically whether they think about about
  • 3:42 - 3:46
    the issue but before making a decision
  • 3:43 - 3:48
    or not another way to say this is that
  • 3:46 - 3:51
    they were trying to find gullible people
  • 3:48 - 3:53
    so if you are able to do that you can
  • 3:51 - 3:56
    just make them you know buy into
  • 3:53 - 4:01
    anything into any content it's easy to
  • 3:56 - 4:02
    believe that Facebook managed to swing
  • 4:01 - 4:05
    the u.s. election
  • 4:02 - 4:09
    to swing brexit was only people on
  • 4:05 - 4:11
    Facebook who saw these new ads that were
  • 4:09 - 4:13
    targeted to them and then went out and
  • 4:11 - 4:15
    possibly changed their votes is that
  • 4:13 - 4:17
    what we're talking about or are we
  • 4:15 - 4:19
    talking about Facebook just being used
  • 4:17 - 4:22
    as a research tool that could then be
  • 4:19 - 4:24
    applied to the wider community in in
  • 4:22 - 4:26
    many ways whether people were
  • 4:24 - 4:28
    individually convinced to vote
  • 4:26 - 4:30
    differently I don't personally believe
  • 4:28 - 4:32
    that's how it happens what I believe is
  • 4:30 - 4:35
    that Facebook itself could be
  • 4:32 - 4:37
    manipulated using those techniques to
  • 4:35 - 4:40
    make some content go viral that would
  • 4:37 - 4:42
    affect public discourse so that would
  • 4:40 - 4:45
    steer the conversation basically and if
  • 4:42 - 4:47
    you're able to do this more or less you
  • 4:45 - 4:49
    know automated more less repetitive in a
  • 4:47 - 4:51
    repetitive fashion then you've partly
  • 4:49 - 4:52
    already won the election because you're
  • 4:51 - 4:55
    steering the conversation around the
  • 4:52 - 4:57
    election and that's precisely the point
  • 4:55 - 4:58
    that Hillary Clinton has been making
  • 4:57 - 5:00
    again and again about camera gigantica
  • 4:58 - 5:02
    is their ability to steer the
  • 5:00 - 5:09
    conversation on specific topics like
  • 5:02 - 5:10
    emails and that's that had an impact the
  • 5:09 - 5:13
    fact that some content was Reshard
  • 5:10 - 5:15
    widely during the election had an impact
  • 5:13 - 5:17
    on editorial decisions made by classic
  • 5:15 - 5:19
    media more established media which in
  • 5:17 - 5:24
    turn had an impact on you know other
  • 5:19 - 5:26
    people's opinions Paul says that even
  • 5:24 - 5:28
    though Facebook and Google have recently
  • 5:26 - 5:30
    allowed us to download everything that
  • 5:28 - 5:34
    they have on us it's not really
  • 5:30 - 5:36
    everything so Facebook can collect data
  • 5:34 - 5:36
    of people who don't have Facebook
  • 5:36 - 5:40
    accounts
  • 5:36 - 5:42
    yeah it's called shadow profiles yeah so
  • 5:40 - 5:45
    that practice for instance has been
  • 5:42 - 5:48
    forbidden in Belgium where I'm from they
  • 5:45 - 5:51
    even people who do have an account are
  • 5:48 - 5:53
    being tracked all over the web all over
  • 5:51 - 5:55
    that same information is collected about
  • 5:53 - 5:57
    them why can't they see it why can't
  • 5:55 - 6:00
    they see all the web pages that Facebook
  • 5:57 - 6:02
    knows they visited before making that
  • 6:00 - 6:05
    transparent will have a very dramatic
  • 6:02 - 6:08
    effect I think in making people aware of
  • 6:05 - 6:11
    how much tracking goes on do you think
  • 6:08 - 6:14
    that UK or EU regulation is strong
  • 6:11 - 6:17
    enough when it comes to protecting
  • 6:14 - 6:19
    our data that's what part of what I
  • 6:17 - 6:20
    wanted to say in the committee we have
  • 6:19 - 6:23
    very strong regulations around personal
  • 6:20 - 6:24
    data that are going to get stronger but
  • 6:23 - 6:27
    it's completely useless and actually
  • 6:24 - 6:29
    worse than not having them if we're not
  • 6:27 - 6:31
    going to enforce them they need to be
  • 6:29 - 6:33
    enforced that's the critical point we're
  • 6:31 - 6:35
    currently things are failing why are
  • 6:33 - 6:37
    they not being enforced because the
  • 6:35 - 6:39
    regulator's currently see their role as
  • 6:37 - 6:41
    balancing commercial interests with
  • 6:39 - 6:43
    Democratic interest around oversight of
  • 6:41 - 6:46
    personal data and that balancing they've
  • 6:43 - 6:48
    done so far was wrong or simply wrong
  • 6:46 - 6:51
    too much on the side of commercial
  • 6:48 - 6:55
    interests and not enough the
  • 6:51 - 6:58
    counterbalances issue on Facebook's
  • 6:55 - 7:00
    reputation and its wealth has taken a
  • 6:58 - 7:03
    massive hit in the last couple of weeks
  • 7:00 - 7:06
    with 80 billion dollars being wiped off
  • 7:03 - 7:08
    its value so can the recently announced
  • 7:06 - 7:12
    new privacy tools help to restore
  • 7:08 - 7:15
    confidence this is the end for Facebook
  • 7:12 - 7:17
    Facebook can still adapt their ways they
  • 7:15 - 7:19
    can still change they will have to
  • 7:17 - 7:22
    anyway because of the regulation that's
  • 7:19 - 7:24
    coming into force it's an opportunity to
  • 7:22 - 7:26
    revenge themselves if you want to say it
  • 7:24 - 7:26
    that way
Title:
Sandbox
Description:

You can use this Sandbox to try out things with the Amara tool.

The video that is primarily streaming here is http://www.youtube.com/watch?v=ZU2kyr9jRkg , which is completely blank. But you can go to the URLs tab to add the URL of another video and make it primary.

Please remember to download your subtitles if you want to keep them, as they will get deleted - and the streaming URL reverted to the blank video if you changed it - after a week or two,

more » « less
Video Language:
English
Team:
Captions Requested
Duration:
01:46:39
Claude Almansi edited English subtitles for Sandbox
Claude Almansi edited English subtitles for Sandbox
Claude Almansi edited English subtitles for Sandbox
Claude Almansi edited English subtitles for Sandbox
Claude Almansi edited English subtitles for Sandbox
koma edited English subtitles for Sandbox
koma edited English subtitles for Sandbox
Claude Almansi edited English subtitles for Sandbox
Show all
  • Revision 1 = provided subtitles for Lecture 1.2 of Prof. Scott Plous' Social Psychology course

  • Revision 1 = provided subtitles for Lecture 1.2 of Prof. Scott Plous' Social Psychology course

  • Revision 1 = provided subtitles for Lecture 1.2 of Prof. Scott Plous' Social Psychology course

English subtitles

Incomplete

Revisions Compare revisions