Sandbox
-
0:00 - 0:06but ever since these companies started
-
0:03 - 0:11amassing our data the clock has been
-
0:06 - 0:13ticking in 2008 we showed you how you
-
0:11 - 0:16and your friends 'facebook data could be
-
0:13 - 0:21accessed by a rogue facebook application
-
0:16 - 0:23without consent in 2011 a researcher
-
0:21 - 0:26called Mikael Kaczynski warns that if
-
0:23 - 0:29computers could analyze enough Facebook
-
0:26 - 0:31data from enough people they could spot
-
0:29 - 0:34connections between the way you act
-
0:31 - 0:39online and your personality traits the
-
0:34 - 0:39type of person you are what's really
-
0:39 - 0:44world-changing about those algorithms
-
0:41 - 0:46it's that they can take your music
-
0:44 - 0:49preferences or your book preferences and
-
0:46 - 0:51extract from these seemingly innocent
-
0:49 - 0:53information very accurate predictions
-
0:51 - 0:56about your religiosity leadership
-
0:53 - 0:59potential political views personality
-
0:56 - 1:01and so on by having hundreds and
-
0:59 - 1:03hundreds of thousands of Americans
-
1:01 - 1:05undertake this survey we were able to
-
1:03 - 1:07form a model to predict the personality
-
1:05 - 1:11of every single adult in the United
-
1:07 - 1:13States of America by 2016 alexander nix
-
1:11 - 1:16was explaining how cambridge analytic
-
1:13 - 1:18could use this kind of research to find
-
1:16 - 1:21people of different personality types
-
1:18 - 1:24and target them with specific messages
-
1:21 - 1:26that might influence their behavior if
-
1:24 - 1:28you know that the personality of the
-
1:26 - 1:30people you're targeting you can nuance
-
1:28 - 1:33your messaging to resonate more
-
1:30 - 1:35effectively with those key audience
-
1:33 - 1:38groups because it's personality that
-
1:35 - 1:41drives behavior and behavior that
-
1:38 - 1:43obviously influences how you vote soon
-
1:41 - 1:46afterwards these techniques were used by
-
1:43 - 1:49two political campaigns that would rock
-
1:46 - 1:49the world
-
1:50 - 1:56yes your likes and dislikes your
-
1:53 - 1:59comments and posts your personal data
-
1:56 - 2:03they are valuable but it's what they say
-
1:59 - 2:07about you as a person that's where the
-
2:03 - 2:09real power lies no one knows exactly how
-
2:07 - 2:12much these techniques actually
-
2:09 - 2:14contributed to the results of the votes
-
2:12 - 2:16one of the first researchers to ask the
-
2:14 - 2:18question was Paul Olivier - hey
-
2:16 - 2:20he works on an article at the end of
-
2:18 - 2:222016 that investigated what was
-
2:20 - 2:24happening and this week he was here in
-
2:22 - 2:28London to give evidence to MPs about the
-
2:24 - 2:30latest revelations sitting alongside him
-
2:28 - 2:32in the Commons Select Committee was
-
2:30 - 2:35Cambridge analytic a whistleblower
-
2:32 - 2:39Christopher Wylie and straight after the
-
2:35 - 2:40session Paul sat down with me this isn't
-
2:39 - 2:42just about Facebook and this isn't just
-
2:40 - 2:44about Cambridge analytic where is it
-
2:42 - 2:45this kind of data collection analysis
-
2:44 - 2:47has been going on for a long time
-
2:45 - 2:50and it's being done by lots of people
-
2:47 - 2:54right so it's in two ways it's not just
-
2:50 - 2:55about those companies Facebook enables a
-
2:54 - 2:59lot more companies than just camber
-
2:55 - 3:01charity care to suck out data in similar
-
2:59 - 3:03ways so that's the first thing and then
-
3:01 - 3:05Facebook is just one player in a big
-
3:03 - 3:09ecosystem of online advertising online
-
3:05 - 3:11profiling some of the companies you have
-
3:09 - 3:14heard of but some of them you just have
-
3:11 - 3:15no relationship with even if you fully
-
3:14 - 3:17understand the terms and conditions that
-
3:15 - 3:20you're agreeing to about what data
-
3:17 - 3:22you're sharing I don't think anyone
-
3:20 - 3:25really understood what can be inferred
-
3:22 - 3:28from the data so not the list of your
-
3:25 - 3:29friends not your likes and dislikes but
-
3:28 - 3:31the things that you've never talked
-
3:29 - 3:33about that now they can tell from your
-
3:31 - 3:35digital footprint yeah it's really hard
-
3:33 - 3:37to understand the inference power of
-
3:35 - 3:39this data what can be deduced from it
-
3:37 - 3:42that's true how people make decisions
-
3:39 - 3:43basically whether they think about about
-
3:42 - 3:46the issue but before making a decision
-
3:43 - 3:48or not another way to say this is that
-
3:46 - 3:51they were trying to find gullible people
-
3:48 - 3:53so if you are able to do that you can
-
3:51 - 3:56just make them you know buy into
-
3:53 - 4:01anything into any content it's easy to
-
3:56 - 4:02believe that Facebook managed to swing
-
4:01 - 4:05the u.s. election
-
4:02 - 4:09to swing brexit was only people on
-
4:05 - 4:11Facebook who saw these new ads that were
-
4:09 - 4:13targeted to them and then went out and
-
4:11 - 4:15possibly changed their votes is that
-
4:13 - 4:17what we're talking about or are we
-
4:15 - 4:19talking about Facebook just being used
-
4:17 - 4:22as a research tool that could then be
-
4:19 - 4:24applied to the wider community in in
-
4:22 - 4:26many ways whether people were
-
4:24 - 4:28individually convinced to vote
-
4:26 - 4:30differently I don't personally believe
-
4:28 - 4:32that's how it happens what I believe is
-
4:30 - 4:35that Facebook itself could be
-
4:32 - 4:37manipulated using those techniques to
-
4:35 - 4:40make some content go viral that would
-
4:37 - 4:42affect public discourse so that would
-
4:40 - 4:45steer the conversation basically and if
-
4:42 - 4:47you're able to do this more or less you
-
4:45 - 4:49know automated more less repetitive in a
-
4:47 - 4:51repetitive fashion then you've partly
-
4:49 - 4:52already won the election because you're
-
4:51 - 4:55steering the conversation around the
-
4:52 - 4:57election and that's precisely the point
-
4:55 - 4:58that Hillary Clinton has been making
-
4:57 - 5:00again and again about camera gigantica
-
4:58 - 5:02is their ability to steer the
-
5:00 - 5:09conversation on specific topics like
-
5:02 - 5:10emails and that's that had an impact the
-
5:09 - 5:13fact that some content was Reshard
-
5:10 - 5:15widely during the election had an impact
-
5:13 - 5:17on editorial decisions made by classic
-
5:15 - 5:19media more established media which in
-
5:17 - 5:24turn had an impact on you know other
-
5:19 - 5:26people's opinions Paul says that even
-
5:24 - 5:28though Facebook and Google have recently
-
5:26 - 5:30allowed us to download everything that
-
5:28 - 5:34they have on us it's not really
-
5:30 - 5:36everything so Facebook can collect data
-
5:34 - 5:36of people who don't have Facebook
-
5:36 - 5:40accounts
-
5:36 - 5:42yeah it's called shadow profiles yeah so
-
5:40 - 5:45that practice for instance has been
-
5:42 - 5:48forbidden in Belgium where I'm from they
-
5:45 - 5:51even people who do have an account are
-
5:48 - 5:53being tracked all over the web all over
-
5:51 - 5:55that same information is collected about
-
5:53 - 5:57them why can't they see it why can't
-
5:55 - 6:00they see all the web pages that Facebook
-
5:57 - 6:02knows they visited before making that
-
6:00 - 6:05transparent will have a very dramatic
-
6:02 - 6:08effect I think in making people aware of
-
6:05 - 6:11how much tracking goes on do you think
-
6:08 - 6:14that UK or EU regulation is strong
-
6:11 - 6:17enough when it comes to protecting
-
6:14 - 6:19our data that's what part of what I
-
6:17 - 6:20wanted to say in the committee we have
-
6:19 - 6:23very strong regulations around personal
-
6:20 - 6:24data that are going to get stronger but
-
6:23 - 6:27it's completely useless and actually
-
6:24 - 6:29worse than not having them if we're not
-
6:27 - 6:31going to enforce them they need to be
-
6:29 - 6:33enforced that's the critical point we're
-
6:31 - 6:35currently things are failing why are
-
6:33 - 6:37they not being enforced because the
-
6:35 - 6:39regulator's currently see their role as
-
6:37 - 6:41balancing commercial interests with
-
6:39 - 6:43Democratic interest around oversight of
-
6:41 - 6:46personal data and that balancing they've
-
6:43 - 6:48done so far was wrong or simply wrong
-
6:46 - 6:51too much on the side of commercial
-
6:48 - 6:55interests and not enough the
-
6:51 - 6:58counterbalances issue on Facebook's
-
6:55 - 7:00reputation and its wealth has taken a
-
6:58 - 7:03massive hit in the last couple of weeks
-
7:00 - 7:06with 80 billion dollars being wiped off
-
7:03 - 7:08its value so can the recently announced
-
7:06 - 7:12new privacy tools help to restore
-
7:08 - 7:15confidence this is the end for Facebook
-
7:12 - 7:17Facebook can still adapt their ways they
-
7:15 - 7:19can still change they will have to
-
7:17 - 7:22anyway because of the regulation that's
-
7:19 - 7:24coming into force it's an opportunity to
-
7:22 - 7:26revenge themselves if you want to say it
-
7:24 - 7:26that way
- Title:
- Sandbox
- Description:
-
You can use this Sandbox to try out things with the Amara tool.
The video that is primarily streaming here is http://www.youtube.com/watch?v=ZU2kyr9jRkg , which is completely blank. But you can go to the URLs tab to add the URL of another video and make it primary.
Please remember to download your subtitles if you want to keep them, as they will get deleted - and the streaming URL reverted to the blank video if you changed it - after a week or two,
- Video Language:
- English
- Team:
- Captions Requested
- Duration:
- 01:46:39
Claude Almansi edited English subtitles for Sandbox | ||
Claude Almansi edited English subtitles for Sandbox | ||
Claude Almansi edited English subtitles for Sandbox | ||
Claude Almansi edited English subtitles for Sandbox | ||
Claude Almansi edited English subtitles for Sandbox | ||
koma edited English subtitles for Sandbox | ||
koma edited English subtitles for Sandbox | ||
Claude Almansi edited English subtitles for Sandbox |
Claude Almansi
Revision 1 = provided subtitles for Lecture 1.2 of Prof. Scott Plous' Social Psychology course
Claude Almansi
Revision 1 = provided subtitles for Lecture 1.2 of Prof. Scott Plous' Social Psychology course
Claude Almansi
Revision 1 = provided subtitles for Lecture 1.2 of Prof. Scott Plous' Social Psychology course