How tech companies deceive you into giving up your data and privacy
-
0:01 - 0:03Do you remember when you were a child,
-
0:03 - 0:07you probably had a favorite toy
that was a constant companion, -
0:07 - 0:10like Christopher Robin
had Winnie the Pooh, -
0:10 - 0:13and your imagination
fueled endless adventures? -
0:14 - 0:16What could be more innocent than that?
-
0:17 - 0:22Well, let me introduce you
to my friend Cayla. -
0:23 - 0:26Cayla was voted toy of the year
in countries around the world. -
0:26 - 0:30She connects to the internet
and uses speech recognition technology -
0:30 - 0:32to answer your child's questions,
-
0:32 - 0:34respond just like a friend.
-
0:35 - 0:39But the power doesn't lie
with your child's imagination. -
0:39 - 0:43It actually lies with the company
harvesting masses of personal information -
0:43 - 0:49while your family is innocently
chatting away in the safety of their home, -
0:49 - 0:51a dangerously false sense of security.
-
0:53 - 0:55This case sounded alarm bells for me,
-
0:56 - 0:59as it is my job to protect
consumers' rights in my country. -
1:00 - 1:03And with billions of devices such as cars,
-
1:03 - 1:08energy meters and even vacuum cleaners
expected to come online by 2020, -
1:08 - 1:12we thought this was a case
worth investigating further. -
1:12 - 1:14Because what was Cayla doing
-
1:14 - 1:17with all the interesting things
she was learning? -
1:17 - 1:21Did she have another friend she was
loyal to and shared her information with? -
1:22 - 1:24Yes, you guessed right. She did.
-
1:24 - 1:27In order to play with Cayla,
-
1:27 - 1:30you need to download an app
to access all her features. -
1:30 - 1:34Parents must consent to the terms
being changed without notice. -
1:35 - 1:39The recordings of the child,
her friends and family, -
1:39 - 1:41can be used for targeted advertising.
-
1:42 - 1:47And all this information can be shared
with unnamed third parties. -
1:48 - 1:50Enough? Not quite.
-
1:51 - 1:55Anyone with a smartphone
can connect to Cayla -
1:55 - 1:57within a certain distance.
-
1:58 - 2:02When we confronted the company
that made and programmed Cayla, -
2:02 - 2:04they issued a series of statements
-
2:04 - 2:09that one had to be an IT expert
in order to breach the security. -
2:10 - 2:14Shall we fact-check that statement
and live hack Cayla together? -
2:18 - 2:19Here she is.
-
2:20 - 2:24Cayla is equipped with a Bluetooth device
-
2:24 - 2:26which can transmit up to 60 feet,
-
2:26 - 2:28a bit less if there's a wall between.
-
2:28 - 2:34That means I, or any stranger,
can connect to the doll -
2:34 - 2:38while being outside the room
where Cayla and her friends are. -
2:38 - 2:40And to illustrate this,
-
2:40 - 2:42I'm going to turn Cayla on now.
-
2:42 - 2:44Let's see, one, two, three.
-
2:45 - 2:47There. She's on. And I asked a colleague
-
2:47 - 2:49to stand outside with his smartphone,
-
2:49 - 2:50and he's connected,
-
2:51 - 2:53and to make this a bit creepier ...
-
2:53 - 2:57(Laughter)
-
2:58 - 3:02let's see what kids could hear Cayla say
in the safety of their room. -
3:04 - 3:07Man: Hi. My name is Cayla. What is yours?
-
3:07 - 3:08Finn Myrstad: Uh, Finn.
-
3:09 - 3:10Man: Is your mom close by?
-
3:10 - 3:12FM: Uh, no, she's in the store.
-
3:13 - 3:15Man: Ah. Do you want
to come out and play with me? -
3:15 - 3:17FM: That's a great idea.
-
3:18 - 3:19Man: Ah, great.
-
3:20 - 3:23FM: I'm going to turn Cayla off now.
-
3:23 - 3:24(Laughter)
-
3:27 - 3:30We needed no password
-
3:30 - 3:33or to circumvent any other
type of security to do this. -
3:34 - 3:38We published a report
in 20 countries around the world, -
3:38 - 3:41exposing this significant security flaw
-
3:41 - 3:43and many other problematic issues.
-
3:44 - 3:45So what happened?
-
3:46 - 3:47Cayla was banned in Germany,
-
3:48 - 3:52taken off the shelves
by Amazon and Wal-Mart, -
3:52 - 3:55and she's now peacefully resting
-
3:55 - 3:58at the German Spy Museum in Berlin.
-
3:58 - 4:01(Laughter)
-
4:01 - 4:05However, Cayla was also for sale
in stores around the world -
4:05 - 4:09for more than a year
after we published our report. -
4:09 - 4:13What we uncovered is that
there are few rules to protect us -
4:13 - 4:17and the ones we have
are not being properly enforced. -
4:18 - 4:22We need to get the security
and privacy of these devices right -
4:22 - 4:25before they enter the market,
-
4:25 - 4:29because what is the point
of locking a house with a key -
4:29 - 4:32if anyone can enter it
through a connected device? -
4:34 - 4:37You may well think,
"This will not happen to me. -
4:37 - 4:40I will just stay away
from these flawed devices." -
4:41 - 4:43But that won't keep you safe,
-
4:43 - 4:46because simply by
connecting to the internet, -
4:46 - 4:50you are put in an impossible
take-it-or-leave-it position. -
4:50 - 4:52Let me show you.
-
4:52 - 4:55Like most of you,
I have dozens of apps on my phone, -
4:56 - 4:58and used properly,
they can make our lives easier, -
4:58 - 5:01more convenient and maybe even healthier.
-
5:02 - 5:05But have we been lulled
into a false sense of security? -
5:07 - 5:09It starts simply by ticking a box.
-
5:10 - 5:12Yes, we say,
-
5:12 - 5:13I've read the terms.
-
5:15 - 5:18But have you really read the terms?
-
5:19 - 5:21Are you sure they didn't look too long
-
5:22 - 5:24and your phone was running out of battery,
-
5:24 - 5:27and the last time you tried
they were impossible to understand, -
5:27 - 5:29and you needed to use the service now?
-
5:30 - 5:33And now, the power
imbalance is established, -
5:34 - 5:37because we have agreed
to our personal information -
5:37 - 5:40being gathered and used
on a scale we could never imagine. -
5:42 - 5:45This is why my colleagues and I
decided to take a deeper look at this. -
5:45 - 5:49We set out to read the terms
-
5:49 - 5:51of popular apps on an average phone.
-
5:51 - 5:55And to show the world
how unrealistic it is -
5:55 - 5:58to expect consumers
to actually read the terms, -
5:58 - 6:00we printed them,
-
6:00 - 6:02more than 900 pages,
-
6:03 - 6:06and sat down in our office
and read them out loud ourselves, -
6:08 - 6:10streaming the experiment
live on our websites. -
6:10 - 6:13As you can see, it took quite a long time.
-
6:13 - 6:17It took us 31 hours,
49 minutes and 11 seconds -
6:17 - 6:20to read the terms on an average phone.
-
6:20 - 6:24That is longer than a movie marathon
of the "Harry Potter" movies -
6:24 - 6:27and the "Godfather" movies combined.
-
6:27 - 6:28(Laughter)
-
6:30 - 6:32And reading is one thing.
-
6:32 - 6:34Understanding is another story.
-
6:34 - 6:37That would have taken us
much, much longer. -
6:37 - 6:39And this is a real problem,
-
6:39 - 6:42because companies have argued
for 20 to 30 years -
6:42 - 6:45against regulating the internet better,
-
6:45 - 6:48because users have consented
to the terms and conditions. -
6:51 - 6:52As we've shown with this experiment,
-
6:53 - 6:55achieving informed consent
is close to impossible. -
6:57 - 7:01Do you think it's fair to put the burden
of responsibility on the consumer? -
7:02 - 7:04I don't.
-
7:04 - 7:07I think we should demand
less take-it-or-leave-it -
7:07 - 7:10and more understandable terms
before we agree to them. -
7:10 - 7:12(Applause)
-
7:12 - 7:13Thank you.
-
7:16 - 7:21Now, I would like to tell you
a story about love. -
7:22 - 7:26Some of the world's
most popular apps are dating apps, -
7:26 - 7:30an industry now worth more than,
or close to, three billion dollars a year. -
7:31 - 7:35And of course, we're OK
sharing our intimate details -
7:35 - 7:37with our other half.
-
7:37 - 7:39But who else is snooping,
-
7:39 - 7:42saving and sharing our information
-
7:42 - 7:44while we are baring our souls?
-
7:45 - 7:47My team and I decided to investigate this.
-
7:49 - 7:52And in order to understand
the issue from all angles -
7:52 - 7:54and to truly do a thorough job,
-
7:55 - 7:57I realized I had to download
-
7:57 - 8:01one of the world's
most popular dating apps myself. -
8:02 - 8:05So I went home to my wife ...
-
8:05 - 8:07(Laughter)
-
8:07 - 8:08who I had just married.
-
8:08 - 8:13"Is it OK if I establish a profile
on a very popular dating app -
8:13 - 8:15for purely scientific purposes?"
-
8:15 - 8:17(Laughter)
-
8:17 - 8:18This is what we found.
-
8:18 - 8:22Hidden behind the main menu
was a preticked box -
8:22 - 8:28that gave the dating company access
to all my personal pictures on Facebook, -
8:28 - 8:31in my case more than 2,000 of them,
-
8:31 - 8:33and some were quite personal.
-
8:34 - 8:37And to make matters worse,
-
8:37 - 8:39when we read the terms and conditions,
-
8:39 - 8:40we discovered the following,
-
8:40 - 8:43and I'm going to need to take out
my reading glasses for this one. -
8:44 - 8:47And I'm going to read it for you,
because this is complicated. -
8:47 - 8:49All right.
-
8:49 - 8:51"By posting content" --
-
8:51 - 8:53and content refers to your pictures, chat
-
8:53 - 8:55and other interactions
in the dating service -- -
8:55 - 8:56"as a part of the service,
-
8:57 - 8:58you automatically grant to the company,
-
8:59 - 9:01its affiliates, licensees and successors
-
9:01 - 9:04an irrevocable" -- which means
you can't change your mind -- -
9:04 - 9:07"perpetual" -- which means forever --
-
9:07 - 9:10"nonexclusive, transferrable,
sublicensable, fully paid-up, -
9:10 - 9:13worldwide right and license
to use, copy, store, perform, -
9:13 - 9:14display, reproduce, record,
-
9:14 - 9:16play, adapt, modify
and distribute the content, -
9:16 - 9:18prepare derivative works of the content,
-
9:18 - 9:20or incorporate the content
into other works -
9:20 - 9:23and grant and authorize sublicenses
of the foregoing in any media -
9:23 - 9:25now known or hereafter created."
-
9:29 - 9:32That basically means
that all your dating history -
9:32 - 9:38and everything related to it
can be used for any purpose for all time. -
9:39 - 9:43Just imagine your children
seeing your sassy dating photos -
9:44 - 9:46in a birth control ad 20 years from now.
-
9:48 - 9:50But seriously, though --
-
9:50 - 9:51(Laughter)
-
9:53 - 9:55what might these commercial
practices mean to you? -
9:56 - 9:59For example, financial loss:
-
9:59 - 10:01based on your web browsing history,
-
10:01 - 10:04algorithms might decide
whether you will get a mortgage or not. -
10:05 - 10:06Subconscious manipulation:
-
10:08 - 10:11companies can analyze your emotions
based on your photos and chats, -
10:11 - 10:15targeting you with ads
when you are at your most vulnerable. -
10:15 - 10:16Discrimination:
-
10:16 - 10:19a fitness app can sell your data
to a health insurance company, -
10:19 - 10:22preventing you from getting
coverage in the future. -
10:22 - 10:25All of this is happening
in the world today. -
10:26 - 10:29But of course, not all uses
of data are malign. -
10:29 - 10:31Some are just flawed or need more work,
-
10:31 - 10:33and some are truly great.
-
10:36 - 10:39And there is some good news as well.
-
10:39 - 10:43The dating companies
changed their policies globally -
10:43 - 10:44after we filed a legal complaint.
-
10:46 - 10:48But organizations such as mine
-
10:48 - 10:51that fight for consumers' rights
can't be everywhere. -
10:51 - 10:54Nor can consumers fix this on their own,
-
10:54 - 10:58because if we know
that something innocent we said -
10:58 - 10:59will come back to haunt us,
-
10:59 - 11:01we will stop speaking.
-
11:01 - 11:04If we know that we are being
watched and monitored, -
11:04 - 11:06we will change our behavior.
-
11:07 - 11:10And if we can't control who has our data
and how it is being used, -
11:10 - 11:12we have lost the control of our lives.
-
11:14 - 11:18The stories I have told you today
are not random examples. -
11:18 - 11:20They are everywhere,
-
11:20 - 11:23and they are a sign
that things need to change. -
11:23 - 11:25And how can we achieve that change?
-
11:25 - 11:30Well, companies need to realize
that by prioritizing privacy and security, -
11:30 - 11:33they can build trust
and loyalty to their users. -
11:35 - 11:38Governments must create a safer internet
-
11:38 - 11:41by ensuring enforcement
and up-to-date rules. -
11:41 - 11:44And us, the citizens?
-
11:44 - 11:45We can use our voice
-
11:45 - 11:51to remind the world that technology
can only truly benefit society -
11:51 - 11:53if it respects basic rights.
-
11:54 - 11:55Thank you so much.
-
11:55 - 11:59(Applause)
- Title:
- How tech companies deceive you into giving up your data and privacy
- Speaker:
- Finn Lützow-Holm Myrstad
- Description:
-
Have you ever actually read the terms and conditions for the apps you use? Finn Myrstad and his team at the Norwegian Consumer Council have, and it took them nearly a day and a half to read the terms of all the apps on an average phone. In a talk about the alarming ways tech companies deceive their users, Myrstad shares insights about the personal information you've agreed to let companies collect -- and how they use your data at a scale you could never imagine.
- Video Language:
- English
- Team:
closed TED
- Project:
- TEDTalks
- Duration:
- 12:12
![]() |
Brian Greene edited English subtitles for How tech companies deceive you into giving up your data and privacy | |
![]() |
Brian Greene edited English subtitles for How tech companies deceive you into giving up your data and privacy | |
![]() |
Brian Greene approved English subtitles for How tech companies deceive you into giving up your data and privacy | |
![]() |
Brian Greene edited English subtitles for How tech companies deceive you into giving up your data and privacy | |
![]() |
Joanna Pietrulewicz accepted English subtitles for How tech companies deceive you into giving up your data and privacy | |
![]() |
Joanna Pietrulewicz edited English subtitles for How tech companies deceive you into giving up your data and privacy | |
![]() |
Joanna Pietrulewicz edited English subtitles for How tech companies deceive you into giving up your data and privacy | |
![]() |
Joseph Geni edited English subtitles for How tech companies deceive you into giving up your data and privacy |