< Return to Video

How tech companies deceive you into giving up your data and privacy

  • 0:01 - 0:03
    Do you remember when you were a child,
  • 0:03 - 0:07
    you probably had a favorite toy
    that was a constant companion,
  • 0:07 - 0:10
    like Christopher Robin
    had Winnie the Pooh,
  • 0:10 - 0:13
    and your imagination
    fueled endless adventures?
  • 0:14 - 0:16
    What could be more innocent than that?
  • 0:17 - 0:22
    Well, let me introduce you
    to my friend Cayla.
  • 0:23 - 0:26
    Cayla was voted toy of the year
    in countries around the world.
  • 0:26 - 0:30
    She connects to the internet
    and uses speech recognition technology
  • 0:30 - 0:32
    to answer your child's questions,
  • 0:32 - 0:34
    respond just like a friend.
  • 0:35 - 0:39
    But the power doesn't lie
    with your child's imagination.
  • 0:39 - 0:43
    It actually lies with the company
    harvesting masses of personal information
  • 0:43 - 0:49
    while your family is innocently
    chatting away in the safety of their home,
  • 0:49 - 0:51
    a dangerously false sense of security.
  • 0:53 - 0:55
    This case sounded alarm bells for me,
  • 0:56 - 0:59
    as it is my job to protect
    consumers' rights in my country.
  • 1:00 - 1:03
    And with billions of devices such as cars,
  • 1:03 - 1:08
    energy meters, and even vacuum cleaners
    expected to come online by 2020,
  • 1:08 - 1:12
    we thought this was a case
    worth investigating further,
  • 1:12 - 1:14
    because what was Cayla doing
  • 1:14 - 1:17
    with all the interesting things
    she was learning?
  • 1:17 - 1:21
    Did she have another friend she was
    loyal to and shared her information with?
  • 1:22 - 1:24
    Yes, you guessed right. She did.
  • 1:24 - 1:27
    In order to play with Cayla,
  • 1:27 - 1:30
    you need to download an app
    to access all her features.
  • 1:30 - 1:34
    Parents must consent to the terms
    being changed without notice.
  • 1:35 - 1:39
    The recordings of the child,
    her friends and family,
  • 1:39 - 1:41
    can be used for targeted advertising.
  • 1:42 - 1:47
    And all this information can be shared
    with unnamed third parties.
  • 1:48 - 1:50
    Enough? Not quite.
  • 1:51 - 1:55
    Anyone with a smartphone
    can connect to Cayla
  • 1:55 - 1:57
    within a certain distance.
  • 1:58 - 2:02
    When we confronted the company
    that made and programmed Cayla,
  • 2:02 - 2:04
    they issued a series of statements
  • 2:04 - 2:09
    that one had to be an IT expert
    in order to breach the security.
  • 2:10 - 2:14
    Shall we fact-check that statement
    and livehack Cayla together?
  • 2:18 - 2:19
    Here she is.
  • 2:20 - 2:24
    Cayla is equipped with a Bluetooth device
  • 2:24 - 2:26
    which can transmit up to 60 feet,
  • 2:26 - 2:28
    a bit less if there's a wall between.
  • 2:28 - 2:34
    That means I, or any stranger,
    can connect to the doll
  • 2:34 - 2:38
    while being outside the room
    where Cayla and her friends are.
  • 2:38 - 2:40
    And to illustrate this,
  • 2:40 - 2:42
    I'm going to turn Cayla on now.
  • 2:42 - 2:44
    Let's see, one, two, three.
  • 2:45 - 2:47
    There. She's on. And I'll ask a colleague
  • 2:47 - 2:49
    to stand outside with his smartphone,
  • 2:49 - 2:50
    and he's connected,
  • 2:51 - 2:53
    and to make this a bit creepier ...
  • 2:53 - 2:57
    (Laughter)
  • 2:58 - 3:02
    let's see what kids could hear Cayla say
    in the safety of their room.
  • 3:04 - 3:07
    Man: Hi. My name is Cayla. What is yours?
  • 3:07 - 3:08
    Finn Myrstad: Uh, Finn.
  • 3:09 - 3:10
    Man: Is your mom close by?
  • 3:10 - 3:12
    FM: Uh, no, she's in the store.
  • 3:13 - 3:15
    Man: Ah. Do you want
    to come out and play with me?
  • 3:15 - 3:17
    FM: That's a great idea.
  • 3:18 - 3:19
    Man: Ah. Great.
  • 3:20 - 3:23
    FM: I'm going to turn Cayla off now.
  • 3:23 - 3:24
    (Laughter)
  • 3:27 - 3:30
    We needed no password
  • 3:30 - 3:33
    or to circumvent any other
    type of security to do this.
  • 3:34 - 3:38
    We published a report
    in 20 countries around the world,
  • 3:38 - 3:41
    exposing this significant security flaw
  • 3:41 - 3:43
    and many other problematic issues.
  • 3:44 - 3:45
    So what happened?
  • 3:46 - 3:47
    Cayla was banned in Germany,
  • 3:48 - 3:52
    taken off the shelves
    by Amazon and Wal-Mart,
  • 3:52 - 3:55
    and she's now peacefully resting
  • 3:55 - 3:58
    at the German Spy Museum in Berlin.
  • 3:58 - 4:01
    (Laughter)
  • 4:01 - 4:05
    However, Cayla was also for sale
    in stores around the world
  • 4:05 - 4:09
    for more than a year
    after we published our report.
  • 4:09 - 4:13
    What we uncovered is that
    there are few rules to protect us
  • 4:13 - 4:17
    and the ones we have
    are not being properly enforced.
  • 4:18 - 4:22
    We need to get the security
    and privacy of these devices right
  • 4:22 - 4:25
    before they enter the market,
  • 4:25 - 4:29
    because what is the point
    of locking a house with a key
  • 4:29 - 4:32
    if anyone can enter it
    through a connected device?
  • 4:34 - 4:37
    You may well think,
    "This will not happen to me.
  • 4:37 - 4:40
    I will just stay away
    from these flawed devices."
  • 4:41 - 4:43
    But that won't keep you safe,
  • 4:43 - 4:46
    because simply by
    connecting to the internet,
  • 4:46 - 4:50
    you are put in an impossible
    take-it-or-leave-it position.
  • 4:50 - 4:52
    Let me show you.
  • 4:52 - 4:55
    Like most of you,
    I have dozens of apps on my phone,
  • 4:56 - 4:58
    and used properly,
    they can make our lives easier,
  • 4:58 - 5:01
    more convenient, and maybe even healthier.
  • 5:02 - 5:05
    But have we been lulled
    into a false sense of security?
  • 5:07 - 5:09
    It starts simply by ticking a box.
  • 5:10 - 5:12
    Yes, we say,
  • 5:12 - 5:13
    I've read the terms.
  • 5:15 - 5:18
    But have you really read the terms?
  • 5:19 - 5:21
    Are you sure they didn't look too long
  • 5:22 - 5:24
    and your phone was running out of battery,
  • 5:24 - 5:27
    and the last time you tried
    they were impossible to understand,
  • 5:27 - 5:29
    and you needed to use the service now?
  • 5:30 - 5:33
    And now, the power
    imbalance is established,
  • 5:34 - 5:37
    because we have agreed
    to our personal information
  • 5:37 - 5:40
    being gathered and used
    on a scale we could never imagine.
  • 5:42 - 5:45
    This is why my colleagues and I
    decided to take a deeper look at this.
  • 5:45 - 5:49
    We set out to read the terms
  • 5:49 - 5:51
    of popular apps on an average phone.
  • 5:51 - 5:55
    And to show the world
    how unrealistic it is
  • 5:55 - 5:58
    to expect consumers
    to actually read the terms,
  • 5:58 - 6:00
    we printed them,
  • 6:00 - 6:02
    more than 900 pages,
  • 6:03 - 6:06
    and sat down in our office
    and read them out loud ourselves,
  • 6:08 - 6:10
    streaming the experiment
    live on our websites.
  • 6:10 - 6:13
    As you can see, it took quite a long time.
  • 6:13 - 6:17
    It took us 31 hours,
    49 minutes, and 11 seconds
  • 6:17 - 6:20
    to read the terms on an average phone.
  • 6:20 - 6:24
    That is longer than a movie marathon
    of the "Harry Potter" movies
  • 6:24 - 6:27
    and the "Godfather" movies combined.
  • 6:27 - 6:28
    (Laughter)
  • 6:30 - 6:32
    And reading is one thing.
  • 6:32 - 6:34
    Understanding is another story.
  • 6:34 - 6:37
    That would have taken us
    much, much longer.
  • 6:37 - 6:39
    And this is a real problem,
  • 6:39 - 6:42
    because companies have argued
    for 20 to 30 years
  • 6:42 - 6:45
    against regulating the internet better,
  • 6:45 - 6:48
    because users have consented
    to the terms and conditions.
  • 6:51 - 6:52
    As we've shown with this experiment,
  • 6:53 - 6:55
    achieving informed consent
    is close to impossible.
  • 6:57 - 7:01
    Do you think it's fair to put the burden
    of responsibility on the consumer?
  • 7:02 - 7:04
    I don't.
  • 7:04 - 7:07
    I think we should demand
    less take-it-or-leave-it
  • 7:07 - 7:10
    and more understandable terms
    before we agree to them.
  • 7:10 - 7:12
    (Applause)
  • 7:12 - 7:13
    Thank you.
  • 7:16 - 7:21
    Now, I would like to tell you
    a story about love.
  • 7:22 - 7:26
    Some of the world's
    most popular apps are dating apps,
  • 7:26 - 7:30
    an industry now worth more than,
    or close to, three billion dollars a year.
  • 7:31 - 7:35
    And of course, we're OK
    sharing our intimate details
  • 7:35 - 7:37
    with our other half.
  • 7:37 - 7:39
    But who else is snooping,
  • 7:39 - 7:42
    saving and sharing our information
  • 7:42 - 7:44
    while we are baring our souls?
  • 7:45 - 7:47
    My team and I decided to investigate this,
  • 7:49 - 7:52
    and in order to understand
    the issue from all angles
  • 7:52 - 7:54
    and to truly do a thorough job,
  • 7:55 - 7:57
    I realized I had to download
  • 7:57 - 8:01
    one of the world's
    most popular dating apps myself.
  • 8:02 - 8:05
    So I went home to my wife ...
  • 8:05 - 8:07
    (Laughter)
  • 8:07 - 8:08
    who I had just married.
  • 8:08 - 8:13
    "Is it OK if I establish a profile
    on a very popular dating app
  • 8:13 - 8:15
    for purely scientific purposes?"
  • 8:15 - 8:17
    (Laughter)
  • 8:17 - 8:18
    This is what we found.
  • 8:18 - 8:22
    Hidden behind the main menu
    was a preticked box
  • 8:22 - 8:28
    that gave the dating company access
    to all my personal pictures on Facebook,
  • 8:28 - 8:31
    in my case more than 2,000 of them,
  • 8:31 - 8:33
    and some were quite personal.
  • 8:34 - 8:37
    And to make matters worse,
  • 8:37 - 8:39
    when we read the terms and conditions,
  • 8:39 - 8:40
    we discovered the following,
  • 8:40 - 8:43
    and I'm going to need to take out
    my reading glasses for this one.
  • 8:44 - 8:47
    And I'm going to read it for you,
    because this is complicated.
  • 8:47 - 8:49
    All right.
  • 8:49 - 8:51
    "By posting content" --
  • 8:51 - 8:53
    and content refers to your pictures, chat
  • 8:53 - 8:55
    and other interactions
    in the dating service --
  • 8:55 - 8:56
    "as a part of the service,
  • 8:57 - 8:58
    you automatically grant to the company,
  • 8:59 - 9:01
    its affiliates, licensees and successors
  • 9:01 - 9:04
    an irrevocable" -- which means
    you can't change your mind --
  • 9:04 - 9:07
    "perpetual" -- which means forever --
  • 9:07 - 9:10
    "nonexclusive, transferrable,
    sublicensable, fully paid-up,
  • 9:10 - 9:13
    worldwide right and license
    to use, copy, store, perform,
  • 9:13 - 9:14
    display, reproduce, record,
  • 9:14 - 9:16
    play, adapt, modify
    and distribute the content,
  • 9:16 - 9:18
    prepare derivative works of the content,
  • 9:18 - 9:20
    or incorporate the content
    into other works
  • 9:20 - 9:23
    and grant and authorize sublicenses
    of the foregoing in any media
  • 9:23 - 9:25
    now known or hereafter created."
  • 9:29 - 9:32
    That basically means
    that all your dating history
  • 9:32 - 9:38
    and everything related to it
    can be used for any purpose for all time.
  • 9:39 - 9:43
    Just imagine your children
    seeing your sassy dating photos
  • 9:44 - 9:46
    in a birth control ad 20 years from now.
  • 9:48 - 9:50
    But seriously, though --
  • 9:50 - 9:51
    (Laughter)
  • 9:53 - 9:55
    what might these commercial
    practices mean to you?
  • 9:56 - 9:59
    For example, financial loss:
  • 9:59 - 10:01
    based on your web browsing history,
  • 10:01 - 10:04
    algorithms might decide
    whether you will get a mortgage or not.
  • 10:05 - 10:06
    Subconscious manipulation:
  • 10:08 - 10:11
    companies can analyze your emotions
    based on your photos and chats,
  • 10:11 - 10:15
    targeting you with ads
    when you are at your most vulnerable.
  • 10:15 - 10:16
    Discrimination:
  • 10:16 - 10:19
    a fitness app can sell your data
    to a health insurance company,
  • 10:19 - 10:22
    preventing you from getting
    coverage in the future.
  • 10:22 - 10:25
    All of this is happening
    in the world today.
  • 10:26 - 10:29
    But of course, not all uses
    of data are malign.
  • 10:29 - 10:31
    Some are just flawed or need more work,
  • 10:31 - 10:33
    and some are truly great.
  • 10:36 - 10:39
    And there is some good news as well.
  • 10:39 - 10:43
    The dating companies
    changed their policies globally
  • 10:43 - 10:44
    after we filed a legal complaint.
  • 10:46 - 10:48
    But organizations such as mine
  • 10:48 - 10:51
    that fight for consumers' rights
    can't be everywhere.
  • 10:51 - 10:54
    Nor can consumers fix this on their own,
  • 10:54 - 10:58
    because if we know
    that something innocent we said
  • 10:58 - 10:59
    will come back to haunt us,
  • 10:59 - 11:01
    we will stop speaking.
  • 11:01 - 11:04
    If we know that we are being
    watched and monitored,
  • 11:04 - 11:06
    we will change our behavior.
  • 11:07 - 11:10
    And if we can't control who has our data
    and how it is being used,
  • 11:10 - 11:12
    we have lost the control of our lives.
  • 11:14 - 11:18
    The stories I have told you today
    are not random examples.
  • 11:18 - 11:20
    They are everywhere
  • 11:20 - 11:23
    and they are a sign
    that things need to change.
  • 11:23 - 11:25
    And how can we achieve that change?
  • 11:25 - 11:30
    Well, companies need to realize
    that by prioritizing privacy and security,
  • 11:30 - 11:33
    they can build trust
    and loyalty to their users.
  • 11:35 - 11:38
    Governments must create a safer internet
  • 11:38 - 11:41
    by ensuring enforcement
    and up-to-date rules.
  • 11:41 - 11:44
    And us, the citizens?
  • 11:44 - 11:45
    We can use our voice
  • 11:45 - 11:51
    to remind the world that technology
    can only truly benefit society
  • 11:51 - 11:53
    if it respects basic rights.
  • 11:54 - 11:55
    Thank you so much.
  • 11:55 - 11:59
    (Applause)
Title:
How tech companies deceive you into giving up your data and privacy
Speaker:
Finn Myrstad
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
12:12

English subtitles

Revisions Compare revisions