Return to Video

You can’t care for patients; you’re not human! | Maneesh Juneja | TEDxPorto

  • 0:11 - 0:14
    When it comes to health care today,
  • 0:14 - 0:19
    what's natural is that
    it's a deeply human, personal experience.
  • 0:19 - 0:24
    Kindness, empathy, compassion,
    eye contact, human touch -
  • 0:24 - 0:26
    when we're feeling sick,
    when we're unwell,
  • 0:27 - 0:29
    when we need help and reassurance,
  • 0:29 - 0:33
    we turn to human doctors,
    human healthcare professionals.
  • 0:34 - 0:36
    That's what's natural right now.
  • 0:36 - 0:40
    However, I want to tell you a story
  • 0:40 - 0:43
    about something that happened
    to me and my family
  • 0:43 - 0:46
    which made me rethink
    what's going to be natural in health care
  • 0:46 - 0:49
    and whether humans
    would always be at the center
  • 0:49 - 0:51
    of the healthcare experience.
  • 0:52 - 0:55
    So last year, in England,
  • 0:55 - 0:57
    my mother called a service
    on the telephone
  • 0:57 - 0:59
    provided by the healthcare system,
  • 0:59 - 1:02
    where you can phone up,
    you describe your symptoms,
  • 1:02 - 1:05
    and they tell you whether to go
    to the emergency room,
  • 1:05 - 1:07
    whether you go to urgent care
  • 1:07 - 1:10
    or whether it's something
    you can take care of at home
  • 1:10 - 1:12
    using over-the-counter
    medicines, for example.
  • 1:13 - 1:16
    So middle of the night, and she called up,
  • 1:16 - 1:18
    and they took her through
    the list of questions,
  • 1:18 - 1:20
    and they're rushing her through the list.
  • 1:20 - 1:21
    She's in her seventies.
  • 1:21 - 1:24
    She's sick, she's confused,
    it's the middle of the night.
  • 1:24 - 1:25
    And she was telling them,
  • 1:25 - 1:28
    "Look, I'm really old, I need time
    to think about my answers."
  • 1:28 - 1:29
    And they're rushing her,
  • 1:29 - 1:32
    and she felt that the entire experience,
  • 1:32 - 1:33
    despite a human being
  • 1:33 - 1:36
    being a human being
    on the end of the telephone
  • 1:36 - 1:37
    in the healthcare system,
  • 1:37 - 1:44
    it was missing of compassion,
    empathy, kindness.
  • 1:44 - 1:47
    Now, contrast that with a few weeks later.
  • 1:47 - 1:51
    I was also unwell, and I decided
    to use the same service.
  • 1:51 - 1:54
    But this time, there's a version
    of that telephone hotline
  • 1:54 - 1:56
    but provided by an app -
    without any human being.
  • 1:57 - 1:59
    And I put in my symptoms,
  • 1:59 - 2:03
    and the end result was it advised me
    I can take care of myself at home;
  • 2:03 - 2:04
    nothing to worry about,
  • 2:04 - 2:06
    "You don't need to go to a facility."
  • 2:06 - 2:08
    But what I found was really interesting
  • 2:08 - 2:10
    was compared to my mother's experience
  • 2:10 - 2:13
    of the human being on the telephone
    rushing her through the questions
  • 2:13 - 2:18
    is this machine, using this app,
    I had time to think about my answers.
  • 2:18 - 2:20
    I had time to respond.
  • 2:20 - 2:21
    I wasn't rushed.
  • 2:21 - 2:25
    And it really felt like
    a compassionate experience.
  • 2:26 - 2:28
    And it made me think:
  • 2:28 - 2:29
    Are we at a point
  • 2:29 - 2:34
    where one day we might have
    machines that can care for us,
  • 2:34 - 2:39
    machines that can do
    what humans currently do today?
  • 2:41 - 2:43
    When we think about the greater challenge,
  • 2:43 - 2:46
    we currently have 7.5 billion
    people on this planet.
  • 2:46 - 2:49
    Many of them don't have access
    to any healthcare facilities,
  • 2:49 - 2:53
    and even if they do have access,
    it's often unaffordable
  • 2:53 - 2:54
    or there're long queues.
  • 2:54 - 2:56
    And if you think,
    by the end of this century,
  • 2:56 - 3:00
    it's forecasted that we're going to go
    from 7.5 billion to 11 billion people -
  • 3:00 - 3:04
    how are we going to provide
    safe and affordable health care
  • 3:04 - 3:07
    to those 11 billion people
    by the end of this century?
  • 3:08 - 3:10
    And on top of that, there's another trend,
  • 3:10 - 3:13
    that healthcare systems are trying
    to reposition themselves, saying,
  • 3:13 - 3:15
    "Instead of just coming
    to us when you're sick,
  • 3:15 - 3:18
    what can we do to help you
    prevent you from getting a disease?
  • 3:18 - 3:21
    How can we help you stay healthy?"
  • 3:21 - 3:22
    So self-care
  • 3:22 - 3:27
    and taking more responsibility
    for our own health on a daily basis.
  • 3:27 - 3:30
    And that's where AI is coming in:
    Artificial Intelligence.
  • 3:31 - 3:35
    So you'll hear stories;
    influential people saying,
  • 3:35 - 3:37
    "We got to be really scared about AI
  • 3:37 - 3:40
    because if it develops
    and we let it develop,
  • 3:40 - 3:42
    it's going to end up ruling us
  • 3:42 - 3:44
    and we're going to become
    slaves to these machines."
  • 3:45 - 3:47
    But I believe, especially in health care,
  • 3:47 - 3:49
    what if we can develop AI
  • 3:49 - 3:52
    that's compassionate,
    that's kind, that's empathetic,
  • 3:52 - 3:55
    and it can fill this gap
    in helping 11 billion people
  • 3:55 - 3:57
    in the future at the end of this century
  • 3:57 - 4:01
    have access to health care
    whenever they want it, at the right time?
  • 4:03 - 4:04
    I want to share my experiences.
  • 4:04 - 4:06
    I've been trying a number of chatbots.
  • 4:06 - 4:09
    You may have tried chatbots
    for customer service on a website.
  • 4:10 - 4:12
    And this one is called Wysa -
  • 4:12 - 4:14
    it's developed in India -
  • 4:14 - 4:17
    a compassionate AI chatbot.
  • 4:17 - 4:19
    And what I found when I was using it
  • 4:19 - 4:21
    is that it actually feels
  • 4:21 - 4:24
    like you're talking or you're chatting
    to a real human being.
  • 4:24 - 4:29
    And the founder of this chatbot,
    Wysa, is a lady called Jo Aggarwal -
  • 4:29 - 4:30
    based out in India -
  • 4:30 - 4:35
    and in her TEDx Talk last year,
    she talked about what changed for her:
  • 4:35 - 4:38
    that after some users
    had been using in this,
  • 4:38 - 4:40
    she got a message from one of the users,
  • 4:40 - 4:47
    a 13-year-old girl who said
    that she had suicidal thoughts
  • 4:47 - 4:54
    and what helped her cling on to life,
    at that point, was Wysa,
  • 4:54 - 4:57
    this app on her phone.
  • 4:58 - 4:59
    Think about it.
  • 4:59 - 5:01
    Imagine the people out there
    around the world.
  • 5:02 - 5:04
    They don't feel like
    they have friends or family
  • 5:05 - 5:09
    or even a psychiatrist or a psychologist
    to turn to in that moment.
  • 5:09 - 5:13
    What if they could turn
    to a machine that cares?
  • 5:13 - 5:15
    Is that a bad thing
    just because it's not natural?
  • 5:15 - 5:19
    Think of how many lives could be saved
    when it comes to mental health.
  • 5:22 - 5:24
    You've got the rise of smart speakers
  • 5:24 - 5:26
    from Google, from Amazon,
    even from Apple now.
  • 5:26 - 5:29
    So children now growing up
  • 5:29 - 5:32
    are finding it perfectly
    normal and natural
  • 5:32 - 5:34
    to have a conversation
    with a smart speaker
  • 5:34 - 5:39
    and ask it the news, ask it the weather,
    possibly even get information on health.
  • 5:39 - 5:42
    And then, if you think about
    what the potential for this,
  • 5:42 - 5:45
    researchers are already working
    on algorithms behind the scenes
  • 5:45 - 5:48
    that will take this voice data,
    you're having a conversation,
  • 5:49 - 5:54
    and it could even potentially detect
    conditions like Alzeimer's and Parkinson's
  • 5:54 - 5:56
    just from your voice data.
  • 5:56 - 5:59
    Think how powerful
    that could be on a global basis,
  • 5:59 - 6:01
    just from these speakers as they evolve.
  • 6:04 - 6:06
    It's even coming to toothbrushes.
  • 6:06 - 6:09
    So this is me,
    brushing my teeth in London
  • 6:09 - 6:12
    with the world's first toothbrush
    which is powered by AI.
  • 6:12 - 6:17
    It has sensors in it, monitors
    my brushing frequency, my habits,
  • 6:17 - 6:18
    how much pressure I'm putting.
  • 6:19 - 6:21
    But what surprised me
  • 6:21 - 6:26
    was that this toothbrush
    actually cares about my sleep as well.
  • 6:26 - 6:29
    So it noticed that I was brushing my teeth
    consistently late at night.
  • 6:29 - 6:32
    And it says, "Hey, maybe you want
    to go to bed earlier
  • 6:32 - 6:33
    and have a good night's sleep."
  • 6:33 - 6:35
    How would you feel about this world
  • 6:35 - 6:38
    where even your toothbrush
    is now caring for you?
  • 6:38 - 6:40
    Not just about your teeth
    but about your sleep.
  • 6:40 - 6:44
    What next? It's going to tell me
    what pizza I should eat tomorrow?
  • 6:44 - 6:46
    (Laughter)
  • 6:48 - 6:53
    So potentially in the future,
    we might have machines that care for us
  • 6:53 - 6:59
    because they can recognize and respond
    to our emotions in real time.
  • 6:59 - 7:02
    This is what researchers
    out there are working on.
  • 7:02 - 7:04
    They have this vision of machines
  • 7:04 - 7:08
    that are as compassionate
    and kind and empathetic
  • 7:08 - 7:10
    as any of us as humans.
  • 7:10 - 7:15
    So it might not feel artificial anymore
  • 7:15 - 7:20
    to deal with a machine
    in these scenarios in health care.
  • 7:20 - 7:23
    So there's a company
    out in the US, Affectiva.
  • 7:23 - 7:26
    They're working on emotion AI
    for automotive, for example.
  • 7:27 - 7:29
    So picture this scenario:
  • 7:29 - 7:30
    you're stuck in rush-hour traffic,
  • 7:30 - 7:31
    here in Porto,
  • 7:31 - 7:34
    you're feeling a bit sad,
    it's been a long day,
  • 7:34 - 7:35
    you're stressed out.
  • 7:35 - 7:37
    How would you feel if the car could detect
  • 7:37 - 7:40
    that you're feeling sad
    from your expressions on your face
  • 7:40 - 7:43
    and how you're looking
    and how you're feeling?
  • 7:43 - 7:44
    And then, maybe suggest,
  • 7:44 - 7:46
    "Oh, maybe I'll play you
    some music to cheer you up,
  • 7:46 - 7:50
    or maybe I'll change
    the fresh-air setting in the car."
  • 7:50 - 7:54
    Or maybe if it can detect
    that you're feeling that tired, it says,
  • 7:54 - 7:56
    "Pull over at the next rest area."
  • 7:56 - 7:59
    So think about even these cars
    that potentially will care for us,
  • 7:59 - 8:01
    these machines.
  • 8:01 - 8:04
    How do we feel about
    more and more machines caring for us?
  • 8:05 - 8:06
    [What do people want?]
  • 8:06 - 8:09
    And it's important to understand
    what people want.
  • 8:09 - 8:11
    It's not just about technology,
    people saying,
  • 8:11 - 8:14
    "We can develop this. Let's roll this out.
    It's great for humanity."
  • 8:14 - 8:15
    Let's ask people.
  • 8:16 - 8:19
    So there was a survey done that asked
    people in a number of countries,
  • 8:20 - 8:21
    "How willing would you be -
  • 8:21 - 8:26
    would you be willing to engage with AI
    and robotics for your healthcare needs?"
  • 8:26 - 8:29
    They asked a number of people
    in a number of countries.
  • 8:29 - 8:34
    The country where people were most willing
    to engage with health care and robotics,
  • 8:34 - 8:38
    with AI and robotics
    for their healthcare needs
  • 8:38 - 8:40
    was Nigeria, in Africa.
  • 8:40 - 8:45
    94% of people surveyed in Nigeria said,
    "Yes, I'd be willing to engage with AI."
  • 8:45 - 8:48
    So when you think about
    where are the places in the world
  • 8:48 - 8:52
    that are going to try these new advances
    in terms of caring machines,
  • 8:52 - 8:54
    where is it going to hit first?
  • 8:54 - 8:56
    Who's going to be most open to it?
  • 8:56 - 9:00
    It could be countries we don't associate
    with innovation and technology.
  • 9:00 - 9:02
    And in fact, in that same survey
  • 9:02 - 9:04
    in the UK only 39% of the people
  • 9:04 - 9:08
    said they'd be willing to engage with AI
    and robotics for healthcare needs.
  • 9:10 - 9:12
    And then, what about
    when we might prefer -
  • 9:12 - 9:15
    even if you got a human being
    available in health care -
  • 9:15 - 9:20
    when might you prefer a machine,
    to deal with a machine?
  • 9:20 - 9:23
    So there's another survey,
    this one done in 2016,
  • 9:23 - 9:25
    they asked smartphone users
    around the world.
  • 9:26 - 9:31
    And they found that actually 29%
    of those people surveyed
  • 9:31 - 9:33
    said for sensitive matters,
  • 9:33 - 9:38
    they'd prefer to deal with an AI
    medical adviser than a human being.
  • 9:38 - 9:39
    Think about that.
  • 9:39 - 9:44
    What could drive you to prefer
    a machine than a human being?
  • 9:45 - 9:48
    I was thinking, maybe it's mental health,
    maybe it's sexual health.
  • 9:48 - 9:51
    How many of you have had
    an embarrassing condition,
  • 9:51 - 9:55
    and it's either delayed you seeking help
    or you haven't looked for help at all
  • 9:55 - 9:57
    because it's just too embarrassing?
  • 9:57 - 9:59
    You don't want another human being
  • 9:59 - 10:01
    to judge you because
    you have that condition.
  • 10:03 - 10:06
    But there are tough decisions ahead;
    it's not going to be easy.
  • 10:06 - 10:07
    Again, like I said,
  • 10:07 - 10:10
    it's not a case of just
    this technology is here, let's use it.
  • 10:10 - 10:12
    We have to be very, very cautious
  • 10:12 - 10:15
    because, again, this is our health,
    this is our well-being.
  • 10:17 - 10:20
    And there was a video for a robot
    that's come out of London.
  • 10:20 - 10:23
    And in this video,
    their promotional video,
  • 10:23 - 10:27
    it showed somebody reaching
    for a midnight snack.
  • 10:27 - 10:29
    Many of us have had midnight snacks.
  • 10:29 - 10:32
    We're not supposed to -
    it's bad for our health - but we do.
  • 10:32 - 10:35
    And in this video,
    this robot that was in the kitchen
  • 10:35 - 10:37
    noticed that they're reaching
    for their midnight snack
  • 10:37 - 10:39
    and prompted, as they opened the door,
  • 10:39 - 10:42
    "Hey, sure you want to have
    that burger or that cake?
  • 10:42 - 10:46
    It's an extra two hours to burn it off
    in the morning to go for a run."
  • 10:46 - 10:49
    And that person then decides
    to follow the robot's advice
  • 10:50 - 10:52
    and not have that midnight snack.
  • 10:52 - 10:55
    How do you feel about a world
    where you'll stay healthy,
  • 10:56 - 10:59
    you'll be prevented from having
    so many diseases
  • 10:59 - 11:03
    because these machines
    will limit our choices on a daily basis,
  • 11:03 - 11:06
    whether it's in the car or the office,
    at home, wherever we are?
  • 11:06 - 11:09
    How do you feel about your freedom
    being affected there
  • 11:09 - 11:11
    with machines that care?
  • 11:12 - 11:16
    And then, underpinning this revolution
    is going to be day to day about us -
  • 11:16 - 11:19
    our preferences,
    our behaviors, our habits.
  • 11:19 - 11:23
    And think about the recent scandal
    with Facebook and Cambridge Analytica,
  • 11:23 - 11:25
    and then privacy and trust.
  • 11:25 - 11:27
    Who's going to be having access
    to all this data?
  • 11:27 - 11:29
    Is it a health insurer?
  • 11:29 - 11:31
    Are people going
    to be using it to penalize us
  • 11:31 - 11:34
    because we're making decisions
    that are not healthy?
  • 11:34 - 11:37
    There's so much to think about
    on this road ahead.
  • 11:38 - 11:40
    And then errors as well.
  • 11:40 - 11:45
    So the possibilities are that, actually,
    if we turn to machines that care,
  • 11:45 - 11:47
    maybe they're not going
    to make mistakes
  • 11:47 - 11:49
    like human healthcare professionals do;
  • 11:49 - 11:51
    they're not going
    to give us the wrong drug,
  • 11:51 - 11:53
    they're not going to misdiagnose us.
  • 11:53 - 11:57
    But equally, these digital systems
    that are going to care for us
  • 11:57 - 11:58
    are not perfect either.
  • 11:58 - 12:02
    So just this week, I was trying something
    from Microsoft called Cortana.
  • 12:02 - 12:04
    It's a general-purpose
    assistant on my phone.
  • 12:04 - 12:06
    I said, "I'm bleeding,"
    and it says, "Okay."
  • 12:06 - 12:09
    I said, "I'm bleeding to death,"
    and it says, "Interesting."
  • 12:09 - 12:10
    (Laughter)
  • 12:10 - 12:13
    So we have to be really
    careful in this journey:
  • 12:13 - 12:15
    who we trust, what we trust.
  • 12:16 - 12:18
    And accountability is really important.
  • 12:18 - 12:20
    So today, if a doctor makes a mistake,
  • 12:20 - 12:22
    we can go and say,
  • 12:22 - 12:24
    "Hey, you made a mistake,
    you gave me the wrong medicine."
  • 12:25 - 12:27
    Who are we going to hold accountable
  • 12:27 - 12:29
    when these algorithms,
    these robots, these chatbots -
  • 12:29 - 12:31
    these machines that care for us -
  • 12:31 - 12:32
    get something wrong?
  • 12:32 - 12:34
    How do we even know
    when they've made a mistake?
  • 12:34 - 12:36
    There's millions of lines of code.
  • 12:36 - 12:37
    Who's going to regulate them?
  • 12:39 - 12:41
    Another thing I was thinking about
  • 12:41 - 12:47
    is that potentially could it make
    the world more inequal?
  • 12:47 - 12:50
    In terms of could it only be in the future
  • 12:50 - 12:53
    that the very rich people have access
    to human healthcare professionals?
  • 12:53 - 12:56
    For the rest of us,
    we all have access to health care,
  • 12:56 - 12:59
    but it's via a machine
    or a type of machine.
  • 13:00 - 13:02
    Think of Japan as well.
  • 13:02 - 13:05
    Now, in some ways
    they're ahead of the curve.
  • 13:05 - 13:07
    They've got an aging population,
    a shrinking workforce,
  • 13:08 - 13:12
    and they're also leading the world
    when it comes to research of care robots,
  • 13:12 - 13:13
    especially for the elderly.
  • 13:13 - 13:16
    And I met a lady from Japan
    a few years ago; she said,
  • 13:16 - 13:21
    "I'd rather be cared for by a robot nurse,
    than a foreign nurse." Why?
  • 13:21 - 13:24
    Because they don't want
    to open themselves to immigration.
  • 13:25 - 13:27
    So what does that mean here in Europe?
  • 13:27 - 13:28
    You've got Brexit.
  • 13:28 - 13:31
    Part of the drive of Brexit
    was immigration, a fear.
  • 13:31 - 13:32
    We want to reduce immigration.
  • 13:32 - 13:36
    So does that mean here in Europe
    we're going to end up in a world
  • 13:36 - 13:40
    where we do have these machines
    that care for us
  • 13:40 - 13:42
    because we, suddenly,
    we're afraid of foreigners?
  • 13:42 - 13:44
    We won't have enough people to care.
  • 13:44 - 13:47
    Then what about loneliness,
    an epidemic of loneliness?
  • 13:47 - 13:49
    So in the UK it was suggested
  • 13:49 - 13:52
    that we give these smart speakers
    to the elderly to combat loneliness.
  • 13:52 - 13:54
    Is this the kind of world
    we want to build,
  • 13:54 - 13:58
    when we have a connection
    to a machine rather than each other?
  • 13:58 - 14:00
    Is that really the way forwards?
  • 14:00 - 14:02
    So we have to have balance.
  • 14:03 - 14:06
    We can't be too afraid;
    we can't be too excited.
  • 14:06 - 14:09
    We have to take a middle path
    when it comes to this future.
  • 14:11 - 14:13
    But if it done well, if we do it right,
  • 14:13 - 14:16
    if we really involve everybody
    in the conversation,
  • 14:16 - 14:19
    I believe that, potentially,
    machines that care
  • 14:19 - 14:23
    could increase the humanity
    in health care around the world.
  • 14:23 - 14:27
    Think about those 11 billion people
    by the end of this century.
  • 14:27 - 14:30
    So at what point of the century
    will it become natural
  • 14:30 - 14:33
    that when we're sick,
    when we want health advice,
  • 14:33 - 14:36
    we turn to a machine
    rather than a human being?
  • 14:37 - 14:40
    And, finally, I want to ask you:
  • 14:40 - 14:45
    Are you ready for a world
    where we have machines that care?
  • 14:46 - 14:47
    Thank you.
  • 14:47 - 14:49
    (Applause)
Title:
You can’t care for patients; you’re not human! | Maneesh Juneja | TEDxPorto
Description:

When it comes to improving the health of the world, many eyes are looking at recent advancements in Artificial Intelligence (AI) with hope. Some even believe that AI will solve every problem in health care. Our homes will become smart enough to know that we haven’t been active enough and will instruct the fridge to suggest healthy recipes personalized to your nutritional goals. Your self-driving car will track your health during your commute to the office and schedule doctor’s appointments. You will grow old surrounded by robot companions who can take care of your every need, never feeling tired, never complaining but anticipating your needs in advance. Technology companies, large and small, are looking at health care and wondering, “How can my algorithm make this better, cheaper and faster?”

Humans are inconsistent and make mistakes and don’t always smile at you if they are having a bad day. Children born today, growing up with digital assistants, maybe the first generation who believes that intelligent machines can "care" just as well, if not better than any human doctor or nurse.

Health care is about humans, it involves listening, empathy and compassion. Machines can’t care for us; only humans can. However, what if some patients prefer interacting with a machine rather than a human? Should we be holding back progress because we don’t understand this?

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at https://www.ted.com/tedx

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDxTalks
Duration:
15:00

English subtitles

Revisions