< Return to Video

How to spot a liar

  • 0:00 - 0:05
    Okay, now I don't want to alarm anybody in this room,
  • 0:05 - 0:07
    but it's just come to my attention
  • 0:07 - 0:09
    that the person to your right is a liar.
  • 0:09 - 0:11
    (Laughter)
  • 0:11 - 0:14
    Also, the person to your left is a liar.
  • 0:14 - 0:17
    Also the person sitting in your very seats is a liar.
  • 0:17 - 0:19
    We're all liars.
  • 0:19 - 0:21
    What I'm going to do today
  • 0:21 - 0:24
    is I'm going to show you what the research says about why we're all liars,
  • 0:24 - 0:26
    how you can become a liespotter
  • 0:26 - 0:29
    and why you might want to go the extra mile
  • 0:29 - 0:32
    and go from liespotting to truth seeking,
  • 0:32 - 0:34
    and ultimately to trust building.
  • 0:34 - 0:37
    Now speaking of trust,
  • 0:37 - 0:40
    ever since I wrote this book, "Liespotting,"
  • 0:40 - 0:43
    no one wants to meet me in person anymore, no, no, no, no, no.
  • 0:43 - 0:46
    They say, "It's okay, we'll email you."
  • 0:46 - 0:48
    (Laughter)
  • 0:48 - 0:52
    I can't even get a coffee date at Starbucks.
  • 0:52 - 0:54
    My husband's like, "Honey, deception?
  • 0:54 - 0:57
    Maybe you could have focused on cooking. How about French cooking?"
  • 0:57 - 0:59
    So before I get started, what I'm going to do
  • 0:59 - 1:02
    is I'm going to clarify my goal for you,
  • 1:02 - 1:04
    which is not to teach a game of Gotcha.
  • 1:04 - 1:06
    Liespotters aren't those nitpicky kids,
  • 1:06 - 1:09
    those kids in the back of the room that are shouting, "Gotcha! Gotcha!
  • 1:09 - 1:12
    Your eyebrow twitched. You flared your nostril.
  • 1:12 - 1:15
    I watch that TV show 'Lie To Me.' I know you're lying."
  • 1:15 - 1:17
    No, liespotters are armed
  • 1:17 - 1:20
    with scientific knowledge of how to spot deception.
  • 1:20 - 1:22
    They use it to get to the truth,
  • 1:22 - 1:24
    and they do what mature leaders do everyday;
  • 1:24 - 1:27
    they have difficult conversations with difficult people,
  • 1:27 - 1:29
    sometimes during very difficult times.
  • 1:29 - 1:31
    And they start up that path
  • 1:31 - 1:33
    by accepting a core proposition,
  • 1:33 - 1:35
    and that proposition is the following:
  • 1:35 - 1:38
    Lying is a cooperative act.
  • 1:38 - 1:42
    Think about it, a lie has no power whatsoever by its mere utterance.
  • 1:42 - 1:44
    Its power emerges
  • 1:44 - 1:46
    when someone else agrees to believe the lie.
  • 1:46 - 1:48
    So I know it may sound like tough love,
  • 1:48 - 1:52
    but look, if at some point you got lied to,
  • 1:52 - 1:54
    it's because you agreed to get lied to.
  • 1:54 - 1:57
    Truth number one about lying: Lying's a cooperative act.
  • 1:57 - 1:59
    Now not all lies are harmful.
  • 1:59 - 2:02
    Sometimes we're willing participants in deception
  • 2:02 - 2:05
    for the sake of social dignity,
  • 2:05 - 2:08
    maybe to keep a secret that should be kept secret, secret.
  • 2:08 - 2:10
    We say, "Nice song."
  • 2:10 - 2:13
    "Honey, you don't look fat in that, no."
  • 2:13 - 2:15
    Or we say, favorite of the digiratti,
  • 2:15 - 2:18
    "You know, I just fished that email out of my spam folder.
  • 2:18 - 2:21
    So sorry."
  • 2:21 - 2:24
    But there are times when we are unwilling participants in deception.
  • 2:24 - 2:27
    And that can have dramatic costs for us.
  • 2:27 - 2:30
    Last year saw 997 billion dollars
  • 2:30 - 2:34
    in corporate fraud alone in the United States.
  • 2:34 - 2:36
    That's an eyelash under a trillion dollars.
  • 2:36 - 2:38
    That's seven percent of revenues.
  • 2:38 - 2:40
    Deception can cost billions.
  • 2:40 - 2:43
    Think Enron, Madoff, the mortgage crisis.
  • 2:43 - 2:46
    Or in the case of double agents and traitors,
  • 2:46 - 2:48
    like Robert Hanssen or Aldrich Ames,
  • 2:48 - 2:50
    lies can betray our country,
  • 2:50 - 2:53
    they can compromise our security, they can undermine democracy,
  • 2:53 - 2:56
    they can cause the deaths of those that defend us.
  • 2:56 - 2:59
    Deception is actually serious business.
  • 2:59 - 3:01
    This con man, Henry Oberlander,
  • 3:01 - 3:03
    he was such an effective con man
  • 3:03 - 3:05
    British authorities say
  • 3:05 - 3:08
    he could have undermined the entire banking system of the Western world.
  • 3:08 - 3:10
    And you can't find this guy on Google; you can't find him anywhere.
  • 3:10 - 3:13
    He was interviewed once, and he said the following.
  • 3:13 - 3:15
    He said, "Look, I've got one rule."
  • 3:15 - 3:18
    And this was Henry's rule, he said,
  • 3:18 - 3:20
    "Look, everyone is willing to give you something.
  • 3:20 - 3:23
    They're ready to give you something for whatever it is they're hungry for."
  • 3:23 - 3:25
    And that's the crux of it.
  • 3:25 - 3:27
    If you don't want to be deceived, you have to know,
  • 3:27 - 3:29
    what is it that you're hungry for?
  • 3:29 - 3:32
    And we all kind of hate to admit it.
  • 3:32 - 3:35
    We wish we were better husbands, better wives,
  • 3:35 - 3:37
    smarter, more powerful,
  • 3:37 - 3:39
    taller, richer --
  • 3:39 - 3:41
    the list goes on.
  • 3:41 - 3:43
    Lying is an attempt to bridge that gap,
  • 3:43 - 3:45
    to connect our wishes and our fantasies
  • 3:45 - 3:48
    about who we wish we were, how we wish we could be,
  • 3:48 - 3:51
    with what we're really like.
  • 3:51 - 3:54
    And boy are we willing to fill in those gaps in our lives with lies.
  • 3:54 - 3:57
    On a given day, studies show that you may be lied to
  • 3:57 - 3:59
    anywhere from 10 to 200 times.
  • 3:59 - 4:02
    Now granted, many of those are white lies.
  • 4:02 - 4:04
    But in another study,
  • 4:04 - 4:06
    it showed that strangers lied three times
  • 4:06 - 4:08
    within the first 10 minutes of meeting each other.
  • 4:08 - 4:10
    (Laughter)
  • 4:10 - 4:13
    Now when we first hear this data, we recoil.
  • 4:13 - 4:15
    We can't believe how prevalent lying is.
  • 4:15 - 4:17
    We're essentially against lying.
  • 4:17 - 4:19
    But if you look more closely,
  • 4:19 - 4:21
    the plot actually thickens.
  • 4:21 - 4:24
    We lie more to strangers than we lie to coworkers.
  • 4:24 - 4:28
    Extroverts lie more than introverts.
  • 4:28 - 4:31
    Men lie eight times more about themselves
  • 4:31 - 4:33
    than they do other people.
  • 4:33 - 4:36
    Women lie more to protect other people.
  • 4:36 - 4:39
    If you're an average married couple,
  • 4:39 - 4:41
    you're going to lie to your spouse
  • 4:41 - 4:43
    in one out of every 10 interactions.
  • 4:43 - 4:45
    Now you may think that's bad.
  • 4:45 - 4:47
    It you're unmarried, that number drops to three.
  • 4:47 - 4:49
    Lying's complex.
  • 4:49 - 4:52
    It's woven into the fabric of our daily and our business lives.
  • 4:52 - 4:54
    We're deeply ambivalent about the truth.
  • 4:54 - 4:56
    We parse it out on an as-needed basis,
  • 4:56 - 4:58
    sometimes for very good reasons,
  • 4:58 - 5:01
    other times just because we don't understand the gaps in our lives.
  • 5:01 - 5:03
    That's truth number two about lying.
  • 5:03 - 5:05
    We're against lying,
  • 5:05 - 5:07
    but we're covertly for it
  • 5:07 - 5:09
    in ways that our society has sanctioned
  • 5:09 - 5:11
    for centuries and centuries and centuries.
  • 5:11 - 5:13
    It's as old as breathing.
  • 5:13 - 5:15
    It's part of our culture, it's part of our history.
  • 5:15 - 5:18
    Think Dante, Shakespeare,
  • 5:18 - 5:21
    the Bible, News of the World.
  • 5:21 - 5:23
    (Laughter)
  • 5:23 - 5:25
    Lying has evolutionary value to us as a species.
  • 5:25 - 5:27
    Researchers have long known
  • 5:27 - 5:29
    that the more intelligent the species,
  • 5:29 - 5:31
    the larger the neocortex,
  • 5:31 - 5:33
    the more likely it is to be deceptive.
  • 5:33 - 5:35
    Now you might remember Koko.
  • 5:35 - 5:38
    Does anybody remember Koko the gorilla who was taught sign language?
  • 5:38 - 5:41
    Koko was taught to communicate via sign language.
  • 5:41 - 5:43
    Here's Koko with her kitten.
  • 5:43 - 5:46
    It's her cute little, fluffy pet kitten.
  • 5:46 - 5:48
    Koko once blamed her pet kitten
  • 5:48 - 5:50
    for ripping a sink out of the wall.
  • 5:50 - 5:52
    (Laughter)
  • 5:52 - 5:54
    We're hardwired to become leaders of the pack.
  • 5:54 - 5:56
    It's starts really, really early.
  • 5:56 - 5:58
    How early?
  • 5:58 - 6:00
    Well babies will fake a cry,
  • 6:00 - 6:02
    pause, wait to see who's coming
  • 6:02 - 6:04
    and then go right back to crying.
  • 6:04 - 6:06
    One-year-olds learn concealment.
  • 6:06 - 6:08
    (Laughter)
  • 6:08 - 6:10
    Two-year-olds bluff.
  • 6:10 - 6:12
    Five-year-olds lie outright.
  • 6:12 - 6:14
    They manipulate via flattery.
  • 6:14 - 6:17
    Nine-year-olds, masters of the cover up.
  • 6:17 - 6:19
    By the time you enter college,
  • 6:19 - 6:22
    you're going to lie to your mom in one out of every five interactions.
  • 6:22 - 6:25
    By the time we enter this work world and we're breadwinners,
  • 6:25 - 6:27
    we enter a world that is just cluttered
  • 6:27 - 6:29
    with spam, fake digital friends,
  • 6:29 - 6:31
    partisan media,
  • 6:31 - 6:33
    ingenious identity thieves,
  • 6:33 - 6:35
    world-class Ponzi schemers,
  • 6:35 - 6:37
    a deception epidemic --
  • 6:37 - 6:39
    in short, what one author calls
  • 6:39 - 6:42
    a post-truth society.
  • 6:42 - 6:44
    It's been very confusing
  • 6:44 - 6:47
    for a long time now.
  • 6:48 - 6:50
    What do you do?
  • 6:50 - 6:52
    Well there are steps we can take
  • 6:52 - 6:54
    to navigate our way through the morass.
  • 6:54 - 6:57
    Trained liespotters get to the truth 90 percent of the time.
  • 6:57 - 7:00
    The rest of us, we're only 54 percent accurate.
  • 7:00 - 7:02
    Why is it so easy to learn?
  • 7:02 - 7:05
    There are good liars and there are bad liars. There are no real original liars.
  • 7:05 - 7:08
    We all make the same mistakes. We all use the same techniques.
  • 7:08 - 7:10
    So what I'm going to do
  • 7:10 - 7:12
    is I'm going to show you two patterns of deception.
  • 7:12 - 7:15
    And then we're going to look at the hot spots and see if we can find them ourselves.
  • 7:15 - 7:18
    We're going to start with speech.
  • 7:18 - 7:20
    (Video) Bill Clinton: I want you to listen to me.
  • 7:20 - 7:22
    I'm going to say this again.
  • 7:22 - 7:25
    I did not have sexual relations
  • 7:25 - 7:29
    with that woman, Miss Lewinsky.
  • 7:29 - 7:31
    I never told anybody to lie,
  • 7:31 - 7:33
    not a single time, never.
  • 7:33 - 7:36
    And these allegations are false.
  • 7:36 - 7:38
    And I need to go back to work for the American people.
  • 7:38 - 7:40
    Thank you.
  • 7:43 - 7:46
    Pamela Meyer: Okay, what were the telltale signs?
  • 7:46 - 7:50
    Well first we heard what's known as a non-contracted denial.
  • 7:50 - 7:53
    Studies show that people who are overdetermined in their denial
  • 7:53 - 7:56
    will resort to formal rather than informal language.
  • 7:56 - 7:59
    We also heard distancing language: "that woman."
  • 7:59 - 8:01
    We know that liars will unconsciously distance themselves
  • 8:01 - 8:03
    from their subject
  • 8:03 - 8:06
    using language as their tool.
  • 8:06 - 8:09
    Now if Bill Clinton had said, "Well, to tell you the truth ... "
  • 8:09 - 8:11
    or Richard Nixon's favorite, "In all candor ... "
  • 8:11 - 8:13
    he would have been a dead giveaway
  • 8:13 - 8:15
    for any liespotter than knows
  • 8:15 - 8:18
    that qualifying language, as it's called, qualifying language like that,
  • 8:18 - 8:20
    further discredits the subject.
  • 8:20 - 8:23
    Now if he had repeated the question in its entirety,
  • 8:23 - 8:27
    or if he had peppered his account with a little too much detail --
  • 8:27 - 8:29
    and we're all really glad he didn't do that --
  • 8:29 - 8:31
    he would have further discredited himself.
  • 8:31 - 8:33
    Freud had it right.
  • 8:33 - 8:36
    Freud said, look, there's much more to it than speech:
  • 8:36 - 8:39
    "No mortal can keep a secret.
  • 8:39 - 8:42
    If his lips are silent, he chatters with his fingertips."
  • 8:42 - 8:45
    And we all do it no matter how powerful you are.
  • 8:45 - 8:47
    We all chatter with our fingertips.
  • 8:47 - 8:50
    I'm going to show you Dominique Strauss-Kahn with Obama
  • 8:50 - 8:53
    who's chattering with his fingertips.
  • 8:53 - 8:56
    (Laughter)
  • 8:56 - 8:59
    Now this brings us to our next pattern,
  • 8:59 - 9:02
    which is body language.
  • 9:02 - 9:05
    With body language, here's what you've got to do.
  • 9:05 - 9:08
    You've really got to just throw your assumptions out the door.
  • 9:08 - 9:10
    Let the science temper your knowledge a little bit.
  • 9:10 - 9:13
    Because we think liars fidget all the time.
  • 9:13 - 9:16
    Well guess what, they're known to freeze their upper bodies when they're lying.
  • 9:16 - 9:19
    We think liars won't look you in the eyes.
  • 9:19 - 9:21
    Well guess what, they look you in the eyes a little too much
  • 9:21 - 9:23
    just to compensate for that myth.
  • 9:23 - 9:25
    We think warmth and smiles
  • 9:25 - 9:27
    convey honesty, sincerity.
  • 9:27 - 9:29
    But a trained liespotter
  • 9:29 - 9:31
    can spot a fake smile a mile away.
  • 9:31 - 9:34
    Can you all spot the fake smile here?
  • 9:35 - 9:37
    You can consciously contract
  • 9:37 - 9:40
    the muscles in your cheeks.
  • 9:40 - 9:43
    But the real smile's in the eyes, the crow's feet of the eyes.
  • 9:43 - 9:45
    They cannot be consciously contracted,
  • 9:45 - 9:47
    especially if you overdid the Botox.
  • 9:47 - 9:50
    Don't overdo the Botox; nobody will think you're honest.
  • 9:50 - 9:52
    Now we're going to look at the hot spots.
  • 9:52 - 9:54
    Can you tell what's happening in a conversation?
  • 9:54 - 9:57
    Can you start to find the hot spots
  • 9:57 - 9:59
    to see the discrepancies
  • 9:59 - 10:01
    between someone's words and someone's actions?
  • 10:01 - 10:03
    Now I know it seems really obvious,
  • 10:03 - 10:05
    but when you're having a conversation
  • 10:05 - 10:08
    with someone you suspect of deception,
  • 10:08 - 10:11
    attitude is by far the most overlooked but telling of indicators.
  • 10:11 - 10:13
    An honest person is going to be cooperative.
  • 10:13 - 10:15
    They're going to show they're on your side.
  • 10:15 - 10:17
    They're going to be enthusiastic.
  • 10:17 - 10:19
    They're going to be willing and helpful to getting you to the truth.
  • 10:19 - 10:22
    They're going to be willing to brainstorm, name suspects,
  • 10:22 - 10:24
    provide details.
  • 10:24 - 10:26
    They're going to say, "Hey,
  • 10:26 - 10:29
    maybe it was those guys in payroll that forged those checks."
  • 10:29 - 10:32
    They're going to be infuriated if they sense they're wrongly accused
  • 10:32 - 10:34
    throughout the entire course of the interview, not just in flashes;
  • 10:34 - 10:37
    they'll be infuriated throughout the entire course of the interview.
  • 10:37 - 10:39
    And if you ask someone honest
  • 10:39 - 10:42
    what should happen to whomever did forge those checks,
  • 10:42 - 10:44
    an honest person is much more likely
  • 10:44 - 10:48
    to recommend strict rather than lenient punishment.
  • 10:48 - 10:50
    Now let's say you're having that exact same conversation
  • 10:50 - 10:52
    with someone deceptive.
  • 10:52 - 10:54
    That person may be withdrawn,
  • 10:54 - 10:56
    look down, lower their voice,
  • 10:56 - 10:58
    pause, be kind of herky-jerky.
  • 10:58 - 11:00
    Ask a deceptive person to tell their story,
  • 11:00 - 11:03
    they're going to pepper it with way too much detail
  • 11:03 - 11:06
    in all kinds of irrelevant places.
  • 11:06 - 11:09
    And then they're going to tell their story in strict chronological order.
  • 11:09 - 11:11
    And what a trained interrogator does
  • 11:11 - 11:13
    is they come in and in very subtle ways
  • 11:13 - 11:15
    over the course of several hours,
  • 11:15 - 11:18
    they will ask that person to tell that story backwards,
  • 11:18 - 11:20
    and then they'll watch them squirm,
  • 11:20 - 11:23
    and track which questions produce the highest volume of deceptive tells.
  • 11:23 - 11:26
    Why do they do that? Well we all do the same thing.
  • 11:26 - 11:28
    We rehearse our words,
  • 11:28 - 11:30
    but we rarely rehearse our gestures.
  • 11:30 - 11:32
    We say "yes," we shake our heads "no."
  • 11:32 - 11:35
    We tell very convincing stories, we slightly shrug our shoulders.
  • 11:35 - 11:37
    We commit terrible crimes,
  • 11:37 - 11:40
    and we smile at the delight in getting away with it.
  • 11:40 - 11:43
    Now that smile is known in the trade as "duping delight."
  • 11:43 - 11:46
    And we're going to see that in several videos moving forward,
  • 11:46 - 11:48
    but we're going to start -- for those of you who don't know him,
  • 11:48 - 11:51
    this is presidential candidate John Edwards
  • 11:51 - 11:54
    who shocked America by fathering a child out of wedlock.
  • 11:54 - 11:57
    We're going to see him talk about getting a paternity test.
  • 11:57 - 11:59
    See now if you can spot him
  • 11:59 - 12:01
    saying, "yes" while shaking his head "no,"
  • 12:01 - 12:03
    slightly shrugging his shoulders.
  • 12:03 - 12:05
    (Video) John Edwards: I'd be happy to participate in one.
  • 12:05 - 12:08
    I know that it's not possible that this child could be mine,
  • 12:08 - 12:10
    because of the timing of events.
  • 12:10 - 12:12
    So I know it's not possible.
  • 12:12 - 12:14
    Happy to take a paternity test,
  • 12:14 - 12:16
    and would love to see it happen.
  • 12:16 - 12:19
    Interviewer: Are you going to do that soon? Is there somebody --
  • 12:19 - 12:22
    JE: Well, I'm only one side. I'm only one side of the test.
  • 12:22 - 12:25
    But I'm happy to participate in one.
  • 12:25 - 12:27
    PM: Okay, those head shakes are much easier to spot
  • 12:27 - 12:29
    once you know to look for them.
  • 12:29 - 12:31
    There're going to be times
  • 12:31 - 12:33
    when someone makes one expression
  • 12:33 - 12:36
    while masking another that just kind of leaks through in a flash.
  • 12:37 - 12:39
    Murderers are known to leak sadness.
  • 12:39 - 12:41
    Your new joint venture partner might shake your hand,
  • 12:41 - 12:43
    celebrate, go out to dinner with you
  • 12:43 - 12:46
    and then leak an expression of anger.
  • 12:46 - 12:49
    And we're not all going to become facial expression experts overnight here,
  • 12:49 - 12:52
    but there's one I can teach you that's very dangerous, and it's easy to learn,
  • 12:52 - 12:55
    and that's the expression of contempt.
  • 12:55 - 12:58
    Now with anger, you've got two people on an even playing field.
  • 12:58 - 13:00
    It's still somewhat of a healthy relationship.
  • 13:00 - 13:02
    But when anger turns to contempt,
  • 13:02 - 13:04
    you've been dismissed.
  • 13:04 - 13:06
    It's associated with moral superiority.
  • 13:06 - 13:09
    And for that reason, it's very, very hard to recover from.
  • 13:09 - 13:11
    Here's what it looks like.
  • 13:11 - 13:13
    It's marked by one lip corner
  • 13:13 - 13:15
    pulled up and in.
  • 13:15 - 13:18
    It's the only asymmetrical expression.
  • 13:18 - 13:20
    And in the presence of contempt,
  • 13:20 - 13:22
    whether or not deception follows --
  • 13:22 - 13:24
    and it doesn't always follow --
  • 13:24 - 13:26
    look the other way, go the other direction,
  • 13:26 - 13:28
    reconsider the deal,
  • 13:28 - 13:32
    say, "No thank you. I'm not coming up for just one more nightcap. Thank you."
  • 13:32 - 13:34
    Science has surfaced
  • 13:34 - 13:36
    many, many more indicators.
  • 13:36 - 13:38
    We know, for example,
  • 13:38 - 13:40
    we know liars will shift their blink rate,
  • 13:40 - 13:42
    point their feet towards an exit.
  • 13:42 - 13:44
    They will take barrier objects
  • 13:44 - 13:47
    and put them between themselves and the person that is interviewing them.
  • 13:47 - 13:49
    They'll alter their vocal tone,
  • 13:49 - 13:52
    often making their vocal tone much lower.
  • 13:52 - 13:54
    Now here's the deal.
  • 13:54 - 13:57
    These behaviors are just behaviors.
  • 13:57 - 13:59
    They're not proof of deception.
  • 13:59 - 14:01
    They're red flags.
  • 14:01 - 14:03
    We're human beings.
  • 14:03 - 14:06
    We make deceptive flailing gestures all over the place all day long.
  • 14:06 - 14:08
    They don't mean anything in and of themselves.
  • 14:08 - 14:11
    But when you see clusters of them, that's your signal.
  • 14:11 - 14:14
    Look, listen, probe, ask some hard questions,
  • 14:14 - 14:17
    get out of that very comfortable mode of knowing,
  • 14:17 - 14:20
    walk into curiosity mode, ask more questions,
  • 14:20 - 14:23
    have a little dignity, treat the person you're talking to with rapport.
  • 14:23 - 14:26
    Don't try to be like those folks on "Law & Order" and those other TV shows
  • 14:26 - 14:28
    that pummel their subjects into submission.
  • 14:28 - 14:31
    Don't be too aggressive, it doesn't work.
  • 14:31 - 14:33
    Now we've talked a little bit
  • 14:33 - 14:35
    about how to talk to someone who's lying
  • 14:35 - 14:37
    and how to spot a lie.
  • 14:37 - 14:40
    And as I promised, we're now going to look at what the truth looks like.
  • 14:40 - 14:42
    But I'm going to show you two videos,
  • 14:42 - 14:45
    two mothers -- one is lying, one is telling the truth.
  • 14:45 - 14:47
    And these were surfaced
  • 14:47 - 14:49
    by researcher David Matsumoto in California.
  • 14:49 - 14:51
    And I think they're an excellent example
  • 14:51 - 14:53
    of what the truth looks like.
  • 14:53 - 14:55
    This mother, Diane Downs,
  • 14:55 - 14:57
    shot her kids at close range,
  • 14:57 - 14:59
    drove them to the hospital
  • 14:59 - 15:01
    while they bled all over the car,
  • 15:01 - 15:03
    claimed a scraggy-haired stranger did it.
  • 15:03 - 15:05
    And you'll see when you see the video,
  • 15:05 - 15:07
    she can't even pretend to be an agonizing mother.
  • 15:07 - 15:09
    What you want to look for here
  • 15:09 - 15:11
    is an incredible discrepancy
  • 15:11 - 15:13
    between horrific events that she describes
  • 15:13 - 15:15
    and her very, very cool demeanor.
  • 15:15 - 15:18
    And if you look closely, you'll see duping delight throughout this video.
  • 15:18 - 15:20
    (Video) Diane Downs: At night when I close my eyes,
  • 15:20 - 15:23
    I can see Christie reaching her hand out to me while I'm driving,
  • 15:23 - 15:26
    and the blood just kept coming out of her mouth.
  • 15:26 - 15:28
    And that -- maybe it'll fade too with time --
  • 15:28 - 15:30
    but I don't think so.
  • 15:30 - 15:33
    That bothers me the most.
  • 15:40 - 15:42
    PM: Now I'm going to show you a video
  • 15:42 - 15:44
    of an actual grieving mother, Erin Runnion,
  • 15:44 - 15:48
    confronting her daughter's murderer and torturer in court.
  • 15:48 - 15:50
    Here you're going to see no false emotion,
  • 15:50 - 15:53
    just the authentic expression of a mother's agony.
  • 15:53 - 15:55
    (Video) Erin Runnion: I wrote this statement on the third anniversary
  • 15:55 - 15:57
    of the night you took my baby,
  • 15:57 - 15:59
    and you hurt her,
  • 15:59 - 16:01
    and you crushed her,
  • 16:01 - 16:05
    you terrified her until her heart stopped.
  • 16:05 - 16:08
    And she fought, and I know she fought you.
  • 16:08 - 16:10
    But I know she looked at you
  • 16:10 - 16:12
    with those amazing brown eyes,
  • 16:12 - 16:15
    and you still wanted to kill her.
  • 16:15 - 16:17
    And I don't understand it,
  • 16:17 - 16:20
    and I never will.
  • 16:20 - 16:24
    PM: Okay, there's no doubting the veracity of those emotions.
  • 16:24 - 16:27
    Now the technology around what the truth looks like
  • 16:27 - 16:30
    is progressing on, the science of it.
  • 16:30 - 16:32
    We know for example
  • 16:32 - 16:35
    that we now have specialized eye trackers and infrared brain scans,
  • 16:35 - 16:38
    MRI's that can decode the signals that our bodies send out
  • 16:38 - 16:40
    when we're trying to be deceptive.
  • 16:40 - 16:43
    And these technologies are going to be marketed to all of us
  • 16:43 - 16:45
    as panaceas for deceit,
  • 16:45 - 16:48
    and they will prove incredibly useful some day.
  • 16:48 - 16:50
    But you've got to ask yourself in the meantime:
  • 16:50 - 16:52
    Who do you want on your side of the meeting,
  • 16:52 - 16:55
    someone who's trained in getting to the truth
  • 16:55 - 16:57
    or some guy who's going to drag a 400-pound electroencephalogram
  • 16:57 - 16:59
    through the door?
  • 16:59 - 17:03
    Liespotters rely on human tools.
  • 17:03 - 17:05
    They know, as someone once said,
  • 17:05 - 17:07
    "Character's who you are in the dark."
  • 17:07 - 17:09
    And what's kind of interesting
  • 17:09 - 17:11
    is that today we have so little darkness.
  • 17:11 - 17:14
    Our world is lit up 24 hours a day.
  • 17:14 - 17:16
    It's transparent
  • 17:16 - 17:18
    with blogs and social networks
  • 17:18 - 17:20
    broadcasting the buzz of a whole new generation of people
  • 17:20 - 17:23
    that have made a choice to live their lives in public.
  • 17:23 - 17:27
    It's a much more noisy world.
  • 17:27 - 17:29
    So one challenge we have
  • 17:29 - 17:31
    is to remember,
  • 17:31 - 17:34
    oversharing, that's not honesty.
  • 17:34 - 17:36
    Our manic tweeting and texting
  • 17:36 - 17:38
    can blind us to the fact
  • 17:38 - 17:41
    that the subtleties of human decency -- character integrity --
  • 17:41 - 17:44
    that's still what matters, that's always what's going to matter.
  • 17:44 - 17:46
    So in this much noisier world,
  • 17:46 - 17:48
    it might make sense for us
  • 17:48 - 17:50
    to be just a little bit more explicit
  • 17:50 - 17:53
    about our moral code.
  • 17:53 - 17:55
    When you combine the science of recognizing deception
  • 17:55 - 17:57
    with the art of looking, listening,
  • 17:57 - 18:00
    you exempt yourself from collaborating in a lie.
  • 18:00 - 18:02
    You start up that path
  • 18:02 - 18:04
    of being just a little bit more explicit,
  • 18:04 - 18:06
    because you signal to everyone around you,
  • 18:06 - 18:09
    you say, "Hey, my world, our world,
  • 18:09 - 18:11
    it's going to be an honest one.
  • 18:11 - 18:13
    My world is going to be one where truth is strengthened
  • 18:13 - 18:16
    and falsehood is recognized and marginalized."
  • 18:16 - 18:18
    And when you do that,
  • 18:18 - 18:21
    the ground around you starts to shift just a little bit.
  • 18:21 - 18:24
    And that's the truth. Thank you.
  • 18:24 - 18:29
    (Applause)
Title:
How to spot a liar
Speaker:
Pamela Meyer
Description:

On any given day we're lied to from 10 to 200 times, and the clues to detect those lie can be subtle and counter-intuitive. Pamela Meyer, author of Liespotting, shows the manners and "hotspots" used by those trained to recognize deception -- and she argues honesty is a value worth preserving.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
18:30
Joanna Pietrulewicz edited English subtitles for How to spot a liar
Krystian Aparta edited English subtitles for How to spot a liar
Krystian Aparta commented on English subtitles for How to spot a liar
Krystian Aparta edited English subtitles for How to spot a liar
Krystian Aparta edited English subtitles for How to spot a liar
Morton Bast edited English subtitles for How to spot a liar
Morton Bast edited English subtitles for How to spot a liar
TED edited English subtitles for How to spot a liar
Show all

English subtitles

Revisions Compare revisions