0:00:00.025,0:00:04.976 Okay, now I don't want[br]to alarm anybody in this room, 0:00:05.000,0:00:08.976 but it's just come to my attention[br]that the person to your right is a liar. 0:00:09.000,0:00:10.976 (Laughter) 0:00:11.000,0:00:13.976 Also, the person to your left is a liar. 0:00:14.000,0:00:16.976 Also the person sitting[br]in your very seats is a liar. 0:00:17.000,0:00:18.976 We're all liars. 0:00:19.000,0:00:20.976 What I'm going to do today 0:00:21.000,0:00:24.576 is I'm going to show you what the research[br]says about why we're all liars, 0:00:24.600,0:00:26.176 how you can become a liespotter 0:00:26.200,0:00:28.976 and why you might want[br]to go the extra mile 0:00:29.000,0:00:31.976 and go from liespotting to truth seeking, 0:00:32.000,0:00:33.976 and ultimately to trust building. 0:00:34.000,0:00:36.976 Now, speaking of trust, 0:00:37.000,0:00:39.976 ever since I wrote[br]this book, "Liespotting," 0:00:40.000,0:00:42.976 no one wants to meet me in person[br]anymore, no, no, no, no, no. 0:00:43.000,0:00:45.976 They say, "It's okay, we'll email you." 0:00:46.000,0:00:47.976 (Laughter) 0:00:48.000,0:00:51.976 I can't even get[br]a coffee date at Starbucks. 0:00:52.000,0:00:53.976 My husband's like, "Honey, deception? 0:00:54.000,0:00:57.191 Maybe you could have focused on cooking.[br]How about French cooking?" 0:00:57.215,0:00:59.358 So before I get started,[br]what I'm going to do 0:00:59.382,0:01:01.976 is I'm going to clarify my goal for you, 0:01:02.000,0:01:03.976 which is not to teach a game of Gotcha. 0:01:04.000,0:01:05.976 Liespotters aren't those nitpicky kids, 0:01:06.000,0:01:09.334 those kids in the back of the room[br]that are shouting, "Gotcha! Gotcha! 0:01:09.358,0:01:11.976 Your eyebrow twitched.[br]You flared your nostril. 0:01:12.000,0:01:14.976 I watch that TV show 'Lie To Me.'[br]I know you're lying." 0:01:15.000,0:01:16.976 No, liespotters are armed 0:01:17.000,0:01:19.976 with scientific knowledge[br]of how to spot deception. 0:01:20.000,0:01:21.976 They use it to get to the truth, 0:01:22.000,0:01:24.096 and they do what mature[br]leaders do everyday; 0:01:24.120,0:01:26.976 they have difficult conversations[br]with difficult people, 0:01:27.000,0:01:28.976 sometimes during very difficult times. 0:01:29.000,0:01:32.976 And they start up that path[br]by accepting a core proposition, 0:01:33.000,0:01:34.976 and that proposition is the following: 0:01:35.000,0:01:37.468 Lying is a cooperative act. 0:01:38.777,0:01:41.976 Think about it, a lie has no power[br]whatsoever by its mere utterance. 0:01:42.000,0:01:43.976 Its power emerges 0:01:44.000,0:01:46.096 when someone else agrees[br]to believe the lie. 0:01:46.120,0:01:48.096 So I know it may sound like tough love, 0:01:48.120,0:01:51.976 but look, if at some point[br]you got lied to, 0:01:52.000,0:01:53.976 it's because you agreed to get lied to. 0:01:54.000,0:01:56.976 Truth number one about lying:[br]Lying's a cooperative act. 0:01:57.000,0:01:58.976 Now not all lies are harmful. 0:01:59.000,0:02:01.976 Sometimes we're willing[br]participants in deception 0:02:02.000,0:02:04.976 for the sake of social dignity, 0:02:05.000,0:02:07.976 maybe to keep a secret that should[br]be kept secret, secret. 0:02:08.000,0:02:09.976 We say, "Nice song." 0:02:10.000,0:02:12.976 "Honey, you don't look fat in that, no." 0:02:13.000,0:02:14.976 Or we say, favorite of the digiratti, 0:02:15.000,0:02:17.976 "You know, I just fished[br]that email out of my Spam folder. 0:02:18.000,0:02:20.976 So sorry." 0:02:21.000,0:02:24.239 But there are times when we are unwilling[br]participants in deception. 0:02:24.263,0:02:26.976 And that can have dramatic costs for us. 0:02:27.000,0:02:29.976 Last year saw 997 billion dollars 0:02:30.000,0:02:33.579 in corporate fraud alone[br]in the United States. 0:02:34.333,0:02:36.348 That's an eyelash[br]under a trillion dollars. 0:02:36.372,0:02:37.976 That's seven percent of revenues. 0:02:38.000,0:02:39.976 Deception can cost billions. 0:02:40.000,0:02:42.976 Think Enron, Madoff, the mortgage crisis. 0:02:43.000,0:02:45.976 Or in the case[br]of double agents and traitors, 0:02:46.000,0:02:47.976 like Robert Hanssen or Aldrich Ames, 0:02:48.000,0:02:49.976 lies can betray our country, 0:02:50.000,0:02:53.000 they can compromise our security,[br]they can undermine democracy, 0:02:53.024,0:02:55.976 they can cause the deaths[br]of those that defend us. 0:02:56.000,0:02:58.976 Deception is actually serious business. 0:02:59.000,0:03:02.976 This con man, Henry Oberlander,[br]he was such an effective con man, 0:03:03.000,0:03:04.976 British authorities say 0:03:05.000,0:03:08.476 he could have undermined the entire[br]banking system of the Western world. 0:03:08.500,0:03:11.676 And you can't find this guy on Google;[br]you can't find him anywhere. 0:03:11.700,0:03:14.176 He was interviewed once,[br]and he said the following. 0:03:14.200,0:03:15.876 He said, "Look, I've got one rule." 0:03:15.900,0:03:17.976 And this was Henry's rule, he said, 0:03:18.000,0:03:20.376 "Look, everyone is willing[br]to give you something. 0:03:20.400,0:03:23.972 They're ready to give you something[br]for whatever it is they're hungry for." 0:03:23.996,0:03:25.376 And that's the crux of it. 0:03:25.400,0:03:27.829 If you don't want to be[br]deceived, you have to know, 0:03:27.853,0:03:29.529 what is it that you're hungry for? 0:03:29.553,0:03:31.976 And we all kind of hate to admit it. 0:03:32.000,0:03:34.976 We wish we were[br]better husbands, better wives, 0:03:35.000,0:03:38.976 smarter, more powerful, taller, richer -- 0:03:39.000,0:03:40.976 the list goes on. 0:03:41.000,0:03:42.976 Lying is an attempt to bridge that gap, 0:03:43.000,0:03:44.976 to connect our wishes and our fantasies 0:03:45.000,0:03:47.976 about who we wish we were,[br]how we wish we could be, 0:03:48.000,0:03:50.976 with what we're really like. 0:03:51.000,0:03:54.239 And boy are we willing to fill in[br]those gaps in our lives with lies. 0:03:54.263,0:03:56.976 On a given day, studies show[br]that you may be lied to 0:03:57.000,0:03:58.976 anywhere from 10 to 200 times. 0:03:59.000,0:04:01.976 Now granted, many of those are white lies. 0:04:02.000,0:04:03.976 But in another study, 0:04:04.000,0:04:05.976 it showed that strangers lied three times 0:04:06.000,0:04:08.381 within the first 10 minutes[br]of meeting each other. 0:04:08.405,0:04:09.976 (Laughter) 0:04:10.000,0:04:12.976 Now when we first hear[br]this data, we recoil. 0:04:13.000,0:04:14.976 We can't believe how prevalent lying is. 0:04:15.000,0:04:16.976 We're essentially against lying. 0:04:17.000,0:04:20.976 But if you look more closely,[br]the plot actually thickens. 0:04:21.000,0:04:23.976 We lie more to strangers[br]than we lie to coworkers. 0:04:24.000,0:04:27.976 Extroverts lie more than introverts. 0:04:28.000,0:04:32.976 Men lie eight times more about themselves[br]than they do other people. 0:04:33.000,0:04:35.976 Women lie more to protect other people. 0:04:36.000,0:04:38.976 If you're an average married couple, 0:04:39.000,0:04:42.976 you're going to lie to your spouse[br]in one out of every 10 interactions. 0:04:43.000,0:04:44.976 Now, you may think that's bad. 0:04:45.000,0:04:47.286 If you're unmarried,[br]that number drops to three. 0:04:47.310,0:04:48.976 Lying's complex. 0:04:49.000,0:04:52.000 It's woven into the fabric[br]of our daily and our business lives. 0:04:52.024,0:04:53.976 We're deeply ambivalent about the truth. 0:04:54.000,0:04:55.976 We parse it out on an as-needed basis, 0:04:56.000,0:04:57.976 sometimes for very good reasons, 0:04:58.000,0:05:01.191 other times just because[br]we don't understand the gaps in our lives. 0:05:01.215,0:05:02.976 That's truth number two about lying. 0:05:03.000,0:05:04.976 We're against lying, 0:05:05.000,0:05:06.976 but we're covertly for it 0:05:07.000,0:05:11.000 in ways that our society has sanctioned[br]for centuries and centuries and centuries. 0:05:11.024,0:05:12.976 It's as old as breathing. 0:05:13.000,0:05:15.429 It's part of our culture,[br]it's part of our history. 0:05:15.453,0:05:20.976 Think Dante, Shakespeare,[br]the Bible, News of the World. 0:05:21.000,0:05:22.976 (Laughter) 0:05:23.000,0:05:25.286 Lying has evolutionary value[br]to us as a species. 0:05:25.310,0:05:28.976 Researchers have long known[br]that the more intelligent the species, 0:05:29.000,0:05:30.976 the larger the neocortex, 0:05:31.000,0:05:32.976 the more likely it is to be deceptive. 0:05:33.000,0:05:34.976 Now you might remember Koko. 0:05:35.000,0:05:38.239 Does anybody remember Koko the gorilla[br]who was taught sign language? 0:05:38.263,0:05:40.976 Koko was taught to communicate[br]via sign language. 0:05:41.000,0:05:42.976 Here's Koko with her kitten. 0:05:43.000,0:05:45.976 It's her cute little, fluffy pet kitten. 0:05:46.000,0:05:49.976 Koko once blamed her pet kitten[br]for ripping a sink out of the wall. 0:05:50.000,0:05:51.976 (Laughter) 0:05:52.000,0:05:54.191 We're hardwired to become[br]leaders of the pack. 0:05:54.215,0:05:55.976 It's starts really, really early. 0:05:56.000,0:05:57.976 How early? 0:05:58.000,0:05:59.976 Well babies will fake a cry, 0:06:00.000,0:06:01.976 pause, wait to see who's coming 0:06:02.000,0:06:03.976 and then go right back to crying. 0:06:04.000,0:06:05.976 One-year-olds learn concealment. 0:06:06.000,0:06:07.976 (Laughter) 0:06:08.000,0:06:09.976 Two-year-olds bluff. 0:06:10.000,0:06:11.976 Five-year-olds lie outright. 0:06:12.000,0:06:13.976 They manipulate via flattery. 0:06:14.000,0:06:16.976 Nine-year-olds, masters of the cover-up. 0:06:17.000,0:06:18.976 By the time you enter college, 0:06:19.000,0:06:22.376 you're going to lie to your mom[br]in one out of every five interactions. 0:06:22.400,0:06:25.276 By the time we enter this work world[br]and we're breadwinners, 0:06:25.300,0:06:28.976 we enter a world that is just cluttered[br]with Spam, fake digital friends, 0:06:29.000,0:06:30.976 partisan media, 0:06:31.000,0:06:32.976 ingenious identity thieves, 0:06:33.000,0:06:34.976 world-class Ponzi schemers, 0:06:35.000,0:06:36.976 a deception epidemic -- 0:06:37.000,0:06:41.977 in short, what one author calls[br]a post-truth society. 0:06:42.001,0:06:45.966 It's been very confusing[br]for a long time now. 0:06:48.665,0:06:49.976 What do you do? 0:06:50.000,0:06:53.976 Well, there are steps we can take[br]to navigate our way through the morass. 0:06:54.000,0:06:56.976 Trained liespotters get to the truth[br]90 percent of the time. 0:06:57.000,0:06:59.976 The rest of us,[br]we're only 54 percent accurate. 0:07:00.000,0:07:01.976 Why is it so easy to learn? 0:07:02.000,0:07:04.215 There are good liars and bad liars. 0:07:04.239,0:07:05.876 There are no real original liars. 0:07:05.900,0:07:08.876 We all make the same mistakes.[br]We all use the same techniques. 0:07:08.900,0:07:12.529 So what I'm going to do is I'm going[br]to show you two patterns of deception. 0:07:12.553,0:07:14.752 And then we're going[br]to look at the hot spots 0:07:14.776,0:07:16.629 and see if we can find them ourselves. 0:07:16.653,0:07:18.276 We're going to start with speech. 0:07:18.300,0:07:20.634 (Video) Bill Clinton:[br]I want you to listen to me. 0:07:20.658,0:07:22.076 I'm going to say this again. 0:07:22.100,0:07:28.976 I did not have sexual relations[br]with that woman, Miss Lewinsky. 0:07:29.000,0:07:32.976 I never told anybody to lie,[br]not a single time, never. 0:07:33.000,0:07:35.976 And these allegations are false. 0:07:36.000,0:07:38.572 And I need to go back to work[br]for the American people. 0:07:38.596,0:07:40.572 Thank you. 0:07:40.596,0:07:42.108 (Applause) 0:07:43.000,0:07:45.976 Pamela Meyer: Okay,[br]what were the telltale signs? 0:07:46.000,0:07:49.976 Well first we heard what's known[br]as a non-contracted denial. 0:07:50.000,0:07:53.000 Studies show that people[br]who are overdetermined in their denial 0:07:53.024,0:07:55.976 will resort to formal rather[br]than informal language. 0:07:56.000,0:07:58.976 We also heard[br]distancing language: "that woman." 0:07:59.000,0:08:01.715 We know that liars will unconsciously[br]distance themselves 0:08:01.739,0:08:02.976 from their subject, 0:08:03.000,0:08:05.976 using language as their tool. 0:08:06.000,0:08:09.048 Now if Bill Clinton had said,[br]"Well, to tell you the truth..." 0:08:09.072,0:08:11.263 or Richard Nixon's favorite,[br]"In all candor..." 0:08:11.287,0:08:12.976 he would have been a dead giveaway 0:08:13.000,0:08:14.976 for any liespotter than knows 0:08:15.000,0:08:18.429 that qualifying language, as it's called,[br]qualifying language like that, 0:08:18.453,0:08:19.976 further discredits the subject. 0:08:20.000,0:08:22.976 Now if he had repeated[br]the question in its entirety, 0:08:23.000,0:08:26.976 or if he had peppered his account[br]with a little too much detail -- 0:08:27.000,0:08:29.191 and we're all really glad[br]he didn't do that -- 0:08:29.215,0:08:31.215 he would have further discredited himself. 0:08:31.239,0:08:32.976 Freud had it right. 0:08:33.000,0:08:35.976 Freud said, look,[br]there's much more to it than speech: 0:08:36.000,0:08:38.976 "No mortal can keep a secret. 0:08:39.000,0:08:41.976 If his lips are silent,[br]he chatters with his fingertips." 0:08:42.000,0:08:44.976 And we all do it no matter[br]how powerful you are. 0:08:45.000,0:08:46.976 We all chatter with our fingertips. 0:08:47.000,0:08:49.976 I'm going to show you[br]Dominique Strauss-Kahn with Obama 0:08:50.000,0:08:52.976 who's chattering with his fingertips. 0:08:53.000,0:08:55.976 (Laughter) 0:08:56.000,0:09:01.976 Now this brings us to our next pattern,[br]which is body language. 0:09:02.000,0:09:04.976 With body language,[br]here's what you've got to do. 0:09:05.000,0:09:07.976 You've really got to just throw[br]your assumptions out the door. 0:09:08.000,0:09:10.429 Let the science temper[br]your knowledge a little bit. 0:09:10.453,0:09:12.976 Because we think liars[br]fidget all the time. 0:09:13.000,0:09:16.762 Well guess what, they're known to freeze[br]their upper bodies when they're lying. 0:09:16.786,0:09:18.976 We think liars won't look you in the eyes. 0:09:19.000,0:09:21.876 Well guess what, they look[br]you in the eyes a little too much 0:09:21.900,0:09:23.476 just to compensate for that myth. 0:09:23.500,0:09:26.976 We think warmth and smiles[br]convey honesty, sincerity. 0:09:27.000,0:09:30.976 But a trained liespotter[br]can spot a fake smile a mile away. 0:09:31.000,0:09:34.000 Can you all spot the fake smile here? 0:09:35.000,0:09:39.976 You can consciously contract[br]the muscles in your cheeks. 0:09:40.000,0:09:42.976 But the real smile's in the eyes,[br]the crow's feet of the eyes. 0:09:43.000,0:09:44.976 They cannot be consciously contracted, 0:09:45.000,0:09:46.976 especially if you overdid the Botox. 0:09:47.000,0:09:49.976 Don't overdo the Botox;[br]nobody will think you're honest. 0:09:50.000,0:09:51.976 Now we're going to look at the hot spots. 0:09:52.000,0:09:54.286 Can you tell what's happening[br]in a conversation? 0:09:54.310,0:09:56.976 Can you start to find the hot spots 0:09:57.000,0:09:58.976 to see the discrepancies 0:09:59.000,0:10:01.191 between someone's words[br]and someone's actions? 0:10:01.215,0:10:02.976 Now, I know it seems really obvious, 0:10:03.000,0:10:07.976 but when you're having a conversation[br]with someone you suspect of deception, 0:10:08.000,0:10:11.076 attitude is by far the most overlooked[br]but telling of indicators. 0:10:11.100,0:10:13.296 An honest person[br]is going to be cooperative. 0:10:13.320,0:10:15.368 They're going to show[br]they're on your side. 0:10:15.392,0:10:16.976 They're going to be enthusiastic. 0:10:17.000,0:10:20.276 They're going to be willing and helpful[br]to getting you to the truth. 0:10:20.300,0:10:23.076 They're going to be willing[br]to brainstorm, name suspects, 0:10:23.100,0:10:24.376 provide details. 0:10:24.400,0:10:25.976 They're going to say, 0:10:26.000,0:10:29.176 "Hey, maybe it was those guys in payroll[br]that forged those checks." 0:10:29.200,0:10:32.476 They're going to be infuriated[br]if they sense they're wrongly accused 0:10:32.500,0:10:35.676 throughout the entire course[br]of the interview, not just in flashes; 0:10:35.700,0:10:38.939 they'll be infuriated throughout[br]the entire course of the interview. 0:10:38.963,0:10:40.376 And if you ask someone honest 0:10:40.400,0:10:42.976 what should happen[br]to whomever did forge those checks, 0:10:43.000,0:10:44.776 an honest person is much more likely 0:10:44.800,0:10:47.976 to recommend strict rather[br]than lenient punishment. 0:10:48.000,0:10:50.667 Now let's say you're having[br]that exact same conversation 0:10:50.691,0:10:51.976 with someone deceptive. 0:10:52.000,0:10:53.976 That person may be withdrawn, 0:10:54.000,0:10:55.976 look down, lower their voice, 0:10:56.000,0:10:57.976 pause, be kind of herky-jerky. 0:10:58.000,0:11:00.048 Ask a deceptive person[br]to tell their story, 0:11:00.072,0:11:02.976 they're going to pepper it[br]with way too much detail 0:11:03.000,0:11:05.976 in all kinds of irrelevant places. 0:11:06.000,0:11:09.476 And then they're going to tell their story[br]in strict chronological order. 0:11:09.500,0:11:11.276 And what a trained interrogator does 0:11:11.300,0:11:14.976 is they come in and in very subtle ways[br]over the course of several hours, 0:11:15.000,0:11:17.976 they will ask that person[br]to tell that story backwards, 0:11:18.000,0:11:19.976 and then they'll watch them squirm, 0:11:20.000,0:11:23.429 and track which questions produce[br]the highest volume of deceptive tells. 0:11:23.453,0:11:25.976 Why do they do that?[br]Well, we all do the same thing. 0:11:26.000,0:11:27.976 We rehearse our words, 0:11:28.000,0:11:29.976 but we rarely rehearse our gestures. 0:11:30.000,0:11:31.976 We say "yes," we shake our heads "no." 0:11:32.000,0:11:35.096 We tell very convincing stories,[br]we slightly shrug our shoulders. 0:11:35.120,0:11:36.976 We commit terrible crimes, 0:11:37.000,0:11:39.976 and we smile at the delight[br]in getting away with it. 0:11:40.000,0:11:42.976 Now, that smile is known[br]in the trade as "duping delight." 0:11:43.000,0:11:45.976 And we're going to see that[br]in several videos moving forward, 0:11:46.000,0:11:49.076 but we're going to start --[br]for those of you who don't know him, 0:11:49.100,0:11:51.176 this is presidential[br]candidate John Edwards 0:11:51.200,0:11:53.976 who shocked America by fathering[br]a child out of wedlock. 0:11:54.000,0:11:56.976 We're going to see him talk[br]about getting a paternity test. 0:11:57.000,0:12:01.000 See now if you can spot him[br]saying, "yes" while shaking his head "no," 0:12:01.024,0:12:02.976 slightly shrugging his shoulders. 0:12:03.000,0:12:05.676 (Video) John Edwards: I'd be happy[br]to participate in one. 0:12:05.700,0:12:08.576 I know that it's not possible[br]that this child could be mine, 0:12:08.600,0:12:10.176 because of the timing of events. 0:12:10.200,0:12:11.976 So I know it's not possible. 0:12:12.000,0:12:15.976 Happy to take a paternity test,[br]and would love to see it happen. 0:12:16.000,0:12:19.048 Interviewer: Are you going to do[br]that soon? Is there somebody -- 0:12:19.072,0:12:21.976 JE: Well, I'm only one side.[br]I'm only one side of the test. 0:12:22.000,0:12:24.604 But I'm happy to participate in one. 0:12:25.795,0:12:28.229 PM: Okay, those head shakes[br]are much easier to spot 0:12:28.253,0:12:29.776 once you know to look for them. 0:12:29.800,0:12:33.776 There're going to be times[br]when someone makes one expression 0:12:33.800,0:12:36.976 while masking another that just[br]kind of leaks through in a flash. 0:12:37.000,0:12:38.976 Murderers are known to leak sadness. 0:12:39.000,0:12:41.576 Your new joint venture partner[br]might shake your hand, 0:12:41.600,0:12:45.976 celebrate, go out to dinner with you[br]and then leak an expression of anger. 0:12:46.000,0:12:49.576 And we're not all going to become[br]facial expression experts overnight here, 0:12:49.600,0:12:52.078 but there's one I can teach you[br]that's very dangerous 0:12:52.102,0:12:53.315 and it's easy to learn, 0:12:53.339,0:12:55.176 and that's the expression of contempt. 0:12:55.200,0:12:58.200 Now with anger, you've got[br]two people on an even playing field. 0:12:58.224,0:13:00.415 It's still somewhat[br]of a healthy relationship. 0:13:00.439,0:13:03.976 But when anger turns to contempt,[br]you've been dismissed. 0:13:04.000,0:13:05.976 It's associated with moral superiority. 0:13:06.000,0:13:08.976 And for that reason, it's very,[br]very hard to recover from. 0:13:09.000,0:13:10.976 Here's what it looks like. 0:13:11.000,0:13:14.976 It's marked by one lip corner[br]pulled up and in. 0:13:15.000,0:13:17.976 It's the only asymmetrical expression. 0:13:18.000,0:13:21.976 And in the presence of contempt,[br]whether or not deception follows -- 0:13:22.000,0:13:23.976 and it doesn't always follow -- 0:13:24.000,0:13:26.048 look the other way,[br]go the other direction, 0:13:26.072,0:13:27.976 reconsider the deal, 0:13:28.000,0:13:31.976 say, "No thank you. I'm not coming up[br]for just one more nightcap. Thank you." 0:13:32.000,0:13:35.976 Science has surfaced[br]many, many more indicators. 0:13:36.000,0:13:37.976 We know, for example, 0:13:38.000,0:13:40.000 we know liars will shift their blink rate, 0:13:40.024,0:13:41.976 point their feet towards an exit. 0:13:42.000,0:13:43.976 They will take barrier objects 0:13:44.000,0:13:47.429 and put them between themselves[br]and the person that is interviewing them. 0:13:47.453,0:13:48.976 They'll alter their vocal tone, 0:13:49.000,0:13:51.976 often making their vocal tone much lower. 0:13:52.000,0:13:53.976 Now here's the deal. 0:13:54.000,0:13:56.976 These behaviors are just behaviors. 0:13:57.000,0:13:58.976 They're not proof of deception. 0:13:59.000,0:14:00.976 They're red flags. 0:14:01.000,0:14:02.976 We're human beings. 0:14:03.000,0:14:06.276 We make deceptive flailing gestures[br]all over the place all day long. 0:14:06.300,0:14:08.491 They don't mean anything[br]in and of themselves. 0:14:08.515,0:14:11.076 But when you see clusters[br]of them, that's your signal. 0:14:11.100,0:14:13.976 Look, listen, probe,[br]ask some hard questions, 0:14:14.000,0:14:16.976 get out of that very comfortable[br]mode of knowing, 0:14:17.000,0:14:19.976 walk into curiosity mode,[br]ask more questions, 0:14:20.000,0:14:23.376 have a little dignity, treat the person[br]you're talking to with rapport. 0:14:23.400,0:14:26.976 Don't try to be like those folks[br]on "Law & Order" and those other TV shows 0:14:27.000,0:14:29.048 that pummel their subjects[br]into submission. 0:14:29.072,0:14:31.076 Don't be too aggressive, it doesn't work. 0:14:32.119,0:14:35.476 Now, we've talked a little bit[br]about how to talk to someone who's lying 0:14:35.500,0:14:36.976 and how to spot a lie. 0:14:37.000,0:14:40.476 And as I promised, we're now going[br]to look at what the truth looks like. 0:14:40.500,0:14:42.276 But I'm going to show you two videos, 0:14:42.300,0:14:44.976 two mothers -- one is lying,[br]one is telling the truth. 0:14:45.000,0:14:49.096 And these were surfaced by researcher[br]David Matsumoto in California. 0:14:49.120,0:14:52.976 And I think they're an excellent example[br]of what the truth looks like. 0:14:53.000,0:14:54.976 This mother, Diane Downs, 0:14:55.000,0:14:56.976 shot her kids at close range, 0:14:57.000,0:15:00.976 drove them to the hospital[br]while they bled all over the car, 0:15:01.000,0:15:02.976 claimed a scraggy-haired stranger did it. 0:15:03.000,0:15:04.976 And you'll see when you see the video, 0:15:05.000,0:15:07.334 she can't even pretend[br]to be an agonizing mother. 0:15:07.358,0:15:10.976 What you want to look for here[br]is an incredible discrepancy 0:15:11.000,0:15:14.976 between horrific events that she describes[br]and her very, very cool demeanor. 0:15:15.000,0:15:18.476 And if you look closely, you'll see[br]duping delight throughout this video. 0:15:18.500,0:15:20.976 (Video) Diane Downs:[br]At night when I close my eyes, 0:15:21.000,0:15:24.096 I can see Christie reaching[br]her hand out to me while I'm driving, 0:15:24.120,0:15:26.376 and the blood just kept[br]coming out of her mouth. 0:15:26.400,0:15:28.543 And that -- maybe[br]it'll fade too with time -- 0:15:28.567,0:15:29.976 but I don't think so. 0:15:30.000,0:15:33.000 That bothers me the most. 0:15:40.000,0:15:41.976 PM: Now I'm going to show you a video 0:15:42.000,0:15:44.048 of an actual grieving mother,[br]Erin Runnion, 0:15:44.072,0:15:47.976 confronting her daughter's murderer[br]and torturer in court. 0:15:48.000,0:15:50.000 Here you're going to see no false emotion, 0:15:50.024,0:15:52.976 just the authentic expression[br]of a mother's agony. 0:15:53.000,0:15:55.091 (Video) Erin Runnion:[br]I wrote this statement 0:15:55.115,0:15:57.776 on the third anniversary[br]of the night you took my baby, 0:15:57.800,0:15:59.076 and you hurt her, 0:15:59.100,0:16:00.976 and you crushed her, 0:16:01.000,0:16:04.976 you terrified her until her heart stopped. 0:16:05.000,0:16:07.976 And she fought, and I know she fought you. 0:16:08.000,0:16:11.976 But I know she looked at you[br]with those amazing brown eyes, 0:16:12.000,0:16:14.976 and you still wanted to kill her. 0:16:15.000,0:16:16.976 And I don't understand it, 0:16:17.000,0:16:18.547 and I never will. 0:16:20.650,0:16:23.976 PM: Okay, there's no doubting[br]the veracity of those emotions. 0:16:24.000,0:16:26.976 Now the technology[br]around what the truth looks like 0:16:27.000,0:16:29.976 is progressing on, the science of it. 0:16:30.000,0:16:31.976 We know, for example, 0:16:32.000,0:16:35.143 that we now have specialized eye trackers[br]and infrared brain scans, 0:16:35.167,0:16:37.976 MRI's that can decode the signals[br]that our bodies send out 0:16:38.000,0:16:39.976 when we're trying to be deceptive. 0:16:40.000,0:16:42.976 And these technologies are going[br]to be marketed to all of us 0:16:43.000,0:16:44.976 as panaceas for deceit, 0:16:45.000,0:16:47.976 and they will prove[br]incredibly useful some day. 0:16:48.000,0:16:50.239 But you've got to ask yourself[br]in the meantime: 0:16:50.263,0:16:52.359 Who do you want on your side[br]of the meeting, 0:16:52.383,0:16:54.976 someone who's trained[br]in getting to the truth 0:16:55.000,0:16:58.076 or some guy who's going to drag[br]a 400-pound electroencephalogram 0:16:58.100,0:16:59.376 through the door? 0:16:59.400,0:17:02.976 Liespotters rely on human tools. 0:17:03.000,0:17:04.976 They know, as someone once said, 0:17:05.000,0:17:06.976 "Character's who you are in the dark." 0:17:07.000,0:17:10.976 And what's kind of interesting[br]is that today, we have so little darkness. 0:17:11.000,0:17:13.976 Our world is lit up 24 hours a day. 0:17:14.000,0:17:17.976 It's transparent[br]with blogs and social networks 0:17:18.000,0:17:20.676 broadcasting the buzz[br]of a whole new generation of people 0:17:20.700,0:17:23.276 that have made a choice to live[br]their lives in public. 0:17:23.300,0:17:26.976 It's a much more noisy world. 0:17:27.000,0:17:30.976 So one challenge we have is to remember, 0:17:31.000,0:17:33.976 oversharing, that's not honesty. 0:17:34.000,0:17:37.976 Our manic tweeting and texting[br]can blind us 0:17:38.000,0:17:41.576 to the fact that the subtleties[br]of human decency -- character integrity -- 0:17:41.600,0:17:44.648 that's still what matters,[br]that's always what's going to matter. 0:17:44.672,0:17:46.176 So in this much noisier world, 0:17:46.200,0:17:47.976 it might make sense for us 0:17:48.000,0:17:52.976 to be just a little bit more explicit[br]about our moral code. 0:17:53.000,0:17:55.576 When you combine the science[br]of recognizing deception 0:17:55.600,0:17:57.276 with the art of looking, listening, 0:17:57.300,0:17:59.976 you exempt yourself[br]from collaborating in a lie. 0:18:00.000,0:18:03.976 You start up that path[br]of being just a little bit more explicit, 0:18:04.000,0:18:06.000 because you signal to everyone around you, 0:18:06.024,0:18:10.976 you say, "Hey, my world, our world,[br]it's going to be an honest one. 0:18:11.000,0:18:13.620 My world is going to be[br]one where truth is strengthened 0:18:13.644,0:18:15.976 and falsehood is recognized[br]and marginalized." 0:18:16.000,0:18:17.976 And when you do that, 0:18:18.000,0:18:20.976 the ground around you starts[br]to shift just a little bit. 0:18:21.000,0:18:23.976 And that's the truth. Thank you. 0:18:24.000,0:18:29.000 (Applause)