1 00:00:00,025 --> 00:00:04,976 Okay, now I don't want to alarm anybody in this room, 2 00:00:05,000 --> 00:00:08,976 but it's just come to my attention that the person to your right is a liar. 3 00:00:09,000 --> 00:00:10,976 (Laughter) 4 00:00:11,000 --> 00:00:13,976 Also, the person to your left is a liar. 5 00:00:14,000 --> 00:00:16,976 Also the person sitting in your very seats is a liar. 6 00:00:17,000 --> 00:00:18,976 We're all liars. 7 00:00:19,000 --> 00:00:20,976 What I'm going to do today 8 00:00:21,000 --> 00:00:24,576 is I'm going to show you what the research says about why we're all liars, 9 00:00:24,600 --> 00:00:26,176 how you can become a liespotter 10 00:00:26,200 --> 00:00:28,976 and why you might want to go the extra mile 11 00:00:29,000 --> 00:00:31,976 and go from liespotting to truth seeking, 12 00:00:32,000 --> 00:00:33,976 and ultimately to trust building. 13 00:00:34,000 --> 00:00:36,976 Now, speaking of trust, 14 00:00:37,000 --> 00:00:39,976 ever since I wrote this book, "Liespotting," 15 00:00:40,000 --> 00:00:42,976 no one wants to meet me in person anymore, no, no, no, no, no. 16 00:00:43,000 --> 00:00:45,976 They say, "It's okay, we'll email you." 17 00:00:46,000 --> 00:00:47,976 (Laughter) 18 00:00:48,000 --> 00:00:51,976 I can't even get a coffee date at Starbucks. 19 00:00:52,000 --> 00:00:53,976 My husband's like, "Honey, deception? 20 00:00:54,000 --> 00:00:57,191 Maybe you could have focused on cooking. How about French cooking?" 21 00:00:57,215 --> 00:00:59,358 So before I get started, what I'm going to do 22 00:00:59,382 --> 00:01:01,976 is I'm going to clarify my goal for you, 23 00:01:02,000 --> 00:01:03,976 which is not to teach a game of Gotcha. 24 00:01:04,000 --> 00:01:05,976 Liespotters aren't those nitpicky kids, 25 00:01:06,000 --> 00:01:09,334 those kids in the back of the room that are shouting, "Gotcha! Gotcha! 26 00:01:09,358 --> 00:01:11,976 Your eyebrow twitched. You flared your nostril. 27 00:01:12,000 --> 00:01:14,976 I watch that TV show 'Lie To Me.' I know you're lying." 28 00:01:15,000 --> 00:01:16,976 No, liespotters are armed 29 00:01:17,000 --> 00:01:19,976 with scientific knowledge of how to spot deception. 30 00:01:20,000 --> 00:01:21,976 They use it to get to the truth, 31 00:01:22,000 --> 00:01:24,096 and they do what mature leaders do everyday; 32 00:01:24,120 --> 00:01:26,976 they have difficult conversations with difficult people, 33 00:01:27,000 --> 00:01:28,976 sometimes during very difficult times. 34 00:01:29,000 --> 00:01:32,976 And they start up that path by accepting a core proposition, 35 00:01:33,000 --> 00:01:34,976 and that proposition is the following: 36 00:01:35,000 --> 00:01:37,468 Lying is a cooperative act. 37 00:01:38,777 --> 00:01:41,976 Think about it, a lie has no power whatsoever by its mere utterance. 38 00:01:42,000 --> 00:01:43,976 Its power emerges 39 00:01:44,000 --> 00:01:46,096 when someone else agrees to believe the lie. 40 00:01:46,120 --> 00:01:48,096 So I know it may sound like tough love, 41 00:01:48,120 --> 00:01:51,976 but look, if at some point you got lied to, 42 00:01:52,000 --> 00:01:53,976 it's because you agreed to get lied to. 43 00:01:54,000 --> 00:01:56,976 Truth number one about lying: Lying's a cooperative act. 44 00:01:57,000 --> 00:01:58,976 Now not all lies are harmful. 45 00:01:59,000 --> 00:02:01,976 Sometimes we're willing participants in deception 46 00:02:02,000 --> 00:02:04,976 for the sake of social dignity, 47 00:02:05,000 --> 00:02:07,976 maybe to keep a secret that should be kept secret, secret. 48 00:02:08,000 --> 00:02:09,976 We say, "Nice song." 49 00:02:10,000 --> 00:02:12,976 "Honey, you don't look fat in that, no." 50 00:02:13,000 --> 00:02:14,976 Or we say, favorite of the digiratti, 51 00:02:15,000 --> 00:02:17,976 "You know, I just fished that email out of my Spam folder. 52 00:02:18,000 --> 00:02:20,976 So sorry." 53 00:02:21,000 --> 00:02:24,239 But there are times when we are unwilling participants in deception. 54 00:02:24,263 --> 00:02:26,976 And that can have dramatic costs for us. 55 00:02:27,000 --> 00:02:29,976 Last year saw 997 billion dollars 56 00:02:30,000 --> 00:02:33,579 in corporate fraud alone in the United States. 57 00:02:34,333 --> 00:02:36,348 That's an eyelash under a trillion dollars. 58 00:02:36,372 --> 00:02:37,976 That's seven percent of revenues. 59 00:02:38,000 --> 00:02:39,976 Deception can cost billions. 60 00:02:40,000 --> 00:02:42,976 Think Enron, Madoff, the mortgage crisis. 61 00:02:43,000 --> 00:02:45,976 Or in the case of double agents and traitors, 62 00:02:46,000 --> 00:02:47,976 like Robert Hanssen or Aldrich Ames, 63 00:02:48,000 --> 00:02:49,976 lies can betray our country, 64 00:02:50,000 --> 00:02:53,000 they can compromise our security, they can undermine democracy, 65 00:02:53,024 --> 00:02:55,976 they can cause the deaths of those that defend us. 66 00:02:56,000 --> 00:02:58,976 Deception is actually serious business. 67 00:02:59,000 --> 00:03:02,976 This con man, Henry Oberlander, he was such an effective con man, 68 00:03:03,000 --> 00:03:04,976 British authorities say 69 00:03:05,000 --> 00:03:08,476 he could have undermined the entire banking system of the Western world. 70 00:03:08,500 --> 00:03:11,676 And you can't find this guy on Google; you can't find him anywhere. 71 00:03:11,700 --> 00:03:14,176 He was interviewed once, and he said the following. 72 00:03:14,200 --> 00:03:15,876 He said, "Look, I've got one rule." 73 00:03:15,900 --> 00:03:17,976 And this was Henry's rule, he said, 74 00:03:18,000 --> 00:03:20,376 "Look, everyone is willing to give you something. 75 00:03:20,400 --> 00:03:23,972 They're ready to give you something for whatever it is they're hungry for." 76 00:03:23,996 --> 00:03:25,376 And that's the crux of it. 77 00:03:25,400 --> 00:03:27,829 If you don't want to be deceived, you have to know, 78 00:03:27,853 --> 00:03:29,529 what is it that you're hungry for? 79 00:03:29,553 --> 00:03:31,976 And we all kind of hate to admit it. 80 00:03:32,000 --> 00:03:34,976 We wish we were better husbands, better wives, 81 00:03:35,000 --> 00:03:38,976 smarter, more powerful, taller, richer -- 82 00:03:39,000 --> 00:03:40,976 the list goes on. 83 00:03:41,000 --> 00:03:42,976 Lying is an attempt to bridge that gap, 84 00:03:43,000 --> 00:03:44,976 to connect our wishes and our fantasies 85 00:03:45,000 --> 00:03:47,976 about who we wish we were, how we wish we could be, 86 00:03:48,000 --> 00:03:50,976 with what we're really like. 87 00:03:51,000 --> 00:03:54,239 And boy are we willing to fill in those gaps in our lives with lies. 88 00:03:54,263 --> 00:03:56,976 On a given day, studies show that you may be lied to 89 00:03:57,000 --> 00:03:58,976 anywhere from 10 to 200 times. 90 00:03:59,000 --> 00:04:01,976 Now granted, many of those are white lies. 91 00:04:02,000 --> 00:04:03,976 But in another study, 92 00:04:04,000 --> 00:04:05,976 it showed that strangers lied three times 93 00:04:06,000 --> 00:04:08,381 within the first 10 minutes of meeting each other. 94 00:04:08,405 --> 00:04:09,976 (Laughter) 95 00:04:10,000 --> 00:04:12,976 Now when we first hear this data, we recoil. 96 00:04:13,000 --> 00:04:14,976 We can't believe how prevalent lying is. 97 00:04:15,000 --> 00:04:16,976 We're essentially against lying. 98 00:04:17,000 --> 00:04:20,976 But if you look more closely, the plot actually thickens. 99 00:04:21,000 --> 00:04:23,976 We lie more to strangers than we lie to coworkers. 100 00:04:24,000 --> 00:04:27,976 Extroverts lie more than introverts. 101 00:04:28,000 --> 00:04:32,976 Men lie eight times more about themselves than they do other people. 102 00:04:33,000 --> 00:04:35,976 Women lie more to protect other people. 103 00:04:36,000 --> 00:04:38,976 If you're an average married couple, 104 00:04:39,000 --> 00:04:42,976 you're going to lie to your spouse in one out of every 10 interactions. 105 00:04:43,000 --> 00:04:44,976 Now, you may think that's bad. 106 00:04:45,000 --> 00:04:47,286 If you're unmarried, that number drops to three. 107 00:04:47,310 --> 00:04:48,976 Lying's complex. 108 00:04:49,000 --> 00:04:52,000 It's woven into the fabric of our daily and our business lives. 109 00:04:52,024 --> 00:04:53,976 We're deeply ambivalent about the truth. 110 00:04:54,000 --> 00:04:55,976 We parse it out on an as-needed basis, 111 00:04:56,000 --> 00:04:57,976 sometimes for very good reasons, 112 00:04:58,000 --> 00:05:01,191 other times just because we don't understand the gaps in our lives. 113 00:05:01,215 --> 00:05:02,976 That's truth number two about lying. 114 00:05:03,000 --> 00:05:04,976 We're against lying, 115 00:05:05,000 --> 00:05:06,976 but we're covertly for it 116 00:05:07,000 --> 00:05:11,000 in ways that our society has sanctioned for centuries and centuries and centuries. 117 00:05:11,024 --> 00:05:12,976 It's as old as breathing. 118 00:05:13,000 --> 00:05:15,429 It's part of our culture, it's part of our history. 119 00:05:15,453 --> 00:05:20,976 Think Dante, Shakespeare, the Bible, News of the World. 120 00:05:21,000 --> 00:05:22,976 (Laughter) 121 00:05:23,000 --> 00:05:25,286 Lying has evolutionary value to us as a species. 122 00:05:25,310 --> 00:05:28,976 Researchers have long known that the more intelligent the species, 123 00:05:29,000 --> 00:05:30,976 the larger the neocortex, 124 00:05:31,000 --> 00:05:32,976 the more likely it is to be deceptive. 125 00:05:33,000 --> 00:05:34,976 Now you might remember Koko. 126 00:05:35,000 --> 00:05:38,239 Does anybody remember Koko the gorilla who was taught sign language? 127 00:05:38,263 --> 00:05:40,976 Koko was taught to communicate via sign language. 128 00:05:41,000 --> 00:05:42,976 Here's Koko with her kitten. 129 00:05:43,000 --> 00:05:45,976 It's her cute little, fluffy pet kitten. 130 00:05:46,000 --> 00:05:49,976 Koko once blamed her pet kitten for ripping a sink out of the wall. 131 00:05:50,000 --> 00:05:51,976 (Laughter) 132 00:05:52,000 --> 00:05:54,191 We're hardwired to become leaders of the pack. 133 00:05:54,215 --> 00:05:55,976 It's starts really, really early. 134 00:05:56,000 --> 00:05:57,976 How early? 135 00:05:58,000 --> 00:05:59,976 Well babies will fake a cry, 136 00:06:00,000 --> 00:06:01,976 pause, wait to see who's coming 137 00:06:02,000 --> 00:06:03,976 and then go right back to crying. 138 00:06:04,000 --> 00:06:05,976 One-year-olds learn concealment. 139 00:06:06,000 --> 00:06:07,976 (Laughter) 140 00:06:08,000 --> 00:06:09,976 Two-year-olds bluff. 141 00:06:10,000 --> 00:06:11,976 Five-year-olds lie outright. 142 00:06:12,000 --> 00:06:13,976 They manipulate via flattery. 143 00:06:14,000 --> 00:06:16,976 Nine-year-olds, masters of the cover-up. 144 00:06:17,000 --> 00:06:18,976 By the time you enter college, 145 00:06:19,000 --> 00:06:22,376 you're going to lie to your mom in one out of every five interactions. 146 00:06:22,400 --> 00:06:25,276 By the time we enter this work world and we're breadwinners, 147 00:06:25,300 --> 00:06:28,976 we enter a world that is just cluttered with Spam, fake digital friends, 148 00:06:29,000 --> 00:06:30,976 partisan media, 149 00:06:31,000 --> 00:06:32,976 ingenious identity thieves, 150 00:06:33,000 --> 00:06:34,976 world-class Ponzi schemers, 151 00:06:35,000 --> 00:06:36,976 a deception epidemic -- 152 00:06:37,000 --> 00:06:41,977 in short, what one author calls a post-truth society. 153 00:06:42,001 --> 00:06:45,966 It's been very confusing for a long time now. 154 00:06:48,665 --> 00:06:49,976 What do you do? 155 00:06:50,000 --> 00:06:53,976 Well, there are steps we can take to navigate our way through the morass. 156 00:06:54,000 --> 00:06:56,976 Trained liespotters get to the truth 90 percent of the time. 157 00:06:57,000 --> 00:06:59,976 The rest of us, we're only 54 percent accurate. 158 00:07:00,000 --> 00:07:01,976 Why is it so easy to learn? 159 00:07:02,000 --> 00:07:04,215 There are good liars and bad liars. 160 00:07:04,239 --> 00:07:05,876 There are no real original liars. 161 00:07:05,900 --> 00:07:08,876 We all make the same mistakes. We all use the same techniques. 162 00:07:08,900 --> 00:07:12,529 So what I'm going to do is I'm going to show you two patterns of deception. 163 00:07:12,553 --> 00:07:14,752 And then we're going to look at the hot spots 164 00:07:14,776 --> 00:07:16,629 and see if we can find them ourselves. 165 00:07:16,653 --> 00:07:18,276 We're going to start with speech. 166 00:07:18,300 --> 00:07:20,634 (Video) Bill Clinton: I want you to listen to me. 167 00:07:20,658 --> 00:07:22,076 I'm going to say this again. 168 00:07:22,100 --> 00:07:28,976 I did not have sexual relations with that woman, Miss Lewinsky. 169 00:07:29,000 --> 00:07:32,976 I never told anybody to lie, not a single time, never. 170 00:07:33,000 --> 00:07:35,976 And these allegations are false. 171 00:07:36,000 --> 00:07:38,572 And I need to go back to work for the American people. 172 00:07:38,596 --> 00:07:40,572 Thank you. 173 00:07:40,596 --> 00:07:42,108 (Applause) 174 00:07:43,000 --> 00:07:45,976 Pamela Meyer: Okay, what were the telltale signs? 175 00:07:46,000 --> 00:07:49,976 Well first we heard what's known as a non-contracted denial. 176 00:07:50,000 --> 00:07:53,000 Studies show that people who are overdetermined in their denial 177 00:07:53,024 --> 00:07:55,976 will resort to formal rather than informal language. 178 00:07:56,000 --> 00:07:58,976 We also heard distancing language: "that woman." 179 00:07:59,000 --> 00:08:01,715 We know that liars will unconsciously distance themselves 180 00:08:01,739 --> 00:08:02,976 from their subject, 181 00:08:03,000 --> 00:08:05,976 using language as their tool. 182 00:08:06,000 --> 00:08:09,048 Now if Bill Clinton had said, "Well, to tell you the truth..." 183 00:08:09,072 --> 00:08:11,263 or Richard Nixon's favorite, "In all candor..." 184 00:08:11,287 --> 00:08:12,976 he would have been a dead giveaway 185 00:08:13,000 --> 00:08:14,976 for any liespotter than knows 186 00:08:15,000 --> 00:08:18,429 that qualifying language, as it's called, qualifying language like that, 187 00:08:18,453 --> 00:08:19,976 further discredits the subject. 188 00:08:20,000 --> 00:08:22,976 Now if he had repeated the question in its entirety, 189 00:08:23,000 --> 00:08:26,976 or if he had peppered his account with a little too much detail -- 190 00:08:27,000 --> 00:08:29,191 and we're all really glad he didn't do that -- 191 00:08:29,215 --> 00:08:31,215 he would have further discredited himself. 192 00:08:31,239 --> 00:08:32,976 Freud had it right. 193 00:08:33,000 --> 00:08:35,976 Freud said, look, there's much more to it than speech: 194 00:08:36,000 --> 00:08:38,976 "No mortal can keep a secret. 195 00:08:39,000 --> 00:08:41,976 If his lips are silent, he chatters with his fingertips." 196 00:08:42,000 --> 00:08:44,976 And we all do it no matter how powerful you are. 197 00:08:45,000 --> 00:08:46,976 We all chatter with our fingertips. 198 00:08:47,000 --> 00:08:49,976 I'm going to show you Dominique Strauss-Kahn with Obama 199 00:08:50,000 --> 00:08:52,976 who's chattering with his fingertips. 200 00:08:53,000 --> 00:08:55,976 (Laughter) 201 00:08:56,000 --> 00:09:01,976 Now this brings us to our next pattern, which is body language. 202 00:09:02,000 --> 00:09:04,976 With body language, here's what you've got to do. 203 00:09:05,000 --> 00:09:07,976 You've really got to just throw your assumptions out the door. 204 00:09:08,000 --> 00:09:10,429 Let the science temper your knowledge a little bit. 205 00:09:10,453 --> 00:09:12,976 Because we think liars fidget all the time. 206 00:09:13,000 --> 00:09:16,762 Well guess what, they're known to freeze their upper bodies when they're lying. 207 00:09:16,786 --> 00:09:18,976 We think liars won't look you in the eyes. 208 00:09:19,000 --> 00:09:21,876 Well guess what, they look you in the eyes a little too much 209 00:09:21,900 --> 00:09:23,476 just to compensate for that myth. 210 00:09:23,500 --> 00:09:26,976 We think warmth and smiles convey honesty, sincerity. 211 00:09:27,000 --> 00:09:30,976 But a trained liespotter can spot a fake smile a mile away. 212 00:09:31,000 --> 00:09:34,000 Can you all spot the fake smile here? 213 00:09:35,000 --> 00:09:39,976 You can consciously contract the muscles in your cheeks. 214 00:09:40,000 --> 00:09:42,976 But the real smile's in the eyes, the crow's feet of the eyes. 215 00:09:43,000 --> 00:09:44,976 They cannot be consciously contracted, 216 00:09:45,000 --> 00:09:46,976 especially if you overdid the Botox. 217 00:09:47,000 --> 00:09:49,976 Don't overdo the Botox; nobody will think you're honest. 218 00:09:50,000 --> 00:09:51,976 Now we're going to look at the hot spots. 219 00:09:52,000 --> 00:09:54,286 Can you tell what's happening in a conversation? 220 00:09:54,310 --> 00:09:56,976 Can you start to find the hot spots 221 00:09:57,000 --> 00:09:58,976 to see the discrepancies 222 00:09:59,000 --> 00:10:01,191 between someone's words and someone's actions? 223 00:10:01,215 --> 00:10:02,976 Now, I know it seems really obvious, 224 00:10:03,000 --> 00:10:07,976 but when you're having a conversation with someone you suspect of deception, 225 00:10:08,000 --> 00:10:11,076 attitude is by far the most overlooked but telling of indicators. 226 00:10:11,100 --> 00:10:13,296 An honest person is going to be cooperative. 227 00:10:13,320 --> 00:10:15,368 They're going to show they're on your side. 228 00:10:15,392 --> 00:10:16,976 They're going to be enthusiastic. 229 00:10:17,000 --> 00:10:20,276 They're going to be willing and helpful to getting you to the truth. 230 00:10:20,300 --> 00:10:23,076 They're going to be willing to brainstorm, name suspects, 231 00:10:23,100 --> 00:10:24,376 provide details. 232 00:10:24,400 --> 00:10:25,976 They're going to say, 233 00:10:26,000 --> 00:10:29,176 "Hey, maybe it was those guys in payroll that forged those checks." 234 00:10:29,200 --> 00:10:32,476 They're going to be infuriated if they sense they're wrongly accused 235 00:10:32,500 --> 00:10:35,676 throughout the entire course of the interview, not just in flashes; 236 00:10:35,700 --> 00:10:38,939 they'll be infuriated throughout the entire course of the interview. 237 00:10:38,963 --> 00:10:40,376 And if you ask someone honest 238 00:10:40,400 --> 00:10:42,976 what should happen to whomever did forge those checks, 239 00:10:43,000 --> 00:10:44,776 an honest person is much more likely 240 00:10:44,800 --> 00:10:47,976 to recommend strict rather than lenient punishment. 241 00:10:48,000 --> 00:10:50,667 Now let's say you're having that exact same conversation 242 00:10:50,691 --> 00:10:51,976 with someone deceptive. 243 00:10:52,000 --> 00:10:53,976 That person may be withdrawn, 244 00:10:54,000 --> 00:10:55,976 look down, lower their voice, 245 00:10:56,000 --> 00:10:57,976 pause, be kind of herky-jerky. 246 00:10:58,000 --> 00:11:00,048 Ask a deceptive person to tell their story, 247 00:11:00,072 --> 00:11:02,976 they're going to pepper it with way too much detail 248 00:11:03,000 --> 00:11:05,976 in all kinds of irrelevant places. 249 00:11:06,000 --> 00:11:09,476 And then they're going to tell their story in strict chronological order. 250 00:11:09,500 --> 00:11:11,276 And what a trained interrogator does 251 00:11:11,300 --> 00:11:14,976 is they come in and in very subtle ways over the course of several hours, 252 00:11:15,000 --> 00:11:17,976 they will ask that person to tell that story backwards, 253 00:11:18,000 --> 00:11:19,976 and then they'll watch them squirm, 254 00:11:20,000 --> 00:11:23,429 and track which questions produce the highest volume of deceptive tells. 255 00:11:23,453 --> 00:11:25,976 Why do they do that? Well, we all do the same thing. 256 00:11:26,000 --> 00:11:27,976 We rehearse our words, 257 00:11:28,000 --> 00:11:29,976 but we rarely rehearse our gestures. 258 00:11:30,000 --> 00:11:31,976 We say "yes," we shake our heads "no." 259 00:11:32,000 --> 00:11:35,096 We tell very convincing stories, we slightly shrug our shoulders. 260 00:11:35,120 --> 00:11:36,976 We commit terrible crimes, 261 00:11:37,000 --> 00:11:39,976 and we smile at the delight in getting away with it. 262 00:11:40,000 --> 00:11:42,976 Now, that smile is known in the trade as "duping delight." 263 00:11:43,000 --> 00:11:45,976 And we're going to see that in several videos moving forward, 264 00:11:46,000 --> 00:11:49,076 but we're going to start -- for those of you who don't know him, 265 00:11:49,100 --> 00:11:51,176 this is presidential candidate John Edwards 266 00:11:51,200 --> 00:11:53,976 who shocked America by fathering a child out of wedlock. 267 00:11:54,000 --> 00:11:56,976 We're going to see him talk about getting a paternity test. 268 00:11:57,000 --> 00:12:01,000 See now if you can spot him saying, "yes" while shaking his head "no," 269 00:12:01,024 --> 00:12:02,976 slightly shrugging his shoulders. 270 00:12:03,000 --> 00:12:05,676 (Video) John Edwards: I'd be happy to participate in one. 271 00:12:05,700 --> 00:12:08,576 I know that it's not possible that this child could be mine, 272 00:12:08,600 --> 00:12:10,176 because of the timing of events. 273 00:12:10,200 --> 00:12:11,976 So I know it's not possible. 274 00:12:12,000 --> 00:12:15,976 Happy to take a paternity test, and would love to see it happen. 275 00:12:16,000 --> 00:12:19,048 Interviewer: Are you going to do that soon? Is there somebody -- 276 00:12:19,072 --> 00:12:21,976 JE: Well, I'm only one side. I'm only one side of the test. 277 00:12:22,000 --> 00:12:24,604 But I'm happy to participate in one. 278 00:12:25,795 --> 00:12:28,229 PM: Okay, those head shakes are much easier to spot 279 00:12:28,253 --> 00:12:29,776 once you know to look for them. 280 00:12:29,800 --> 00:12:33,776 There're going to be times when someone makes one expression 281 00:12:33,800 --> 00:12:36,976 while masking another that just kind of leaks through in a flash. 282 00:12:37,000 --> 00:12:38,976 Murderers are known to leak sadness. 283 00:12:39,000 --> 00:12:41,576 Your new joint venture partner might shake your hand, 284 00:12:41,600 --> 00:12:45,976 celebrate, go out to dinner with you and then leak an expression of anger. 285 00:12:46,000 --> 00:12:49,576 And we're not all going to become facial expression experts overnight here, 286 00:12:49,600 --> 00:12:52,078 but there's one I can teach you that's very dangerous 287 00:12:52,102 --> 00:12:53,315 and it's easy to learn, 288 00:12:53,339 --> 00:12:55,176 and that's the expression of contempt. 289 00:12:55,200 --> 00:12:58,200 Now with anger, you've got two people on an even playing field. 290 00:12:58,224 --> 00:13:00,415 It's still somewhat of a healthy relationship. 291 00:13:00,439 --> 00:13:03,976 But when anger turns to contempt, you've been dismissed. 292 00:13:04,000 --> 00:13:05,976 It's associated with moral superiority. 293 00:13:06,000 --> 00:13:08,976 And for that reason, it's very, very hard to recover from. 294 00:13:09,000 --> 00:13:10,976 Here's what it looks like. 295 00:13:11,000 --> 00:13:14,976 It's marked by one lip corner pulled up and in. 296 00:13:15,000 --> 00:13:17,976 It's the only asymmetrical expression. 297 00:13:18,000 --> 00:13:21,976 And in the presence of contempt, whether or not deception follows -- 298 00:13:22,000 --> 00:13:23,976 and it doesn't always follow -- 299 00:13:24,000 --> 00:13:26,048 look the other way, go the other direction, 300 00:13:26,072 --> 00:13:27,976 reconsider the deal, 301 00:13:28,000 --> 00:13:31,976 say, "No thank you. I'm not coming up for just one more nightcap. Thank you." 302 00:13:32,000 --> 00:13:35,976 Science has surfaced many, many more indicators. 303 00:13:36,000 --> 00:13:37,976 We know, for example, 304 00:13:38,000 --> 00:13:40,000 we know liars will shift their blink rate, 305 00:13:40,024 --> 00:13:41,976 point their feet towards an exit. 306 00:13:42,000 --> 00:13:43,976 They will take barrier objects 307 00:13:44,000 --> 00:13:47,429 and put them between themselves and the person that is interviewing them. 308 00:13:47,453 --> 00:13:48,976 They'll alter their vocal tone, 309 00:13:49,000 --> 00:13:51,976 often making their vocal tone much lower. 310 00:13:52,000 --> 00:13:53,976 Now here's the deal. 311 00:13:54,000 --> 00:13:56,976 These behaviors are just behaviors. 312 00:13:57,000 --> 00:13:58,976 They're not proof of deception. 313 00:13:59,000 --> 00:14:00,976 They're red flags. 314 00:14:01,000 --> 00:14:02,976 We're human beings. 315 00:14:03,000 --> 00:14:06,276 We make deceptive flailing gestures all over the place all day long. 316 00:14:06,300 --> 00:14:08,491 They don't mean anything in and of themselves. 317 00:14:08,515 --> 00:14:11,076 But when you see clusters of them, that's your signal. 318 00:14:11,100 --> 00:14:13,976 Look, listen, probe, ask some hard questions, 319 00:14:14,000 --> 00:14:16,976 get out of that very comfortable mode of knowing, 320 00:14:17,000 --> 00:14:19,976 walk into curiosity mode, ask more questions, 321 00:14:20,000 --> 00:14:23,376 have a little dignity, treat the person you're talking to with rapport. 322 00:14:23,400 --> 00:14:26,976 Don't try to be like those folks on "Law & Order" and those other TV shows 323 00:14:27,000 --> 00:14:29,048 that pummel their subjects into submission. 324 00:14:29,072 --> 00:14:31,076 Don't be too aggressive, it doesn't work. 325 00:14:32,119 --> 00:14:35,476 Now, we've talked a little bit about how to talk to someone who's lying 326 00:14:35,500 --> 00:14:36,976 and how to spot a lie. 327 00:14:37,000 --> 00:14:40,476 And as I promised, we're now going to look at what the truth looks like. 328 00:14:40,500 --> 00:14:42,276 But I'm going to show you two videos, 329 00:14:42,300 --> 00:14:44,976 two mothers -- one is lying, one is telling the truth. 330 00:14:45,000 --> 00:14:49,096 And these were surfaced by researcher David Matsumoto in California. 331 00:14:49,120 --> 00:14:52,976 And I think they're an excellent example of what the truth looks like. 332 00:14:53,000 --> 00:14:54,976 This mother, Diane Downs, 333 00:14:55,000 --> 00:14:56,976 shot her kids at close range, 334 00:14:57,000 --> 00:15:00,976 drove them to the hospital while they bled all over the car, 335 00:15:01,000 --> 00:15:02,976 claimed a scraggy-haired stranger did it. 336 00:15:03,000 --> 00:15:04,976 And you'll see when you see the video, 337 00:15:05,000 --> 00:15:07,334 she can't even pretend to be an agonizing mother. 338 00:15:07,358 --> 00:15:10,976 What you want to look for here is an incredible discrepancy 339 00:15:11,000 --> 00:15:14,976 between horrific events that she describes and her very, very cool demeanor. 340 00:15:15,000 --> 00:15:18,476 And if you look closely, you'll see duping delight throughout this video. 341 00:15:18,500 --> 00:15:20,976 (Video) Diane Downs: At night when I close my eyes, 342 00:15:21,000 --> 00:15:24,096 I can see Christie reaching her hand out to me while I'm driving, 343 00:15:24,120 --> 00:15:26,376 and the blood just kept coming out of her mouth. 344 00:15:26,400 --> 00:15:28,543 And that -- maybe it'll fade too with time -- 345 00:15:28,567 --> 00:15:29,976 but I don't think so. 346 00:15:30,000 --> 00:15:33,000 That bothers me the most. 347 00:15:40,000 --> 00:15:41,976 PM: Now I'm going to show you a video 348 00:15:42,000 --> 00:15:44,048 of an actual grieving mother, Erin Runnion, 349 00:15:44,072 --> 00:15:47,976 confronting her daughter's murderer and torturer in court. 350 00:15:48,000 --> 00:15:50,000 Here you're going to see no false emotion, 351 00:15:50,024 --> 00:15:52,976 just the authentic expression of a mother's agony. 352 00:15:53,000 --> 00:15:55,091 (Video) Erin Runnion: I wrote this statement 353 00:15:55,115 --> 00:15:57,776 on the third anniversary of the night you took my baby, 354 00:15:57,800 --> 00:15:59,076 and you hurt her, 355 00:15:59,100 --> 00:16:00,976 and you crushed her, 356 00:16:01,000 --> 00:16:04,976 you terrified her until her heart stopped. 357 00:16:05,000 --> 00:16:07,976 And she fought, and I know she fought you. 358 00:16:08,000 --> 00:16:11,976 But I know she looked at you with those amazing brown eyes, 359 00:16:12,000 --> 00:16:14,976 and you still wanted to kill her. 360 00:16:15,000 --> 00:16:16,976 And I don't understand it, 361 00:16:17,000 --> 00:16:18,547 and I never will. 362 00:16:20,650 --> 00:16:23,976 PM: Okay, there's no doubting the veracity of those emotions. 363 00:16:24,000 --> 00:16:26,976 Now the technology around what the truth looks like 364 00:16:27,000 --> 00:16:29,976 is progressing on, the science of it. 365 00:16:30,000 --> 00:16:31,976 We know, for example, 366 00:16:32,000 --> 00:16:35,143 that we now have specialized eye trackers and infrared brain scans, 367 00:16:35,167 --> 00:16:37,976 MRI's that can decode the signals that our bodies send out 368 00:16:38,000 --> 00:16:39,976 when we're trying to be deceptive. 369 00:16:40,000 --> 00:16:42,976 And these technologies are going to be marketed to all of us 370 00:16:43,000 --> 00:16:44,976 as panaceas for deceit, 371 00:16:45,000 --> 00:16:47,976 and they will prove incredibly useful some day. 372 00:16:48,000 --> 00:16:50,239 But you've got to ask yourself in the meantime: 373 00:16:50,263 --> 00:16:52,359 Who do you want on your side of the meeting, 374 00:16:52,383 --> 00:16:54,976 someone who's trained in getting to the truth 375 00:16:55,000 --> 00:16:58,076 or some guy who's going to drag a 400-pound electroencephalogram 376 00:16:58,100 --> 00:16:59,376 through the door? 377 00:16:59,400 --> 00:17:02,976 Liespotters rely on human tools. 378 00:17:03,000 --> 00:17:04,976 They know, as someone once said, 379 00:17:05,000 --> 00:17:06,976 "Character's who you are in the dark." 380 00:17:07,000 --> 00:17:10,976 And what's kind of interesting is that today, we have so little darkness. 381 00:17:11,000 --> 00:17:13,976 Our world is lit up 24 hours a day. 382 00:17:14,000 --> 00:17:17,976 It's transparent with blogs and social networks 383 00:17:18,000 --> 00:17:20,676 broadcasting the buzz of a whole new generation of people 384 00:17:20,700 --> 00:17:23,276 that have made a choice to live their lives in public. 385 00:17:23,300 --> 00:17:26,976 It's a much more noisy world. 386 00:17:27,000 --> 00:17:30,976 So one challenge we have is to remember, 387 00:17:31,000 --> 00:17:33,976 oversharing, that's not honesty. 388 00:17:34,000 --> 00:17:37,976 Our manic tweeting and texting can blind us 389 00:17:38,000 --> 00:17:41,576 to the fact that the subtleties of human decency -- character integrity -- 390 00:17:41,600 --> 00:17:44,648 that's still what matters, that's always what's going to matter. 391 00:17:44,672 --> 00:17:46,176 So in this much noisier world, 392 00:17:46,200 --> 00:17:47,976 it might make sense for us 393 00:17:48,000 --> 00:17:52,976 to be just a little bit more explicit about our moral code. 394 00:17:53,000 --> 00:17:55,576 When you combine the science of recognizing deception 395 00:17:55,600 --> 00:17:57,276 with the art of looking, listening, 396 00:17:57,300 --> 00:17:59,976 you exempt yourself from collaborating in a lie. 397 00:18:00,000 --> 00:18:03,976 You start up that path of being just a little bit more explicit, 398 00:18:04,000 --> 00:18:06,000 because you signal to everyone around you, 399 00:18:06,024 --> 00:18:10,976 you say, "Hey, my world, our world, it's going to be an honest one. 400 00:18:11,000 --> 00:18:13,620 My world is going to be one where truth is strengthened 401 00:18:13,644 --> 00:18:15,976 and falsehood is recognized and marginalized." 402 00:18:16,000 --> 00:18:17,976 And when you do that, 403 00:18:18,000 --> 00:18:20,976 the ground around you starts to shift just a little bit. 404 00:18:21,000 --> 00:18:23,976 And that's the truth. Thank you. 405 00:18:24,000 --> 00:18:29,000 (Applause)