0:00:06.521,0:00:07.959 Truth and deception. 0:00:09.179,0:00:11.188 Lying and telling the truth 0:00:11.938,0:00:14.080 and knowing the difference [br]between the two. 0:00:14.960,0:00:17.415 It's something that I think about a lot. 0:00:18.435,0:00:22.279 And it's not that I'm[br]sort of a cynical, morbid, old guy 0:00:22.676,0:00:28.097 who thinks a lot about lying, but I do,[br]and I do because it's important. 0:00:28.097,0:00:29.977 (Laughter) 0:00:29.977,0:00:32.649 It's important for us [br]in an interpersonal level. 0:00:32.999,0:00:34.649 We have conversations with people; 0:00:34.649,0:00:36.072 we often are concerned 0:00:36.072,0:00:38.242 whether they're being[br]truthful with us or not. 0:00:38.852,0:00:40.821 It's important for us as a society. 0:00:41.321,0:00:43.180 It's important for us as a world. 0:00:43.820,0:00:48.381 When lies get told to societies,[br]and they're believed, 0:00:48.391,0:00:49.532 it has a special name; 0:00:49.532,0:00:51.005 it's called propaganda. 0:00:51.365,0:00:55.859 And when propaganda happens,[br]some really bad things can happen. 0:00:56.450,0:00:59.169 And so I want to talk[br]about that topic tonight 0:00:59.169,0:01:00.881 from a point of view of science. 0:01:00.881,0:01:02.945 And the science is really interesting 0:01:02.945,0:01:05.702 because it's got[br]some interesting contradictions 0:01:05.702,0:01:08.100 and, I think, surprising findings in it. 0:01:08.240,0:01:11.252 But I'd like to make this real for you[br]with a couple of stories, 0:01:11.252,0:01:12.441 just to begin with. 0:01:12.730,0:01:16.149 The first one is about a young man[br]named Jeffrey Deskovic. 0:01:17.022,0:01:18.667 You can see Jeffrey here; 0:01:18.667,0:01:20.607 he's about 16 years old, 0:01:21.147,0:01:25.723 living the life of the average[br]American 15- and 16-year-old. 0:01:26.083,0:01:27.552 He was in high school, 0:01:27.552,0:01:29.580 everything was going fairly well, 0:01:29.580,0:01:35.207 and in the fall of 1989,[br]a tragedy befell his high school 0:01:35.543,0:01:37.225 and one of his classmates. 0:01:37.715,0:01:40.985 A classmate named Angela[br]went out on a photographic project 0:01:41.764,0:01:43.400 and went missing. 0:01:43.840,0:01:46.403 She was gone for a couple of days. 0:01:46.933,0:01:48.383 And then her body was found, 0:01:48.383,0:01:50.426 and she'd been brutally[br]raped and murdered. 0:01:51.636,0:01:53.318 Jeffrey took this kind of hard. 0:01:54.038,0:01:55.204 He knew Angela; 0:01:55.204,0:01:59.100 they were not boyfriend and girlfriend,[br]but they knew each other. 0:01:59.100,0:02:00.626 Angela had been nice to him; 0:02:00.626,0:02:02.417 she'd helped him with his homework. 0:02:02.507,0:02:05.642 And he was deeply affected by her loss. 0:02:06.412,0:02:07.854 He went to the funeral. 0:02:08.364,0:02:10.961 He wrote a letter[br]that he put on the casket. 0:02:11.441,0:02:13.824 He broke down and cried [br]during the funeral. 0:02:14.644,0:02:17.362 This was in a small town in New York. 0:02:17.672,0:02:20.598 The police were very interested[br]in solving the crime, 0:02:20.598,0:02:24.641 and as they often do,[br]they went to the funeral. 0:02:24.641,0:02:27.831 They thought Jeffrey[br]reacted just too much, 0:02:28.255,0:02:30.243 and they became interested in Jeffrey. 0:02:30.623,0:02:32.719 They interviewed Jeffrey[br]a couple of times, 0:02:33.259,0:02:35.595 but they also interviewed [br]a lot of other people. 0:02:36.295,0:02:38.790 And for some reason, 0:02:38.790,0:02:41.592 they just got very interested in Jeffrey. 0:02:41.932,0:02:46.173 One morning, they came to his school,[br]and unbeknownst to his parents, 0:02:46.665,0:02:47.776 they took Jeffrey, 0:02:47.776,0:02:50.506 and they asked him if he'd come[br]and take a polygraph test. 0:02:51.708,0:02:54.753 He thought he was going to go down[br]to the local police station, 0:02:54.757,0:02:56.814 but he was taken some distance away. 0:02:57.154,0:02:59.566 He was introduced to a polygraph examiner. 0:02:59.916,0:03:05.316 And really, there was no intention[br]of running a real polygraph test that day. 0:03:05.316,0:03:07.261 They wanted to interrogate him. 0:03:07.936,0:03:10.580 The interrogation lasted six hours. 0:03:11.750,0:03:15.630 And at the end of six hours,[br]Jeffrey was reduced to tears. 0:03:15.857,0:03:18.245 He was on the floor in the fetal position. 0:03:18.565,0:03:21.845 The polygraph examiner,[br]who had been with him for those six hours, 0:03:21.859,0:03:22.946 gave up, 0:03:23.166,0:03:25.126 and two other detectives came in. 0:03:25.426,0:03:28.436 And they did what you[br]hear call the "good cop." 0:03:28.626,0:03:30.735 So they were now the helpful police. 0:03:31.265,0:03:32.585 And they tell Jeffrey, 0:03:32.935,0:03:37.807 "If you'll just be honest with us,[br]we'll get you some treatment, 0:03:37.997,0:03:39.666 you won't have to go to jail." 0:03:40.146,0:03:41.396 And within about an hour, 0:03:41.396,0:03:45.005 he had confessed[br]to having raped and killed Angela. 0:03:46.335,0:03:48.117 But they were lying to him. 0:03:48.307,0:03:50.246 They didn't take him to treatment. 0:03:50.246,0:03:52.463 They arrested him [br]and charged him with murder. 0:03:52.843,0:03:56.057 He almost immediately [br]recanted the confession, 0:03:56.267,0:03:57.798 and said he only did that 0:03:57.798,0:04:01.648 because he thought that was the only way[br]he was going to get away from the police. 0:04:01.997,0:04:04.847 He eventually went to trial,[br]was convicted, 0:04:04.987,0:04:07.356 and spent the next 16 years in jail. 0:04:08.446,0:04:10.624 The Innocence Project got involved. 0:04:11.574,0:04:14.610 They discovered that there was[br]still DNA from the victim 0:04:14.610,0:04:17.068 that the rapist had left behind. 0:04:17.915,0:04:20.458 They, after a long court battle, 0:04:21.108,0:04:24.063 got permission to test that[br]against the national database, 0:04:24.063,0:04:26.390 which now existed, 16 years later. 0:04:27.029,0:04:31.499 And a hit was found, a positive hit,[br]on the contributor of the semen. 0:04:31.973,0:04:36.178 And Jeffrey was released[br]as a wrongfully convicted person. 0:04:37.198,0:04:41.209 He sued and ended up[br]with quite a bit of money. 0:04:41.439,0:04:44.279 And today, Jeffrey[br]is actually doing quite well. 0:04:44.649,0:04:48.308 He took some of that money[br]and created a foundation 0:04:48.308,0:04:50.179 to help wrongfully convicted people. 0:04:50.421,0:04:52.866 And right now he's in law school, 0:04:52.866,0:04:57.069 going to become a lawyer[br]to represent wrongfully convicted people. 0:04:57.739,0:05:00.457 But here's a young man[br]who told the truth to the police, 0:05:00.687,0:05:02.198 and he wasn't believed. 0:05:02.728,0:05:06.319 Then he lied to the police[br]because he gave a false confession, 0:05:06.879,0:05:08.358 and he was believed. 0:05:08.598,0:05:12.169 Then he went to trial,[br]and 12 people sat in judgment over him, 0:05:12.180,0:05:15.350 and he said that he was innocent,[br]and they didn't believe him. 0:05:16.090,0:05:17.224 And he went to jail. 0:05:18.054,0:05:20.960 That would be tragedy enough[br]if it was just Jeffrey. 0:05:20.962,0:05:22.339 But it's not just Jeffrey. 0:05:22.339,0:05:24.991 From the Innocence Project,[br]we know that there have been 0:05:24.991,0:05:28.052 a quite a number of wrongfully[br]convicted people. 0:05:28.882,0:05:31.842 About one out of four of them[br]wrongfully confessed 0:05:32.515,0:05:34.309 to a crime they didn't commit. 0:05:34.999,0:05:38.140 So this is not an unusual, isolated event. 0:05:38.640,0:05:41.031 The other event I want to mention is 9/11. 0:05:41.931,0:05:47.717 And 9/11 is of interest[br]because those 19 terrorists 0:05:47.717,0:05:50.240 who came to the United States[br]to attack us, 0:05:50.730,0:05:55.569 every one of them was interviewed[br]by agents of the United States Government. 0:05:55.687,0:05:58.655 Every one of them,[br]interviewed at least three times. 0:05:59.082,0:06:02.582 So they were interviewed at the embassy[br]where they applied for a visa - 0:06:03.171,0:06:06.893 probably interviewed more than once,[br]given where they came from - 0:06:08.052,0:06:09.822 they were interviewed by Customs 0:06:09.822,0:06:12.144 when they came to the border[br]of the United States 0:06:12.144,0:06:13.800 to come inside the United States, 0:06:13.800,0:06:15.952 and they were also[br]interviewed by Immigration. 0:06:15.982,0:06:17.422 Each of them lied three times. 0:06:17.422,0:06:19.437 We didn't catch even one of them. 0:06:20.847,0:06:23.215 Think what a different world[br]we'd live in today 0:06:23.976,0:06:26.573 if we'd caught even one of them early on 0:06:27.183,0:06:28.429 and talked to them. 0:06:30.509,0:06:33.223 So, that's my interest,[br]and this is what I do. 0:06:33.223,0:06:34.940 I'm a psychological scientist. 0:06:34.940,0:06:38.383 I study truthfulness and deception[br]and how you tell the difference. 0:06:38.984,0:06:41.493 And that science, I think, [br]is really interesting. 0:06:41.493,0:06:43.442 And I want to give credit to some people 0:06:43.442,0:06:45.812 because I'm gonna talk -[br]this is not just my work; 0:06:45.812,0:06:48.833 this is work of a group of people[br]who do research in this area. 0:06:49.052,0:06:52.561 And I want to just give [br]some of them credit upfront. 0:06:53.051,0:06:59.629 Charles Bond, Bella DePaulo,[br]Maria Hartwig, David Raskin, Aldert Vrij. 0:06:59.965,0:07:01.171 And if you're interested, 0:07:01.171,0:07:03.443 those are names[br]to go look for to read more. 0:07:05.443,0:07:08.367 And the science has[br]some interesting things in it. 0:07:08.367,0:07:11.890 One is we have interesting[br]attitudes about lying. 0:07:12.800,0:07:14.389 If you ask people, 0:07:14.389,0:07:16.519 "What makes up good character? 0:07:16.519,0:07:19.801 What do you want to see in people[br]to think that they're a good person 0:07:19.801,0:07:21.171 and have good character?" 0:07:21.171,0:07:27.618 And near the top of that list[br]is always "sincere, honest, truthful." 0:07:28.900,0:07:32.607 And at the bottom of that list -[br]what's bad character? 0:07:33.320,0:07:36.681 The number one or sort[br]of the numbered last is "liar." 0:07:37.521,0:07:38.790 We don't like liars. 0:07:39.100,0:07:40.894 At least we say we don't like liars. 0:07:42.224,0:07:44.634 But if you go and you look[br]at people's behavior, 0:07:45.839,0:07:49.480 you find out something really funny,[br]really interesting, that doesn't fit. 0:07:49.970,0:07:53.085 Because if you look at people's behavior,[br]they lie frequently. 0:07:54.737,0:07:59.174 If you look at conversations[br]that last at least 10 minutes, 0:08:00.744,0:08:05.073 the data say we lie in about[br]one out of every four of those. 0:08:05.983,0:08:10.890 We lie to about one of every three people[br]we have a conversation with. 0:08:11.801,0:08:15.766 When we talk to our[br]significant others, our spouses, 0:08:15.882,0:08:17.843 we lie in about one out of ten of those. 0:08:19.523,0:08:20.583 Hmm. 0:08:21.023,0:08:22.923 (Laughter) 0:08:22.923,0:08:27.419 Now, all that lying[br]might not be a problem 0:08:28.259,0:08:30.312 if we were good at detecting lies. 0:08:30.742,0:08:33.052 And actually if you ask people,[br]most people think 0:08:33.052,0:08:36.911 they're pretty good at detecting lies,[br]particularly in people they know well, 0:08:36.911,0:08:39.211 like their children [br]or their significant others. 0:08:40.191,0:08:42.138 But again, when you look at the data, 0:08:42.138,0:08:43.553 that's not true. 0:08:43.853,0:08:47.442 And I wish I could tell you[br]that there is some magic bullet, 0:08:47.442,0:08:49.561 some magic finding from researching. 0:08:49.721,0:08:52.822 Goodness knows there are people out there[br]who will tell you that 0:08:53.112,0:08:55.413 because they teach seminars,[br]and they sell books, 0:08:55.413,0:08:57.305 and they train police officers. 0:08:57.713,0:08:59.336 But what the research says 0:08:59.836,0:09:03.281 is that if you look at the ability[br]of having a conversation with someone 0:09:03.281,0:09:06.073 and detecting whether[br]they're telling you the truth or not, 0:09:06.073,0:09:08.842 your accuracy is about 54 percent. 0:09:09.932,0:09:12.674 If you flip a coin, it's 50 percent. 0:09:13.834,0:09:15.321 What terrible it is. 0:09:16.761,0:09:20.148 And it turns out there just[br]really isn't much to find. 0:09:20.478,0:09:23.914 You know, I can ask that question, "Why?[br]Why are we so bad at this?" 0:09:24.273,0:09:25.913 And I think there are two reasons. 0:09:26.493,0:09:28.624 One has to do with our motivations. 0:09:29.144,0:09:31.495 And there's a very large [br]psychological literature 0:09:31.495,0:09:34.513 about how we make decisions,[br]and I think it's appropriate here. 0:09:34.513,0:09:37.706 And I don't have time to tell you [br]all the details of that research, 0:09:37.706,0:09:40.874 but I can capture it for you [br]from some literature. 0:09:41.564,0:09:43.704 There was a fantasy book[br]that was very popular 0:09:43.704,0:09:44.944 about 10 years ago 0:09:45.294,0:09:47.335 written by a fellow named Terry Goodkind; 0:09:47.335,0:09:49.204 it's called The Wizard's First Rule. 0:09:49.924,0:09:51.931 And the wizard's first rule is this: 0:09:52.691,0:09:54.528 With a little bit of motivation, 0:09:55.538,0:10:00.226 almost anybody can be led[br]to believe almost anything, 0:10:01.056,0:10:04.883 either because:[br]1) they want to believe it's true, 0:10:05.843,0:10:08.785 or 2) they're afraid that it's true. 0:10:10.405,0:10:14.665 I want you to think about that [br]for a little while and just process that. 0:10:14.665,0:10:18.655 But, you know, that captures[br]an awful lot of psychological research. 0:10:18.655,0:10:20.118 So that's one problem we have. 0:10:20.118,0:10:24.164 Our motives get in our way [in] how[br]we interpret the data that we see. 0:10:25.463,0:10:28.934 The other is that it turns out,[br]it is just really hard to do this. 0:10:29.341,0:10:34.176 There is very little in the body language,[br]and in the voice, and in the face 0:10:34.546,0:10:36.076 that gives away liars. 0:10:36.756,0:10:37.985 And the reason for that[br] 0:10:37.985,0:10:41.185 is that the very things[br]that make the liar nervous, 0:10:41.625,0:10:43.725 they're afraid they're going[br]to get caught - 0:10:43.725,0:10:44.935 it does cause changes; 0:10:44.935,0:10:46.716 people do get anxious about that; 0:10:46.716,0:10:48.455 they do alter their behavior. 0:10:49.855,0:10:52.376 But the truthful person -[br]think about Jeffrey - 0:10:52.614,0:10:55.055 the truthful person[br]who is talking to people, 0:10:55.055,0:10:56.900 who he thinks don't believe them, 0:10:57.190,0:11:01.039 you have fear that you're not going[br]to be believed when you're truthful. 0:11:01.266,0:11:04.503 And the body language you give off[br]is exactly the same. 0:11:07.021,0:11:08.835 So, where does that leave us? 0:11:09.415,0:11:11.177 Well, science hasn't given up. 0:11:11.717,0:11:13.645 And one approach is technology. 0:11:14.605,0:11:16.752 I've been involved[br]in one of the technologies 0:11:16.752,0:11:19.397 for most of my adult life,[br]and that's polygraph testing. 0:11:19.684,0:11:22.906 I was trained as a polygraph tester[br]and have done that work, 0:11:23.567,0:11:26.837 and although there's a lot[br]of mythology about polygraph, 0:11:26.947,0:11:29.177 what the science says is that: 0:11:29.177,0:11:32.957 Properly conducted polygraph tests[br]can be rather accurate. 0:11:34.543,0:11:38.398 In the lab, we can easily get[br]test accuracies up around 90 percent, 0:11:38.768,0:11:41.678 and there are field studies [br]that have replicated that, 0:11:42.598,0:11:44.528 but there's a problem with polygraph. 0:11:44.528,0:11:47.737 One is it requires a skilled[br]examiner to run the test. 0:11:48.667,0:11:51.148 And a test takes two hours. 0:11:51.788,0:11:54.418 So you need an instrument,[br]an examiner, and two hours. 0:11:54.726,0:11:56.498 Most of the time, we don't have that; 0:11:56.498,0:11:58.279 it's really expensive in that regard. 0:11:59.099,0:12:01.379 The other problem is[br]there are police agencies 0:12:01.379,0:12:04.679 like the one that Jeffrey Deskovic[br]got involved in, 0:12:05.479,0:12:09.629 where they used the polygraph[br]as a pretense for interrogation. 0:12:10.310,0:12:12.759 And the polygraph[br]is very dangerous in that setting. 0:12:14.539,0:12:18.913 Recent research has just reported[br]that [if] you look at the FBI - 0:12:19.103,0:12:20.466 and the FBI is an agency 0:12:20.466,0:12:23.230 that uses polygraph [br]as a pretense to interrogate - 0:12:23.960,0:12:25.648 if you are actually innocent, 0:12:25.648,0:12:28.660 and you agree to take[br]a polygraph from the FBI, 0:12:28.990,0:12:33.868 your chance of not being interrogated[br]is only 20 percent. 0:12:36.047,0:12:39.238 Four out of five innocent people[br]get interrogated. 0:12:41.537,0:12:42.959 There are other technologies. 0:12:42.959,0:12:46.004 There are some new ones[br]that look at the central nervous system. 0:12:46.004,0:12:48.213 So there's EEG and the fMRI. 0:12:49.223,0:12:51.411 And again, they involve [br]expensive equipment. 0:12:51.711,0:12:53.890 They've looked interesting [br]in the laboratory, 0:12:54.160,0:12:56.527 and they haven't made it out [br]into the field. 0:12:58.200,0:12:59.360 There's also a new test[br] 0:12:59.360,0:13:02.520 called an ocular-motor[br]detection of deception test 0:13:02.930,0:13:04.210 that looks at the eyes. 0:13:04.736,0:13:07.591 So the pupils of our eyes[br]do get smaller and larger 0:13:07.979,0:13:09.180 as we are processing. 0:13:09.440,0:13:11.991 It is harder to lie[br]than it is to tell the truth. 0:13:13.031,0:13:14.650 And those tests are around. 0:13:14.650,0:13:17.091 There's some research on them[br]that looks promising. 0:13:17.430,0:13:19.641 We're waiting for more data[br]to come in on this. 0:13:19.891,0:13:22.631 And we will need to see[br]if they'll move out into the field 0:13:22.631,0:13:23.821 and be useable. 0:13:25.131,0:13:26.909 The final area, though, comes back - 0:13:26.909,0:13:30.019 the problem with all those technologies[br]is that they're expensive. 0:13:30.389,0:13:31.739 You're not going to use that 0:13:31.739,0:13:34.103 if you want to talk[br]to somebody who works for you, 0:13:34.103,0:13:35.212 or you want to find out 0:13:35.212,0:13:37.650 whether your spouse[br]is telling you the truth or not. 0:13:38.860,0:13:44.155 And so, are there ways that we can improve[br]interpersonal detection of deception 0:13:44.431,0:13:47.482 and reading body language,[br]and face, and all that doesn't work? 0:13:48.032,0:13:50.850 But there are ways we can go about that 0:13:50.851,0:13:55.442 because what the research says[br]is that what people say 0:13:55.442,0:13:58.102 is way more important[br]than how they say it. 0:13:59.972,0:14:02.794 So, there's an interview technique [br]that's been developed - 0:14:03.124,0:14:07.835 it's a forensic interview technique -[br]and it's really pretty much common sense. 0:14:08.042,0:14:09.079 But, of course, 0:14:09.079,0:14:11.109 common sense often isn't common. 0:14:12.369,0:14:14.605 And all this technique really does involve 0:14:14.605,0:14:17.958 is letting the person[br]you're interested in assessing 0:14:17.958,0:14:19.342 tell their story. 0:14:20.852,0:14:22.253 It's called a free narrative. 0:14:22.253,0:14:24.265 So, you literally do just that. 0:14:24.545,0:14:26.107 "Tell me what happened." 0:14:26.677,0:14:28.422 And then you have to stop talking. 0:14:28.932,0:14:31.002 A lot of us find that really hard to do. 0:14:31.222,0:14:33.553 That's not how we have [br]conversations with people. 0:14:33.553,0:14:35.484 Conversations usually go back and forth. 0:14:35.894,0:14:39.085 But in this case, you want to just[br]ask them to tell you their story, 0:14:39.085,0:14:40.916 and then stop talking, and listen. 0:14:42.581,0:14:45.534 Listen carefully, take notes. 0:14:46.534,0:14:47.835 If you have facts, 0:14:49.755,0:14:51.592 there are things you know to be true, 0:14:52.152,0:14:53.549 don't tell them about those. 0:14:53.549,0:14:55.652 If they contradict them,[br]don't confront them. 0:14:55.973,0:14:57.793 Just listen until they finish. 0:14:58.713,0:15:02.605 And then when they finish,[br]you strategically use your evidence. 0:15:03.273,0:15:05.329 It's called "the strategic[br]use of evidence." 0:15:05.329,0:15:06.439 Isn't that surprising? 0:15:06.439,0:15:07.749 (Laughter) 0:15:08.189,0:15:10.064 So the person finishes their story. 0:15:10.274,0:15:11.826 You've been listening carefully. 0:15:11.826,0:15:15.586 You know X happened,[br]but they never mentioned X. 0:15:16.021,0:15:18.385 And so you go, "Well, I'm confused. 0:15:20.152,0:15:24.735 I know that at this time X happened,[br]but you didn't talk about that. 0:15:24.735,0:15:25.974 Can you tell me why?" 0:15:26.654,0:15:30.147 And for the innocent person,[br]most likely it is they just forgot. 0:15:31.194,0:15:34.116 Because when we tell stories[br]about things we have experienced, 0:15:34.116,0:15:35.455 and they're true stories, 0:15:35.455,0:15:37.916 we often forget to tell about the details. 0:15:37.916,0:15:39.766 We often have to go back and fix it. 0:15:39.766,0:15:42.985 Think about talking to a friend[br]about something that happened to you 0:15:42.985,0:15:44.045 just recently. 0:15:45.275,0:15:47.991 And you'll know that that's true.[br]We do that all the time. 0:15:48.031,0:15:49.240 For the innocent person, 0:15:49.240,0:15:52.551 that new evidence is easy [br]to incorporate into their story 0:15:52.551,0:15:54.283 because it's in fact true. 0:15:54.283,0:15:55.277 Their story is true. 0:15:55.277,0:15:58.046 "Oh, I just forgot to tell you that.[br]This is blah blah ..." 0:15:58.046,0:15:59.422 And they fill it in. 0:16:00.252,0:16:03.462 For the liar, it's much more difficult. 0:16:04.282,0:16:05.475 Because the liar - 0:16:05.475,0:16:08.756 you know, the first rule of lying[br]is you got to keep the lie straight. 0:16:08.756,0:16:11.038 And so they've now told you [br]their whole story. 0:16:12.248,0:16:14.662 And now you've given them [br]a new piece of evidence. 0:16:14.662,0:16:17.188 They've got to work a lot harder [br]to put that back in. 0:16:17.928,0:16:21.488 And if you have more pieces of evidence,[br]you strategically introduce them 0:16:21.488,0:16:22.739 one at a time. 0:16:23.159,0:16:24.417 And what we find is[br] 0:16:24.677,0:16:29.486 that with people who are outright lying,[br]their stories eventually collapse 0:16:29.486,0:16:33.484 because they just can't keep up[br]with the influx of new evidence. 0:16:35.309,0:16:38.200 So, I want to end up with two ideas. 0:16:39.010,0:16:42.917 One is that you have to talk to somebody[br]and try to make an assessment 0:16:42.917,0:16:44.999 about whether they're being[br]truthful or not. 0:16:45.259,0:16:46.259 Listen. 0:16:46.259,0:16:51.391 Listen carefully to what they have to say[br]and remember that what they say 0:16:51.751,0:16:54.167 is far more important[br]than how they say it. 0:16:54.937,0:16:57.358 Because even innocent people[br]can look very nervous. 0:16:58.468,0:16:59.658 And the other one is:[br] 0:16:59.838,0:17:02.058 Don't believe anything 0:17:02.838,0:17:05.598 just because you want[br]to believe that it's true 0:17:06.828,0:17:09.158 or because you're afraid that it's true. 0:17:10.408,0:17:13.347 Our relationships, our lives, 0:17:14.517,0:17:17.114 our country, the world, in fact, 0:17:17.404,0:17:19.024 may well depend upon that. 0:17:19.574,0:17:20.829 Thank you very much. 0:17:20.839,0:17:23.879 (Applause)