1 00:00:00,501 --> 00:00:04,428 Okay, let's have a look at risk management in practice 2 00:00:04,457 --> 00:00:08,474 And what I want to do is to start with some basic concepts 3 00:00:08,485 --> 00:00:14,015 then focus on TWO difficult areas in the risk process 4 00:00:14,242 --> 00:00:19,164 So, I guess if I asked you to define the word 'risk' 5 00:00:19,174 --> 00:00:22,954 you would have some idea of what it meant 6 00:00:22,967 --> 00:00:26,476 We might not have a formal definition that we could quote, 7 00:00:26,476 --> 00:00:30,253 but we all have something in our minds when we hear the word 'risk' 8 00:00:30,274 --> 00:00:33,972 This is what we think, and maybe you think of things like this 9 00:00:34,234 --> 00:00:38,534 Maybe you feel like this little guy, facing some big ugly challenge 10 00:00:38,534 --> 00:00:41,791 that you know is just going to squash you flat. 11 00:00:42,066 --> 00:00:43,766 Maybe you feel like this guy. 12 00:00:44,024 --> 00:00:46,489 This is a real job in North Korea, 13 00:00:47,768 --> 00:00:51,367 and his job is to hold the target for other people to shoot at 14 00:00:51,500 --> 00:00:53,979 Sometimes project managers have the target here 15 00:00:54,292 --> 00:00:56,911 We feel like everybody is shooting at us in our job 16 00:00:57,757 --> 00:01:01,987 Or maybe you just know there's something nasty out there, waiting to get you 17 00:01:02,470 --> 00:01:05,700 And maybe that's what you think of when you think of the word 'risk' 18 00:01:06,361 --> 00:01:09,574 Well that's partly true but it's not the whole truth. 19 00:01:10,193 --> 00:01:13,625 Risk is not the same as uncertainty. 20 00:01:14,220 --> 00:01:16,963 Risk is related to uncertainty but they're different. 21 00:01:17,670 --> 00:01:23,826 So all risks are uncertain but not all uncertainties are risks. 22 00:01:24,653 --> 00:01:27,589 If you have a risk register or a risk list, 23 00:01:27,942 --> 00:01:31,435 you don't have a million items in it, or you shouldn't. 24 00:01:31,962 --> 00:01:34,512 You don't even probably have a thousand items in it, 25 00:01:34,512 --> 00:01:35,788 you have a smaller number. 26 00:01:36,714 --> 00:01:40,007 Although there are millions of uncertainties in the world. 27 00:01:40,400 --> 00:01:44,313 So how do we decide which uncertainties we're going to call 'risk'? 28 00:01:44,677 --> 00:01:47,280 And write them down and put them in our risk register 29 00:01:47,548 --> 00:01:50,057 and decide to do something about them. 30 00:01:50,483 --> 00:01:56,362 Clearly 'risk' is a subset of uncertainties, but which subset? 31 00:01:56,942 --> 00:01:58,133 How do you know? 32 00:01:58,798 --> 00:02:02,948 I think it's very simple to separate risk and uncertainty. 33 00:02:03,199 --> 00:02:05,274 And I use 3 English words, 34 00:02:05,425 --> 00:02:10,019 these words here, "risk is uncertainty that matters." 35 00:02:11,503 --> 00:02:14,783 Because most of the uncertainties in the world don't matter. 36 00:02:15,564 --> 00:02:19,014 We don't care if it's going to rain in London tomorrow afternoon. 37 00:02:19,400 --> 00:02:23,780 It might, it might not. It's irrelevant, it doesn't matter. 38 00:02:24,498 --> 00:02:26,948 We don't care what the exchange rate will be 39 00:02:26,948 --> 00:02:30,703 if it's between the Russian Ruble and the Chinese Yen in 2020. 40 00:02:30,703 --> 00:02:32,387 It doesn't matter to us. 41 00:02:32,888 --> 00:02:35,118 But there are things on our projects, 42 00:02:35,427 --> 00:02:37,117 and things in our families, 43 00:02:37,259 --> 00:02:38,859 and things in our country, 44 00:02:38,979 --> 00:02:41,446 which are uncertain which do matter to us. 45 00:02:42,195 --> 00:02:45,338 If it's an uncertainty that matters, it's a risk. 46 00:02:46,188 --> 00:02:49,991 So here's another question, how do you know what matters? 47 00:02:50,751 --> 00:02:53,396 In your projects, what are the things that matter? 48 00:02:54,077 --> 00:02:57,875 The things that matter in our projects are our objectives. 49 00:02:58,532 --> 00:03:02,216 So we must always connect uncertainty with objectives, 50 00:03:02,991 --> 00:03:05,631 in order to find the risks. 51 00:03:06,005 --> 00:03:08,355 And if we look at some definitions of risk, 52 00:03:08,361 --> 00:03:11,405 this is the ISO standard that I mentioned, 53 00:03:11,445 --> 00:03:13,796 it connects those words very simply; 54 00:03:13,796 --> 00:03:17,559 Risk is the effect of uncertainty on objectives. 55 00:03:18,451 --> 00:03:21,201 And we might look at another definition from the UK, 56 00:03:21,387 --> 00:03:23,916 from our association for project management, 57 00:03:24,144 --> 00:03:28,134 it says the same thing that risk is an uncertain event 58 00:03:28,152 --> 00:03:32,212 or a set of circumstances, which is uncertain, 59 00:03:32,212 --> 00:03:35,372 but it matters because should it occur, 60 00:03:35,372 --> 00:03:38,587 it will have an effect on achievement of objectives. 61 00:03:38,603 --> 00:03:40,553 Uncertainty that matters. 62 00:03:40,867 --> 00:03:44,317 So we should be looking in our risk register for two things: 63 00:03:44,703 --> 00:03:48,803 "Is it uncertain?" We don't want problems in our risk register. 64 00:03:49,183 --> 00:03:52,113 We don't want issues in the risk register. 65 00:03:52,113 --> 00:03:55,020 We don't want constraints or requirements. 66 00:03:55,291 --> 00:03:59,631 These things are certain, what we want is uncertainties, 67 00:03:59,721 --> 00:04:02,241 something that might happen or might not happen. 68 00:04:03,076 --> 00:04:06,766 But the other important question for our risk register is 69 00:04:06,766 --> 00:04:08,366 "Does it matter?" 70 00:04:08,439 --> 00:04:11,739 Which objective would be affected if this thing happened? 71 00:04:13,304 --> 00:04:15,810 And then when we want to see how big the risk is, 72 00:04:16,160 --> 00:04:18,322 we can ask those two questions: 73 00:04:18,338 --> 00:04:19,768 "How uncertain is it, 74 00:04:19,985 --> 00:04:22,095 and how much does it matter?" 75 00:04:22,132 --> 00:04:24,502 And that will tell us how big the risk is. 76 00:04:24,564 --> 00:04:27,044 So, this idea of uncertainty that matters 77 00:04:27,084 --> 00:04:30,536 then develops into something which is useful 78 00:04:30,618 --> 00:04:33,317 by linking uncertainty to our objectives. 79 00:04:34,767 --> 00:04:37,148 So, we have two dimensions of ‘risk,’ 80 00:04:37,409 --> 00:04:39,658 we have an uncertainty dimension and we 81 00:04:39,658 --> 00:04:42,093 have a dimension that affects our objectives 82 00:04:43,249 --> 00:04:47,419 In projects, we call this probability and impact. 83 00:04:47,449 --> 00:04:49,502 We could call them other things, 84 00:04:49,502 --> 00:04:51,233 there are other English 85 00:04:51,233 --> 00:04:52,767 words we could use, but these 86 00:04:52,767 --> 00:04:54,553 are the ones, most often, we use. 87 00:04:54,702 --> 00:04:57,732 And I would like to ask you with this picture of the mouse. 88 00:04:59,632 --> 00:05:04,647 What effect matters to the mouse? 89 00:05:05,874 --> 00:05:09,315 So first of all, clearly, he is in an uncertain situation here. 90 00:05:09,845 --> 00:05:12,193 And he's seen some risks. 91 00:05:12,457 --> 00:05:15,359 His objective is to get the cheese and stay alive. 92 00:05:15,938 --> 00:05:18,817 And so, one of the risks he has identified is a bad thing 93 00:05:18,903 --> 00:05:21,353 that might happen: he might be killed or injured. 94 00:05:22,177 --> 00:05:24,517 And so, he has been a good project manager, 95 00:05:24,517 --> 00:05:27,032 he has put his little helmet on, and he is preparing 96 00:05:27,152 --> 00:05:32,051 so that it doesn't happen to him. So, he doesn't get killed or injured. 97 00:05:32,051 --> 00:05:32,821 Very good. 98 00:05:33,690 --> 00:05:36,560 And there are things in our projects, that if they happened 99 00:05:36,560 --> 00:05:37,875 would kill or injure us. 100 00:05:37,875 --> 00:05:39,218 They would waste time, 101 00:05:39,218 --> 00:05:41,566 waste money, damage reputation, 102 00:05:41,566 --> 00:05:42,986 destroy performance, 103 00:05:43,202 --> 00:05:45,732 maybe even injure real people. 104 00:05:46,230 --> 00:05:49,880 And as project managers we have to see those things and stop them happening. 105 00:05:49,957 --> 00:05:51,792 Protect ourselves in advance. 106 00:05:51,857 --> 00:05:52,867 Avoid them. 107 00:05:54,240 --> 00:05:57,870 Are there any other uncertainties that matter for the mouse? 108 00:05:59,637 --> 00:06:01,327 Well there is... 109 00:06:01,327 --> 00:06:02,275 the cheese. 110 00:06:02,678 --> 00:06:05,558 There's an uncertainty here which matters a great deal. 111 00:06:05,558 --> 00:06:07,887 "Will I get the cheese out of the trap?" 112 00:06:08,733 --> 00:06:10,429 He might, or he might not. 113 00:06:10,967 --> 00:06:14,121 And if he doesn't get the cheese out of the trap, he's failed 114 00:06:14,961 --> 00:06:17,471 So he has two uncertainties to manage, 115 00:06:17,471 --> 00:06:20,133 one of them is bad - he might be killed or injured - 116 00:06:20,419 --> 00:06:22,639 the other is good - he might get the cheese. 117 00:06:23,129 --> 00:06:24,616 And what he has to do, 118 00:06:24,968 --> 00:06:28,958 what he has to do is to manage both of these at the same time. 119 00:06:29,232 --> 00:06:32,159 And as project managers, we have to do the same thing. 120 00:06:32,788 --> 00:06:36,128 And also we have to do it in the best possible way - 121 00:06:36,128 --> 00:06:40,546 sometimes there's a better way to get the cheese without being killed or injured. 122 00:06:41,428 --> 00:06:44,518 In our projects, we have to stop the bad things happening, 123 00:06:44,889 --> 00:06:47,789 but we also have to get the cheese out of our projects. 124 00:06:49,116 --> 00:06:52,116 "So what does 'cheese' mean, in your project?" 125 00:06:52,116 --> 00:06:54,302 "What is the 'cheese' in your project?" 126 00:06:55,160 --> 00:06:56,640 'Cheese' means value. 127 00:06:56,806 --> 00:06:58,445 'Cheese' means benefits. 128 00:06:58,644 --> 00:07:01,868 'Cheese' means products and services that people want and need. 129 00:07:02,170 --> 00:07:04,290 'Cheese' means customer satisfaction. 130 00:07:04,371 --> 00:07:06,802 'Cheese' is the good stuff that we're trying to get 131 00:07:06,802 --> 00:07:08,465 out of our difficult projects. 132 00:07:08,872 --> 00:07:11,962 And if we don't do anything bad - 133 00:07:11,962 --> 00:07:16,170 we don't waste time, we don't waste money, we don't damage reputation - 134 00:07:16,170 --> 00:07:17,954 but we don't create value, 135 00:07:18,278 --> 00:07:19,338 we've failed. 136 00:07:19,683 --> 00:07:23,183 If the mouse didn't die but he didn't get the cheese, he failed. 137 00:07:23,950 --> 00:07:28,353 If we create benefits, but we waste time and waste money and destroy reputation, 138 00:07:28,353 --> 00:07:29,369 we've failed. 139 00:07:30,015 --> 00:07:32,565 And if the mouse gets the cheese and he's killed, 140 00:07:32,724 --> 00:07:33,784 he's failed. 141 00:07:33,784 --> 00:07:36,076 So we have to do both of these things. 142 00:07:36,353 --> 00:07:39,231 And when we think about risk and think about impact, 143 00:07:39,468 --> 00:07:41,896 there are two kinds of impact that matter. 144 00:07:42,525 --> 00:07:45,175 Bad ones, and good ones. 145 00:07:45,488 --> 00:07:48,125 Uncertainties that could hurt the project, 146 00:07:48,678 --> 00:07:51,568 and uncertainties that could help the project. 147 00:07:51,766 --> 00:07:56,364 Both of these matter and both of these need to be managed. 148 00:07:56,867 --> 00:07:59,078 And we have another word for those. 149 00:07:59,305 --> 00:08:03,791 So, here's the definition of risk from the Project Management Institute, the PMI, 150 00:08:04,007 --> 00:08:05,685 from the PMBok Guide. 151 00:08:05,993 --> 00:08:08,093 It's the same as the others that we've seen: 152 00:08:08,093 --> 00:08:12,379 an uncertain event or condition, that if it occurs, affects an objective. 153 00:08:13,222 --> 00:08:18,534 But PMI knows about the mouse. PMI knows about the cheese and the traps, 154 00:08:18,721 --> 00:08:22,277 and has added three words to the definition of risk here. 155 00:08:23,233 --> 00:08:26,204 It's not the words 'cheese' and 'traps'. 156 00:08:26,432 --> 00:08:29,301 It's the words 'positive or negative'. 157 00:08:30,067 --> 00:08:33,984 What this tells us is that there are good risks, as well as bad risks. 158 00:08:34,544 --> 00:08:37,997 And we heard that in one of our keynote speeches, earlier this morning. 159 00:08:38,507 --> 00:08:42,907 In the uncertain situation that this country faces going forward 160 00:08:42,958 --> 00:08:46,058 with all the changes that there have been, there are threats. 161 00:08:46,058 --> 00:08:48,061 There are things that could go wrong. 162 00:08:48,061 --> 00:08:50,226 And you need to see those and address them. 163 00:08:50,407 --> 00:08:53,188 But there are also opportunities. 164 00:08:53,248 --> 00:08:56,235 Uncertain things that might happen that could be good. 165 00:08:56,918 --> 00:08:59,346 And we also need to see those things, 166 00:08:59,561 --> 00:09:02,699 and to try and proactively make them happen. 167 00:09:03,249 --> 00:09:05,376 And that is equally true in our projects, 168 00:09:05,376 --> 00:09:06,938 in our personal lives, 169 00:09:06,938 --> 00:09:09,308 and also at the national level. 170 00:09:09,778 --> 00:09:13,578 And I'll be talking about some of those things later on this afternoon 171 00:09:14,717 --> 00:09:19,161 So, PMI has this definition. The other standards have something very similar. 172 00:09:19,541 --> 00:09:21,448 The ISO standard, at the bottom here, 173 00:09:21,448 --> 00:09:25,619 says 'risk is the effect of uncertainty on objectives.' 174 00:09:26,536 --> 00:09:29,252 Note, the effect can be positive or negative. 175 00:09:31,009 --> 00:09:34,726 And the APM, Association for Project Management in the UK says the same thing. 176 00:09:35,426 --> 00:09:39,954 So we have this new idea, that risk is a double-sided concept. 177 00:09:41,090 --> 00:09:43,959 And it's the same impression, the word you have for risk, 178 00:09:44,329 --> 00:09:48,002 we mostly think of bad things. But it could be used for good things, 179 00:09:48,002 --> 00:09:49,597 as well. Isn't that right? 180 00:09:49,597 --> 00:09:51,384 It's an uncertain word. 181 00:09:51,875 --> 00:09:55,031 And there are good risks as well as bad risks. 182 00:09:55,991 --> 00:09:59,364 So in our project risk management process, 183 00:09:59,574 --> 00:10:03,181 we should be looking out for the traps and avoiding them 184 00:10:03,181 --> 00:10:06,004 and protecting ourselves and preventing them happening. 185 00:10:06,004 --> 00:10:09,147 But we should also be looking out for the cheese 186 00:10:09,147 --> 00:10:11,549 and chasing it, and making it happen proactively, 187 00:10:11,549 --> 00:10:14,897 so we get the maximum benefit for the minimum cost. 188 00:10:15,957 --> 00:10:19,375 That’s why risk management is so important to 189 00:10:19,712 --> 00:10:22,676 project success: because it effects our objectives. 190 00:10:23,817 --> 00:10:27,426 It gives us the best possible chance to achieve our goals. 191 00:10:28,570 --> 00:10:30,290 So how do we do that? 192 00:10:30,621 --> 00:10:33,104 If we think about the risk management process, 193 00:10:33,539 --> 00:10:35,899 the process has to do a number of things. 194 00:10:36,524 --> 00:10:39,564 If risk is uncertainty that affects objectives, 195 00:10:39,633 --> 00:10:41,753 we have to know what our objectives are. 196 00:10:42,076 --> 00:10:44,416 Then, we have to identify the uncertainties. 197 00:10:45,099 --> 00:10:48,459 The uncertainties that would matter to those objectives. 198 00:10:48,618 --> 00:10:53,058 And remember that they could be good or bad, threats and opportunities. 199 00:10:53,897 --> 00:10:56,747 That gives us a long list of uncertainties that matter, 200 00:10:56,777 --> 00:10:58,477 but they don't all matter the same. 201 00:10:59,218 --> 00:11:03,528 So the next thing we have to do is to prioritize, and ask the question 202 00:11:04,287 --> 00:11:06,837 "How uncertain, and how much does it matter?" 203 00:11:07,053 --> 00:11:10,133 Then we get a prioritized list of risks. 204 00:11:10,210 --> 00:11:14,390 We know which are the worst threats and the best opportunities, 205 00:11:14,811 --> 00:11:17,061 so that we do something about it. 206 00:11:17,508 --> 00:11:19,384 Then we plan how to respond. 207 00:11:19,411 --> 00:11:23,294 We think about what would be appropriate to stop the bad thing happening 208 00:11:23,344 --> 00:11:24,974 and to make the good thing happen. 209 00:11:25,583 --> 00:11:28,920 And having decided, we do it of course. 210 00:11:29,851 --> 00:11:33,641 And then risk is constantly changing so we need to come back and do it again, 211 00:11:33,873 --> 00:11:35,923 and see what has changed. 212 00:11:36,518 --> 00:11:42,048 We could express this process as a number of questions that it's important to ask, 213 00:11:42,184 --> 00:11:45,604 and keep on asking about our project. 214 00:11:46,337 --> 00:11:49,897 In fact, you can use these questions for anything. 215 00:11:50,106 --> 00:11:54,076 You could use these questions for your next career move. 216 00:11:54,524 --> 00:11:58,914 You could use these questions for deciding about your pension. 217 00:11:59,141 --> 00:12:03,691 You could use these questions to decide how to bring up your children 218 00:12:04,323 --> 00:12:09,183 or to decide on how to invest the nation's wealth. 219 00:12:09,996 --> 00:12:11,759 These are the questions: 220 00:12:11,799 --> 00:12:15,409 "What are we trying to achieve?" That's setting objectives. 221 00:12:15,979 --> 00:12:18,439 Then, "what could affect us in achieving that?" 222 00:12:18,534 --> 00:12:20,574 That's identifying risks. 223 00:12:20,915 --> 00:12:24,455 Then, "when we have a list of risks, which are the most important ones?" 224 00:12:24,742 --> 00:12:27,392 That's prioritizing at that assessing the risks. 225 00:12:27,674 --> 00:12:29,814 Then, "what could we do about it?" 226 00:12:30,242 --> 00:12:34,402 Planning our responses and doing it, implementing the responses. 227 00:12:34,846 --> 00:12:39,166 And then, "did it work and what's changed" Reviewing the risk. 228 00:12:39,306 --> 00:12:43,766 So if we look at a risk management process, we could link each step in the 229 00:12:43,800 --> 00:12:46,930 process to one of these questions. 230 00:12:47,093 --> 00:12:49,773 And this is why risk management is so easy, 231 00:12:49,876 --> 00:12:56,116 because all we're doing is asking and answering obvious questions. 232 00:12:56,297 --> 00:13:01,117 Anybody who's doing anything important will ask these questions: 233 00:13:01,320 --> 00:13:03,820 "What am I trying to do?" "What could affect me?" 234 00:13:03,896 --> 00:13:06,462 "Which are the big ones?" "What shall I do about it?" 235 00:13:06,473 --> 00:13:08,983 "Did that work?" "Now what?" 236 00:13:09,703 --> 00:13:14,383 And you could ask those questions every Monday morning when you drove to work, 237 00:13:14,402 --> 00:13:16,032 or every Saturday morning. 238 00:13:16,032 --> 00:13:17,786 You can ask the question, say 239 00:13:17,791 --> 00:13:20,831 "What am I trying to achieve today?" "This week?" 240 00:13:21,286 --> 00:13:23,636 "What could affect me and which are the big ones?" 241 00:13:23,743 --> 00:13:24,835 "What shall I do?" 242 00:13:24,871 --> 00:13:30,351 We can manage risk on a very simple basis, or we can use this as the structure for 243 00:13:30,394 --> 00:13:35,114 a risk process which is much more complex, which involves lots of meetings, 244 00:13:35,139 --> 00:13:39,079 and lots of stakeholder groups and lots of analysis and statistics. 245 00:13:39,098 --> 00:13:41,188 It's the same questions. 246 00:13:41,909 --> 00:13:45,306 So I would like you to remember two important things. 247 00:13:45,319 --> 00:13:49,229 One is, risk is uncertainty that matters. 248 00:13:49,554 --> 00:13:53,854 And secondly, these questions, these six questions. 249 00:13:54,535 --> 00:13:58,235 Because that's the heart, that's the basis of managing risk, 250 00:13:58,245 --> 00:14:00,761 and it really is very, very easy. 251 00:14:01,402 --> 00:14:06,197 Now, in the time that we have, I want to focus on just two parts of this process, 252 00:14:06,351 --> 00:14:10,591 and then give us the opportunity to try out some of these things. 253 00:14:10,618 --> 00:14:14,308 The identification step, clearly very, very important 254 00:14:14,344 --> 00:14:18,494 because if we don't identify the risks, we can't manage them. 255 00:14:18,977 --> 00:14:21,617 And then planning responses. 256 00:14:21,880 --> 00:14:26,310 Understanding how we can deal with the uncertainties that we've identified. 257 00:14:26,590 --> 00:14:30,190 So, let's think about these things: identifying risks. 258 00:14:30,363 --> 00:14:32,493 How do we find all of the risks? 259 00:14:33,496 --> 00:14:35,016 Well, you can't. 260 00:14:35,016 --> 00:14:38,782 You can't find all of the risks because there are risks that arrive 261 00:14:38,803 --> 00:14:40,563 that we hadn't seen before. 262 00:14:40,563 --> 00:14:44,386 There are emergent risks, new risks, different risks 263 00:14:44,607 --> 00:14:48,907 and I'll be talking about those later this afternoon in my speech. 264 00:14:49,088 --> 00:14:54,848 What we want to find are the knowable risks: the risks that we could find. 265 00:14:55,249 --> 00:14:58,609 We don't want somebody on our project team who knows a risk 266 00:14:58,609 --> 00:15:00,192 and they're not telling anybody. 267 00:15:00,235 --> 00:15:04,525 So this process is about exposing the uncertainties that matter, 268 00:15:04,533 --> 00:15:06,773 finding them so we can do something about them. 269 00:15:06,888 --> 00:15:08,708 And there are lots of techniques, 270 00:15:08,708 --> 00:15:11,724 brainstorming, workshops, check lists, 271 00:15:11,724 --> 00:15:15,144 testing our assumptions and so on. 272 00:15:15,398 --> 00:15:18,028 But I would like to answer a bigger question, 273 00:15:18,262 --> 00:15:20,439 a different question from techniques. 274 00:15:21,479 --> 00:15:24,173 And it's the question, "are we finding the real risks?" 275 00:15:25,393 --> 00:15:29,787 When you go to a risk workshop and you write things in your risk register, 276 00:15:30,007 --> 00:15:33,724 are they really the uncertainties that matter for your project? 277 00:15:34,434 --> 00:15:39,874 Are these really the things that could drive you off track or really help you? 278 00:15:40,249 --> 00:15:42,419 Or are they just the obvious things? 279 00:15:42,419 --> 00:15:45,954 Where all projects have problems with requirements, 280 00:15:45,954 --> 00:15:49,541 with resources, with testing. These are things that 281 00:15:49,541 --> 00:15:52,625 always come up, and we have processes to deal with them. 282 00:15:53,733 --> 00:15:55,754 But are they the real risks? 283 00:15:56,034 --> 00:15:59,014 I would like to suggest to you that often in our risk registers 284 00:15:59,014 --> 00:16:02,018 we confuse real risks with other things. 285 00:16:03,021 --> 00:16:07,725 Often, we confuse risks with their causes, where does the risk come from? 286 00:16:08,555 --> 00:16:13,125 Or we confuse risk with their effects, what do they do if they happen? 287 00:16:14,005 --> 00:16:16,525 But risks are uncertainties that matter. 288 00:16:17,565 --> 00:16:20,405 They are not causes or effects. 289 00:16:20,470 --> 00:16:22,767 So causes are things that are true. 290 00:16:23,252 --> 00:16:25,743 This is true that the project is difficult, 291 00:16:26,237 --> 00:16:29,370 it is true that we do not have enough people on the project. 292 00:16:29,730 --> 00:16:33,233 it is true that the customer hasn't signed the contract yet. 293 00:16:33,553 --> 00:16:36,631 These are not risks, they are facts. 294 00:16:36,786 --> 00:16:38,426 They might be issues. 295 00:16:38,426 --> 00:16:42,164 They might be problems, but they are not risks because they are not uncertain. 296 00:16:42,607 --> 00:16:45,437 And a lot of people write these things in our risk register. 297 00:16:45,437 --> 00:16:47,589 "We don't have enough time for this project." 298 00:16:47,589 --> 00:16:48,819 "It’s a risk!" 299 00:16:48,819 --> 00:16:51,360 No, it’s a problem. 300 00:16:52,236 --> 00:16:55,164 Sometimes we confuse risks with their effects. 301 00:16:55,335 --> 00:16:58,635 There could be an accident, we could be late. 302 00:16:58,835 --> 00:17:02,204 those are not risks either, they are the effects of risks, 303 00:17:02,204 --> 00:17:06,941 how do you manage, we could be late? If your late, it’s too late. 304 00:17:07,264 --> 00:17:10,547 What we want to know is, why might you be late? 305 00:17:10,777 --> 00:17:15,230 What unplanned thing could happen that would result in you being late? 306 00:17:15,542 --> 00:17:19,439 So, risks sit between causes and effects. 307 00:17:19,939 --> 00:17:23,673 We can’t manage causes because they're here now, they're facts. 308 00:17:24,193 --> 00:17:27,339 We don't want to manage effects because they may never happen. 309 00:17:27,799 --> 00:17:30,857 What we can manage is risks that sit in the middle 310 00:17:30,857 --> 00:17:33,322 because they haven't happened yet. 311 00:17:34,282 --> 00:17:38,028 So, risk management has to separate risks from 312 00:17:38,028 --> 00:17:41,258 their causes and risks from their effects. 313 00:17:42,108 --> 00:17:46,090 And I find looking at hundreds of risk registers all around the world. 314 00:17:46,920 --> 00:17:51,758 I've worked in 48 different countries, every continent, every culture. 315 00:17:52,108 --> 00:17:56,480 Uh, not the Antarctic, it’s too cold. Um, but nearly every continent. 316 00:17:56,670 --> 00:18:02,360 And over half of the stuff in risk registers are causes or effects. 317 00:18:02,690 --> 00:18:03,707 Over half. 318 00:18:04,233 --> 00:18:07,063 So the things we are trying to manage in the risk register 319 00:18:07,103 --> 00:18:10,486 are not risks and then people are surprised that it doesn't work. 320 00:18:11,506 --> 00:18:16,153 So how do we separate cause, risk, and effect. Here is a little test. 321 00:18:17,013 --> 00:18:19,838 And these statements are written in your notes. 322 00:18:20,028 --> 00:18:22,299 Or you can just think as we go. 323 00:18:22,299 --> 00:18:26,080 Each of these statements and they are all very simple is one of these things. 324 00:18:26,120 --> 00:18:28,436 A cause is something that is true today. 325 00:18:28,996 --> 00:18:31,730 A risk is an uncertainty that might, or might not happen. 326 00:18:32,220 --> 00:18:35,120 The effect is why it matters to our objective. 327 00:18:36,167 --> 00:18:38,557 Okay? So you have to think what these are. 328 00:18:38,987 --> 00:18:41,806 The project is based in a third-world country. 329 00:18:41,906 --> 00:18:44,055 Cause? Risk? Or effect? What do you think? 330 00:18:44,695 --> 00:18:46,139 Cause! Very good. 331 00:18:46,309 --> 00:18:50,130 So, this is a fact, there might be uncertainties that come out of this fact. 332 00:18:50,340 --> 00:18:54,924 So we may not get the resources we need, there may be security concerns. 333 00:18:55,364 --> 00:18:59,907 We may not get paid. These are uncertainties that come from this fact. 334 00:19:01,107 --> 00:19:03,516 Interest rates might go down. 335 00:19:03,816 --> 00:19:04,816 It's a risk. 336 00:19:04,826 --> 00:19:07,094 Or they could stay the same or they could go up. 337 00:19:07,094 --> 00:19:09,692 And we could go over budget. 338 00:19:10,013 --> 00:19:11,302 It's an effect. 339 00:19:11,332 --> 00:19:13,622 So, a million things could take you over budget, 340 00:19:13,622 --> 00:19:15,393 maybe interest rates is one of them. 341 00:19:15,393 --> 00:19:17,200 Okay? They were easy. How about this? 342 00:19:17,200 --> 00:19:19,796 The weather might be better than usual. 343 00:19:20,056 --> 00:19:21,756 So risk could be the same or worse. 344 00:19:22,806 --> 00:19:26,270 It would be a bad thing if you were selling umbrellas. 345 00:19:27,250 --> 00:19:30,290 It would be a good thing if you were selling ice cream. 346 00:19:31,220 --> 00:19:32,987 It depends what your project is. 347 00:19:34,007 --> 00:19:36,454 Um, I'm allergic to prawns. 348 00:19:38,234 --> 00:19:39,920 It's a cause, it's a fact. 349 00:19:39,920 --> 00:19:44,353 What is the risk that comes from this fact, this cause? 350 00:19:47,818 --> 00:19:49,846 You think maybe I could be sick? 351 00:19:50,236 --> 00:19:53,180 I could have a reaction. I could be very ill. I could die. 352 00:19:54,640 --> 00:19:57,948 All of those things are effects. Aren’t they? 353 00:19:58,578 --> 00:20:00,864 But if something happens that I didn't plan, 354 00:20:00,864 --> 00:20:03,911 because I am allergic something might happen that makes me sick. 355 00:20:04,621 --> 00:20:06,532 What's the something? 356 00:20:07,342 --> 00:20:09,264 I might eat prawns without knowing. 357 00:20:09,614 --> 00:20:14,421 So then I check, are there prawns in this? You know I avoid things with prawn in them 358 00:20:14,741 --> 00:20:18,247 I manage the risk and not the effect. And not the cause. 359 00:20:18,354 --> 00:20:21,684 Okay, we have got to use a new technique, an unproven technique. 360 00:20:23,144 --> 00:20:25,725 It's a fact, it's a requirement, we have to do it. 361 00:20:25,725 --> 00:20:28,843 we might introduce design errors but it just is a fact. 362 00:20:28,843 --> 00:20:30,593 A requirement of our project. 363 00:20:30,593 --> 00:20:33,375 The contractor may not deliver on time is a risk. 364 00:20:34,155 --> 00:20:36,005 Um, this is going too fast. 365 00:20:36,365 --> 00:20:38,517 It might not work for some reason. 366 00:20:39,197 --> 00:20:40,857 You saw the color, it's an effect. 367 00:20:40,857 --> 00:20:44,759 Okay, I will go more slowly. Uh, we don't have enough people. 368 00:20:47,039 --> 00:20:50,942 It's a cause, yes. And lastly, there's a risk that we'll be late. 369 00:20:53,781 --> 00:20:57,229 Hmm...mm. It's an effect, is it? 370 00:20:58,069 --> 00:21:01,112 Because we want to know what is the risk that we'll be late. 371 00:21:01,502 --> 00:21:03,133 Being late is an effect. 372 00:21:03,423 --> 00:21:08,619 So apart from the prawns, all of the blue and green things we see in risk registers. 373 00:21:09,743 --> 00:21:16,568 The project environment, new technology, lack of resources, or going over budget. 374 00:21:16,568 --> 00:21:20,890 Lack of performance, delivering late. These are not risks. 375 00:21:20,980 --> 00:21:22,610 These are causes or effects. 376 00:21:23,549 --> 00:21:28,020 And if we looked at a real risk register and this is written in your notes for you 377 00:21:28,019 --> 00:21:31,116 If you want to do this afterward, we could do another exercise 378 00:21:32,016 --> 00:21:35,891 In fact, the next page of the notes, if you turn over the page 379 00:21:35,891 --> 00:21:38,511 has these written a bit larger for you 380 00:21:39,010 --> 00:21:40,495 English only I'm afraid. 381 00:21:40,588 --> 00:21:42,758 We'll have to do something about that. 382 00:21:43,588 --> 00:21:47,808 Um. You could just try this little exercise on a real risk register 383 00:21:49,153 --> 00:21:52,889 This is one of my clients, I asked them for their top 10 risks. 384 00:21:52,889 --> 00:21:54,132 This is what they gave me. 385 00:21:54,132 --> 00:21:57,059 They're not risks. They're all sorts of things mixed up. 386 00:21:58,496 --> 00:22:01,476 Really, you should do this on your risk register. 387 00:22:02,355 --> 00:22:06,205 But let me show you what happened when I did this on their risk register. 388 00:22:06,699 --> 00:22:09,411 I found there was a whole mixture of things. 389 00:22:10,033 --> 00:22:14,333 So, the current hardware is not fast enough to support testing. That's a fact. 390 00:22:15,333 --> 00:22:16,163 It's a cause. 391 00:22:17,019 --> 00:22:20,669 This means that we may be unable to test performance 392 00:22:20,669 --> 00:22:22,862 until production hardware is used. 393 00:22:23,521 --> 00:22:24,711 That's the risk. 394 00:22:25,518 --> 00:22:27,767 So we have two things in this statement. 395 00:22:28,749 --> 00:22:31,130 The next one down is just a fact. 396 00:22:31,467 --> 00:22:35,467 A number of usability issues have been identified by the supplier. 397 00:22:36,273 --> 00:22:37,872 Okay, so what? 398 00:22:38,638 --> 00:22:40,218 What difference does that make? 399 00:22:40,977 --> 00:22:44,560 Let me color code this for you. Just to be slightly friendly. 400 00:22:44,560 --> 00:22:45,330 Umm. 401 00:22:45,330 --> 00:22:49,102 But you will have to do it on your own if you want to try the complete exercise. 402 00:22:49,523 --> 00:22:53,863 Umm. There is a whole range of different things in this so-called risk register. 403 00:22:54,543 --> 00:22:57,481 And I would expect that yours is the same. 404 00:22:57,916 --> 00:23:02,192 That you'll have things in your risk register that are just pure facts. 405 00:23:02,473 --> 00:23:05,824 Or things that are a mixture of risks and other things 406 00:23:06,997 --> 00:23:10,997 Now, there are two in this list that I think are particularly interesting. 407 00:23:11,444 --> 00:23:14,645 It's this one and this one. They have all three colors in them. 408 00:23:15,708 --> 00:23:18,207 Because they have a cause and a risk and an effect. 409 00:23:19,518 --> 00:23:23,518 So, let take this one. The team does not have a documented design. 410 00:23:24,109 --> 00:23:26,438 For this function. That's a fact. 411 00:23:26,944 --> 00:23:27,734 So what? 412 00:23:28,042 --> 00:23:30,659 Well, there's the risk that the architecture 413 00:23:30,659 --> 00:23:33,769 may not support the required functionality. 414 00:23:33,769 --> 00:23:37,467 That might happen because we don't have a documented design. 415 00:23:38,177 --> 00:23:39,534 Why do we care about that? 416 00:23:39,534 --> 00:23:43,543 If that happens, it results in the requirements not being met. 417 00:23:43,608 --> 00:23:45,648 or a higher number of defects. 418 00:23:45,648 --> 00:23:49,161 That hits our performance objective and our quality objective. 419 00:23:49,958 --> 00:23:53,633 So, now we have three things, we know what the risk is. 420 00:23:54,344 --> 00:23:57,784 The risk is that the architecture might not support the functionality. 421 00:23:58,875 --> 00:24:02,498 We know why that's happening, because we don't have a documented design. 422 00:24:02,498 --> 00:24:06,768 And we know how it could affect the project in not meeting the requirements, 423 00:24:07,226 --> 00:24:09,326 or delivering defects. 424 00:24:10,261 --> 00:24:12,329 Those are really useful things to know. 425 00:24:13,011 --> 00:24:17,961 And it will be helpful if every risk description had those three things in it. 426 00:24:18,257 --> 00:24:22,076 And, so what we recommend is a structured description of risk 427 00:24:22,076 --> 00:24:24,014 that has three parts to it. 428 00:24:24,789 --> 00:24:31,389 That says "as a result of" some fact, a cause. Then, an uncertainty might occur. 429 00:24:31,886 --> 00:24:33,516 It might not, but it might. 430 00:24:33,855 --> 00:24:35,825 And if it did, it would be a risk. 431 00:24:37,614 --> 00:24:40,623 and if that thing actually happened, it would lead to 432 00:24:40,880 --> 00:24:42,647 An affect on the objectives 433 00:24:43,252 --> 00:24:48,816 And we recommend and PMI recommends and the ISO standard recommends 434 00:24:49,367 --> 00:24:51,267 and best practice guidelines recommend. 435 00:24:51,377 --> 00:24:53,967 But you describe your risk in these three stages. 436 00:24:53,967 --> 00:24:57,967 What do we know, what uncertainty does that gives us, and why does it matter? 437 00:24:59,118 --> 00:25:02,568 And then we can use it to help us manage the risk. 438 00:25:03,769 --> 00:25:08,201 In English, we have definite words to describe facts 439 00:25:08,201 --> 00:25:11,832 This is true. This has happened. This does occur. 440 00:25:13,088 --> 00:25:17,343 We have uncertain words to describe the risk. It might or it might not 441 00:25:17,343 --> 00:25:18,523 It's possible. 442 00:25:19,284 --> 00:25:23,134 And then we have conditional words that say this would follow 443 00:25:23,134 --> 00:25:24,724 if the risk occurred. 444 00:25:25,481 --> 00:25:27,631 Maybe your language is a little different. 445 00:25:27,631 --> 00:25:30,547 But we can use the language to help us perhaps. 446 00:25:31,465 --> 00:25:33,355 So one of the things I'd like us to try, 447 00:25:33,444 --> 00:25:36,224 in the short exercise we're going to do in a moment, 448 00:25:36,784 --> 00:25:41,474 is to try describing risks in that three part way. 449 00:25:41,579 --> 00:25:42,759 What do we know? 450 00:25:43,134 --> 00:25:45,029 What uncertainty does it give us? 451 00:25:45,337 --> 00:25:47,507 And why does that matter to our objectives? 452 00:25:47,857 --> 00:25:50,933 And I would recommend that you try that for your own 453 00:25:50,933 --> 00:25:55,113 real risk register on your project, and see what difference it makes. 454 00:25:56,072 --> 00:25:57,532 You might be surprised. 455 00:25:58,184 --> 00:26:02,554 Now, let's think about the next question, which is not, 456 00:26:02,554 --> 00:26:04,094 Well, there is another question. 457 00:26:04,094 --> 00:26:05,494 "How do we prioritize them?" 458 00:26:05,494 --> 00:26:08,015 But the one I want to focus on is, "What could we do 459 00:26:08,015 --> 00:26:09,938 about the risks that we've identified?'" 460 00:26:09,938 --> 00:26:12,278 Planning risk responses. 461 00:26:12,781 --> 00:26:14,790 Here are the questions we need to ask. 462 00:26:15,268 --> 00:26:18,018 "What are we going to do based on the risk?" 463 00:26:18,489 --> 00:26:19,779 How manageable it is. 464 00:26:20,626 --> 00:26:23,841 How bad or good it might be if we left it alone 465 00:26:24,057 --> 00:26:25,677 impacts the variety. 466 00:26:26,227 --> 00:26:29,575 Whether we have the people and the equipment of the skills 467 00:26:29,689 --> 00:26:30,719 to deal with it. 468 00:26:30,931 --> 00:26:32,581 A resource availability 469 00:26:33,234 --> 00:26:34,674 and cost effectiveness. 470 00:26:34,864 --> 00:26:37,614 Can we spend a small amount to save a big amount? 471 00:26:38,486 --> 00:26:42,156 We don't want to spend a big amount to save a small amount. 472 00:26:42,892 --> 00:26:44,812 And the next important question 473 00:26:44,848 --> 00:26:46,248 "who is going to do this?" 474 00:26:47,681 --> 00:26:50,531 What could we do to deal with risk? 475 00:26:51,787 --> 00:26:54,127 Often, people think of four things. 476 00:26:54,127 --> 00:26:57,733 Four different types of things we could do to address 477 00:26:57,733 --> 00:26:59,209 uncertainties that matter. 478 00:26:59,690 --> 00:27:02,231 And each of these has a name. 479 00:27:02,804 --> 00:27:05,884 It's a strategy. A strategy to focus our planning. 480 00:27:06,804 --> 00:27:08,734 To focus our thinking. 481 00:27:08,734 --> 00:27:13,689 And then, once we've focused our thinking with a strategy, we can develop tactics 482 00:27:13,740 --> 00:27:15,693 to address each individual risk. 483 00:27:15,783 --> 00:27:18,403 So, what are the four things that most people think of? 484 00:27:18,752 --> 00:27:20,842 The first is risk avoidance. 485 00:27:21,237 --> 00:27:25,237 Is there something we can do to kill the risk, to remove it altogether? 486 00:27:26,491 --> 00:27:29,471 The second is something we call risk transfer. 487 00:27:30,181 --> 00:27:31,601 Can we give it away? 488 00:27:31,721 --> 00:27:34,011 Can we get somebody else to take it away for us? 489 00:27:35,726 --> 00:27:38,346 The third is what we call risk reduction. 490 00:27:38,851 --> 00:27:41,521 Some people call this risk mitigation. 491 00:27:41,880 --> 00:27:44,290 And here, we're trying to make the risk smaller 492 00:27:44,746 --> 00:27:46,096 so that we could accept it. 493 00:27:47,879 --> 00:27:52,139 And the fourth response after avoid, transfer, or reduce 494 00:27:52,958 --> 00:27:54,788 is the one that everyone forgets. 495 00:27:55,640 --> 00:27:57,770 They think if we can't do anything about it 496 00:27:58,101 --> 00:28:01,091 we just have to hope and pray and wonder and wait. 497 00:28:02,633 --> 00:28:03,925 The other response is 498 00:28:04,289 --> 00:28:05,619 to take the risk. 499 00:28:05,812 --> 00:28:08,012 We call that risk acceptance. 500 00:28:08,224 --> 00:28:12,444 To recognize we're taking this risk and to include it in our baseline 501 00:28:12,478 --> 00:28:14,559 and to monitor it very carefully. 502 00:28:15,405 --> 00:28:21,967 So, you might see those four options as quite a good set of response strategies. 503 00:28:22,640 --> 00:28:23,896 But there's a problem. 504 00:28:24,969 --> 00:28:28,689 The problem is all these things only work for bad risks. 505 00:28:29,604 --> 00:28:31,614 What about opportunities? 506 00:28:32,088 --> 00:28:35,788 We don't want to avoid or give away or make smaller 507 00:28:35,905 --> 00:28:37,445 opportunities. 508 00:28:37,656 --> 00:28:41,784 So, how do you respond if you find a good thing that might happen 509 00:28:41,784 --> 00:28:43,004 on your project. 510 00:28:43,609 --> 00:28:45,919 Do you just wait and see and hope? 511 00:28:46,641 --> 00:28:49,621 Or is there something active that we could do? 512 00:28:50,815 --> 00:28:54,895 Fortunately, there are four response strategies for opportunities 513 00:28:55,392 --> 00:28:59,623 that match the four response strategies for threats. 514 00:29:00,236 --> 00:29:01,932 So, here are the bad ones. 515 00:29:01,932 --> 00:29:03,319 Avoid a bad thing. 516 00:29:03,702 --> 00:29:05,282 Give it to someone to take away. 517 00:29:05,362 --> 00:29:06,648 Make it smaller. 518 00:29:06,740 --> 00:29:08,041 Or take the risk. 519 00:29:09,262 --> 00:29:11,552 This is not those things I'm trying to achieve. 520 00:29:12,208 --> 00:29:13,718 To remove the uncertainty. 521 00:29:13,938 --> 00:29:15,655 To get somebody else to help. 522 00:29:16,174 --> 00:29:18,014 To change the size of the risk. 523 00:29:18,228 --> 00:29:20,643 Or to include it in our project plan. 524 00:29:21,654 --> 00:29:23,964 We could do all of those four things, 525 00:29:24,230 --> 00:29:25,500 for opportunities. 526 00:29:26,008 --> 00:29:28,078 How do you eliminate uncertainty 527 00:29:28,383 --> 00:29:29,713 from opportunity? 528 00:29:30,964 --> 00:29:32,114 You capture it. 529 00:29:32,520 --> 00:29:33,830 Take up a strategy, 530 00:29:34,436 --> 00:29:36,456 which makes it definitely happen. 531 00:29:37,474 --> 00:29:39,864 In English, we call this "Exploit". 532 00:29:40,034 --> 00:29:42,174 Exploit is the same as avoid. 533 00:29:42,556 --> 00:29:45,206 For avoid, you make the probability zero. 534 00:29:45,667 --> 00:29:46,856 It can't happen. 535 00:29:47,811 --> 00:29:50,201 For a threat, it's avoid. 536 00:29:50,447 --> 00:29:52,107 For opportunity, exploit. 537 00:29:52,422 --> 00:29:54,932 It's to make the probability 100% 538 00:29:55,579 --> 00:29:57,839 It will happen. It must happen. 539 00:29:58,116 --> 00:29:59,826 So they're aggressive strategies. 540 00:30:00,005 --> 00:30:02,970 You kill the threat, you capture the opportunity 541 00:30:03,687 --> 00:30:05,127 It's the same kind of thing. 542 00:30:05,852 --> 00:30:09,932 What could we do, instead of giving away, transferring a threat? 543 00:30:10,301 --> 00:30:12,811 We want to involve somebody else to help us. 544 00:30:13,384 --> 00:30:15,174 We could share the opportunity. 545 00:30:15,217 --> 00:30:17,857 We could ask them to come into our project 546 00:30:17,857 --> 00:30:21,032 and be involved with us in a joint venture 547 00:30:21,085 --> 00:30:23,772 or a subcontract, or a partnership. 548 00:30:23,847 --> 00:30:27,639 Where they help us to achieve this uncertainy that would help us all? 549 00:30:27,689 --> 00:30:31,123 And we give them some part of the benefit, we share the opportunity 550 00:30:31,168 --> 00:30:35,008 How could we change the size of an opportunity? 551 00:30:35,008 --> 00:30:38,608 We don't want to reduce it, we want to enhance it. 552 00:30:38,639 --> 00:30:41,649 We want to grow it, we want to make it more likely. 553 00:30:41,649 --> 00:30:45,981 And bigger impact. It's the same idea but the other way around for the opportunity. 554 00:30:47,595 --> 00:30:50,925 And the last one, if we can't do these active things, we could just 555 00:30:50,998 --> 00:30:53,738 Accept an opportunity and wait and see what happens. 556 00:30:53,898 --> 00:30:57,898 But monitor it very closely if there's nothing else we could do. 557 00:30:58,541 --> 00:31:03,731 So what this slide tells us is that there's an equal variety 558 00:31:03,940 --> 00:31:09,480 Of potential response times that we can choose between for our opportunities 559 00:31:09,480 --> 00:31:12,880 Equal to the number that we have threats. 560 00:31:12,951 --> 00:31:16,831 You see, the secret to thinking about opportunities... 561 00:31:16,938 --> 00:31:21,518 Is to recognize that an opportunity is the same as a threat. 562 00:31:21,625 --> 00:31:25,775 The only difference is the sign of the impact. 563 00:31:28,206 --> 00:31:32,586 So a threat has a negative impact, an opportunity has a positive impact. 564 00:31:33,535 --> 00:31:35,695 Apart from that, they're the same. 565 00:31:36,374 --> 00:31:38,846 They are both uncertainties that matter. 566 00:31:38,936 --> 00:31:41,506 They are both things that might or might not happen. 567 00:31:41,581 --> 00:31:43,151 But could affect our objectives. 568 99:59:59,999 --> 99:59:59,999 They can both be managed practically. 569 99:59:59,999 --> 99:59:59,999 They both make a difference to the chances of succeeding on our project. 570 99:59:59,999 --> 99:59:59,999 And that's why risk management should manage threats and opportunities together 571 99:59:59,999 --> 99:59:59,999 in a single process. 572 99:59:59,999 --> 99:59:59,999 Because they are the same things. 573 99:59:59,999 --> 99:59:59,999 And that may be new thinking to some of you. 574 99:59:59,999 --> 99:59:59,999 Those of you who always think of risk as a big ugly thing waiting to squash me. 575 99:59:59,999 --> 99:59:59,999 Or the unknown thing in the future that's going to hurt me. 576 99:59:59,999 --> 99:59:59,999 There are some like that, and we need to stop them happening to protect ourselves. 577 99:59:59,999 --> 99:59:59,999 But there are also some very good things out that which might happen 578 99:59:59,999 --> 99:59:59,999 Which we need to see, and we need to chase them. 579 99:59:59,999 --> 99:59:59,999 And make them happen. So that our projects could be more successful. 580 99:59:59,999 --> 99:59:59,999 There is another response strategy that we might try. 581 99:59:59,999 --> 99:59:59,999 Which is really not recommended, and just pretending that there are no risks 582 99:59:59,999 --> 99:59:59,999 and hiding away and saying maybe it will never happen. 583 99:59:59,999 --> 99:59:59,999 We really don't recommend this at all. 584 99:59:59,999 --> 99:59:59,999 In fact, it's probably true that ostriches don't bury their heads in the sand. 585 99:59:59,999 --> 99:59:59,999 It's just the way that the sand dunes kind of look and it's just a pretend story 586 99:59:59,999 --> 99:59:59,999 And sadly for project managers, it's not a pretend story. 587 99:59:59,999 --> 99:59:59,999 We hide our heads and we pretend there are no risks. 588 99:59:59,999 --> 99:59:59,999 And then we get surprised. 589 99:59:59,999 --> 99:59:59,999 Well, we can do better than that. 590 99:59:59,999 --> 99:59:59,999 So, let me just finish here just to say we do need to do something about this