0:00:00.501,0:00:04.428 Okay, let's have a look at[br]risk management in practice 0:00:04.457,0:00:08.474 And what I want to do[br]is to start with some basic concepts 0:00:08.485,0:00:14.015 then focus on TWO difficult areas[br]in the risk process 0:00:14.242,0:00:19.164 So, I guess if I asked you[br]to define the word 'risk' 0:00:19.174,0:00:22.954 you would have some idea[br]of what it meant 0:00:22.967,0:00:26.476 We might not have a formal definition[br]that we could quote, 0:00:26.476,0:00:30.253 but we all have something in our minds[br]when we hear the word 'risk' 0:00:30.274,0:00:33.972 This is what we think,[br]and maybe you think of things like this 0:00:34.234,0:00:38.534 Maybe you feel like this little guy,[br]facing some big ugly challenge 0:00:38.534,0:00:41.791 that you know is just going to[br]squash you flat. 0:00:42.066,0:00:43.766 Maybe you feel like this guy. 0:00:44.024,0:00:46.489 This is a real job in North Korea, 0:00:47.768,0:00:51.367 and his job is to hold the target[br]for other people to shoot at 0:00:51.500,0:00:53.979 Sometimes project managers[br]have the target here 0:00:54.292,0:00:56.911 We feel like everybody is shooting at us[br]in our job 0:00:57.757,0:01:01.987 Or maybe you just know there's something[br]nasty out there, waiting to get you 0:01:02.470,0:01:05.700 And maybe that's what you think of[br]when you think of the word 'risk' 0:01:06.361,0:01:09.574 Well that's partly true[br]but it's not the whole truth. 0:01:10.193,0:01:13.625 Risk is not the same[br]as uncertainty. 0:01:14.220,0:01:16.963 Risk is related to uncertainty[br]but they're different. 0:01:17.670,0:01:23.826 So all risks are uncertain[br]but not all uncertainties are risks. 0:01:24.653,0:01:27.589 If you have a risk register[br]or a risk list, 0:01:27.942,0:01:31.435 you don't have a million items in it,[br]or you shouldn't. 0:01:31.962,0:01:34.512 You don't even probably have[br]a thousand items in it, 0:01:34.512,0:01:35.788 you have a smaller number. 0:01:36.714,0:01:40.007 Although there are millions[br]of uncertainties in the world. 0:01:40.400,0:01:44.313 So how do we decide which uncertainties[br]we're going to call 'risk'? 0:01:44.677,0:01:47.280 And write them down [br]and put them in our risk register 0:01:47.548,0:01:50.057 and decide to do something about them. 0:01:50.483,0:01:56.362 Clearly 'risk' is a subset[br]of uncertainties, but which subset? 0:01:56.942,0:01:58.133 How do you know? 0:01:58.798,0:02:02.948 I think it's very simple to separate[br]risk and uncertainty. 0:02:03.199,0:02:05.274 And I use 3 English words, 0:02:05.425,0:02:10.019 these words here,[br]"risk is uncertainty that matters." 0:02:11.503,0:02:14.783 Because most of the[br]uncertainties in the world don't matter. 0:02:15.564,0:02:19.014 We don't care if it's going to rain[br]in London tomorrow afternoon. 0:02:19.400,0:02:23.780 It might, it might not.[br]It's irrelevant, it doesn't matter. 0:02:24.498,0:02:26.948 We don't care what the [br]exchange rate will be 0:02:26.948,0:02:30.703 if it's between the Russian Ruble [br]and the Chinese Yen in 2020. 0:02:30.703,0:02:32.387 It doesn't matter to us. 0:02:32.888,0:02:35.118 But there are things on our projects, 0:02:35.427,0:02:37.117 and things in our families, 0:02:37.259,0:02:38.859 and things in our country, 0:02:38.979,0:02:41.446 which are uncertain which do matter to us. 0:02:42.195,0:02:45.338 If it's an uncertainty that matters, [br]it's a risk. 0:02:46.188,0:02:49.991 So here's another question,[br]how do you know what matters? 0:02:50.751,0:02:53.396 In your projects, [br]what are the things that matter? 0:02:54.077,0:02:57.875 The things that matter in our projects[br]are our objectives. 0:02:58.532,0:03:02.216 So we must always connect uncertainty[br]with objectives, 0:03:02.991,0:03:05.631 in order to find the risks. 0:03:06.005,0:03:08.355 And if we look at [br]some definitions of risk, 0:03:08.361,0:03:11.405 this is the ISO standard that I mentioned, 0:03:11.445,0:03:13.796 it connects those words very simply; 0:03:13.796,0:03:17.559 Risk is the effect of uncertainty[br]on objectives. 0:03:18.451,0:03:21.201 And we might look at another definition[br]from the UK, 0:03:21.387,0:03:23.916 from our association [br]for project management, 0:03:24.144,0:03:28.134 it says the same thing that risk[br]is an uncertain event 0:03:28.152,0:03:32.212 or a set of circumstances, [br]which is uncertain, 0:03:32.212,0:03:35.372 but it matters because should it occur, 0:03:35.372,0:03:38.587 it will have an effect [br]on achievement of objectives. 0:03:38.603,0:03:40.553 Uncertainty that matters. 0:03:40.867,0:03:44.317 So we should be looking [br]in our risk register for two things: 0:03:44.703,0:03:48.803 "Is it uncertain?" We don't want [br]problems in our risk register. 0:03:49.183,0:03:52.113 We don't want issues in the risk register. 0:03:52.113,0:03:55.020 We don't want constraints or requirements. 0:03:55.291,0:03:59.631 These things are certain,[br]what we want is uncertainties, 0:03:59.721,0:04:02.241 something that might happen[br]or might not happen. 0:04:03.076,0:04:06.766 But the other important question for our[br]risk register is 0:04:06.766,0:04:08.366 "Does it matter?" 0:04:08.439,0:04:11.739 Which objective would be affected [br]if this thing happened? 0:04:13.304,0:04:15.810 And then when we want to see[br]how big the risk is, 0:04:16.160,0:04:18.322 we can ask those two questions: 0:04:18.338,0:04:19.768 "How uncertain is it, 0:04:19.985,0:04:22.095 and how much does it matter?" 0:04:22.132,0:04:24.502 And that will tell us how big the risk is. 0:04:24.564,0:04:27.044 So, this idea of uncertainty that matters 0:04:27.084,0:04:30.536 then develops into[br]something which is useful 0:04:30.618,0:04:33.317 by linking uncertainty to our objectives. 0:04:34.767,0:04:37.148 So, we have two dimensions of ‘risk,’ 0:04:37.409,0:04:39.658 we have an uncertainty dimension and we 0:04:39.658,0:04:42.093 have a dimension that [br]affects our objectives 0:04:43.249,0:04:47.419 In projects, we call[br]this probability and impact. 0:04:47.449,0:04:49.502 We could call them other things, 0:04:49.502,0:04:51.233 there are other English 0:04:51.233,0:04:52.767 words we could use,[br]but these 0:04:52.767,0:04:54.553 are the ones,[br]most often, we use. 0:04:54.702,0:04:57.732 And I would like to ask you with [br]this picture of the mouse. 0:04:59.632,0:05:04.647 What effect matters to the mouse? 0:05:05.874,0:05:09.315 So first of all, clearly, [br]he is in an uncertain situation here. 0:05:09.845,0:05:12.193 And he's seen some risks. 0:05:12.457,0:05:15.359 His objective is to get the cheese[br]and stay alive. 0:05:15.938,0:05:18.817 And so, one of the risks he has [br]identified is a bad thing 0:05:18.903,0:05:21.353 that might happen:[br]he might be killed or injured. 0:05:22.177,0:05:24.517 And so, he has been a [br]good project manager, 0:05:24.517,0:05:27.032 he has put his little helmet on, [br]and he is preparing 0:05:27.152,0:05:32.051 so that it doesn't happen to him.[br]So, he doesn't get killed or injured. 0:05:32.051,0:05:32.821 Very good. 0:05:33.690,0:05:36.560 And there are things in our projects, [br]that if they happened 0:05:36.560,0:05:37.875 would kill or injure us. 0:05:37.875,0:05:39.218 They would waste time, 0:05:39.218,0:05:41.566 waste money, damage reputation, 0:05:41.566,0:05:42.986 destroy performance, 0:05:43.202,0:05:45.732 maybe even injure real people. 0:05:46.230,0:05:49.880 And as project managers we have to[br]see those things and stop them happening. 0:05:49.957,0:05:51.792 Protect ourselves in advance. 0:05:51.857,0:05:52.867 Avoid them. 0:05:54.240,0:05:57.870 Are there any other uncertainties[br]that matter for the mouse? 0:05:59.637,0:06:01.327 Well there is... 0:06:01.327,0:06:02.275 the cheese. 0:06:02.678,0:06:05.558 There's an uncertainty here which [br]matters a great deal. 0:06:05.558,0:06:07.887 "Will I get the cheese out of the trap?" 0:06:08.733,0:06:10.429 He might, or he might not. 0:06:10.967,0:06:14.121 And if he doesn't get the [br]cheese out of the trap, he's failed 0:06:14.961,0:06:17.471 So he has two uncertainties to manage, 0:06:17.471,0:06:20.133 one of them is bad - he might be killed[br]or injured - 0:06:20.419,0:06:22.639 the other is good - he might [br]get the cheese. 0:06:23.129,0:06:24.616 And what he has to do, 0:06:24.968,0:06:28.958 what he has to do is to manage both[br]of these at the same time. 0:06:29.232,0:06:32.159 And as project managers, we have to[br]do the same thing. 0:06:32.788,0:06:36.128 And also we have to do it in the[br]best possible way - 0:06:36.128,0:06:40.546 sometimes there's a better way to get the [br]cheese without being killed or injured. 0:06:41.428,0:06:44.518 In our projects, we have to stop the [br]bad things happening, 0:06:44.889,0:06:47.789 but we also have to get the cheese out[br]of our projects. 0:06:49.116,0:06:52.116 "So what does 'cheese' mean,[br]in your project?" 0:06:52.116,0:06:54.302 "What is the 'cheese' in your project?" 0:06:55.160,0:06:56.640 'Cheese' means value. 0:06:56.806,0:06:58.445 'Cheese' means benefits. 0:06:58.644,0:07:01.868 'Cheese' means products and [br]services that people want and need. 0:07:02.170,0:07:04.290 'Cheese' means customer satisfaction. 0:07:04.371,0:07:06.802 'Cheese' is the good stuff [br]that we're trying to get 0:07:06.802,0:07:08.465 out of our difficult projects. 0:07:08.872,0:07:11.962 And if we don't do anything bad - 0:07:11.962,0:07:16.170 we don't waste time, we don't [br]waste money, we don't damage reputation - 0:07:16.170,0:07:17.954 but we don't create value, 0:07:18.278,0:07:19.338 we've failed. 0:07:19.683,0:07:23.183 If the mouse didn't die but he didn't [br]get the cheese, he failed. 0:07:23.950,0:07:28.353 If we create benefits, but we waste time [br]and waste money and destroy reputation, 0:07:28.353,0:07:29.369 we've failed. 0:07:30.015,0:07:32.565 And if the mouse gets the cheese[br]and he's killed, 0:07:32.724,0:07:33.784 he's failed. 0:07:33.784,0:07:36.076 So we have to do both of these things. 0:07:36.353,0:07:39.231 And when we think about risk [br]and think about impact, 0:07:39.468,0:07:41.896 there are two kinds of impact that matter. 0:07:42.525,0:07:45.175 Bad ones, and good ones. 0:07:45.488,0:07:48.125 Uncertainties that could hurt the project, 0:07:48.678,0:07:51.568 and uncertainties that [br]could help the project. 0:07:51.766,0:07:56.364 Both of these matter [br]and both of these need to be managed. 0:07:56.867,0:07:59.078 And we have another word for those. 0:07:59.305,0:08:03.791 So, here's the definition of risk from the[br]Project Management Institute, the PMI, 0:08:04.007,0:08:05.685 from the PMBok Guide. 0:08:05.993,0:08:08.093 It's the same as the others [br]that we've seen: 0:08:08.093,0:08:12.379 an uncertain event or condition, [br]that if it occurs, affects an objective. 0:08:13.222,0:08:18.534 But PMI knows about the mouse. PMI knows [br]about the cheese and the traps, 0:08:18.721,0:08:22.277 and has added three words [br]to the definition of risk here. 0:08:23.233,0:08:26.204 It's not the words 'cheese' and 'traps'. 0:08:26.432,0:08:29.301 It's the words 'positive or negative'. 0:08:30.067,0:08:33.984 What this tells us is that there [br]are good risks, as well as bad risks. 0:08:34.544,0:08:37.997 And we heard that in one of our [br]keynote speeches, earlier this morning. 0:08:38.507,0:08:42.907 In the uncertain situation that this [br]country faces going forward 0:08:42.958,0:08:46.058 with all the changes that there have been,[br]there are threats. 0:08:46.058,0:08:48.061 There are things that could go wrong. 0:08:48.061,0:08:50.226 And you need to see those[br]and address them. 0:08:50.407,0:08:53.188 But there are also opportunities. 0:08:53.248,0:08:56.235 Uncertain things that might happen[br]that could be good. 0:08:56.918,0:08:59.346 And we also need to see those things, 0:08:59.561,0:09:02.699 and to try and proactively [br]make them happen. 0:09:03.249,0:09:05.376 And that is equally true in our projects, 0:09:05.376,0:09:06.938 in our personal lives, 0:09:06.938,0:09:09.308 and also at the national level. 0:09:09.778,0:09:13.578 And I'll be talking about some of [br]those things later on this afternoon 0:09:14.717,0:09:19.161 So, PMI has this definition. The other [br]standards have something very similar. 0:09:19.541,0:09:21.448 The ISO standard, at the bottom here, 0:09:21.448,0:09:25.619 says 'risk is the effect of[br]uncertainty on objectives.' 0:09:26.536,0:09:29.252 Note, the effect can be [br]positive or negative. 0:09:31.009,0:09:34.726 And the APM, Association for Project [br]Management in the UK says the same thing. 0:09:35.426,0:09:39.954 So we have this new idea,[br]that risk is a double-sided concept. 0:09:41.090,0:09:43.959 And it's the same impression,[br]the word you have for risk, 0:09:44.329,0:09:48.002 we mostly think of bad things.[br]But it could be used for good things, 0:09:48.002,0:09:49.597 as well. Isn't that right? 0:09:49.597,0:09:51.384 It's an uncertain word. 0:09:51.875,0:09:55.031 And there are good risks as well [br]as bad risks. 0:09:55.991,0:09:59.364 So in our project [br]risk management process, 0:09:59.574,0:10:03.181 we should be looking out for the traps[br]and avoiding them 0:10:03.181,0:10:06.004 and protecting ourselves and [br]preventing them happening. 0:10:06.004,0:10:09.147 But we should also be looking[br]out for the cheese 0:10:09.147,0:10:11.549 and chasing it, and making it[br]happen proactively, 0:10:11.549,0:10:14.897 so we get the maximum[br]benefit for the minimum cost. 0:10:15.957,0:10:19.375 That’s why risk management is so [br]important to 0:10:19.712,0:10:22.676 project success: because it effects [br]our objectives. 0:10:23.817,0:10:27.426 It gives us the best possible chance[br]to achieve our goals. 0:10:28.570,0:10:30.290 So how do we do that? 0:10:30.621,0:10:33.104 If we think about the risk management[br]process, 0:10:33.539,0:10:35.899 the process has to do a number of things. 0:10:36.524,0:10:39.564 If risk is uncertainty that affects[br]objectives, 0:10:39.633,0:10:41.753 we have to know what our objectives are. 0:10:42.076,0:10:44.416 Then, we have to identify the[br]uncertainties. 0:10:45.099,0:10:48.459 The uncertainties that would matter to[br]those objectives. 0:10:48.618,0:10:53.058 And remember that they could be good[br]or bad, threats and opportunities. 0:10:53.897,0:10:56.747 That gives us a long list of uncertainties[br]that matter, 0:10:56.777,0:10:58.477 but they don't all matter the same. 0:10:59.218,0:11:03.528 So the next thing we have to do is[br]to prioritize, and ask the question 0:11:04.287,0:11:06.837 "How uncertain,[br]and how much does it matter?" 0:11:07.053,0:11:10.133 Then we get a prioritized list of risks. 0:11:10.210,0:11:14.390 We know which are the worst threats and[br]the best opportunities, 0:11:14.811,0:11:17.061 so that we do something about it. 0:11:17.508,0:11:19.384 Then we plan how to respond. 0:11:19.411,0:11:23.294 We think about what would be appropriate[br]to stop the bad thing happening 0:11:23.344,0:11:24.974 and to make the good thing happen. 0:11:25.583,0:11:28.920 And having decided, we do it of course. 0:11:29.851,0:11:33.641 And then risk is constantly changing [br]so we need to come back and do it again, 0:11:33.873,0:11:35.923 and see what has changed. 0:11:36.518,0:11:42.048 We could express this process as a number[br]of questions that it's important to ask, 0:11:42.184,0:11:45.604 and keep on asking about our project. 0:11:46.337,0:11:49.897 In fact, you can use these questions for[br]anything. 0:11:50.106,0:11:54.076 You could use these questions for your[br]next career move. 0:11:54.524,0:11:58.914 You could use these questions for deciding[br]about your pension. 0:11:59.141,0:12:03.691 You could use these questions to decide[br]how to bring up your children 0:12:04.323,0:12:09.183 or to decide on how to invest the nation's[br]wealth. 0:12:09.996,0:12:11.759 These are the questions: 0:12:11.799,0:12:15.409 "What are we trying to achieve?"[br]That's setting objectives. 0:12:15.979,0:12:18.439 Then, "what could affect[br]us in achieving that?" 0:12:18.534,0:12:20.574 That's identifying risks. 0:12:20.915,0:12:24.455 Then, "when we have a list of risks,[br]which are the most important ones?" 0:12:24.742,0:12:27.392 That's prioritizing at that[br]assessing the risks. 0:12:27.674,0:12:29.814 Then, "what could we do about it?" 0:12:30.242,0:12:34.402 Planning our responses and doing it,[br]implementing the responses. 0:12:34.846,0:12:39.166 And then, "did it work and what's changed"[br]Reviewing the risk. 0:12:39.306,0:12:43.766 So if we look at a risk management[br]process, we could link each step in the 0:12:43.800,0:12:46.930 process to one of these questions. 0:12:47.093,0:12:49.773 And this is why risk[br]management is so easy, 0:12:49.876,0:12:56.116 because all we're doing is asking and[br]answering obvious questions. 0:12:56.297,0:13:01.117 Anybody who's doing anything important[br]will ask these questions: 0:13:01.320,0:13:03.820 "What am I trying to do?"[br]"What could affect me?" 0:13:03.896,0:13:06.462 "Which are the big ones?"[br]"What shall I do about it?" 0:13:06.473,0:13:08.983 "Did that work?"[br]"Now what?" 0:13:09.703,0:13:14.383 And you could ask those questions every[br]Monday morning when you drove to work, 0:13:14.402,0:13:16.032 or every Saturday morning. 0:13:16.032,0:13:17.786 You can ask the question, say 0:13:17.791,0:13:20.831 "What am I trying to achieve today?"[br]"This week?" 0:13:21.286,0:13:23.636 "What could affect me and[br]which are the big ones?" 0:13:23.743,0:13:24.835 "What shall I do?" 0:13:24.871,0:13:30.351 We can manage risk on a very simple basis,[br]or we can use this as the structure for 0:13:30.394,0:13:35.114 a risk process which is much more complex,[br]which involves lots of meetings, 0:13:35.139,0:13:39.079 and lots of stakeholder groups and[br]lots of analysis and statistics. 0:13:39.098,0:13:41.188 It's the same questions. 0:13:41.909,0:13:45.306 So I would like you to remember[br]two important things. 0:13:45.319,0:13:49.229 One is, risk is uncertainty that matters. 0:13:49.554,0:13:53.854 And secondly, these questions,[br]these six questions. 0:13:54.535,0:13:58.235 Because that's the heart,[br]that's the basis of managing risk, 0:13:58.245,0:14:00.761 and it really is very, very easy. 0:14:01.402,0:14:06.197 Now, in the time that we have, I want to[br]focus on just two parts of this process, 0:14:06.351,0:14:10.591 and then give us the opportunity[br]to try out some of these things. 0:14:10.618,0:14:14.308 The identification step, clearly[br]very, very important 0:14:14.344,0:14:18.494 because if we don't identify the risks,[br]we can't manage them. 0:14:18.977,0:14:21.617 And then planning responses. 0:14:21.880,0:14:26.310 Understanding how we can deal with[br]the uncertainties that we've identified. 0:14:26.590,0:14:30.190 So, let's think about these things:[br]identifying risks. 0:14:30.363,0:14:32.493 How do we find all of the risks? 0:14:33.496,0:14:35.016 Well, you can't. 0:14:35.016,0:14:38.782 You can't find all of the risks because[br]there are risks that arrive 0:14:38.803,0:14:40.563 that we hadn't seen before. 0:14:40.563,0:14:44.386 There are emergent risks,[br]new risks, different risks 0:14:44.607,0:14:48.907 and I'll be talking about those[br]later this afternoon in my speech. 0:14:49.088,0:14:54.848 What we want to find are the knowable[br]risks: the risks that we could find. 0:14:55.249,0:14:58.609 We don't want somebody[br]on our project team who knows a risk 0:14:58.609,0:15:00.192 and they're not telling anybody. 0:15:00.235,0:15:04.525 So this process is about exposing the[br]uncertainties that matter, 0:15:04.533,0:15:06.773 finding them so we can[br]do something about them. 0:15:06.888,0:15:08.708 And there are lots of techniques, 0:15:08.708,0:15:11.724 brainstorming, workshops, check lists, 0:15:11.724,0:15:15.144 testing our assumptions and so on. 0:15:15.398,0:15:18.028 But I would like to answer a [br]bigger question, 0:15:18.262,0:15:20.439 a different question from techniques. 0:15:21.479,0:15:24.173 And it's the question, "are we [br]finding the real risks?" 0:15:25.393,0:15:29.787 When you go to a risk workshop and you[br]write things in your risk register, 0:15:30.007,0:15:33.724 are they really the uncertainties that [br]matter for your project? 0:15:34.434,0:15:39.874 Are these really the things that could [br]drive you off track or really help you? 0:15:40.249,0:15:42.419 Or are they just the obvious things? 0:15:42.419,0:15:45.954 Where all projects have problems with [br]requirements, 0:15:45.954,0:15:49.541 with resources, with testing. [br]These are things that 0:15:49.541,0:15:52.625 always come up, and we have processes [br]to deal with them. 0:15:53.733,0:15:55.754 But are they the real risks? 0:15:56.034,0:15:59.014 I would like to suggest to you that often[br]in our risk registers 0:15:59.014,0:16:02.018 we confuse real risks with other things. 0:16:03.021,0:16:07.725 Often, we confuse risks with their causes,[br]where does the risk come from? 0:16:08.555,0:16:13.125 Or we confuse risk with their effects,[br]what do they do if they happen? 0:16:14.005,0:16:16.525 But risks are uncertainties that matter. 0:16:17.565,0:16:20.405 They are not causes or effects. 0:16:20.470,0:16:22.767 So causes are things that are true. 0:16:23.252,0:16:25.743 This is true that the project [br]is difficult, 0:16:26.237,0:16:29.370 it is true that we do not have enough[br]people on the project. 0:16:29.730,0:16:33.233 it is true that the customer hasn't [br]signed the contract yet. 0:16:33.553,0:16:36.631 These are not risks, they are facts. 0:16:36.786,0:16:38.426 They might be issues. 0:16:38.426,0:16:42.164 They might be problems, but they are [br]not risks because they are not uncertain. 0:16:42.607,0:16:45.437 And a lot of people write these[br]things in our risk register. 0:16:45.437,0:16:47.589 "We don't have enough time [br]for this project." 0:16:47.589,0:16:48.819 "It’s a risk!" 0:16:48.819,0:16:51.360 No, it’s a problem. 0:16:52.236,0:16:55.164 Sometimes we confuse risks[br]with their effects. 0:16:55.335,0:16:58.635 There could be an accident,[br]we could be late. 0:16:58.835,0:17:02.204 those are not risks either,[br]they are the effects of risks, 0:17:02.204,0:17:06.941 how do you manage, we could be late?[br]If your late, it’s too late. 0:17:07.264,0:17:10.547 What we want to know is,[br]why might you be late? 0:17:10.777,0:17:15.230 What unplanned thing could happen[br]that would result in you being late? 0:17:15.542,0:17:19.439 So, risks sit between causes and effects. 0:17:19.939,0:17:23.673 We can’t manage causes because [br]they're here now, they're facts. 0:17:24.193,0:17:27.339 We don't want to manage effects[br]because they may never happen. 0:17:27.799,0:17:30.857 What we can manage is risks[br]that sit in the middle 0:17:30.857,0:17:33.322 because they haven't happened yet. 0:17:34.282,0:17:38.028 So, risk management has [br]to separate risks from 0:17:38.028,0:17:41.258 their causes and risks from[br]their effects. 0:17:42.108,0:17:46.090 And I find looking at hundreds of[br]risk registers all around the world. 0:17:46.920,0:17:51.758 I've worked in 48 different[br]countries, every continent, every culture. 0:17:52.108,0:17:56.480 Uh, not the Antarctic, it’s too cold.[br]Um, but nearly every continent. 0:17:56.670,0:18:02.360 And over half of the stuff in risk[br]registers are causes or effects. 0:18:02.690,0:18:03.707 Over half. 0:18:04.233,0:18:07.063 So the things we are trying to[br]manage in the risk register 0:18:07.103,0:18:10.486 are not risks and then [br]people are surprised that it doesn't work. 0:18:11.506,0:18:16.153 So how do we separate cause, risk, and[br]effect. Here is a little test. 0:18:17.013,0:18:19.838 And these statements are[br]written in your notes. 0:18:20.028,0:18:22.299 Or you can just think as we go. 0:18:22.299,0:18:26.080 Each of these statements and they are[br]all very simple is one of these things. 0:18:26.120,0:18:28.436 A cause is something that is true today. 0:18:28.996,0:18:31.730 A risk is an uncertainty that might,[br]or might not happen. 0:18:32.220,0:18:35.120 The effect is why it matters[br]to our objective. 0:18:36.167,0:18:38.557 Okay? So you have to[br]think what these are. 0:18:38.987,0:18:41.806 The project is based in a [br]third-world country. 0:18:41.906,0:18:44.055 Cause? Risk? Or effect? [br]What do you think? 0:18:44.695,0:18:46.139 Cause! Very good. 0:18:46.309,0:18:50.130 So, this is a fact, there might be [br]uncertainties that come out of this fact. 0:18:50.340,0:18:54.924 So we may not get the resources we need,[br]there may be security concerns. 0:18:55.364,0:18:59.907 We may not get paid. These are [br]uncertainties that come from this fact. 0:19:01.107,0:19:03.516 Interest rates might go down. 0:19:03.816,0:19:04.816 It's a risk. 0:19:04.826,0:19:07.094 Or they could stay the same or[br]they could go up. 0:19:07.094,0:19:09.692 And we could go over budget. 0:19:10.013,0:19:11.302 It's an effect. 0:19:11.332,0:19:13.622 So, a million things could[br]take you over budget, 0:19:13.622,0:19:15.393 maybe interest rates is one of them. 0:19:15.393,0:19:17.200 Okay? They were easy. [br]How about this? 0:19:17.200,0:19:19.796 The weather might be better than usual. 0:19:20.056,0:19:21.756 So risk could be the same or worse. 0:19:22.806,0:19:26.270 It would be a bad thing if you[br]were selling umbrellas. 0:19:27.250,0:19:30.290 It would be a good thing if you[br]were selling ice cream. 0:19:31.220,0:19:32.987 It depends what your project is. 0:19:34.007,0:19:36.454 Um, I'm allergic to prawns. 0:19:38.234,0:19:39.920 It's a cause, it's a fact. 0:19:39.920,0:19:44.353 What is the risk that comes from[br]this fact, this cause? 0:19:47.818,0:19:49.846 You think maybe I could be sick? 0:19:50.236,0:19:53.180 I could have a reaction. [br]I could be very ill. I could die. 0:19:54.640,0:19:57.948 All of those things are effects. [br]Aren’t they? 0:19:58.578,0:20:00.864 But if something happens [br]that I didn't plan, 0:20:00.864,0:20:03.911 because I am allergic something might[br]happen that makes me sick. 0:20:04.621,0:20:06.532 What's the something? 0:20:07.342,0:20:09.264 I might eat prawns without knowing. 0:20:09.614,0:20:14.421 So then I check, are there prawns in this?[br]You know I avoid things with prawn in them 0:20:14.741,0:20:18.247 I manage the risk and not the effect.[br]And not the cause. 0:20:18.354,0:20:21.684 Okay, we have got to use a new technique,[br]an unproven technique. 0:20:23.144,0:20:25.725 It's a fact, it's a requirement,[br]we have to do it. 0:20:25.725,0:20:28.843 we might introduce design errors but it [br]just is a fact. 0:20:28.843,0:20:30.593 A requirement of our project. 0:20:30.593,0:20:33.375 The contractor may not deliver on[br]time is a risk. 0:20:34.155,0:20:36.005 Um, this is going too fast. 0:20:36.365,0:20:38.517 It might not work for some reason. 0:20:39.197,0:20:40.857 You saw the color, it's an effect. 0:20:40.857,0:20:44.759 Okay, I will go more slowly. Uh,[br]we don't have enough people. 0:20:47.039,0:20:50.942 It's a cause, yes. And lastly,[br]there's a risk that we'll be late. 0:20:53.781,0:20:57.229 Hmm...mm. [br]It's an effect, is it? 0:20:58.069,0:21:01.112 Because we want to know what is the[br]risk that we'll be late. 0:21:01.502,0:21:03.133 Being late is an effect. 0:21:03.423,0:21:08.619 So apart from the prawns, all of the blue[br]and green things we see in risk registers. 0:21:09.743,0:21:16.568 The project environment, new technology,[br]lack of resources, or going over budget. 0:21:16.568,0:21:20.890 Lack of performance, delivering late. [br]These are not risks. 0:21:20.980,0:21:22.610 These are causes or effects. 0:21:23.549,0:21:28.020 And if we looked at a real risk register [br]and this is written in your notes for you 0:21:28.019,0:21:31.116 If you want to do this afterward,[br]we could do another exercise 0:21:32.016,0:21:35.891 In fact, the next page of the notes,[br]if you turn over the page 0:21:35.891,0:21:38.511 has these written a bit larger for you 0:21:39.010,0:21:40.495 English only I'm afraid. 0:21:40.588,0:21:42.758 We'll have to do something about that. 0:21:43.588,0:21:47.808 Um. You could just try this little [br]exercise on a real risk register 0:21:49.153,0:21:52.889 This is one of my clients, I asked [br]them for their top 10 risks. 0:21:52.889,0:21:54.132 This is what they gave me. 0:21:54.132,0:21:57.059 They're not risks. They're all sorts of[br]things mixed up. 0:21:58.496,0:22:01.476 Really, you should do this on your [br]risk register. 0:22:02.355,0:22:06.205 But let me show you what happened[br]when I did this on their risk register. 0:22:06.699,0:22:09.411 I found there was a whole mixture[br]of things. 0:22:10.033,0:22:14.333 So, the current hardware is not fast[br]enough to support testing. That's a fact. 0:22:15.333,0:22:16.163 It's a cause. 0:22:17.019,0:22:20.669 This means that we may be unable [br]to test performance 0:22:20.669,0:22:22.862 until production hardware is used. 0:22:23.521,0:22:24.711 That's the risk. 0:22:25.518,0:22:27.767 So we have two things in this [br]statement. 0:22:28.749,0:22:31.130 The next one down is just a fact. 0:22:31.467,0:22:35.467 A number of usability issues have[br]been identified by the supplier. 0:22:36.273,0:22:37.872 Okay, so what? 0:22:38.638,0:22:40.218 What difference does that make? 0:22:40.977,0:22:44.560 Let me color code this for you.[br]Just to be slightly friendly. 0:22:44.560,0:22:45.330 Umm. 0:22:45.330,0:22:49.102 But you will have to do it on your own[br]if you want to try the complete exercise. 0:22:49.523,0:22:53.863 Umm. There is a whole range of different[br]things in this so-called risk register. 0:22:54.543,0:22:57.481 And I would expect that yours is the same. 0:22:57.916,0:23:02.192 That you'll have things in your risk [br]register that are just pure facts. 0:23:02.473,0:23:05.824 Or things that are a mixture of risks [br]and other things 0:23:06.997,0:23:10.997 Now, there are two in this list that I [br]think are particularly interesting. 0:23:11.444,0:23:14.645 It's this one and this one.[br]They have all three colors in them. 0:23:15.708,0:23:18.207 Because they have a cause and a risk [br]and an effect. 0:23:19.518,0:23:23.518 So, let take this one. The team [br]does not have a documented design. 0:23:24.109,0:23:26.438 For this function. That's a fact. 0:23:26.944,0:23:27.734 So what? 0:23:28.042,0:23:30.659 Well, there's the risk [br]that the architecture 0:23:30.659,0:23:33.769 may not support [br]the required functionality. 0:23:33.769,0:23:37.467 That might happen because we don't have [br]a documented design. 0:23:38.177,0:23:39.534 Why do we care about that? 0:23:39.534,0:23:43.543 If that happens, it results in the[br]requirements not being met. 0:23:43.608,0:23:45.648 or a higher number of defects. 0:23:45.648,0:23:49.161 That hits our performance objective[br]and our quality objective. 0:23:49.958,0:23:53.633 So, now we have three things,[br]we know what the risk is. 0:23:54.344,0:23:57.784 The risk is that the architecture might [br]not support the functionality. 0:23:58.875,0:24:02.498 We know why that's happening,[br]because we don't have a documented design. 0:24:02.498,0:24:06.768 And we know how it could affect the [br]project in not meeting the requirements, 0:24:07.226,0:24:09.326 or delivering defects. 0:24:10.261,0:24:12.329 Those are really useful things to know. 0:24:13.011,0:24:17.961 And it will be helpful if every risk [br]description had those three things in it. 0:24:18.257,0:24:22.076 And, so what we recommend is[br]a structured description of risk 0:24:22.076,0:24:24.014 that has three parts to it. 0:24:24.789,0:24:31.389 That says "as a result of" some fact,[br]a cause. Then, an uncertainty might occur. 0:24:31.886,0:24:33.516 It might not, but it might. 0:24:33.855,0:24:35.825 And if it did, it would be a risk. 0:24:37.614,0:24:40.623 and if that thing actually happened,[br]it would lead to 0:24:40.880,0:24:42.647 An affect on the objectives 0:24:43.252,0:24:48.816 And we recommend and PMI recommends[br]and the ISO standard recommends 0:24:49.367,0:24:51.267 and best practice guidelines recommend. 0:24:51.377,0:24:53.967 But you describe your risk in these [br]three stages. 0:24:53.967,0:24:57.967 What do we know, what uncertainty does[br]that gives us, and why does it matter? 0:24:59.118,0:25:02.568 And then we can use it to help us[br]manage the risk. 0:25:03.769,0:25:08.201 In English, we have definite words [br]to describe facts 0:25:08.201,0:25:11.832 This is true. This has happened. [br]This does occur. 0:25:13.088,0:25:17.343 We have uncertain words to describe the[br]risk. It might or it might not 0:25:17.343,0:25:18.523 It's possible. 0:25:19.284,0:25:23.134 And then we have conditional words that [br]say this would follow 0:25:23.134,0:25:24.724 if the risk occurred. 0:25:25.481,0:25:27.631 Maybe your language is a little different. 0:25:27.631,0:25:30.547 But we can use the language [br]to help us perhaps. 0:25:31.465,0:25:33.355 So one of the things[br]I'd like us to try, 0:25:33.444,0:25:36.224 in the short exercise we're[br]going to do in a moment, 0:25:36.784,0:25:41.474 is to try describing risks[br]in that three part way. 0:25:41.579,0:25:42.759 What do we know? 0:25:43.134,0:25:45.029 What uncertainty does it give us? 0:25:45.337,0:25:47.507 And why does that matter [br]to our objectives? 0:25:47.857,0:25:50.933 And I would recommend that you[br]try that for your own 0:25:50.933,0:25:55.113 real risk register on your project, and[br]see what difference it makes. 0:25:56.072,0:25:57.532 You might be surprised. 0:25:58.184,0:26:02.554 Now, let's think about the[br]next question, which is not, 0:26:02.554,0:26:04.094 Well, there is another question. 0:26:04.094,0:26:05.494 "How do we prioritize them?" 0:26:05.494,0:26:08.015 But the one I want to focus on is,[br]"What could we do 0:26:08.015,0:26:09.938 about the risks that we've identified?'" 0:26:09.938,0:26:12.278 Planning risk responses. 0:26:12.781,0:26:14.790 Here are the questions[br]we need to ask. 0:26:15.268,0:26:18.018 "What are we going to do based on[br]the risk?" 0:26:18.489,0:26:19.779 How manageable it is. 0:26:20.626,0:26:23.841 How bad or good it might be[br]if we left it alone 0:26:24.057,0:26:25.677 impacts the variety. 0:26:26.227,0:26:29.575 Whether we have the people[br]and the equipment of the skills 0:26:29.689,0:26:30.719 to deal with it. 0:26:30.931,0:26:32.581 A resource availability 0:26:33.234,0:26:34.674 and cost effectiveness. 0:26:34.864,0:26:37.614 Can we spend a small amount[br]to save a big amount? 0:26:38.486,0:26:42.156 We don't want to spend a big amount [br]to save a small amount. 0:26:42.892,0:26:44.812 And the next important question 0:26:44.848,0:26:46.248 "who is going to do this?" 0:26:47.681,0:26:50.531 What could we do to deal with risk? 0:26:51.787,0:26:54.127 Often, people think of four things. 0:26:54.127,0:26:57.733 Four different types of things [br]we could do to address 0:26:57.733,0:26:59.209 uncertainties that matter. 0:26:59.690,0:27:02.231 And each of these has a name. 0:27:02.804,0:27:05.884 It's a strategy. A strategy to focus[br]our planning. 0:27:06.804,0:27:08.734 To focus our thinking. 0:27:08.734,0:27:13.689 And then, once we've focused our thinking[br]with a strategy, we can develop tactics 0:27:13.740,0:27:15.693 to address each individual risk. 0:27:15.783,0:27:18.403 So, what are the four things[br]that most people think of? 0:27:18.752,0:27:20.842 The first is risk avoidance. 0:27:21.237,0:27:25.237 Is there something we can do to[br]kill the risk, to remove it altogether? 0:27:26.491,0:27:29.471 The second is something we call[br]risk transfer. 0:27:30.181,0:27:31.601 Can we give it away? 0:27:31.721,0:27:34.011 Can we get somebody else[br]to take it away for us? 0:27:35.726,0:27:38.346 The third is what we call risk reduction. 0:27:38.851,0:27:41.521 Some people call this risk mitigation. 0:27:41.880,0:27:44.290 And here, we're trying to make[br]the risk smaller 0:27:44.746,0:27:46.096 so that we could accept it. 0:27:47.879,0:27:52.139 And the fourth response after avoid, [br]transfer, or reduce 0:27:52.958,0:27:54.788 is the one that everyone forgets. 0:27:55.640,0:27:57.770 They think if we can't do anything [br]about it 0:27:58.101,0:28:01.091 we just have to hope and pray and wonder[br]and wait. 0:28:02.633,0:28:03.925 The other response is 0:28:04.289,0:28:05.619 to take the risk. 0:28:05.812,0:28:08.012 We call that risk acceptance. 0:28:08.224,0:28:12.444 To recognize we're taking this risk[br]and to include it in our baseline 0:28:12.478,0:28:14.559 and to monitor it very carefully. 0:28:15.405,0:28:21.967 So, you might see those four options as[br]quite a good set of response strategies. 0:28:22.640,0:28:23.896 But there's a problem. 0:28:24.969,0:28:28.689 The problem is all these [br]things only work for bad risks. 0:28:29.604,0:28:31.614 What about opportunities? 0:28:32.088,0:28:35.788 We don't want to avoid or [br]give away or make smaller 0:28:35.905,0:28:37.445 opportunities. 0:28:37.656,0:28:41.784 So, how do you respond if you find [br]a good thing that might happen 0:28:41.784,0:28:43.004 on your project. 0:28:43.609,0:28:45.919 Do you just wait and see and hope? 0:28:46.641,0:28:49.621 Or is there something active that we [br]could do? 0:28:50.815,0:28:54.895 Fortunately, there are four response[br]strategies for opportunities 0:28:55.392,0:28:59.623 that match the four response strategies[br]for threats. 0:29:00.236,0:29:01.932 So, here are the bad ones. 0:29:01.932,0:29:03.319 Avoid a bad thing. 0:29:03.702,0:29:05.282 Give it to someone to take away. 0:29:05.362,0:29:06.648 Make it smaller. 0:29:06.740,0:29:08.041 Or take the risk. 0:29:09.262,0:29:11.552 This is not those things [br]I'm trying to achieve. 0:29:12.208,0:29:13.718 To remove the uncertainty. 0:29:13.938,0:29:15.655 To get somebody else to help. 0:29:16.174,0:29:18.014 To change the size of the risk. 0:29:18.228,0:29:20.643 Or to include it in our project plan. 0:29:21.654,0:29:23.964 We could do all of those four things, 0:29:24.230,0:29:25.500 for opportunities. 0:29:26.008,0:29:28.078 How do you eliminate uncertainty 0:29:28.383,0:29:29.713 from opportunity? 0:29:30.964,0:29:32.114 You capture it. 0:29:32.520,0:29:33.830 Take up a strategy, 0:29:34.436,0:29:36.456 which makes it definitely happen. 0:29:37.474,0:29:39.864 In English, we call this "Exploit". 0:29:40.034,0:29:42.174 Exploit is the same as avoid. 0:29:42.556,0:29:45.206 For avoid, you make the probability[br]zero. 0:29:45.667,0:29:46.856 It can't happen. 0:29:47.811,0:29:50.201 For a threat, it's avoid. 0:29:50.447,0:29:52.107 For opportunity, exploit. 0:29:52.422,0:29:54.932 It's to make the probability 100% 0:29:55.579,0:29:57.839 It will happen. It must happen. 0:29:58.116,0:29:59.826 So they're aggressive strategies. 0:30:00.005,0:30:02.970 You kill the threat, [br]you capture the opportunity 0:30:03.687,0:30:05.127 It's the same kind of thing. 0:30:05.852,0:30:09.932 What could we do, instead of giving away,[br]transferring a threat? 0:30:10.301,0:30:12.811 We want to involve [br]somebody else to help us. 0:30:13.384,0:30:15.174 We could share the opportunity. 0:30:15.217,0:30:17.857 We could ask them [br]to come into our project 0:30:17.857,0:30:21.032 and be involved with us in a[br]joint venture 0:30:21.085,0:30:23.772 or a subcontract, or a partnership. 0:30:23.847,0:30:27.639 Where they help us to achieve this[br]uncertainy that would help us all? 0:30:27.689,0:30:31.123 And we give them some part of the benefit,[br]we share the opportunity 0:30:31.168,0:30:35.008 How could we change the size[br]of an opportunity? 0:30:35.008,0:30:38.608 We don't want to reduce it,[br]we want to enhance it. 0:30:38.639,0:30:41.649 We want to grow it,[br]we want to make it more likely. 0:30:41.649,0:30:45.981 And bigger impact. It's the same idea but[br]the other way around for the opportunity. 0:30:47.595,0:30:50.925 And the last one, if we can't do these[br]active things, we could just 0:30:50.998,0:30:53.738 Accept an opportunity and wait and see[br]what happens. 0:30:53.898,0:30:57.898 But monitor it very closely if there's[br]nothing else we could do. 0:30:58.541,0:31:03.731 So what this slide tells us is that[br]there's an equal variety 0:31:03.940,0:31:09.480 Of potential response times that we can[br]choose between for our opportunities 0:31:09.480,0:31:12.880 Equal to the number that we have threats. 0:31:12.951,0:31:16.831 You see, the secret to thinking[br]about opportunities... 0:31:16.938,0:31:21.518 Is to recognize that an opportunity[br]is the same as a threat. 0:31:21.625,0:31:25.775 The only difference is the sign[br]of the impact. 0:31:28.206,0:31:32.586 So a threat has a negative impact,[br]an opportunity has a positive impact. 0:31:33.535,0:31:35.695 Apart from that, they're the same. 0:31:36.374,0:31:38.846 They are both uncertainties that matter. 0:31:38.936,0:31:41.506 They are both things that might or might[br]not happen. 0:31:41.581,0:31:43.151 But could affect our objectives. 9:59:59.000,9:59:59.000 They can both be managed practically. 9:59:59.000,9:59:59.000 They both make a difference to the chances[br]of succeeding on our project. 9:59:59.000,9:59:59.000 And that's why risk management should[br]manage threats and opportunities together 9:59:59.000,9:59:59.000 in a single process. 9:59:59.000,9:59:59.000 Because they are the same things. 9:59:59.000,9:59:59.000 And that may be new thinking to[br]some of you. 9:59:59.000,9:59:59.000 Those of you who always think of risk[br]as a big ugly thing waiting to squash me. 9:59:59.000,9:59:59.000 Or the unknown thing in the future[br]that's going to hurt me. 9:59:59.000,9:59:59.000 There are some like that, and we need to[br]stop them happening to protect ourselves. 9:59:59.000,9:59:59.000 But there are also some very good things[br]out that which might happen 9:59:59.000,9:59:59.000 Which we need to see, and[br]we need to chase them. 9:59:59.000,9:59:59.000 And make them happen. So that our[br]projects could be more successful. 9:59:59.000,9:59:59.000 There is another response strategy[br]that we might try. 9:59:59.000,9:59:59.000 Which is really not recommended, and[br]just pretending that there are no risks 9:59:59.000,9:59:59.000 and hiding away and saying maybe[br]it will never happen. 9:59:59.000,9:59:59.000 We really don't recommend this at all. 9:59:59.000,9:59:59.000 In fact, it's probably true that ostriches[br]don't bury their heads in the sand. 9:59:59.000,9:59:59.000 It's just the way that the sand dunes[br]kind of look and it's just a pretend story 9:59:59.000,9:59:59.000 And sadly for project managers,[br]it's not a pretend story. 9:59:59.000,9:59:59.000 We hide our heads and we pretend[br]there are no risks. 9:59:59.000,9:59:59.000 And then we get surprised. 9:59:59.000,9:59:59.000 Well, we can do better than that. 9:59:59.000,9:59:59.000 So, let me just finish here just to say[br]we do need to do something about this