0:00:00.501,0:00:04.428 Okay, let's have a look at[br]risk management in practice 0:00:04.457,0:00:08.474 And what I want to do[br]is to start with some basic concepts 0:00:08.485,0:00:14.015 then focus on TWO difficult areas[br]in the risk process 0:00:14.242,0:00:19.164 So, I guess if I asked you[br]to define the word 'risk' 0:00:19.174,0:00:22.954 you would have some idea[br]of what it meant 0:00:22.967,0:00:26.476 We might not have a formal definition[br]that we could quote, 0:00:26.476,0:00:30.253 but we all have something in our minds[br]when we hear the word 'risk' 0:00:30.274,0:00:33.972 This is what we think,[br]and maybe you think of things like this 0:00:34.234,0:00:38.534 Maybe you feel like this little guy,[br]facing some big ugly challenge 0:00:38.534,0:00:41.791 that you know is just going to[br]squash you flat. 0:00:42.066,0:00:43.766 Maybe you feel like this guy. 0:00:44.024,0:00:46.489 This is a real job in North Korea, 0:00:47.768,0:00:51.367 and his job is to hold the target[br]for other people to shoot at 0:00:51.500,0:00:53.979 Sometimes project managers[br]have the target here 0:00:54.292,0:00:56.911 We feel like everybody is shooting at us[br]in our job 0:00:57.757,0:01:01.987 Or maybe you just know there's something[br]nasty out there, waiting to get you 0:01:02.470,0:01:05.700 And maybe that's what you think of[br]when you think of the word 'risk' 0:01:06.361,0:01:09.574 Well that's partly true[br]but it's not the whole truth. 0:01:10.193,0:01:13.625 Risk is not the same[br]as uncertainty. 0:01:14.220,0:01:16.963 Risk is related to uncertainty[br]but they're different. 0:01:17.670,0:01:23.826 So all risks are uncertain[br]but not all uncertainties are risks. 0:01:24.653,0:01:27.589 If you have a risk register[br]or a risk list, 0:01:27.942,0:01:31.435 you don't have a million items in it,[br]or you shouldn't. 0:01:31.962,0:01:34.512 You don't even probably have[br]a thousand items in it, 0:01:34.512,0:01:35.788 you have a smaller number. 0:01:36.714,0:01:40.007 Although there are millions[br]of uncertainties in the world. 0:01:40.400,0:01:44.313 So how do we decide which uncertainties[br]we're going to call 'risk'? 0:01:44.677,0:01:47.280 And write them down [br]and put them in our risk register 0:01:47.548,0:01:50.057 and decide to do something about them. 0:01:50.483,0:01:56.362 Clearly 'risk' is a subset[br]of uncertainties, but which subset? 0:01:56.942,0:01:58.133 How do you know? 0:01:58.798,0:02:02.948 I think it's very simple to separate[br]risk and uncertainty. 0:02:03.199,0:02:05.274 And I use 3 English words, 0:02:05.425,0:02:10.019 these words here,[br]'risk is uncertainty that matters." 0:02:11.503,0:02:14.783 Because most of the[br]uncertainties in the world don't matter. 0:02:15.564,0:02:19.014 We don't care if it's going to rain[br]in London tomorrow afternoon. 0:02:19.400,0:02:23.780 It might, it might not,[br]it's irrelevant, it doesn't matter. 0:02:24.498,0:02:26.948 We don't care what the [br]exchange rate will be 0:02:26.948,0:02:30.703 if it's between the Russian Ruble [br]and the Chinese Yen in 2020. 0:02:30.703,0:02:32.387 It doesn't matter to us. 0:02:32.888,0:02:35.118 But there are things on our projects, 0:02:35.427,0:02:37.117 and things in our families, 0:02:37.271,0:02:38.871 and things in our country, 0:02:38.979,0:02:41.446 which are uncertain which do matter to us. 0:02:42.195,0:02:45.338 If it's an uncertainty that matters, [br]it's a risk. 0:02:46.188,0:02:49.991 So here's another question,[br]how do you know what matters? 0:02:50.751,0:02:53.396 In your projects, [br]what are the things that matter? 0:02:54.077,0:02:57.875 The things that matter in our projects[br]are our objectives. 0:02:58.532,0:03:02.216 So we must always connect uncertainty[br]with objectives, 0:03:02.991,0:03:05.631 in order to find the risks. 0:03:06.005,0:03:08.355 And if we look at [br]some definitions of risk, 0:03:08.361,0:03:11.405 this is the ISO standard that I mentioned, 0:03:11.445,0:03:13.796 it connects those words very simply; 0:03:13.796,0:03:17.559 Risk is the effect of uncertainty[br]on objectives. 0:03:18.451,0:03:21.201 And we might look at another definition[br]from the UK, 0:03:21.387,0:03:23.916 from our association [br]for project management, 0:03:24.144,0:03:28.134 it says the same thing that risk[br]is an uncertain event 0:03:28.152,0:03:32.212 or a set of circumstances, [br]which is uncertain, 0:03:32.212,0:03:35.372 but it matters because should it occur, 0:03:35.372,0:03:38.587 it will have an effect on achievement of objectives. 0:03:38.603,0:03:40.553 Uncertainty that matters. 0:03:40.867,0:03:44.317 So we should be looking [br]in our risk register for two things: 0:03:44.703,0:03:48.803 "Is it uncertain?" We don't want [br]problems in our risk register. 0:03:49.183,0:03:52.113 We don't want issues in the risk register. 0:03:52.113,0:03:55.020 We don't want constraints or requirements. 0:03:55.291,0:03:59.631 These things are certain,[br]what we want is uncertainties, 0:03:59.721,0:04:02.241 something that might happen[br]or might not happen. 0:04:03.076,0:04:06.766 But the other important question for our[br]risk register is 0:04:06.766,0:04:08.366 "Does it matter?" 0:04:08.439,0:04:11.739 Which objective would be affected [br]if this thing happened? 0:04:13.304,0:04:15.810 And then when we want to see[br]how big the risk is, 0:04:16.160,0:04:18.322 we can ask those two questions: 0:04:18.338,0:04:19.768 "How uncertain is it," 0:04:19.985,0:04:22.095 "and how much does it matter?" 0:04:22.132,0:04:24.502 And that will tell us how big the risk is. 0:04:24.564,0:04:27.044 So, this idea of uncertainty that matters 0:04:27.084,0:04:30.536 then develops into something which is useful 0:04:30.618,0:04:33.317 by linking uncertainty to our objectives. 0:04:34.767,0:04:37.148 So, we have two dimensions of ‘risk,’ 0:04:37.409,0:04:39.658 we have an uncertainty dimension and we 0:04:39.658,0:04:42.093 have a dimension that [br]affects our objectives 0:04:43.249,0:04:47.419 In projects, we call[br]this probability and impact, 0:04:47.449,0:04:49.502 We could call them other things, 0:04:49.502,0:04:51.233 there are other English 0:04:51.233,0:04:52.767 words we could use,[br]but these 0:04:52.767,0:04:54.553 are the ones,[br]most often, we use. 0:04:54.702,0:04:57.732 And I would like to ask you with [br]this picture of the mouse. 0:04:59.632,0:05:04.647 What effect matters to the mouse? 0:05:05.874,0:05:09.315 So first of all, clearly, [br]he is in a uncertain situation here. 0:05:09.845,0:05:12.193 And he's seen some risks. 0:05:12.457,0:05:15.359 His objective is to get the cheese[br]and stay alive. 0:05:15.938,0:05:18.817 And so, one of the risks he has [br]identified is a bad thing 0:05:18.903,0:05:21.353 that might happen:[br]he might be killed or injured. 0:05:22.177,0:05:24.517 And so, he has been a [br]good project manager, 0:05:24.517,0:05:27.032 he has put his little helmet on, [br]and he is preparing 0:05:27.152,0:05:32.051 so that it doesn't happen to him.[br]So, he doesn't get killed or injured. 0:05:32.051,0:05:32.821 Very good. 0:05:33.690,0:05:36.560 And there are things in our projects, [br]that if they happened 0:05:36.560,0:05:37.875 would kill or injure us. 0:05:37.875,0:05:39.218 They would waste time, 0:05:39.218,0:05:41.566 waste money, damage reputation, 0:05:41.566,0:05:42.986 destroy performance, 0:05:43.202,0:05:45.732 maybe even injure real people. 0:05:46.230,0:05:49.880 And as project managers we have to[br]see those things and stop them happening. 0:05:49.957,0:05:51.792 Protect ourselves in advance. 0:05:51.857,0:05:52.867 Avoid them. 0:05:54.240,0:05:57.870 Are there any other uncertainties[br]that matter for the mouse? 0:05:59.637,0:06:01.327 Well there is... 0:06:01.327,0:06:02.275 the cheese. 0:06:02.678,0:06:05.558 There's an uncertainty here which [br]matters a great deal. 0:06:05.558,0:06:07.887 "Will I get the cheese out of the trap?" 0:06:08.733,0:06:10.429 He might, or he might not. 0:06:10.967,0:06:14.121 And if he doesn't get the [br]cheese out of the trap, he's failed 0:06:14.961,0:06:17.471 So he has two uncertainties to manage, 0:06:17.471,0:06:20.133 one of them is bad - he might be killed[br]or injured - 0:06:20.419,0:06:22.639 the other is good - he might [br]get the cheese. 0:06:23.129,0:06:24.616 And what he has to do, 0:06:24.968,0:06:28.958 what he has to do is to manage both[br]of these at the same time. 0:06:29.232,0:06:32.159 And as project managers, we have to[br]do the same thing. 0:06:32.788,0:06:36.128 And also we have to do it in the[br]best possible way - 0:06:36.128,0:06:40.546 sometimes there's a better way to get the [br]cheese without being killed or injured. 0:06:41.428,0:06:44.518 In our projects, we have to stop the [br]bad things happening, 0:06:44.889,0:06:47.789 but we also have to get the cheese out[br]of our projects. 0:06:49.116,0:06:52.116 "So what does 'cheese' mean,[br]in your project?" 0:06:52.116,0:06:54.302 "What is the 'cheese' in your project?" 0:06:55.160,0:06:56.640 'Cheese' means value. 0:06:56.806,0:06:58.445 'Cheese' means benefits. 0:06:58.644,0:07:01.868 'Cheese' means products and [br]services that people want and need. 0:07:02.170,0:07:04.290 'Cheese' means customer satisfaction. 0:07:04.371,0:07:06.802 'Cheese' is the good stuff [br]that we're trying to get 0:07:06.802,0:07:08.465 out of our difficult projects. 0:07:08.872,0:07:11.962 And if we don't do anything bad - 0:07:11.962,0:07:16.170 we don't waste time, we don't [br]waste money, we don't damage reputation - 0:07:16.170,0:07:17.954 but we don't create value, 0:07:18.278,0:07:19.338 we've failed. 0:07:19.683,0:07:23.183 If the mouse didn't die but he didn't [br]get the cheese, he failed. 0:07:23.950,0:07:28.353 If we create benefits, but we waste time [br]and waste money and destroy reputation, 0:07:28.353,0:07:29.369 we've failed. 0:07:30.015,0:07:32.565 And if the mouse gets the cheese[br]and he's killed, 0:07:32.724,0:07:33.784 he's failed. 0:07:33.784,0:07:36.076 So we have to do both of these things. 0:07:36.353,0:07:39.231 And when we think about risk [br]and think about impact, 0:07:39.468,0:07:41.896 there are two kinds of impact that matter. 0:07:42.525,0:07:45.175 Bad ones, and good ones. 0:07:45.488,0:07:48.125 Uncertainties that could hurt the project, 0:07:48.678,0:07:51.568 and uncertainties that [br]could help the project. 0:07:51.766,0:07:56.364 Both of these matter [br]and both of these need to be managed. 0:07:56.867,0:07:59.078 And we have another word for those. 0:07:59.305,0:08:03.791 So, here's the definition of risk from the[br]Project Management Institute, the PMI, 0:08:04.007,0:08:05.685 from the PMBok Guide. 0:08:05.993,0:08:08.093 It's the same as the others [br]that we've seen: 0:08:08.093,0:08:12.379 an uncertain event or condition, [br]that if it occurs, affects an objective. 0:08:13.222,0:08:18.534 But PMI knows about the mouse. PMI knows [br]about the cheese and the traps, 0:08:18.721,0:08:22.277 and has added three words [br]to the definition of risk here. 0:08:23.233,0:08:26.204 It's not the words 'cheese' and 'traps'. 0:08:26.432,0:08:29.301 It's the words 'positive or negative'. 0:08:30.067,0:08:33.984 What this tells us is that there [br]are good risks, as well as bad risks. 0:08:34.544,0:08:37.997 And we heard that in one of our [br]keynote speeches, earlier this morning. 0:08:38.507,0:08:42.907 In the uncertain situation that this [br]country faces going forward 0:08:42.958,0:08:46.058 with all the changes that there have been,[br]there are threats. 0:08:46.058,0:08:48.061 There are things that could go wrong. 0:08:48.061,0:08:50.226 And you need to see those[br]and address them. 0:08:50.407,0:08:53.188 But there are also opportunities. 0:08:53.248,0:08:56.235 Uncertain things that might happen[br]that could be good. 0:08:56.918,0:08:59.346 And we also need to see those things, 0:08:59.561,0:09:02.699 and to try and proactively [br]make them happen. 0:09:03.249,0:09:05.376 And that is equally true in our projects, 0:09:05.376,0:09:06.938 in our personal lives, 0:09:06.938,0:09:09.308 and also at the national level. 0:09:09.778,0:09:13.578 And I'll be talking about some of [br]those things later on this afternoon 0:09:14.717,0:09:19.161 So, PMI has this definition. The other [br]standards have something very similar. 0:09:19.541,0:09:21.448 The ISO standard, at the bottom here, 0:09:21.448,0:09:25.619 says 'risk is the effect of[br]uncertainty on objectives.' 0:09:26.536,0:09:29.252 Note, the effect can be [br]positive or negative. 0:09:31.009,0:09:34.726 And the APM, Association for Project [br]Management in the UK says the same thing. 0:09:35.426,0:09:39.954 So we have this new idea,[br]that risk is a double-sided concept. 0:09:41.090,0:09:43.959 And it's the same impression,[br]the word you have for risk, 0:09:44.329,0:09:48.002 we mostly think of bad things.[br]But it could be used for good things, 0:09:48.002,0:09:49.877 as well. Isn't that right? 0:09:50.143,0:09:51.862 It's an uncertain word. 0:09:52.281,0:09:55.031 And there are good risks as well [br]as bad risks. 0:09:55.991,0:09:59.364 So in our project [br]risk management process, 0:09:59.574,0:10:03.181 we should be looking out for the traps[br]and avoiding them 0:10:03.181,0:10:06.004 and protecting ourselves and [br]preventing them happening. 0:10:06.004,0:10:09.147 But we should also be looking[br]out for the cheese 0:10:09.147,0:10:11.549 and chasing it, and making it[br]happen proactively, 0:10:11.549,0:10:14.897 so we get the maximum[br]benefit for the minimum cost. 0:10:15.957,0:10:19.375 That’s why risk management is so [br]important to 0:10:19.712,0:10:22.676 project success: because it effects [br]our objectives. 0:10:23.817,0:10:27.426 It gives us the best possible chance[br]to achieve our goals. 0:10:28.570,0:10:30.290 So how do we do that? 0:10:30.621,0:10:33.104 If we think about the risk management[br]process, 0:10:33.539,0:10:35.899 the process has to do a number of things. 0:10:36.524,0:10:39.564 If risk is uncertainty that affects[br]objectives, 0:10:39.633,0:10:41.753 we have to know what our objectives are. 0:10:42.076,0:10:44.416 Then, we have to identify the[br]uncertainties. 0:10:45.099,0:10:48.459 The uncertainties that would matter to[br]those objectives. 0:10:48.618,0:10:53.058 And remember that they could be good[br]or bad, threats and opportunities. 0:10:53.897,0:10:56.747 That gives us a long list of uncertainties[br]that matter, 0:10:56.777,0:10:58.477 but they don't all matter the same. 0:10:59.218,0:11:03.528 So the next thing we have to do is[br]to prioritize, and ask the question 0:11:04.287,0:11:06.837 "How uncertain,[br]and how much does it matter?" 0:11:07.053,0:11:10.133 Then we get a prioritized list of risks. 0:11:10.210,0:11:14.390 We know which are the worst threats and[br]the best opportunities, 0:11:14.811,0:11:17.061 so that we do something about it. 0:11:17.508,0:11:19.384 Then we plan how to respond. 0:11:19.411,0:11:23.294 We think about what would be appropriate[br]to stop the bad thing happening 0:11:23.344,0:11:24.974 and to make the good thing happen. 0:11:25.583,0:11:28.920 And having decided, we do it of course. 0:11:29.851,0:11:33.641 And then risk is constantly changing [br]so we need to come back and do it again, 0:11:33.873,0:11:35.923 and see what has changed. 0:11:36.518,0:11:42.048 We could express this process as a number[br]of questions that it's important to ask, 0:11:42.184,0:11:45.604 and keep on asking about our project. 0:11:46.337,0:11:49.897 In fact, you can use these questions for[br]anything. 0:11:50.106,0:11:54.076 You could use these questions for your[br]next career move. 0:11:54.524,0:11:58.914 You could use these questions for deciding[br]about your pension. 0:11:59.141,0:12:03.691 You could use these questions to decide[br]how to bring up your children 0:12:04.323,0:12:09.183 or to decide on how to invest the nation's[br]wealth. 0:12:09.996,0:12:11.759 These are the questions: 0:12:11.799,0:12:15.409 "What are we trying to achieve?"[br]That's setting objectives. 0:12:15.979,0:12:18.439 Then, "what could affect[br]us in achieving that?" 0:12:18.534,0:12:20.574 That's identifying risks. 0:12:20.915,0:12:24.455 Then, "when we have a list of risks,[br]which are the most important ones?" 0:12:24.742,0:12:27.392 That's prioritizing, that[br]assessing the risks. 0:12:27.674,0:12:29.814 Then, "what could we do about it?" 0:12:30.242,0:12:34.402 Planning our responses and doing it,[br]implementing the responses. 0:12:34.846,0:12:39.166 And then, "did it work and what's changed"[br]Reviewing the risk. 0:12:39.306,0:12:43.766 So if we look at a risk management[br]process, we could link each step in the 0:12:43.800,0:12:46.930 process to one of these questions. 0:12:47.093,0:12:49.773 And this is why risk[br]management is so easy, 0:12:49.876,0:12:56.116 because all we're doing is asking and[br]answering obvious questions. 0:12:56.297,0:13:01.117 Anybody who's doing anything important[br]will ask these questions: 0:13:01.320,0:13:03.820 "What am I trying to do?"[br]"What could affect me?" 0:13:03.896,0:13:06.462 "Which are the big ones?"[br]"What shall I do about it?" 0:13:06.473,0:13:08.983 "Did that work?"[br]"Now what?" 0:13:09.703,0:13:14.383 And you could ask those questions every[br]Monday morning when you drove to work, 0:13:14.402,0:13:16.032 or every Saturday morning. 0:13:16.032,0:13:17.786 You can ask the question, say 0:13:17.791,0:13:20.831 "What am I trying to achieve today?"[br]"This week?" 0:13:21.286,0:13:23.636 "What could affect me and[br]which are the big ones?" 0:13:23.743,0:13:24.835 "What shall I do?" 0:13:24.871,0:13:30.351 We can manage risk on a very simple basis,[br]or we can use this as the structure for 0:13:30.394,0:13:35.114 a risk process which is much more complex,[br]which involves lots of meetings, 0:13:35.139,0:13:39.079 and lots of stakeholder groups and[br]lots of analysis and statistics. 0:13:39.098,0:13:41.188 It's the same questions. 0:13:41.909,0:13:45.306 So I would like you to remember[br]two important things. 0:13:45.319,0:13:49.229 One is, risk is uncertainty that matters. 0:13:49.554,0:13:53.854 And secondly, these questions,[br]these six questions. 0:13:54.535,0:13:58.235 Because that's the heart,[br]that's the basis of managing risk, 0:13:58.245,0:14:00.761 and it really is very, very easy. 0:14:01.402,0:14:06.197 Now, in the time that we have, I want to[br]focus on just two parts of this process, 0:14:06.351,0:14:10.591 and then give us the opportunity[br]to try out some of these things. 0:14:10.618,0:14:14.308 The identification step, clearly[br]very, very important 0:14:14.344,0:14:18.494 because if we don't identify the risks,[br]we can't manage them. 0:14:18.977,0:14:21.617 And then planning responses. 0:14:21.880,0:14:26.310 Understanding how we can deal with[br]the uncertainties that we've identified. 0:14:26.590,0:14:30.190 So, let's think about these things:[br]identifying risks. 0:14:30.363,0:14:32.493 How do we find all of the risks? 0:14:33.496,0:14:35.016 Well, you can't. 0:14:35.016,0:14:38.782 You can't find all of the risks because[br]there are risks that arrive 0:14:38.803,0:14:40.563 that we hadn't seen before. 0:14:40.563,0:14:44.386 There are emergent risks,[br]new risks, different risks 0:14:44.607,0:14:48.907 and I'll be talking about those[br]later this afternoon in my speech. 0:14:49.088,0:14:54.848 What we want to find are the knowable[br]risks: the risks that we could find. 0:14:55.249,0:14:58.609 We don't want somebody[br]on our project team who knows a risk 0:14:58.609,0:15:00.192 and they're not telling anybody. 0:15:00.235,0:15:04.525 So this process is about exposing the[br]uncertainties that matter, 0:15:04.533,0:15:06.773 finding them so we can[br]do something about them. 0:15:06.888,0:15:08.708 And there are lots of techniques, 0:15:08.708,0:15:11.724 brainstorming, workshops, check lists, 0:15:11.724,0:15:15.144 testing our assumptions and so on. 0:15:15.398,0:15:18.028 But I would like to answer a [br]bigger question 0:15:18.262,0:15:20.439 A different question from techniques 0:15:21.479,0:15:24.173 And it's the question, "are we [br]finding the real risks?" 0:15:25.393,0:15:29.787 When you go to a risk workshop and you[br]write things in your risk register, 0:15:30.007,0:15:33.724 are they really the uncertainties that [br]matter for your project? 0:15:34.434,0:15:39.874 Are these really the things that could [br]drive you off track or really help you? 0:15:40.559,0:15:42.856 Or are they just the obvious things? 0:15:42.896,0:15:45.954 Where all projects have problems with [br]requirements, 0:15:46.734,0:15:49.541 with resources, with testing. [br]These are things that 0:15:49.793,0:15:52.625 always come up, and we have processes [br]to deal with them. 0:15:53.733,0:15:55.754 But are they the real risks? 0:15:56.034,0:15:59.304 I would like to suggest to you that often[br]in our risk registers 0:15:59.664,0:16:02.018 we confuse real risks with other things. 0:16:03.021,0:16:07.725 Often, we confuse risks with their causes,[br]where does the risk come from? 0:16:08.555,0:16:13.125 Or we confuse risk with their effects,[br]what do they do if they happen? 0:16:14.005,0:16:16.525 But risks are uncertainties that matter. 0:16:17.565,0:16:20.405 They are not causes or effects. 0:16:20.470,0:16:25.357 So, causes are things that are true.[br]This is true that the project is difficult 0:16:26.547,0:16:29.370 it is true that we do not have enough[br]people on the project. 0:16:29.730,0:16:33.233 it is true that the customer hasn't [br]signed the contract yet. 0:16:33.553,0:16:36.631 These are not risks, they are facts. 0:16:36.786,0:16:38.426 They might be issues. 0:16:38.426,0:16:42.254 They might be problems, but they are [br]not risks because they are not uncertain. 0:16:42.997,0:16:45.807 And a lot of people write these[br]things in our risk register. 0:16:45.807,0:16:47.949 "We don't have enough time [br]for this project." 0:16:47.949,0:16:48.819 "It’s a risk!" 0:16:48.819,0:16:51.360 No, it’s a problem. 0:16:52.236,0:16:55.164 Sometimes we confuse risks[br]with their effects. 0:16:55.335,0:16:58.635 There could be an accident,[br]we could be late. 0:16:58.835,0:17:02.204 those are not risks either,[br]they are the effects of risks, 0:17:02.204,0:17:06.941 how do you manage, we could be late?[br]If your late, it’s too late. 0:17:07.264,0:17:10.547 What we want to know is,[br]why might you be late? 0:17:10.867,0:17:15.230 what unplanned thing could happen[br]that would result in, you being late. 0:17:15.542,0:17:19.439 So, risks sit between causes and effects. 0:17:19.939,0:17:23.673 We can’t manage causes because [br]they're here now, they are facts. 0:17:24.193,0:17:27.339 We don't want to manage effects[br]because they may never happen. 0:17:27.799,0:17:30.857 What we can manage is risks[br]that sit in the middle 0:17:30.857,0:17:33.322 because they haven't happened yet. 0:17:34.282,0:17:38.288 So, risk management has [br]to separate risks from 0:17:38.288,0:17:41.258 their causes and risks from[br]their effects. 0:17:42.108,0:17:46.090 And I find looking at hundreds of[br]risk registers all around the world. 0:17:46.920,0:17:51.758 I've worked in 48 different[br]countries, every continent, every culture. 0:17:52.108,0:17:56.480 Uh, not the Antarctic, it’s too cold.[br]Um but nearly every continent. 0:17:56.670,0:18:02.360 And over half of the stuff in risk[br]registers are causes or effects. 0:18:02.690,0:18:03.707 Over half. 0:18:04.233,0:18:07.063 So the things we are trying to[br]manage in the risk register 0:18:07.103,0:18:10.486 are not risks and then [br]people are surprised that it doesn't work. 0:18:11.506,0:18:16.153 So how do we separate cause, risk, and[br]effect here is a little test. 0:18:17.293,0:18:19.838 And these statements are[br]written in your notes. 0:18:20.028,0:18:22.299 Or you can just think as we go. 0:18:22.359,0:18:26.080 Each of these statements and they are[br]all very simple is one of these things. 0:18:26.120,0:18:28.436 A cause is something that is true today. 0:18:28.996,0:18:31.730 A risk is an uncertainty that might,[br]or might not happen. 0:18:32.220,0:18:35.120 The effect is why it matters[br]to our objective. 0:18:36.167,0:18:38.557 Okay? So you have to[br]think what these are. 0:18:38.987,0:18:41.806 The project is based in a [br]third-world country. 0:18:41.906,0:18:44.055 Cause? Risk? Or effect? [br]What do you think? 0:18:44.695,0:18:46.139 Cause! Very good. 0:18:46.309,0:18:50.130 So, this is a fact, there might be [br]uncertainties that come out of this fact. 0:18:50.340,0:18:54.924 So we may not get the resources we need,[br]there may be security concerns. 0:18:55.364,0:18:59.907 We may not get paid. These are [br]uncertainties that come from this fact. 0:19:01.107,0:19:03.516 Interest rates might go down. 0:19:03.816,0:19:04.816 It's a risk. 0:19:04.826,0:19:07.094 Or they could stay the same or[br]they could go up. 0:19:07.094,0:19:09.692 And we could go over budget. 0:19:10.013,0:19:11.302 It's an effect. 0:19:11.332,0:19:13.622 So, a million things could[br]take you over budget, 0:19:13.622,0:19:15.393 maybe interest rates is one of them. 0:19:15.393,0:19:17.200 Okay? They were easy. [br]How about this? 0:19:17.200,0:19:19.796 The weather might be better than usual. 0:19:20.056,0:19:21.756 So risk could be the same or worse. 0:19:22.806,0:19:26.270 It would be a bad thing if you[br]were selling umbrellas. 0:19:27.250,0:19:30.290 It would be a good thing if you[br]were selling ice cream. 0:19:31.220,0:19:32.987 It depends what your project is. 0:19:34.007,0:19:36.454 Um, I'm allergic to prawns. 0:19:38.234,0:19:39.920 It's a cause, it's a fact. 0:19:39.920,0:19:44.353 What is the risk that comes from[br]this fact, this cause? 0:19:47.818,0:19:49.846 You think maybe I could be sick? 0:19:50.236,0:19:53.180 I could have a reaction. [br]I could be very ill. I could die. 0:19:54.640,0:19:57.948 All of those things are effects. [br]Aren’t they? 0:19:58.578,0:20:00.864 But if something happens [br]that I didn't plan, 0:20:00.864,0:20:03.911 because I am allergic something might[br]happen that makes me sick. 0:20:04.621,0:20:06.532 What's the something? 0:20:07.342,0:20:09.264 I might eat prawns without knowing. 0:20:09.614,0:20:14.421 So then I check, are there prawns in this?[br]You know I avoid things with prawn in them 0:20:14.741,0:20:18.247 I manage the risk and not the effect.[br]And not the cause. 0:20:18.354,0:20:21.684 Okay, we have got to use a new technique,[br]an unproven technique. 0:20:23.144,0:20:25.725 It's a fact, it's a requirement,[br]we have to do it. 0:20:25.725,0:20:30.073 we might introduce design errors but it is[br]just a fact, a requirement of our project. 0:20:30.593,0:20:33.375 The contractor may not deliver on[br]time is a risk. 0:20:34.155,0:20:36.005 Um, this is going to fast 0:20:36.365,0:20:38.517 It might not work for some reason. 0:20:39.197,0:20:40.857 You saw the color, it's an effect. 0:20:40.857,0:20:44.759 Okay, I will go more slowly. Uh,[br]we don't have enough people. 0:20:47.039,0:20:50.942 It's a cause, yes. And lastly,[br]there is a risk we will be late. 0:20:53.781,0:20:57.229 Hmm..mm. it's an effect, is it? 0:20:58.069,0:21:01.112 Because we want to know what is the[br]risk that we'll be late. 0:21:01.502,0:21:03.133 Being late is an effect. 0:21:03.423,0:21:08.864 So apart from the prawns all of the blue[br]and green things we see in risk registers.