Return to Video

11-08 Markov Chain Question 2 Solution

  • 0:00 - 0:03
    [Thrun] And again the solution follows directly from the state diagram over here.
  • 0:03 - 0:07
    In the beginning we do know we're in state A
  • 0:07 - 0:09
    and the chance of remaining in A is 0.5.
  • 0:09 - 0:13
    This is the 0.5 over here. We can just read this off.
  • 0:13 - 0:19
    For the next state we find ourselves to be with 0.5 chance to be in A
  • 0:19 - 0:21
    and 0.5 chance to be in B.
  • 0:21 - 0:24
    If we're in B, we transition with certainty to A.
  • 0:24 - 0:26
    That's because of the 0.5.
  • 0:26 - 0:31
    But if we're in A, we stay in A with a 0.5 chance. So you put this together.
  • 0:31 - 0:36
    0.5 probability being in A times 0.5 probability of remaining in A
  • 0:36 - 0:41
    plus 0.5 probability to be in B times 1 probability to transition to A.
  • 0:41 - 0:44
    That gives us 0.75.
  • 0:44 - 0:52
    Following the same logic but now we're in A with 0.75 times a 0.5 probability
  • 0:52 - 0:58
    of staying in A plus 0.25 in B, which is 1 minus 0.75,
  • 0:58 - 1:06
    and the transition's uncertainty back to A as 1, we get 0.625.
  • 1:06 - 1:11
    So now you should be able to take a Markov chain and compute by hand
  • 1:11 - 1:16
    or write a piece of software the probabilities of future states.
  • 1:16 -
    You will be able to predict something. That's really exciting.
Title:
11-08 Markov Chain Question 2 Solution
Description:

Unit 11 08.mp4 Markov Chain Answer 2.mp4

more » « less
Team:
Udacity
Project:
CS271 - Intro to Artificial Intelligence
Duration:
01:20
Amara Bot added a translation

English subtitles

Revisions