Forward and Backward Algorithm in Hidden Markov Model (HMM)
Model Parameters:
- States (N = 2): Rainy (S1), Sunny (S2)
- Observations (M = 2): Walk, Shop
- Observation sequence (T = 3): O = [Walk, Shop, Walk]
HMM Parameters:
- Initial Probabilities (pi): pi = [0.6, 0.4]
- Transition Probabilities (A):
A = [[0.7, 0.3],
[0.4, 0.6]]
- Emission Probabilities (B):
B = [[P(Walk|Rainy) = 0.1, P(Shop|Rainy) = 0.4],
[P(Walk|Sunny) = 0.6, P(Shop|Sunny) = 0.3]]
Forward Algorithm:
1. Initialization (t=1):
alpha1(1) = 0.6 * 0.1 = 0.06
alpha1(2) = 0.4 * 0.6 = 0.24
2. Induction (t=2, observation=Shop):
alpha2(1) = (0.06*0.7 + 0.24*0.4) * 0.4 = 0.0552
alpha2(2) = (0.06*0.3 + 0.24*0.6) * 0.3 = 0.0486
3. Induction (t=3, observation=Walk):
alpha3(1) = (0.0552*0.7 + 0.0486*0.4) * 0.1 = 0.005808
alpha3(2) = (0.0552*0.3 + 0.0486*0.6) * 0.6 = 0.027432
4. Termination:
P(O|lambda) = alpha3(1) + alpha3(2) = 0.03324
Backward Algorithm:
1. Initialization (t=3):
beta3(1) = beta3(2) = 1
2. Induction (t=2):
beta2(1) = 0.7*0.1 + 0.3*0.6 = 0.25
beta2(2) = 0.4*0.1 + 0.6*0.6 = 0.4
3. Induction (t=1):
beta1(1) = 0.7*0.4*0.25 + 0.3*0.3*0.4 = 0.106
beta1(2) = 0.4*0.4*0.25 + 0.6*0.3*0.4 = 0.112
4. Termination:
P(O|lambda) = 0.6*0.1*0.106 + 0.4*0.6*0.112 = 0.03324
Final Result:
P(O|lambda) = 0.03324 (using both forward and backward algorithms)