sarsooxyz.hugo/content/posts/markov/index.md
Andy Pack ace6d41e0e
All checks were successful
Deploy Hugo site to Prod / Build Container (push) Successful in 36s
using badges
2024-07-28 20:54:19 +01:00

1.8 KiB

title date draft
Hidden Markov Model Training 2021-01-05T11:12:40+00:00 false

state-topology

Markov model state topology

The second piece of coursework for my speech & audio processing & recognition module was focused on the machine learning side of the field. One of the main methods we learnt about were Hidden Markov Models and how to train them, this coursework was a test of the theory. My submission achieved 98%.

probability density function iterations

Re-estimated probability density function outputs after training

The provided spec for the model included the entry, exit and transition probabilities, the parameters for each state's Gaussian output function and the observations used for training.

From here, the coursework tested the ability to calculate and analyse various aspects of the model including forward, backward, occupation and transition likelihoods. A single iteration of Baum-Welch-based training was completed resulting in a new set of transition probabilities and output function parameters.

Occupation likelihoods

Probability of being in each state at each time step or observation

The above graph is presenting the occupation likelihoods of each state at each time step or observation. It is the joint probability from the forward and backward likelihoods. From here it looks like the observations were taken from state 2 for 3 time-steps before swapping to state 1 for 4 time-steps and changing back to state 2 for the last one.

GitHub

Read the report here.