Skip To Content

Athabasca University

Section 2 : Hidden Markov models (HMMs)

Commentary

Section Goals

  • To introduce Hidden Markov models (HMMs) based on the concepts, representation, and principles presented in the previous section.

Learning Objectives

Learning Objective 1

  • Outline how matrix computing is used to simplify the algorithm of HMM.
  • Design a smoothing algorithm for HMMs based on matrix representation.
  • Explore HMM-related algorithms from freely downloadable online resources.

Objective Readings

Required readings:

Reading topics:

Hidden Markov Models (see Section 15.3 of AIMA3ed) Paper: Sanhez, D., Tentori, M., and Favela, J. (2008). Activity recognition for the smart hospital. IEEE Intelligent

Systems, 23(2), 50 - 57. DOI: 10.1109/MIS.2008.18

Supplemental Readings

Rabiner, L. R. (1989). A tutorial on Hidden Markov Models and selected applications in speech recognition. Proceedings of the IEEE, 77(2), 257-286.

Cappé, O., Moulines, E., and Rydén, T. (2005). Inference in Hidden Markov Models. Springer. ISBN 0-387-40264-0.

Objective Questions

  • What makes Hidden Markov Models more important to many AI applications than to Markov processes?
  • How do you explain the "physical meanings" of each of the elements in HMM structure? (You can answer this question based on examples you are familiar with.)

Objective Activities

  • Explore and test HMM-related algorithms from freely downloadable online resources, and report them to the course conference for discussion.
  • Explore the source code of the following smoothing algorithm, which is downloadable from the textbook's website.
    • Fixed-Lag-Smoothing
  • Compare the different implementations of HMM, which you can find on either the course-related websites or elsewhere on the internet, probably with other programming languages. Vote for the best one on the conference.

Updated November 17 2015 by FST Course Production Staff