Skip To Content

Athabasca University

Section 2 : Exact and approximate inference in Bayesian networks

Commentary

Section Goals

  • To introduce exact inference in Bayesian networks based on enumeration and variable elimination.
  • To introduce randomized sampling algorithms, also called Monte Carlo algorithms, for approximate inference in Bayesian networks

Learning Objectives

Learning Objective 1

  • Outline how to perform exact inference in Bayesian networks by enumeration.
  • Explain the variable elimination algorithms, focusing on how it can make an exact inference.
  • Explain the following concepts or terms:
    • Query variable
    • Evidence variable
    • Hidden variable
    • Variable elimination
    • Causal rule
    • Monte Carlo
    • Rejection sampling
    • Likelihood weighting
    • Markov chain Monte Carlo (MCMC)
    • Transition probability
    • Stationary distribution
    • Gibbs sampler

Objective Readings

Required readings:

Reading topics:

Exact and Approximate Inference in Bayesian Networks (see Sections 14.4 - 14.5 of AIMA3ed)

Objective Questions

  • How does likelihood weighting avoid the inefficiency of rejection sampling?
  • Why does MCMC work?
  • What are the differences between the common types of graphical models?

Objective Activities

  • Explore the following probabilistic reasoning algorithms related to this section from the textbook's website.
    • Rejection-Sampling
    • Likelihood-Weighting
    • Gibbs-Ask
  • Complete Exercise 14.7 of AIMA3ed.

Updated November 17 2015 by FST Course Production Staff