Different random initial HMMs


(Note: videos below are given in both quicktime and mpeg formats; however, the quicktime videos tend to be higher quality and are therefore recommended. If you do not have quicktime installed, you can download the free player (Mac/Windows) from Apple.)
Below, we illustrate the convergence behavior of the Baum-Welch algorithm for the "god" training set of the speech recognition system discussed in class. The animation below illustrates how three different HMMs (i.e. different initial random parameter settings) converge to different locally maximal solutions over the same training set.
HMM convergence
(400 x 180)

(The three HMMs are trained on the same data, but are started from different initial random parameter settings. The first frame corresponds to one iteration of the Baum-Welch algorithm.)

(high-quality, quicktime, 2.7 Mb)

(lower-quality, mpeg, 300 kb)

Last updated October 21, 2003 by Michael C. Nechyba