Newsgroups: comp.speech
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!uunet!in1.uu.net!news.ssd.intel.com!ornews.intel.com!chnews!ennews!trcsun3!deisher
From: deisher@trcsun3.eas.asu.edu (Michael E. Deisher)
Subject: Baum-Welch algorithm convergence
Keywords: HMM training Baum-Welch EM convergence rate
Message-ID: <D9tC1I.8u3@ennews.eas.asu.edu>
Sender: news@ennews.eas.asu.edu (USENET News System)
Organization: Arizona State University
Date: Wed, 7 Jun 1995 17:17:41 GMT
X-Nntp-Posting-Host: enws125.eas.asu.edu
Lines: 19

Hi.  I'm using the Baum-Welch (EM) algorithm to train a
continuous-Gaussian-mixture-density HMM (isn't everybody? :-)  In
practice, it converges very fast (in less than 10 iterations) for a
variety of model sizes/topologies.

Are there any good references that address the rate of convergence of
this algorithm, perhaps showing analytically why it is (typically) so
fast?  If not, do any of you have intuition as to why this happens?

--Mike

 ==============================================================================
  |  Mike Deisher                                  Arizona State University  |
  |  deisher@dspsun.eas.asu.edu          Telecommunications Research Center  |
  |  voice:  (602) 965-0396                    Signal Processing Laboratory  |
  |  fax:    (602) 965-8325                           Tempe, AZ  85287-7206  |
 ==============================================================================
                     If Murphy's Law can go wrong it will.

