Newsgroups: comp.ai
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!news.starnet.net!wupost!newspump.wustl.edu!news.ecn.bgu.edu!siemens!princeton!tucson.Princeton.EDU!ijchang
From: ijchang@tucson.Princeton.EDU (Isaac Joel Chang)
Subject: Hidden Markov Models
Message-ID: <1995Jan10.022348.2386@Princeton.EDU>
Originator: news@hedgehog.Princeton.EDU
Sender: news@Princeton.EDU (USENET News System)
Nntp-Posting-Host: tucson.princeton.edu
Reply-To: ijchang@tucson.Princeton.EDU (Isaac Joel Chang)
Organization: Princeton University
Date: Tue, 10 Jan 1995 02:23:48 GMT
Lines: 15

I'm working on implementing learning algorithms on HMM's.  The particular 
automata I'm looking at now has epsilon (null-output) edges, and the Baum-Welch
(aka forward-backward or estimate-maximize) chokes when having to deal with 
that.  I know I can change the non-deterministic automata to a deterministic
one via a power set, but I'd rather avoid the exponential blow-up in 
states if I can.

I've been told there's a modification of the Baum-Welch which can handle epsilon
edges, but I haven't been able to find it:  I've been mostly using the text 
"Hidden Markov Models for Speech Recognition" by Huang, Ariki and Jack, but I 
can't find it in there.  Any pointers to papers or texts, etc. would be greatly 
appreciated.



