Newsgroups: comp.speech
Path: lyra.csx.cam.ac.uk!warwick!pipex!howland.reston.ans.net!agate!msuinfo!uchinews!iitmax!thssdxn
From: thssdxn@iitmax.iit.edu (Dibyendu Nandy)
Subject: Re: "Markov Sources" language models
Message-ID: <1994Apr13.231019.24306@iitmax.iit.edu>
Organization: Illinois Institute of Technology, Chicago
References: <2obhtt$doo@lyra.csx.cam.ac.uk>
Date: Wed, 13 Apr 94 23:10:19 GMT
Lines: 31

In article <2obhtt$doo@lyra.csx.cam.ac.uk> rjp1006@eng.cam.ac.uk (R.J. Pocock) writes:
>I am investigating alternatives to the standard N-gram language model, and have
>heard mention of "Markov Sources", which were described as word sequences 
>generated by a fully connected hidden MM. 
>
>I haven't been able to find out any more than this. Can anybody help ? I would
>particularly appreciate pointers to references.
>
>cheers,
>Rob Pocock
> 
>

The following is an excellent tutorial :
Levinson S.E., Rabiner L.R., and Sondhi  M.M., 
"An Introduction to the Application of the Theory of Probabilistic
Functions of a Markov Process to Automatic Speech Recognition",
The Bell System Tech. Jour. V 62, N0 4, April 1983.

regards
Dibyendu
-- 
302 Seigel Hall	Ph# (312)567-3407
Department of ECE, IIT, Chicago, Il 60616.
Internet: dnandy@hezi.ece.iit.edu	dnandy@chitra.ece.iit.edu
	  thssdxn@iitmax.acc.iit.edu 	nanddib@karl.acc.iit.edu
	 	nanddib@elof.acc.iit.edu (NeXTmail)
------------------------------------------------------------------------
Fortune for the day : 
Good night to spend with family, but avoid arguments with your mate's
new lover.
