Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!pipex!uunet!timbuk.cray.com!walter.cray.com!mwd
From: mwd@cray.com (Mark Dalton)
Subject: Re: What is"simulated annealing"?
Message-ID: <1994Dec26.142911.10058@walter.cray.com>
Lines: 23
Nntp-Posting-Host: pajarito.cray.com
X-Newsreader: TIN [version 1.2 021193BETA PL3-CRIb]
References: <3bfskb$quh@newssvr.cacd.rockwell.com> <19941118130823.BKAMP@pi0192.kub.nl> <3cbn3q$s3s@cantaloupe.srv.cs.cmu.edu>
Date: 26 Dec 94 14:29:11 CST

Scott Fahlman (sef@CS.CMU.EDU) wrote:
: In article <3c3ddq$4u8@newssvr.cacd.rockwell.com> trhope@cacd.rockwell.com (THOMAS R HOPE) writes:
:    If I start with a slow learning rate, and just leave it there, will it
:    eventually converge or will it fall into a local minimum?  
: It will eventually converge, but might take a VERY long time.  The
: point of annealing is that the higher temperatures allow you to find
: some good neighborhood in a reasonable time, and then you cool down
: find a good minimum within that neighborhood.
: -- Scott

There is also Adaptive Simulated annealing.  You can get more information
about this at:
	http://alumni.caltech.edu/~ingber/
	or
	ftp://ftp.alumni.caltech.edu/pub/ingber/

	The source code and papers are available at either site.

Mark 
------
Mark Dalton
Cray Research, Inc.
mwd@cray.com
