Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!news.mathworks.com!uunet!in1.uu.net!olivea!charnel.ecst.csuchico.edu!csusac!csus.edu!news.ucdavis.edu!sunnyboy.water.ca.gov!sunnyboy!nsandhu
From: nsandhu@venice.water.ca.gov (Nicky Sandhu)
Subject: Re: testing network with SNNS
In-Reply-To: borisov@cs.wmich.edu's message of Sat, 11 Mar 1995 18:57:20 GMT
Message-ID: <NSANDHU.95Mar13102049@grizzly.water.ca.gov>
Sender: news@sunnyboy.water.ca.gov
Organization: Calif. Dept. of Water Resources
References: <1995Mar11.185720.24658@sol.cs.wmich.edu>
Date: Mon, 13 Mar 1995 18:20:49 GMT
Lines: 18

>>>>> Regarding testing network with SNNS; borisov@cs.wmich.edu (Slava Borisov) adds:



Slava> Hi
Slava> I've got a problem with SNNS which should have a very simple
Slava> solution, I think.
	It is.
Slava> Suppose I trained my network with 25 patterns. How how can I
Slava> see what will be produced if I feed a network with a pattern,
Slava> different from those 25 ? In other words, how can I test
Slava> the network's behavior on 'unseen' patterns ?
	First create a second pattern file with test patterns in it.
Then load that pattern file and the trained network. The remote panel
must be open and show the  current pattern set to be the test pattern
set. Open file and click of result file button. Give the result file a
name and save it. This would save the patterns in the result file. You
can also check out the performance with the analyzer menu button.
