Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!europa.eng.gtefsd.com!howland.reston.ans.net!cs.utexas.edu!utnut!wave.scar!93funkst
From: 93funkst@scar.utoronto.ca (FUNK  STEVEN LESLIE,,Student Account)
Subject: Hopfield Storage Capacity
Message-ID: <Cxvr13.Gyo@wave.scar.utoronto.ca>
Sender: usenet@wave.scar.utoronto.ca
Nntp-Posting-Host: wave.scar.utoronto.ca
Reply-To: 93funkst@wave.scar.utoronto.ca
Organization: University of Toronto - Scarborough College
References: <1994Oct17.230541.9773@Princeton.EDU>
Date: Tue, 18 Oct 1994 17:53:26 GMT
Lines: 9

Hi,

	I've been rereading Hopfields 1982 article, and I'm a little confused.  It seems as though he tested a 30 unit system with 5 patterns and found that it settled in a nominal state 85% of the time.  But he describes the system as having 10 nominal states, the original patterns and their inverses.  Does this mean that a 30 unit system, storing 5 patterns can only generalize to the original 5 patterns some 43% of the time?  On the same page theres a frequency distribution of the numbers of errors.  According 

to this the system can only store 5 patterns in a 100 unit system and recall them totally error free.  the 15 patterns in a 100 unit system shows a rather high degree of errors.  So it seems as though the hopfield net has a really low storage capacity.  Further the ability to correct/complete a pattern is limited by the storage of the original pattern and its inverse.  Is all of this correct?  I was under the impression that the networks performance was much better.  

Steven 
93funkst@wave.scar.utoronto.ca

