Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news.harvard.edu!news2.near.net!MathWorks.Com!europa.eng.gtefsd.com!howland.reston.ans.net!cs.utexas.edu!utnut!wave.scar!93funkst
From: 93funkst@scar.utoronto.ca (FUNK  STEVEN LESLIE,,Student Account)
Subject: Re: Maximum Storage Capacity
Message-ID: <Cx88qL.4u3@wave.scar.utoronto.ca>
Sender: usenet@wave.scar.utoronto.ca
Nntp-Posting-Host: wave.scar.utoronto.ca
Reply-To: 93funkst@scar.utoronto.ca
Organization: University of Toronto - Scarborough College
References: <lvh.781380034@news.charm.net>
Date: Thu, 6 Oct 1994 01:13:32 GMT
Lines: 31

In article 781380034@news.charm.net, lvh@charm.net (Larrie V. Hutton) writes:
> 93funkst@scar.utoronto.ca (FUNK  STEVEN LESLIE,,Student Account) writes:
> 
> >Hi,
> >	Well, about a week ago I asked if people could give me some idea of what the maximum storage capacity of neural network would be.  I got no response, so I'm going to try again with a slightly different approach.  Here's the challenge:  I've got 10 randomly generated patterns stored in a 16 unit memory, consistently with success on 98% of the attempts.  Is there anyone out there that can beat this?  If so let me know.  If you can come close, let me know.  If you think this is a ridiculous claim, and
> 
> e
> 
> 
> > done the best that anybody can, let me know.  Thanks  
> 
> >I hope this gets me some response
> >Steven
> 
> How are you getting a 98% success rate with only 10 patterns?
> 
> Something doesn't compute.
> 
> 


Well, Its just a hypothetical case.  The criterion that I set down (if you've got a better one please let me know.) is to succeed in storing X number of memories in 98% of the attempts.  I figured that some small degree of flexibility might be  nice since randomly generated patterns might sooner or later come up with an unusually difficult set to learn.  So in the case described above, I'm looking at storing 10 randomly generated 16 bit patterns in the memory.  The criterion is that all of the 10 patterns 


be stable in 98% of the trials.  A trial is considered to be an attempt to encode a set of 10 stable patterns in the memory.  Being somewhat minimaphobic, I'm fooling around with the idea of storing memories in a kinda suspended state within a dynamic system.  It's difficult to explain, But if I dont find any (more) bugs in the program that I'm using to test the idea then the system might be able to beat the whole linear seperability thing.  My original question (perhaps phrased vaguely the first time) is 


basically just: What is the maximum storage capacity of a network? of various kinds of networks? various learning rules? or under other conditions?  I thought that throwing a number out might help to stimulate some conversation, and it did.  Thanks for the question, hope I've cleared it up, and I'll be anxiously awaiting a response.  Thanks.

Steven
93funkst@wave.scar.utoronto.ca


