Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news.harvard.edu!news2.near.net!MathWorks.Com!news.duke.edu!convex!cs.utexas.edu!utnut!wave.scar!93funkst
From: 93funkst@scar.utoronto.ca (FUNK  STEVEN LESLIE,,Student Account)
Subject: Re: Maximum Storage Capacity
Message-ID: <Cx89HC.5HG@wave.scar.utoronto.ca>
Sender: usenet@wave.scar.utoronto.ca
Nntp-Posting-Host: wave.scar.utoronto.ca
Reply-To: 93funkst@scar.utoronto.ca
Organization: University of Toronto - Scarborough College
References: <36vbtj$44n@scapa.cs.ualberta.ca>
Date: Thu, 6 Oct 1994 01:29:36 GMT
Lines: 49

In article 44n@scapa.cs.ualberta.ca, arms@cs.ualberta.ca (Bill Armstrong) writes:
> >93funkst@scar.utoronto.ca (FUNK  STEVEN LESLIE,,Student Account) writes:
> 
> >> Well, about a week ago I asked if people could give me some idea of
> what the maximum storage capacity of neural network would be.  I got
> no response, so I'm going to try again with a slightly different
> approach.  Here's the challenge: I've got 10 randomly generated
> patterns stored in a 16 unit memory, consistently with success on 98%
> of the attempts.  Is there anyone out there that can beat this?  If so
> let me know.  If you can come close, let me know.  If you think this
> is a ridiculous claim, and you've done the best that anybody can, let
> me know.
> 
> >>I hope this gets me some response
> 
> The answer is there is no limit.  Kolmogorov's theorem says you can
> fit any continuous function to any degree of precision as long as you
> take a large enough net.  Your claim is not ridiculous, it's just that
> sometimes it's hard for the system to absorb the information.  With enough
> nodes and a general interconnection scheme with enough levels, you can
> store however much information you want.
> 
> Before we all start competing let's get the problem straight.  How
> about n boolean vectors of b bits, randomly generated, and the n
> boolean responses are randomly assigned.  Or do you want to have
> uniformly distributed reals in [0,1]?  What is classed as a correct
> response if the problem involves reals (ie floats).  What constraints
> do you want on the net?
> 
> --
> ***************************************************
> Prof. William W. Armstrong, Computing Science Dept.
> University of Alberta; Edmonton, Alberta, Canada T6G 2H1
> arms@cs.ualberta.ca Tel(403)492 2374 FAX 492 1071


Okay,

	Well, lets (for the sake of discussion) say that were going to be given a memory of 64 bits to work with.  The bits are binary and the patterns are all randomly generated with no tweeking for orthogonality or any of that stuff.  As you say its a good idea to define the problem first so lets try the conditions I've just outlined.  So, binary system with 64 bit memory.  That means that the maximum number of memories possible (in the best of all possible worlds) would be the full set of patterns 2^64.  Now I


 could be wrong but I read somewhere that linear seperability constrains the number of stable memories to 2N.  What I'm wondering is:  Under the best possible conditions (please define them) what is the maximum storage capacity of said system?  I've heard from several different sources that 12% of the system size in units is about the limit for a hopfield net.  What about a boltzman machine, or some other kind of network?  Thanks for helping to clear this up.

Steven
93funkst@wave.scar.utoronto.ca

PS:  from a theoretical/mathmatical point of view, must all memories be stored in a minima?  I'm kind of a minimaphobic so I've been fooling around with the idea of suspending memories in a dynamic space.  I'm sorry if this, like everything else, seems vague but Its been a long summer with GRE's on friday.

 

