Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!yeshua.marcam.com!zip.eecs.umich.edu!newsxfer.itd.umich.edu!gatech!psuvax1!news.ecn.bgu.edu!siemens!princeton!handel!jtoth
From: jtoth@handel.Princeton.EDU (Gabor J.Toth)
Subject: Re: Hopfield Storage Capacity
Message-ID: <1994Oct19.231519.3819@Princeton.EDU>
Originator: news@hedgehog.Princeton.EDU
Sender: news@Princeton.EDU (USENET News System)
Nntp-Posting-Host: handel.princeton.edu
Organization: Princeton University
References: <1994Oct17.230541.9773@Princeton.EDU> <Cxvr13.Gyo@wave.scar.utoronto.ca> <mike.782505122@motion>
Date: Wed, 19 Oct 1994 23:15:19 GMT
Lines: 12

In article <mike.782505122@motion>, Mike Dawson <mike@psych.ualberta.ca> 
wrote:

>In an intro book to neural nets written by some physicists (unfortunately
>I don't have it with me as I write), I've seen proofs that a Hopfield
>net can only hold about .14 M patterns accurately, where M is the number
>of processors in the net.

You may think  of the book  of Tamas  Geszti: his  book really  has the
proof and he is indeed  a physicist.  Unfortunately,  I do not have the
reference, either...

