From newshub.ccs.yorku.ca!torn!cs.utexas.edu!zaphod.mps.ohio-state.edu!pacific.mps.ohio-state.edu!linac!mp.cs.niu.edu!rickert Wed Aug 12 16:52:11 EDT 1992
Article 6550 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!zaphod.mps.ohio-state.edu!pacific.mps.ohio-state.edu!linac!mp.cs.niu.edu!rickert
>From: rickert@mp.cs.niu.edu (Neil Rickert)
Subject: Re: Memory and store/retrieve.
Message-ID: <1992Aug3.153944.14859@mp.cs.niu.edu>
Organization: Northern Illinois University
References: <BILL.92Jul31195028@ca3.nsma.arizona.edu> <1992Aug1.132812.12457@mp.cs.niu.edu> <BILL.92Aug2121827@ca3.nsma.arizona.edu>
Date: Mon, 3 Aug 1992 15:39:44 GMT
Lines: 14

In article <BILL.92Aug2121827@ca3.nsma.arizona.edu> bill@nsma.arizona.edu (Bill Skaggs) writes:

>The real question, I think, is whether the brain uses attractor neural
>networks (such as, for example, Hopfield nets) for memory.  If it
>does, then creation of a new attractor is naturally described as a
>"storage" operation, and convergence to an existing attractor is
>naturally described as "retrieval".

  I don't think it is that simple.  If a "fact" is represented by a
single attractor, your statement make sense.  But what seems to us to
be a single fact may actually be represented as thousands of smaller
"factoids", and their attractors might not all be created at the same
time.



