Newsgroups: comp.ai.genetic,comp.ai.fuzzy
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!satisfied.apocalypse.org!news.mathworks.com!uhog.mit.edu!news.media.mit.edu!minsky
From: minsky@media.mit.edu (Marvin Minsky)
Subject: Re: Financial Neural Networks Home Page
Message-ID: <1995Mar14.223357.10794@news.media.mit.edu>
Sender: news@news.media.mit.edu (USENET News System)
Cc: minsky
Organization: MIT Media Laboratory
References: <3itdu2$fon@money.eng.warwick.ac.uk> <3iu8a7$kou@ixnews2.ix.netcom.com> <ssmith.19.0019FB13@mit.edu>
Date: Tue, 14 Mar 1995 22:33:57 GMT
Lines: 24
Xref: glinda.oz.cs.cmu.edu comp.ai.genetic:5224 comp.ai.fuzzy:4205

In article <ssmith.19.0019FB13@mit.edu> ssmith@mit.edu (Scott Smith) writes:
>In article <3iu8a7$kou@ixnews2.ix.netcom.com> mpole@ix.netcom.com (Mark 
>Polansky) writes:
>>I'd be very interested.  Neural networks seems to be fairly new, so I 
>
>Umm, NN's have been around QUITE a while...  Adaline was invented around 1959, 
>Perceptrons where from around 1969.  But unfortunately this technology wasn't 
>accepted until much much later.  Also, I'm not sure when Back Propagation was 
>created, but I think it is much more recently.

Freud proposed some analog-with-threshold network ideas around 1985.
McCulloch and Pitts 1943 was the most significant early paper, but
there were a number of others published in that erea, in Bulletin of
Mathematical Biophysics.  I designed some NN learning machines around
1949 and built the first working one in 1951; it is described in my
1953 thesis (Princeton) , "Neural-Analog Networks and the Brain-Model
Problem." The most ambitious theory was that of Donald Heb, in
"Organization of Behavior", 1950 (I think).  In the early 1950s there
were a variety of other NN models,including some simulated on
computers, but none of them learned enough to be interesting until
Rosenblatt's work beginning in 1957 and published in 1959.



