From newshub.ccs.yorku.ca!torn!cs.utexas.edu!qt.cs.utexas.edu!yale.edu!jvnc.net!darwin.sura.net!haven.umd.edu!uunet!trwacs!erwin Thu Oct  8 10:10:14 EDT 1992
Article 7026 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!qt.cs.utexas.edu!yale.edu!jvnc.net!darwin.sura.net!haven.umd.edu!uunet!trwacs!erwin
>From: erwin@trwacs.fp.trw.com (Harry Erwin)
Newsgroups: comp.ai.philosophy
Subject: Re: Grounding
Message-ID: <725@trwacs.fp.trw.com>
Date: 24 Sep 92 13:25:26 GMT
References: <19qphvINN7au@darkstar.UCSC.EDU>
Organization: TRW Systems Division, Fairfax VA
Lines: 15

Freeman's work is apropos. Mammalian brains are semantically grounded, to
the extent that the neocortex appears to download directions to the sense
organs (or at least the corresponding nuclei) to monitor specific semantic
objects and report back. Hence, even the sense organs appear to be
operating on a semantic level. Raw qualia, without associated semantic
meaning, appear to be sent to the brain only when an associated object is
not available. When that happens, the brain performs an orienting action
to classify the novel inputs and integrate them into its ongoing world
model.


-- 
Harry Erwin
Internet: erwin@trwacs.fp.trw.com



