From newshub.ccs.yorku.ca!torn!utcsri!rutgers!sun-barr!olivea!uunet!trwacs!erwin Thu Oct  8 10:10:40 EDT 1992
Article 7064 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!utcsri!rutgers!sun-barr!olivea!uunet!trwacs!erwin
>From: erwin@trwacs.fp.trw.com (Harry Erwin)
Newsgroups: comp.ai.philosophy
Subject: Re: Brain and Mind (was: Logic and God)
Summary: A few comments on consciousness
Message-ID: <739@trwacs.fp.trw.com>
Date: 29 Sep 92 17:12:31 GMT
References: <1992Sep17.181358.1828@Princeton.EDU> <1992Sep28.164828.2122@meteor.wisc.edu>
Followup-To: comp.ai.philosophy
Organization: TRW Systems Division, Fairfax VA
Lines: 43


I'm going to post a position on this argument to give people a target to
shoot at.

I suspect that most chordates are conscious, in the sense that their
brains operate semantic models and deviations from models to assess
external stimuli.

The dividing line between _self_ consciousness and lack of same appears to
lie between the apes and the monkeys. 

Hence consciousness is a primitive aspect to how we function and might be
regarded as "axiomatic."

Where do I stand on Searle's Chinese Room argument? I need to tell a
story. Most high schools teach students latin by training them in a
decoding algorithm that allows them to "unpack" latin literature. My twins
commented that they didn't "understand" latin until they tried to speak it
and later to think in it. This implies to me that "understanding" is not
algorithmic but rather reflects the acquisition of a generative model for
speaking the language. This is similar to how a fish predicts where its
neighbors will be in a school--it models their behavior using a generative
model. Hence in that sense, I suspect fish can learn Chinese more readily
than a robot can. 















Cheers,
-- 
Harry Erwin
Internet: erwin@trwacs.fp.trw.com


