Newsgroups: comp.ai,comp.ai.nat-lang,sci.cognitive
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!oitnews.harvard.edu!yale!zip.eecs.umich.edu!newshost.marcam.com!news.mathworks.com!gatech!darwin.sura.net!ms!ma.cs.wm.edu!gottliej
From: gottliej@ma.cs.wm.edu (Jeremy F. Gottlieb)
Subject: Re: Announce: Book excerpt available
Message-ID: <1995Jul21.173749.14506@cs.wm.edu>
Followup-To: comp.ai,comp.ai.nat-lang,sci.cognitive
Sender: news@cs.wm.edu (News System)
Nntp-Posting-Host: ma.cs.wm.edu
Organization: College of William & Mary, founded 1693
X-Newsreader: TIN [version 1.2 PL2]
References: <3u6j1n$jtq@chuangtsu.acns.carleton.edu> <3u8n0v$cqa@newsbf02.news.aol.com> <3uh1g5$lvn@sword.eng.pyramid.com> <3uhodl$ds5@clarknet.clark.net>
Date: Fri, 21 Jul 1995 17:37:49 GMT
Lines: 43
Xref: glinda.oz.cs.cmu.edu comp.ai:31746 comp.ai.nat-lang:3637 sci.cognitive:8461

Noble (nlv@clark.net) wrote:
: In article <3uh1g5$lvn@sword.eng.pyramid.com>, kudzu@pyramid.com says...
: >
: >
: >Computers don't think, they merely *think* they do.  As for people,
: >the same is probably true...
: >
: If "they" (computers) *think* they can think, then aren't "they" thinking? 
: :)

: BTW, pardon me if the first "they" in your sentence does not refer to 
: computers.  I was just dying to write what I wrote!
: --
: Noble
: nlv@clark.net

What I think kudzu was trying to say is (and try to follow me here):
	Computers only "think" that they are "thinking" because their
internal program tells them that their thinking, i.e. because people
instilled this inner "sense" of ability to "think" into the computer.
Thus what I gather his argument is is that the same is true of people,
and so our "thinking" is one giant internal program.

	To some extent, this must be true. Our ability to cogitate
obviously is dependent upon the 3-some odd pounds of mush in our
skulls. However, what kudzu is (I think) trying to say is that a
computer will never truly "think" unless we tell it it does, and thus
it isn't "thinking" for itself and never wil.

	My argument against that would go as follows: Every day when
we meet people we make the assumption that they are self-aware and
that they "think" in the same way we do (you know what I mean). If we
can make a computer (robot) that is human-like enough to fool people
into making a similar assumption about it (or at least to acknowledge
the possibility, if we can't make that reaslistic a robot), couldn't
that computer then be said to be endowed with the same cognitive gifts
we are, regardless of how it came to achieve them?
--
Jeremy Gottlieb				Parallel Computing Lackey
gottliej@cs.wm.edu			William & Mary
gottliej@mathcs.carleton.edu		Dork: Carleton College
"Without C, we'd have BASI, OBOL, and PASAL."
http://www.mathcs.carleton.edu/students/gottliej/gottliej.html
