From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!batcomputer!cornell!rochester!kodak!ispd-newsserver!psinntp!norton!brian Mon Mar  9 18:34:46 EST 1992
Article 4217 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!batcomputer!cornell!rochester!kodak!ispd-newsserver!psinntp!norton!brian
>From: brian@norton.com (Brian Yoder)
Subject: Re: Strong AI and panpsychism
Message-ID: <1992Mar03.061159.16651@norton.com>
Organization: Symantec / Peter Norton
References: <1992Feb27.221933.1168@psych.toronto.edu>
Date: Tue, 03 Mar 1992 06:11:59 GMT

michael@psych.toronto.edu (Michael Gemar) writes:
> In article <1992Feb27.023234.49@norton.com> brian@norton.com (Brian Yoder) writes:
> >bill@NSMA.AriZonA.EdU (Bill Skaggs) writes:
> >>   Proposed definition:  An object is "intelligent" if it implements
> >> some sufficiently sophisticated set of programs.

> >On what basis would anyone define "intelligent" this way?  Is this related
> >to Searle's arguments about AI?
 
> This is the way in which *AI*, not Searle, defines intelligence.  If you
> don't like it, then you don't like AI...
 
Now that's a bold claim!  Why can't I define AI as "The study of man-made 
systems that perceive reality and react on the basis of that knowledge in
a self-generated goal-directed manner."?  Such a description would encompass
the systems that people want to build and does not suffer from the myriad
of problems Searle arrives at.  I should comment that I think Searle DOES reach
the proper conclusions given his starting point, but I don't buy his definition
of AI.  If I am not talking about AI then what AM I talking about?
 
> >One of my criticisms of Searle is that he spends a lot of time talking about
> >"programs" and "instructions" rather than talking about information, perception,
> >consciousness, concepts, and the like. 
 
> Searle is discussing AI.  AI is composed of programs carrying out instructions.
 
Could you justify that definition?  It seems horribly tied to ONE WAY OF 
BUILDING an intelligent system.  What Searle proves is that that method cannot
work.  I buy that much.  How can you justify his definitions are the correct 
ones?
 
> > He could just as easily say that 
> >"Humans have brains.  Brains are made of atoms.  Atoms cannot think.  Therefore
> >brains cannot think.".  
 
> But we *know* subjectively that brains *do* think.  We don't know the same for
> computers.

So what you are saying is that huamns can think and we know that for a fact
(which I hearily agree with).  So tell me.  If I could build a machine that
does what the brain does and it didn't follow Searle's paradigm, would such
a machine fall under what you would call AI?  This seems to be a very simple
distinction to me.
 
> >>   Now Putnam's argument, which I will not repeat, is that this
> >> seemingly natural definition is bad, because with such a broad
> >> notion of implementation it can be shown that every physical object
> >> (such as a rock) implements every program, or at least an enormous
> >> set of programs.  The conclusion is that some more restrictive
> >> notion of implementation is needed.   

> >I believe I have heard that argument (though it was some time ago).
> >As I remember it was another case of horrible context jumping.  To say that
> >intelligence is a program and that programs are FSAs and that everything 
> >is an FSA therefore everything is intelligent is just ludicrous. 
 
> That's a reasonable response.  It's the one I have.  Now, which premise
> do you want to discard?
 
The premise that AI is to be achieved through "instructions and states".  
Perception (which Searle completely ignores if memory serves) and induction
and goal-orientation are the primary things we need to discuss in terms of
intelligent systems.  Talking about intelligence in terms of instructions and
states is like talking about "melody" in terms of air pressure changes.  It
gets you nowhere even though the two subjects are interrelated.

--Brian
 
-- 
-- Brian K. Yoder (brian@norton.com) - Q: What do you get when you cross     --
-- Peter Norton Computing Group      -    Apple & IBM?                       --
-- Symantec Corporation              - A: IBM.                               --
--


