From newshub.ccs.yorku.ca!torn!cs.utexas.edu!uunet!mcsun!news.funet.fi!hydra!klaava!amnell Sat Oct 24 20:44:41 EDT 1992
Article 7358 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!uunet!mcsun!news.funet.fi!hydra!klaava!amnell
>From: amnell@klaava.Helsinki.FI (Marko Amnell)
Newsgroups: comp.ai.philosophy
Subject: Re: Simulated Brain
Message-ID: <1992Oct21.222125.24753@klaava.Helsinki.FI>
Date: 21 Oct 92 22:21:25 GMT
References: <BwGKx3.5oJ@usenet.ucs.indiana.edu> <1992Oct21.101000.1131@klaava.Helsinki.FI> <1992Oct21.161515.7529@inesc.pt>
Organization: University of Helsinki
Lines: 19

In article <1992Oct21.161515.7529@inesc.pt> xarax@eniac.inesc.pt 
(Luis Antunes) writes:

>First we were talking about consciousness now you talk about "full,
>healthy conscious state"... Let's be careful. If the child has an
>unhealthy conscious state then it certainly has a consciuous state...

OK.  I was probably playing fast and loose with the word.  I think I've
made it clear that my purpose was to emphasize the importance of the
whole environment in the production of a person.  A machine would
presumably not have a history of interaction with other minds, and my
point was that such an artificial mind would also lack the full range
of mental and experiental states a human being has.


-- 
Marko Amnell
amnell@klaava.helsinki.fi
Graduate Student in Philosophy


