From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!bonnie.concordia.ca!IRO.UMontreal.CA!JSP.UMontreal.CA!u1795 Tue Jan 28 12:17:02 EST 1992
Article 3096 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!bonnie.concordia.ca!IRO.UMontreal.CA!JSP.UMontreal.CA!u1795
>From: u1795@JSP.UMontreal.CA (Zimmer Eric)
Newsgroups: comp.ai.philosophy
Subject: Re: Ethics and AI ( AI, software-pit-bulls, OR ...? )
Keywords: AI,ai,life,humanity,sillyness,mind,conciousness
Message-ID: <1992Jan24.052258.17027@jsp.umontreal.ca>
Date: 24 Jan 92 05:22:58 GMT
References: <knmip5INNob2@exodus.Eng.Sun.COM> <1992Jan21.154933.127@lrc.edu>
Sender: news@jsp.umontreal.ca (Administration de C news)
Organization: not an organizationU
Lines: 51

In article <1992Jan21.154933.127@lrc.edu> lehman_ds@lrc.edu writes:
>"understanding" is brought up is, ultimately, we will have to decide if
>the machine is a tool for man to do what he wishes with, or if the machine
>now becomes an entity unto itself, making the human rights activist do
>a double-take...  To say that the machine passes the Turing test and
>never understands, we have now said that the machine is a smart tool.
>To say that it has understanding, we now have a beign that deserves
>"human" rights

Lets forget the Turing Test for a moment[ to avoid endless discussions about
it :-) ] and suppose that this machine really IS able to think, "just like us".

Of course, we'll have all these discussions about "What is conciousness?",
or "Do we have a soul and why do computers shouldn't have a soul?".  What
would be so typical for the human race(excuse my contempt) would be to make up
another test(lets call it "the counciousness test"!), supposing they can, wich
will decide wether or not "conciousness" is more than just a box processing  
information.  Of course(if I may predict the results), the machine will
fail the test and we will conclude that they must be "smart tools".  When
the test will be given to a human being, and failed, and then given to thousands
of human beings and failed again, humans will think of themselves as tools
and humanists will have to argue that the test was wrong  :-]


To put it roughly:  What would make a human being, apart from religious
                    beliefs, to consider himself more worth "living" than
                    a machine?

I think that my question, just by appearing over the net, is overrated. (!)

Some answers could be
                       -snobism; 
                       -ecosystem: humans used to live in interaction
                                   with the "cycles" of nature while
                                   machines may be a treat to the ecosystem;
                       -survival of the specy
 
It seems to me that the computers & ethics questions only raises
questions asked too many times before plus one more: why did I reply
to this post?  I guess I wanted to share my little story...

Anyway, if there is a point in all this waste of bandwidth, is that
conciousness alone does not differenciate humans from machines.  
The moral of it is that humanity may be way out as to label life
as of an "individualist" concept.


"Excuse me for the way I think,
 wich I don't do anyway"                                      S.L.

"..excuse my english, I could have replied in french but..."


