From newshub.ccs.yorku.ca!torn!cs.utexas.edu!usc!rpi!uwm.edu!ogicse!das-news.harvard.edu!cantaloupe.srv.cs.cmu.edu!crabapple.srv.cs.cmu.edu!rudis Mon Aug 24 15:41:22 EDT 1992
Article 6658 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!usc!rpi!uwm.edu!ogicse!das-news.harvard.edu!cantaloupe.srv.cs.cmu.edu!crabapple.srv.cs.cmu.edu!rudis
>From: rudis+@cs.cmu.edu (Rujith S DeSilva)
Newsgroups: comp.ai.philosophy
Subject: Re: Freewill, chaos and digital systems
Message-ID: <Bt9Kq2.CLy.1@cs.cmu.edu>
Date: 20 Aug 92 04:06:01 GMT
Article-I.D.: cs.Bt9Kq2.CLy.1
References: <Bt4xt1.MA0.1@cs.cmu.edu> <1992Aug19.210204.29868@mp.cs.niu.edu>
Sender: news@cs.cmu.edu (Usenet News System)
Organization: School of Computer Science, Carnegie Mellon
Lines: 45
Nntp-Posting-Host: probacto.soar.cs.cmu.edu

In article <1992Aug19.210204.29868@mp.cs.niu.edu> rickert@mp.cs.niu.edu (Neil
Rickert) writes:
>In article <Bt4xt1.MA0.1@cs.cmu.edu> rudis+@cs.cmu.edu (Rujith S DeSilva)
(that's me!) writes: 
>>(1) Does freewill arise solely through the mechanism of chaos?
>
>Usually we think of free will as the ability to make a decision, and
>have that decision affect what we do in the future.  That is, we make a
>choice, and can stick to that choice.

I prefer a more operational definition.  Let me suggest the following
different flavours of freewill.  I assume that the brain/mind/soul obeys the
`usual' physical laws.

(a) All your `decisions' can be predicted in advance, given sufficiently
advanced equipment.  In this scenario, I would say that one does not have
freewill.  Note that being able to predict some or most of your decisions is
insufficient.

(b) All your decisions can be predicted, given sufficient time on the advanced
equipment.  All of them can no longer be predicted in advance.  I'm not sure
whether this is significantly different from (a).

(c) Your decisions are pre-determined, but they are not predictable.  I think
chaos has to be invoked here.

(d) Your decisions are not pre-determined.  I think current quantum theory
(specifically, the absence of `hidden variables') allows this possibility, but
would like an expert opinion.

Have I left out any possibilities?  What do you believe is the true situation?
Personally, I don't have a clue, but I would not like it to be (a).  Actually,
(d) is not too bad.  It could be that most of our decisions are based on our
nature/nurture, with only the borderline cases being resolved randomly.

>Well, yes, your decision is determined in advance, but no, free will is not a
>sham, and life is not a charade.

In this case, as a previous poster pointed out, are we morally responsible for
our actions?  I think the concept of rehabilitation makes sense, but
punishment does not, except as a deterrent.  Actually, none of the four
flavours above supports the idea of punishment.

Rujith de Silva.
Carnegie Mellon.


