From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!usc!wupost!uunet!mcsun!uknet!edcastle!aiai!jeff Tue Apr  7 23:24:24 EDT 1992
Article 4947 of comp.ai.philosophy:
Xref: newshub.ccs.yorku.ca comp.ai.philosophy:4947 sci.philosophy.tech:2515
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!usc!wupost!uunet!mcsun!uknet!edcastle!aiai!jeff
>From: jeff@aiai.ed.ac.uk (Jeff Dalton)
Newsgroups: comp.ai.philosophy,sci.philosophy.tech
Subject: Re: A rock implements every FSA
Message-ID: <6567@skye.ed.ac.uk>
Date: 6 Apr 92 18:10:33 GMT
References: <1992Apr3.180407.28679@bronze.ucs.indiana.edu> <1992Apr4.015204.10671@husc3.harvard.edu> <1992Apr4.175511.24556@bronze.ucs.indiana.edu>
Sender: news@aiai.ed.ac.uk
Organization: AIAI, University of Edinburgh, Scotland
Lines: 18

In article <1992Apr4.175511.24556@bronze.ucs.indiana.edu> chalmers@bronze.ucs.indiana.edu (David Chalmers) writes:
>In article <1992Apr4.015204.10671@husc3.harvard.edu> zeleny@zariski.harvard.edu (Mikhail Zeleny) writes:
>
>>Again, the degree of necessity is the last, if not the least of your
>>problems (for, if functionalism is true, biological necessity is the same
>>as physical necessity); the first one has to do with giving an appropriate
>>semantics for your conditionals.  That your conflicting requirements simply
>>won't allow you to do: you can either stipulate trans-world identity
>>conditions for mind-brains, or for their properties, but not for both.
>>Choose the former, and you lose the conditionals; choose the latter, and
>>you lose personal identity.
>
>As I've made clear on a number of occasions, I subscribe to Parfit's
>treatment of personal identity, so that there need not be determinate
>facts about personal identity across worlds.

I'm not sure what the import of that is supposed to be.  Do you want
your other arguments to depend on Parfit's conclusions being right?


