From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Thu Feb 20 15:19:56 EST 1992
Article 3646 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: Functionalist Theory of Qualia
Message-ID: <1992Feb11.181633.13862@psych.toronto.edu>
Organization: Department of Psychology, University of Toronto
References: <1992Feb7.203648.8033@cs.yale.edu> <1992Feb10.032900.27301@bronze.ucs.indiana.edu> <6558@pkmab.se>
Date: Tue, 11 Feb 1992 18:16:33 GMT

In article <6558@pkmab.se> ske@pkmab.se (Kristoffer Eriksson) writes:
>In article <1992Feb10.032900.27301@bronze.ucs.indiana.edu> chalmers@bronze.ucs.indiana.edu (David Chalmers) writes:

> >This is where I disagree, obviously.  You and I both know that this
> >talk of "self-models" and so on is just a shorthand way of talking
> >about certain kinds of behavioural dispositions, complex mechanisms
> >of internal causation, and so on.  If one were to predict a priori
> >what it would feel like to be such a system, there'd be no reason
> >to suppose that it would feel like anything at all.
>
>Wrong. If your self-model says that you feel a certain way, then that IS how
>you feel. The a priori prediction of how it would feel to be such a system,
>is, by definition, that it would feel the way the self-model says that you
>would feel.
>

So how does my "self-model" *know* what things feel like?  My self-model may
be able to tell me that I am in a state called "pain," but how on earth
can it tell me what that state *feels* like?

- michael



