Newsgroups: comp.ai.philosophy
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!pipex!uknet!festival!edcogsci!jeff
From: jeff@aiai.ed.ac.uk (Jeff Dalton)
Subject: Re: What's innate? (Was Re: Artificial Neural Networks and Cognition
Message-ID: <D2IEyp.2B2@cogsci.ed.ac.uk>
Sender: usenet@cogsci.ed.ac.uk (C News Software)
Nntp-Posting-Host: bute-alter.aiai.ed.ac.uk
Organization: AIAI, University of Edinburgh, Scotland
References: <3f6tsf$mpu@mp.cs.niu.edu> <jqbD2D90I.KqH@netcom.com> <3f7o1m$13g@mp.cs.niu.edu>
Date: Mon, 16 Jan 1995 17:52:01 GMT
Lines: 24

In article <3f7o1m$13g@mp.cs.niu.edu> rickert@cs.niu.edu (Neil Rickert) writes:
>In <jqbD2D90I.KqH@netcom.com> jqb@netcom.com (Jim Balter) writes:
>
>>One observation I would like to make is that there appear to be a set of
>>grammar rules that we have all learned not to violate, but that the
>>"approximate grammar" from which we actually generate utterances is a subset of
>>that.
>
>If I read you correctly, you are saying that the language a person
>will accept as correct for input is larger than the language that the
>same person will produces as output.  I would agree with that, and I
>suspect that Chomsky would also agree.  This may be something like
>his distinction between performance and competence.

Why do you think "the language a person will accept as correct for
input is larger than the language that the same person will produces
as output"?  Presumably you don't mean that what the person will
actually accept in a lifetime is larger than what they'll actually
produce (since that would depend on how much they talked as opposed
to listened).  But you mean what they would potentially accept as
opposed to what they would potentially produce, why should the
former be larger than the latter?

-- jeff
