Newsgroups: comp.ai.philosophy
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!news.alpha.net!uwm.edu!vixen.cso.uiuc.edu!howland.reston.ans.net!news.sprintlink.net!news.dorsai.org!news.ilx.com!psinntp!psinntp!psinntp!scylla!daryl
From: daryl@oracorp.com (Daryl McCullough)
Subject: Re: What's innate? (Was Re: Artificial Neural Networks and Cognition
Message-ID: <1995Feb10.024824.2295@oracorp.com>
Organization: Odyssey Research Associates, Inc.
Date: Fri, 10 Feb 1995 02:48:24 GMT
Lines: 77

rickert@cs.niu.edu (Neil Rickert) writes:

>jeff@aiai.ed.ac.uk (Jeff Dalton) writes:

>>rickert@cs.niu.edu (Neil Rickert) writes:

>>>In other words, without a UG, the language itself evolves so as to
>>>become learnable without a UG.  Who is to say that the English
>>>language has not already evolved so as to become learnable without a
>>>UG?

>>But perhaps this is not actually possible.  Perhaps w/o a UG
>>creatures like us can't learn a sufficiently expressive language.

>That is conceivable, although I think it unlikely. My argument was
>mainly about whether the 'poverty of stimulus' argument proves
>anything. You are suggesting a different type of argument for UG.

No, he isn't. Look, the learnability of a language depends on many
factors, including: (1) How much data is available, (2) how complex
the language is, and (3) how suitable the learning strategy is for the
particular language being learned. Obviously, extremely simple
languages can be learned with only a small amount of data, but the
claim made in the poverty of stimulus argument is that natural
language is *not* that simple. Therefore, there must be a match
between the learning strategy used by children and the language being
learned.

In another article, Neil says:

>The effect is that language will evolve so as to mainly contain
>features which are easy to learn, given the particular learning
>biases. Since these features are easy to learn, we would expect
>children to learn them with relatively little stimulus. Thus there
>would be evidence to support a 'poverty of stimulus' argument.

Right. And Chomsky's claim is that what makes a language easy to
learn, given built-in human learning biases, is UG.

In yet another article, Neil says:

>My comments about music are, of course ridiculuous.  But what they
>emphasize is that talk of setting parameters is meaningless jargon.
>It make sense to talk of parameter setting if a relatively few
>parameters are to be set, and each setting controls a great deal.
>But if there are many parameters, each controlling relatively little,
>then the parameter setting analogy is a gross distortion.
>In particular, if there are so many parameter choices as to take care of
>every dialect of every language, then the parameters must be like the
>individual binary digits on a computer tape.

I think this is a pretty good criterion, but by that criterion, your
conclusion is completely wrong. The variations in natural languages
are nothing like the variations in the individual bits on a computer
tape. The number of variable parameters *is* very small compared with
the complexity of natural language. One linguist, Igor Melcuk at the
University of Montreal, has developed a framework for natural
language, called Meaning Text Theory, that he claims captures every
known natural language. Of course, there is no way to tell how much
of the commonality of natural languages is due to innate facts about
human brains, and how much is due to the fact that all languages presumably
developed from some common ancester language.

Daryl McCullough
ORA Corp.
Ithaca, NY











