Newsgroups: comp.ai.philosophy
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!math.ohio-state.edu!sdd.hp.com!hplabs!hplntx!curry
From: curry@hpl.hp.com (Bo Curry)
Subject: Re: What's innate? (Was Re: Artificial Neural Networks and Cognition
Sender: news@hpl.hp.com (HPLabs Usenet Login)
Message-ID: <D3nuKv.4Kw@hpl.hp.com>
Date: Wed, 8 Feb 1995 02:50:54 GMT
References: <3gtu3i$rf3@mp.cs.niu.edu> <3guoku$bci@mp.cs.niu.edu> <D3LG9D.G18@hpl.hp.com> <3h69pv$en3@mp.cs.niu.edu> <D3LuE8.3tq@spss.com> <3h6k27$l59@mp.cs.niu.edu> <D3nBrM.Is2@hpl.hp.com> <3h8ssp$qji@mp.cs.niu.edu>
Nntp-Posting-Host: saiph.hpl.hp.com
Organization: Hewlett-Packard Laboratories, Palo Alto, CA
X-Newsreader: TIN [version 1.2 PL2]
Lines: 52

Neil Rickert (rickert@cs.niu.edu) wrote:
: >: It make sense to talk of parameter setting if a relatively few
: >: parameters are to be set, and each setting controls a great deal.
: >: But if there are many parameters, each controlling relatively little,
: >: then the parameter setting analogy is a gross distortion.  In
: >: particular, if there are so many parameter choices as to take care of
: >: every dialect of every language, then the parameters must be like the
: >: individual binary digits on a computer tape.

: In <D3nBrM.Is2@hpl.hp.com> curry@hpl.hp.com (Bo Curry) writes:
: >According to Chomsky's theory, this is false. There are, indeed,
: >very many settings, but they do not cover the space of possibility.
: >Your final statement begs the question.

Neil Rickert (rickert@cs.niu.edu) wrote:
: Come on now.  The binary digits on a computer tape don't cover the
: space of possibilities, either, since they can only take the
: values 0 or 1, and no other value is allowed.

I thought you were referring to the message on the tape, which is
composed of (all possible, by implication) combinations of the
"parameter settings". The "parameters" may indeed be chunked at a
very low level, or they may not (Chomsky is, so far as I can tell,
agnostic on this) - the point is that not all messages are allowed.
So, if the parameters are indeed analogous to single bits, then
there must be fairly strong interdependencies among them - they
are not then independent parameters. Such a description of a
learning system seems (at the least) useless.

: >: Chomsky argues that language is not learned by inductive methods.
: >: But, after all, each induction merely sets a parameter.  There is no
: >: contradiction between learning by inductive methods and the setting
: >: of parameters.

: >Chomsky argues no such thing.

: Try: "Language and Learning: The Debate between Jean Piaget and Noam
: Chomsky," Harvard University Press, 1980.

OK. But it seems silly on the face of it - how could anything be
*learned* by other than inductive methods? Or do we mean something
different by "inductive methods"?

: >However you tune a piano, any piece you play upon it will sound like
: >a piano, and not like a trombone. That's all the UG implies.

: If that is all you are claiming, it can be explained by our
: all having similar shaped mouths, larynxes, etc.

That explains commonalities in phonemes, but not in grammars.

Bo
