Newsgroups: sci.lang
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!news.mathworks.com!newsfeed.internetmci.com!tank.news.pipex.net!pipex!uknet!newsfeed.ed.ac.uk!edcogsci!steve
From: steve@cogsci.ed.ac.uk (Steve Finch)
Subject: Re: Chomksy, Significance, and Current Trends
Message-ID: <DD5CLH.2nJ@cogsci.ed.ac.uk>
Organization: Centre for Cognitive Science, Edinburgh, UK
References: <4084i9$dml@newsbf02.news.aol.com>
Date: Fri, 11 Aug 1995 12:41:21 GMT
Lines: 113

alsosprach@aol.com (AlsoSprach) writes:

>At one point (late 60's-mid 70's), it appeared that Chomksy's generative
>grammer would have sweeping applications in many fields from music theory
>to sociology.  Then the furor died down.  Was that a good thing? Why did
>it happen? Should/can linguistics ever regain that prominance in the
>social sciences? What are the current trends?

>Personally, I think Chomsky's obsession with the logical form of his
>theoretical framework has simply placed undue restraints on all programs
>of linguistic study.  

Well, nothing of real practical value to someone interested, for
example, in getting computers to process language, seems to have come
from the "semantics as logic" school of which logical form is the
translation step from language.  Rather it seems to have misled people
to study esoteric examples from the logical fringes of language while
ignoring (or trivialising) huge areas of what it is necessary to
analyse if we are to be able to perform useful tasks in
computationally applied research such as translation, message
extraction, information abstraction and retrieval from natural
language, speech recognition, and so on and so forth.

So much is this the case that very few researchers in any of the above
areas (who should be natural clients for scientific research on the
nature of language) now use the results of Chomsky's paradigm in their
research.  This should worry anyone who claims to be doing scientific
research into the nature of language since these are the fields which
one would hope the scientific study of language (linguistics) could
illuminate (or at least give a few leads to).

>It would be like mathemeticians telling physicists
>to ignore certain phenomena because the phenomena cannot be described
>mathematically yet in a way that is satisfactory to the mathemeticians. 

One can make the argument that linguistics is to these applied
language areas as physics is to engineering, rather than as
mathematics is to physics.  Nevertheless physicists are interested in
many of the same areas as engineers are, and advances in physics have
led to new engineering areas and practices.  This cannot be said of
linguistics/applied linguistics where the former is increasingly
ignored by the latter.

>So until Chomsky and his cohorts are able to develop what they consider an
>appropraite framework all sorts of fundamental aspect of language are
>ignored.  

Indeed.  Here are some small questions of crucial importance to
applied linguists either ignored by "real" linguists or for which the
analysis of "real" linguists is largely unapplicable.


1.  Why does our use of words satisfy Zipf's law and how does the HSPM
make language satisfy Zipf's law?  Why do the forms of syntactic
construction we use also satisfy a Zipf-Mandelbrot type law?

2.  Why are our utterances usually interpreted unambiguously, and how
does the HSPM generate such unambiguous sentences (even in the face of
noise cf. 7 below)?  What in the nature of language allows such a
property for the HSPM?

3.  Why are many of the traditional linguistic categories in language
apparent statistically, and of what importance is this observation for
acquisition and processing?

4.  Why do we NOT generate many forms of utterance which traditional
linguistics would lead us to believe are "well-formed"?  Why does the
HSPM have a preference for certain sentence forms, and is this
consistent across language users?

5.  What processes might lead us to acquire knowledge of how to use
words from hearing/reading them once only?  What is needed from
language in order to be able to do this?

6.  How do we understand novel semi-lexical constructions
(e.g. noun-noun compounds) so easily?  What is the nature of
compositionality in such cases?

7.  How is the HSPM robust in the face of some violations of syntactic
structure/slips of the tongue/noise in communication?  What in the
nature of language allows such properties of the HSPM?

8.  In the light of Q1--7, where does sentence validity decision fit
in?  Why is it possible (and in what sense is it justified) to claim
that a natural language can be described by a grammar?


I believe these form a cluster of questions (at various levels) which
are of enormous interest to researchers in the applied side of
language, and form a cluster of questions which, if answered, would
throw light not only on the nature of natural language and the HSPM,
but on most cognitive tasks.  I also believe the answers to these
questions will be universal across languages.

The competence/performance distinction has too long been used to
isolate Q8 from Q1--7, and note that Q1--7 are far from independent of
the "nature of language".  And yet Q8 is the least important question
of direct relevance to those interested in the engineering side of
linguistics.

Just to keep a potentially interesting discussion going,

Steve.

------------------------------------------------------------------------------
When you steal from one person, it's called Plagiarism-
When you steal from many, it's research.                       - Wilson Misner
------------------------------------------------------------------------------
Steven Finch                              | University of Edinburgh
Phone: +44 131 650 4656                   | Language Technology Group
					  | Human Communication Research Centre
email: S.Finch@ed.ac.uk                   | 2 Buccleuch Place
                                          | Edinburgh            EH8 9LW
