Editor’s Note

This special issue of The Classic consists of five articles written by undergraduate students in Dr. Vera Lee-Schoenfeld’s Generative Syntax (LING 3150W), a course introducing the scientific study of the human language faculty offered by UGA’s Department of Linguistics [i]. To orient ourselves prior to diving into these essays, let’s take a moment to consider what we know about human language. This consideration, in turn, will lead us to some interesting initial questions that make up the foundation of linguistic theory.


Everybody speaks a language. Every human being at every point in our species’ history has—barring pathology—successfully acquired at least one natural language from the community they live in. Anyone who has ever raised a child or grown up with young siblings can attest to the incredible rate at which kids go from having seemingly no language at all to a veritable cornucopia of verbal behavior. On the face of it, this may seem intuitive. Of course kids learn language fast and easily; what else do they have going on, after all? But it turns out that this is much more surprising than might be appreciated at an initial glance. When we consider just what a child is required to learn about a natural language, the task appears rather daunting.

Photo by Jason Rosewell on Unsplash

In a few short years, a child must acquire the ability to produce and comprehend a potentially infinite number of expressions. This tremendous task was put most lucidly nearly a century ago by Wilhelm von Humboldt in his 1836 introduction to general linguistics, wherein he describes language as a system that “makes infinite use of finite means”. That is, with a finite set of words, a human who has acquired a natural language can produce and comprehend an infinite number of valid sentences; our limitations lay only in our physicality, bounded as we are in mental processing, memory, and a finite lifespan.

This fact about natural language requires that what we acquire is a system of rules. It wouldn’t do to simply learn a listing of the possible phrases and expressions of a language, since we cannot list an infinite number of such elements. That means that a native speaker’s competence must consist in a finite procedure that generates those infinite expressions. Linguists call the procedure we acquire a generative grammar. When we zoom into the nature of the linguistic expressions produced by such a grammar, we find some more interesting facts that consequently help us to understand what this rule system is expected to generate.

Any linguistic expression is a pairing of sounds or signs to a specific meaning. A native speaker of English knows, by virtue of having acquired an English grammar, that (1a) does not mean the same thing as (1b), but (1a) and (1c) are basically synonymous.

(1)      

a. Dogs chase cats.
b. Cats chase dogs.
c. Cats are chased by dogs.

Generative grammars, then, are procedures for assembling a specific symbolic profile (a sequence of sounds or gestures) with a specific meaning, and they do this over an infinite number of possible meanings. We can deduce from the above two basic questions that linguists must grapple with, and it is these two questions that motivate much research into the nature of our linguistic abilities.

First, while it is obvious, given these basic assumptions, that what we acquire must be some sort of rule system, what is not so obvious is what the rules of a particular language must look like. We can formulate our first research question:

How should we describe the rules of natural language grammars and the ways those grammars can vary from one another?

Much of the project of modern linguistics is the examination of particular languages with the aim of providing clear empirical descriptions of those languages’ grammatical rule systems.

Second, while it is clear that there is nothing specific about you and me that makes us English speakers, other than the random circumstance of being born in an English-speaking environment, there certainly is something specific about humans that makes us able to acquire natural languages in the first place. That is, we have an innate faculty for learning grammars. This straightforward deduction does not, however, tell us what our innate faculty looks like. And so we turn to the second research question:

What is the fine structure of our innate faculty for learning the grammar of a natural language?

While the definitive answer to this second question is far from settled, linguists have arrived at a few concrete generalizations. Chief among these is the understanding that, although languages appear to vary from one another in radical ways, careful examination shows that all natural languages are cut from the same innate cloth. And although we cannot judge ahead of time what the fine structure of our language faculty looks like or the fullness of the dimensions along which it is able to vary, we have an increasingly sharp picture of what properties are important, and which properties natural languages simply cannot have.

Photo by engin akyurt on Unsplash

When we begin to seriously examine these questions, we run into innumerable new questions, and although the research over the past sixty years into generative grammar has secured satisfying explanations for many of these, there are many directions new research can take us. The five essays that comprise this special issue of The Classic are initial investigations into the structure of specific natural languages. Each one examines a fragment of a specific human language and, using the methodological tools of generative syntax, puts forth a model grammar that may be used to formulate new hypotheses about the fine structure of our language faculty. Crucially, each discussion considers data from a language that has manifold surface differences from English. After providing a description for the rules of each of these languages, they emphasize that the ways they vary are actually quite restricted. Each contribution can be taken as the starting point of further investigation into the fine structure of our innate ability for language.

—Jonathan Crum

Doctoral Student, UGA Department of Linguistics

Writing Intensive Program Teaching Assistant


[i] Dr. Lee-Schoenfeld thanks Dr. Jorge Hankamer at UC Santa Cruz for sharing his “Syntactic Structures” materials with her when she was his teaching assistant in 2002. Her LING 3150W course is based directly on Dr. Hankamer’s LING 55.