The Primacy of Grammar Part 3
You’re reading novel The Primacy of Grammar Part 3 online at LightNovelFree.com. Please use the follow button to get notification about the latest chapter next time when you visit LightNovelFree.com. Use F11 button to read novel in full-screen(PC only). Drop by anytime you want to read free – fast – latest novel. It’s great if you could leave a comment, share your opinion about the new chapters, new novel with others on the internet. We’ll do our best to bring you the finest, latest novel everyday. Enjoy!
Keeping to Fodor's conceptions of syntax and semantics, much of Fodor's recent work (Fodor 1994, 1998) may be viewed as a defense of Old Testament semantics-the study of language-world connections-against any other form of semantics such as those involving conceptual roles, exemplars, prototypes, and the like (Murphy 2002): call them ''New Testament semantics.'' If Fodor is right in his rejection of New Testament semantics, and if Chomsky is right in rejecting Old Testament semantics, no intelligible concept of semantics survives outside the internalist concept of language use proposed in biolinguistics (Bilgrami and Rovane 2005). Beyond biolinguistics, vast gaps of understanding surround studies on language and related mental aspects of the world, even when we set aside the various dimensions of the unification problem.
We now have some idea of the respects in which biolinguistics is isolated from the rest of human inquiry, including other inquiries on language.
The twin facts of the isolation and the scientific character of biolinguistics raise the possibility that biolinguistics may have identified a new aspect of the world. I a.s.sume that we talk (legitimately) of an aspect of the world only in connection with a scientific theory of an advanced character with the usual features of abstract postulation, formalization, depth of 28
Chapter 1.
explanation, power of prediction, departure from common sense, and so on. This is what Black's notion of a ''body of doctrines'' implies, in my opinion. Chemical, optical, and electrical count as bodies of doctrines because, in each case, there is a cl.u.s.ter of scientific theories that give a unified account of a range of processes and events that they cover: the broader and more varied the range, the more significant the appellation ''body of doctrines.'' With the exception of rare occasions of unification, science typically proceeds under these separate heads, extending our understanding of the aspect of the world it already covers. Thus, not every advance in science results in the identification of a new aspect of the world.
It follows that, since biolinguistics is a science, it extends our understanding of some aspect of the world. However, since it is isolated from the rest of science, the aspect of the world it covers does not fall under the existing heads; therefore, biolinguistics has identified a new aspect of the world. We need to make (metaphysical) sense of the puzzling idea that the object of biolinguistics stands alone in the rest of the world.
The obvious first step to that end is to form some conception of how biolinguistics works. As noted, biolinguistics attempts to solve a specific version of Plato's problem with explicit articulation of the computational principles involved in the mind-internal aspects of language, including aspects of language use that fall in this domain. We have to see how exactly meaningful solutions to Plato's problem are reached within these restrictions.
2 Linguistic Theory I I provided a general historical overview of the generative enterprise in the last chapter; hence, I will skip discussion of the early phases of the enterprise (see Boeckx 2006, chapter 2). The current review thus goes straight into the principles-and-parameters (P&P) framework, developed in the early 1980s and generally viewed as a watershed in the short history of the field. The P&P framework essentially consists of two phases: an earlier phase known as Government-Binding Theory (G-B) and the current Minimalist Program (MP). I have organized the discussion of these phases as follows.
Except for some brief remarks on MP near the end, I will concentrate on G-B in this chapter because most of the traditional issues concerning the notions of grammar, language, and meaning discussed in the next two chapters (chapters 3 and 4) can be addressed with G-B in hand. Once I have done so, I will return to MP in chapter 5 to show that it takes these issues to a dierent plane altogether. Moreover, although possible, it is dicult to describe the P&P framework directly with MP, just as it is dif-ficult to describe the theory of relativity without grounding the discussion first in Newtonian theory (Piattelli-Palmarini 2001, 3). This is reinforced by the fact that, as we will see in section 5.2, each principle postulated in MP derives in one way or another from G-B itself (Boeckx 2006, chapter 3; also Hornstein and Grohman 2006). Finally, a crucial discussion in chapter 5 will require a comparative study of the principles of G-B and MP; to that end, a discussion of G-B is needed in any case.
I begin the discussion with a cla.s.sic philosophical problem in this area-the scope problem raised by Bertrand Russell-to show how contemporary linguistics oers a marvelous solution to this problem. This is not the way Chomsky and other linguists would like to introduce their discipline.1 Needless to say, the basic scientific program throughout is to solve Plato's problem in the domain of language. Solution of cla.s.sical 30
Chapter 2.
philosophical problems from within the same explanatory goals, then, ought to be viewed as bonus; it also helps prepare the ground for questioning nongrammatical approaches to language.
2.1.
Russell's Scope Problem In an epoch-making paper, the philosopher Bertrand Russell (1905) raised the following problem. We know that France had ceased to be a monar-chy at a certain point in history. Then what should be the truth value of sentence (1) uttered after that time?
(1) The king of France is wise.
Since there is no king of France, the sentence cannot be true. Can it be false? The trouble is that if (1) is false, then sentence (2), other things being equal, ought to be true.
(2) The king of France is not wise.
How can (2) be true if there is no king of France? Whose lack of wisdom is a.s.serted here? After rejecting a number of obvious and, in my opinion, fairly plausible options, Russell suggested that the notation of (first-order) quantification be used to rewrite (1) as (3) (bx)(Kx & (Ey)(Ky ! x y) & Wx) which means informally that there is a king of France, x, and that if anyone else, y, is a king of France then x and y must be one and the same; and also that x is wise. Russell thought that (3) captured the meaning of (1). Clearly, (3) will be false if there is no king of France. Then Russell suggested an ingenuous move for (2). Thinking of not as the familiar negation operator (s), Russell argued that there are two places in (3) where the operator can occur, giving rise to (4) and (5).
(4) (bx)(Kx & (Ey)(Ky ! x y) & sWx) (5) s(bx)(Kx & (Ey)(Ky ! x y) & Wx) It is clear that (4) also will be false if there is no king of France so that the uneasy issue of ''who-is-unwise'' does not arise. When (3) is false, (5) indeed is true but it is not true of someone; hence, the problem disappears.
Many authors including myself have held that, for definite descriptions, such a solution does not work and is perhaps not even required (Strawson 1950, 1961; Hawkins 1978; Heim 1982; Mukherji 1987, 1995, etc.). I post-pone a discussion of this aspect of Russell's theory of definite descriptions Linguistic Theory I
31.
(see section 3.5.3). Nevertheless, notwithstanding the merits of the specific solution, Russell's a.n.a.lysis of the problem brought out a general feature of languages.
The heart of Russell's solution was the observation that (2), a sentence of English, is structurally ambiguous. Example (2) is not ambiguous because some word(s) occurring in (2) are ambiguous; thus the ambiguity of (2) is quite dierent from the ambiguity of, say, The kite is flying high where kite could mean either a bird or an artifact. According to Russell, (4) and (5) represent disambiguated versions of (2) and, if his a.n.a.lysis is correct, we know the exact source of this ambiguity. The ambiguity lies in the relations.h.i.+p between the quantifier (bx) and the negation operator (s): in (4), the operator is inside the scope of the quantifier; in (5), the relation is reversed. Statements (4) and (5) bring out these facts formally by specifying the location/position of these items in each case. For example, in (4), the negation operator is said to have a ''narrow'' scope since it occurs to the right of the quantifier; the quantifier in (4) has a ''wide''
scope since it occurs to the left of the operator. Likewise for (5). This is the sense in which (2) is structurally ambiguous.
Several interesting points emerge. First, the surface or phonetic form of (2) conceals the ambiguity. Second, the ambiguity aects, as we saw, how (2) is to be semantically interpreted-that is, whether (2) is true or false.
Third, the semantic ambiguity can be traced to structural ambiguity in a suitable canonical notation. All of this led Russell to distinguish between the surface form of a sentence such as (2), that is, how the sentence looks and sounds, and its logical form(s) such as (4) and (5), that is, how the sentence is to be interpreted. Clearly, the distinction obtains for any sentence in any language even if a particular sentence is not structurally ambiguous: the (unique) logical form of the sentence tells us why. I have ignored what Russell took to be the central feature of the distinction, namely, that the unit The king of France is no longer represented either in (4) or in (5). This led Russell to conclude that The king of France is semantically insignificant; it is an ''incomplete symbol.'' It is interesting that, in philosophical circles, this conclusion still generates a lot of heat (Buchanan and Ostertag 2005).
Perhaps the most interesting point of the logical form (5) is that here the negation operator ''s'' occurs at the front of the sentence whereas not occurred near the end of (2). Given that (5) represents a (canonical) interpretation of (2), it follows that the lexical item not has not been interpreted where it has been sounded. In that sense, the interpretation of not is displaced-that is, phonetic and semantic interpretations are a.s.signed 32
Chapter 2.
at a ''distance.'' Russell's a.n.a.lysis gives an explicit account of a specific displacement because of the specificity of his interests. Yet, perhaps fortu-itously, Russell put his finger on a much wider phenomenon.
Scope distinctions, whether or not they involve the particular operator and the quantifier mentioned above, can now be generally viewed as examples of displacement. Consider (6).
(6) Every boy danced with a girl.
Example (6) is clearly ambiguous between (i) every boy found at least one girl to dance with, and (ii) a girl is such that every boy danced with her.
In fact, the phenomenon is even wider. Consider an active-pa.s.sive pair such as (7) and (8).
(7) Bill has read the book.
(8) The book has been read by Bill.
The expressions the book and Bill continue to have the same interpretations-object and agent, respectively, of the action of reading-in (7) and (8), although they occupy very dierent positions in these sentences. Such examples give rise to the even more general idea that sound-meaning connections in natural languages are typically indirect in that one cannot read o the meaning of a sentence from its phonetic form; hence the need for canonical representation of meaning. In that sense, Russell opened our eyes to a fundamental feature of natural languages.
This is perhaps the place to list some of the cla.s.sic examples that have occupied linguists over the years to show just how widespread the phenomenon really is. Each of these displays various complex and indirect relations.h.i.+ps between phonetic and semantic interpretation, thereby bolstering the argument from the poverty of stimulus: the stimulus properties of the datum supply insucient evidence for the child to decide how a sentence is to be semantically interpreted.
(9) Flying planes can be dangerous.
(10) Shooting of the hunters disturbed Mary.
(11) The troops stopped drinking in the village.
(12) John is easy to please.
(13) John is eager to please.
(14) John is too stubborn to talk to Bill.
(15) John is too stubborn to talk to.
Linguistic Theory I
33.
Examples (9), (10), and (11) are structurally ambiguous in various ways.
For example, (9) might mean that it is dangerous to fly planes, or that planes that fly can be dangerous; in (10), what disturbed Mary could be either that hunters themselves were getting shot or the fact that hunters were shooting, say, birds; a similar a.n.a.lysis accompanies interpretations of (11). As the paraphrases show, these examples have similarities with scope ambiguities. The pairs (12)/(13) and (14)/(15), on the other hand, require major dierences in underlying semantic interpretations despite very close similarities in their phonetic shapes. In (12), John is the Object of please, while in (13) some arbitrary individual(s) is the Object of please; similarly for (14) and (15).
How good is Russell's own a.n.a.lysis of the general phenomenon even if we ignore the specific theory of descriptions? The trouble is that canonical forms such as (4) and (5) are central to Russell's approach, and it is not at all clear what is accomplished by these forms. As I viewed the matter, displacement is clearly a natural phenomenon demanding a principled account. For various reasons, it is rather doubtful whether Russell has given such an account.
First, it is not clear in what sense, say, (4) and (5) give an account of (2). The notation of logical theory itself belongs to an artificial language.
So what Russell has done in (4) and (5) is to capture two of his intuitive interpretations of (2) in two sentences of this artificial language. In eect, this exercise has, at best, the same force as two Hindi sentences displaying the ambiguity of (2). In fact, for those who understand Hindi, the latter exercise is likely to be preferred over Russell's ''a.n.a.lysis'' since Hindi speakers can depend on their own linguistic intuitions that are naturally a.s.sociated with their knowledge of Hindi. This point is exemplified by the English paraphrases (i) and (ii) of (6). Clearly, these paraphrases are informal displays of the phenomenon; the question will be begged if they are thought to give an account of the interpretation(s) of (6).
The logical notation, on the other hand, is a tool created by the logician and, hence, it is not a.s.sociated with any linguistic intuitions independently of the intuitions of the native speaker of English. The fact that we ''see'' that (2), for instance, has the same meaning as (4) is not because we have some independent knowledge of the meaning of (4), but because we simply (and intuitively) a.s.sociate (4) with our knowledge of the meaning of (2). In other words, if we are asked to explain the meaning of (4), the most we can do is to say that it has one of the meanings of (2). So if the task is to explain what knowledge of meaning of (2) we have, then it is not accomplished by writing (4) down. We return to this issue in 34
Chapter 2.
chapter 3, after witnessing the explanatory power of an alternative framework in this chapter.
Second, the logical notation does not even suce as an adequate notational scheme. Even if we grant, say, that (4) and (5) give an account of (2) in some (yet to be clarified) sense, the account ought to begin with the structural features of (2) and lead systematically to the structural features of (4) and (5). This will be one way of ''rationally reconstructing'' the native speaker's knowledge of (2), notwithstanding the lack of explanatory force of this exercise. Any procedure that establishes a structural link between (2) and (4) cannot begin unless we have a proper syntactic characterization of (2). To have that characterization is to already have a canonical notation. Of course, once we have that characterization with a syntactic theory, we may use its resources to ''plug in'' logical formalism as systematically as we can. But then we need independent justification for the duplication of the eort as in much work in formal semantics, as we will see.
Thus we want linguistic theory to solve two major problems simultaneously: the unique linguistic version of Plato's problem, and the scope problem as an instance of the general displacement problem. This project will take us to the end of this chapter. After completing the project, we return to a more detailed examination of the relations between logical and grammatical theories in the next chapter.
I have just raised, and will continue to raise, a variety of objections against the use of logical theory to explain the workings of natural languages.2 None of this is meant as an objection to logical theory itself. I hold the development of mathematical logic through the work of Gottlob Frege, Bertrand Russell, Alfred Tarski, Kurt G.o.del, Alonzo Church, Alan Turing, and a host of others, as one of the most significant achievements in the history of human thought. As an ardent student of logic, I cannot fail to admire the beauty of its constructions, its metaproofs, and a series of surprising results on the character of formal systems culminating in Kurt G.o.del's mind-boggling work. But praise is due where it belongs.
The basic objection I am raising is simple and, to my mind, pretty obvious. Almost all of the work in mathematical logic originated with an abstract and intuitive characterization of a small list of words from natural language: & for and, s for not, E for all, b for some, ! for if, (variable) x for it, and so forth. It was a stupendous feat to construct extremely complex systems and proof strategies from such a small basis.
One of the natural requirements for these constructions was to keep the Linguistic Theory I
35.
basic formalism, including a scheme of interpretation, under strict control so that the vagaries of the basis do not contaminate the resulting constructions. It is not surprising, therefore, that the characterization of the basis not only required a prior intuitive understanding of certain English words; it reflected the meanings of these words only partially to enable the logician to reach a ''core'' meaning to which the logical concerns of deducibility, validity, and so on can be systematically attached. As Peter Strawson (1952, 148) described the logical enterprise, ''the rules of the system'' of logic ensured that the constants of the system formed ''neatly systematic relations to one another'' that the words of English do not have.
By the same token, we cannot expect expressions of logic to give an account of the meanings of these English words. A very dierent form of inquiry is needed to explain the workings of natural language. In pursuing it, no doubt, lessons from the rest of human inquiry, including logical theory, will be drawn on. In fact, the basic explanatory format of linguistic theory, namely, the computational-representational framework, is adopted directly from a specialized branch of mathematical logic known as ''computability theory'' advanced by G.o.del, Church, Turing, and others. But not surprisingly, the domains of application of mathematical logic and linguistic theory dier sharply.
2.2.
Principles and Parameters I now turn to the treatment of the scope problem in generative grammar.3 According to Chomsky (2000d), Universal Grammar (UG) postulates the following provisions of the faculty of language (FL) that enter into the acquisition of language: A. A set of features B. Principles for a.s.sembling features into lexical items C. Operations that apply successively to form syntactic objects of greater complexity CS, the computational system of a language, incorporates (C) in that it integrates lexical information to form linguistic expressions hPF, LFi at the interfaces where language interacts with other cognitive systems of the mind. Although there has been significant progress in recent decades on principles of lexical organization (Pustejovsky 1995; Jackendo 2002; also Pinker 1995a for a popular review), linguistic theory has been primarily concerned with the properties of CS. In what follows, therefore, I 36
Chapter 2.
will also concentrate on CS to review the character of the principles contained there.
The basic features of the P&P framework (Chomsky 1981) can be brought out as follows. We may think of four kinds of rules and principles that a linguistic theory may postulate. First, the formulation of some rules may be tied to specific languages; call them ''language-specific rules'' (LSR): relative clauses in j.a.panese, pa.s.sivization in Hindi, and so on. Second, some rules may refer to specific constructions without referring to specific languages; call them ''construction-specific rules'' (CSR): NP-preposing, VP ) V NP, and the like. (I am introducing this group for expository purposes. In practice, these rules often refer to language typologies-for example, VP ) V NP holds only for head-first languages. It does not aect the discussion that follows.) Third, we may have rules that refer neither to specific languages nor to specific constructions, but to general linguistic categories; call them ''general linguistic principles'' (GLP): a lexical item may have a y-role just in case it has Case, an anaphor must be bound in local domain, there is a head parameter, and the like. Finally, we may have rules that simply signal combinatorial principles and general principles of interpretation without any specific mention of linguistic categories; call them ''purely computational principles'' (PCP): all elements in a structure must be interpretable, the shorter of two converging derivations is valid, and so on. I discuss this taxonomy from a dierent direction in section 5.2.
The remarkable thing about current linguistic theory is that there is a real possibility that rules of the first two kinds, namely, LSR and CSR, may be totally absent from linguistic theory. The P&P framework made this vast abstraction possible. In slightly dierent terms than mine, Chomsky (1991a, 2324) brings out the basic features of a P&P theory as follows. Consider two properties that descriptive statements about languages might have: a statement may be language-particular or language-invariant [Glp, where lp LSR], or it could be construction-particular or construction-invariant [Gcp, where cp CSR]. Then, according to Chomsky, a P&P theory contains only general principles of language that are [alp] and [acp], and a specification of parameters that is [lp]
and [acp]. Traditional grammatical constructions such as active-pa.s.sive, interrogative, and the like are ''on a par with such notions as terrestrial animal or large molecule, but are not natural kinds.'' Once the parameters are set to mark o a particular language, the rest of the properties of the expressions of this language follow from the interaction of language-invariant principles: ''the property [Gcp] disappears.''
Linguistic Theory I
The Primacy of Grammar Part 3
You're reading novel The Primacy of Grammar Part 3 online at LightNovelFree.com. You can use the follow function to bookmark your favorite novel ( Only for registered users ). If you find any errors ( broken links, can't load photos, etc.. ), Please let us know so we can fix it as soon as possible. And when you start a conversation or debate about a certain topic with other people, please do not offend them just because you don't like their opinions.
The Primacy of Grammar Part 3 summary
You're reading The Primacy of Grammar Part 3. This novel has been translated by Updating. Author: Nirmalangshu Mukherji already has 650 views.
It's great if you read and follow any novel on our website. We promise you that we'll bring you the latest, hottest novel everyday and FREE.
LightNovelFree.com is a most smartest website for reading novel online, it can automatic resize images to fit your pc screen, even on your mobile. Experience now by using your smartphone and access to LightNovelFree.com
- Related chapter:
- The Primacy of Grammar Part 2
- The Primacy of Grammar Part 4