Dynamics of Language An Introduction
Dynamics of Language An Introduction
net/publication/298544659
CITATIONS READS
39 7,373
3 authors:
Lutz Marten
SOAS, University of London
74 PUBLICATIONS 637 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
The Dynamics of Language: Dynamic Syntax as a formal model of syntax View project
All content following this page was uploaded by Ruth Margaret Kempson on 19 May 2014.
final draft
iv
Contents
v
vi CONTENTS
10 REFERENCES 353
viii CONTENTS
11 Index 367
CONTENTS ix
Preface
For the whole of the last half-century, all of us studying language from a formal
perspective have assumed that knowledge of language is different from the tasks
of speaking and understanding. This idea was first introduced by Saussure at
the turn of the century, and later recast by Chomsky (1965) as the distinction
between competence and performance. On both these views knowledge of language
is some capacity identifiable through abstraction away from the use of language in
communication and languages, hence language itself, have to be studied entirely
independently of the way they are used, because explaining the phenomena of
speaking and understanding a language have to make reference to this use-neutral
knowledge which underpins them. During the course of the century, there were
some dissenters, but, since no alternative was ever formally articulated, by and
large, this view held total sway.
This book takes a different tack: it continues the task set in hand by Kempson
et al. (2001) of arguing to the contrary that the common-sense intuition is correct
and that knowledge of a language consists in being able to use it in speaking and
understanding. What we argue is that defining a formal model of how interpretation
is built up across a sequence of words relative to some context in which it might
be uttered is all that is needed to explain structural properties of language. The
dynamics of how interpretation is built up just is the syntax of a language system.
The capacity for language is thus having in place the architecture which enables
us to produce and understand language input in context. What is central to the
explanation, through every twist and turn of the formal intricacies, is the faithful
reflection of the dynamics of processing a string: bit by bit, each unit of information
that constitutes the next input picks up on what is presented before and adds to it
to provide the context for the next addition to be processed.
Given this commitment, such a model might look like a new variant of use-
theories of meaning, a continuation of the later Wittgenstein’s admonition that all
there is to meaning is to look at the usage, nothing more need be said. However,
out of this earlier work, a resolutely anti-formalist stance developed, so that this
tradition is in danger of being anti-theory, dismissing the need for any formal expla-
nation. What we are arguing, however, is that there is room for views in between
these extremes. Our proposal is that properly formal theories of language can be
articulated in terms of the dynamics of how a language enables interpretation to
be progressively built up. Against the use-theory of tradition, we are committed
to the same methodology of previous grammar-writers in the tradition of theoreti-
cal linguistics that a grammar must define what it means for an expression of the
language to be well-formed, and that it must couch such explanations in terms of
distributional properties of the expressions of the language. Indeed, just like anyone
else in this tradition, we present a formal framework for articulating such struc-
tural properties of language. In this respect, the structure of the book is within
the tradition of formal linguistic modelling: we introduce the formal framework; we
apply it to individual analyses in individual languages to show that it provides a
reasonable way of addressing standard structural properties of that language; and
from there we enlarge the horizon to show how it can be used to articulate gen-
eral properties across languages. In particular we show how it is competitive in
addressing a whole range of phenomena that are extremely puzzling seen from the
perspective of other frameworks. In all these respects, the methodology is familiar.
x CONTENTS
However, there is a very different twist, because the formal framework we present
has an attribute which grammars of the past century, with a few honourable excep-
tions, have not had. The formalism which we present as the basis for articulating
grammars of natural language is able to define processes of growth of information
across sequences of expressions; and we argue that natural-language grammars, by
definition, reflect the dynamics of real time processing in context. In essence, we
take syntax to be a process, the process by which information is extracted from
strings of words uttered in some context. This stance not only leads to different
analyses of individual phenomena, and a commitment to very different forms of
explanation; but it opens up a whole new debate about the nature of language,
the formal properties of natural-language models, and the relation of language to
explanations of the way linguistic ability is nested within a more general cognitive
perspective.
The formal framework of Dynamic Syntax was set out in Kempson et al. (2001),
but in writing this book we have had different objectives. First, our task has been
to convey to a general linguistic audience with a minimum of formal apparatus, the
substance of that formal system, without diminishing the content of the explana-
tions. Secondly, as linguists, we set ourselves the task of applying the formal system
defined to as broad an array of linguistic puzzles as possible. On the one hand, we
have covered the kind of data other books might cover – problems displayed by
English. On the other hand, we have shown how the very same system can be used
to develop a detailed account of Japanese as a proto-typical verb-final language
showing that from the perspective we set out, Japanese looks just as natural as any
other language. This is a good result: verb-final languages comprise about half the
world’s languages; yet, to all other current theoretical frameworks, they present a
cluster of properties that remain deeply puzzling. Indeed, on looking back from this
new perspective in which they fall so naturally into the same type of explanation,
it is at the very least surprising that the community of linguists should have been
content for so long with theoretical frameworks that found these languages such a
challenge. We have also looked in some detail at the intricacies of the agreement
system in Swahili, one of a language family – Bantu – which seems to pose quite
different problems yet again from either of these two languages. Having set out a
range of detailed analyses, and a demonstration of how each fits into general ty-
pologies for language structure, the book then closes with reflecting on the novel
perspective which it opens up, showing how old questions get new answers, and
new challenges emerge. Amongst these is the demonstration that with the gap be-
tween a formal competence model and a formal performance model being so much
narrowed, we can fruitfully address challenges set by psycholinguists. One of these
is the challenge to provide a basis for modelling dialogue, the core phenomenon of
language use; and we argue that the Dynamic Syntax framework provides a good
basis for modelling the free and easy interchange between the tasks of speaking and
understanding, displayed in dialogue.
In writing such a book, it is always more fruitful to write as a community, and,
first of all we would like to take the opportunity of thanking each other for having
got ourselves to a point where we none of us feel we would have got to without the
other. As one might expect from such collaboration, the book has spawned other
publications during the process of its completion. Parts of chapter 3 were written up
as Kempson (2003), Kempson and Meyer-Viol (2004), parts of chapters 4-5 as Cann
et al. (2003), Cann et al. (2004), Cann et al. (2005), Kempson, Kiaer and Cann
(forthcoming), Kempson, Cann and Kiaer (2004). Part of chapter 6 is written up as
Kempson and Kiaer (2004), Kempson (2005), parts of chapter 7 as Marten (2000,
2003, 2005), parts of chapter 8 as Cann (forthcoming a,b) The implementation of
CONTENTS xi
1
2 CHAPTER 1. THE POINT OF DEPARTURE
time, but it does not take much scratching at the surface before the problems start
to emerge.
Within the perspective which the current methodology imposes, there are two
central properties displayed by all languages; and these pose a recognised major
challenge for theoretical explanations of language. On the one hand there is the
compositionality problem of how words and what they are taken to mean can be
combined into sentences across an indefinite array of complexity. We have various
means of saying the same thing, using words in a variety of orders:
(1.1) Tomorrow I must see Bill.
(1.2) Bill I must see tomorrow.
And any one of these sequences can always be embedded at arbitrary depths of
complexity:
(1.3) You insisted I think that Harry I must interview today and Bill I must see
tomorrow.
(1.4) The friend of mine who told me that Sue was insisting that tomorrow I
must see Bill was lying.
On the other hand, there is the problem of context-dependence. Pronouns are
a familiar case: they have to be understood by picking up their interpretation
from some other term in some sense recoverable from the context in which they
are uttered. However the phenomenon is much, much more general. Almost every
expression in language displays a dependence on the context in which the expression
might occur. The effect is that any one sentence can be taken to express a vast
number of different interpretations. Even our trivial examples (1.1)-(1.2) differ
according to who is speaker, who Bill is, and when the sentence is uttered. The first
challenge, then, to put it another way, is to be able to state the interaction between
order of words and interpretation within a sentence. The second challenge is to
explain how the interpretation of words may be related to what has gone before
them, either within the sentence itself or in some previous sentence.
The types of solution for addressing these problems are almost never challenged.
The first problem, right across theoretical frameworks, is taken to require a syntac-
tic solution. The second, equally uniformly, is taken to present a semantic problem,
only. These challenges are then taken up by different breeds of theoretical linguist:
syntactician on the one hand, semanticist (or pragmaticist) on the other. Given
this separation, we might reasonably expect that two-way feeding relations between
the phenomena would be precluded. Indeed systematic interaction is virtually in-
expressible given many of the solutions that are proposed. Yet such expectation
would be entirely misplaced. There is systematic interaction between the two phe-
nomena, as we shall shortly see, with context-dependent expressions feeding into
structural processes in a rich assortment of ways.
The significance of this interaction is, we believe, not sufficiently recognised. The
phenomena themselves have not gone unnoticed, but their significance has. Such
interactions as are identified are generally analysed as an exclusively syntactic phe-
nomenon, with those expressions that display interaction with syntactic processes,
characteristically pronouns, analysed as having distinct forms, one of which is sub-
ject to syntactic explanation, the other semantic. We shall see lots of these during
the course of this book. The result is that the phenomena may be characterised by
the various formalisms, but only by invoking suitable forms of ambiguity; but the
1.1. COMPOSITIONALITY AND RECURSION IN NATURAL LANGUAGE 3
underlying significance of the particular forms of interaction displayed has gone al-
most entirely uncommented upon (though exceptions are beginning now to emerge:
Boeckx (2003), Asudeh (2004)).
What we shall argue to the contrary is that what these systematic interactions
show is that the problem of compositionality and the problem of context-dependence
are not independent: syntactic and semantic aspects of compositionality, together
with the problem of context dependence, have to be addressed together. Making
this move, however, as we shall demonstrate, involves abandoning the method-
ological assumption of use-neutral grammar formalisms in favour of a grammar
formalism which directly reflects the time-linearity and context-dependent growth
of information governing natural language parsing.1 Intrinsic properties defining
language, we shall argue, reflect the way language is used in context so directly that
the structural properties of individual languages can all be explained in terms of
growth of structure relative to context. We shall argue accordingly that parsing is
the basic task for which the language faculty is defined; and that both syntactic and
semantic explanations need to be rethought in terms of the dynamics of language
processing.
Setting out the argument is the burden of the entire book, together with the
presentation of the framework of Dynamic Syntax (Kempson et al. 2001). In
this first chapter, by way of preliminary, we set out these two challenges in some
detail, provide some preliminary justification for the stance we take, and give some
indication of the type of analysis we shall provide.
for natural languages that, in defining well-formedness of strings in terms of possible continuations,
also induce strings on a left-right basis (Hausser 1989, 1998).
4 CHAPTER 1. THE POINT OF DEPARTURE
noun phrase to yield a sentence, or, as a closer rendering of what the rules actually
determine, a sentence may be made up of an NP and a VP, and a VP can be made
up of a Verb and an NP. By a very general convention, any realisation of some
application of these rules gets represented as a tree structure into which the words
are taken to fit:2
S → NP VP S = sentence
VP → V NP VP = verb phrase
(1.5)
NP = noun phrase
V = verb
NP VP
V NP
John
upset
Mary
As the starting point for articulating a semantics for such structured strings,
we might assume that there are “semantic” rules which determine how meanings
for sentences are built up on the basis of the meanings of their words and such
structural arrangements. So we might express what it is that (1.5) means on the
basis of first knowing that upsetting is a relation between individuals – something
that people can do to us – that the term Mary refers to some individual bearing
that name, John to some individual called by the name ‘John’. What knowing
the meaning of the sentence then involves, is having the capacity to combine these
pieces of information together following the structure of the sentence to yield the
assertion that John upset Mary (not the other way round). Somewhat more for-
mally, providing the semantic characterisation of a sentence involves specifying the
conditions which determine the truth of any utterance of that sentence; and, as part
of this programme, the meaning of a word is defined as the systematic contribution
made by its interpretation within the sentences in which it occurs. The semantic
rules that guarantee this rely on an independently articulated syntax. So, for (1.5),
following the syntactic structure, the meaning assigned to the verb upset applies as
a function to the individual denoted by the name Mary to yield the property of up-
setting Mary, which in its turn applies to the individual denoted by the name John
in subject position to yield the proposition that John upset Mary at some time in
the past, relative to whatever is taken to be the time of utterance.3 The example we
have given is trivial. But the underlying assumptions about design of the grammar
are extremely widespread. This pattern of defining a syntax that induces structure
over the strings of the language, with an attendant truth-conditional semantics
that is defined in tandem with the defined syntactic operations, has been a guid-
ing methodology for the last thirty odd years, even though individual frameworks
disagree over the detailed realisation of such formalisations.
2 In some frameworks, S corresponds to IP or CP (Inflection Phrase or Complementiser Phrase),
semantics interaction, based on ideas taken from Generalized Phrase Structure Grammar, see
Gazdar et al. (1985).
1.1. COMPOSITIONALITY AND RECURSION IN NATURAL LANGUAGE 5
The particular toy grammar we have given here does not allow for recursion.
But on the assumption that we introduce appropriate syntactic rules to allow for
recursion, this model will have the advantage of being able to reflect the infinitely
recursive property for both syntax and semantics, given the definition of semantic
modes of combination running on the back of the defined syntactic rules.4 As a
reflection of the systematicity of this syntax-semantics relation, we might dub this
view “the jigsaw view of language”: each word and its meaning add together with
its neighbouring words and their meaning to give bigger and bigger pieces which fit
together to form a sentence plus meaning pair for a sentence when all these various
parts are combined together. Knowledge of language, then, constitutes knowledge
of this set of principles; and, accordingly, models of language use will have to take
such models as their point of departure.
fully appropriate mnemonics were coined for labelling distinct syntactic constructions - island
constraints, the Right Roof Constraint (Ross 1967), Crossover phenomena (Postal 1970), etc.
6 CHAPTER 1. THE POINT OF DEPARTURE
The subject of obvious, which we might expect on the pattern of (1.7) to occur
before the predicate word obvious, is the clause that I am wrong (this is what
is obvious); but all we have in subject position is the word it as some kind of
promissory note, itself giving much less information than we need, merely acting as
a wait-and-see device.
To add to the puzzle, there are restrictions on these correlations which do not
seem to be reducible to any semantic considerations. So while establishing a connec-
tion between a left dislocated expression and some position in a string indefinitely
far from this initial position is possible across certain clause boundaries, it is not
across others. Thus, (1.9) is well-formed where each clause is the complement of
some verb, while (1.10)-(1.11) are not. In the latter cases, this is because the
containing clause inside which the removed item is intended to be correlated is a
relative clause.
(1.9) That book of Mary’s, Tom tells me he is certain you can summarise
without difficulty.
(1.10) *That book of Mary’s, Tom tells me he has met the author who wrote.
(1.11) *The book which Tom told me he had met the author who wrote was very
boring.
Left dislocation, though subject to constraints, nevertheless may involve the
indefinite “removal” of an expression from its interpretation site. Analogous “re-
moval” to some right-periphery position is very much more restricted, however.
Dislocation from the point at which the expression has to be interpreted can only
be within the domain provided by the clause in which the expression is contained,
and not any further. This is shown by (1.12)-(1.15):
(1.12) It was obvious I was wrong.
(1.13) That it was obvious I was wrong was unfortunate.
(1.14) It was unfortunate that it was obvious I was wrong.
(1.15) *That it was obvious was unfortunate I was wrong.
(1.12) shows that a right peripheral clausal string like I was wrong can be associated
with an expletive pronoun, it, in subject position, providing the content of that
pronoun. (1.13) shows that this form of construction can be contained within
others. (1.14) shows that this process can be repeated, so that the main subject
pronoun can be provided with its interpretation by the string that it was obvious I
was wrong.
What is completely impossible, however, is the establishment of a correlation
between a right dislocated clausal string such as I was wrong with expletive it which
is subject, not of the matrix clause, but of the clausal subject, That it was obvious.
As (1.16) illustrates, this is the structure exhibited by (1.15) which is completely
ungrammatical and any meaning for the sentence as a whole is totally unrecoverable.
That it might mean the same as (1.13)-(1.14) is impossible to envisage. This is
despite the fact that the clausal sequences are all arguments which, as noted above,
freely allow leftward movement.
(1.16) *[ That iti was obvious ] was unfortunate [I was wrong]i .
Such right peripheral displacements are said to be subject to a “Right Roof Con-
straint” (Ross 1967), imposing a strict locality restriction on how far the clausal
1.1. COMPOSITIONALITY AND RECURSION IN NATURAL LANGUAGE 7
string can be removed to the right from its corresponding pronoun, limiting this to
a single, containing clause.
It was the existence of such apparently semantically blind restrictions that led
in the 70’s to agreement across otherwise warring theoretical factions that syntactic
and semantic explanations could not be expressed in the same terms;6 and follow-
ing this agreement, a number of grammar formalisms were articulated (Gazdar et
al. 1985, Bresnan 2001, Sag and Wasow 1999) all of which presumed on the inde-
pendence of theoretical vocabulary and formal statements to be made for syntactic
and semantic generalisations.7 More idiomatically, as Chomsky has expressed it
(Chomsky 1995), natural languages seem in some sense not to be “perfect”, and
this in itself is a challenge which linguists have to address. A core property of
natural languages lies in these irreducibly syntactic properties - those which, in
minimalist terms, cannot be reduced to properties of either the semantic or phonol-
ogy interface; and it is these above all which Chomsky pinpoints as evidence of the
uniquely defining structural properties of natural language.
(Montague 1974) in which syntax and semantics were expressed directly in tandem, following
earlier articulation of categorial grammar formalisms in which this is true by definition (Lambek
1961, Bar Hillel 1964 and many others since). Though this assumption was agreed to be unsus-
tainable, given the existence in particular of strong island constraints such as extractability from
complex structures to the left periphery (eg. The Complex NP Constraint), and no-extractability
from complex structures to the right periphery (The Right Roof Constraint), the methodology
of striving to establish denotational underpinnings for structural properties of language continues
to this day, attempting in so far as possible to preserve the strongest form of compositionality
of meaning for natural language expressions, by invoking lexical and structural ambiguity for all
departures from it.
7 Categorial grammar formalisms preserve the Montague methodology in its purest form, defin-
ing a mapping from natural-language string onto model-theoretically defined denotations without
any essential invocation of a level of representation, using an array of lexical type assignments
and operations on these types which determine compositionality of meaning despite variability in
word order (Ranta 1994, Morrill 1994, Steedman 1996, 2000, Moortgat 1988). Lexical Functional
Grammar (LFG) is the only framework which systematically abandons any form of composition-
ality defined directly on the string of expressions, for c-structure (the only level defined in terms of
a tree-structure configuration inhabited by the string of words) is motivated not by denotational
considerations, but solely by distribution (Bresnan 2001, Dalrymple 2001).
8 CHAPTER 1. THE POINT OF DEPARTURE
b. Though John and Mary adored each other, he married Sue. The only
time they subsequently met, he upset her so badly she was glad he
had married Sue, not her.
Despite the fact that he upset her is in much the same context in (1.18.a,b), never-
theless in (1.18.a) her means ‘Sue’ and in (1.18.b) it means ‘Mary’, due to the use of
they in (1.18.b) and the assertion that whoever they are had only met once subse-
quent to the marriage, which rules out interpreting her as picking out Sue (at least
given conventional marriages). The problem is that the assumption that semantics
is given by articulating the contribution each word makes to the conditions under
which the containing sentence is true would seem to lead to the assertion that all
sentences containing pronouns are systematically ambiguous: the conditions under
which John upset Mary are not the same as the conditions under which John upset
Sue, ‘Mary’ and ‘Sue’ being the discrete interpretations associated with the pro-
noun her. Ambiguity defined in terms of truth conditions holds when an expression
makes more than one contribution to the truth conditions of the sentence in which
it is contained. The word bank is an obvious example, any one use of the word in
a sentence systematically providing more than one set of truth conditions for the
containing sentence. Problematically, pronouns seem to be showing the same prob-
lem; yet we do not want to be led to the conclusion that a pronoun is an ambiguous
word. For this leaves us no further forward in expressing what it is that constitutes
the meaning of an anaphoric expression: it simply lists the different interpretations
it can have, and there are too many for this to be plausible.
There is a more general problem than this. Pronouns stand in different types of
relation to the denotational content they can be assigned in context, making it look
as though the ambiguity analysis of pronouns is unavoidable in principle. In (1.19),
the pronoun is construed as a bound variable, that is, a variable to be construed
as bound by the quantifier every girl, its interpretation dependent entirely on what
range of girls every is taken to range over:
(1.19) I told every girl that she had done well.
But this is not an appropriate analysis of (1.20). In (1.20), the pronoun is gener-
ally said to be a coreferring pronoun, both it and its antecedent having the same
interpretation, denoting the same individual:
(1.20) Edwina came in. She was sick.
But this form of analysis will not do for the pronoun she in (1.21) - the pronoun
has to be construed as ‘the woman I helped over the road’:
(1.21) I helped an old woman over the road. She thanked me.
But the pronoun in (1.21) is not functioning as a bound-variable, as in (1.19), either.
Different from either of these, the construal of the pronoun in (1.21) involves some
kind of computation over the entire preceding clause.
The particular significance of the differences between (1.19)-(1.21) is that, on
the one hand, the problem does not seem to be reducible to a syntactic problem.
This is not a problem that can be solved by analysing the pronoun as a stand-in
device for the expression that provides its antecedent, since (1.19) and (1.22) mean
entirely different things:
(1.22) I told every girl that every girl had done well.
(1.21) and (1.23) also:
1.1. COMPOSITIONALITY AND RECURSION IN NATURAL LANGUAGE 9
(1.23) I helped an old woman over the road. An old woman thanked me.
Deciding on the most appropriate analysis for this type of pronoun, as is the stan-
dardly acceptable way of describing the problem, has led to an immense amount of
debate; and pronouns that give rise to this form of construal in such contexts are
currently distinguished from other uses by the term E-type pronouns.8
With there being different types of truth conditions which a pronoun can con-
tribute to its containing sentence, it seems that analysing the meaning of a pronoun
in terms of its contributions to the truth conditions of sentences in which it occurs
seems bound to involve diversity, hence ambiguity of the pronoun itself. Yet there is
good reason to think that any solution which freely invokes lexical ambiguity cannot
be right. Not only does this phenomenon of multiple variable-type, referential-type
and E-type interpretations extend systematically across pronouns in all languages.
It also extends systematically across anaphoric expressions in all languages. Here
are a range of paired examples from English, showing the availability of bound
variable and E-type interpretations provided by both the definite article and for
both forms of demonstrative.
(1.24) Every house I have put on the market I have checked, to make sure the
house will not be hard to sell.
(1.25) Most students were there. The entire group got drunk.
(1.26) Every day I drink any wine, I know that later that day I shall have a
migraine.
(1.27) Most people who came early left well before a few people got drunk. That
group were no problem.
(1.28) Every time I don’t take my pills, I think that this time I am better and
will not need them.
(1.29) I went to a party at which several of my friends were playing dominos.
This group were very quiet, though the musicians got very drunk.
The availability of such types of construal also extends to tense, with (1.30) indi-
cating a bound-variable form of construal for the present tense, (1.31) indicating a
referential form of construal, (1.32) an E-type form of construal:
(1.30) Whenever the dog goes out, she pees.
(1.31) The dog went out. She didn’t pee.
(1.32) I saw each student for an hour. I was very patient despite the fact that I
had a headache.
In (1.32), the type of construal is again an E-type form of construal, for the under-
standing of the time interval contributed by was in I was very patient is over the
whole time it took to see the students one after the other.
There is also verb-phrase (VP) anaphora, and nominal anaphora:
(1.33) Even though I write my own songs, most of my friends don’t.
(1.34) Even though I like to sing John’s songs, that particular one isn’t much
good.
8 ‘E’ for existence, so we understand, and not for Gareth Evans, who coined the term (Evans
1980).
10 CHAPTER 1. THE POINT OF DEPARTURE
And this is still not much more than the tip of the iceberg. There are also a number
of different forms of nominal and verbal ellipsis, where the required interpretation
isn’t expressed by words at all, but is picked up directly from the context:
(1.35) Whenever I sing my own songs, Sue refuses to.
(1.36) Even though I like to sing John’s recent songs, not one is as good as most
he had written while a student.
(1.37) She can sing some of my songs, but not John’s.
(1.38) Question: What does Lloyds promote? Answer: Its own interests.
(1.39) What Lloyds promotes is its own interests.
(1.40) She can sing some of my songs, but I haven’t decided which.
Though VP ellipsis has been taken to be a broadly unitary phenomenon since Dal-
rymple et al. 1991, as in (1.33) and (1.35), VP ellipsis and nominal ellipsis (as
in (1.34), (1.36)-(1.37)) are invariably distinguished, with different forms of input
structure. So too are fragmentary answers to questions (1.38), and the correspond-
ing so-called pseudo-cleft structures (1.39), for which a recent analysis has proposed
that the final expression (and correspondingly the fragment answer) is really a full
sentential form ‘Lloyds promotes its own interests’ in which all but the final expres-
sion is deleted (Schlenker 2003). There is so-called sluicing (1.40), generally also
analysed as a form of sentential ellipsis in which only the wh expression remains
(Levin 1982, Chung et al. 1995, Merchant 2001).
The list goes on and on from there. For each major construction, distinct forms
of interpretation are seen to be associated with subtly distinct structural restric-
tions. As we shall see in chapters 3 and 6, there are a number of different types
of relative clause construal, suggesting at least rather different forms of relativis-
ing element, supposedly needing quite distinct forms of analysis. And in chapter
8, we shall confront the problem of be for which at least five different forms have
been posited - presentational, existential, predicative and equative be plus the aux-
iliary. Despite the temptation to posit discrete structures coinciding with discrete
forms of interpretation, the consequences of any decision simply to invoke ambi-
guity have to be faced, as part of the challenge of confronting the phenomenon of
context-dependence in general. Any such decision is no more than an evasion of the
challenge of providing an explanation of what intrinsic contribution the expression
or structure brings to the interpretation process, which at one and the same time
licenses and yet limits the flexibility of its interpretation.
The alternative conclusion stares one in the face, begging to be drawn. The
phenomenon of context dependence is general. It dominates the problem of pro-
viding an adequate account of natural language construal; indeed it constitutes its
very heart. This is not a phenomenon that can be swept aside as a mere ambiguity
of the expressions that display it.9 Furthermore, once we grant that the meaning
of words contained in a sentence is not sufficient to establish the interpretation of
what is conveyed in an utterance, we have to articulate meanings for expressions
that are systematically weaker than the interpretation they may be assigned, with
a definition of the forms of update that take place during the process of building up
such interpretations.10 So what we shall be seeking to provide is a formal charac-
terisation of an expression’s lexical meaning in terms of some specification which is
9 See Carston (2002) for a demonstration that it also infuses conceptual content, with the
only the input to an interpretation process that is sensitive to the context in which
the expression is uttered.
expressions that are in some sense long are moved to the right periphery of a string. However,
this has sometimes led to bizarre conclusions, e.g. that variability in constituent order known as
scrambling is a PF phenomenon, outside core syntax (see Karimi (2003) for different representative
views about scrambling). Indeed, since all dislocation processes are to some extent context-
dependent, this line of thought logically leads to the conclusion that no such phenomena should
be characterised as part of core grammar.
12 Despite the fact that linear precedence is encoded in LFG at the level of c-structure, it is
uniformly agreed that the substantial core of linguistic generalisations has to be expressed at
f-structure, which retains no record of linear order. In Head-Driven Phrase Structure Grammar
(HPSG, Sag et al. 2003), discussions of linear order have tended to be concerned with capturing
generalisations about word orders within particular languages and linear precedence rules are
in general treated independently of the principal syntactic operations on feature structures (e.g.
Uszkoreit 1987, Sag and Wasow 1999), although there are exceptions (Kathol 1995, 2000, Reape
1993). See also Hoffman (1995), Baldridge (2002) who set out a Combinatory Categorial Grammar
(CCG) characterisation of local scrambling in free word order languages in terms of a set-theoretic
type specification which reflects the view that the grammar simply fails to dictate local NP-
ordering relative to the verb.
12 CHAPTER 1. THE POINT OF DEPARTURE
syntactic primitive is subtly shifting in recent minimalist work, without concern that
this might be undercutting basic theoretical assumptions. This move started with
Kayne (1994), who observed that the mapping of dominance structures onto linear
strings imposes an asymmetry on the structures to be licensed, with such constraints
as ‘specifier’ precedes ‘head’, ‘head’ precedes ‘complement’, a view which has been
extremely influential despite the fact that these constraints are highly problematic
to sustain in the face of verb-final languages, imposing a large number of movement
processes in any characterisation of those languages (Simpson and Bhattacharya
2003).13 Nevertheless we take it as no coincidence that the increasing interest in
issues of linearity has also led within minimalism to a view which directly correlates
the grammar formalism with the dynamics of language performance (Phillips 1996,
2003), in a view not dissimilar to our own.
There is a major problem caused by the disavowal of explanations that make
reference to the dynamics of processing order: it creates a tension between the
characterisation of how words are put together to form strings (syntax), and char-
acterisation of interpretation of such strings (semantics). Though syntax is assumed
to provide the input for semantic characterisation (at some LF interface in min-
imalist terms), the remit of semantics is assumed to be an explanation of how
information is built up in context to yield a new context for what follows, a task
which has to be fulfilled despite the fact that syntactic processes are not defined
in terms that support this remit. This creates two problems, one formal, one em-
pirical, the empirical evidence showing that the formal stance adopted cannot be
right. First, we face a design problem for the language model: it is far from obvious
how to fit syntax and semantics together as distinct but correlated components of a
formal system that is designed to reflect what a speaker/hearer knows independent
of any application of that knowledge, given that one of these components but, ap-
parently, not the other has to be defined in dynamical update terms with essential
reference to context. On the empirical side, the facts simply deny the sharp separa-
bility of anaphora construal and the articulation of structural restrictions that this
separation of considerations of structure and context-dependence might lead one
to expect. To the contrary, there is pervasive inter-dependence of anaphora and
structural processes, as we shall spend much of this book showing. All that can be
done to preserve the encapsulation of the grammar formalism and its separation
from more general concepts of context-dependence and the dynamics of language
processing in context is to assume a bifurcation between anaphoric processes which
display interaction with structural processes, and those which appear to depend,
in some sense more loosely, on the larger dialogue context. But this is difficult to
reconcile with the fact that any one pronoun can be used in both ways.14 The result
for such types of solution (Aoun and Li 2003) is a lack of integrated explanation
of any single anaphora/ellipsis process. The formalism merely defines a subset of
cases to which some arbitrarily delimited anaphoric process applies, generally one
that can be reduced to some binding mechanism independently articulated in the
grammar (see Morrill (1994), Hepple (1990) for categorial grammar examples).15
13 We are grateful to Öystein Nilsen for informing us of recent research moves in the direction
pronouns.
15 Despite the emphasis in Discourse Representation Theory of providing a framework in which a
unitary account of anaphora can be expressed (Kamp and Reyle 1993), the same bifurcation occurs
if the version of Discourse Representation Theory presupposes an attendant grammar formalism
which assigns co-indexing between anaphoric and antecedent expressions for grammar-internal
processes, for the DRS construction algorithm which dictates construal of pronouns through an
identification of variables will apply only to supposedly discourse-general processes, excluding
1.1. COMPOSITIONALITY AND RECURSION IN NATURAL LANGUAGE13
Any language can be used to illustrate this. In this chapter, we stick to English.
We start with the resumptive pronoun strategy in English where the pronoun serves
to pinpoint the position in the relative-clause construal at which some reflex of the
head noun has to be seen as contributing to the interpretation of the relative (the
pronoun it in (1.41)):16
(1.41) There was this owl which it had got its foot caught in the goal netting.
The problem that examples like these display is that, on standard assumptions,
the requisite interpretation of the pronoun is the result of a structural process, but
this process is characterised independently of, and logically prior to, the process of
assigning semantic interpretation to the assigned structure. The characterisation is
in terms of some form of co-dependency of elements in a long-distance dependency
structure.17 Whatever the particular mechanisms of the particular formalism, it
forces an analysis of this use of anaphoric expression of an entirely different sort
from the more general phenomenon of context-dependence. One possible way to
avoid the theoretical consequences which these data apparently present is to label
the problematic subcases as a discrete lexical property of some itemised form of
pronoun, possibly phonologically weakened. This is an instance of the problematic
ambiguity tactic very generally adopted in the face of problematic data. But this
tactic really is not well motivated, as all other definite and demonstrative NPs
can be used in the same way where there is no possible analysis in terms of some
phonologically weakened form, and the goal of analysing resumptive pronouns in
a way that integrates them with the more general phenomenon of anaphora is a
recognised challenge (Boeckx 2003, Asudeh 2004):18
(1.42) I watched this woman, who the idiot had got herself into a dreadful
muddle trying to sort out her papers in the middle of the conference hall.
(1.43) This afternoon I’m interviewing a mature student, who this woman is
asking for extra time, and we may not be able to avoid giving it to her.
The other alternative is to exclude the data altogether. Asudeh (2004), for
example, analyses these as ungrammatical (as does Boeckx (2003)), requiring an
additional production mechanism that licenses local strings which determines their
acceptability to the speaker.19 Yet, this common occurrence of optional resumptive
pronouns in relative clauses that are nevertheless deemed to be substandard is not
peculiar to English. It is displayed in other European languages - Italian, Spanish,
Portuguese at least. In contrast, strikingly, resumptive pronouns are not usable
at all in German. But this makes a pure production analysis seem unlikely: how
could the requisite production strategy not be applicable in speaking German? This
problem should not be dismissed as just a one-off production error associated with
pronouns captured by the grammar from its remit. In this connection, it is should be noted
that the Dynamic Predicate Logic (DPL) characterisation of anaphora (Groenendijk and Stokhof
1991) presumes on a prior indexing of pronouns provided by some attendant grammar formalism,
so there is no formal reflex of the underspecification intrinsic to anaphoric expressions.
16 This is a strategy which is used much more freely in other languages. See chapter 4, and Cann
et al. forthcoming. The example was uttered spontaneously by the second author, 13.08.04.
17 This is achieved in various ways in the different frameworks: feature sharing in HPSG, term
of a general characterisation of anaphora, but his analysis involves positing a morphologically null
operator which has the effect of removing the pronominal properties of the resumptive, thereby
ensuring its function as a gap needing binding by the associated gap-binding wh-operator.
19 It is notable, in this connection, that these are often reported as possible to parse, but with the
observation that “I wouldn’t say it that way”, an observation which is at odds with his analysis.
14 CHAPTER 1. THE POINT OF DEPARTURE
there could be some interaction between order and binding. The theory is, however, problematic
in requiring the explosion of functional categories to allow anything like the full range of word
order phenomena found even in a single language.
1.1. COMPOSITIONALITY AND RECURSION IN NATURAL LANGUAGE17
(1.49) a. The report on our project Sue has warned us will be first on next
meeting’s agenda.
b. *That it was obvious was unfortunate I was wrong.
These and other challenges we shall take up in due course.
Before getting into any of the details, we now set this putative shift in per-
spective against the larger psychological background. It is no coincidence that the
concept of underspecification that we espouse provides the tool which will bring
anaphora construal and long-distance dependency together. We shall find that
underspecification of interpretation assigned to some signal is intrinsic to all sig-
nals processed by the cognitive system, a phenomenon much broader than some
language-internal property. And once we grant the need for a general cognitive
explanation of this phenomenon, it opens up the way for articulating in detail the
particular forms of underspecified input plus update which natural languages make
available, against a background in which the phenomenon of language is grounded
directly in a general cognitive framework.
ing different sub-tasks, there has been a recent, rapid growth of interest in inferential approaches,
using Bayesian models of probabilistic reasoning (Knill and Richards 1996).
1.2. INTERPRETING THE WORLD AROUND US: THE CONSTRUCTION OF REPRESENTATIO
for many people to accept that what they see is a symbolic interpretation of the
world – it all seems so like the ‘real thing’. But in fact we have no direct knowledge
of objects in the world” – a view which goes back at least to Helmholtz (1925), who
argued that perception is unconscious inference.
These examples highlight a very important aspect of this process which Fodor
did not lay much emphasis on. This is the systematic gap between the information
provided by the stimulus itself and the information humans recover from it. We
are all always adding to what it is we have literally “seen” in virtue of what else we
know that is available to us at the time. But if signals do not themselves determine
the interpretation that we impose on them, because the retinal image itself fails to
uniquely determine the information to be recovered, what is it that determines how
signals are interpreted?
on its own cannot explain how choices about which information to attend to can
be made. Somehow or other, most information probably interacts with what we
believe already in some way or other, so in practice it is not possible to process
all incoming information and check for potential contextual effects. So Sperber
and Wilson propose that maximisation of contextual effects is counter-balanced by
processing cost. Mental activity involves “cost”: thinking, information retrieval
from long-term memory, deriving conclusions are all activities which need cognitive
resources. These resources have to be allocated so as to derive maximally relevant
information with justified cognitive effort. This Principle of Relevance is then taken
to underpin how all input to the cognitive system is enriched to determine a specific
interpretation in context, whatever the form of input.
In applying such considerations to language construal, Sperber and Wilson
(1995) argue that there is one additional consideration: the signal provided is a
manifest display of the speaker having an intention to communicate, and this in
itself provides extra information. In virtue of this display of intention, the hearer
is justified in spending cognitive effort on processing a communicated message, and
furthermore minimal cognitive effort, because she can assume that the form of the
utterance used is selected as the most relevant one possible in the given situation.25
From this perspective, the puzzle presented by pronouns and elliptical structures is
just part of what is a quite general cognitive process and is not some idiosyncracy
of natural language. All human processing involves constructing a representation
which we take, following Fodor, to be the interpretation provided by some given
signal. The choice as to which interpretation to construct from a signal is dictated
by the very general cognitive considerations encapsulated in a constraint such as
the principle of relevance. We can see this from the examples given earlier, e.g.
(1.18.a,b), repeated below.
1.18 a. Though John and Mary adored each other, he married Sue.
Whenever he upset her subsequently, she would remind him that it
was Mary he should have married.
b. Though John and Mary adored each other, he married Sue. The only
time they subsequently met, he upset her so badly she was glad he
had married Sue, not her.
We interpret he upset her in (1.18.a) as ‘John upset Sue’ because this is the only
choice consistent with what the speaker has said in the clause that follows. Such an
interpretation, easily constructed in the provided context gives us inferences which
are informative in that context in a way that the competing interpretation is not.
In (1.18.b), the same string has to be interpreted as ‘John upset Mary’ for similar
reasons, given the difference in the following clause. Least effort is not enough: in
these cases this yields two competing interpretations. But neither is maximisation
of inferences. It is the two together that determine the particular interpretation
assigned in context. That is, we make decisions swiftly, with least effort scanning
the smallest structure for a term which could provide an interpretation the speaker
could have intended, with the specific context determining what that choice can
be.
The insight that anaphora and elliptical processing are interpreted by the same
general pragmatic principles that drive all interpretive procedures is not uncontro-
exactly the new information interacts with old information. Beliefs might be strengthened or
contradicted, or the new information might provide a premise to derive a conclusion which would
not have followed from the initial information state.
25 This involves a weaker principle which Sperber and Wilson call The Communicative Principle
of Relevance.
1.3. COMPETENCE AND PERFORMANCE: TOWARDS A NEW VIEW 21
versial. Prior to Sperber and Wilson’s work, linguists and psychologists had very
generally assumed that the free processing of language in context, what we label
pragmatics, was different in kind from the process of assigning interpretation to
expressions such as pronouns, which was taken to form part of establishing the
content directly expressed in uttering a sentence. In Gricean pragmatics in partic-
ular, a sharp division is drawn between the propositional content associated with
an utterance, and further more indirect pragmatic effects that might follow from
the fact of its being uttered, with only the latter being constrained by the so-called
maxims of conversation.26 Part of Sperber and Wilson’s contribution has been to
demonstrate to the contrary that the way we assign interpretation to words such
as pronouns is subject to the very same constraints as determines the much richer
array of interpretational effects such as metaphor, irony, and so on.27
models of pragmatics. However, there is the puzzle of the various forms of inter-
action between anaphora construal and structural processes waiting to be solved.
If, as already suggested, we take on the challenge of solving this puzzle by mod-
elling discontinuity effects and anaphoric processes as forms of underspecification
which undergo processes of update, then we can no longer assume that all linguistic
processes happen first and pragmatically driven processes operate solely on their
output. Rather, what is needed is a model which articulates not merely some req-
uisite set of logical structures, but also a process of growth of such structures given
natural-language input, with an assumption that general constraints, such as that of
optimal relevance, can determine how individual choices might be made during any
one such growth process which the model licenses. So the resulting model does not
have the same property of encapsulation as assumed by Sperber and Wilson, along
with many others. The twist in the tale is that a formal account of the individual
time-linear steps whereby structure corresponding to interpretation can be built
up turns out to have just the right flexibility for capturing structural properties of
language without being vacuously general.
Of course, we do not expect anyone to believe this stance without argumenta-
tion; and the purpose of the rest of this book is to provide this. Having set out
the Dynamic Syntax framework, we shall look first at long-distance dependency,
anaphora construal and their various forms of interaction, an account which will
include accounts of all the major types of relative clause formation, and the major
structures that have been distinguished as left and right periphery effects, such as,
Hanging Topic Left Dislocation, Clitic Left Dislocation, expletives, pronoun dou-
bling and Right Node Raising. We shall then turn to looking at Japanese, which,
as a prototypical verb-final language, is notoriously problematic for current for-
malisms: these languages at best are assigned analyses which makes parsing them
seem irreducibly non-incremental, despite the fact that there is lots of evidence that
the parsing of Japanese is just as incremental as parsing any other language (Inoue
and Fodor 1995, Kamide and Mitchell 1999, Fodor and Hirose 2003, Aoshima et
al. 2004).28 Chapters 7 and 8 pursue the enterprise to show how co-ordination and
agreement patterns in Bantu are constrained by time linearity and how the con-
cept of underspecification can be used to articulate a uniform account of a range
of copular clauses in English. In contrast to standard and static approaches, we
shall see that the time-linear perspective of Dynamic Syntax allows an entirely nat-
ural set of analyses of these phenomena using the same concepts of tree growth,
while preserving incrementality of parsing. In addressing these and other current
puzzles, our claim in each case will be that the model we set out will be uniformly
simpler than other more conventional models that are designed not to reflect the
time-linearity of natural language processing.
As a result of this shift in perspective, the competence-performance distinction
looks very different. Though there remains a distinction between the linguistic-
competence model and a general theory of performance, the articulation of that
competence model is no longer disconnected from the articulation of the latter.
To the contrary, the competence model is developed on the assumption that it
provides the architecture within which the choice mechanisms of performance have
to be implemented. The claim that the system is parsing-oriented is nonetheless a
far cry from making the claim that no abstraction from the data of performance is
needed. We are not providing a performance model, despite the concern with the
dynamics of language processing. What we are claiming is that the explanation of
the intrinsic patterns displayed in natural language is best explained in terms of
28 See Phillips (2003), and Schneider (1999) for attempts to devise parsing formalisms which
the dynamics of how interpretation is built up in real time, and the model itself
defines a concept of tree growth that does indeed reflect the dynamics of processing
linguistic input in time, with the left periphery of a sentence string presumed to
provide information that has to be used first in developing the emergent structural
configuration.
new structure can be added to the overall system without effects elsewhere, it is
hard to see what linguistic phenomena this richness in expressive power excludes.
While minimalist approaches to syntax still maintain a rich array of grammatical
devices, the aim of restricting the vocabulary for expressing linguistic properties
nevertheless has important implications for linguistic theory, not least with respect
to simplifying the relation between knowledge of language and language use.
Dynamic Syntax also differs from Discourse Representation Theory (DRT: Kamp
and Reyle 1993), though it shares with DRT the commitment to articulating a level
of representation not inhabited by the string of words as an essential intermediate
step in characterising compositionality of denotational content. The difference be-
tween the two systems lies in the fact that, in DS, this form of representation also
constitutes the basis for syntax: there is no other level of structure with indepen-
dently defined primitives over which syntactic generalisations are expressed. DRT,
on the other hand, simply takes as input to the interpretation algorithm whatever
syntactic formalism is selected as appropriate, defining a construction algorithm
mapping such syntactic structures onto discourse representation structures, in this
respect falling into the same family of formalisms as LFG.
There are however some real similarities between the formal notations which
these theories use and Dynamic Syntax, a matter to which we will return; and the
arguments in favour of DS over either of these is mainly in terms of simplicity,
and the ability to articulate more principled explanations. The Right Roof Con-
straint is one such case. This is the constraint alluded to earlier in introducing
the syntax-semantics mismatch, which applies to rightward dependencies but not
leftward ones, enforcing a strict locality of dependencies at the right-periphery of
a clause. This asymmetry is unexpected in all frameworks in which structural gen-
eralisations depend solely on structural configuration: there is no reason in such
frameworks to expect asymmetry between left and right peripheral effects. How-
ever, as chapter 5 will show, this constraint emerges as a direct consequence of what
it means to be establishing forms of construal in the latter stages of the construc-
tion process, rather than in the early stages. Given the richness of vocabulary of
HPSG and LFG, the more restricted set of facts can doubtless be described by for-
mal specification of the structures over which such dependencies are articulated,29
but the underlying explanation of this asymmetry eludes such static formalisms.
So despite the heavy machinery that the frameworks make available, we shall see
that a simpler architectural assumption can lead to simpler individual analyses. It
might seem that in only defining one type of structural representation, Dynamic
Syntax would be less expressive than multi-level theories, but, as we shall see, DS
is nonetheless able to articulate a rich array of variation in linguistic distribution
by the ability to make reference to intermediate stages in the construction process.
At the other extreme lies Categorial Grammar, which explains natural lan-
guage phenomena solely in terms of mappings from phonological sequences onto
denotational contents without any essential reference to a level of representation
at all. This is the pure Montague approach to natural languages following the for-
mal language pattern (see section 1.1.1). The DS framework shares with Categorial
grammar formalisms the assumption that syntactic categories reflect semantic com-
binatorial operations, as reflected in type theory, and there are other similarities too.
However, DS has the edge over these formalisms in at least two respects. Categorial
grammar formalisms fail altogether to take up the challenge of reflecting the wealth
of interaction between anaphora and structural processes, and in general no unitary
characterisation of anaphora is provided (see Hepple (1990), Morrill (1994), Steed-
29 See Dipper (2003) for an LFG characterisation of German which provides such a stipulation.
1.4. CODA: PARSING AS THE BASIC ACTIVITY 25
man (1996, 2000), though for a notable exception see Ranta (1994)). It might be
argued that there is a concept of delay in semantic combinatorics expressed through
the type-lifting of an NP expression to a functor from VP contents to sentence con-
tents, which might be taken as equivalent in effect to structural underspecification
plus update hence not requiring additional mechanisms to express; but as we shall
see, the left-peripheral placement of expressions is only one of several forms of
underspecification plus update. There is a problem in principle in expressing un-
derspecification of content in categorial grammars, as their explicit commitment to
the view that grammars are logics leaves no room for expressing such systematic
underspecification and enrichment. It is a consequence of this commitment that
verb-final languages, which display a high degree of context-dependence, constitute
a major challenge for categorial grammar formalisms.30
Of all the extant formalisms, the DS formalism is perhaps closest to LTAG - Lex-
icalized Tree Adjoining Grammar (Joshi and Kulick 1997). LTAG projects typed
structures directly from lexical expressions over which type-deduction is defined
(as in categorial grammar formalisms and Dynamic Syntax). Long-distance dis-
continuity effects are characterised by processes of tree-adjunction, mapping pairs
of trees into a single composite tree. In particular, the long-distance dependency
of left-peripheral expressions is defined by splitting apart some lexically defined
tree-structure and injecting into it a further tree corresponding to all the non-local
material intervening between the left-peripheral expression and its site of depen-
dency. This is unlike the DS system, in which the process of tree growth is defined
to apply in a way that strictly follows the dynamics of parsing. Indeed, in be-
ing intrinsically dynamic, the DS system is unlike all these systems, despite many
similarities at the level of tools of description, for the time-linear incrementality of
natural-language construal is built into the formalism itself. The simplicity that we
gain, as we shall set out case by case, lies in the fact that the concepts of under-
specification are twinned with a monotonic tree growth mechanism that is defined
over the left-right sequence of words.
of arguments, any one of which may be optional if recoverable from context. Multi-set typing has
been proposed to offset this problem in part (Hoffman 1996, Baldridge 2002), but this is little more
than a notational economy (see Kiaer in preparation for detailed evaluation of the difficulties facing
categorial grammar formalisms in addressing linguistic distributions in verb-final languages).
26 CHAPTER 1. THE POINT OF DEPARTURE
text, with one additional constraint: the structures being constructed must match
some intended content. Consider the case of answering a question:
(1.50) A: What shall we eat?
B: Curry.
Speaker A sets out a structure by providing a string for B to process where her
request is for a value for the place-holding wh expression. Once B has established
this by parsing the string, all she has to do in replying is to provide this value. She
does not have to make any distinct hypothesis as to the structure A, her hearer,
has as context with which to interpret that answer: she “knows” the context A has,
because she herself has just parsed the string A said. All she has to do in choosing
the word curry is to check that the concept it expresses, rather than, say apples,
matches the content she has in mind when put into the structure set up by A’s
question. One could of course analyse ellipsis as involving a high level production
strategy that is quite independent of parsing, which involves first assuming that
some thought has to be constructed by some process, say the thought ‘B wants
to eat curry’, then selecting the words to express it on the basis of having made
a strategic decision as to what A has as their immediate context, and only then
deciding which of those words need to be said, and in what order. If this were the
appropriate account of production, then the correlation of parsing and production,
as presumed in the informal account just given, would break down. However,
in dialogue, where context dependence is extensive, there is little evidence that
such high level judgements are a necessary part of production, as we shall see in
chapter 9. To the contrary, there is increasing evidence that use of context at all
levels of production choice is a means of cutting to the minimum such high level
decisions (Pickering and Garrod 2004). The whole point of being able to rely on
information which one has just parsed oneself and which therefore by definition is
part of the private cognitive context that one brings to bear in any cognitive task,
is that making choices that are got from one’s own context-store will completely
side-step the need to search through one’s general store of words; and this makes
the production task much easier. In short, like hearers, speakers rely on their own
immediate context, but with the additional constraint that all choices made have
to be checked for consistency with what they have in mind to convey.
Finally, as long as we adopt even the outlines of Relevance Theory, we are com-
mitted to the primacy of understanding, for this explains inferential abilities in
communication as resulting from cognitive abilities relevant for processing infor-
mation, so from interpretation rather than production. Sperber and Wilson derive
communicative behaviour as expressed in the communicative principle of relevance
from general cognitive behaviour, namely from relevance-driven processing as em-
bodied in the cognitive principle of relevance and definition of relevance. In other
words, our ability to assess and choose information in linguistic communication is
a reflex of our ability to handle information in general. Indeed, as things will turn
out, general concepts of information processing, though with language-particular
instantiations, will turn out to be central to every single analysis of syntactic phe-
nomena we provide, for concepts of underspecification and monotonic growth of
representation will be at the heart of them all. In all cases, we shall argue, it is
these which directly make available simpler, more intuitive accounts of linguistic
phenomena. Moreover, the account will have the bonus of providing formal expres-
sion to what might otherwise be dismissed as naive functionalism. So the challenge
we shall be setting out is this: take the dynamics of the parsing process seriously,
and you get a grammar that is simpler both at the level of individual analyses, and
at the level of the grammar architecture.
Chapter 2
The Dynamics of
Interpretation
We have a two-fold challenge ahead. The first is to set out a model of how inter-
pretation is recovered in context. The second is to establish why this constitutes
the basis for syntactic explanations. As we saw in chapter 1, the heart of the ex-
planation is our commitment to reflecting the way humans can manipulate partial
information and systematically map one piece of partial information into another
in language processing, using each piece of information provided as part of the
context for processing each subsequent input. The challenge is to use these, intrin-
sically dynamic, concepts to replace analyses which depend on a discrete syntactic
vocabulary, involving processes such as movement, feature passing, etc. It is in this
respect, above all, that this formalism will depart from all other widely adopted
grammar formalisms.
In this chapter, we set out the basic framework, beginning with a sketch of the
process of building representations of content and subsequently developing the con-
cepts and technical apparatus. The discussion will be kept as informal as possible,
but more formal material is introduced so that readers can get a feel for the formal
basis of the theory.1
1 Full details of the formal characterisation of the system can be found in Kempson, et al.
(2001).
2 We shall use text-capitals for technical terms, in particular for rules defined in the theory.
27
28 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
NP VP C IP
N V NP DP I’
Hilary upset N D NP I VP
Joan N V DP
Hilary upset D NP
Joan
Joan′ Upset′
So what is the difference between these trees? In the first place, the tree in (2.2)
contains no information about word order. There is no claim here that English is
verb final - not even with respect to some hypothetical ‘deep structure’. Instead,
the tree represents the semantic structure of the propositional content expressed
by the string Hilary upset Joan so that what labels the nodes in the tree are
compositionally derived concepts, expressed in some lambda calculus, just as in
certain versions of categorial grammar (Morrill 1994, Carpenter 1998, etc.). The
tree thus reflects a jigsaw view of how we can entertain complex concepts (Fodor
1998), but notably not a jigsaw view about words.3 The trees in Figure 2.1, on
the other hand, reflect putative properties of words in strings that define structure
over those strings. So, VP labels a node that consists of a word that is a verb plus
another word that functions (simultaneously) as a noun and a noun phrase (and
possibly a determiner and determiner phrase). The syntactic structure determined
(or projected) by the words exists independently of the words themselves and is
constrained by independently defined rules or principles, again stated over strings
of words.
3 To maintain a distinction between words and the concepts they express, when we refer to
words we will use italic script, and when we refer to the concept we will use non-italic script, an
initial capital letter and a following prime. The proper name John thus expresses the concept
John′ . We also use italic script occasionally for emphasis, as here, but we assume it will always
be obvious in context which is intended.
2.1. A SKETCH OF THE PROCESS 29
Hilary′
7→ ?T y(t) 7→ ?T y(t)
Hilary′ • Hilary′ •
Cresswell (1985) for arguments that support such a view with respect to the objects of verbs of
propositional attitude.
5 It should be noted here that we are not in this book probing the internal structure of concepts
themselves. So we have nothing to say about the concept Sing′ or Love′ ; and indeed nothing to
say about the relation between the word loves and the concept which it expresses, other than that
it expresses the concept Love′ . This is a huge and complex topic. At the two extremes, some
have argued for definitions associated with words (e.g. Bierwisch 1969), others for a one-to-one
word-concept correspondence (Fodor 1981), with various intermediate positions (e.g. Pustejovsky
1995, Rappaport and Levin 1998). Yet others have noted that concepts vary according to the
particular context in which they are used, so there is a context-dependency to the use of words
to express concepts much like the context-dependence in using pronouns (Carston 2002). For the
purposes of this book, we adopt the view of Fodor (1981) that words express primitive concepts,
with a word and the concept it expresses being in one-to-one correspondence (see also Marten
(2002b)).
2.2. THE TOOLS OF DYNAMIC SYNTAX 31
The formula is just one of several labels that can decorate a node. In addition
to the formula label, we also have a label that gives information about the type
of the formula expression. The type of an expression is its semantic category, as-
sociating an expression with a particular sort of denotation. So, an expression of
propositional type t denotes a truth value; that of type e is a term that denotes some
entity. Functor types are represented as conditional statements so that an expres-
sion of type e → t expresses a (one-place) predicate, since when it combines with a
term (of type e) it yields a proposition (of type t) and denotes a set (see any num-
ber of introductory books on formal semantics, e.g. Chierchia and McConnell-Ginet
(1990), Cann (1993), Gamut (1991), Carpenter (1998)). Although most theories
of types assume a recursive definition of types, yielding an infinite set of types,
Dynamic Syntax makes use of only a small, predefined, set consisting of three basic
types e, t, cn,6 and a restricted set of functors based on these to provide sufficient
structure to account for the number and types of arguments of verbal and nominal
predicates. There is also a type cn → e that is assigned to quantifiers. Type raising
operations and the higher order types associated with Montague and Categorial
Grammar play no part in the grammar formalism, since concepts of underspecifi-
cation of update replace those of type-lifting and composition of functions, as we
shall shortly see. The table in (2.4) lists the most common types used in this book
with a description and examples.7
(2.4) Types in DS
Type Description Example expression
T y(e) Individual term F o(Mary′ ), F o(ǫ, x, Student′ (x))
T y(t) Proposition F o(Sing′ (John′ )),
F o(Upset′ (Hilary′ )(Joan′ ))
T y(e → t) (1-place) Predicate F o(Upset′ (Hilary′ )), F o(Run′ )
T y(e → (e → t)) (2-place) Predicate F o(Upset′ ), F o(Give′ (John′ ))
T y(e → (e → (e → t))) (3-place) Predicate F o(Give′ ), F o(Put′ )
T y(t → (e → t)) (Proposition) Predicate F o(Believe′ ), F o(Say′ )
T y(cn) Nominal F o(x, Student′ (x)),
F o(y, Father′ (John′ )(y))
T y(cn → e) Quantifier F o(λP.ǫ, P )
Trees, therefore, display nodes decorated not only with formulae but also their
associated types, as in (2.5). In fact, all information holding at, or annotating, a
tree node is stated as a declarative unit, or a tree node description. Declarative
Units (DUs) consist of consistent sets of labels expressing a range of different sorts
of information.
(2.5) Representation of content of Eve likes Mary
T y(e → (e → t)),
T y(e), F o(Mary′ )
F o(Like′ )
6 The latter being the type assigned to common nouns where the formula consists of an ordered
pair of variable plus a propositional formula where that variable occurs free. See chapter 3.
7 All quantifying expressions are analysed as terms of type e, for example the term
ǫ, x, Student′ (x) listed is the analogue in these terms of existential quantification ∃, x, Student′ (x).
32 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
This tree displays how information from the functor nodes combines with infor-
mation from the argument nodes to give the complex formula value at the mother
node. As in Categorial Grammar, application of modus ponens over type values is
parallelled by function-application over formula values.8
?T y(t), ♦
We assume that the parse of Eve likes gives rise to the tree in (2.6) through mech-
anisms that we shall see below.
8 However,
in this notation, DS types are conditional types without any implication for the order
of natural language expressions (unlike the “forward-slash” and “backward-slash” of Categorial
Grammars).
9 Requirements may be modal in form in which case the specified goal may be achieved by an
?T y(t)
At this stage, there are three outstanding requirements, ?T y(t), ?T y(e → t), and
?T y(e). The pointer symbol, ♦, shows that, following the processing of Eve likes,
the node under development is the internal argument node of the predicate, as
determined by the lexical actions associated with parsing the verb likes (see below).
The current task state is thus ?T y(e), a requirement to construct a term. If, in
this situation, information from the lexicon provides an expression of T y(e) (such
as by a parse of Harriet ), it can be introduced into the tree at that node, since
it matches, and so satisfies, the current requirement. However, if the next word
is sing, its associated predicate F o(Sing′ ) cannot be introduced into the tree even
though its type, T y(e → t), matches one of the requirements in the tree. This is
because the pointer is not at the node at which this requirement holds. No update
can, therefore, be provided by the word sings, and the sequence of actions induced
by parsing the verb cannot lead to a completed logical form. The parse must be
aborted and, hence, the sequence of words *Eve likes sings is not a grammatical one.
The pointer thus gives important information about the current state of a parse
and the theory of how the pointer moves will form a significant role in the analyses
to be presented in later chapters of the book. As you can see from this sketch
of a simple example, the notion of grammaticality rests, not merely on whether a
certain parse leaves requirements unsatisfied, but also on where the pointer is at
any particular point in a parse.
00 01
010 011
0110 0111
34 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
By convention, the left daughter node of a node n is assigned the index n0 and
the right daughter is assigned the index n1. This information may form part of
a Declarative Unit (the information collected at a node) and is expressed by the
predicate T n (tree node) which takes as value some index, as illustrated in the tree
in (2.8).
(2.8) Tree-locations from a parse of John sings
In this tree, the left daughter is decorated with an argument formula, the right
daughter is decorated with a formula which is a functor that applies to that ar-
gument to yield the decoration at the mother. As a convention, this is a pattern
we shall retain throughout: arguments as left daughters, and functors as right
daughters.
So far our vocabulary can only describe individual nodes. However we can de-
scribe the relation between tree nodes if we add modal operators to the language of
description. The relations between tree nodes can then be described by modal state-
ments which provide a means to state that some information holds at a daughter
or at a mother node, and more important, as we shall later see, a means to express
requirements that need to be satisfied at some node other than the current node.
There are two basic modalities, one corresponding to the daughter relation,
h↓i ‘down’, and one corresponding to the mother relation, h↑i ‘up’. These can
be used with and without the numerical subscript, depending on whether it is im-
portant to distinguish between left (argument) and right (functor) branches. Hence,
h↓1 i refers to the functor daughter, while h↓0 i refers to the argument daughter node
(and similarly for the mother relation, h↑1 i, h↑0 i, but in this case there is only ever
a single mother). As these symbols form part of a modal logic, an expression like
h↓0 iT y(e) at node n, means ‘there exists an argument daughter that node n imme-
diately dominates which is decorated by the label T y(e)’ (i.e. node n has a term
as its argument daughter).
Furthermore, modality operators can be iterated, e.g. h↓ih↓i, h↓ih↑i, etc. This
provides a means of identifying from one node in a tree that some property holds
of some other node.
(2.9) T n(0), Q
T n(00), P T n(01), P → Q
T n(010), R T n(011), R → (P → Q)
It is thus possible to describe the whole of the tree in (2.9) from any node within it.
Hence, the statements in (2.10) are all true of this tree from the node with treenode
address 010, i.e. that one decorated by R.
2.2. THE TOOLS OF DYNAMIC SYNTAX 35
(2.10) a. h↑0 iP → Q
at my mother, P → Q holds
b. h↑0 ih↓1 iR → (P → Q)
at my mother’s functor daughter, R → (P → Q) holds
c. h↑0 ih↑1 iQ
at my mother’s mother, Q holds
d. h↑0 ih↑1 ih↓0 iP
at my mother’s mother’s argument daughter, P holds
The tree may be described in many different ways using the modal operators,
depending on the ‘viewpoint’, the node from which the description is made. Another
is shown in (2.11) which takes the topnode as the point of reference and provides
a complete description of the tree. In this description, each node is described in
terms of the information that holds at that node.
(2.11) a. T n(0), Q
at node 0, Q holds
b. h↓0 iP
at my argument daughter P holds
c. h↓1 iP → Q
at my functor daughter P → Q holds
d. h↓0 ih↓1 iR
at the argument daughter of my functor daughter P → Q holds
e. h↓1 ih↓1 iR → (P → Q)
at the functor daughter of my functor daughter R → (P → Q) holds
As we shall see shortly, the use of this logic is crucial for many analyses in the
system. It allows, for example, a specification of the lexical actions associated with
parsing some word to construct nodes within a tree or to annotate the current or
other nodes with some information, as we shall see below. The modal operators
also come into play with respect to parsing particular words to check aspects of the
local linguistic context (the partial tree constructed so far) to ensure that the parse
is well-formed. As an example, one way of expressing case relations is to articulate
them as a constraint on decorating the current node only if its mother bears a
certain kind of decoration. Hence, one might define a subject pronoun in English,
such as they, as licensed just in case the node that dominates the current node
carries a propositional requirement, h↑0 i?T y(t), i.e. the current node is a subject
node. Such a constraint is satisfied in the first tree in (2.12) (showing part of the
analysis of They sing) but not in the second (disallowing *Kim likes they).
(2.12) Proper and improper contexts for parsing they
OK T n(0), ?T y(t) NOT OK T n(0), ?T y(t)
The modal operators may also be used in conjunction with the notion of require-
ment to constrain further development of the tree. So while an annotation such as
h↓0 iF o(α) indicates that the formula value α holds of my argument daughter (and
is therefore only true if within the tree as currently constructed, F o(α) actually
36 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
does decorate that node), one such as ?h↓0 iF o(α) merely states that at some point
in constructing the current tree, F o(α) must decorate my argument daughter. This
gives us another way to express case – as providing a filter on the output tree.
So we might also express nominative case as providing a decoration of the form
?h↑0 iT y(t).10
Notice that in this second characterisation of case that the ‘?’ precedes the
modal operator, so this formulation expresses the constraint that in order to achieve
a well-formed result, the current node must be immediately dominated by a node
which has the decoration T y(t). This is of course not achieved at any stage prior to
the final goal, hence it is a filter on the output tree. As we can already see, this will
give us a flexibility with respect to the analysis of case; and individual languages
or indeed individual case specifications may differ in the restrictions they impose.
Quite generally, we shall see that such modal requirements are very important and
will be used extensively throughout this book.
10 We recognise that this sort of specification of case as just involving conditions on tree position
(i.e. of argument status) is insufficient as a complete characterisation of case in languages that
have well-articulated systems of morphological case. It is, however, sufficient for our limited
purposes and will be used to provide the basis for a theory of constructive case in chapter 6 with
respect to scrambling in Japanese.
2.3. CONSTRUCTING TREES 37
b.
{. . . φ . . . ♦ . . . }
{. . . ψ . . . ♦ . . . }
As already noted, a pair of transition rules together drive the process of growth
by breaking down goals into subgoals. The rule of Introduction is effectively an
inference from an initial goal to one in which two subgoals are added in the form
of requiring the tree to grow and be annotated by information that together can
satisfy the original goal. Specifically, the rule unpacks a requirement to find an
expression of one type into requirements to have daughters which are decorated by
expressions of other types which can be combined through functional application to
give an expression of the appropriate type. In other words, the rule adds to a node
with a requirement to be decorated with an expression of type X, requirements to
have daughters decorated with expressions of type Y on one node and type Y → X
on the other. This is formally defined in (2.14) in terms of tree descriptions and
shown in terms of tree growth in (2.15).11 Notice that the tree in (2.15) has not
actually grown the daughter nodes: it has merely acquired requirements to have
such nodes. So the tree, as in the input, consists of only one node.
(2.14) Introduction (Rule)
{. . . {. . .?T y(Y ) . . . ♦} . . . }
{. . . {. . .?T y(Y ), ?h↓0 iT y(X), ?h↓1 iT y(X → Y ), . . . ♦} . . . }
The outermost brackets in such tree descriptions allow the rule to apply at any
point in the tree-construction process, and will be a general property of the rules
unless explicitly defined otherwise.
The second rule, called Prediction, introduces the required nodes and deco-
rates them with requirements to be decorated by expressions of the required type.
As before, we give the formal definition in terms of tree descriptions (2.16) followed
by the same information shown in terms of tree growth (2.17).12
(2.16) Prediction (Rule)
as the first of the two nodes to be expanded. Although this is not necessary from the point of
view of the grammatical formalism, we take this move to be universal and a reflection of the fact
that (for example) subjects are typologically more frequently found before their verbs than after
them. We will see below and throughout the book how other word orders may be achieved.
38 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
?T y(Y ), ♦ ?T y(Y → X)
The effect of these rather complex looking rules is, in fact, best illustrated by a
single step of tree growth as shown in (2.20) which shows a tree growing from just
one node decorated with a requirement of type t to a new tree with two new nodes
decorated with requirements to find expressions of types e and e → t.13 Although we
will continue to show the transition rules in their more formal form in this chapter
in order to familiarise the reader with the technical apparatus, we will illustrate the
effects of all such rules using figures that show tree growth directly. In later chapters,
although we will continue to state the formal definitions of transition rules, readers
are encouraged to rely on the more immediately comprehensible illustrations of tree
growth.
(2.20) Introduction and Prediction of Subject and Predicate
?T y(t), ?h↓0 iT y(e), ?T y(t), ?h↓0 iT y(e),
7→
?h↓1 iT y(e → t), ♦ ?h↓1 iT y(e → t)
?T y(e), ♦ ?T y(e → t)
One property of these rules should be noted immediately: all rules are optional, so
the system is constraint-based and in this respect like HPSG and LFG and unlike
minimalism.14
13 In fact, it is solely this variant of Introduction that we shall use, narrowing down the set of
For example, an expression like a proper name is of T y(e) and requires that
there be a current requirement ?T y(e) at the stage at which the lexical entry is
scanned. This provides the appropriate trigger. The THEN statement lists the
particular actions which are performed if the condition in the IF statement is met.
In the case of a proper name, this can be taken to be a simple annotation of the
current node with type and formula information, using the predicate put(). Finally,
the only other possibility is to abort the parse. The lexical entry for a name like
Hilary is thus as shown in (2.22) and its effect is to induce the transition shown in
(2.23) from the output tree in (2.20) above.15
IF ?T y(e) Trigger
(2.22) Hilary THEN put(T y(e), F o(Hilary′ ), [↓]⊥) Annotation
ELSE Abort Failure
15 This view of proper names as having no internal structure will be revised in Chapter 3, when
we incorporate an account of quantification.
40 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
?T y(e), T y(e),
?T y(e), ♦ ?T y(e → t) ?T y(e → t)
F o(Hilary′ ), [↓]⊥, ♦
The one strange decoration in this sequence of actions is [↓]⊥, what we call
“the bottom restriction”. This annotation is to express that the word is used to
decorate a terminal node in the tree. Formally it reads: “necessarily below the
current node (for every node the current node immediately dominates), the falsum
holds”: i.e. the current node has and will have no daughters with any properties at
all. Intuitively, this is the reflection in this system that words provide the minimal
parts from which interpretation is built up. So we reflect a compositionality of
meaning principle like anyone else; but the information projected by words may be
considerably more than merely providing some concept.
Within lexical entries, the order of the action predicates is important. For
example put(. . . ) before make(. . . ) means ‘put information at current node, then
build a new node’, while make(. . . ) before put(. . . ) means ‘build a new node and
put some information there’. This is important, for example, in parsing verbs in
English. We take the parsing of verbs in English to be triggered by a context in
which there is a predicate requirement, ?T y(e → t).16 The actions induced by
parsing finite verbs involve at minimum the annotation of the propositional node
with tense information and the annotation of the predicate node. This is all that
will happen with intransitive verbs, as shown in the lexical entry for danced in
(2.24).
(2.24) danced
IF ?T y(e → t) Predicate Trigger
THEN go(h↑1 i?T y(t)); Go to propositional node
put(T ns(P AST )); Add tense information
go(h↓1 i?T y(e → t)); Go to predicate node
put(T y(e → t), F o(Dance′ ), [↓]⊥) Add content
ELSE Abort
There are a number of things to notice about the information in this lexical en-
try. In the first place, the condition for the introduction of the information from
danced is that the current task state is ?T y(e → t). Then there is movement from
that node up to the immediately dominating propositional node, given by the in-
struction go(h↑1 i?T y(t)) ‘go up to the immediately dominating node decorated by
a propositional requirement’. This node is then annotated by tense information
which we have represented simplistically in terms of a label T ns with simple values
P AST or P RES(EN T ) (for English).17 Notice that this means of encoding tense
obviates the need for ‘percolation’ or ‘copying’ devices, as required in HPSG and
other frameworks, to ensure that information introduced by a word gets to the
place where it is to be interpreted. This is done entirely through the dynamics
16 Differences in the trigger for classes of expression is one of the ways in which cross-linguistic
better account would include proper semantic information manipulating indices so that the tense
label does not look like a simple syntactic label. See also chapter 3 where the account of tense is
slightly expanded in conjunction with the discussion of quantification.
2.3. CONSTRUCTING TREES 41
of parsing the verb.18 The pointer then returns to the open predicate node and
annotates that with type and formula information (and the bottom restriction) as
we saw with parsing Hilary.
Notice that the order in which the action predicates occur is important, reflect-
ing the dynamic nature of the analysis. If the action put(T ns(P AST )) preceded
the action go(h↑1 i?T y(t)), the predicate node would be decorated with the tense
information and not the propositional node. Similarly, the ordering of put(T y(e →
t), F o(Dance′ ), [↓]⊥) before go(h↓1 i?T y(e → t)) would fail to satisfy the proposi-
tional requirement on the node and so lead to a failure of the parse.
So intransitive verbs add information about tense and supply a one-place pred-
icate. Transitive verbs, however, not only add tense, but create new nodes: a
two-place predicate node which it annotates with type and formula values and
a node for the internal argument, decorated with a type e requirement. This is
illustrated in the lexical entry for upset in (2.25).
(2.25) upset
IF ?T y(e → t) Predicate trigger
THEN go(h↑1 i?T y(t)); Go to propositional node
put(T ns(P AST )); Tense information
go(h↓1 i?T y(e → t)); Go to predicate node
make(h↓1 i); Make functor node
put(F o(Upset′ ), T y(e → (e → t)), [↓]⊥); Annotation
go(h↑1 i); Go to mother node
make(h↓0 i); Make argument node
go(h↓0 i); Go to argument node
put(?T y(e)) Annotation
ELSE Abort
The condition for the introduction of the information from upset is that the cur-
rent task state is ?T y(e → t). If this condition is met, a new functor node is built
and annotated with the formula and type values specified, and following the return
to the mother node, a new daughter node is built with a requirement for a for-
mula of type e. To be fully explicit, the decoration F o(Upset′ ) should be given as
F o(λyλx[Upset′ (x, y)]), with λ-operators indicating the number and type of argu-
ments with which the predicate Upset′ has to combine, and the order in which this
functor will combine with them.19
18 It might be objected that tense information should be generalised, otherwise one might expect
different verbs within the same language to behave differently with respect to such inflectional
matters. Possibly, in morphologically regular constructions, the phonological information pro-
vided by the consonant cluster indicates a separate lexical specification for the suffix, an analysis
of phonological clustering advocated within Government Phonology (Kaye 1995). In fact, it is
relatively easy to structure the lexicon within Dynamic Syntax as is done in HPSG (Sag and
Wasow 1999, etc.) so that past tense (for example) can be stated generally as an instruction to
go up to the mother node and decorate that with the past tense label:
tense-past IF T y(e → t)
THEN go(h↑1 i); put(T ns(P AST )); go(h↓1 i)
content
where content is the basic actions induced by all forms of the verb, in the case of dance merely the
decoration of the predicate node with the appropriate formula, type and bottom restriction. There
are differences between HPSG accounts and what is necessary in the current dynamic framework,
but we do not pursue these refinements in this book. What cannot differ between lexical specifica-
tions is the specification of tense necessary for the appropriate semantics to be given, here specified
as a decoration on the type t-requiring node as a promissory note for a formal characterisation.
See also chapter 3, and chapter 6, where we see that the past-tense morpheme of Japanese plays
a particular role in determining the way in which interpretation is incrementally built up.
19 The λ operator is an abstraction operator which constructs functions from some input to a
42 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
To illustrate the effect of the parse of a transitive verb, (2.26) shows the transi-
tion from the output tree in (2.23) to a tree with the pointer at the open predicate
node, which triggers the parse of the verb upset to give the second tree.
(2.26) Parsing Hilary upset
?T y(t) 7→
T y(e), F o(Hilary′ )
?T y(e → t), ♦
[↓]⊥
T y(e), F o(Hilary′ )
?T y(e → t)
[↓]⊥
T y(e → (e → t))
?T y(e), ♦
F o(Upset′ ), [↓]⊥
With the pointer again at an open T y(e) node, it is possible to parse another proper
noun, say Joan, to yield the tree shown in (2.27).
(2.27) Parsing Hilary upset Joan
T y(e), F o(Hilary′ )
?T y(e → t)
[↓]⊥
T y(e → (e → t))
T y(e), F o(Joan′ ), ♦
F o(Upset′ ), [↓]⊥
The first rule provides a means for stating that requirements have been ful-
filled:
(2.28) Thinning
{. . . {. . . , φ, . . . , ?φ, . . . , ♦} . . . }
{. . . {. . . , φ, . . . , ♦} . . . }
All this rule does is to simplify the information accumulated at a node (a Declarative Unit
(DU)): if, at the current node, a DU holds which includes both a fact and the re-
quirement to fulfil this fact, the requirement is deleted and the pointer remains at
the current node. This is the only means of getting rid of decorations, rather than
adding them. Hence, in (2.23), there is a further step before the pointer moves to
the predicate node: getting rid of the type requirement, as shown in (2.29).20
(2.29) Parsing Hilary with thinning
?T y(t) 7→ ?T y(t)
We also need transition rules that move the pointer on from a type-complete
node and by so doing to satisfy the modal requirement imposed by Introduction.
In Kempson et al. (2001), this is done by a rule called completion (which can be
regarded as the inverse of Prediction) which moves the pointer up from a daughter
to a mother and, crucially, annotates the mother node with the information that
it indeed has a daughter with certain properties. This latter move has the effect
of satisfying the modal requirement introduced by Introduction. The rule of
completion is given in (2.30) and the effect in terms of tree growth is shown in
(2.31). Hence, completion states that if at a daughter node some information
holds which includes an established type, and the daughter is the current node,
then the mother node may become the current node. In the tree-display, we use
a ternary branching format to emphasise the neutrality of Completion between
functor and argument daughters.
(2.30) Completion (Rule):21
... h↑iT n(n), T y(X), ♦ ... ... h↑iT n(n), T y(X) ...
20 In general, however, we will not show the final transition determined by thinning, assuming
alise this to cover the return of the pointer having constructed and decorated an unfixed node
(symbolised by the ∗ notation).
44 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
Notice how this formulation of completion brings out how the use of modal
statements reflects the perspective of the node in the tree at which they hold. So in
the rule as formulated, some daughter node is defined to have a mother node T n(n)
above it: i.e. it itself is defined as h↑i iT n(n). In the input state defined by the rule,
T y(X) holds at the daughter node. The effect of the rule is to license the pointer
to shift up to the mother node; and from the perspective of the mother node, to
record the fact that h↓i iT y(X) holds. The latter annotation, as noted, satisfies the
requirement ?h↓ii T y(X) written to the mother node by Introduction and the
node can duly be thinned.
We also need a rule for moving the pointer down a tree which we call antic-
ipation and which moves the pointer from a mother to a daughter which has an
outstanding requirement.22 The rule is given in (2.32) and the effect on tree growth
is shown in (2.33).
(2.32) Anticipation (Rule):
The rules for pointer movement mean that the transitions in (2.26) are mediated
by the further transitions shown in (2.34): the first licensed by completion; the
second by thinning and the third by anticipation. Although these transitions
are formally necessary (and have an effect on licit derivations, as we shall see in
later chapters), we will, in general, ignore this sort of straightforward development,
assuming that the pointer always moves directly from a type-complete node to a
(sister) node hosting an open requirement.
(2.34) Parsing Hilary with completion and anticipation
a. After Thinning
?T y(t), ?h↓0 iT y(e), ?h↓1 i(T y(e → t)
T y(e), [↓]⊥,
?T y(e → t)
F o(Hilary′ ), ♦
b. Completion
?T y(t), ?h↓0 iT y(e), h↓0 iT y(e), ?h↓1 i(T y(e → t), ♦
T y(e), [↓]⊥,
?T y(e → t)
F o(Hilary′ )
22 The term ‘anticipation’ being intended to convey the idea that the movement of the pointer
c. Thinning
?T y(t), h↓0 iT y(e), ?h↓1 i(T y(e → t), ♦
T y(e), [↓]⊥,
?T y(e → t)
F o(Hilary′ )
d. Anticipation
?T y(t), h↓0 iT y(e), ?h↓1 i(T y(e → t)
T y(e), [↓]⊥,
?T y(e → t), ♦
F o(Hilary′ )
T y(Y ), F o(α) T y(Y → X), F o(β) T y(Y ), F o(α) T y(Y → X), F o(β)
Notice that elimination does not introduce a new node, but only changes an-
notations holding at one node: if a given node immediately dominates an argument
and a functor daughter which are both annotated with a formula and a type value,
then the two type values can combine by modus ponens, with the corresponding
F ormula expressions combined by function-application. For example, in complet-
ing the analysis of Hilary upset Joan from the output tree in (2.27d) (shown as
the initial tree in (2.37)), completion licenses the movement of the pointer to the
open predicate node. Then elimination applies to satisfy that type requirement
as shown in the last of the partial trees in (2.37).
23 Note that if construction rules were stated in the same format as lexical actions (as they could
be), the condition could be stated straightforwardly as an instruction to abort the actions, just in
case a daughter has an unsatisfied requirement:
... IF h↓i?φ
THEN abort
ELSE ...
46 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
T y(e), [↓]⊥,
?T y(e → t)
F o(Hilary′ )
T y(e), [↓]⊥,
?T y(e → t), h↓0 iT y(e), ♦
F o(Hilary′ )
24 Notice that in the function-argument notation in the Formula language, the functor always
T y(e), [↓]⊥,
?T y(e), ♦ ?T y(e → t) ?T y(e → t), ♦
F o(John′ )
The verb thinks is now parsed. This is associated with the lexical information
in (2.41) which shows present tense and the creation of an internal argument node
with a type t requirement. The effect of parsing the verb is shown in (2.42).
IF ?T y(e → t)
THEN go(h↑1 i?T y(t));
put(T ns(P RES));
go(h↓1 i?T y(e → t));
(2.41) thinks make(h↓1 i), go(h↓1 i);
put(T y(t → (e → t)), F o(Think′ ), [↓]⊥);
go(h↑1 i); make(h↓0 i); go(h↓0 i);
put(?T y(t))
ELSE Abort
2.3. CONSTRUCTING TREES 49
T y(t → (e → t)),
?T y(t), ♦
F o(Think′ ), [↓]⊥
T y(e), [↓]⊥,
?T y(e → t)
F o(John′ )
The effect of parsing the complementiser is shown in (2.44) and the parse of
the rest of the string proceeds exactly as discussed in the previous section, yielding
the tree in (2.45) up to the point at which the embedded proposition has been
completed. Notice that thinning applies twice to the internal propositional node:
once to eliminate the type requirement ?T y(t) and once to eliminate the requirement
for tense information ?∃x.T ns(x).
50 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
T y(e), [↓]⊥,
?T y(e → t)
F o(John′ )
The completion of the tree then proceeds through steps of completion, elimina-
tion and thinning on the predicate and top propositional nodes to yield the final
output shown in (2.46).25
(2.46) Completing John thinks that Hilary upset Joan
T ns(P RES), F o(Think′ (Upset′ (Joan′ )(Hilary′ ))(John′ )), T y(t), ♦
T y(e), [↓]⊥,
T y(e → t), F o(Think′ (Upset′ (Joan′ )(Hilary′ )))
F o(John′ )
Notice in this connection that lexical specifications may do more than just dec-
orate the terminal nodes of a tree, but decorating the terminal nodes is essential
to what they do. So there is a sense in which the simple form of the traditional
principle of compositionality (that sentence meanings are composed of meanings of
words in a certain configuration) is retained. It is just that the information that a
word may convey may be more than just some concept: it may provide information
about what other concepts it may combine with; it may introduce other nodes in
the tree (e.g.the object argument for a verb); or in the case of tensed verbs, the
word may project decorations on other nodes. On the other hand, a word like the
subordinating complementiser that does not decorate a terminal node; indeed, it
25 At this point, the simplistic account of tense we have provisionally adopted means that we
cannot express the compilation of tense as part of the propositional formula. Defining this involves
essential interaction with quantification, so this aspect has to wait until chapter 3.
2.4. LEFT DISLOCATION STRUCTURES 51
In words, this is the interpretation ‘John thinks that Hilary upset Joan’ in which
the formula itself is derived compositionally in a regular bottom-up way from the
terms introduced into the tree structure. The primary difference from the normal
syntax-semantics correspondence is that this structure has been introduced on a
left-to-right basis, rather than on a bottom-up basis as is standard.
T n(0), Q, h↓∗ iR
There are four decorated nodes in this tree, but only three of them are in fixed
locations. The fourth is described as holding at h↑∗ iT n(0) indicating that it holds at
some node within the tree along a sequence of daughter relations from the topnode
but without that sequence being further specified. In short, the only information
52 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
similar to the concept of ‘functional uncertainty’ of LFG (Kaplan and Zaenen 1989), though the
resulting analyses are distinct: see chapter 5 section 5.6.
27 Pronounced “star adjunction”, the name being intended to express the idea of a node which
The output tree in (2.52) provides an environment in which the proper name
Joan can be parsed. Completion can apply to the type-complete node because
h↑∗ i is one of the modalities that it refers to (see (2.30)) and the pointer moves
back up to the topnode.28 The parse of the string in (2.50) then proceeds through
Introduction and Prediction to provide subject and predicate requirements,
permitting the parse of Hilary upset, with the unfixed node remaining on hold, so
to speak, as shown in (2.53).29
(2.53) Parsing Joan, Hilary upset
a. Parsing Joan
T n(0), ?T y(t)
F o(Joan′ ), T y(e),
T n(00), ?T y(e), ♦ T n(01), ?T y(e → t)
h↑∗ iT n(0), ?∃x.T n(x)
28 This generalisation of completion is not a license to move the pointer up through arbitrary
fixed nodes in a tree. Kempson et al. (2001) make a distinction between a relation of ‘internally
dominate’, which applies to unfixed nodes only, and one of ‘externally dominate’ which is the
general dominate relation ranging over all dominated tree positions, fixed or unfixed. External
domination is shown using up and down arrows without the angle brackets. So, ↑∗ is the external
modality corresponding to the internal modality h↑∗ i. Only the latter is referred to in the rule
of completion so movement of the pointer upwards can only take place from a currently unfixed
node (decorated also by the treenode requirement ?∃x.T n(x)) or from a fixed argument or functor
daughter, but not from any other nodes.
29 In all trees that follow, the bottom restrictions imposed by lexical specifications will be omit-
At the point reached in (2.53d), the pointer is at a node with an open type e
requirement. Since this matches the type of the unfixed node, the latter may merge
with that position.30 The output of this process is the unification of the information
on the unfixed node with that on the internal argument node. This ensures that
the open type requirement on the latter and the outstanding requirement on the
unfixed node to find a treenode address are both satisfied, as can be seen from
the DU in (2.54a). Thinning applies (over ?∃x.T n(x) and T n(010) and ?T y(e)
and T y(e)) to give (2.54b), where because the specific modality (h↑0 ih↑1 iT n(0)) is
subsumed by the general one (h↑∗ iT n(0)), only the former need be retained.
(2.54) a. {h↑∗ iT n(0), h↑0 ih↑1 iT n(0), ?∃x.T n(x), T n(010), ?T y(e), T y(e),
F o(Joan′ ), [↓]⊥, ♦}
b. {h↑0 ih↑1 iT n(0), T n(010), T y(e), F o(Joan′ ), [↓]⊥, ♦}
This process is shown informally in (2.55) where the dashed arrow indicates the
merge process and the second tree shows the output. Notice that when compiled
and completed, the tree in (2.55b) is identical to the output tree for Hilary upset
Joan in (2.38). The informational differences between the two strings are thus not
encoded in the representation, as in many current theories of syntax, either as a
separate layer of grammatical information (Vallduvı́ 1992) or in terms of functional
categories projected at the left periphery of the clause (Rizzi 1997, and others fol-
lowing him). Instead, the differences, we assume, derive from the different ways
that the final tree is established. With the left dislocated object, a term is pre-
sented to the hearer which provides an update for a propositional structure to be
established from the rest of the string – a type of focus effect (see Kempson et al.
(2004) for further discussion).31
30 The process of merge used in DS should not be confused with the entirely different process
of Merge in the Minimalist Program (Chomsky 1995). A better term in DS would be unify, but
we adhere to the term introduced in Kempson et al. (2001).
31 Note, in passing, how the process directly reflects an idea of focus as involving an isolated
term which provides an update to some, given, propositional form (Rooth 1985, Erteschik-Shir
1997), a perspective which we pursue somewhat further in chapter 4.
2.4. LEFT DISLOCATION STRUCTURES 55
T n(00), T y(e),
T n(01), ?T y(e → t)
F o(Hilary′ )
At this point, we need some reflection on the differences between the formal
apparatus and what the tree displays seem to be reflecting. With the unfixed node
appearing to the left of the main propositional tree attached to the topnode, it
looks from the tree displays that it is somehow associated with some position on
the left periphery, analogous perhaps to some Topic or Focus projection. However,
this is merely an artefact of the use of tree diagrams to illustrate the concepts
used. Recall that trees give representations of meaning, not words and phrases,
so the representation of the node to the left or the right is immaterial. More
importantly, unfixed nodes do not inhabit any determinate position in the tree
and, technically, the information associated with the unfixed structure (i.e. the
declarative unit that describes that structure), is checked against each partial tree
as successive pairs of daughter nodes are introduced. So in effect, the information
from such nodes is passed down the tree, step by step, until a fixed position for
the node can be established. Merge may then take place at any stage where the
information on the unfixed node is compatible with that on the fixed position. The
development shown in (2.53) might thus be more accurately shown as (2.56), where
the information associated with the unfixed node is carried down the tree along
with the pointer. The information associated with the unfixed node is shown inside
the dashed boxes .
(2.56) Parsing Joan, Hilary upset
a. *Adjunction 7→
T n(0), ?T y(t),
h↑∗ iT n(0), ?T y(e), ?∃x.T n(x), ♦
b. Parsing Joan
T n(0), ?T y(t),
h↑∗ iT n(0), T y(e), F o(Joan′ ), ?∃x.T n(x), [↓]⊥, ♦
56 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
Note that at each of these intermediate nodes, the decorations on the unfixed
node are checked for consistency with that intermediate node. And at any one
of these nodes T n(a), the unfixed node can be described as h↑∗ iT n(a) because,
according to this concept of ‘be dominated by’, the relation holds between two
nodes if there is any sequence of daughter relations between the two nodes in
question, including the empty one, hence if the property holds at that node.32
At this point we need to be a bit more precise as to what the process of merge
involves. Quite simply, all it does is to unify two node descriptions, here referred
to as DU, DU′ (a pair of DUs) as indicated in the formal rule in (2.57).
(2.57) Merge
{. . . {. . . DU, DU′ . . . } . . . }
{. . . {. . . DU ⊔ DU′ . . . } . . . }
♦ ∈ DU′
The only constraints on this completely general rule are that: (a) the pointer is
one of the decorations on one of the DUs (the fixed node) and (b) that the two
32 In this way, the concept of the unfixed node shows some similarity with the SLASH passing
mechanisms of HPSG (Sag and Wasow 1999). A significant difference, however, is that, once
discharged through merge, no record remains of the fact that the left peripheral expression was
ever unfixed.
2.4. LEFT DISLOCATION STRUCTURES 57
2.5 Anaphora
We have so far been considering the processing of sentences pretty much as though
in isolation. We now have to rectify this, and take up the challenge put out in
chapter 1 to provide a unitary characterisation of pronominal anaphora. To do this,
we assume that a pronoun gives rise to a place-holder for some logical expression
which has been constructed within the context of utterance. Antecedents, though
they may be given by previous words, cannot be the words themselves as this gives
quite the wrong result over and over again. Presuming that the antecedent of the
pronoun in (2.58a) is the quantifying expression wrongly fails to predict what is
the appropriate interpretation of (2.58a), and wrongly predicts that it should have
the same interpretation as (2.58b).
(2.58) a. Every child thinks that he should get a prize.
b. Every child thinks that every child should get a prize.
Assuming the general stance that words provide lexical actions in building up rep-
resentations of content in context, in contrast, gets us on the right track. We can
thus say that the pronoun may pick out some logical term if that term is provided
in the discourse context, whether it is a full logical name or a variable introduced by
some quantifying expression, and so on. We consider only the simplest cases here,
leaving a discussion of the quantificational cases until the next chapter, but the
effect of such an assumption, together with the adoption of the epsilon calculus to
provide an account of quantification (see chapter 3) is to capture both the fact that
pronouns contribute very differently to interpretation depending on the antecedent
that they have, and that a pronoun is nevertheless not lexically ambiguous in the
sense of having a number of quite different interpretations defined in the lexicon.
To achieve the simple notion that pronouns pick out some logical term from the
discourse context, we again have recourse to underspecification, in this case to the
underspecification of content, rather than underspecification of position, as with
our account of left dislocation. So we extend the vocabulary of our Formula values
to allow for placeholders for specific values. These we call metavariables in the
logical language and represent as boldface capitals U, V and so on. A pronoun
then projects one such metavariable as the F o value given by its lexical actions. As
a metavariable is just a placeholder for some contentful value, it is associated with
a requirement to establish such a value, ?∃x.F o(x), just as unfixed nodes with the
underspecified modality h↑∗ iT n(n) are associated with a requirement to find a value
for its treenode label, ?∃x.T n(x). Following the general pattern that all require-
ments have to be eliminated in any well-formed derivation, the formula requirement
ensures that metavariables will be replaced by a term in the F ormula language as
part of the construction process. Such replacement is established through a prag-
matically driven process of substitution which applies as part of this construction
process.
As an illustration, in processing an example such as (2.60), uttered in the context
of having just parsed an utterance of (2.59), we assume the steps of interpretation
in processing the subject and object expressions shown in (2.61).34
(2.59) John ignored Mary.
(2.60) He upset her.
34 The trees are shown schematically; and types and other requirements, once established, are
F o(U), T y(e),
?T y(e → t)
?∃x.F o(x), ♦
b. Substitution
?T y(t)
T y(e),
?T y(e → t), ♦
F o(John′ )
c. Parsing He upset
T ns(P ast), ?T y(t)
T y(e),
?T y(e → t)
F o(John′ )
T y(e → (e → t)),
?T y(e), ♦
F o(Upset′ )
T y(e),
?T y(e → t)
F o(John′ )
F o(V), ♦,
T y(e → (e → t)),
T y(e),
F o(Upset′ )
?∃x.F o(x)
2.5. ANAPHORA 61
e. Substitution
T ns(P ast), ?T y(t)
T y(e),
?T y(e → t)
F o(John′ )
It is important to notice that the metavariables are replaced, at steps (2.61b), and
at step (2.61e), so there is no record thereafter of there having been a pronominal
form of input. The tree resulting from the parse of John upset Mary and that of
He upset her, as uttered in the context of having just parsed (2.59), is effectively
identical in the two cases, the only difference being the occurrence of meta-variables
redundantly decorating the node once Substitution has taken place in the con-
strual of the string containing pronouns.
The lexical specification for the pronoun he can now be given. It must express
both the need for the pronoun to have a value established in context, and specify
whatever other constraints the pronoun imposes. So, for example, he is associ-
ated with a constraint on substitution that whatever term is substituted for the
metavariable projected by the pronoun is describable (in context) as having the
property M ale′ . This sort of constraint (a presupposition) we show as a subscript
on the metavariable.35 Additionally, the case of a pronoun constrains which node
in a tree the pronoun may decorate. For nominative pronouns in English, where the
morphological specification of case is very restricted, we take nominative case spec-
ification to take the form of an output filter: a requirement that the immediately
dominated node be decorated with type t. In English, of course, there is a further
constraint that nominative pronouns only appear in finite subject positions, so that
the actual constraint is shown as a requirement to be immediately dominated by a
propositional node with a tense decoration.36
(2.62) he
IF ?T y(e)
THEN put(T y(e); Type statement
F o(UMale’ ); Metavariable and Presupposition
?∃x.F o(x); Formula Requirement
?h↑i(T y(t) ∧ ∃x.T ns(x)); Case Condition
↓ [⊥]) Bottom Restriction
ELSE Abort
Other pronouns and other case-forms project analogous information. So, for
example, the first and second pronouns project presuppositions that whatever is
substituted for the metavariable must be the speaker or the hearer/addressee, re-
spectively (i.e. USpeaker’ , UHearer’ ). Notice that, just as with third person pronouns,
the metavariable projected by I/me or you will be replaced by some logical term,
picking out the speaker or hearer, and so the output propositional form ceases to
35 We do not, in this book, go into presuppositional effects or the formal analysis of such things,
3, there is more than one way to express case restrictions. This will be taken up in more detail in
chapter 6, where we shall see that in Japanese, nominative case specification is arguably different
from other case specifications.
62 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
SUBJ ?T y(e → t)
What this does is to exclude all local arguments as possible substituends for any
metavariable. Additionally, it also excludes as a potential substituend a formula
decorating an unfixed node. This is because, as we saw above, the unfixed node is
evaluated at each node along the path between the node where it is introduced and
the node where it merges. The decorations on the node, including F o(α), are eval-
uated for whether they hold at that very node, and this entails h↑0 ih↑∗1 ih↓0 iF o(α).
Hence, an unfixed node is always local in the sense intended. With this approach,
then, neither Hilary likes her nor Hilary, she likes can ever be associated with the
propositional formula Like′ (Hilary′ )(Hilary′ ). The tree-description vocabulary can
thus capture the type of generalisations that other frameworks rely on such as fa-
miliar concepts of locality (indeed rather more, since the characterisation of partial
trees is the central motivation for turning to a formal tree-description language).
This substitution process is assumed to be defined over a context of structures
from which putative antecedents are selected. At this stage, we rely on a purely
intuitive concept of context, with context construed as an arbitrary sequence of
trees, including the partial tree under construction as the parsing process unfolds
incrementally (see chapter 9 for a more detailed discussion of context). So we
presume that the context relative to the parsing of John thinks that he is clever
at the stage of parsing the pronoun includes the partial tree under construction,
hence making available F o(John′ ) as a putative antecedent. This is so because John
decorates the matrix subject which is not a co-argument of the node decorated by
he, the modality between the latter node and the former is mediated by a second
argument node: h↑0 ih↑0 ih↑1 ih↓0 i.38
in the most common circumstances. However, Substitution is not involved here but the lexical
actions associated with a reflexive identify a local formula and use that as a substitute as part of
the parsing process directly. A word like herself is thus associated with a lexical entry like:
IF ?T y(e)
THEN IF h↑0 i?T y(t)
THEN Abort
ELSE IF h↑0 ih↑∗1 ih↓0 iF o(α)
herself
THEN put(T y(e), F o(α), [↓]⊥)
ELSE Abort
ELSE Abort
ELSE Abort
See chapter 6 for a discussion of the anaphor zibunzisin in Japanese.
39 The terminology in this area is extremely confusing, with ‘topicalisation’ as one standard term
for long-distance dependencies that are characteristically used for the focus effect of isolating one
term as update to some remaining structure.
64 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
material. Such an effect can be seen in the clitic doubling structures evident in lan-
guages such as Modern Greek, as illustrated in (2.67). The phrasal expression ti
Maria we take to project internal structure (that of an iota expression, a position
we take up for all proper names from chapter 3) and hence necessarily developing
the treenode decorated by the clitic tin.
(2.67) ti Maria tin ksero
theACC Maria herACC I know
‘I know Maria.’
We go into such constructions in more detail later in the book, but essentially the
clitic tin in (2.67) projects a metavariable but does not decorate that node with
the bottom restriction. This allows the unfixed node decorated by ti Maria to
merge with that node, yielding a structure identical to that derived by parsing the
non-dislocated string ksero ti Maria.
The example in (2.67) also shows another common property of natural lan-
guages: pro-drop. This is the phenomenon in many languages of licensing the
occurrence of a verb without any independent expression providing the argument
for the predicate, presuming on its identification from context. The dissociation of
a metavariable from the bottom restriction illustrated by clitic doubling, however,
also provides a straightforward way to analyse such languages.
As we have seen in this chapter, subject nodes in English are introduced by
the construction rules Introduction and Prediction while the lexical actions
associated with verbs are triggered by a predicate requirement, ?T y(e → t). As
such, transitive verbs do not decorate their subject nodes and overt subjects are
thus necessary to guarantee the well-formedness of strings because, without one,
the pointer can never reach the triggering predicate node. However, certain lan-
guages are best treated as having the parsing of verbs triggered by a propositional
requirement, rather than a predicate node. So, for example, in a VSO language
like Modern Irish, we may analyse (non-copular) verbs as being triggered by a type
t requirement and providing a full propositional template with the pointer left at
the subject node for development next. Thus, the verb chonaic ‘saw’ in (2.68) may
be given the lexical entry in (2.69) which has the effect shown in Figure 2.70. The
fact that the pointer is at the subject node allows the immediate parsing of the
first person pronoun mé in (2.68) and the pointer then travels down the tree using
anticipation (twice) to allow the internal object to be parsed.40
(2.68) Chonaic mé an cú
saw I the dog
‘I saw the dog.’
(2.69) chonaic
IF ?T y(t)
THEN put(T ns(P AST )); Tense
make(h↓1 i); go(h↓1 i); put(?T y(e → t)); Predicate Node
make(h↓1 i); go(h↓1 i);
put(F o(Chon′ ), T y(e → (e → t)), [↓]⊥; ) Main Functor
go(h↑1 i); make(h↓0 i); go(h↓0 i); put(?T y(e)); Internal Argument
go(h↑0 ih↑1 i); make(h↓0 i); go(h↓0 i); put(?T y(e)) Subject
ELSE Abort
40 Here and below, we represent the concepts expressed by content words in some language in
terms of primed versions of the stems of those words. So we expression the content of chonaic
in (2.69) as Chon′ , not in terms of their translations into English such as See′ . We do this in
order to sidestep the issue of whether lexicalised concepts can be realised fully in terms of simple
translation equivalents.
2.5. ANAPHORA 65
?T y(e), ♦ ?T y(e → t)
F o(Chon′ ), [↓]⊥,
?T y(e)
T y(e → (e → t))
There is, of course, much more to say about the analysis of Modern Irish, but this
brief sketch of the parsing of main verbs shows one of the ways in which word order
variation can be achieved within Dynamic Syntax: through the manipulation of
triggers (between propositional, main predicate and n-place predicate requirements)
and where the pointer remains after the lexical actions have occurred. It also
provides a means of accounting for subject pro drop languages such as Modern
Greek. While in Modern Irish the subject node is decorated by a type requirement
(at least for analytic forms such as chonaic), indicating the need for an overt subject,
in Modern Greek we may analyse verbs as decorating their subject nodes with
metavariables. Such metavariables, like those projected by pronouns in the way
we have seen, are associated with a formula requirement and so must be updated
with some contentful expression. This is achieved by lexical actions such as those
associated with the verb ksero ‘I know’ shown in (2.72). (2.73) shows the parse of
the sentence in (2.71) with substitution of the subject metavariable being made on
the assumption that Stavros utters the string.41 The resulting proposition is, as
expected:42 F o(Kser′ (Maria′ )(Stavros′ )).
b. Parsing ksero:
?T y(t)
F o(Kser′ ), [↓]⊥,
?T y(e)
T y(e → (e → t))
c. Substitution:
?T y(t)
F o(Kser′ ), [↓]⊥,
?T y(e), ♦
T y(e → e → t)
d. Parsing ti Maria:
?T y(t)
?T y(t)
T y(e → (e → t)),
?T y(e)
F o(Kser′ ), [↓]⊥
internal structure of proper names which in chapter 3 will be rectified when they are treated as
terms which have internal structure, as a form of quantifying expression.
68 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
The important thing to notice about this description is that lines b and c, giv-
ing the descriptions of the initial and introduced subject nodes, describe the very
same node, the argument node immediately dominated by the top node. The full
description of that node should thus be that in (2.77a) with the two descriptions
unified, which itself reduces to (2.77b) once fulfilled requirements and uninformative
information (i.e. the metavariable) are removed.
(2.77) a. {h↓0 iT n(0), T y(e), F o(Stavros′ ), [↓]⊥, F o(U), ?∃x.F o(x), ♦}
b. {h↓0 iT n(0), T y(e), F o(Stavros′ ), [↓]⊥, ♦}
The tree in (2.78b) is thus equivalent to that in (2.78a). This example thus shows
the usefulness of the tree description language which highlights directly the number
and type of nodes in a tree which may be obscured by apparently contradictory
sets of computational and lexical actions.
(2.78) Parsing o Stavros kseri with fixed initial term
a. Introduction and Prediction for subject and predicate:
?T y(t)
T y(e), F o(Stavros′ ),
?T y(e → t)
[↓]⊥, ♦
b. Parsing kseri
?T y(t)
T y(e → (e → t)),
?T y(e)
F o(Kser′ ), [↓]⊥
c. The result:
?T y(t)
T y(e), [↓]⊥,
?T y(e → t)
F o(Stavros′ ), ♦
T y(e → (e → t)),
?T y(e)
F o(Kser′ ), [↓]⊥
an output is achieved and, as we shall see later in this book, different parsing
strategies (possibly signalled by variations in prosodic information) may give rise
to different pragmatic effects that may be exploited to convey non-truth conditional
information.
Before turning to a more detailed discussion of well-formedness and ungram-
maticality it is useful to note, in concluding this section, that the basis for our
account of linguistic variation is in differences in lexical actions associated with
classes of words. So, in Greek, unlike Irish and English, the parse of a verb induces
a structure in which the type-assignment of the subject node is already satisfied
by the projection of a metavariable, thus giving rise to pro-drop effects. In both
Greek and Irish, on the other hand, unlike English, the parse of a verb projects
a full propositional template, giving rise to VSO orders that are excluded in En-
glish.44 In the chapters that follow, we will use different sets of lexical actions for
verbs and pronouns along lines suggested in this section in accounting for different
grammatical properties, in particular in accounting for the interaction of dislocated
or scrambled expressions with anaphora.
F o(John′ ) F o(Upset′ )
F o(Mary′ ) F o(Upset′ )
Given what the framework provides, why is it not possible for John to be construed
as subject and Mary as object in parsing John, Mary upset ? Several possible
derivations might spring to mind. For example, Mary might be taken to decorate
an object node which is introduced first on developing the predicate-requiring node
or, alternatively, to be taken to decorate an unfixed node introduced after the
subject node is established and decorated, which is then merged with the object
node. Why are these not licit alternatives to the parse that achieves the correct
result?
The answer is simple enough, though a bit laborious to establish. As so far
set out, there are two choices at the outset of the construction process. Either
*Adjunction applies and the word John is taken to decorate an unfixed node
with F o(John′ ); or Introduction and Prediction are applied and John is taken
to decorate the subject node. The former strategy gives us what we want, so we
can leave that on one side as unproblematic. Suppose to the contrary that John,
as the first word in the string, is taken to decorate the subject node as in (2.82).
46 This is, of course, not a problem specific to Dynamic Syntax, but is actually true of all major
theories of syntax.
2.6. WELL-FORMEDNESS AND UNGRAMMATICALITY 71
At this point, we immediately face a difficulty in parsing the second noun phrase
Mary, for once the subject node is parsed, the pointer goes back to the top node,
by completion and moves onto the predicate-requiring node, via anticipation.
*Adjunction cannot now apply: it is defined to apply only when the pointer
is at the node decorated with ?T y(t) (i.e. not from the predicate-requiring node).
*Adjunction also cannot apply more than once from any one node: this is ensured
by the requirement that it apply only when there are no other nodes in the (sub)tree
of which it is the top. This is the significance of the double brackets ({{) in the
input tree description condition in the definition of *Adjunction in (2.51):
As soon as either one application of *Adjunction has taken place, or one appli-
cation of Prediction, this condition will no longer apply. With *Adjunction
inapplicable, given the assumption that John has been taken to decorate a sub-
ject node, then the only option available is to develop the predicate goal.47 But
if Prediction indeed non-trivially applies yielding a node to be decorated with the
predicate requirement, ?T y(e → t), then the condition for carrying out the actions
of the name Mary cannot be applied – its trigger of ?T y(e) has not been provided
and no parse can proceed.
Applying the actions of the verb could of course be carried out if the verb
were next in the sequence, as this requirement is the appropriate condition for its
sequence of update actions. However, it is Mary that lies next in the sequence
of words. But with Introduction and Prediction only applying in English
at the level of expanding a propositional node, i.e. with ?T y(t) as the condition
on their application, the trigger for the actions provided by the word Mary is
not provided and the sequence aborts. Hence, the impossibility of building an
interpretation ‘John upset Mary’ from the sequence John, Mary upset. The only
way for a successful derivation to take place, once the decoration on the subject
node has been established, is for the update action of the verb to be carried out
first. And to enable this to take place, the verb has to be positioned before the
noun phrase to be construed as object.
This explanation turns on what may seem to be an ad hoc stipulation that the
only form of Introduction and Prediction that is available in English is the
one that leads to the formation of subject and predicate nodes. If, instead of this
stipulation, Introduction and Prediction are taken to apply in the general,
type-neutral, forms given in the rules in (2.14) and (2.16), then they could apply
in principle at the predicate-requiring node to yield the (schematic) tree in (2.83)
with the pointer at the object node.48
47 In principle, another daughter node could be introduced, but any such move would be trivial,
as any such node being already introduced would immediately satisfy the requirement ?T y(e)
removing the requirement. By a very general principle of tree growth, such trivial updates are
debarred.
48 Recall that Prediction puts the pointer at the open argument node.
72 CHAPTER 2. THE DYNAMICS OF INTERPRETATION
inferences over these. We leave this aspect of context entirely to one side.
73
74 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
The context also, of course, includes the current tree under construction, allowing
substitution of some term, already constructed, for a metavariable projected later in
the parse process, as in (3.2), without any recourse to other trees in the context.
(3.2) Mary thought she was overworked.
In order to characterise relative clauses,2 we extend our notion of context to
allow one partial tree to be built which provides the context in which some other
structure is then built. The process of completing the first tree then takes place
in the context of having constructed the intermediate tree as a side-routine. In
this way, we allow the processor to build structures in tandem, which provide
contexts of interpretation for each other. This is what we propose happens in
relative clause construal, which we take as the proto-typical case of such joint
structure-building. In this way, our analysis treats a major subpart of natural-
language recursion as the progressive projection of individually simple structures
which share a context. The methodology, as before, is to closely follow the dynamics
of time-linear processing, showing how complex interpretations can be established
through the use of underspecification interacting with tree growth processes.3
with some additions, in particular the discussion of non-restrictive relative clause construal.
4 We take up the analysis of restrictive relatives once we have introduced quantification: see
section 3.2.
3.1. LINKED STRUCTURES 75
• and then a subsequent completion of the main clause through a parse of the
verb smokes.
This is exactly the process we use to account for relative clause construal in
general. We take the process of projecting a second propositional structure as
involving the notion of a “link” between one completed node of type e in a partial
tree and the top-node of another tree decorated by a propositional requirement
(?T y(t)). The word who, which we treat as a relative pronoun, following Jespersen
(1927) and many another traditional grammarian, provides a copy of the formula
decorating the node from which the link is projected (the “head”). The parse of the
remaining string then proceeds in the way that we have seen for other left dislocation
structures in chapter 2, to give an output formula value F o(Like′ (John′ )(Sue′ ))
just as in a parse of the string John, Sue likes. The initial tree is then completed
through a parse of the verb to yield a propositional formula F o(Smoke′ (John′ )),
with the whole structure then being interpreted in terms of conjunction of the two
propositions decorating the two trees. We show the output structure schematically
in the figure in (3.5) where the “link” relation is shown by the thick black arrow.
(3.5) The representation of (3.3):
T y(t), F o(Smoke′ (John′ ))
that L is a valid argument of the treenode predicate T n in addition to 0 and 1, the address of a
node which is inversely “LINKed” to a head with address T n(a) being given as T n(aL). So we
76 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
head
z }| {
{. . . {T n(a), F o(α), T y(e), ♦} . . . }
{. . . {T n(a), F o(α), T y(e)} . . . }, {hL−1 iT n(a), ?T y(t), ?h↓∗ iF o(α) , ♦}
| {z } | {z }
head formula requirement
| {z }
LINKed node
Firstly, notice that, in requiring a copy of the formula of the head to be found
within the LINK structure, this rule encapsulates the idea that the latter tree is
constructed in the context provided by the first partial tree. The rule thus cannot
operate on a type-incomplete node and ensures that both structures share a term.
Secondly, the requirement to find the copy of the shared term (F o(John′ ) in this
case) is modal in form, indicating that the copy should be found somewhere in the
unfolding propositional tree but without specifying where in that tree it should oc-
cur. This, recall, is the means by which left-dislocation structures are characterised
as having an initially unfixed node. Expressed here as a requirement, it embodies
the restriction that somewhere in the ensuing construction of the tree from this
node, there must be a copy of the formula decorating the head. This restriction
is notably transparent, stated in this modal logic vocabulary. The structure thus
develops for English with an application of *Adjunction to provide an unfixed
node in anticipation of securing that copy, giving rise to the tree in (3.8).
tree displays.
3.1. LINKED STRUCTURES 77
T n(0), ?T y(t)
h↑0 iT n(0),
?T y(e → t)
T y(e), F o(John′ )
This requirement need not be satisfied immediately: all that ?h↓∗ iF o(John′ )
requires is its satisfaction at some point before the LINKed tree is completed.
When we turn to other languages, this will be significant, as this combination of
requirement and modal statement provides a natural means of expressing a range
of non-local discontinuities between either the head or some clause-initial comple-
mentiser and a pronoun within that clause functioning resumptively. At no point
does there need to be adjacency in the structure between the term providing the
requisite formula as head and the term realising the requirement for a copy of it:
the modality in the formula specification captures all that is needed to express
long-distance dependencies of this form. In English, as it happens, the requirement
imposed by the construction of a LINK transition is met immediately, a restriction,
as noted, which is induced by the lexical actions of the relativisers who, which, and
that. These are defined as rule-governed anaphoric devices that decorate an unfixed
node within a LINKed structure, as shown by the lexical actions associated with
who given in (3.9).
IF ?T y(e), ?∃x.T n(x), h↑∗ ihL−1 iF o(x)
(3.9) whorel THEN put(F o(x), T y(e), [↓]⊥)
ELSE Abort
The context for parsing the word is complex in order to enforce its strict distribu-
tion. What the conditional action says is that if from an unfixed node, there is the
LINK relation defined to some node decorated by some formula F o(x), then a copy
of that formula should be provided for the current node. This action is notably like
that of a pronoun, even to the decoration of the node with a bottom restriction,
except that the value is fully determined as being that decorating the head from
which the LINK transition is defined. The effect of these actions in parsing John,
who is shown in (3.10), the copy requirement now being obviously satisfied.
78 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
T n(0), ?T y(t)
h↑0 iT n(0),
?T y(e → t)
T y(e), F o(John′ )
h↑0 iT n(0),
T y(e), ?T y(e → t)
F o(John′ )
T y(e → (e → t))
?T y(e), ♦
F o(Like′ )
Once the merge step has taken place, the linked structure can be completed
by steps of completion, thinning, and evaluation, all as before, until the top
of the LINKed tree is reached. At this point, however, we can get no further as
there is currently no rule to move the pointer back to the head node to allow a
completion of the parse. However, we may revise the rule of completion from the
last chapter as in (3.12) to move the pointer from a type-complete node across any
daughter or inverse LINK relation.
(3.12) Completion (Revised):
{. . . {T n(n) . . . }, {hµ−1 iT n(n), . . . , T y(X), . . . , ♦} . . . }
{. . . {T n(n), . . . , hµiT y(X), . . . , ♦}, {hµ−1iT n(n), . . . , T y(X), . . . , } . . . }
µ−1 ∈ {↑0 , ↑1 , ↑∗ , L−1 } µ ∈ {↓0 , ↓1 , ↓∗ , L}
3.1. LINKED STRUCTURES 79
With this small revision in place, the matrix proposition can be completed, essen-
tially to yield the tree in (3.5).
There is, however, one further step to take and that is to provide a means of
interpreting such a tree. As noted above, non-restrictive relative clauses give rise
semantically to a conjoined propositional structure. We therefore introduce the first
in a series of what we call LINK evaluation rules, as given in (3.13). This rather
complex looking rule takes a completed propositional tree with treenode address
T n(a) that has embedded in it somewhere a propositional structure LINKed to
some node dominated by T n(a) and returns a conjunction of the two formula values
decorating the rootnodes of those propositional trees. We end up, therefore, with
the tree in (3.14).
(3.13) LINK Evaluation 1 (Non-restrictive construal):
T y(e → t),
T y(e), F o(Sue′ )
F o(Like′ (John′ ))
Notice what this account of relative clauses involves and how this differs from
most other accounts. In the first place, the whole account rests on two concepts of
underspecification, of position within a tree and of content, both of which are used
to characterise other phenomena. Secondly, and most importantly, the story of
relative clause construal is fully anaphoric, involving no concepts of quantification
or binding. Relative pronouns are analysed in the same way as we have analysed
personal pronouns: as expressions that have underspecified content. The differ-
ence between the two types of pronoun is that personal pronouns may get their
content through a pragmatic process of substitution (fairly) freely from the con-
text, while relative pronouns directly copy some term specifically identified in the
unfolding structure, i.e. that provided by the head. Because of this, and because
of the properties of merge which merely unifies the information associated with
two constructed treenodes, there is no record in the output that the string that
gives rise to the structure contains a relative pronoun. Indeed, as we shall in later
chapters, the structure given in (3.14) is exactly the same as the output that would
80 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
result in a successful parse of (3.15), where the LINK structure is projected by the
conjunction and and he is construed as John.
(3.15) John, and Sue likes him, smokes.
We will be exploring the consequences of this account of relative clauses through-
out much of the rest of this book, but before we do so, it is worth noticing a further
property of the analysis, associated with the concept of LINK. This has to do with
our account of left dislocation. The modality introduced for unfixed nodes by *Ad-
junction is, as we have seen, h↑∗ i (and its inverse h↓∗ i). This modality ranges over
the closure of ‘dominated by’, ↑ (or ‘dominate’ ↓) relations. It does not range over
LINK relations hL−1 i or hLi. This means that no unfixed node of this sort can be
merged with a position that is contained within some structure LINKed to the tree
into which it is introduced, as schematically shown in (3.16).
(3.16) Illicit merge:
∗T n(a), ?T y(t)
This provides us with a straightforward account of strong islands and precludes the
derivation of strings like those in (3.17), which exhibit violations of Ross (1967)’s
Complex NP Constraint.
(3.17) a. *Who did John, who likes, smoke?
b. *Sue, John, who likes, smokes.
3.1.4 Crossover
In (3.18) the word he cannot be construed as picking out the same individual as
the expression John, while, in (3.19), it can:8
(3.18) John, who Sue is certain he said would be at home, is in the surgery.
(3.19) John, who Sue is certain said he would be at home, is in the surgery.
When first observed by Postal (Postal 1970), this phenomenon was called crossover
to reflect the fact that analysing word order variation in terms of moving expressions
from one position to another, the wh expression is moved, ”crossing over” a pro-
noun, an effect which debars interpretation of wh, pronoun and gap as co-referring.
In more recent work such as Postal (1993), this restriction is still classified as a
mystery, and remains a major challenge to any linguistic theory.
The problem is that the data have to be seen, as in the original Postal analysis,
as splitting into at least two different phenomena, called strong and weak crossover,
and each is yet further subdivided into two further subcategories: ‘extended strong’
and ‘weakest’ crossover. In Lasnik and Stowell’s (1991) account, a discrete concept
of gap called a ‘null epithet’ is posited in addition to the array of empty categories
already posited at the time. There remains no satisfactory integrated explanation
within movement frameworks. Indeed it has become standard to treat these phe-
nomena entirely separately, all requiring very different analyses, as though such a
proliferation of discrete categories is not problematic (see e.g. Boeckx (2003), de
Cat (2002), for recent discussion). There have been a number of relatively recent
analyses in non-movement frameworks attempting to reduce the plethora of facts
to a single phenomenon, but it remains very generally intransigent in any account
which fails to reflect the linearity intrinsic to the distribution.9
The extent of the puzzle can be traced directly to the static methodology of the
jigsaw view of language. The problem arises in that type of framework, because the
observed patterns are analysed exclusively in terms of the hierarchical relationship
between a pronoun and a posited empty position, ‘the gap’. All reference to the
dynamics of left-right processing is, by standard assumptions, debarred in syntactic
explanations of linguistic data (see chapter 1). According to this standard perspec-
tive, if pronoun and gap are both arguments and the pronoun c-commands the gap
(as (3.18) would standardly be analysed), an interpretation in which the pronoun
and wh-expression are taken to denote the same entity is not possible given binding
principle C. This is because that principle debars an empty category from being
interpreted as referring to the same entity as any c-commanding argument expres-
sion whatever, the gap supposedly having name-like properties which require it to
be free.10
8 We cite these here without any indexing indicating the licensed interpretation, contrary to nor-
mal methodology, as these sentences are fully well-formed. It is simply that some interpretations
are precluded.
9 See Hepple (1990), which requires an account of all cross-sentential anaphora as entirely
distinct; and Safir (1996), which the author himself notes fails to extend appropriately cross-
linguistically. Dalrymple et al. (2001), Kempson and Gabbay (1998), Kempson and Meyer-Viol
(2002), Shan and Barker (forthcoming), by contrast all invoke concepts of linearity as an integral
part of the explanation.
10 Incidentally, this observation of unique binding of the gap by the wh expression, which is
construed as an operator, guarantees that, despite the adoption of predicate-logic forms of binding
as the working metaphor for LF representations, nevertheless the “binding” of the gap by the
associated wh has to be distinct from a quantifier-variable binding operation because quantifiers
can bind an arbitrary number of variables, hence the specially defined concept of a “syntactic
operator” (Chomsky 1981, Koopman and Sportiche 1982).
82 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
This principle-C style of analysis will not apply if the pronoun precedes but does
not c-command the gap, because the filter will not apply. Crossover environments in
which the pronoun is contained in a complex determiner are accordingly analysed as
a distinct phenomenon “weakest crossover”, since under some circumstances these
are able to be construed as co-referring as in (3.20).11
(3.20) Johni , whoi Sue said hisi mother worries about ei , has stopped working.
The surprise for this form of analysis, in the standard terminology, is that if the
pronoun c-commands the gap but the binder is complex, then judgements of gram-
maticality do not coincide with strong crossover cases as might be expected. In-
stead, they pattern with those observed of weak crossover in allowing co-referring
interpretations, despite their classification with the much stronger ‘strong crossover’
restriction, the so-called ‘extended strong crossover’ restriction (3.21):12
(3.21) Johni , whosei motherj hei worries about ej far too much, has stopped
working.
The parallelism in the acceptability judgements of examples such as (3.20) and
(3.21) is entirely unexplained. It is problems such as these that led Lasnik and
Stowell to posit the “null epithet” as a further form of empty category needed for
what is itemised as ei in (3.20). This behaves, not like a name as the principle C
account would expect, but like a pronoun, hence subject to principle B. But positing
an additional type of empty category just adds to the puzzle (see Safir 1996, 1999).
Whatever the merits of the analysis in providing new theoretical constructs that
are compatible with the data, it is clear that this does not constitute anything like
an explanation of how anaphoric processes and long-distance dependency interact;
and indeed the phenomenon has never been addressed by syntacticians at this level
of generality.13
With a shift into a methodology which pays close attention to the way in which
interpretation is progressively built up within a given structural context, the ac-
count is, by comparison, strikingly simple. Using steps introduced in chapter 2 and
section 1 of this chapter, the emergent structure of (3.18), which we now take in
detail, can be seen as built up following a combination of computational and lexical
actions. The expression John is taken to decorate a fixed subject node, and, from
that, a LINKed structure is projected by LINK Adjunction which imposes the
requirement to find a copy of F o(John′ ) in this structure, as we have seen. An un-
fixed node is constructed within that newly introduced structure via *Adjunction
which is duly decorated with that formula by the lexical actions associated with
parsing the relative pronoun. Subsequent parse steps then lead in succession to the
setting up of a subject node, and its decoration with F o(Sue′ ); the introduction of
a predicate-requiring node (decorated with ?T y(e → t)); the construction of a func-
tor node (decorated with F o(Certain′ )) and its attendant second-argument node
11 In this and the following examples, we revert to the standard methodology of co-indexing to
indicate the relevant possible interpretation. In particular, we use the trace notation without any
commitment to such entities to indicate the position in the interpretation process at which the
unfixed node associated with the parsing of who will be resolved.
12 The debate is often discussed in connection with wh-question data, but here we restrict our
attention to crossover phenomena in relatives. See Kempson et al. (2001) for a fuller discussion
which includes wh-questions.
13 Boeckx (2003) and Asudeh (2004) are recent advocates of an integrated account of what they
both identify as resumptive pronouns, in the Asudeh analysis purporting to nest this as a mere
sub-case of pronominal anaphora; but, as things turn out, many of the relevant phenomena, in
particular the use of resumptive pronouns in English, they both set aside as involving a distinct
form of pronoun, “an intrusive pronoun”, thereby denying the possibility of the complete account
that is claimed.
3.1. LINKED STRUCTURES 83
(decorated with ?T y(t)). This leads to the construction of the embedded subject
node, with the lexical actions of the pronoun providing a metavariable F o(U) of
type e as decoration. The structure at this point in the parse is shown in the tree
display in (3.22).
(3.22) Parsing John, who Sue is certain he:
T n(0), ?T y(t)
h↑0 iT n(0),
T y(e), ?T y(e → t)
F o(John′ )
T y(e),
?T y(e → t)
F o(Sue′ )
T y(t → (e → t)),
?T y(t)
F o(Certain′ )
Given that all Kleene star relations include the empty relation, this description
applies to any formula decoration that is at the node itself (i.e. h↑0 ih↓0 i). So
substituting the formula F o(U) by F o(John′ ) as value is precluded, even though
the node from which some other copy of that formula has been established might
be in a suitably non-local relation (such as the matrix subject node in (3.22)).
Hence, merge is the only means of securing F o(John′ ) as the selected update
to the metavariable projected by the pronoun in the embedded subject position.
Yet this also turns out to be a precluded choice of strategy. This is because, if
merge is adopted at the point of having processed the pronoun, things go wrong
thereafter, because the unfixed node, being now fixed, is no longer available for
updating some later partial structure. In parsing John, who Sue is certain he
said would be at home, is in the surgery, the crunch point comes after parsing
the verb said. The word in the string that follows this is another verb requiring
(in English) a predicate goal to trigger its lexical actions. However, what follows
said is a propositional structure in which introduction and prediction apply to
derive subject and predicate goals, but with the pointer on the subject node. This
is decorated with the requirement ?T y(e). If no merge of the unfixed node has
previously occurred, then it can provide the necessary update, satisfying the type
requirement and the parse can continue. However, if the unfixed node has been
fixed earlier, there is no possible update to satisfy the type requirement and the
parse will abort.
Since substitution is precluded and merge of the unfixed node with the
embedded subject node in (3.22) leads inevitably to a failed parse, he in (3.18)
cannot be construed as being co-referential with John. The strong crossover effect
is thereby accounted for straightforwardly in terms in terms of the interaction of
updating different types of underspecification: tree position and content.
3.1. LINKED STRUCTURES 85
Notice how the difficulty here arises solely because the pronoun is construed as
picking out the doctor in question. If however, he had been interpreted as ‘Tom’,
or ‘Dick’, or ‘Harry’, etc., indeed anything other than the name picking out the
indicated individual John, then there would be no problem at any juncture in the
parsing process. The unfixed node with a term picking out John would still be
described as unfixed at the point at which the embedded predicate is reached, and
so at this point, this unfixed node could be updated appropriately. Hence it is only
interpretations of the pronoun as anyone other than John that lead to a well-formed
result in (3.18).
In (3.19), no such problem arises. This is because the process which merges
the unfixed node with the first embedded subject node occurs before the pronoun
is processed. So when the interpretation process reaches that point, there is no
problem interpreting the pronoun. It can be interpreted as John because there is
no local node now decorated with F o(John′ ), because the information associated
with the unfixed node, now fixed, is no longer being passed down the tree. The
tree display in (3.23) illustrates this process schematically up to the point where
substitution applies to the metavariable (shown by the double uparrow, ⇑, as
regularly from now on).14
(3.23) Parsing John, who Sue is certain said he:
T n(0), ?T y(t)
h↑0 iT n(0),
T y(e), ?T y(e → t)
F o(John′ )
T y(e),
?T y(e → t)
F o(Sue′ )
T y(t → (e → t)),
?T y(t)
F o(Certain′ )
T y(e), F o(U), ♦
⇑ ?T y(e → t)
F o(John′ )
The answer to the question as to why the pronoun can be identified with the
wh relativiser in both (3.20) and (3.21) now reduces to the question of whether the
14 Notice that the figure is purely schematic, and is intended to show the main steps in the
derivation, up to the point at which the pronoun is parsed. It does not show strictly just the
structure at the point at which substitution takes place, since by this time the unfixed node
would be fixed properly in the relevant position.
86 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
node decorated by the pronoun is relevant to identifying the fixed tree position for
the unfixed node. The simple answer is that it is not, in either case. In (3.24), if the
pronoun, he, is identified as having a formula value identical to the wh expression,
whose, by virtue of being interpreted as a copy of the head, this has no consequences
for identifying the tree position of the unfixed node. This follows because the wh
expression projects merely a subterm for the formula decorating the unfixed node.
That is, construal of he as ‘John’ cannot be the result of establishing the tree
position for the node annotated by a formula constructed from whose mother.
(3.24) Johni , whosei motherj Sue says hei worries about ej far too much, is at
the hospital.
Not only will consistency of node annotations rule out any application of merge;
so too will the terminal node restriction imposed by the pronoun. To similar effect,
in the case of weak crossover, no node internal to a determiner can provide an
update for the unfixed node projected by the relative pronoun who. In this case,
with the wh expression not being complex, the account turns on the analysis of
genitive pronouns such as his. These might be analysed as defining a complex
operator introducing a metavariable as part of its restrictor without assigning it
any independent node, in which case the terminal node restriction imposed by
the wh pronoun will fail to be met. Alternatively, his might be analysed along
with more complex genitive constructions as projecting a distinct linked structure
whose internal arguments cannot provide an update for a node unfixed within the
structure initiated by the relativising particle.15 Either way, as with extended
strong crossover effects, there will be no interaction between the construal of the
wh expression and the pronominal, and so an interpretation of the pronoun as
picking out the same individual as is picked out by the head is available:16
(3.25) Johni , whoi Sue said hisi mother was very worried about ei has stopped
working.
The short version of this rather long story about crossover is that we get
crossover effects whenever there is interference between the updating of an unfixed
node and the updating the underspecified term provided by a pronoun. When-
ever no such interference is possible, for whatever reason, the interpretation of
the pronoun is free. We thus have the beginnings of an account of crossover data
that spans strong, weak and extended strong crossover data without any grammar-
internal stipulations particular to these sub-cases: the different forms of interpre-
tation available emerge solely from the interaction of the process of building up
structure for the construal of the relative and the process of establishing a value
for a pronoun.
This kind of feeding relation between processes of construing strings displaying
long-distance dependency and the construal of a pronoun is extremely surprising
if the problem of pronoun construal is taken as different in kind from the problem
posed by long-distance dependency (the “imperfection” problem – see chapter 1).
15 See Kempson et al. (2001: 144-148).
16 The analysis of genitives as ‘implicit relatives’ is well attested in the literature (Baker 1996,
Kayne 1994, etc). Notice that this form of analysis provides a basis for characterising their island-
like status: dependencies are in general not available between a pair of terms, one external to some
structure projected from a genitive marker and one within that structure. An alternative is to
analyse genitives as locally unfixed with respect to the node introduced from a ?T y(e) decoration.
This characterisation of a genitive as decorating an unfixed node will prevent application of some
unfixed node as putative argument of a genitive imposed structure, as the appropriate tree update
will not be available (see chapter 6 for discussion of Local*Adjunction in connection with
Japanese).
3.2. QUANTIFIER PHRASES AND RESTRICTIVE RELATIVE CLAUSES 87
Indeed, this feeding relation can only be modelled directly if we assume that the
interpretation of both involves updating tree structures on a left to right basis
with decorations leading to constructed logical forms. So the explanation turns on
treating pronoun construal and long-distance dependency as two sides of the same
coin: both project aspects of underspecification in a structure building process.
It also provides our first case where explanation of a syntactic distribution has
invoked intermediate steps in the interpretation process. The distribution cannot be
defined over the input to the relative construal process (trivially, since the input is
on this assumption merely a single top-node requirement); but nor can it be defined
over the output structure. What is critical is the dynamics of how some interpre-
tation may be licensed only if a certain kind of choice is made at an intermediate
step.
The significance of this account is that there has not been any need to define
strong, weak, weakest, or extended strong crossover, as discrete sets of data, to
be analysed in terms of different structures requiring quite different analyses.17
This is quite unlike most analyses, in which weak crossover is uniformly seen as a
separate phenomenon from strong crossover, not even requiring a related basis of
explanation; and the generality of the underlying explanation as a consequence of
the feeding relation between two simple and independently motivated processes is
missed altogether.18
T y(cn → e)
T y(cn)
QUANTIFIER
This tree shows how quantified noun phrases, although taken to project terms of
type e, are nevertheless associated with internal structure, indeed, more or less
the structure that one would expect on any standard account of quantified noun
phrases. Given our perspective that what is built by syntactic processes is semantic
structure for a resulting logical form, this may seem surprising. How can this be
what is wanted for quantifying expressions?
To see what is going on here, we need to start from the stance that others
adopt about quantification, and the internal structural properties of noun phrases,
in order to see to what extent the Dynamic Syntax account is different. The point
17 Wehave not discussed crossover in WH-questions: see Kempson et al. (2001: 213-216).
18 This
is not to deny that the challenge of reducing the plethora of crossover types to a single
phenomenon is not recognised by some (Hepple 1990, Dalrymple et al. 2001), but without the
dynamics of intermediate stages in the update process, no fully integrative approach is possible.
88 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
of departure for all formalisms is that predicate logic, in some sense to be made
precise, provides an appropriate underpinning for representing the semantic prop-
erties of natural language quantification. The existential and universal quantifiers
of predicate logic give the right truth conditions, at least for the analogous natural
language quantifiers, every and the indefinite singular a/some. However there is
an immediate problem, because, at least on the face of it, the syntax of predicate
logic and the syntax of natural languages are not alike. In predicate logic, the
quantifiers are propositional operators: they take open propositional formulae, and
bind variables in them to yield a closed propositional formula. Moreover, the open
formulae are complex and require different connectives depending on the quantifier
selected. So, the existential quantifier is associated with conjunction (∧) and the
universal with material implication (→). The predicate logic translations of (3.27a)
and (3.28a) are thus (3.27b) and (3.28b), respectively.
(3.27) a. Every student laughed.
b. ∀x(Student′ (x) → Laugh′ (x))
(3.28) a. A lecturer cried.
b. ∃y(Lecturer′ (x) ∧ Cry′ (x))
Considered in terms of its combinatorial properties, in other words its logical
type, the quantifier ∀ is accordingly of type t → t, an expression that maps proposi-
tional formulae into propositional formulae. This makes it structurally quite unlike
quantified expressions in natural languages, which fill argument positions just like
other NP expressions.
The move made in Montague semantics and all formal semantic formalisms fol-
lowing that tradition (Montague 1974, Dowty et al. 1981, Carpenter 1998, Morrill
1994, etc.) is to retain the insight that the predicate logic formula for a univer-
sally quantified statement expresses the right truth conditions for a sentence such
as Every student laughed, but recognise that the words of English are not mapped
onto the predicate logic formula directly. Montague’s proposed solution is to define
the types of every, student and laughed so as to make sure that the determiner
may combine first with the noun, and then with the predicate (as its argument) to
yield a propositional formula. Since student and smoke are (one-place) predicates
according to standard assumptions of predicate logic, this means defining every as
having a high type ((e → t) → ((e → t) → t)). When such a type combines with
a noun the result is an expression of type ((e → t) → t) which may combine with
a (one-place) predicate, expressed by some verb phrase, to yield a propositional
formula (of type t). The derivation of the formula in (3.27b) proceeds as follows:
Expression Type Translation
every ((e → t) → λQλP ∀x.Q(x) → P (x)
((e → t) → t))
student (e → t) Student′
every student ((e → t) → t) (λQλP ∀x.Q(x) → P (x))(Student′ )
≡ λP ∀x.Student′ (x) → P (x)
laughed (e → t) Laugh′
every student laughed t (λP ∀x.Student′ (x) → P (x))(Laugh′ )
≡ ∀x.Student′ (x) → Laugh(x))
The details are not in fact important here.19 A significant consequence, how-
ever, is that the noun phrase subject in particular is no longer the last argument
19 See the introductory semantics textbooks for discussion, Dowty et al. (1981), Gamut (1991),
that the predicate combines with to yield a propositional formula. Instead, it is the
verb phrase which provides the argument: the functor/argument roles of subject
and predicate are flipped from what they might be expected to be. An advantage
of this move is that the semantics of universally quantified sentences of English can
be preserved as being that of the corresponding predicate logic formula, while pre-
serving the parallelism with syntax that all noun phrases can be analysed as having
a single type. However, its disadvantage is that the argument-predicate roles in the
case of the subject to some verb are different from those of all its other arguments
(unless the type of verb, and everything else accordingly is driven up to increased
levels of complexity of type specification).20 The general problem underpinning
this is the methodology it presumes, which might well be called “preparing for the
worst”. That is, the analysis is being driven by the example which displays the
need for greatest complexity.
However, there is an alternative point of departure for looking at NPs in natural
languages, and that is to look not at predicate-logic formulae, but at predicate-logic
proofs. In particular, given the goal of modelling the way in which information is
incrementally established in natural language, it is instructive to look at the clos-
est analogue to natural language reasoning. This is the natural deduction proof
method, in which valid inference is defined through the step-by-step process by
which some conclusion is established from a set of premises. Natural deduction
proofs for predicate logic all follow the same pattern: there are licensed steps for
getting rid of the quantifiers, replacing them with so-called arbitrary names sub-
stituted in place of the variables. The body of the proof is then carried out with
these names as illustrated in (3.29).21
20 There have always been problems in manipulating the higher type ((e → t) → t) to non-
subject noun phrases, as the predicate with which it is defined to combine is never available. The
problem of type-raising is further compounded if things like plurality are properly incorporated
in the system.
21 There is a regular objection to the move to use arbitrary names for natural-language se-
mantics. It is that quantifiers such as most are essentially generalized quantifiers, for which no
proof-theoretic account is available. Only the generalized quantifier analysis has the appropriate
generality to be applicable to all noun phrases. This, however, is not an appropriate form of
criticism to the proposed analysis. In the DS system, it is the progressive compilation of the
logical form which requires all terms projected by noun phrases to be of type e. (This style of
natural deduction proof is due to Prawitz (1965). For an introduction to logic in terms of such
natural deduction, see Lemmon (1965).) The logical forms that result are then subject to a step
of evaluation, as we shall see shortly; and in this step, one might define a step onto a generalized
quantifier formula, e.g. in cases involving most. The system as it stands is restricted by the
expressive power of the selected calculus, but in principle the assumption that computation of
semantically relevant structure is via the working assumption of type e terms has more general
application.
90 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
it is this we are going to adopt (Hilbert and Bernays 1939). For a recent study, see Meyer-Viol
(1995).
3.2. QUANTIFIER PHRASES AND RESTRICTIVE RELATIVE CLAUSES 91
of such terms and, in particular, the scopal relations between terms. The latter
involves collecting scope statements at a propositional node and then evaluating
the propositional formula of that node with respect to these. This is the topic of
section 3.3. Here we are concerned with the process of building up the logical forms
of the terms from the words.
As noted above, what we need as a projection of structure for a noun phrase is
a complex form of name, of type e but containing within its structure a quantifier
node to carry the information about the kind of quantification to be projected, a
node to be decorated with the restrictor predicate, Man′ , Book′ , etc., and another
node for a variable, itself also of type e, as set out in (3.26).
In the process of constructing such a tree, a determiner projects the variable-
binding term-creating operator, in the case of the indefinite, a promissory note for
an epsilon term (what this is we shall take up shortly), shown as a lambda term
binding a variable P of type cn. There is also a classification feature Indef (+).
This is shown in the lexical actions associated with the indefinite article in (3.30).
The tree growth associated with this is shown in (3.31) which begins a parse of A
lecturer cried:24
IF ?T y(e)
THEN put(Indef (+)); make(h↓1 i); go(h↓1 i);
(3.30) a put(F o(λP.(ǫ, P )), T y(cn → e), [↓]⊥);
go(h↑1 i); make(h↓0 i); go(h↓0 i); put(?T y(cn))
ELSE Abort
(3.31) Parsing a:
T n(0), ?T y(t) 7→ T n(0), ?T y(t)
T y(cn → e),
?T y(cn), ♦
F o(λP.(ǫ, P )), [↓]⊥
distinguishes between indefinites which are relatively unconstrained, and non-indefinites, which
follow linear order more closely, see section 3.3. Such features are used as sparingly as possible
here and are, given our commitment to the hypothesis that syntactic processes create semantic
structures, merely promissory notes for a semantic characterisation that needs to be made explicit.
25 The sequence of actions that achieves the ‘freshput’ instruction is a rather primitive device
scanning the set of variables already associated with a scope statement, and aborting the lexical
action for each one that corresponds to such a variable. For a full account, see Kempson et al.
(2001) ch.7.
92 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
IF ?T y(cn)
THEN go(h↑0 i); put(?SC(x));
go(h↓0 i); make(h↓1 i); go(h↓1 i);
(3.32) lecturer
put(F o(λy(y, Lecturer′ (y))), T y(e → cn), [↓]⊥);
go(h↑1 i); make(h↓0 i); freshput(x, F o(x)); put(T y(e));
ELSE Abort
There are a number of things to notice about this complex of actions. In the first
place, it imposes on the dominating ?T y(e) node a new requirement ?SC(x) which
is satisfied just in case a scope statement is constructed at some point in a parse.
(We will discuss this aspect later in section 3.3.) Secondly, the content of the
common noun, lecturer, is shown as a complex lambda term λy(y, Lecturer′ (y)),
which binds the fresh variable that it introduces on the internal type e node. The
effect of parsing the first two words of A lecturer cried is thus as shown in (3.33).
T y(cn → e),
?T y(cn)
F o(λP.(ǫ, P )), [↓]⊥
Steps of Completion and Evaluation (see chapter 2) will apply from the
bottom of the tree in (3.33) to yield a F ormula decoration at the cn node, of
F o(x, Lecturer′ (x)), and again to yield the F ormula decoration F o(ǫ, x, Lecturer′ (x))
at the top type e node. The verb can now be parsed and, assuming a step of scope
creation to be discussed below, the final output tree in our example is that given
in (3.34).
(3.34) Parsing A lecturer cried:
T n(0), S < x, T y(t), F o(Cry′ (ǫ, x, Lecturer′ (x))), T ns(P AST ), ♦
T y(cn → e),
T y(cn), F o(x, Lecturer′ (x))
F o(λP.(ǫ, P )), [↓]⊥
The formula, F o(ǫ, x, Lecturer′ (x)), is called an ‘epsilon’ term which stands
for some arbitrary ‘witness’ of the set denoted by the restrictor, here Lecturer′ .
Although we will discuss the evaluation of such terms in more detail later in this
3.2. QUANTIFIER PHRASES AND RESTRICTIVE RELATIVE CLAUSES 93
T y(e → cn),
T y(e), F o(x)
F o(λy.(y, John’(y))
8. See also Kempson et al. (2001) where such a specification is taken to be a constraint on model-
theoretic evaluation of the resulting term. See also Cann (forthcoming b) for an interpretation of
such formulae as involving LINK structures.
29 Though in the case of names, with which we started, we simply presumed that the logical
the internal node equally to be available. This provides a natural basis for re-
strictive relative construal, since we have to hand a means of articulating further
propositional structures containing a copy of the variable.
Notice that our characterisation of the lexical actions associated with common
nouns, such as in (3.32), leaves the pointer on the node decorated by the variable
it introduces. There is nothing to prevent LINK Adjunction from applying at
this point to launch a propositional LINKed structure containing a requirement
to find a copy of this variable somewhere inside it. In parsing a string like that
in (3.39), there is a transition from the tree constructed after parsing a man and
before parsing who that gives the tree in (3.40).
(3.39) A man who Sue likes smokes.
(3.40) Parsing A man with LINK Adjunction:
T n(0), ?T y(t)
?T y(e) ?T y(e → t)
T y(cn → e),
?T y(cn)
F o(λP.(ǫ, P )), [↓]⊥
This rule of LINK Adjunction thus applies exactly as for non-restrictive forms
of construal. As before, what the rule does is to introduce a new LINKed structure
built from the node decorated by the variable as ‘head’, and at the top node of that
LINKed structure it puts a requirement for the occurrence of a copy of the very
variable projected by the noun.
There is then an intermediate step of *Adjunction, which then provides the
condition which enables the relative pronoun to provide the required copy (3.41),
the latter being indifferent to what formula decorates the head. There is thus no
need to define any ambiguity for the relative pronoun between restrictive and non-
restrictive uses.
96 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
T n(0), ?T y(t)
?T y(e) ?T y(e → t)
T y(cn → e),
?T y(cn)
F o(λP.(ǫ, P )), [↓]⊥
From this juncture on, the process of parsing the string is exactly as in simple
clauses. Nodes are introduced by the rules of Introduction and Prediction or
by lexical actions given by the verb, and a step of Merge duly takes place unifying
the unfixed node that had been decorated by the relative pronoun with some open
node, in a parse (3.39), the object node projected by parsing likes.
The only additional rule needed to complete a full parse of (3.39) is a second
rule of LINK evaluation for restrictive relative clauses that puts together the
information provided by the completed LINKed structure with the nominal predi-
cate to yield a complex restrictor for the variable at the cn node. The somewhat
complicated rule in (3.42) actually has a simple effect: it takes the propositional
formula (φ) decorating the LINK structure and conjoins it with part of the formula
decorating the T y(e → cn) node (ψ) applied to the variable, x, decorating the in-
ternal type e node (ψ(x)). This yields a complex restrictor of the variable because
the formula, φ, derived from the LINK structure contains a copy of the variable.
The final step, then, is to ‘abstract’ the variable to give a well-formed common
noun representation F o(x, φ(x) ∧ ψ(x)).30
(3.42) LINK Evaluation 2 (Restrictive construal)
struals, because the resulting decoration is of a T y(cn) node, and not a propositional one. Never-
theless, it involves conjunction of two propositions, one in the matrix tree and one derived from
the LINKed structure, thus sharing common properties.
3.2. QUANTIFIER PHRASES AND RESTRICTIVE RELATIVE CLAUSES 97
The effect of this rule in a parse of the subject noun phrase A man who Sue likes
is shown in the tree display in (3.44) (with much irrelevant detail omitted). Once
this nominal restrictor is compiled, here as the complex conjunction, the compound
type cn formula can be combined with the quantifying formula to provide the
specified epsilon term: F o(ǫ, x, Man′ (x) ∧ Like′ (x)(ι, z, Sue′ (z))), an epsilon term
that picks a witness from the intersection of the set of men with the set of things Sue
likes, as expected. The rest of the parse of A man who Sue likes smokes continues
as expected and we ultimately derive the propositional formula in (3.43)
(3.43) F o(Smoke′ (ǫ, x, Man′ (x) ∧ Like′ (x)(ι, z, Sue′ (z))))
?T y(e) ?T y(e → t)
F o(x) F o(Like′ )
ing from the movement of the wh expression to some (Spec)-CP position. See Chomsky (1981)
and many references thereafter. In Minimalist analyses, this movement of the wh expression it-
self may not be necessary, since in some analyses, movement is restricted to feature-movement.
Nonetheless, the underlying variable-binding operator analysis of relative clauses is retained, as it
is in essence in all other generative frameworks that we are aware of (Sag et al. 2003, Dalrymple
2001, Morrill 1994).
98 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
while English allows restrictive relative clauses without any overt relativiser this is
not true of non-restrictive relative clauses.32
(3.45) a. The student Bill admires won the class prize.
b. *The student, Bill admires, won the class prize.
The availability of restrictive relative construal without any relative pronoun is very
simple to define. We have to allow the analogue of a null complementiser, a phono-
logical free-ride that is defined to allow the free creation of a copy at some unfixed
node (introduced by successive steps of LINK Adjunction and *Adjunction)
as long as the node which constitutes the head is itself immediately dominated by a
cn-requiring node. The lexical characterisation in (3.46) is identical to that of the
relative pronoun which except for the extra condition that the head formula be in-
ternal to the type e sub-structure, so even in this null complementiser construction,
we sustain an anaphoric style of analysis.
IF ?T y(e), ?∃x.T n(x),
h↑∗ ihL−1 iF o(x)
THEN IF h↑∗ ihL−1 ih↑0 i?T y(cn)
(3.46) ∅relcomp
THEN put(F o(x), T y(e), [↓]⊥)
ELSE Abort
ELSE Abort
mismatch, given the display of semantic co-ordination properties, but syntactic subordination.
34 Relative clauses formed with that have an intermediate status. They are often reported to
T n(00),
F o(ǫ, x, Man′ (x)∧ ?T y(e → t)
Dislike′ (x)(ι, y, Sue′ (y)))
T n(000)
F o(λy.(y, Man′ (y)))
F o(x)
F o(x) F o(Like′ )
T nhL−1 iT n(00), ?T y(t), ?h↓∗ iF o(ǫ, x, Man′ (x) ∧ Dislike(x)(ι, y, Sue′ (y))), ♦
As can be seen from this tree, the projection of a LINK structure that provides a
non-restrictive construal involves building that structure from the higher of the two
T y(e) nodes in the term. Hence, this can only be introduced once this higher type
e node has been completed and decorated with a F o value. But this satisfaction of
the type requirement on this node only takes place if the determiner node and cn
node have been fully decorated, with their type requirements fulfilled. Of necessity
35 Scope information is not specified here, but in fact there would be a scope statement associated
with the type e term immediately prior to the introduction of this second LINKEd tree. See section
3.3
100 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
then, this step must follow any process which involves constructing and completing
some LINKed structure introduced as a transition defined on some T y(e) node
dominated by a cn node and decorated by a variable.
The pointer can never return to some daughter of the Ty(cn) node from the top
T y(e) node, so no construal of a relative clause restrictively can be achieved after
some clause that has a non-restrictive construal. Indeed, even if it could, the process
of constructing a LINKed structure at that later stage would involve retracting the
decorations on the cn and higher type e nodes, in order to introduce the newly
added modification from this latterly introduced LINKed structure. But this is
independently precluded: all building up of interpretation is monotonic, involving
progressive enrichment only. The only decoration that can ever be removed from a
node is a requirement, once that has been satisfied – nothing else.
Given that this linear-order restriction emerges as a consequence of pointer
movement, and not from any idiosyncratic decoration on the individual nodes, we
would expect that it would apply equally to all noun phrases, indefinite and non-
indefinite alike. And indeed it does:
(3.49) a. Every interviewer you disliked, who I was on good terms with, liked
our cv’s.
b. *Every interviewer, who I was on good terms with, you disliked,
liked our cv’s.
Finally, contrary to a conventional implicature analysis, on the present analysis,
we expect that non-restrictively construed relative clauses can contain antecedents
for subsequent pronominals:36
(3.50) I saw a man, who ignored a friend of mine. When she hit him, he
continued to ignore her.
The data present no more of a problem than regular cross-sentential identification
of a pronoun with some epsilon term as antecedent, leading to the construction
of a new extended epsilon term in the logical form constructed for the sentence
containing the pronoun. As we would expect, this pattern may also occur within a
single sentence:
(3.51) A parrot, which was loudly singing a song, appeared to understand it.
So, given an analysis of all noun phrases as type e terms, and the processing of
building up paired linked structures, the distinction between restrictive and non-
restrictive forms of construal is remarkably straightforward to reflect. There are
no hidden mysteries of a discrete level of structure, in some sense beyond logical
form, no structures which have to be assigned some unexpected status as filters
on content, and no radical difference in computation between the two forms of
construal.
definition a conventional implicature is a filter on content, and not part of it. Yet for the pronoun-
antecedent relation to be established, the content of the conventional implicature must have be-
come part of the context, requiring a specific accommodation process.
3.3. QUANTIFIER SCOPE AND ITS EFFECTS 101
which collects scope-relevant information during the construction process, and the
subsequent scope-evaluation process (see Kempson et al. (2001) ch.7 for details).
Scope information takes the form of statements of the form x < y, which is short-
hand for “a term with variable x has scope over term with variable y”. These
statements are collected at the local node requiring a formula of type t as they are
made available by words and their actions during the parsing process.37 Once a
propositional formula of type t is derived at the top node of some local tree, this
node will have a pair of (i) a scope statement of the form Scope(Si < x < . . . )
and (ii) a formula of type t. (i) and (ii) together will then be subject to a scope
evaluation algorithm which spells out what that pairing amounts to. We take this
lexically driven route because there is lexical variation in how much scope freedom
the terms projected have, which comes primarily from the determiners (but may
also come from verbs). We begin with a discussion of the major problem case:
the indefinites, noun phrases containing determiners include such as a, some, all
numeral expressions, and many.
This asymmetry between individual quantifiers is a puzzle for all standard ac-
counts of quantification. On the basis of the freedom with which indefinites can
be construed, it has been assumed that this will have to be a general process:
since quantifier scoping has to be defined over a propositional domain, it cannot be
treated as a lexical property. All quantifiers, accordingly, are said to be subject to
such scope variation, with a general process of either quantifier raising (in the syn-
tax, e.g. May (1985)) or quantifier storage (in the semantics, e.g. Cooper (1983)).
But this then leaves as a complete mystery as to why there should be variation
between individual quantifiers in the first place.
One response to this is to suggest that indefinites are ambiguous between a
regular quantifier and some kind of name, referring to a particular individual (or
group) taken to be picked out by the speaker. Such an analysis has the advantage
of leaving undisturbed a generalised quantifier analysis of a sub-class of indefinites,
analysing the others as a specialised naming device. On such an analysis, there
would be no need of a general process of quantifier-storage. However, as observed
by a number of people over the years (Farkas 1981, Cormack and Kempson 1991,
Abusch 1994, Winter 1997, Reinhart 1997), such an ambiguity account fails to
capture cases in which the indefinite can be construed as taking intermediate scope,
with scope over the clause or noun phrase within which the indefinite is contained
but nevertheless still interpreted as within the scope of some preceding quantifier:
(3.56) a. Every professor insisted every student of theirs read a recent MIT
thesis.
b. Each student has to come up with three arguments that show that
some condition proposed by Chomsky is wrong.
Each of the examples in (3.56) allows an interpretation, as one amongst several, in
which the indefinite in the embedded clause can be understood as taking broader
scope than the nearer of the two quantified expressions preceding it, but narrower
scope than the subject quantifier. For example (3.56a) allows as one possible inter-
pretation: ‘For each professor there is a recent MIT thesis that the professor insisted
that their students read’. So the scope relation is every professor < a recent MIT
thesis < every student: one MIT thesis per professor for all their students, but not
the same thesis. So for these cases, in any event, indefinite expressions must appar-
ently be analysed as expressions that function as quantifiers taking scope over some
arbitrarily larger domain. Analysing a sub-case of interpretations of indefinites as
name-like does not solve the problem.
If, however, we look at this phenomenon from a processing perspective, then
there is an alternative account in terms of interpreting the indefinite relative to
its context, which is naturally expressible only if we analyse quantified expressions
as a form of name, initially underspecified. We can get at this alternative way of
looking at the problem by asking how ambiguity displayed by (3.56a) arises. How,
for example, does it differ from a simpler structure such as (3.57)?
(3.57) Every student is reading a recent MIT thesis.
One obvious answer is that (3.56a) has three quantified expressions, (3.57) only
two. The clue, we suggest, lies exactly in this.
The ambiguities in (3.57) are straightforward enough. Either the second quan-
tifier is interpreted relative to the first, or the first quantifier is interpreted relative
to the second. This is the standard way of describing this phenomenon. However,
there is another way of expressing this observation, by expressing the dependence
solely in terms of the indefinite. Either the expression a recent MIT thesis in (3.57)
3.3. QUANTIFIER SCOPE AND ITS EFFECTS 103
ing the construal of individual members of the set quantified over independently of the island in
which the indefinite expression is contained. However, in (i), given the disambiguation provided
by the presented elaboration, wide scope potential for the indefinite seems possible ((ii) is Ruys’
example):
(i) If two of my sons ever help me do the washing up I buy them a beer by way of thanks. If they
both do, I get out the champagne.
(ii) If two of my uncles give me a house, I shall receive a fortune.
104 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
It might seem surprising that choice of scope should be said to parallel choice
of pronoun construal, because cataphoric interpretations for pronouns in English
are generally not available in English, yet scope inverted interpretations surely
textitare.39 We can interpret (3.61) as meaning that all the patients got interviewed,
but not necessarily all by the same nurse because of what we know about the
burdens of the national health service:40
(3.61) A nurse interviewed every patient.
Yet he in (3.62) cannot be interpreted as picking up on its interpretation from the
following occurrence of John:
(3.62) He knows that John is clever.
This is certainly true; but this is not the end of the matter. There are the expletive
pronouns, which are interpreted in precisely a forward-looking manner:
(3.63) It is likely that I’m wrong.
The it in (3.63) is construed as identical to the interpretation assigned to the fol-
lowing clausal expression I’m wrong. In the literature, such pronouns are analysed
as completely different phenomena, the so-called expletive pronouns, unrelated to
their anaphoric counterparts. However, there is reason to think that they are nev-
ertheless but a subcase of the pronoun it, a matter we come back to later (chapters
5 and 8). Here it is sufficient to know that cataphoric processes of anaphora are
available subject to certain locality constraints.41
There is further evidence that allowing indefinites to depend on some quantifier
introduced later in a sequence is the right move to make. Analysing the apparent
reversal of scope of a pair of quantifiers when the first is indefinite and the second
non-indefinite provides a solution to what is otherwise a baffling form of variation
in the scope construal of non-indefinites poses. The puzzle is that quantifiers such
as most appear to give rise to wide-scope construal when they follow an indefinite
noun phrase, but not when they follow a non-indefinite noun phrase. In (3.53), for
example, we observed that the expression most good scripts may take wide scope
over two examiners but nevertheless not wide scope over everyone:
(3.53) Everyone agreed that two examiners marked most good scripts.
Beside that, however, the interpretation of (3.64) generally follows the linear order
in which the quantified expressions occur. The only natural interpretation of (3.64)
is as an assertion about all examiners that each marked most scripts:
(3.64) All examiners marked most scripts.
39 Apart from expletives, cataphoric effects may be expected if there are grounds for analysing the
node the pronoun decorates as unfixed, since there is a later stage at which the needed substitution
operation can apply. But this is not the operative consideration in simple SVO sequences.
40 These scope inverted interpretations are regularly said not to be preferred, but nevertheless
agreed to be available.
41 There are also cases of cataphora involving initial adjunct clauses, often generically interpreted
And, conversely, the only natural interpretation of (3.65) is an assertion about each
member of some larger proportion of scripts that all examiners checked it:
(3.65) Most scripts, all examiners marked.
If we analyse all quantified expressions as having a wide-scope potential, these
facts are mysterious, for sometimes quantified expressions containing most seem to
allow interpretations in which they take scope over other quantified expressions,
but sometimes not. On the other hand, the puzzle is resolved if we analyse all
scope inversion for pairs of quantified expressions as an idiosyncracy of indefinites:
that they can be interpreted as taking scope narrower than some term, and that
in some contexts this can be constructed from an expression following them in
the sequence of words. Most itself provides no potential for being interpreted
outside the immediate structural context within which it is processed. It is only the
indefinite which provides an underspecified term, with a choice of scope dependency
that ranges over any term made available during the construction process.42
In any case, the analysis that is being proposed is not that indefinites and pro-
nouns are identical in their process of interpretation, so the expectation is not for
absolute parallelism in their mode of interpretation. The claim is simply, more
weakly, that both have a context-dependent element to their interpretation: for
pronouns, it is the context that provides the value of the metavariable which these
pronouns project, for indefinites, it is the context of the proposition under con-
struction that provides a discrete term relative to which the relative scope of the
indefinite can be identified.
choice go hand in hand in all languages. If the language is very sensitive to linearity in its choice
of pronouns, then so is its choice of scope construal for indefinites. For detailed demonstration of
this see Kempson and Meyer-Viol (2004) and Kempson et al. (2001).
106 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
different to the other requirements that we have seen because, although it is non-
modal in form, it is satisfied only when a scope statement is established on the
most local propositional node. It should, therefore, more accurately be written as a
modal requirement to establish a relation in the Scope predicate with the variable
introduced by the common noun:43
?SC(x) =def ?h↑0 ih↑1∗ i∃y.Scope(y < x) ∨ Scope(x < y)
In general, we will continue to represent this complex requirement in its simpler
form, as ?SC(x). Once the information provided by the determiner and by the
common noun are put together, determiner-particular scope statements get added
to the local type-t-requiring node to satisfy this scope requirement. A sequence of
scope statements is thereby accumulated during the construction process at some
type node decorated by ?T y(t).
In general, quantifier scope follows linear order, so a quantifier processed first
on a left-right basis is presumed to have scope over a quantifying term that follows
it, with each scope statement successively added to the end of the sequence and
fixing its scope relative to what precedes. Indefinites, however, provide a systematic
exception to this general principle, and are lexically defined to have a more open
characterisation of scope effects, subject to choice.44 We can now express the
parallelism between indefinites and pronouns by defining an indefinite as adding to
the set of scope statements a relatively weak scope restriction that it be interpreted
as taking narrow scope with respect to some term to be chosen, represented by the
meta-variable U:
U < x
The value of a metavariable U as projected by the lexical specification of a pronoun,
is established in the structural context in which it occurs. Though these statements
are collected locally, the value of the metavariable U itself is not structurally re-
stricted, so a term may get entered into a scope statement for which there is no
corresponding occurrence in the formula itself. As a result the scope requirement
?SC(x) may be satisfied; but evaluation of the effect of that particular scope de-
pendency cannot be resolved until higher in the tree. In all such cases, this scope
statement gets progressively lifted up the tree until it arrives at a node where the
locally compiled formula contains an occurrence of this selected term.
In determining the value for U in a scope statement, there is one difference from
regular anaphoric processing in that there can be no possible indexical construal,
where the value for U is taken from some external context. The term under con-
struction is a quantifying term and must thus ultimately be subject to evaluation as
part of the overall proposition being constructed, as we shall see below. However,
apart from the constraint that the choice of value to be made for the metavari-
able must range only over other terms used in the interpretation of the string, it is
otherwise structurally unrestricted. This gives us a basis for characterising the ar-
bitrarily wide scope properties of indefinites, if we make the additional assumption
that representations of time are also taken as a term in the language.
43 Notice the modality h↑ ih↑1 i which is more constrained than h↑ i in pointing to a node that
0 ∗ ∗
can be reached by going up one argument node and zero or more functor nodes. See the locality
defined for Substitution in chapter 2, (2.64) on page 62.
44 Another exception is when the terms decorate a node which is not yet fixed in the structure.
Some languages allow quantified expressions to decorate such nodes, others do not. Those that
license such left-peripheral occurrence will allow scope statements to be entered, subsequent to
the step of merge which assigns the unfixed node, a definitive position in the tree. There is some
indication here of sensitivity to whether the process allows separation of the quantifying term
from that position across a propositional boundary. See chapter 7.
3.3. QUANTIFIER SCOPE AND ITS EFFECTS 107
Suppose that every formula of type t is of the form Si : ψ, with Si a term denot-
ing the time at which the formula ψ is said to hold.45 This is the one entry in the
scope statement which is assumed to be fixed independently. To reflect this, we also
modify the starting point ?T y(t) so that it also contains one term in the attendant
scope statement, namely Si , the constructed index of evaluation. Morphological
tense (or adjuncts) then add predicates on Si restricting its construal, so that what
we have been writing as T ns(P AST ) may be construed as T ns(Si < Snow ). With
this in place, the relatively unrestricted range of scope effects for indefinites fol-
lows immediately. If the value selected for the first argument of the scope relation
is some variable associated with a discrete quantifying determiner, we get narrow
scope interpretations, if it is chosen to be a term of the form Si which is itself not
construed as taking narrow scope with respect to such quantifying expressions, then
the indefinite will be construed as able to take wide scope relative to those quanti-
fying expressions.46 In (3.66), there is only one choice, this being that the indefinite
is interpreted relative to Si , the term representing the index of evaluation:
(3.66) A student fainted.
Notice the mode of interpretation for indefinites and pronouns do not project
identical forms of dependence on context. To start with, indefinites cannot be in-
terpreted as dependent on some term outside the process of construction. They are
in any case distinct forms of input: pronouns are not scope-inducing terms. Any
idiosyncracy distinguishing them is thus easily expressible in the distinct sets of
lexical actions.47 By contrast the parallelism between anaphora and indefinites is
inexpressible on accounts of quantification in model-theoretic terms, as generalised
quantifiers. There is no room to express under-specification of a generalised quanti-
fier. So the very smoothness with which the anaphoric-style of account of indefinites
and the analysis of all quantifying expressions as names goes together provides fur-
ther evidence for this analysis of the compilation of quantification structures in
type-e terms.48
To illustrate the process of scope construction, we will go through the relevant
steps in parsing (3.67) with narrow scope for the indefinite.
(3.67) A student read every book.
As we saw in section 3.2.1, parsing the determiner introduces the node for the
binding operator itself and the cn-requiring node, adding to the type-e-requiring
node the feature Indef (+). The actions associated with parsing a common noun
noun then introduce the two daughter nodes of the ?T y(cn) node: a T y(e → cn)
node decorated with the formula that expresses the concept associated with the
noun; and an internal type e node decorated with a fresh variable. It also adds
to the higher type-e-requiring node, as we have seen, the requirement for a scope
statement for that variable ?SC(x) interpreted as a modal requirement for the
variable to appear as part of the Scope predicate on some dominating node, without
45 Throughout this book, we make no attempt to formally address tense or mood construal. See
Perrett (2000) for an preliminary DS characterisation.
46 Unlike DRT, where wide scope effects for indefinites have been said to involve movement of
the discourse referent to the top box (Kamp and Reyle 1993), the assumption that scope of an
indefinite is a relatively unrestricted choice allows for intermediate interpretations in addition.
47 See Kempson et al. (2001) for full lexical specifications of the determiners every, a.
48 This lexicalised approach to projection of scope effects allows idiosyncratic specifications for
other determiners. For example, no should arguably be analysed as projecting a composite se-
quence of actions, the projection of a falsity operator at the closest type t node, and an epsilon
term defined to take narrow scope with respect to this operator. Since we do not address problems
of negation in this book, we do not define this here.
108 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
which the term cannot be completed. The non-terminal nodes on this subtree then
get accumulated in the normal way by Completion, Thinning and Evaluation
and the result at this point is (3.68):
(3.68) Parsing A student:
T y(cn), T y(cn → e)
F o(x, Student′ (x)) F o(λP.(ǫ, P ))
With the combination of properties accumulated at the higher T y(e) node through
parsing the determiner and he common noun, i.e. Indef (+) and ?SC(x), a set
of actions is now initiated that causes the pointer to move to the most locally
dominating node decorated with ?T y(t) and, because the term is marked as indef-
inite, adds the statement U < x to the Scope predicate at the top node to yield
Scope(Si , U < x).49
Because of the underspecification associated with the use of the metavariable,
the choice of which term the variable is dependent on is determined by pragmatic
factors. However, this choice is not as free as the analogous choice for pronouns:
what is being constructed is one of a number of terms all of which have to be inter-
preted relative to the particular propositional formula. So what the metavariable
ranges over is other fully determined terms in the construction process, at the point
in the construction process at which the choice is made, where “fully determined”
means a term with a fixed scope-dependency relation. For a wide-scope reading
of the indefinite, its scope relation can be fixed relative to the index of evaluation,
Scope(Si < x), immediately upon processing the expression itself. However, like all
other processes, this substitution process is optional, and so the scope relation can
be left open at this point, allowing it to be determined after some later term is intro-
duced, when the pointer is able to get back to the node where the scope statement
is by licensed moves, i.e. by moves associated with Completion, Elimination
and Anticipation.
Despite the fact that the scope requirement ?SC(x) is not yet satisfied, the
provision of a type e specification and a formula value has sufficiently completed
the parse of the indefinite expression for the pointer to move on (Completion only
needs a specification of a type value and is blind to other requirements that may
sit on the node, unlike Elimination). So the pointer can move to the predicate
node and proceed with the parse of the verb phrase, i.e. read every book. This is
shown in (3.69), again up to the point at which elimination applies to yield the
universal term as formula at the higher of the two type e nodes (notice where the
pointer is in (3.69):50
49 The details of how these scope actions are formulated is not going to be needed later, so these
we leave on one side. See Chapter 7 of Kempson et al. (2001) for formal details.
50 Recall that universal quantification is indicated by the tau (τ ) operator.
3.3. QUANTIFIER SCOPE AND ITS EFFECTS 109
T y(e), ?SC(y),
T y(cn), T y(cn → e)
Indef (−), F o(Read′ )
F o(x, Student′ (x)) F o(λP.(ǫ, P ))
F o(τ, y, Book′ (y)), ♦
Just like the indefinite, the created quantifying term has a feature determining
what kind of determiner it is, Indef (−), and also a scope requirement. The scope
requirement on this higher type-e node now needs to be satisfied, and this takes
place by the creation of a scope statement on the propositional node, thereby de-
termining the relation of the the term binding the variable y to the remainder of
the formula under construction. As the term is non-indefinite, its associated scope
action simply adds the variable y to the last term that has been entered into the
scope statement with a fixed scope dependency relation. This is the reflection of
the fact that the scope of non-indefinites is determined strictly linearly. However,
here is where the incompleteness of the scope specification for the previously parsed
indefinite makes a difference. Had the choice been taken previously for the indef-
inite to have wide scope over any later introduced term, the term projected from
the indefinite would have been construction-wise complete (with no requirements
outstanding), and this would have given rise to the scope statements:
Scope(Si < x, x < y) (= Scope(Si < x < y))
However, in the derivation leading to the inverse scope interpretation, the scope
of the indefinite was left undetermined so the specification of that term is not yet
complete. This means that there is only one possibility for fixing the scope of the
universal term; and that is to enter the variable y projected by the expression every
book into the scope statement as dependent on the one term in the sequence of scope
statements that does have a fixed value, and that is Si , the temporal variable. So
the scope statement associated with the term τ, y, Book′ (y) is Si < y. With a spec-
ified scope statement now on the local propositional node, the requirement ?SC(y)
(interpreted in the way specified above as a local modal requirement over the Scope
predicate and the variable y) is now satisfied, completing the specification of all re-
quirements on this node. Since there are no requirements outstanding on the object
node, Elimination can apply to to yield the predicate F o(Read′ (τ, y, Book′ (y))).
Completion moves the pointer back onto the propositional node and the value of
the metavariable U in the scope statement at the top node can be satisfied through
being identified as y, since the term containing y is at this juncture one of those
with a fixed scope statement. With this sequence of actions, the effect is wider
scope for the universal:
Scope(Si < y, y < x) (= Scope(Si < y < x))
110 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
which all occurrences of the expression under evaluation are replaced by the con-
structed name. The name itself then has exactly the same internal structure as this
compound formula (reflecting the epsilon/predicate-logic equivalence), except that
the name is replaced by the variable and the whole is prefixed by the appropriate
operator.
The formal definition of this algorithm is given in (3.76).
(3.76) Scope Evaluation Algorithm:
Formulae of the form:
φ(ν1 , x1 , ψ1 ), . . . , (νn , xn , ψn )
are evaluated relative to a scope statement:
hSi < x1 < . . . < xn , . . . , ti φ[ν, xn , ψn /xn ]
hSi < x1 < . . . < xn−1 , . . . , ti fνn ,xn ,ψn (φ),
where for x occurring free in φ and Si a (temporal) index, the values
fνxψ (φ), for ν ∈ {ǫ, τ, Q}, and fSi (φ) are defined by:
a. fτ xψ (φ) = (ψ[a/x] → φ[a/x])
where a = τ, x, (ψ → φ)
b. fǫxψ (φ) = (ψ[b/x] ∧ φ[b/x])
where b = ǫ, x, (ψ ∧ φ)
c. fιxψ (φ) = (ψ[c/x] ∧ φ[c/x])
where c = ι, x, (ψ ∧ φ)
d. fSi (φ) = (Si : φ)
In the case of the formula in (3.75), there is only a single possible scope statement
Si < x, so the algorithm proceeds in two steps. First we evaluate the epsilon
term (because x has narrower scope). By (3.76), we first construct (3.77b) from
(3.77a).
(3.77) a. hSi < xi Burn′ (ǫ, x, Cake′ (x))
b. hSi i fǫ,x,Cake′ (x) (Burn′ (x))
(3.76b) now applies to this to replace x in the argument position with the relevant
constructed term, at the same time conjoining this with the restrictor, again with its
instance of x replaced by the constructed term (3.78a) (= (3.75)). The evaluation
is completed by ‘discharging’ the index of evaluation to give the fully evaluated
propositional formula in (3.78a).
(3.78) a. hSi i Cake′ (a) ∧ Burn′ (a)
where a = (ǫ, x, Cake′ (x) ∧ Burn′ (x))
b. Si : Cake′ (a) ∧ Burn′ (a)
where a = (ǫ, x, Cake′ (x) ∧ Burn′ (x))
This example involves just one occurrence of the term under evaluation; but
in more complex cases the enriched term will replace all occurrences of the partial
term under evaluation. To see this, let us take a case of universal quantification
with a pronoun that is to be construed as bound by the quantifying expression.
A parse of the string in (3.79a) yields the propositional formula in (3.79b) with
two occurrences of the partial tau term, as it has been used as substituend of the
metavariable provided by the pronoun in the embedded clause.
3.3. QUANTIFIER SCOPE AND ITS EFFECTS 113
form of discourse-licensed parenthetical, merely adding incidental information about the object
described. See Fabb (1990), Safir (1996) who analyse them as involving some post-LF level of
LF’, a level whose status is not well understood, either formally or conceptually.
55 The observation that there is some form of semantic distinction between the two types of
construal is currently analysed using the only other conventional means of characterising content,
viz as a type of conventional implicature (Chierchia and McConnell-Ginet 1990 and Potts 2001).
On this analysis, the content of a non-restrictive relative is a filter on the projection of the primary
content, but not part of its resulting truth-conditions, an analysis which does not provide a basis
for explaining the interaction of anaphora and non-restrictive relative construal in many of the
data listed in this section.
3.3. QUANTIFIER SCOPE AND ITS EFFECTS 115
names:56
(3.84) a. Every referee, who I had personally selected, turned down my
research application.
b. Every parachutist, who the pilot had instructed meticulously, was
warned not to open his parachute too early.
c. Before I left, I managed to see most of my students, who I gave
something to read to discuss with me when I got back.
In these cases, it is the term under construction that is imposed as a requirement on
the linked structure, duly copied at the unfixed node by the actions of the relative
pronoun, and subsequently unified as an argument node in the predicate structure.
In (3.84b), for example, it is the tau term decorating the subject node that is
copied into the LINK structure, as shown schematically in (3.85).57
(3.85) Parsing (3.84b):
Si < x, ?T y(t)
T n(00), T y(e),
?T y(e → t)
F o(τ, x, Parachutist′ (x))
T y(e → (e → t)),
?T y(e), ♦
F o(Instruct′ )
Notice how in this case, the scope statement Si < x is assigned at the top node prior
to the construction of the LINK transition, since the type e term is provided. It is
then the combination of the LINK evaluation rule and a subsequent single process
of scope evaluation which ensures that the construal of the linked structure forms
a conjunction in the consequent of the conditional associated with the evaluation
of the tau term. The resulting logical form of (3.84b) is given in (3.86a) with its
fully evaluated form in (3.86b).58
(3.86) a. hSi < xi (Warned′ (τ, x, Parachutist′ (x)) ∧
(Instructed′ (τ, x, Parachutist′ (x))
b. Si : Parachutist′ (a) → (Warned′ (a) ∧ Instructed′ (a))
a = τ, x, Parachutist′ (x) → (Warned′ (x) ∧ Instructed′ (x))
56 On the assumption that wh-question words are place-holding devices (see Kempson et al.
2001), they will disallow non-restrictive modification, since the value to be assigned to the relative
pronoun must be some fixed Formula value, and not some open place-holder.
57 Much irrelevant structure is omitted from this tree display and the content of the definite the
warned not to open his parachute too early are simplified to Instructed′ and Warned′ , respectively,
for ease of exposition.
116 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
The LINK evaluation step for non-restrictive relative clauses thus has the effect
of a scope-extending device, despite being combined merely as the addition of a
conjoined formula. This is the result of having an incremental method of building
up scope dependencies as terms are constructed and an account of non-restrictive
relative construal as anaphorically inducing a copy of such terms into a LINKed
structure.
A further consequence of our analysis is that we expect that quantifiers in the
main clause should be able to bind pronouns in the relative clause (Safir 1996):
(3.87) a. Every nurse alerted the sister, who congratulated her on her prompt
reaction.
b. Every parrot sang a song, which it ruined.
In the processing of (3.87b), there are two terms under construction, (τ, x, Parrot′ (x)),
and (ǫ, y, Song′ (y)). The non-restrictive relative clause modifying the object thus
has a copy of the epsilon term within it. Nothing, however, precludes the tau term
from being identified as the antecedent to the pronoun, it in that relative clause
structure, as illustrated (again schematically) in (3.88) where the substitution
step is shown by ⇑.
(3.88) A possible parse for Every parrot sang a song which it:
T n(0), Si < x < y, ?T y(t)
T y(e),
?T y(e → t)
F o(τ, x, Parrot′ (x))
Hence, we predict the truth-conditional equivalence of the results (at least in the
singular indefinite case) despite the different procedures used to construct them.
We have, correctly, a processing ambiguity, but not one necessarily resulting in
denotational distinctiveness.
The two forms of subordination differ from co-ordination in that the latter is
constructed as two entirely independent structures, with, solely, possible anaphoric
links. The restrictive form of construal involves subordination in the sense of the
content derived from the LINKed structure being compiled in as a compound re-
strictor on the domain of the variable quantified over. And the non-restrictive form
of construal, as the intermediate case, involves subordination only in the sense of
the content derived from the LINKed structure being quantified over as a single
scope-binding domain, hence lacking the independence of quantification possible
with conjoined sentences.
3.4 Summary
Let us take stock of what we have achieved. We have an account of relative clauses
which provides a natural basis for distinguishing restrictive and non-restrictive rel-
ative clause construal, while nevertheless assuming a single rule for introducing the
structures which underpin these different types.59 Quantification is defined in terms
that allow maximum correspondence between the syntactic properties of quantified
noun phrases and the requisite scope-taking properties of quantifying expressions.
We have a preliminary account of crossover phenomena in relative clauses, which
shows how anaphoric and strict update processes can interact.60
59 Closely related to non-restrictive relative clause construal is the phenomenon of apposition:
of construal, as the data are said to be less clearcut with restrictive forms of construal. See
3.4. SUMMARY 119
In all cases set out so far, the syntactic properties of the structures have been
shown to follow from the process of tree growth. Long-distance dependency phe-
nomena, crossover phenomena, and restrictive and non-restrictive relative forms of
clause construal, have all been characterised exclusively in terms of the structure
representing the final logical form, the input provided by lexical specifications, and
the process of goal-directed tree growth that is defined over a sequence of update
actions. The different syntactic and semantic properties of expressions are seen to
emerge from the dynamics of how logical structures are built up in a time-linear
parsing perspective.
So, though incomplete in many respects, we have the first detailed evidence
that the formalism set out can be seen not just as a tool for modelling the process
of interpretation, but as a putative grammar formalism. We now turn to seeing
how these three strategies of building linked structures, building unfixed nodes, and
decorating nodes with underspecified metavariables can interact in different ways
to yield cross-linguistic generalisations.
Kempson et al. (2001) for a full discussion of crossover phenomena across relative clauses and
questions, and Cann et al. (forthcoming) for discussion of the problem of how best to characterise
borderline judgements of grammaticality.
120 CHAPTER 3. RELATIVE CLAUSE CONSTRUAL
Chapter 4
121
122 CHAPTER 4. TREE GROWTH AND LANGUAGE TYPOLOGIES
account, then, is to determine which forms of update are particular to human lan-
guage, which are particular to the individual language, and which are a consequence
of general cognitive constraints.
Our over-all aim in chapter 4 and 5 is to explain the interaction between
anaphoric construal and structure building processes, and left and right periph-
ery effects. The first step is relatively modest. We set out a preliminary partial
typology of restrictive relative clauses, focusing on head-initial relatives. We then
build on the partial parallelism between this and construal of left-peripheral expres-
sions to set out a typology of left-periphery effects, using the concepts of LINKed
structures, unfixed nodes and anaphora resolution. These areas have been subject
to intensive study, and, in this chapter, we aim merely to show that the DS tools
provide a natural basis for the gradient set of effects observable in left periphery con-
structions while retaining a unitary analysis of any individual pronoun. As things
turn out however, the feeding relations between the different tree-update processes
provide the basis for a more fine-grained characterisation of the data both in relative
clause and left-periphery constructions than a two-way distinction between struc-
tural and anaphoric forms of dependency is able to express. The principles remain
simple and general even though the interactions between the various constraints on
their implementation may give rise to overlapping tree-growth processes, giving the
appearance of complexity.
Then in the following chapter, we turn to right-periphery effects which, expletive
pronouns aside, have been the subject of much less study; and we shall show that
the same concepts of anaphorically pairing LINKed structures and building unfixed
nodes can be applied to rather different effect to capture generalisations about how
interpretation is built up at later stages of the construal process. This is a more
surprising result, as many right-periphery effects are subject to the much tighter
Right Roof Constraint, which in other frameworks remains a poorly understood
constraint, requiring structure-specific stipulation.
From there we shall turn to head-final languages and use the same tools all
over again, both to characterise word order effects in Japanese, and to capture the
various forms of relative clause construal. Thereafter, we shall turn to the Bantu
languages and look at the puzzle of co-ordination and mis-matching agreement data,
where again we shall see the influence of the dynamics of processing. The result
will be a rounding out of the typology of relative clauses to determine predictions
of both the limits on the type of processes that natural languages make available,
and on the type of variations to expect within that landscape. Right across the
different types of languages, asymmetries between left and right periphery effects
will be explained as consequences of the dynamics of on-line processing.
languages. Modern Irish glas does not map on to Modern English green, since part of its spectrum
includes grey (of animals).
4.1. TOWARDS A RELATIVE CLAUSE TYPOLOGY 123
Arabic, all related to the definite article. In the main, the Arabic we shall use as illustration is
Egyptian Arabic.
4.1. TOWARDS A RELATIVE CLAUSE TYPOLOGY 125
What these actions stipulate is that in the presence of a terminal type e node, a
LINK transition is constructed onto a new topnode with two requirements: one of
?T y(t) and an additional requirement for a copy of the formula decorating the head
node from which the LINK relation was built. This specification is notably like
the computational action of LINK Introduction as defined for English, but here
defined as a lexical action, triggered by the complementiser ?p illi. However, there
are differences. Firstly, this sequence of actions is restricted to applying to nodes
decorated only by variables, since the condition on action lists the bottom restric-
tion as a necessary condition, so it will only trigger restrictive forms of construal.
Secondly, this complementiser is restricted to occurring with definite noun phrases
and we stipulate a definite environment.3 Thirdly, the requirement for the copy
involves a modality not yet encountered. This is shown by the operator hDi which
ranges over both daughter, h↓i, and LINK relations, hLi, while its inverse, hU i,
conversely ranges over mother, h↑i and inverse LINK relations, hL−1 i. Because it
ranges over LINK relations this modality does not observe strong islands and thus
imposes no structural constraint whatever on where in the construction process the
copy head is to be found. Indeed, the required pronoun may occur across a relative
clause boundary, a classic island-restriction violation:4
(4.3) t
arrafna ?p ala l-muxrizh yalli laila sheefit l-masra#iyye
met1st.sg on the-director that laila saw3rd.sg.f em the-play
yalli huwwe ?p axraž-a [Lebanese Arabic]
that he directed3rd.sg.masc -it
‘We met the director that Laila saw the play that he directed it.’
The first consequence of defining ?p illi in this way is that, with no particular
lexical item defined in the language as projecting the required copy, the only way of
meeting it is to use the regular anaphoric process of the language, for this by defini-
tion involves a form which must be interpreted by some term taken from somewhere
else in the context. Hence, there must be some pronoun occurring at some point
in the subsequent string, and it must be interpreted as having an interpretation in
which its value is token-identical to that of the formula provided by the head node.
Any other interpretation of the pronoun (replacing the pronoun’s meta-variable
with some independently available term) will leave the LINKed structure with a
requirement outstanding, hence not well-formed. Any number of occurrences of a
pronoun may occur in the subsequent string: all that matters is that one of them be
interpreted as identical with the head from which the relative structure is induced.
The significance of this account is that unlike other characterisations of resump-
tive pronouns, this makes use of an entirely general account of pronouns. No special
resumptive form for the pronoun needs to be posited, and there is no invocation of
any “intrusive” pronoun either: the modal requirement does all the work. Despite
the emptiness of the locality restriction that this particular form of requirement im-
poses (using the least constraining hDi operator), the requirement still has the effect
of enforcing application of the general process of substitution during the construc-
tion process. In order to meet the well-formedness condition that no requirements
remain unsatisfied in the resulting logical form, the only possible sequences of tran-
sitions, given the non-anaphoric properties of the relativiser ?p illi, will be ones in
which at least one anaphoric device is assigned a construal which allows the modal
3 Notice again that this remains to be given a proper semantic characterisation, although if
tree in Arabic following upon the transition ensured through the use of ?p illi is no more than
a superficial puzzle, given the analysis of Arabic as a subject pro-drop language, with verbs
introducing a full propositional structure from a trigger ?T y(t), decorating the introduced subject
node with a metavariable. In cases in which an explicit subject expression is present in the string,
there may then be an application of *Adjunction – the subject value decorating the unfixed node
and merging with that given by the verb.
6 Demirdache (1991) and others have argued that there are crossover effects involving epithets.
However, it has been counter-argued, by Aoun and Choueiri (2000) that such cases constitute a
Principle C effect. We leave this issue on one side.
4.1. TOWARDS A RELATIVE CLAUSE TYPOLOGY 127
7 See Kempson et al. (2001) for a detailed, similar, specification of the required lexical and
computational actions.
128 CHAPTER 4. TREE GROWTH AND LANGUAGE TYPOLOGIES
F o(U), T y(e), ♦
F o(Darab′ ),
⇑
T y(e → (e → t))
F o(x)
It is notable in this regard that the same rule of LINK evaluation for restrictive
construals operates identically in both languages, yielding a construal of the noun
phrase in (4.5b) as (4.8).
(4.8) F o(ǫ, x, Mudarris′ (x) ∧ Darab′ (x)(ι, z, Magdi′ (z))).
they are often characterised as rescue devices (Sells 1984 etc.), but as these examples show, all
of which except (4.9b) are collected data, their occurrence is very far from being restricted to
such uses. Many of the examples quoted here are part of the collection made by Tami Kaplan
over a period of six years and we are grateful to her for allowing us to use them. See Cann et al.
(forthcoming) for detailed discussion.
4.1. TOWARDS A RELATIVE CLAUSE TYPOLOGY 129
acceptable if said with heavy stress on the pronoun, with the implication of contrast
between this term and some other term, which is not made explicit:
(4.11) a. John, who had to be the one who said we did very little teaching,
was roundly condemned.
b. ?John, who he had to be the one who said we did very little
teaching, was roundly condemned.
c. John, who he had to be the one who said we did very little teaching,
was roundly condemned.
But this is the extra implication which would warrant the use of a pronoun. Put
crudely, if you want to say something with contrastive implication, you need to
stress it, and you cannot stress silence. Under these circumstances, a pronoun
becomes fully acceptable, just as Relevance-theoretic assumptions lead us to expect.
Conversely, when subordinate structures have to be constructed, use of a pro-
noun seems more acceptable than their use in monoclausal sequences as in (4.9b),
(4.10b)), and, though there is difficulty in evaluating what constitutes an appropri-
ate level of difficulty that would warrant use of a pronoun, this might be taken to
indicate a level of complexity in parsing that makes the use of a pronoun more ac-
ceptable because of some clarificatory function. This too is along the lines that Rel-
evance Theory leads us to expect, this time the motivation being the minimisation
of effort by which the hearer is expected to establish the intended interpretation.9
Various other effects, both pragmatic and processual, may be involved in the
parsing of resumptive pronouns in English (see Cann et al. (forthcoming) for dis-
cussion). But this still leaves us with the puzzle of why speakers do not uniformly
judge them to be well-formed, if sentences such as these can indeed be accept-
ably used. We address the concepts of well-formedness and production in detail in
chapter 9, but in the mean time what is notable about these examples is that the
context which motivates a speaker’s use of such data and the context which the
hearer is able to bring to bear in interpreting the sentence may not be the same,
even though for both the string is processable. For example, a speaker may have a
particular entity in mind, hence in her own discourse context, for which the hearer
can only construct the appropriate term having gone through the parsing process,
so not having the analogous term in his own discourse context. As we shall see in
chapter 9, successful communication involves the parser and the producer building
structures that are in exact correspondence. This suggests that informants’ judge-
ments of lack of well-formedness are due to a recognition of mismatch between the
contexts assumed by speaker and hearer, hence an assessment that the conditions
for successful communication are not fulfilled.
Whatever the pragmatic/performance-based explanation of why these data are
peripheral in English, confirmation that the restriction on resumptive pronoun use
in English is indeed system-external and not a constraint to be encoded in the
grammar formalism comes, surprisingly, from construal of pronouns in subject po-
sition in Arabic. Despite the obligatory use of pronouns in non-subject position, in
subject position pronouns are generally avoided, and dispreferred. However, Ara-
bic displays a restricted freedom exactly similar to that more generally available in
9 Though we do not address the problem in detail, the fact that definite NPs can be used
resumptively is also not unexpected, since definite NPs can be defined, as suggested in chapter 3,
as a complex form of anaphoric device (see chapter 8):
(i) John came in. The poor dear was upset.
(ii) That friend of yours who the idiot had to be the one to admit that our teaching loads were
low, was roundly condemned.
(iii)John, who Sue tells me the poor dear is suffering dreadfully from overwork, is still at the office.
4.1. TOWARDS A RELATIVE CLAUSE TYPOLOGY 131