Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
AI
The paper explores the syntax-semantics interface in linguistics, comparing natural languages to artificial languages to understand how syntactic structures correspond to semantic representations. It discusses the implications of these relationships for theoretical frameworks in understanding language comprehension and production, illustrating key concepts with examples and figures that outline the need for a nuanced understanding of grammatical and semantic interactions.
2008
There is a tendency in science to proceed from descriptive methods towards an adequate explanatory theory and then move beyond its conclusions. Our purpose is to discover the concepts of computational efficiency in natural language that exclude redundancy, and to investigate how these relate to more general principles. By developing the idea that linguistic structures possess the features of other biological systems this article focuses on the third factor that enters into the growth of language in the individual. It is suggested that the core principles of grammar can be observed in nature itself. The Faculty of Language is an efficient mechanism designed for the continuation of movement in compliance with optimization requirements. To illustrate that, a functional explanation of syntactic Merge is offered in this work, and an attempt is made to identify some criteria that single out this particular computational system as species-specific.
P. Brandt und E. Fuss, Hgg., Form, Structure and …, 2006
Word order correlates with distinctions of information structure, and Günther Grewendorf's work has contributed much to our understanding of what the pertinent regularities are in German and other languages such as Italian or Japanese, and it has also shaped our understanding of how these regularities are linked to grammar. constitutes one of the first concrete proposals of capturing the impact of informational distinctions on German word order. Sabel (1994, 1999) developed one of the most detailed models of word order variation in the middle field. Grewendorf has also contributed substantially to recent developments concerning the left periphery of clauses, based on the view that categories like focus and topic are directly represented in the syntax. proposed that there are Topic and Focus heads situated in the higher functional layers of the clauses, and that the specifiers of these heads host topic and focus phrases, respectively -if not already in the surface representation, then at least at LF. Rizzi's view has been elaborated for b).
2014
This essay describes and comments on some of the key developments in the history of formal semantics and its relation to syntax, focusing on the period from the beginnings of Chomskyan generative grammar in the 1950's up until the mature period of formal semantics in the 1980's, with only a few remarks about earlier background and later developments. One crucial theme is the challenge of developing a compositional semantic theory, and how that challenge has taken on different forms with respect to different syntactic and semantic theories. The ‚syntax-semantics interface‛ is a relatively young topic in the history of linguistics. In early Western linguistics, there was little syntax and essentially no semantics other than some non-formal lexical semantics. Formal syntax came first, with Zellig Harris's student Noam Chomsky revolutionizing the field of linguistics with his work on syntax. Chomsky shared the field's general skepticism about the possibility of semantics...
The linguistic landscape is littered with literally hundreds if not thousands of theories of syntax, many with no more than a handful of adherents, a smallish number with considerable numbers of practitioners; just a few theories dominate over the rest, and account for the majority of syntacticians. Syntactic theories are commonly grouped into two broad types, formal and functional (see §1.5). Formal theories of syntax focus on linguistic form, relegating meaning to a peripheral position. Functional theories by contrast tend to focus on the functions language serves, and the ways that syntax is organised to serve these functions; meaning plays a central role.
2013
Since its inception, generative grammar has pursued a 'divide and conquer' strategy with respect to the study of linguistic phenomena, inheriting from its predecessors in structuralist linguistics and traditional grammar the useful notion that different linguistic phenomena are amenable to modes of explanation which suggest the existence of clusters of linguistic phenomena, some related to sound, some to grammar, some to meaning, some to aspects of language use, and so on. To capture this, Chomsky (1955) formalized the notion of a 'linguistic level' and insisted on the relative autonomy of these levels. The differentiation of linguistic levels was far from new in itself, of course, but it takes on a new significance given Chomsky's mentalist perspective, delivering a view of the language faculty, and by extension cognition, as something structured, with differentiated subcomponents, including minimally a lexicon, and phonological, syntactic, and semantic components, each with their own structural characteristics, in stark contrast to the then-dominant behaviourist view of the mind as a unitary black box. 1 Naturally a theory containing multiple distinct levels of representation raises immediate questions. How many levels are there, and what exactly do the different levels do? Is it an accident that the levels enumerated in the previous paragraph correspond so closely to the classical levels of linguistic analysis in structuralist and earlier linguistics? What can a level of linguistic representation look like? Equally, questions arise immediately about interactions between representations, now commonly discussed under the term linguistic interfaces: what does the output of each level look like and how does it feed into the next level(s)? How
2012
ing away from the particulars of the various algebraic logics that Quine proposes as possible substitutes for a more traditional first-order predicate logic, three critical aspects of how complex formulas are built are relevant for our purposes: (i) conjunction of atomic formulas, (ii) identification of variables within conjuncts and across conjuncts, and (iii) selection of a variable for ‘outside composition,’ that is, selection of a variable within a formula which will be targeted by an operator that outscopes that formula. As we now show, all three features of Quine’s reanalysis play an essential role in our model of the syntax/semantics interface of Oneida. We focus in this paper on three of the five major constructions we have identified in Oneida. For considerations of space, the description is informal or semi-formal, and we focus on the semantic effect of the syntactic combination, in particular on the conjunctive mode of semantic composition (see Koenig and Michelson 2012 f...
2015
Since its inception, generative grammar has pursued a ‘divide and conquer’ strategy with respect to the study of linguistic phenomena, inheriting from its predecessors in structuralist linguistics and traditional grammar the useful notion that different linguistic phenomena are amenable to modes of explanation which suggest the existence of clusters of linguistic phenomena, some related to sound, some to grammar, some to meaning, some to aspects of language use, and so on. To capture this, Chomsky (1955) formalized the notion of a ‘linguistic level’ and insisted on the relative autonomy of these levels. The differentiation of linguistic levels was far from new in itself, of course, but it takes on a new significance given Chomsky’s mentalist perspective, delivering a view of the language faculty, and by extension cognition, as something structured, with differentiated subcomponents, including minimally a lexicon, and phonological, syntactic, and semantic components, each with their ...
Features: Perspectives on a key notion in linguistics, ed …, 2010
In some languages, such as Hebrew, definiteness is encoded as a morphosyntactic feature that not only contributes to the semantics but also plays a role in syntactic operations. In other languages, there is no evidence that definiteness as a feature is available to the syntactic component. In this paper I argue that differences in the range of interpretations of complex genitive constructions in Hebrew versus other languages show that there are two different strategies for constructing the meaning of complex nominals: one that relies on sharing a morphosyntactic definiteness feature (which is possible only in languages that have such a feature), and one that does not. Furthermore, it is argued that morphosyntactic definiteness in Hebrew is not a bivalent feature with two possible values, but a monovalent feature whose presence alternates with lack of specification, which accounts for various asymmetries between definiteness and indefiniteness.
The Sign of the V: Papers in Honour of Sten Vikner, 2019
In this paper it is shown that Danish syntactic constructions, such as accusative + infi nitive, e.g. Hun så ham komme (She saw him come), accusative + to-infi nitive, that-clauses and preposition + that-clauses, have their own type of meaning potential, exactly like lexical items, such as perception predicates: see, hear, control predicates: permit, offer, and mental NEG-raising predicates: think, hope. The types of meaning that syntactic constructions can have as predications are: state of affairs, proposition, illocution and fact. Both lexical items and syntactic constructions are polysemous and disambiguate each other when combined in a clause according to a general rule that may be stated similarly to the way that the rule for a lexical entry may. Some examples such as Hun bad ham komme (She asked him to come) and Hun lod ham begrave (She let him be buried) are identifi ed and given an explanation.
One important aspect of teaching English syntax (to native and nonnative undergraduate students alike) involves the balance in the overall approach between facts and theory. We understand that one important goal of teaching English syntax to undergraduate students is to help students enhance their understanding of the structure of English in a systematic and scientific way. Basic knowledge of this kind is essential for students to move on the next stages, in which they will be able to perform linguistic analyses for simple as well as complex English phenomena. This new introductory textbook has been developed with this goal in mind. The book focuses primarily on the descriptive facts of English syntax, presented in a way that encourages students to develop keen insights into the English data. It then proceeds with the basic, theoretical concepts of generative grammar from which students can develop abilities to think, reason, and analyze English sentences from linguistic points of view.
Acta Linguistica Academica, 2020
Proceedings of the first conference on European chapter of the Association for Computational Linguistics -, 1983
describes the structure and evaluation
Journal of Linguistics, 2010
This linguistic textbook by Peter Culicover provides a broad introductory overview of various topics in the study of syntax. Its major objectives are to show how natural language makes use of various syntactic and morphosyntactic devices, and to lay out the conceptual structures that correspond to particular aspects of linguistic form (xi). The title of the book is thoughtprovoking : rather than Introduction to syntax, Syntactic theory or the like, Culicover chose the title Natural language syntax, which can be interpreted twofold. First, it may be regarded as a direct response to recent research in syntax, especially the Minimalist program (Chomsky 1995), emphasizing that linguistic theory needs to place greater emphasis on accounting for actual language data rather than indulge in purely formalistic investigation. Another possible reading of the title is that the syntactic theory proposed in this book is natural, intuitive and simple, and that the title alludes to Culicover's previous book Simpler syntax (Culicover & Jackendoff 2005). In general, the discussions in this book are framed in theory-neutral terms, supplemented with comparisons between Culicover & Jackendoff's 'simpler syntax ' (SS) approach and so-called mainstream generative grammar (MGG). Chapter 1, 'Overview ', defines syntax as 'the system that governs the relationship between form and meaning in a language' (1) ; and states that the goal of linguistic theory is 'to understand what the properties of human languages are, and why they are that way ' (3). Culicover compares the SS and MGG approaches for the treatment of displacement. Instead of pursuing a derivational analysis, as in MGG, SS directly specifies 'correspondence rules ' between syntactic positions and meanings. Chapter 2, ' Syntactic categories ', provides a detailed description of various syntactic and morphosyntactic categories, and introduces the notational device of an attribute value matrix (also employed in Lexical Functional Grammar and Head-driven Phrase Structure Grammar), which displays the features and values of lexical items. To Culicover, a lexical item expresses the correspondences between mophosyntactic categories, semantic values and phonological representation, as exemplified in (1).
The Iterated Romance of Syntax, 2023
One of the greatest intellectual crimes to beset us in the 20th-century was the premature death of the formalist program. The millennia old dream of solving math was never realized as our efforts fell short of our ambition. David Hilbert, with all his grace, his effulgent brilliance, his professional magnanimity, and his unflinching dedication, was left with only disgruntled disappointment. The formalist program was a noble effort, held aloft by the unyielding conviction and charisma of the foremost mathematicians of the age. This forlorn vignette is rendered at least somewhat emotionally digestible by the developments that followed. The dream of syntax is all became partially realized by the contributions of Haskell Curry, Alonzo Church, Stephen Kleene, Moses Schönfinkel and others. With their syntactic approach to mathematics, we capture the compositional beauty of infinity in our humble symbol. And thus, we compile the syntactic face of God.
Inference: International Review of Science
In the first essay for a new series on landmark texts, Anna Maria Di Sciullo revisits Noam Chomsky’s classic Aspects on the Theory of Syntax. Published in 1965, Aspects was a revelation, presenting linguists with what, at once, became the Standard Theory. It promoted linguistics into a science, one that accepted the methods and the standards of the hard sciences themselves.
Proceedings of the 21st annual meeting on Association for Computational Linguistics -, 1983
A central goal of linguistic theory is to explain why natural languages are the way they are. It has often been supposed that com0utational considerations ought to play a role in this characterization, but rigorous arguments along these lines have been difficult to come by. In this paper we show how a key "axiom" of certain theories of grammar, Subjacency, can be explained by appealing to general restrictions on on-line parsing plus natural constraints on the rule-writing vocabulary of grammars. The explanation avoids the problems with Marcus' [1980] attempt to account for the same constraint. The argument is robust with respect to machine implementauon, and thus avoids the problems that often arise wilen making detailed claims about parsing efficiency. It has the added virtue of unifying in the functional domain of parsing certain grammatically disparate phenomena, as well as making a strong claim about the way in which the grammar is actually embedded into an on-line sentence processor.
In this programmatic paper, we articulate a minimalist conception of linguistic composition, syntactic and semantic, with the aim of identifying fundamental operations invoked by the human faculty of language (HFL). On this view, all complex expressions are formed via the operation COMBINE(A, B). But this operation is not primitive: COMBINE(A, B) = LABEL[CONCATENATE(A, B)]. We take labeling to be a computationally simple but perhaps distinctively human operation that converts a mere concatenation of expressions, like A^B, into a more complex unit like [ A A^B], with the subscript indicating a copy of the dominant constituent. We discuss several virtues of this spare conception of syntax. With regard to semantics, we take instances of COMBINE(A, B) to be instructions to build concepts. More specifically, we claim that concatenation is an instruction to conjoin monadic concepts, while labeling provides a vehicle for invoking thematic concepts, as indicated by the relevant labels.
2012
One of the crucial functions of any human language, such as English or Korean, is to convey various kinds of information from the everyday to the highly academic. Language provides a means for us to describe how to cook, how to remove cherry stains, how to understand English grammar, or how to provide a convincing argument. We commonly consider certain properties of language to be key essential features from which the basic study of linguistics starts.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.