Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
1992, Proceedings of the 10th European Conference on Artificial Intelligence
…
5 pages
1 file
Feature structures are partially speci ed, record-like structures which are employed in many recent grammar formalisms to represent linguistic objects of various kinds. Building on previous approaches to the logical representation of linguistic knowledge, this paper presents a logical language which is su ciently expressive to allow for the encoding of recursive constraints on feature structures. A particular concern of this paper is to show how formulas of the logic can be used to capture the denotation of a grammar considered as a recursive de nition of a class of linguistic objects. However, the logic may be of interest to researchers working in the area of general knowledge representation.
1984
The design, implementation, and use of grammar forma]isms for natural language have constituted a major branch of coml)utational linguistics throughout its development. By viewing grammar formalisms as just a special ease of computer languages, we can take advantage of the machinery of denotational semantics to provide a precise specification of their meaning. Using Dana Scott's domain theory, we elucidate the nature of the feature systems used in augmented phrase-structure grammar formalisms, in particular those of recent versions of generalized phrase structure grammar, lexical functional grammar and PATR-I1, and provide a (lcnotational semantics for a simple grammar formalism. We find that the mathematical structures developed for this purpose contain an operation of feature generalization, not available in those grammar formalisms, that can be used to give a partial account of the effect of coordination on syntactic features.
2005
Abstract Minimal recursion semantics (MRS) is a framework for computational semantics that is suitable for parsing and generation and that can be implemented in typed feature structure formalisms. We discuss why, in general, a semantic representation with minimal structure is desirable and illustrate how a descriptively adequate representation with a nonrecursive structure may be achieved.
Research on Language and Computation, 2005
This paper concerns grammatical phenomena sensitive to certain classes of nominal forms, i.e. those that encode different kinds of referential properties of the nominal. We propose a grammar component for defining and picking out such semantic classes of nominal forms within typed feature structure formalisms such as the one used in HPSG, thus aiming at standardizing the representation of such phenomena. The grammar component includes four semantic features associated with the discourse referent of a nominal, i.e. cognitive status, specificity, partitivity, and whether the nominal has a universal interpretation or not. The proposed grammar component reduces to an assumed minimum a relatively large set of features that have already been proposed in analyses of the kind of phenomena at focus here, and it is hypothesized that parts of the structure are likely to be shared among grammars for different languages.
2001
We develop a framework for formalizing semantic construction within grammars expressed in typed feature structure logics, including HPSG. The approach provides an alternative to the lambda calculus; it maintains much of the desirable flexibility of unificationbased approaches to composition, while constraining the allowable operations in order to capture basic generalizations and improve maintainability.
2002
As part of a long-ranged project that aims at establishing databasetheoretic semantics as a model of computational semantics, this presentation focuses on the development of a syntactic component for processeing strings of words or sentences to construct semantic data structures. For design and modeling purposes, the present treatment will be restricted to the analysis of some problematic constructions of Korean involving semi-free word order, conjunction and temporal anchoring, and adnominal modification and antecedent binding. The present work heavily relies on Hausser's (1999, 2000) SLIM theory for language that is based on surface compositionality, timelinearity and two other conditions on natural language processing. Time-linear syntax for natural language has been shown to be conceptually simple and computationally efficient. The associated semantics is complex, however, because it must deal with situated language involving interactive multi-agents.. Nevertheless, by processing input word strings in a time-linear mode, the syntax can incrementally construct the necessary semantic structures for relevant queries and valid inferences. The fragment of Korean syntax will be implemented in Malaga, a Ctype implementation language that was enriched for both programming and debugging purposes and that was particluarly made suitable for implementing in Left-Associative Grammar. This presentation will show how the system of syntactic rules with constraining subrules processes Korean sentences in a step-by-step time-linear manner to incrementally construct semantic data structures that mainly specify relations with their argument, temporal, and binding structures.
Proceedings of the National Conference on …, 1980
We use an extended notion of functional relation here that includes surface syntactic relations, logical syntactic (or shallow case structure) relations, and relations useful for determining discourse structures such as primary focus.
Proceedings of the 24th annual meeting on Association for Computational Linguistics -, 1986
Consideration of the question of meaning in the framework of linguistics often requires an allusion to sets and other higher-order notions. The traditional approach to representing and reasoning about meaning in a computational setting has been to use knowledge representation systems that are either based on first-order logic or that use mechanisms whose formal justifications are to be provided after the fact. In this paper we shall consider the use of a higher-order logic for this task. We first present a version of definite clauses (positive Horn clauses) that is based on this logic. Predicate and function variables may occur in such clauses and the terms in the language are the typed λ-terms. Such term structures have a richness that may be exploited in representing meanings. We also describe a higher-order logic programming language, called λProlog, which represents programs as higher-order definite clauses and interprets them using a depth-first interpreter. A virtue of this language is that it is possible to write programs in it that integrate syntactic and semantic analyses into one computational paradigm. This is to be contrasted with the more common practice of using two entirely different computation paradigms, such as DCGs or ATNs for parsing and frames or semantic nets for semantic processing. We illustrate such an integration in this language by considering a simple example, and we claim that its use makes the task of providing formal justifications for the computations specified much more direct.
1993
Abstract Syntax/semantics interfaces using unification-based or feature-based formalisms are increasingly common in the existing computational linguistics literature.
2000
Building treebanks is a prerequisite for various experiments and research tasks in the area of NLP. Under a recently awarded grant, we are developing (i) a formal definition of a (dependency based) tree, and (ii) a midsize treebank based on this definition. The annotated corpus is designed to have three layers: morphosyntactic (linear) tagging, syntactic dependency annotation, and the tectogrammatical annotation. The project is being carried out jointly at the authors' Institutes. 1 The Current State and Motivation Recent decades have seen a shift towards expressing linguistic knowledge in ways which allow its verification and processing by formal means. Tools originating in mathematics, logic and computer science have been applied to human language to model its structure and functioning. Various aspects of different languages are being described within formally defined frameworks proposed by a number of interacting linguistic theories. The proposals deal with various levels of ...
Journal of Language Modelling
Linguistic description and language modelling need to be formally sound and complete while still being supported by data. We present a linguistic framework that bridges such formal and descriptive requirements, based on the representation of syntactic information by means of local properties. This approach, called Property Grammars, provides a formal basis for the description of specific characteristics as well as entire constructions. In contrast with other formalisms, all information is represented at the same level (no property playing a more important role than another) and independently (any property being evaluable separately). As a consequence, a syntactic description, instead of a complete hierarchical structure (typically a tree), is a set of multiple relations between words. This characteristic is crucial when describing unrestricted data, including spoken language. We show in this paper how local properties can implement any kind of syntactic information and constitute a...
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Proceedings of the 11th coference on Computational linguistics -, 1986
Proceedings of the first conference on European chapter of the Association for Computational Linguistics -, 1983
Journal of Logic and Computation, 2007
Journal of Logic Programming, 1996
Proceedings of the 21st annual meeting on Association …, 1983