Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2004
…
64 pages
1 file
In this paper we study categories of theories and interpretations. In these categories, notions of sameness of theories, like synonymy, bi-interpretability and mutual interpretability, take the form of isomorphism. We study the usual notions like monomorphism and product in the various theories. We provide some examples to separate notions across categories. In contrast, we show that, in some cases, notions in different categories do coincide. E.g., we can, under such-and-such conditions, infer synonymity of two theories from their being equivalent in the sense of a coarser equivalence relation. We illustrate that the categories offer an appropriate framework for conceptual analysis of notions. For example, we provide a ‘coordinate free’ explication of the notion of axiom scheme. Also we give a closer analysis of the object-language/ meta-language distinction. Our basic category can be enriched with a form of 2-structure. We use this 2-structure to characterize a salient subclass of...
Several philosophers of science construe models of scientic theories as set-theoretic structures. Some of them moreover claim that models should not be construed as structures in the sense of model theory because the latter are languagedependent. I argue that if we are ready to construe models as set-theoretic structures (strict semantic view), we could equally well construe them as model-theoretic structures of higher-order logic (liberal semantic view). I show that every family of set-theoretic structures has an associated language of higher-order logic and an up to signature isomorphism unique model-theoretic counterpart, which is able to serve the same purposes. This allows to carry over every syntactic criterion of equivalence for theories in the sense of the liberal semantic view to theories in the sense of the strict semantic view. Taken together, these results suggest that the recent dispute about the semantic view and its relation to the syntactic view can be resolved.
Notre Dame Journal of Formal Logic, 2007
being 'inverse to each other' admits important variations. For example, Chang and Keisler [5, §A.31] specify one particular choice of axiomatisation of ZF; for this axiomatisation a weak form of interpretation-equivalence of 'ZF with infinity negated' and PA can be proved, but for stronger notions of interpretation-equivalence a different axiomatisation of ZF seems to be required.
JOHN CORCORAN. 1980. A note concerning definitional equivalence, History and Philosophy of Logic 1: 231–34. MR83j:01002. P R This paper defines certain purely syntactic homogeneous relations between formal, uninterpreted theories based on different formal languages that are sublanguages of a more extensive language. The fact that a theory is uninterpreted does not prevent it from having interpretations nor does it preclude the theory from being presented as the set of true sentences of a given concrete interpretation. The defined relations are called ‘interpretability’, ‘mutual interpretability’, and ‘definitional equivalence’. Let T0s be the set of sentences true in the standard interpretation of the second-order language of arithmetic based on primitives ‘0’ and ‘s’, for the number zero and the one-place operation of successor. Thus the constant terms of T1 are the strings ‘0’, ‘s0’, ‘ss0’, etc. Under the standard interpretation the universe of discourse of T1 is the set of natural numbers: 0, 1, 2, 3, etc. Let T0s^—note bold font—be the set of sentences true in the standard interpretation of the second-order language of two-character string theory based on primitives ‘0’, ‘s’, and ‘^’, for the zero numeral, the successor symbol, and the two-place operation of concatenation. Thus the constant terms of T0s^ are ‘0’, ‘s’, ‘(s^0)’, (0^ s)’ ‘(0^(s^0))’, etc. Under the standard interpretation the universe of discourse of T0s^ is the set of strings composed of the two characters ‘0’and ‘s’: the two strings of length one ‘0’, ‘s’, the four strings of length two ‘00’, ‘0s’, ‘s0’, ‘ss’, the eight strings of length three ‘000’, ‘00s’, ‘0s0’, ‘s00’ ‘0ss’, ‘s0s’, ‘ss0’, ‘sss’, the 16 strings of length 4, etc. As said in this paper, the reference Corcoran et al.1974 implies that T0s is definitional equivalent to T0s^. This does not mean that numbers are definable in terms of character strings. This does not mean that the successor operation is numbers definable in terms of character strings and concatenation. A fortiori, definitional equivalence of two theories implies nothing as to whether the ontological status of objects used to interpret one is prior to or dependent on the whether the ontological status of objects used to interpret the other. Admittedly, the terms ‘interpretability’, ‘mutual interpretability’, and ‘definitional equivalence’ may suggest otherwise to people who fail to notice that the three expressions refer to relations between uninterpreted theories. Perhaps, these and related issues concerning primitive concepts and primitive objects, as opposed to primitive symbols, can be discussed in the session.
1984
Abstract: We present a formal theory of abstract interpretation based on a new category theoretic formalism. This formalism allows one to derive a collecting semantics which preserves continuity of lifted functions and for which the lifting functor is itself continuous. The theory of abstract interpretation is then presented as an approximation of this collecting semantics.
In the paper we consider the classical logicism program restricted to first-order logic. The main result of this paper is the proof of the theorem, which contains the necessary and sufficient conditions for a mathematical theory to be reducible to logic. Those and only those theories, which don't impose restrictions on the size of their domains, can be reduced to pure logic. Among such theories we can mention the elementary theory of groups, the theory of combinators (combinatory logic), the elementary theory of topoi and many others. It is interesting to note that the initial formulation of the problem of reduction of mathematics to logic is principally insoluble. As we know all theorems of logic are true in the models with any number of elements. At the same time, many mathematical theories impose restrictions on size of their models. For example, all models of arithmetic have an infinite number of elements. If arithmetic was reducible to logic, it would had finite models, including an one-element model. But this is impossible in view of the axiom 0 ̸ = x ′ .
In this paper we study the combined structure of the relations of theory-extension and interpretability between theories for the case of finitely axiomatised theories. We focus on two main questions. The first is the matter of definability of salient notions in terms of the structure. We show, for example, that local tolerance, locally faithful inter-pretability and the finite model property are definable over the structure. The second question is how to think about 'good' properties of theories that are independent of implementation details and of 'bad' properties that do depend on implementation details. Our degree structure is suitable to study this contrast, since one of our basic relations, to wit theory-extension, is dependent on implementation details and the other relation, interpretability, is not. Nevertheless, we can define new good properties using bad ones. We introduce a new notion of sameness of theories i-bisimilarity that is second-order definable over our structure. We define a notion of goodness in terms of this relation. We call this notion being fine. We illustrate that some intuitively good properties, like being a complete theory, are not fine.
This paper aims to provide a new criterion of formal equivalence of the- ories that is suitable for being supplemented with preservation conditions concerning interpretation. I argue that both the internal structure of mod- els and choices of morphisms between models are aspects of formalisms that are relevant when it comes to their interpretations. So an adequate crite- rion should take these aspects into account. The two most important cri- teria presently discussed in philosophy of science—generalised definitional equivalence (Morita equivalence) and categorical equivalence—are not opti- mal in this respect. Generalised definitional equivalence neglects choices of morphisms, whereas categorical equivalence neglects the internal structure of models. I put forward the new criterion of definable categorical equivalence which takes into account both aspects and connects the category-theoretic approach and the definability-theoretic approach to formal equivalence.
Philosophical Studies, 2021
On a widespread naturalist view, the meanings of mathematical terms are determined, and can only be determined, by the way we use mathematical language—in particular, by the basic mathematical principles we’re disposed to accept. But it’s mysterious how this can be so, since, as is well known, minimally strong first-order theories are non-categorical and so are compatible with countless non-isomorphic interpretations. As for second-order theories: though they typically enjoy categoricity results—for instance, Dedekind’s categoricity theorem for second-order and Zermelo’s quasi-categoricity theorem for second-order —these results require full second-order logic. So appealing to these results seems only to push the problem back, since the principles of second-order logic are themselves non-categorical: those principles are compatible with restricted interpretations of the second-order quantifiers on which Dedekind’s and Zermelo’s results are no longer available. In this paper, we prov...
Journal of Computer and System Sciences, 1978
The equivalence between programs is an essential concept in the mathematical theory of computation and programming. That a program is correct, that a program transformation (as considered by Burstall and Darlington [6]) p reserves the computed function can be formulated in terms of equivalence between programs. But this relation is very difficult to study for well-known theoretical reasons, Program schemes have been introduced to overcome these difficulties as much as possible. A program P is divided into a program scheme $ (i.e., a program where the domain of computation and the base functions are left unspecified and an interpretation I (i.e., the specification of a domain D, and a function fi = Dlk-+ D, for each k-ary base function symbol). We denote by+, the function computed by $ under I, i.e., the function computed by the program P. We consider pure-LISP-like recursive program schemes without assignments. The conditional if. .. then. .. else. .., usually considered as a piece of control structure, for instance in [l], can be considered as a base function. Hence, in the corresponding schemes, it will be replaced by the 3-adic function symbol h(...,. .. .. ..). (Formal definitions are given in Section 2.) Since the function symbols can be interpreted by arbitrary functions of correct arity the corresponding equivalence relation on schemes, namely (b = $' iff $I = +; for every interpretation I, is very restrictive and does not help very much for the study of interesting equivalence between real programs. (See example 2.2 below.) In order to get more concrete results, we use the notion of a class of interpretations V; the associated equivalence between program schemes is then d =ud' iff 4, = 4; for every I E V.
Dialectica, 2005
Based on an idea of Ajdukiewiu, a method of equifunctionality is developed to provide a formal explication of the notion of sameness of use relative to some system of rules. Given this, a set-theoretic explication of Lauener's context dependent conception of synonymy is introduced by looking at languages of propositional logic, and compared both with Ajdukiewicz's original conception and with Carnap's explication of synonymy based on his method of extension and intenrion ** Wolfson College, Oxford, GB.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Journal of Philosophical Logic, 2021
The Science of Meaning: Essays on the Metatheory of Natural Language Semantics (Oxford University Press, eds. Derek Ball and Brian Rabern)
synthese, 2020
Logic, Meaning and Computation, 2001
To appear in J. of Logic and Computation, 2014
Lecture Notes in Computer Science, 2014
Logic Group Preprint Series, 2008
The Journal of Logic Programming, 1990
Annals of Pure and Applied Logic, 2004
Journal of Philosophical Logic, 2003
South America Journal of Logic, 2019
Computer Languages, Systems & Structures, 1993