Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2008, Journal of Algorithms
…
17 pages
1 file
This paper develops connections between objective Bayesian epistemology-which holds that the strengths of an agent's beliefs should be representable by probabilities, should be calibrated with evidence of empirical probability, and should otherwise be equivocal-and probabilistic logic. After introducing objective Bayesian epistemology over propositional languages, the formalism is extended to handle predicate languages. A rather general probabilistic logic is formulated and then given a natural semantics in terms of objective Bayesian epistemology. The machinery of objective Bayesian nets and objective credal nets is introduced and this machinery is applied to provide a calculus for probabilistic logic that meshes with the objective Bayesian semantics.
Advances in Soft Computing, 2008
This paper proposes a common framework for various probabilistic logics. It consists of a set of uncertain premises with probabilities attached to them. This raises the question of the strength of a conclusion, but without imposing a particular semantics, no general solution is possible. The paper discusses several possible semantics by looking at it from the perspective of probabilistic argumentation.
2011
While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches to probabilistic logic fit into a simple unifying framework: logically complex evidence can be used to associate probability intervals or probabilities with sentences.
Synthese, 2008
Objective Bayesian probability is often defined over rather simple domains, e.g., finite event spaces or propositional languages. This paper investigates the extension of objective Bayesianism to first-order logical languages. It is argued that the objective Bayesian should choose a probability function, from all those that satisfy constraints imposed by background knowledge, that is closest to a particular frequency-induced probability function which generalises the λ = 0 function of Carnap's continuum of inductive methods.
In Defence of Objective Bayesianism, 2010
I present a formalism that combines two methodologies: objective Bayesianism and Bayesian nets. According to objective Bayesianism, an agent's degrees of belief (i) ought to satisfy the axioms of probability, (ii) ought to satisfy constraints imposed by background knowledge, and (iii) should otherwise be as non-committal as possible (i.e. have maximum entropy). Bayesian nets offer an efficient way of representing and updating probability functions. An objective Bayesian net is a Bayesian net representation of the maximum entropy probability function. I show how objective Bayesian nets can be constructed, updated and combined, and how they can deal with cases in which the agent's background knowledge includes knowledge of qualitative influence relationships, e.g. causal influences. I then sketch a number of applications of the resulting formalism, showing how it can shed light on probability logic, causal modelling, logical reasoning, semantic reasoning, argumentation and recursive modelling.
2005
Uncertainty is a fundamental and irreducible aspect of our knowledge about the world. Until recently, classical first-order logic has reigned as the de facto standard logical foundation for artificial intelligence. The lack of a built-in, semantically grounded capability for reasoning under uncertainty renders classical first-order logic inadequate for many important classes of problems. General-purpose languages are beginning to emerge for which the fundamental logical basis is probability. Increasingly expressive probabilistic languages demand a theoretical foundation that fully integrates classical first-order logic and probability. In first-order Bayesian logic (FOBL), probability distributions are defined over interpretations of classical first-order axiom systems. Predicates and functions of a classical first-order theory correspond to a random variables in the corresponding first-order Bayesian theory. This is a natural correspondence, given that random variables are formalized in mathematical statistics as measurable functions on a probability space. A formal system called Multi-Entity Bayesian Networks (MEBN) is presented for composing distributions on interpretations by instantiating and combining parameterized fragments of directed graphical models. A construction is given of a MEBN theory that assigns a non-zero probability to any satisfiable sentence in classical first-order logic. By conditioning this distribution on consistent sets of sentences, FOBL can represent a probability distribution over interpretations of any finitely axiomatizable first-order theory, as well as over interpretations of infinite axiom sets when a limiting distribution exists. FOBL is inherently open, having the ability to incorporate new axioms into existing theories, and to modify probabilities in the light of evidence. Bayesian inference provides both a proof theory for combining prior knowledge with observations, and a learning theory for refining a representation as evidence accrues. The results of this paper provide a logical foundation for the rapidly evolving literature on first-order Bayesian knowledge representation, and point the way toward Bayesian languages suitable for generalpurpose knowledge representation and computing. Because FOBL contains classical first-order logic as a deterministic subset, it is a natural candidate as a universal representation for integrating domain ontologies expressed in languages based on classical first-order logic or subsets thereof.
2013
Abstract. We present DISPONTE, a semantics for probabilistic ontologies that is based on the distribution semantics for probabilistic logic programs. In DISPONTE the axioms of a probabilistic ontology can be annotated with an epistemic or a statistical probability. The epistemic probability represents a degree of confidence in the axiom, while the statistical probability considers the populations to which the axiom is applied. 1
Entropy, 2015
Objective Bayesianism says that the strengths of one's beliefs ought to be probabilities, calibrated to physical probabilities insofar as one has evidence of them, and otherwise sufficiently equivocal. These norms of belief are often explicated using the maximum entropy principle. In this paper we investigate the extent to which one can provide a unified justification of the objective Bayesian norms in the case in which the background language is a first-order predicate language, with a view to applying the resulting formalism to inductive logic. We show that the maximum entropy principle can be motivated largely in terms of minimising worst-case expected loss.
2012
We present DISPONTE, a semantics for probabilistic ontologies that is based on the distribution semantics for probabilistic logic programs. In DISPONTE the axioms of a probabilistic ontology can be annotated with an epistemic or a statistical probability. The epistemic probability represents a degree of confidence in the axiom, while the statistical probability considers the populations to which the axiom is applied.
Bulletin of the Section of Logic, 2018
We define and investigate from a logical point of view a family of consequence relations defined in probabilistic terms. We call them relations of supporting, and write: |≈w where w is a probability function on a Boolean language. A |≈w B iff the fact that A is the case does not decrease a probability of being B the case. Finally, we examine the intersection of |≈w , for all w, and give some formal properties of it.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Journal of Applied Logic, 2014
Information and Computation, 1990
Discrete Applied Mathematics, 1992
Probabilistic Logics and Probabilistic Networks, 2010
Journal of Logic and Computation, 2020
Information and computation, 1990
Proceedings of the 20th …, 2004
Theory and Practice of Logic Programming, 2009