Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2007, Arxiv preprint arXiv:0709.1149
Abstract: Ontological models are attempts to quantitatively describe the results of a probabilistic theory, such as Quantum Mechanics, in a framework exhibiting an explicit realism-based underpinning. Unlike either the well known quasi-probability ...
IEEE Intelligent Systems, 2000
Scientific theories that make predictions about observable quantities can be evaluated by their fit to existing data and can be used for predictions on new cases. Our goal is to publish such theories along with observational data and the ontologies needed to enable the inter-operation of the theories and the data. This paper is about designing ontologies that take into account the defining properties of classes. We show how a multi-dimensional design paradigm, based on Aristotelian definitions, is natural, can easily be represented in OWL, and can provide random variables which provide structure for theories that make probabilistic predictions. We show how such ontologies can be the basis for representing observational data and probabilistic theories in our primary application domain of geology.
2012
We present DISPONTE, a semantics for probabilistic ontologies that is based on the distribution semantics for probabilistic logic programs. In DISPONTE the axioms of a probabilistic ontology can be annotated with an epistemic or a statistical probability. The epistemic probability represents a degree of confidence in the axiom, while the statistical probability considers the populations to which the axiom is applied.
Synthese, 2019
In this paper we attempt to analyze the concept of quantum probability within quantum computation and quantum computational logic. While the subjectivist interpretation of quantum probability explains it as a reliable predictive tool for an agent in order to compute measurement outcomes, the objectivist interpretation understands quantum probability as providing reliable information of a real state of affairs. After discussing these different viewpoints we propose a particular objectivist interpretation grounded on the idea that the Born rule provides information about an intensive realm of reality. We then turn our attention to the way in which the subjectivist interpretation of probability is presently applied within both quantum computation and quantum computational logic. Taking as a standpoint our proposed intensive account of quantum probability we discuss the possibilities and advantages it might open for the modeling and development of both quantum computation and quantum computational logic.
Physics Essays, 2002
This contribution derives from a rather extensive study on the foundations of probability. We start by discussing critically the two main models of the random event in Probability Theroy and cast light over a number of incongruities. We conclude that the argument of probability is the critical knot of the probability foundations and put forward the structure of levels for the partially determinate event. The structural model enables us to define the prabability and to attune its subjective and objective interpretations.
Lecture Notes in Computer Science, 2008
This chapter overviews work on semantic science. The idea is that, using rich ontologies, both observational data and theories that make (probabilistic) predictions on data are published for the purposes of improving or comparing the theories, and for making predictions in new cases. This paper concentrates on issues and progress in having machine accessible scientific theories that can be used in this way. This paper presents the grand vision, issues that have arisen in building such systems for the geological domain (minerals exploration and geohazards), and sketches the formal foundations that underlie this vision. The aim is to get to the stage where: any new scientific theory can be tested on all available data; any new data can be used to evaluate all existing theories that make predictions on that data; and when someone has a new case they can use the best theories that make predictions on that case.
Logic Journal of the IGPL, 2018
A probabilistic propositional logic, endowed with an epistemic component for asserting (non-)compatibility of diagonizable and bounded observables, is presented and illustrated for reasoning about the random results of projective measurements made on a given quantum state. Simultaneous measurements are assumed to imply that the underlying observables are compatible. A sound and weakly complete axiomatization is provided relying on the decidable first-order theory of real closed ordered fields. The proposed logic is proved to be a conservative extension of classical propositional logic.
Based on ideas of quantum theory of open systems we propose the consistent approach to the formulation of logic of plausible propositions. To this end we associate with every plausible proposition diagonal matrix of its likelihood and examine it as density matrix of relevant quantum system. We are showing that all logical connectives between plausible propositions can be represented as special positive valued transformations of these matrices. We demonstrate also the above transformations can be realized in relevant composite quantum systems by quantum engineering methods. The approach proposed allows one not only to reproduce and generalize results of wellknown logical systems (Boolean, Lukasiewicz and so on) but also to classify and analyze from unified point of view various actual problems in psychophysics and social sciences.
The paper is a brief summary of an invited talk given at the Discovery Science conference. The principal points are as follows: rst, that probability theory forms the basis for connecting hypotheses and data; second, that the expressive power of the probability models used in scienti c theory formation has expanded signi cantly; and nally, that still further expansion is required to tackle many problems of interest. This further expansion should combine probability theory with the expressive power of rst-order logical languages. The paper sketches an approximate inference method for representation systems of this kind.
Four questions about Quantum Bayesianism and their answers by Ontology of Knowledge. Jean-Louis Boucon issue 2023/12/01, 2023
The following article will attempt to highlight four questions which, in my opinion, are left unanswered (or overlooked) by QBism and to show the answers that the Ontology of Knowledge (OK) can provide. ● How does the subject come to exist for itself, individuated and persistent? ● From what common reality do world, mind, and meaning emerge? ● How does meaning emerge from the mathematical fact of probabilistic expectation? ● Is meaning animated by its own nature?
EPTCS, 2018
We present a simple categorical framework for the treatment of probabilistic theories, with the aim of reconciling the fields of Categorical Quantum Mechanics (CQM) and Operational Probabilistic Theories (OPTs). In recent years, both CQM and OPTs have found successful application to a number of areas in quantum foundations and information theory: they present many similarities, both in spirit and in formalism, but they remain separated by a number of subtle yet important differences. We attempt to bridge this gap, by adopting a minimal number of operationally motivated axioms which provide clean categorical foundations, in the style of CQM, for the treatment of the problems that OPTs are concerned with.
2013
Abstract. We present DISPONTE, a semantics for probabilistic ontologies that is based on the distribution semantics for probabilistic logic programs. In DISPONTE the axioms of a probabilistic ontology can be annotated with an epistemic or a statistical probability. The epistemic probability represents a degree of confidence in the axiom, while the statistical probability considers the populations to which the axiom is applied. 1
2013
The theory of probability is widely used in biomedical research for data analysis and modelling. In previous work the probabilities of the research hypotheses have been recorded as experimental metadata. The ontology HELO is designed to support probabilistic reasoning, and provides semantic descriptors for reporting on research that involves operations with probabilities. HELO explicitly links research statements such as hypotheses, models, laws, conclusions, etc. to the associated probabilities of these statements being true. HELO enables the explicit semantic representation and accurate recording of probabilities in hypotheses, as well as the inference methods used to generate and update those hypotheses. We demonstrate the utility of HELO on three worked examples: changes in the probability of the hypothesis that sirtuins regulate human life span; changes in the probability of hypotheses about gene functions in the S. cerevisiae aromatic amino acid pathway; and the use of active learning in drug design (quantitative structure activity relation learning), where a strategy for the selection of compounds with the highest probability of improving on the best known compound was used. HELO is open source and available at
Eprint Arxiv 1011 6331, 2010
In the following we revisit the frequency interpretation of probability of Richard von Mises, in order to bring the essential implicit notions in focus. Following von Mises, we argue that probability can only be defined for events that can be repeated in similar conditions, and that exhibit 'frequency stabilization'. The central idea of the present article is that the mentioned 'conditions' should be well-defined and 'partitioned'. More precisely, we will divide probabilistic systems into object, environment, and probing subsystem, and show that such partitioning allows to solve a wide variety of classic paradoxes of probability theory. As a corollary, we arrive at the surprising conclusion that at least one central idea of the orthodox interpretation of quantum mechanics is a direct consequence of the meaning of probability. More precisely, the idea that the "observer influences the quantum system" is obvious if one realizes that quantum systems are probabilistic systems; it holds for all probabilistic systems, whether quantum or classical.
2001
The acquisition and representation of basic experimental information under the probabilistic paradigm is analysed. The multinomial probability distribution is identified as governing all scientific data collection, at least in principle. For this distribution there exist unique random variables, whose standard deviation becomes asymptotically invariant of physical conditions. Representing all information by means of such random variables gives the quantum mechanical probability amplitude and a real alternative. For predictions, the linear evolution law (Schrödinger or Dirac equation) turns out to be the only way to extend the invariance property of the standard deviation to the predicted quantities. This indicates that quantum theory originates in the structure of gaining pure, probabilistic information, without any mechanical underpinning.
arXiv (Cornell University), 2009
The crucial but very confidential fact is brought into evidence that-as Kolmogorov himself repeatedly claimed-the mathematical theory of probabilities cannot be applied to physical, factual probabilistic situations because the factual concept of probability distribution is not defined : it is nowhere specified how to construct factually for a given physical random phenomenon, the specific numerical distribution of probabilities to be asserted on the universe of outcomes generated by that phenomenon; nor is it known what significance to associate to the assertion of mere 'existence' of such a factual distribution of numerically defined probabilities. An algorithm of semantic integration of this factual numerically defined distribution of probabilities is then constructed. This algorithm, developed inside a general method of relativized conceptualization, involves a sort of "quantification" of the factual concept of probability. The mentioned result, while it solves Kolmogorov's aporia, fully organizes the general classical concept of probability, from both a factual and a syntactic standpoint. In particular, it appears that, while "randomness" can be considered to be a natural fact, the concepts of 'random phenomenon' and 'probabilistic situation' are factual-conceptual artifacts. As for quantum mechanical 'probabilities', it comes out-surprisingly-that, in general, they cannot be defined factually in an effective way.
It is argued that quantum mechanics does not have merely a predictive function like other physical theories; it consists in a formalisation of the conditions of possibility of any prediction bearing upon phenomena whose circumstances of detection are also conditions of production. This is enough to explain its probabilistic status and theoretical structure. Published in: Collapse, 8, 87-121, 2014
We discuss generalized pobabilistic models for which states not necessarily obey Kolmogorov's axioms of probability. We study the relationship between properties and probabilistic measures in this setting, and explore some possible interpretations of these measures. Quantum Probability-Generalized probabilistic models-Interpretations of probability theory
Physical Review A, 2002
In the Bayesian approach to probability theory, probability quantifies a degree of belief for a single trial, without any a priori connection to limiting frequencies. In this paper we show that, despite being prescribed by a fundamental law, probabilities for individual quantum systems can be understood within the Bayesian approach. We argue that the distinction between classical and quantum probabilities lies not in their definition, but in the nature of the information they encode. In the classical world, maximal information about a physical system is complete in the sense of providing definite answers for all possible questions that can be asked of the system. In the quantum world, maximal information is not complete and cannot be completed. Using this distinction, we show that any Bayesian probability assignment in quantum mechanics must have the form of the quantum probability rule, that maximal information about a quantum system leads to a unique quantum-state assignment, and that quantum theory provides a stronger connection between probability and measured frequency than can be justified classically. Finally we give a Bayesian formulation of quantum-state tomography.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.