Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2021
…
14 pages
1 file
The information content of a source is defined in terms of the minimum number of bits needed to store the output of the source in a perfectly recoverable way. A similar definition can be given in the case of quantum sources, with qubits replacing bits. In the mentioned cases the information content can be quantified through Shannon’s and von Neumann’s entropy, respectively. Here we extend the definition of information content to operational probabilistic theories, and prove relevant properties as the subadditivity, and the relation between purity and information content of a state. We prove the consistency of the present notion of information content when applied to the classical and the quantum case. Finally, the relation with one of the notions of entropy that can be introduced in general probabilistic theories, the maximum accessible information, is given in terms of a lower bound.
Journal of Mathematical Physics, 2000
A method of representing probabilistic aspects of quantum systems is introduced by means of a density function on the space of pure quantum states. In particular, a maximum entropy argument allows us to obtain a natural density function that only reflects the information provided by the density matrix. This result is applied to derive the Shannon entropy of a quantum state. The information theoretic quantum entropy thereby obtained is shown to have the desired concavity property, and to differ from the the conventional von Neumann entropy. This is illustrated explicitly for a two-state system.
2004
Interest toward information-theoretic derivations of the formalism of quantum theory has been growing since early 1990s thanks to the emergence of the field of quantum computation and to the return of epistemological questions into research programs of many theoretical physicists. We propose a system of information-theoretic axioms from which we derive the formalism of quantum theory.
In this article, we discuss the formal structure of a generalized information theory based on the extension of the probability calculus of Kolmogorov to a (possibly) non-commutative setting. By studying this framework, we argue that quantum information can be considered as a particular case of a huge family of non-commutative extensions of its classical counterpart. In any conceivable information theory, the possibility of dealing with different kinds of information measures plays a key role. Here, we generalize a notion of state spectrum, allowing us to introduce a majorization relation and a new family of generalized entropic measures.
Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical stochastics (probability and statistics). On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics. Furthermore, concurrent advances in experimental techniques have led to a strong interest in questions of quantum information, in particular in the sense of the amount of information about unknown parameters in given observational data or accessible through various possible types of measurements. This scenery is outlined. † MaPhySto-Centre for Mathematical Physics and Stochastics, funded by the Danish National Research Foundation 1 We use the term 'stochastics' in the modern sense of 'probability and statistics together'.
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences, and distinguishability, and is formalized as the distinctions of a partition (a pair of points distinguished by the partition). All the definitions of simple, joint, conditional, and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qubits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates (cohered together in the pure superposition state being measured) that are distinguished by the measurement (decohered in the post-measurement mixed state). Both the classical and quantum versions of logical entropy have simple interpretations as " two-draw " probabilities. The conclusion is that quantum logical entropy is the simple and natural notion of information for a quantum information theory focusing on the distinguishing of quantum states.
This article consists of a very short introduction to classical and quantum information theory. Basic properties of the classical Shannon entropy and the quantum von Neumann entropy are described, along with related concepts such as classical and quantum relative entropy, conditional entropy, and mutual information. A few more detailed topics are considered in the quantum case.
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions (" dits ") of a partition (a pair of points distinguished by the partition). All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates (cohered together in the pure superposition state being measured) that are distinguished by the measurement (decohered in the post-measurement mixed state). Both the classical and quantum versions of logical entropy have simple interpretations as " two-draw " probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states.
). We consider two entropic quantities, which we term measurement and mixing entropy. In classical and quantum theory, they are equal, being given by the Shannon and von Neumann entropies respectively; in general, however, they are very different. In particular, while measurement entropy is easily seen to be concave, mixing entropy need not be. In fact, as we show, mixing entropy is not concave whenever the state space is a non-simplicial polytope. Thus, the condition that measurement and mixing entropies coincide is a strong constraint on possible theories. We call theories with this property monoentropic. Measurement entropy is subadditive, but not in general strongly subadditive. Equivalently, if we define the mutual information between two systems A and B by the usual formula I(A:B) = H(A) + H(B) - H(AB) where H denotes the measurement entropy and AB is a non-signaling composite of A and B, then it can happen that I(A:BC) < I(A:B). This is relevant to information causality in the sense of Pawlowski et al.: we show that any monoentropic non-signaling theory in which measurement entropy is strongly subadditive, and also satisfies a version of the Holevo bound, is informationally causal, and on the other hand we observe that Popescu-Rohrlich boxes, which violate information causality, also violate strong subadditivity. We also explore the interplay between measurement and mixing entropy and various natural conditions on theories that arise in quantum axiomatics.
2021
We use a novel form of quantum conditional probability to define new measures of quantum information in a dynamical context. We explore relationships between our new quantities and standard measures of quantum information such as von Neumann entropy. These quantities allow us to find new proofs of some standard results in quantum information theory, such as the concavity of von Neumann entropy and Holevo’s theorem. The existence of an underlying probability distribution helps to shed some light on the conceptual underpinnings of these results.
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, di¤erences, and distinguishability, and is formalized as the distinctions of a partition (a pair of points distinguished by the partition). All the de-…nitions of simple, joint, conditional, and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding de…nitions at the logical level. The purpose of this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qubits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates (cohered together in the pure superposition state being measured) that are distinguished by the measurement (decohered in the post-measurement mixed state). Both the classical and quantum versions of logical entropy have simple interpretations as " two-draw " probabilities. The conclusion is that quantum logical entropy is the simple and natural notion of information for a quantum information theory focusing on the distinguishing of quantum states.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
The Frontiers Collection, 2015
Annalen der Physik
Fuzzy Sets and Systems, 2004
SSRN Electronic Journal
Physical Review A, 2011
Foundations of Physics, 2009
arXiv (Cornell University), 2021
IMA Journal of Mathematical Control and Information, 2013
arXiv (Cornell University), 2013