Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
11
IEEE Transactions on Systems, Man, and Cybernetics, Part B, 2007
In this paper, we analyze from a geometric perspective the meaningful relations taking place between belief and probability functions in the framework of the geometric approach to the theory of evidence. Starting from the case of binary domains, we identify and study three major geometric entities relating a generic belief function (b.f.) to the set of probabilities P: 1) the dual line connecting belief and plausibility functions; 2) the orthogonal complement of P; and 3) the simplex of consistent probabilities. Each of them is in turn associated with a different probability measure that depends on the original b.f. We focus in particular on the geometry and properties of the orthogonal projection of a b.f. onto P and its intersection probability, provide their interpretations in terms of degrees of belief, and discuss their behavior with respect to affine combination.
The study of the interplay between belief and probability can be posed in a geometric framework, in which belief and plausibility functions are represented as points of simplices in a Cartesian space. Probability approximations of belief functions form two homogeneous groups, which we call "affine" and "epistemic" families. In this paper we focus on relative plausibility, belief, and uncertainty of probabilities of singletons, the "epistemic" family. They form a coherent collection of probability transformations in terms of their behavior with respect to Dempster's rule of combination. We investigate here their geometry in both the space of all pseudo belief functions and the probability simplex, and compare it with that of the affine family. We provide sufficient conditions under which probabilities of both families coincide.
submitted to the International Journal of …, 2006
Artificial Intelligence: Foundations, Theory, and Algorithms, Springer Nature, 2011
The principal aim of this book is to introduce to the widest possible audience an original view of belief calculus and uncertainty theory. In this geometric approach to uncertainty, uncertainty measures can be seen as points of a suitably complex geometric space, and manipulated in that space, for example, combined or conditioned. In the chapters in Part I, Theories of Uncertainty, the author offers an extensive recapitulation of the state of the art in the mathematics of uncertainty. This part of the book contains the most comprehensive summary to date of the whole of belief theory, with Chap. 4 outlining for the first time, and in a logical order, all the steps of the reasoning chain associated with modelling uncertainty using belief functions, in an attempt to provide a self-contained manual for the working scientist. In addition, the book proposes in Chap. 5 what is possibly the most detailed compendium available of all theories of uncertainty. Part II, The Geometry of Uncertainty, is the core of this book, as it introduces the author’s own geometric approach to uncertainty theory, starting with the geometry of belief functions: Chap. 7 studies the geometry of the space of belief functions, or belief space, both in terms of a simplex and in terms of its recursive bundle structure; Chap. 8 extends the analysis to Dempster’s rule of combination, introducing the notion of a conditional subspace and outlining a simple geometric construction for Dempster’s sum; Chap. 9 delves into the combinatorial properties of plausibility and commonality functions, as equivalent representations of the evidence carried by a belief function; then Chap. 10 starts extending the applicability of the geometric approach to other uncertainty measures, focusing in particular on possibility measures (consonant belief functions) and the related notion of a consistent belief function. The chapters in Part III, Geometric Interplays, are concerned with the interplay of uncertainty measures of different kinds, and the geometry of their relationship, with a particular focus on the approximation problem. Part IV, Geometric Reasoning, examines the application of the geometric approach to the various elements of the reasoning chain illustrated in Chap. 4, in particular conditioning and decision making. Part V concludes the book by outlining a future, complete statistical theory of random sets, future extensions of the geometric approach, and identifying high-impact applications to climate change, machine learning and artificial intelligence. The book is suitable for researchers in artificial intelligence, statistics, and applied science engaged with theories of uncertainty. The book is supported with the most comprehensive bibliography on belief and uncertainty theory.
IEEE Transactions on Systems, Man, and Cybernetics, Part C, 2008
In this paper we propose a geometric approach to the theory of evidence based on convex geometric interpretations of its two key notions of belief function and Dempster's sum. On one side, we analyze the geometry of belief functions as points of a polytope in the Cartesian space called belief space, and discuss the intimate relationship between basic probability assignment and convex combination. On the other side, we study the global geometry of Dempster's rule by describing its action on those convex combinations. By proving that Dempster's sum and convex closure commute we become able to depict the geometric structure of conditional subspaces, i.e. sets of belief functions conditioned by a given function b. Natural applications of these geometric methods to classical problems like probabilistic approximation and canonical decomposition are outlined.
2006
In this work we extend the geometric approach to the theory of evidence in order to study the geometric behavior of the two quantities inherently associated with a belief function. i.e. the plausibility and commonality functions. After introducing the analogous of the basic probability assignment for plausibilities and commonalities, we exploit it to understand the simplicial form of both plausibility and commonality spaces. Given the intuition provided by the binary case we prove the congruence of belief, plausibility, and commonality spaces for both standard and unnormalized belief functions, and describe the point-wise geometry of these sum functions in terms of the rigid transformation mapping them onto each other. This leads us to conjecture that the D-S formalism may be in fact a geometric calculus in the line of geometric probability, and opens the way to a wider application of discrete mathematics to subjective probability.
2021
Conditioning is crucial in applied science when inference involving time series is involved. Belief calculus is an effective way of handling such inference in the presence of epistemic uncertainty – unfortunately, different approaches to conditioning in the belief function framework have been proposed in the past, leaving the matter somewhat unsettled. Inspired by the geometric approach to uncertainty, in this paper we propose an approach to the conditioning of belief functions based on geometrically projecting them onto the simplex associated with the conditioning event in the space of all belief functions. We show here that such a geometric approach to conditioning often produces simple results with straightforward interpretations in terms of degrees of belief. This raises the question of whether classical approaches, such as for instance Dempster’s conditioning, can also be reduced to some form of distance minimisation in a suitable space. The study of families of combination rul...
2011
In this paper we study the problem of conditioning a belief function (b.f.) b with respect to an event A by geometrically projecting such belief function onto the simplex associated with A in the space of all belief functions. Deflning geometric conditional b.f.s by minimizing Lp distances between b and the conditioning simplex in such \belief" space (rather than in the \mass" space) produces complex results with less natural interpretations in terms of degrees of belief. The question of weather classical approaches, such as Dempster’s conditioning, can be themselves reduced to some form of distance minimization remains open: the generation of families of combination rules generated by (geometrical) conditioning appears to be the natural prosecution of this line of research.
2005
In this paper we investigate the properties of the relative plausibility function, the probability built by normalizing the plausibilities of singletons associated with a belief function. On one side, we stress how this probability is a perfect representative of the original belief function when combined with any arbitrary probability through Dempster's rule. This leads to conjecture that this function should also be the solution of the probabilistic approximation problem, formulated naturally in terms of Dempster's rule. On the other side, the geometric properties of relative plausibilities are studied in the context of the geometric approach to the theory of evidence, yielding a description of the representation property which suggests a sketch for the general proof of our conjecture.
ANNALS OF MATHEMATICS AND ARTIFICIAL INTELLIGENCE, 2010
Probability transformation of belief functions can be classified into different families, according to the operator they commute with. In particular, as they commute with Dempster's rule, relative plausibility and belief transforms form one such "epistemic" family, and possess natural rationales within Shafer's formulation of the theory of evidence. However, the relative belief transform only exists when some mass is assigned to singletons. We show here that relative belief is only a member of a class of "relative mass" mappings, which can be interpreted as lowcost proxies for both plausibility and pignistic transforms.
In this paper we study the problem of conditioning a belief function (b.f.) b with respect to an event A by geometrically projecting such belief function onto the simplex associated with A in the space of all belief functions. Defining geometric conditional b.f.s by minimizing L p distances between b and the conditioning simplex in such "belief" space (rather than in the "mass" space) produces complex results with less natural interpretations in terms of degrees of belief. The question of weather classical approaches, such as Dempster's conditioning, can be themselves reduced to some form of distance minimization remains open: the generation of families of combination rules generated by (geometrical) conditioning appears to be the natural prosecution of this line of research.
Proceedings of ISIPTA, 2003
In this paper we adopt the geometric approach to the theory of evidence to study the geometric counterparts of the plausibility functions, or upper probabilities. The computation of the coordinate change between the two natural reference frames in the belief space allows us to introduce the dual notion of basic plausibility assignment and understand its relation with the classical basic probability assignment. The convex shape of the plausibility space P is recovered in analogy to what was done for the belief space, and the pointwise geometric relation between a belief function and the corresponding plausibility vector is discussed. The orthogonal projection of an arbitrary belief function s onto the probabilistic subspace is computed and compared with other significant entities, such as the relative plausibility and mean probability vectors.
Arxiv preprint, arXiv:1810.10341, 2018
In this Book we argue that the fruitful interaction of computer vision and belief calculus is capable of stimulating significant advances in both fields. From a methodological point of view, novel theoretical results concerning the geometric and algebraic properties of belief functions as mathematical objects are illustrated and discussed in Part II, with a focus on both a perspective 'geometric approach' to uncertainty and an algebraic solution to the issue of conflicting evidence. In Part III we show how these theoretical developments arise from important computer vision problems (such as articulated object tracking, data association and object pose estimation) to which, in turn, the evidential formalism is able to provide interesting new solutions. Finally, some initial steps towards a generalization of the notion of total probability to belief functions are taken, in the perspective of endowing the theory of evidence with a complete battery of estimation and inference tools to the benefit of all scientists and practitioners.
Belief Functions: Theory and Applications, 2018
In this paper we build on previous work on the geometry of Dempster's rule to investigate the geometric behaviour of various other combination rules, including Yager's, Dubois', and disjunctive combination, starting from the case of binary frames of discernment. Believability measures for unnormalised belief functions are also considered. A research programme to complete this analysis is outlined.
Logics in Artificial Intelligence, 2008
2014
Computer vision is an ever growing discipline whose ambitious goal is to enable machines with the intelligent visual skills humans and animals are provided by Nature, allowing them to interact effortlessly with complex, dynamic environments. Designing automated visual recognition and sensing systems typically involves tackling a number of challenging tasks, and requires an impressive variety of sophisticated mathematical tools. In most cases, the knowledge a machine has of its surroundings is at best incomplete – missing data is a common problem, and visual cues are affected by imprecision. The need for a coherent mathematical ‘language’ for the description of uncertain models and measurements then naturally arises from the solution of computer vision problems. The theory of evidence (sometimes referred to as ‘evidential reasoning’, ‘belief theory’ or ‘Dempster- Shafer theory’) is, perhaps, one of the most successful approaches to uncertainty modelling, as arguably the most straightforward and intuitive approaches to a generalized probability theory. Emerging in the last Sixties from a profound criticism of the more classical Bayesian theory of inference and modelling of uncertainty, it stimulated in the last decades an extensive discussion of the epistemic nature of both subjective ‘degrees of beliefs’ and frequentist ‘chances’ or relative frequencies. More recently, a renewed interest in belief functions, the mathematical generalization of probabilities which are the object of study of the theory of evidence, has seen a blossoming of applications to a variety of fields of applied science. In this Book we are going to show how, indeed, the fruitful interaction of computer vision and evidential reasoning is able stimulate a number of advances in both fields. From a methodological point of view, novel theoretical advances concerning the geometric and algebraic properties of belief functions as mathematical objects will be illustrated in some detail in Part II, with a focus on a perspective ‘geometric approach’ to uncertainty and an algebraic solution of the issue of conflicting evidence. In Part III we will illustrate how these new perspectives on the theory of belief functions arise from important computer vision problems, such as articulated object tracking, data association and object pose estimation, to which in turn the evidential formalism can give interesting new solutions. Finally, some initial steps towards a generalization of the notion of total probability to belief functions will be taken, in the perspective of endowing the theory of evidence with a complete battery of estimation and inference tools to the benefit of scientists and practitioners.
Conditioning is crucial in applied science when inference involving time series is involved. Belief calculus is an effective way of handling such inference in the presence of uncertainty, but different approaches to conditioning in that framework have been proposed in the past, leaving the matter unsettled. We propose here an approach to the conditioning of belief functions based on geometrically projecting them onto the simplex associated with the conditioning event in the space of all belief functions. We show here that such a geometric approach to conditioning often produces simple results with straightforward interpretations in terms of degrees of belief. The question of whether classical approaches, such as for instance Dempster’s conditioning, can also be reduced to some form of distance minimization remains open: the study of families of combination rules generated by (geometric) conditioning rules appears to be the natural prosecution of the presented research.
Proceedings of BELIEF 2010
"Conditioning is crucial in applied science when inference involving time series is involved. Belief calculus is an e ffective way of handling such inference in the presence of uncertainty, but di fferent approaches to conditioning in that framework have been proposed in the past, leaving the matter unsettled. We propose here an approach to the conditioning of belief functions based on geometrically projecting them onto the simplex associated with the conditioning event in the space of all belief functions. Two di fferent such simplices can be defi ned, as each belief function can be represented as either the vector of its basic probability values or the vector of its belief values. We show here that such a geometric approach to conditioning often produces simple results with straightforward interpretations in terms of degrees of belief. The question of whether classical approaches, such as for instance Dempster's conditioning, can also be reduced to some form of distance minimization remains open: the study of families of combination rules generated by (geometric) conditioning rules appears to be the natural prosecution of the presented research."
2000
We discuss the justifications of Bayesianism by Cox and Jaynes, and relate them to a recent critique by Halpern(JAIR, vol 10(1999), pp 67–85). We show that a problem with Halperns example is that a finite and natural refinement of the model leads to inconsistencies, and that the same is the case with every model in which rescalability to probability cannot be done. We also discuss other problems with the justifications and assumptions usually made on the function F describing plausibility of conjunction. We note that the commonly postulated monotonicity condition should be strengthened to strict monotonicity before Cox justification becomes convincing. On the other hand, we note that the commonly assumed regularity requirements on F (like continuity) or its domain (like denseness) are unnecessary.
submitted to the International Journal of …, 2007
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.