Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2010, Synthese
…
4 pages
1 file
AI-generated Abstract
Contemporary models of belief and information have evolved to address the complexities of multi-agent scenarios, integrating belief dynamics and interactions. This special issue of Synthese highlights contributions ranging from single-agent belief revision to multi-agent belief ascription, presenting a table that outlines the landscape of belief dynamics and the key research areas explored, while also indicating important domains for future investigation.
In this article I want to discuss some philosophical problems one encounters when trying to model the dynamics of epistemic states. Apart from being of interest in themselves, I believe that solutions to these problems will be crucial for any attempt to use computers to handle changes of knowledge systems. Problems concerning knowledge representation and the updating of such representations have become the focus of much recent research in artificial intelligence (AI).
Artificial Intelligence, 1987
Several new logics for belief and knowledge are introduced and studied, all of which have the property that agents are not logically omniscient. In particular, in these logics, the set of beliefs of an agent does not necessarily contain all valid formulas. Thus, these logics are more suitable than traditional logics for modelling beliefs of humans (or machines) with limited reasoning capabilities. Our first logic is essentially an extension of Levesque's logic of implicit and explicit belief, where we extend to allow multiple agents and higher-level belief (i.e., beliefs about beliefs). Our second logic deals explicitly with "awareness," where, roughly speaking, it is necessary to be aware of a concept before one can have beliefs about it. Our third logic gives a model of "local reasoning," where an agent is viewed as a "society of minds," each with its own cluster of beliefs, which may contradict each other.
The study of the mathematical aspects of belief formation, information processing and rational belief change is of central importance in a number of different fields, namely artificial intelligence, computer science, game theory, logic, philosophy and psychology. The area of belief change studies how a rational agent may maintain its beliefs about a possibly changing environment after obtaining or perceiving new information about the environment. This new information could include properties of the actual world, occurrences of events, and, in the case of multiple agents, actions performed by other agents, as well as the beliefs, preferences or actions (including communication acts) of other agents. Such agents could be acting and sensing in a dynamic world, coalescing information obtained from various sources, negotiating with other agents, or otherwise augmenting and revising their knowledge. The most important question in Game Theory is how to rationally form a belief about other players' behavior and how to rationally revise those beliefs in light of observed actions. Traditionally Game Theory has relied mostly on probabilistic models of beliefs, although recent research has focused on qualitative aspects of belief change. A new branch of modal logic, called Dynamic Epistemic Logic, has emerged that investigates the effects of events that involve information being revealed to a group of agents in a variety of ways, such as through a public announcement or a private announcement. In artificial intelligence, the relatively recent emergence of the field of cognitive robotics, which is concerned with endowing artificial agents with cognitive functions that involve reasoning about goals, actions, the states of other agents, collaboration and negotiation, etc., has given impetus to the development of computational operators for belief change and the identification of issues arising from concrete, evolving sets of knowledge. Another, related, new field of research, called Social Software, maintains that mathematical models developed to reason about the knowledge and beliefs of a group of agents can be used to deepen our understanding of social interaction and aid in the design of successful social institutions. Social Software is the formal study of social procedures focusing on three aspects: (1) the logical and algorithmic structure of social procedures (the main contributors to this area are computer scientists), (2) knowledge and information (the main contributors to this area are logicians and philosophers) , and (3) incentives (the main contributors are game theorists and economists). The area of belief change is thus of interest to many research communities. To date, there has been limited interaction among these communities. The purpose of this 2-day workshop was to bring together researchers from these
The British Journal for the Philosophy of Science, 1984
We provide an introduction to interactive belief systems from a qualitative and semantic point of view. Properties of belief hierarchies are formulated locally. Among the properties considered are "Common belief in no error" (which has been shown to have important game theoretic applications), "Negative introspection of common belief" (which plays a role in the epistemic foundations of correlated equilibrium), "Truth of common _ belief" and "Truth about common belief". The relationship between these properties is studied.
2010
Abstract. We investigate a specific model of knowledge and beliefs and their dynamics. The model is inspired by public announcement logic and the approach to puzzles concerning knowledge using that logic. In the model epistemic considerations are based on ontology. The main notion that constitutes a bridge between these two disciplines is the notion of epistemic capacities.
Bulletin of Economic Research, 2002
The paper models information as possibilities consistent with signals received from the environment. Knowledge is obtained by reasoning about the signals received as well as those that might have been received but were not. The term`knowledge' is used to refer to those beliefs that are obtained by reasoning about the available information, and nothing else. That is, one ought to be able to fully justify what one knows by means of the information that is available. The term`belief' is used to refer to those beliefs that are based on information but not necessarily only on information. The author investigates the relationship between information, knowledge and belief, as well as the issue of updating knowledge and belief in response to changes in information.
2005
In this article I want to discuss some philosophical problems one encounters when trying to model the dynamics of epistemic states. Apart from being of interest in themselves, I believe that solutions to these problems will be crucial for any attempt to use computers to handle changes of knowledge systems. Problems concerning knowledge representation and the updating of such representations have become the focus of much recent research in artificial intelligence (AI).
Philosophical Studies, 2006
At first sight, the modern agenda of epistemology has little to do with logic. Topics include different definitions of knowledge, its basic formal properties, debates between externalist and internalist positions, and above all: perennial encounters with sceptics lurking behind every street corner, especially in the US. The entry 'Epistemology' in the Routledge Encyclopedia of Philosophy (Klein, 1993) and the anthology (Kim and Sosa, 2000) give an up-to-date impression of the field. Now, epistemic logic started as a contribution to epistemology, or at least a tool in its modus operandi, with the seminal book Knowledge and Belief (Hintikka's, 1962, 2005). Formulas like K i / for ''the agent i knows that /'' B i / for ''the agent i believes that'' provided logical forms for stating and analyzing philosophical propositions and arguments. And more than that, their model-theoretic semantics in terms of ranges of alternatives provided an appealing extensional way of thinking about what agents know or believe in a given situation. In particular, on Hintikka's view, an agent knows those propositions which are true in all situations compatible with what she knows about the actual world; i.e., her current range of uncertainty:
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Logic, Epistemology, and the Unity of Science, 2016
In Social Epistemology and Epistemic Agency: De-Centralizing Epistemic Agency, (ed.) P. Reider, (Rowman & Littlefield)
Dagstuhl Seminar Proceedings, 2009
Logic, Epistemology, and the Unity of Science
International Journal of Game Theory, 1999
Springer eBooks, 2003
The Journal of Symbolic Logic, 1985
ILLC Amsterdam. To appear in …, 2009