Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
1996
Abstract We present a connectionist architecture that supports almost instantaneous deductive and abductive reasoning. The deduction algorithm responds in few steps for single rule queries and in general, takes time that is linear with the number of rules in the query. The abduction algorithm produces an explanation in few steps and the best explanation in time linear with the size of the assumption set.
1998
Abstract Hybrid connectionist symbolic systems have been the subject of much recent research in AI. By focusing on the implementation of high-level human cognitive processes (eg, rule-based inference) on low-level, brain-like structures (eg, neural networks), hybrid systems inherit both the efficiency of connectionism and the comprehensibility of symbolism. This paper presents the Basic Reasoning Applicator Implemented as a Neural Network (BRAINN).
Topoi, 2007
Abduction is or subsumes a process of inference. It entertains possible hypotheses and it chooses hypotheses for further scrutiny. There is a large literature on various aspects of non-symbolic, subconscious abduction. There is also a very active research community working on the symbolic (logical) characterisation of abduction, which typically treats it as a form of hypothetico-deductive reasoning. In this paper we start to bridge the gap between the symbolic and subsymbolic approaches to abduction. We are interested in benefiting from developments made by each community. In particular, we are interested in the ability of nonsymbolic systems (neural networks) to learn from experience using efficient algorithms and to perform massively parallel computations of alternative abductive explanations. At the same time, we would like to benefit from the rigour and semantic clarity of symbolic logic. We present two approaches to dealing with abduction in neural networks. One of them uses Connectionist Modal Logic and a translation of Horn clauses into modal clauses to come up with a neural network ensemble that computes abductive explanations in a top-down fashion. The other combines neural-symbolic systems and abductive logic programming and proposes a neural architecture which performs a more systematic, bottomup computation of alternative abductive explanations. Both approaches employ standard neural network architectures which are already known to be highly effective in practical learning applications. Differently from previous work in the area, our aim is to promote the integration of reasoning and learning in a way that the neural network provides the machinery for cognitive computation, inductive learning and hypothetical reasoning, while logic provides the rigour and explanation capability to the systems, facilitating the interaction with the outside world. Although it is left as future work to determine whether the structure of one of the proposed approaches is more amenable to learning than the other, we hope to have contributed to the development of the area by approaching it from the perspective of symbolic and sub-symbolic integration.
In this paper, we point out the long lasting debate between two main approaches of Artificial Intelligence: Symbolic AI and connectionist AI. The reasoning of the connectionist AI defenders depend on the indication of necessity of neuron modelling for knowledge representation and hence ignoring symbolic AI which fails in doing that. On the other hand, symbolic AI presents better models of representing knowledge, and higher reasoning capabilities. So we will investigate the issue of whether connectionists might be justified in their arguments since symbolic AI has no intent to consider neuronal level knowledge representation and manipulations and fails most of the time when agents it considers are embodied in a real world environment where as connectionist approaches mainly consider embodiment where there are yet no serious signs of how high level reasoning might have emerged from low level functioning of interconnected neurons.
Theoretical computer science, 2006
1992
Abstract Symbol manipulation as used in traditional Artificial Intelligence has been criticized by neural net researchers for being excessively inflexible and sequential. On the other hand, the application of neural net techniques to the types of high-level cognitive processing studied in traditional artificial intelligence presents major problems as well. We claim that a promising way out of this impasse is to build neural net models that accomplish massively parallel case-based reasoning.
… -symbolic integration: From …, 1997
An inference engine for a hybrid representation scheme based on neurules is presented. Neurules are a kind of hybrid rules that combine a symbolic (production rules) and a connectionist representation (adaline unit). The inference engine uses a connectionist technique, which is based on the 'firing potential', a measurement of the firing tendency of a neurule, and symbolic pattern matching. It is proved to be more efficient and natural than pure connectionist inference engines. Explanation of 'how' type can be provided in the form of if-then symbolic rules.
Artificial Intelligence, 1995
The paper presents a connectionist framework that is capable of representing and learning propositional knowledge. An extended version of propositional calculus is developed and is demonstrated to be useful for nonmonotonic reasoning, dealing with conflicting beliefs and for coping with inconsistency generated by unreliable knowledge sources. Formulas of the extended calculus are proved to be equivalent in a very strong sense to symmetric networks (like Hopfield networks and Boltzmann machines), and efficient algorithms are given for translating back and forth between the two forms of knowledge representation. A fast learning procedure is presented that allows symmetric networks to learn representations of unknown logic formulas by looking at examples. A connectionist inference engine is then sketched whose knowledge is either compiled from a symbolic representation or learned inductively from training examples. Experiments with large scale randomly generated formulas suggest that the parallel local search that is executed by the networks is extremely fast on average. Finally, it is shown that the extended logic can be used as a high-level specification language for connectionist networks, into which several recent symbolic systems may be mapped. The paper demonstrates how a rigorous bridge can be constructed that ties together the (sometimes opposing) connectionist and symbolic approaches.
Vietnam Journal of Computer Science, 2014
We present a simple, distributed reasoning system for the first order logic, which applies a connection calculus as an inference method. The calculus has been proposed by Bibel as a generalization of some other popular approaches, like the tableau calculus or the resolution-based inference. The system is constructed in a lean deduction style and it has been inspired to some extent by a sequential reasoner leanCoP, implemented in Prolog. Our reasoner has a form of a relational program in the Oz language. In this programming model, a computational strategy is a parameter of a program having a form of a special object called a search engine. Therefore, the same program can be run in various ways, particularly in parallel on distributed machines. For this purpose, we use a parallel search engine available on the Mozart platform, which is a programming environment for Oz. We also describe results of experiments for estimating a speedup obtained by the distributed processing.
Lecture Notes in Computer Science, 2012
For a long time, connectionist architectures have been criticized for having propositional fixation, lack of compositionality and, in general, for their weakness in representing sophisticated symbolic information and processing it. This work offers a novel approach that allows full integration of symbolic AI with the connectionist paradigm. We show how to encode and process relational knowledge using artificial neural networks (ANNs), such as Boltzmann Machines. The neural architecture uses a working memory (WM), consisting of pools of "binders", and a long-term synapticmemory that can store a large relational knowledge-base (KB). A compact variable binding mechanism is proposed which dynamically allocates ensembles of neurons when a query is clamped; retrieving KB items till a solution emerges in the WM. We illustrate the proposal through non-trivial predicate unification problems: knowledge items are only retrieved into the WM upon need, and unified, graphlike structures emerge at equilibrium as an activation pattern of the neural network. Our architecture is based on the fact that some attractor-based ANNs may be viewed as performing constraint satisfaction, where, at equilibrium, fixed-points maximally satisfy a set of weighted constraints. We show how to encode relational graphs as neural activation in WM and how to use constraints that are encoded in synapses, in order to retrieve and process such complex structures. Both procedural (the unification algorithm) and declarative knowledge (logic formulae) are first expressed as constraints and then used to generate (or learn) weighted synaptic connections. The architecture has no central control and is inherently robust to unit failures. Contrary to previous connectionist suggestions, this approach is expressive, compact, accurate, and goal directed. The mechanism is universal and has a simple underlying computational principle. As such, it may be further adapted for applications that combine the advantages of both connectionist and traditional symbolic AI and may be used in modeling aspects of human reasoning. BICA 2013 Symposium 16/6/2013 Dear BICA chairs, I'm happy to have the opportunity to submit our papers to the BICA 2013 conference Sincerely
Lecture Notes in Computer Science, 1996
Over the last years several new approaches for modeling situations, actions, and causality within a deductive framework were proposed. These new approaches treat the facts about a situation as resources, which are consumed and produced by actions. In this paper we extend one of these approaches, viz. an equational logic approach, by reifying actions to become resources as well. Using the concept of a membrane we show how abstractions and hierarchical planning can be modeled in such an equational logic. Moreover, we rigorously prove that the extended equational logic program can be mapped onto the so-called chemical abstract machine . As this machine is a model for parallel processes this may lead to a parallel computational model for reasoning about situations, actions, and causality.
Knowledge-Based Systems, 1995
The relationship between symbolism and connectionism has been one of the major issues in recent artificial intelligence research. An increasing number of researchers from each side have tried to adopt the desirable characteristics of the approach. A major open question in this field is the extent to which a connectionist architecture can accommodate basic concepts of symbolic inference, such as a dynamic variable binding mechanism and a rule and fact encoding mechanism involving nary predicates. One of the current leaders in this area is the connectionist rule-based system proposed by Shastri and Ajjanagadde. The paper demonstrates that the mechanism for variable binding which they advocate is fundamentally limited, and it shows how a reinterpretation of the primitive components and corresponding modifications of their system can extend the range of inference which can be supported. Our extension hinges on the basic structural modification of the network components and further modifications of the rule and fact encoding mechanism. These modifications allow the extended model to have more expressive power in dealing with symbolic knowledge as in the unification of terms across many' groups of unifying arguments.
2004
In this paper, we describe a model for reasoning using forward chaining for predicate logic rules and facts with coarse-coded distributed representations for instantiated predicates in a connectionist frame work. Distributed representations are known to give advantages of good generalization, error correction and graceful degradation of performance under noise conditions. The system supports usage of complex rules which involve multiple conjunctions. The system solves the variable binding problem using coarse-coded distributed representations of instantiated predicates without the need to decode them into localist representations. The system has performed forward reasoning successfully on the given reasoning task. It’s performance with regard to generalization on unseen inputs and its ability to exhibit fault tolerance under noise conditions is studied and has been found to give good results.
2002
The paper is an attempt to summarize the previous works of the author on integrating deductive and abductive reasoning paradigms for solving the classification task. A two-tiered reasoning and learning architecture in which Case-Based Reasoning (CBR) used both as a corrective of the solutions inferred by a deductive reasoning system and as a method for accumulating and refining knowledge is briefly described. As illustrative e~Amples the applications of the approach for problems of the case-bnsed maintenance of rnie-based systems and for case-based refinement of neural networks are presented.
Doctor of PhilosophyDepartment of Computer ScienceMajor Professor Not ListedSymbolic knowledge representation and reasoning and deep learning are fundamentally different approaches to artificial intelligence with complementary capabilities. The former are transparent and data-efficient, but they are sensitive to noise and cannot be applied to non-symbolic domains where the data is ambiguous. The latter can learn complex tasks from examples, are robust to noise, but are black boxes; require large amounts of --not necessarily easily obtained-- data, and are slow to learn and prone to adversarial examples. Either paradigm excels at certain types of problems where the other paradigm performs poorly. In order to develop stronger AI systems, integrated neuro-symbolic systems that combine artificial neural networks and symbolic reasoning are being sought. In this context, one of the fundamental open problems is how to perform logic-based deductive reasoning over knowledge bases by means of...
ABSTRACI'. The relation between logic and thought has long been controversial, but has recently influenced theorizing about the nature of mental processes in cognitive science. One prominent tradition argues that to explain the systematicity of thought we must posit syntactically structured representations inside the cognitive system which can be operated upon by structure sensitive rules similar to those employed in systems of natural deduction. I have argued elsewhere that the systematicity of human thought might better be explained as resulting from the fact that we have learned natural languages which are themselves syntactically structured. According to this view, symbols of natural language are external to the cognitive processing system and what the cognitive system must learn to do is produce and comprehend such symbols. In this paper I pursue that idea by arguing that ability in natural deduction itself may rely on pattern recognition abilities that enable us to operate on external symbols rather than encodings of rules that might be applied to internal representations. To support this suggestion, I present a series of experiments with connectionist networks that have been trained to construct simple natural deductions in sentential logic. These networks not only succeed in reconstructing the derivations on which they have been trained, but in constructing new derivations that are only similar to the ones on which they have been trained.
Lecture Notes in Computer Science, 1992
Chcl is a connectionist inference system for H orn logic which is based on the connection method and uses limited resources. This paper gives an overview of the system and its implementation.
1998
The paper is an attempt to summarize the previous works of the author on integrating deductive and abductive reasoning paradigms for solving the classification task. A two-tiered reasoning and learning architecture in which Case-Based Reasoning (CBR) used both as a corrective of the solutions inferred by a deductive reasoning system and as a method for accumulating and refining knowledge is briefly described. As illustrative e~Amples the applications of the approach for problems of the case-bnsed maintenance of rnie-based systems and for case-based refinement of neural networks are presented.
2020
This paper compares the various conceptions of “real-time” in the context of AI, as different ways of taking the processing time into consideration when problems are solved. An architecture of real-time reasoning and learning is introduced, which is one aspect of the AGI system NARS. The basic idea is to form problem-solving processes flexibly and dynamically at run time by using inference rules as building blocks and incrementally self-organizing the system’s beliefs and skills, under the restriction of time requirements of the tasks. NARS is designed under the Assumption of Insufficient Knowledge and Resources, which leads to an inherent ability to deal with varying situations in a timely manner.
We present a multi-domain computational model for symbolic reasoning that was designed with the aim of matching human performance. The computational model is able to reason by deduction, induction, and abduction. It begins with an arbitrary theory in a given domain and gradually extends this theory as new regularities are learned from positive and negative examples. At the core of the computational model is a cognitive model with bounded cognitive resources. The combinatorial explosion problem, which frequently arises in inductive learning, is tackled by searching for solutions inside this cognitive model only. By way of example, we show that the computational model can learn elements of two different domains, namely arithmetic and English grammar.