Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2013, Logical Investigations
forms of Kolmogoroff’s complexity, Chaitin and G_del’s theorems are stated. They are used to analyze numerous methodological issues: Kant’s Third antinomy, Parkinson’s law of committee, cooperative creative activity, multilanguage programming, benevolence to other’s views, dilemma of deism–atheism.
Kolmogoroff's complexity, Chaitin and Gödel's incompleteness theorems are considered commonly relatively to a fixed coding of objects and to a standard notion of algorithms. In essence all they are independent from almost all properties of concrete theories, algorithms and codings. So stable and general results are to have deep methodological, philosophical and even theological consequences. Here we consider their abstract form and speculations which can be derived from them and partially from modern computer science, IT practice and physics. Main ones are the following: A new proof of Kant's Third antinomy and of Parkinson's Law of committee, relations to cooperative creative activity, multi-language programming, benevolence to other's views, dilemma of deismatheism, and finally methodological approach to theology.
Journal of Scientific Exploration, 2002
SYNTHESIS PHILOSOPHICA, 1992
The author in this text offers a seríes of hypothescs regarding lhe manner in which Kant solvcd lhe fundamental problem of transcendaital philosophy, narnely, the problem of the possibility of synthetic judgements a priorL The task at hand is to determine the a priori conditions required for synthetic judgements to be presumed as either given or true. This, as the author himself indicates, entails an analysis of some of the major steps of Kant's philsophical metod: the theory of categories, the metaphysical and trandscendental exposition of judgements, the status of concepts (eg space and time), and the operations of pure reasori The author also offers an analysis of the theses of objectivity and ideality, as well as Kant's transcendental deduction, In the end, the author demonstrates that there is a circle in Kands transcendental proofs, although not a vicious one.
2012
Wolfram's Principle of Computational Equivalence (PCE) implies that universal complexity abounds in nature. This paper comprises three sections. In the first section we consider the question why there are so many universal phenomena around. So, in a sense, we week a driving force behind the PCE if any. We postulate a principle GNS that we call the Generalized Natural Selection Principle that together with the Church-Turing Thesis is seen to be equivalent to a weak version of PCE. In the second section we ask the question why we do not observe any phenomena that are complex but not-universal. We choose a cognitive setting to embark on this question and make some analogies with formal logic. In the third and final section we report on a case study where we see rich structures arise everywhere.
Remarks on Impossibility, Incompleteness, Paraconsistency, Undecidability, Randomness, Computability, Paradox, Uncertainty and the Limits of Reason in Chaitin, Wittgenstein, Hofstadter, Wolpert, Doria, da Costa, Godel, Searle, Rodych, Berto, Floyd, Moyal-Sharrock and Yanofsky (2019), 2019
It is commonly thought that such topics as Impossibility, Incompleteness, Paraconsistency, Undecidability, Randomness, Computability, Paradox, Uncertainty and the Limits of Reason are disparate scientific physical or mathematical issues having little or nothing in common. I suggest that they are largely standard philosophical problems (i.e., language games) which were resolved by Wittgenstein over 80 years ago. Wittgenstein also demonstrated the fatal error in regarding mathematics or language or our behavior in general as a unitary coherent logical ‘system,’ rather than as a motley of pieces assembled by the random processes of natural selection. “Gödel shows us an unclarity in the concept of ‘mathematics’, which is indicated by the fact that mathematics is taken to be a system” and we can say (contra nearly everyone) that is all that Gödel and Chaitin show. Wittgenstein commented many times that ‘truth’ in math means axioms or the theorems derived from axioms, and ‘false’ means that one made a mistake in using the definitions, and this is utterly different from empirical matters where one applies a test. Wittgenstein often noted that to be acceptable as mathematics in the usual sense, it must be useable in other proofs and it must have real world applications, but neither is the case with Godel’s Incompleteness. Since it cannot be proved in a consistent system (here Peano Arithmetic but a much wider arena for Chaitin), it cannot be used in proofs and, unlike all the ‘rest’ of PA it cannot be used in the real world either. As Rodych notes “…Wittgenstein holds that a formal calculus is only a mathematical calculus (i.e., a mathematical language-game) if it has an extra- systemic application in a system of contingent propositions (e.g., in ordinary counting and measuring or in physics) …” Another way to say this is that one needs a warrant to apply our normal use of words like ‘proof’, ‘proposition’, ‘true’, ‘incomplete’, ‘number’, and ‘mathematics’ to a result in the tangle of games created with ‘numbers’ and ‘plus’ and ‘minus’ signs etc., and with ‘Incompleteness’ this warrant is lacking. Rodych sums it up admirably. “On Wittgenstein’s account, there is no such thing as an incomplete mathematical calculus because ‘in mathematics, everything is algorithm [and syntax] and nothing is meaning [semantics]…” I make some brief remarks which note the similarities of these ‘mathematical’ issues to economics, physics, game theory, and decision theory. Those wishing further comments on philosophy and science from a Wittgensteinian two systems of thought viewpoint may consult my other writings -- Talking Monkeys--Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet--Articles and Reviews 2006-2019 3rd ed (2019), The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle 2nd ed (2019), Suicide by Democracy 4th ed (2019), The Logical Structure of Human Behavior (2019), The Logical Structure of Consciousness (2019, Understanding the Connections between Science, Philosophy, Psychology, Religion, Politics, and Economics and Suicidal Utopian Delusions in the 21st Century 5th ed (2019), Remarks on Impossibility, Incompleteness, Paraconsistency, Undecidability, Randomness, Computability, Paradox, Uncertainty and the Limits of Reason in Chaitin, Wittgenstein, Hofstadter, Wolpert, Doria, da Costa, Godel, Searle, Rodych, Berto, Floyd, Moyal-Sharrock and Yanofsky (2019), and The Logical Structure of Philosophy, Psychology, Sociology, Anthropology, Religion, Politics, Economics, Literature and History (2019).
Arxiv preprint cs.CC/0604072, 2006
The science of complexity is based on a new way of thinking that stands in sharp contrast to the philosophy underlying Newtonian science, which is based on reductionism, determinism, and objective knowledge. This paper reviews the historical development of this new world view, focusing on its philosophical foundations. Determinism was challenged by quantum mechanics and chaos theory. Systems theory replaced reductionism by a scientifically based holism. Cybernetics and postmodern social science showed that knowledge is intrinsically subjective. These developments are being integrated under the header of "complexity science". Its central paradigm is the multi-agent system. Agents are intrinsically subjective and uncertain about their environment and future, but out of their local interactions, a global organization emerges. Although different philosophers, and in particular the postmodernists, have voiced similar ideas, the paradigm of complexity still needs to be fully assimilated by philosophy. This will throw a new light on old philosophical issues such as relativism, ethics and the role of the subject.
Emergence: Complexity and Organization, 2018
In this paper we argue that a rigorous understanding of the nature and implications of complexity reveals that the underlying assumptions that inform our understanding of complex phenomena are deeply related to general philosophical issues. We draw on a very specific philosophical interpretation of complexity, as informed by the work of Paul Cilliers and Edgar Morin. This interpretation of complexity, we argue, resonates with specific themes in post-structural philosophy in general, and deconstruction in particular. We argue that post-structural terms such as différance carry critical insights into furthering our understanding of complexity. The defining feature that distinguishes the account of complexity offered here to other contemporary theories of complexity is the notion of critique. The critical imperative that can be located in a philosophical interpretation of complexity exposes the limitations of totalising theories and subsequently calls for examining the normativity inherent in the knowledge claims that we make. The conjunction of complexity and post-structuralism inscribes a critical-emancipatory impetus into the complexity approach that is missing from other theories of complexity. We therefore argue for the importance of critical complexity against reductionist or restricted understandings of complexity.
Philosophical logic not only determines to associate with the principle of logical thinking but also determines the fundamental meaning of the Logos itself, for judgmental thinking activity. This research article is based on the relationship of transcendental logic and the general logic issues in Kant?s book \'Critique of Pure Reason\'. In his book ?Critique of Pure Reason? he connotes the direct relationship of ?transcendental logic?. Kant thinks that the outcome of transcendental analytics can be substituted with conventional ontology because ?Critique of Pure Reason? has been considered as ?the preliminary studies?. The intent of this article is to solve the following problems. In what way general logic and transcendental logic are related? What represent the earlier progress? While the transcendental logic has superiority of function based on general logic, because latter derived from the former
Russell's paradox shows that the set of all sets does not exist, which gives an example of set theoretical paradox. To resolve a version of this paradox, Russell proposed a theory of types, and then the type hierarchy to infinity appeared in the resolution of the paradox. Victor gave a solution to remove the infinite type hierarchy by proposing his Duality Principle, which not only can solve Russell's paradox, but also can solve the Liar's paradox. In this paper, I contend the completeness of meanings in Duality Principle. Firstly, in solving the Russell's paradox, I assert that the complete meaning of a set should contain Miscellaneous Sets such as A={a,{b}}, in addition to Pure Sets {a,b,c} and Set of Sets {{a},{{b},c}}. And this will require solution of Russel's paradox by Duality principle change its disjunction condition to have 3 conditions. i.e A set is either a Pure Set, or a Set of Sets, or a Miscellaneous Set. Then his original solution can not follow these new conditions of the Duality Principle. Secondly, in resolving the Liar's paradox by Victor's Duality Principle, I argue that the completeness of meaning of a statement, in addition to {a statement, a statement about statements}, there should be also { a statement about (statement about statements) }. It means that the original meaning of a statement is incomplete, there are more meanings of a statement such that it is a complete set of meanings. However, this will push us to Russell's infinite type hierarchy which Victor tried to avoid. Therefore, I myself would like to give the meaning of a statement in Liar's paradox as a Boolean function, which in turn renders the Dialetheism.
Philosophy, Human Nature and the Collapse of Civilization -- Articles and Reviews 2006-2017 3rd Ed 686p(2017)
I have read many recent discussions of the limits of computation and the universe as computer, hoping to find some comments on the amazing work of polymath physicist and decision theorist David Wolpert but have not found a single citation and so I present this very brief summary. Wolpert proved some stunning impossibility or incompleteness theorems (1992 to 2008-see arxiv.org) on the limits to inference (computation) that are so general they are independent of the device doing the computation, and even independent of the laws of physics, so they apply across computers, physics, and human behavior. They make use of Cantor's diagonalization, the liar paradox and worldlines to provide what may be the ultimate theorem in Turing Machine Theory, and seemingly provide insights into impossibility, incompleteness, the limits of computation,and the universe as computer, in all possible universes and all beings or mechanisms, generating, among other things,a non- quantum mechanical uncertainty principle and a proof of monotheism. There are obvious connections to the classic work of Chaitin, Solomonoff, Komolgarov and Wittgenstein and to the notion that no program (and thus no device) can generate a sequence (or device) with greater complexity than it possesses. One might say this body of work implies atheism since there cannot be any entity more complex than the physical universe and from the Wittgensteinian viewpoint, ‘more complex’ is meaningless (has no conditions of satisfaction, i.e., truth-maker or test). Even a ‘God’ (i.e., a ‘device’ with limitless time/space and energy) cannot determine whether a given ‘number’ is ‘random’ nor can find a certain way to show that a given ‘formula’, ‘theorem’ or ‘sentence’ or ‘device’ (all these being complex language games) is part of a particular ‘system’. Those wishing a comprehensive up to date framework for human behavior from the modern two systems view may consult my book ‘The Logical Structure of Philosophy, Psychology, Mind and Language in Ludwig Wittgenstein and John Searle’ 2nd ed (2019). Those interested in more of my writings may see ‘Talking Monkeys--Philosophy, Psychology, Science, Religion and Politics on a Doomed Planet--Articles and Reviews 2006-2019 3rd ed (2019), The Logical Structure of Human Behavior (2019), and Suicidal Utopian Delusions in the 21st Century 4th ed (2019)
This is a review of Edgar Morin's book ON Complexity.
SOME REMARKS ON THE LOGIC AND EPISTEMOLOGY OF COMPUTATION
The paper focuses on some logical and epistemological aspects of the notion of computation. The first part questions the Church-Turing thesis as a fundamental principle concerning the limits of computation and some of its consequences for Philosophy of Mind and Cognitive Science. The second part discusses one of the main presumptions of the traditional conception of computability, namely, its reliance on the absolute character of classical logic which is taken as an underlying framework.
2018
This paper deals with a brief account of the life of Immanuel Kant, elaborates upon his theory of epistemology, in which he merges the tradition of rationalism and empiricism, and how he responds to the question placed froth by David Hume, the famous Scott. The paper, also explicates the ethical stand of Kant, his Deonotological Ethics.
2004
It is reported that the 18 th Century philosopher, Immanuel Kant, was wont to say that the business of philosophy is to answer three questions: What can I know? What ought I to do? What may I hope for? He considered that the answer to the second and third depends on the answer to the first (Catholic Encyclopaedia, 2003). This paper develops and explores knowledge of human subjectivity and organisation from a Complexity perspective -to answer Kant's first question -and expands its implications in relation to what we ought to do and what we may hope for if we were to adopt knowledge generated from this perspective. Fractality is the key feature of Complexity used here metaphorically to describe a new sense of human being that is neither solely subjective nor solely objective. Provocative propositions are made to stimulate exploration of the implications of this Complexity-informed notion of human subjectivity.
2018
Where a licence is displayed above, please note the terms and conditions of the licence govern your use of this document. When citing, please reference the published version. Take down policy While the University of Birmingham exercises care and attention in making items available there are rare occasions when an item has been uploaded in error or has been deemed to be commercially or otherwise sensitive.
Logicians are usually philosophically or mathematically minded. Why, then, would they be so interested in problems that belong to computer science, like the explication of the notions of algorithm, effective procedure, and suchlike? The reason for their interest is presumably this. Such problems are interdisciplinary, and modern mathematics, logic and analytic philosophy have much in common, going hand in hand. For instance, the classical decision problem (Entscheidungsproblem) was tremendously popular among logicians. Kurt Gödel, for one, worked on it. Thus I first provide in Section 1 a brief summary of Gödel’s famous incompleteness results. In the summary I use a current technical vernacular. That is, I use terms like ‘algorithm’, ‘effective procedure’, ‘recursive axiomatization’, etc. These terms were not used in the time when Gödel was pursuing his research on (un)decidability, because the study of these modern notions was triggered, inter alia, just by Gödel’s incompleteness r...
Carnap"s mature philosophy of science is an attempt to dissolve the scientific realism debate altogether as a philosophical pseudo-question. His argument depends upon a logico-semantic thesis regarding the structure of a scientific theory, and more importantly, a meta-ontological thesis regarding the explication of existence claims. The latter commits Carnap to a distinction between the analytic and the synthetic, which was allegedly refuted by Quine. The contemporary philosophy of science has therefore sought to distance itself from logicosemantic considerations, and has pursued the scientific realism debate as an essentially epistemological thesis. I show however that one of the most prominent positions in this recent debate -van Fraassen"s constructive empiricismnot only ends up in very close proximity to Carnap"s attempted dissolution, but even provides the resources for extending and refining his programme. Rather than a historical footnote, Carnap"s mature philosophy of science offers a live-option in the current debate. 1. Introduction 2. Pseudo-Questions in the Philosophy of Science 3. The Analytic and the Synthetic 4. The New Empiricism and the New Epistemology 5. Conclusion: A Plea for Tolerance
Emergence: Complexity & Organization
World Complexity Science Academy Journal, 2020
Since their foundation, Complexity theories have been characterized by a wide multiplicity of interpretations, so that it is often hard to find even a commonly shared definition. This work provides a description of the main interpretations of Complexity and gives an account of the differences, in terms of heuristic potentialities, that have not permitted the assessment of a unique definition. The exploration of these differences will take place within an historical framework. The main idea of this work is that each different version of Complexity Theories is related to the social and cultural structure that characterizes a given epoch. More specifically, we will try to insert each specific version of CT in a given historical contents and will highlight the socialepistemic needs (which change at each epoch) that that version is supposed to have met. We hope that this work will bring two categories of benefits: on one side it will highlight the different interpretations and help
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.