Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
1993, Developments in Language Theory
…
21 pages
1 file
All science is founded on the assumption that the physical universe is ordered. Our aim is to challenge this hypothesis using arguments,from the algorithmic information theory.
This paper analyzes the case that the Universe is a result of the hypothesis of the existence of the universe. The Universe is a result of an internal infinity and external infinity, then for an external observer of the Universe we never existed, but for an internal observer, we exist for infinity time. This is a result of the existence-possibility scrutinized in the paper “Why the cosmos exist?”. The result is that an asymmetry caused by time and not space created a space-time Universe. Then, the mathematical asymmetry created the other attributes of the Universe, like physical phenomena which are projections of mathematics. The background of mathematics in combination with the result of physics is translated as a code. Math in a combination of series that come from physical attributes creates code in the universe. Then if someone affects the code it is plausible to affect the universe. This code is between the first layer of maths and the third layer of physics. The code is in the second layer. The strongest paradigm that code exists is the possibility that exists in biology. It is a result of random statistics, and not a result with a cause, then not a physical phenomenon.
There are many attempts to explain the existence of the universe. Even the observable universe around us is filled with unknowns and numerous unexplainable phenomena, let alone the unobservable universe. Each and every theory proposed is just trying to find an explanation for the existence of the observable universe around us, ignoring the fact that it is potentially possible there is an invisible universe which reveals itself as we find newer ways to observe it. We need to first and foremost be able to abstract this universe as we observe it to be just a representation of something else underneath that is projected as this world around us by our descriptive thoughts. Then and only then can we find an explanation that is inclusive of everything that the universe is. As far as we search for the origin of just these appearances of descriptive thoughts, we will only dabble with physics, chemistry, mathematics and computations. It should be recognised that Wolfram introduces a very good concept. Simple things recurse to form complex worlds. But this without the correct transformation remains simply a recognisable pattern. How does this recognisable pattern go on to become this complex appearance of reality that we observe around us is up in the air. The ancient Sanskrit literature has an in-depth explanation of how this reality around us works and how it has come to be this. It just needs the correct context around which to translate these texts to understand it. For a long time I wondered what would be the best way to explain the algorithm of the universe as seen by the ancients and explained in the Sanskrit literature. I could explain it by using translation of the Sanskrit verses in a certain order. But I find that they do not take the reader on the journey that is necessary to understand and form the correct thoughts to understand the working of and creation of this reality around us. I thought I would write it as a description of astrology i.e., jyothiSha which, if we look at from the ancient view, is actually a science that tells us how to control the distribution of seeking which drives the formation of this reality around us and hence control this reality itself. But, I find that the term has been highly mangled with all sorts of meaning associated with it. I find, even if there was a very beautiful science explained it becomes very difficult to break these pre-conceived perceptions to get the reader to read objectively and understand. Finally, I have decided the best way is to use the computers that we have created and the AI algorithms that we are trying to code as the best way to describe it, because this was my primary driver to want to understand the Sanskrit literature and to debug the working of the universe. The underlying curiosity to understand why and how the AI that we write does not even seem to be a “fake imitation” of the working of the universe around us. The underlying curiosity to understand how this reality around me is so completely immersive while the virtual reality that we create does not even come near this immersive experience that we have with reality. The curiosity to understand if it is possible for us to create a hardware and software that comes even a little bit close to the kind of intelligence, knowledge and beauty in the algorithm of this reality around us. This has been my driver to decode and understand the Sanskrit literature. So, I find that is the best way to describe the algorithm. In this book I explain the algorithms proposed in the Sanskrit literatures for the working of this reality around us. I also compare these concepts to what we have currently present in computers and explain how they are different from what is explained in these books and why what we have in computers are not sufficient to create the AI that we want to create. Further, I try to put down my thoughts on how or what can be done to create a system that can be used to implement a AI system.
Lecture Notes in Artificial Intelligence
There are writers in both metaphysics and algorithmic information theory (AIT) who seem to think that the latter could provide a formal theory of the former. This paper is intended as a step in that direction. It demonstrates how AIT might be used to define basic metaphysical notions such as object and property for a simple, idealized world. The extent to which these definitions capture intuitions about the metaphysics of the simple world, times the extent to which we think the simple world is analogous to our own, will determine a lower bound for basing a metaphysics for our world on AIT.
Essays on Scientific and Philosophical Understanding of Foundations of Information and Computation, 2011
We propose a test based on the theory of algorithmic complexity and an experimental evaluation of Levin's universal distribution to identify evidence in support of or in contravention of the claim that the world is algorithmic in nature. To this end we have undertaken a statistical comparison of the frequency distributions of data from physical sources on the one handrepositories of information such as images, data stored in a hard drive, computer programs and DNA sequences-and the frequency distributions generated by purely algorithmic means on the other-by running abstract computing devices such as Turing machines, cellular automata and Post Tag systems. Statistical correlations were found and their significance measured.
Studies in Applied Philosophy, Epistemology and Rational Ethics, 2013
There are different aspects and spheres of natural and unconventional computations. In this paper, we analyze methodological and philosophical implications of algorithmic issues of unconventional computations. At first, we describe how the classical algorithmic universe has been developed and analyze why it became closed in the conventional approach to computation in particular and information processing in general. Then we explain how the new models of algorithms constructed by different authors changed the classical algorithmic universe, making it open in the open world of algorithmic constellations allowing higher flexibility and superior creativity. As Gödel undecidability theorems demonstrate, the closed algorithmic universe restricts essential forms of human cognition, while the open algorithmic universe and even more the open world of algorithmic constellations, eliminate such restrictions and enable new perspectives for creativity and innovation.
2011
I will propose the notion that the universe is digital, not as a claim about what the universe is made of but rather about the way it unfolds. Central to the argument will be the concepts of symmetry breaking and algorithmic probability, which will be used as tools to compare the way patterns are distributed in our world to the way patterns are distributed in a simulated digital one. These concepts will provide a framework for a discussion of the informational nature of reality. I will argue that if the universe were analog, then the world would likely be random, making it largely incomprehensible. The digital model has, however, an inherent beauty in its imposition of an upper limit and in the convergence in computational power to a maximal level of sophistication. Even if deterministic, that it is digital doesnt mean that the world is trivial or predictable, but rather that it is built up from operations that at the lowest scale are very simple but that at a higher scale look complex and even random, though only in appearance. * None of the views expressed herein necessarily reflects those held by my employers or by the institutions with which I am affiliated.
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining `information'. We discuss the extent to which Kolmogorov's and Shannon's information theory have a common purpose, and where they are fundamentally different. We indicate how recent developments within the theory allow one to formally distinguish between `structural' (meaningful) and `random' information as measured by the Kolmogorov structure function, which leads to a mathematical formalization of Occam's razor in inductive inference. We end by discussing some of the philosophical implications of the theory.
Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Chamonix, France, 2010, 2010
At this point in time, two major areas of physics, statistical mechanics and quantum mechanics, rest on the foundations of probability and entropy. The last century saw several significant fundamental advances in our understanding of the process of inference, which make it clear that these are inferential theories. That is, rather than being a description of the behavior of the universe, these theories describe how observers can make optimal predictions about the universe. In such a picture, information plays a critical role. What is more is that little clues, such as the fact that black holes have entropy, continue to suggest that information is fundamental to physics in general. In the last decade, our fundamental understanding of probability theory has led to a Bayesian revolution. In addition, we have come to recognize that the foundations go far deeper and that Cox’s approach of generalizing a Boolean algebra to a probability calculus is the first specific example of the more fundamental idea of assigning valuations to partially ordered sets. By considering this as a natural way to introduce quantification to the more fundamental notion of ordering, one obtains an entirely new way of deriving physical laws. I will introduce this new way of thinking by demonstrating how one can quantify partially-ordered sets and, in the process, derive physical laws. The implication is that physical law does not reflect the order in the universe, instead it is derived from the order imposed by our description of the universe. Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science.
DIGITAL PHYSICS: Decoding the Universe, 2019
In this vital and groundbreaking exploration of the universe, the author has written a compelling work allowing us to understand the universe through the computational and consciousness model. In Digital Physics: Decoding the Universe, Ediho Lokanga argues that information and consciousness may offer a unifying language for quantum theory and general relativity, offering a new perspective on quantum gravity (QG) and the theory of everything (ToE), thus contributing to a better description of the universe. There is no doubt that the last few years have demonstrated the important role that information and consciousness play in allowing us to understand physics and science in a new light.
2014
Complexity is a catchword of certain extremely popular and rapidly devel oping interdisciplinary new sciences often called accordingly the sciences of complexity It is often closely associated with another notably popular but ambiguous word information information in turn may be justly called the central new concept in the whole th century science Moreover the notion of information is regularly coupled with a key concept of thermody namics viz entropy And like this was not enough it is quite usual to add one more at present extraordinarily popular notion namely chaos and wed it with the abovementioned concepts It is my aim in this paper to critically analyse this conceptual mess from a logical and philosophical point of view concentrating on the concepts of complexity and information and the question concerning the true relation between them I shall focus especially on the socalled algorithmic infor mation theory which has lately become extraordinarily popular especially in theoreti...
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Essays in Honour of Ingvar Johansson on His Seventieth Birthday, 2013
Determinism, Holism and Complexity, 2003
Review of Philosophy and Psychology
Lecture Notes in Computer Science, 2009
JOURNAL OF ADVANCES IN PHYSICS
International Journal of Theoretical Physics, 2016