Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2020, Logos & Episteme
This paper aims to show that Selim Berker’s widely discussed prime number case is merely an instance of the well-known generality problem for process reliabilism and thus arguably not as interesting a case as one might have thought. Initially, Berker’s case is introduced and interpreted. Then the most recent response to the case from the literature is presented. Eventually, it is argued that Berker’s case is nothing but a straightforward consequence of the generality problem, i.e., the problematic aspect of the case for process reliabilism (if any) is already captured by the generality problem.
2015
All the known approximations of π(n) for finite values of n are derived from real-valued functions that are asymptotic to π(x), such as x logex , Li(x) and Riemann's function R(x) = ∞ n=1 µ(n) (n) li(x 1/n). The degree of approximation for finite values of n is determined only heuristically, by conjecturing upon an error term in the asymptotic relation that can be seen to yield a closer approximation than others to the actual values of π(n) for computable values of n. None of these can, however, claim to estimate π(n) uniquely for all values of n. We show that statistically the probability of n being a prime is π(√ j) i=1 (1 − 1 p i), and that statistically the expected value of the number π(n) of primes less than or equal to n is given uniquely by n j=1 π(√ j) i=1 (1 − 1 p i) for all values of n. We then demonstrate how this yields elementary probability-based proofs of the Prime Number Theorem, Dirichlect's Theorem, and the Twin-Prime Conjecture.
HAL (Le Centre pour la Communication Scientifique Directe), 2015
All the known approximations of π(n) for finite values of n are derived from real-valued functions that are asymptotic to π(x), such as x logex , Li(x) and Riemann's function R(x) = ∞ n=1 µ(n) (n) li(x 1/n). The degree of approximation for finite values of n is determined only heuristically, by conjecturing upon an error term in the asymptotic relation that can be seen to yield a closer approximation than others to the actual values of π(n) for computable values of n. None of these can, however, claim to estimate π(n) uniquely for all values of n. We show that statistically the probability of n being a prime is π(√ j) i=1 (1 − 1 p i), and that statistically the expected value of the number π(n) of primes less than or equal to n is given uniquely by n j=1 π(√ j) i=1 (1 − 1 p i) for all values of n. We then demonstrate how this yields elementary probability-based proofs of the Prime Number Theorem, Dirichlect's Theorem, and the Twin-Prime Conjecture.
We review the conventional wisdom that the distribution of primes suggested by the Prime Number Theorem is such that the probability of an integer being a prime can only be heuristically estimated as 1/log n; apparently reflecting an implicit faith in G. H. Hardy and J. E. Littlewood's 1922 quip that: "Probability is not a notion of pure mathematics, but of philosophy or physics".
RATIO MATHEMATICA, 2019
Within the conceptual framework of number theory, we consider prime numbers and the classic still unsolved problem to find a complete law of their distribution. We ask ourselves if such persisting difficulties could be understood as due to theoretical incompatibilities. We consider the problem in the conceptual framework of computational theory. This article is a contribution to the philosophy of mathematics proposing different possible understandings of the supposed theoretical unavailability and indemonstrability of the existence of a law of distribution of prime numbers. Tentatively, we conceptually consider demonstrability as computability, in our case the conceptual availability of an algorithm able to compute the general properties of the presumed primes’ distribution law without computing such distribution. The link between the conceptual availability of a distribution law of primes and decidability is given by considering how to decide if a number is prime without computing. The supposed distribution law should allow for any given prime knowing the next prime without factorial computing. Factorial properties of numbers, such as their property of primality, require their factorisation (or equivalent, e.g., the sieves), i.e., effective computing. However, we have factorisation techniques available, but there are no (non-quantum) known algorithms which can effectively factor arbitrary large integers. Then factorisation is undecidable. We consider the theoretical unavailability of a distribution law for factorial properties, as being prime, equivalent to its non-computability, undecidability. The availability and demonstrability of a hypothetical law of distribution of primes are inconsistent with its undecidability. The perspective is to transform this conjecture into a theorem.
Alvin Goldman and Erik Olsson have recently proposed a novel solution to the value problem in epistemology, i.e., to the question of how to account for the apparent surplus value of knowledge over mere true belief. Their " conditional probability solution " maintains that even simple process reliabilism can account for the added value of knowledge, since forming true beliefs in a reliable way raises the objective probability that the subject will have more true belief of a similar kind in the future. I argue that this proposal confronts significant internal problems and implicitly invokes higher-level epistemic conditions that run against the spirit of externalism.
2018
The ability of randomness to increase polynomial time computational power has been a subject of controversy for the past 40 years. More precisely, the problem lies in determining whether every problem decidable in bounded-error polynomial time (BPP) can be decided in deterministic polynomial time (P). The answer to this question initially appeared to be negative, fueled by the success of Miller-Rabin pri-mality test in the late 1970s, along with many other "successes of randomization". However, the consensus on the issue gradually reversed, guided by the theoretical work of Adleman [1], Sipser [18], Lautemann [11] and Widgerson [9, 13], among others. In a recent paper [2], Agarwal provided strong support for the BPP=P view by demonstrating a deterministic polynomial primality test. This paper will follow this shift of consensus through the related theoretical developments that motivated it.
2020
It has been said that every natural number was a personal friend to Ramanujan. Counting indeed corresponds to a fundamental necessity of human kind: acting on the world. This has incidentally never been more true than in our modernity, in which almost all our activities get numerically coded. When, at the beginning of the twentieth century, probability theory emerged from the theory of games and, partly due to the influence of actuaries, started to develop as a genuine part of mathematics, who could imagine that the theory of numbers would have anything to do with randomness? Bachelier had investigated models for stock exchange quotes using random variables and had employed stochastic calculus to study their variations. But, venerable arithmetic seemed preserved from these nebulous zones: how could such a crucial basis depend on uncertainty? Among other advances, Kolmogorov has given a thorough axiomatic construction to probability theory, and the topic has gradually become part of ...
arXiv (Cornell University), 2015
All the known approximations of the number of primes pi(n) not exceeding any given integer n are derived from real-valued functions that are asymptotic to pi(x), such as x/log x, Li(x) and Riemann's function R(x). The degree of approximation for finite values of n is determined only heuristically, by conjecturing upon an error term in the asymptotic relation that can be seen to yield a closer approximation than others to the actual values of pi(n) within a finite range of values of n. None of these can, however, claim to estimate pi(n) uniquely for all values of n. We show that the statistical probability of n being a prime is the product (1-1/p) over all primes not exceeding the square root of n; and that statistically the expected value of the number pi(n) of primes not exceeding n is given uniquely by the sum, over all j not exceeding n, of the product (1-1/p) over all primes not exceeding the square root of j. We then demonstrate how this yields elementary probability-based proofs of the Prime Number Theorem, Dirichlect's Theorem, and the Twin-Prime Conjecture.
viXra, 2017
Abstract: A prime number (or a prime) is a natural number greater than 1 that has no positive divisors other than 1 and itself. The crucial importance of prime numbers to number theory and mathematics in general stems from the fundamental theorem of arithmetic, which states that every integer larger than 1 can be written as a product of one or more primes in a way that is unique except for the order of the prime factors. Primes can thus be considered the “basic building blocks”, the atoms, of the natural numbers. There are infinitely many primes, as demonstrated by Euclid around 300 BC. There is no known simple formula that separates prime numbers from composite numbers. However, the distribution of primes, that is to say, the statistical behavior of primes in the large, can be modelled. The first result in that direction is the prime number theorem, proven at the end of the 19th century, which says that the probability that a given, randomly chosen number n is prime is inversely pr...
We talk about random when it is not possible to determine a pattern on the observed outcomes.A computer follows a sequence of fixed instructions to give any of its output, hence the difficulty of choosing numbers randomly from algorithmic approaches. However, some algorithms based on mathematical formulas like the Linear Congruential algorithm and the Lagged Fibonacci generator appear to produce "true" random sequences to anyone who does not know the secret initial input [1]. Up to now, we cannot rigorously answer the question on the randomness of prime numbers [2, page 1] and this highlights a connection between random number generator and the distribution of primes. From [3] and [4] one sees that it is quite naive to expect good random reproduction with prime numbers. We are, however, interested in the properties underlying the distribution of prime numbers, which emerge as sufficient or insufficient arguments to conclude a proof by contradiction which tends to show that...
2017
Currently there is no known efficient formula for primes. Besides that, prime numbers have great importance in e.g., information technology such as public-key cryptography, and their position and possible or impossible functional generation among the natural numbers is an ancient dilemma. The properties of the functions 2ab+a+b in the domain of natural numbers are introduced, analyzed, and exhibited to illustrate how these single out all the prime numbers from the full set of odd numbers. The characterization of odd primes vs. odd non-primes can be done with 2ab+a+b among the odd natural numbers as an analogue to the other, well known type of fundamental characterization for irrational and rational numbers among the real numbers. The prime number theorem, twin primes and erratic nature of primes, are also commented upon with respect to selection, as well as with the Fermat and Euler numbers as examples. Keywords prime number generator, prime number theorem, twin primes, erratic nature of primes
Journal of Cryptology, 1988
In this paper we make two observations on Rabin's probabilistic primality test. The first is a provocative reason why Rabin's test is so good. It turns out that a single iteration has a non-negligible probability of failing only on composite numbers that can actually be split in expected polynomial time. Therefore, factoring would be easy if Rabin's test systematically failed with a 25% probability on each composite integer (which, of course, it does not). The second observation is more fundamental because is it not restricted to primality testing : it has consequences for the entire field of probabilistic algorithms. The failure probability when using a probabilistic algorithm for the purpose of testing some property is compared to that when using it for the purpose of obtaining a random element hopefully having this property. More specifically, we investigate the question of how reliable Rabin's test is when used to generate a random integer that is probably prime, rather than to test a specific integer for primality.
2017
Currently there is no known efficient formula for primes. Besides that, prime numbers have great importance in e.g., information technology such as public-key cryptography, and their position and possible or impossible functional generation among the natural numbers is an ancient dilemma. The properties of the functions 2ab+a+b in the domain of natural numbers are introduced, analyzed, and exhibited to illustrate how these single out all the prime numbers from the full set of odd numbers. The characterization of odd primes vs. odd non-primes can be done with 2ab+a+b among the odd natural numbers as an analogue to the other, well known type of fundamental characterization for irrational and rational numbers among the real numbers. The prime number theorem, twin primes and erratic nature of primes, are also commented upon with respect to selection, as well as with the Fermat and Euler numbers as examples.
2003
We show one possible dynamical approach to the study of the distribution of prime numbers. Our approach is based on two complexity methods, the Computable Information Content and the Entropy Information Gain, looking for analogies between the prime numbers and intermittency.
2004
Let π(x) denote the number of primes smaller or equal to x. We compare √ π(x) with √ R(x) and √ ℓi(x), where R(x) and ℓi(x) are the Riemann function and the logarithmic integral, respectively. We show a regularity in the distribution of the natural numbers in terms of a phase related to ( √ π − √ R) and indicate how ℓi(x) can cross π(x) for the first time.
2004
We show one possible dynamical approach to the study of the distribution of prime numbers. Our approach is based on two complexity methods, the Computable Information Content and the Entropy Information Gain, looking for analogies between the prime numbers and intermittency.
2024
This paper aims to provide a set of considerations that allow us to see a possible solution to the problematic issue of Goldbach's "strong" conjecture, which amounts to asserting that any even natural number greater than 2 can be written as the sum of two prime numbers that are not necessarily distinct. Specifically, we will show mathematically that a hypothetical scenario in which no even composite number exists as a sum of two primes is impossible. This will be done by adopting a probabilistic method by far simpler than the arithmetical attempts already present in literature.
This article is concerned with a statistical proposal due to James R. Beebe for how to solve the generality problem for process reliabilism. The proposal is highlighted by Alvin I. Goldman as an interesting candidate solution. However, Goldman raises the worry that the proposal may not always yield a determinate result. We address this worry by proving a dilemma: either the statistical approach does not yield a determinate result or it leads to trivialization: reliability collapses into truth (and anti-reliability into falsehood). Various strategies for avoiding this predicament are considered, including revising the statistical rule or restricting its application to natural kinds. All amendments are seen to have serious problems of their own. We conclude that reliabilists need to look elsewhere for a convincing solution to the generality problem.
In "A Well-Founded Solution to the Generality Problem", Comesaña argues, inter alia, for three main claims. One is what I call the unavoidability claim: Any adequate epistemological theory needs to appeal, either implicitly or explicitly, to the notion of a belief's being based on certain evidence. Another is what I call the legitimacy claim: It is perfectly legitimate to appeal to the basing relation in solving a problem for an epistemological theory. According to Comesaña, the legitimacy claim follows straightforwardly from the unavoidability claim. The third is what I call the basing solution claim: An appeal to the notion of basing relation is all we need to solve the generality problem for (process) reliabilism. In this article, I argue that the unavoidability claim and the basing solution claim are false and that the legitimacy claim might be true only in a qualified sense.
https://econteenblog.wordpress.com/, 2018
One thing that will be investigated is if there can be two prime number ratios that are equal to each other. There will be other aspects of prime numbers that will be investigated as well, such as how they ‘relate’ to composite numbers. This paper was originally intended to show that it is impossible to find an underlying pattern or explanation by just using natural numbers, but that was likely incorrect.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.