Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2012
As Dempster-Shafer theory spreads in different applications fields involving complex systems, the need for algorithms randomly generating mass functions arises. As such random generation is often perceived as secondary, most proposed algorithms use procedures whose sample statistical properties are difficult to characterize. Thus, although they produce randomly generated mass functions, it is difficult to control the sample statistical laws. In this paper, we briefly review classical algorithms, explaining why their statistical properties are hard to characterize, and then provide simple procedures to perform efficient and controlled random generation. Thomas Burger: CNRS (FR3425), CEA (iRTSV/BGE), INSERM (
Computational Statistics, 2004
Random variate generation is an important tool in statistical computing. Many programms for simulation or statistical computing (e.g. R) provide a collection of random variate generators for many standard distributions. However, as statistical modeling has become more sophisticated there is demand for larger classes of distributions. Adding generators for newly required distribution seems not to be the solution to this problem. Instead so called automatic (or black-box) methods have been developed in the last decade for sampling from fairly large classes of distributions with a single piece of code. For such algorithms a data about the distributions must be given; typically the density function (or probability mass function), and (maybe) the (approximate) location of the mode. In this contribution we show how such algorithms work and suggest an interface for R as an example of a statistical library.
2009
Monte Carlo simulations are an important tool in statistical physics, complex systems science, and many other fields. An increasing number of these simulations is run on parallel systems ranging from multicore desktop computers to supercomputers with thousands of CPUs. This raises the issue of generating large amounts of random numbers in a parallel application. In this lecture we will learn
Brazilian Journal of Physics, 2015
The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p = (p 1 , • • • , p d). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described.
Physical Review E, 2018
We show that collective chaotic behavior emerging from the Boltzmann picture of gases provides an alternate conceptual framework to analyze and create apparent randomness on computers. For instance, at equilibrium the distribution of various physical quantities such as position or velocities or two-dimensional projection of kinetic energy can be used to generate uniform, Gaussian, and exponential distributions. Finally, based on these ideas we present a simplified but highly efficient mesoscopic Boltzmann-type algorithm to generate random distributions at a computational efficiency which is orders of magnitude higher than contemporary algorithms. We illustrate the effectiveness by considering two representative examples: spinodal decomposition of a binary alloy and rare event sampling in biological systems.
Theory of Computing Systems / Mathematical Systems Theory, 2003
1 This paper is a combination of two independent w orks 17] a n d 2 4 ] and collaborative w ork. A summarized version appears in 19].
Lecture Notes in Computer Science, 2000
In this paper, a parallel pseudo-random generator, named PLFG, is presented. PLFG was designed specifically for MIMD parallel programming, implemented using Message Passing Interface (MPI) in C. It is highly scalable and with the default parameters chosen, it provides an astronomical period of at least 2 29 2 23209 − 1 . Its scalability and period is essentially limited only by the hardware architecture on which it is running on. An implementation in MPI guarantees portability across the large number of high-performance parallel computers, ranging from clusters of workstations to massively parallel processor machines, supported by MPI. PLFG has been subjected to the 2D Ising model Monte Carlo simulation test with the Wolff algorithm. Results from the test show that the quality of the pseudo-random numbers generated are comparable to that of other more commonly used parallel pseudo-random generator. Timing results show that PLFG is faster than some PPRNGs, and on par with others.
Revista Cubana De Ciencias Informaticas, 2014
La elección de algoritmos eficaces y eficientes para la generación de números aleatorios es un problema clave en simulaciones de procesos estocásticos; siendo la difusión uno de ellos. El modelo del caminante aleatorio y la ecuación dinámica del Langevin son las formas más sencillas para el estudio computacional de la difusión. Ambos modelos, donde las partículas no interactúan y se mueven libremente, se utilizan para probar la calidad de los generadores de números aleatorios que se van a utilizar en simulaciones computacionales más complejas. En principio, la generación de números aleatorios a través de ordenadores es imposible porque los ordenadores funcionan a través de algoritmos deterministas, sin embargo, se pueden utilizar generadores deterministas cuyas secuencias de números que para las aplicaciones prácticas podrían considerarse aleatoria. En el presente trabajo se presenta una combinación de los generadores de números aleatorios reportados por Numerical Recipes y GNU Scientific Library
Concurrency and Computation: Practice and Experience, 2012
This paper presents an open source toolkit allowing a rigorous distribution of stochastic simulations. It is designed according to the state of the art in pseudo-random numbers partitioning techniques. Based on a generic XML format for saving pseudo-random number generator states, each state contains adapted metadata. This toolkit named DistMe is usable by modelers who are non-specialists in parallelizing stochastic simulations, it helps in distributing the replications and in the generation of experimental plans. It automatically writes ready for runtime scripts for various parallel platforms, encapsulating the burden linked to the management of status files for different pseudo-random generators. The automation of this task avoids many human mistakes. The toolkit has been designed based on a model driven engineering approach: the user builds a model of its simulation and the toolkit helps in distributing independent stochastic experiments. In this paper, the toolkit architecture is exposed, and two examples in life science research domains are detailed. The preliminary design of the DistMe toolkit was achieved when dealing with the distribution of a nuclear medicine application using the largest European computing grid: European Grid Initiative (EGI). Thanks to our alpha version of the software toolbox, the equivalent of 3 years of computing was achieved in a few days. Next, we present the second application in another domain to show the potential and genericity of the DistMe toolkit. A small experimental plan with 1024 distributed stochastic experiments was run on a local computing cluster to explore scenarios of an environmental application. For both applications, the proposed toolkit was able to automatically generate distribution scripts with independent pseudo-random number streams, and it also automatically parameterized the simulation input files to follow an experimental design. The automatic generation of scripts and input files is achieved, thanks to model transformations using a model driven approach.
Ussr Computational Mathematics and Mathematical Physics, 1967
This report presents a collection of computer-generated statistical distributions which are useful for performing Monte Carlo simulations. The distributions are encapsulated into a C++ class, called ''Random,'' so that they can be used with any C++ program. The class currently contains 27 continuous distributions, 9 discrete distributions, data-driven distributions, bivariate distributions, and number-theoretic distributions. The class is designed to be flexible and extensible, and this is supported in two ways: (1) a function pointer is provided so that the user-programmer can specify an arbitrary probability density function, and (2) new distributions can be easily added by coding them directly into the class. The format of the report is designed to provide the practitioner of Monte Carlo simulations with a handy reference for generating statistical distributions. However, to be self-contained, various techniques for generating distributions are also discussed, as well as procedures for estimating distribution parameters from data. Since most of these distributions rely upon a good underlying uniform distribution of random numbers, several candidate generators are presented along with selection criteria and test results. Indeed, it is noted that one of the more popular generators is probably overused and under what conditions it should be avoided.
2011
Abstract A significant problem faced by scientific investigation of complex modern systems is that credible simulation studies of such systems on single computers can frequently not be finished in a feasible time. Discrete-event simulation of dynamic stochastic systems, allowing multiple replications in parallel (MRIP) to speed up simulation time, has become one of the most popular paradigms of investigation in many areas of science and engineering.
ACM Transactions on Mathematical Software, 2000
In this article we present background, rationale, and a description of the Scalable Parallel Random Number Generators (SPRNG) library. We begin by presenting some methods for parallel pseudorandom number generation. We will focus on methods based on parameterization, meaning that we will not consider splitting methods such as the leap-frog or blocking methods. We describe, in detail, parameterized versions of the following pseudorandom number generators: (i) linear congruential generators, (ii) shift-register generators, and (iii) lagged-Fibonacci generators. We briefly describe the methods, detail some advantages and disadvantages of each method, and recount results from number theory that impact our understanding of their quality in parallel applications. SPRNG was designed around the uniform implementation of different families of parameterized random number generators. We then present a short description of SPRNG. The description contained within this document is meant only to outline the rationale behind and the capabilities of SPRNG. Much more information, including examples and detailed documentation aimed at helping users with putting and using SPRNG on scalable systems is available at http://sprng.cs.fsu.edu. In this description of SPRNG we discuss the random-number generator library as well as the suite of tests of randomness that is an integral part of SPRNG. Random-number tools for parallel Monte Carlo applications must be subjected to classical as well as new types of empirical tests of randomness to eliminate generators that show defects when used in scalable environments.
2000
In this paper, we propose the use of the bispectrum based tools to evaluate the statitiscal quality of a pseudo-random generator. Two well-know implementations of a pseudo-random generator and a new idea from physics were used as an example of how to use these statistical tools.
Monte-Carlo simulations are common and inherently well suited to parallel processing, thus requiring random numbers that are also generated in parallel. We describe here a splitting approach for parallel random number generation. Various definitions of the Monte-Carlo method have been given. As one example (1): "The Monte-Carlo method is defined as representing the solution of a problem as a parameter of a hypothetical population, and using a random sequence of numbers to construct a sample of the population from which statistical estimates of the parameters can be obtained." The method has been defined more broadly (2) as "any technique making use of random numbers to solve a problem." A more encompassing description (6) is "a numerical method based on random sampling." In this article we take the definition in its most general sense, and it is the random aspect of Mon te Carlo that is our focus. Monte-Carlo simulation consists of repeating the same ba...
Physica A: Statistical Mechanics and its Applications, 2003
We describe a generalized scheme for the probability-changing cluster (PCC) algorithm, based on the study of the finite-size scaling property of the correlation ratio, the ratio of the correlation functions with different distances. We apply this generalized PCC algorithm to the two-dimensional 6-state clock model. We also discuss the combination of the cluster algorithm and the extended ensemble method. We derive a rigorous broad histogram relation for the bond number. A Monte Carlo dynamics based on the number of potential moves for the bond number is proposed, and applied to the three-dimensional Ising and 3-state Potts models.
Proceedings of the 2023 ACM Southeast Conference
This paper discusses the development of a new pseudorandom number generator (PRNG) based on chaotic billiards and particle randomness. In this new system, two massless particles bounce, teleport, and collide with each other inside of the classic Sinai Billiard. Random sequences are generated based on the collision coordinates of two particles as they bounce off of the circular center billiard wall. Three statistical tests conducted on the generated sequences indicate the generator produces sequences comparable to truly random sequences. In addition to discussing the results of the three statistical analysis, this paper details how the model is set up and how the pseudorandom sequence is produced. CCS CONCEPTS • Theory of computation → Pseudorandomness and derandomization; • Mathematics of computing → Statistical graphics.
Combinatorics, Probability and Computing, 2004
This article proposes a surprisingly simple framework for the random generation of combinatorial configurations based on what we call Boltzmann models. The idea is to perform random generation of possibly complex structured objects by placing an appropriate measure spread over the whole of a combinatorial class -an object receives a probability essentially proportional to an exponential of its size. As demonstrated here, the resulting algorithms based on real-arithmetic operations often operate in linear time. They can be implemented easily, be analysed mathematically with great precision, and, when suitably tuned, tend to be very efficient in practice.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.