0% found this document useful (0 votes)
122 views4 pages

History of Probability and Statistics

This document discusses the history and development of probability and statistics from the 18th century to modern times. It covers early contributions from authors like Laplace, Gauss, Legendre, and Markov. Key developments mentioned include the establishment of the normal distribution, the method of least squares, and the modern axiomatic approach to probability based on measure theory developed by Kolmogorov. The document also briefly references contributions to integral geometry and the development of mathematical logic in relation to the foundations of mathematics in the late 19th century.

Uploaded by

Hong Joo
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
122 views4 pages

History of Probability and Statistics

This document discusses the history and development of probability and statistics from the 18th century to modern times. It covers early contributions from authors like Laplace, Gauss, Legendre, and Markov. Key developments mentioned include the establishment of the normal distribution, the method of least squares, and the modern axiomatic approach to probability based on measure theory developed by Kolmogorov. The document also briefly references contributions to integral geometry and the development of mathematical logic in relation to the foundations of mathematics in the late 19th century.

Uploaded by

Hong Joo
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

to solve practical problems will be discussed.

An introduction , expectation and other fundamental issues are covered with the focus on their applications in the study of industrial systems. Stochastic models such zation, analysis, interpretation and presentation

mous, 1722), but a memoir prepared by Thomas Simpson in 1755 (printed 1756) first applied the theory to the discussion of errors of observation.[citation needed] The reprint (1757) of this memoir lays down the axioms that positive and negative errors are equally probable, and that certain assignable limits define the range of all errors. Simpson also discusses continuous errors and describes a probability curve. The first two laws of error that were proposed both originated with Pierre-Simon Laplace. The first law was published in 1774 and stated that the frequency of an error could be expressed as an exponential function of the numerical magnitude of the error, disregarding sign. The second law of error was proposed in 1778 by Laplace and stated that the frequency of the error is an exponential function of the square of the error.[14] The second law of error is called the normal distribution or the Gauss law. "It is difficult historically to attribute that law to Gauss, who in spite of his well-known precocity had probably not made this discovery before he was two years old."[14] Daniel Bernoulli (1778) introduced the principle of the maximum product of the probabilities of a system of concurrent errors. nal probability is written by[21] , and is read "the probability of A, given B". It is defined

If then is formally undefined by this expression. However, it is possible to define a conditional probability for some zero-probability events using a -algebra of such events (such as those arising from a continuous random variable).[citation needed] For example, in a bag of 2 red balls and 2 blue balls (4 balls in total), the probability of taking a red ball is ; however, when taking a second ball, the probability of it being either a red ball or a blue ball depends on the ball previously taken, such as, if a red ball was taken, the probability of picking a red ball again would be been remaining.
Inverse probability

since only 1 red and 2 blue balls would have

In probability theory and applications, Bayes' rule relates the odds of event to event before (prior to) and after (posterior to) conditioning on another event . The odds on

, to

event

is simply the ratio of the probabilities of the as posterior is proportional to prior where the proportionality symbol means that

times likelihood, the left hand side is proportional to (i

Carl Friedrich Gauss

Adrien-Marie Legendre (1805) developed the method of least squares, and introduced it in his Nouvelles mthodes pour la dtermination des orbites des comtes (New Methods for Determining the Orbits of Comets).[citation needed] In ignorance of Legendre's contribution, an IrishAmerican writer, Robert Adrain, editor of "The Analyst" (1808), first deduced the law of facility of error,

where is a constant depending on precision of observation, and is a scale factor ensuring that the area under the curve equals 1. He gave two proofs, the second being essentially the same as John Herschel's (1850).[citation needed] Gauss gave the first proof that seems to have been known in Europe (the third after Adrain's) in 1809. Further proofs were given by Laplace (1810, 1812), Gauss (1823), James Ivory (1825, 1826), Hagen (1837), Friedrich Bessel (1838), W. F. Donkin (1844, 1856), and Morgan Crofton (1870). Other contributors were Ellis (1844), De Morgan (1864), Glaisher (1872), and Giovanni Schiaparelli (1875). Peters's (1856) formula[clarification needed] for r, the probable error of a single observation, nt times as a model of truth and rigor for rational inquiry, and giving tools or even a foundation for other sciences (especially physics). But the many developments of mathematics towards higher abstractions in the 19th century, brought new challenges and paradoxes, urging for a deeper and more systematic examination of the nature and criteria of mathematical truth, as well as a unification of the diverse branches of mathematics into a coherent whole. The systematic search for the foundations of mathematics started at the end of the 19th century, and formed a new mathematical discipline called mathematical logic, with strong links to theoretical computer science. It went through a series of crises with paradoxical results, until the discoveries nt times as a model of truth and rigor for rational inquiry, and giving tools or even a foundation for other sciences (especially physics). But the many developments of mathematics towards higher abstractions in the 19th ce be used to study symmetry in physics and other fields), and abstract algebra. Concepts of vector spaces emerged from the conception of barycentric coordinates by Mbius in 1827, to the modern definition of vector spaces and linear maps by Peano in 1888. Geometry was no more limited to 3 dimensions. These concepts do not generalize numbers but combine notions of functions and sets which are not yet formalized, breaking away from familiar mathematical objects.

Non-Euclidean Geometries After many failed attempts to derive the parallel postulate from other axioms, the study of the still hypothetical hyperbolic geometry by Johann Heinrich Lambert (1728 1777) led him to introduce the hyperbolic functions and compute the area of a hyperbolic triangle (where the sum of angles is less than 180). Then the Russian mathematician Nikolai Lobachevsky (17921856) established in 1826 (and published in 1829) the coherence of this geometry (thus the independence of the parallel postulate), in parallel with the Hungarian mathematician Jnos Bolyai (180260) in 1832, and with Gauss. Later in the 19th century, the German mathematician Bernhard Riemann developed Elliptic geometry, another non-Euclidean geometry where no parallel can be found and the sum of angles in a triangle is more than 180. It was proved consistent by defining point to mean a pair of antipodal points on a fixed sphere and line to mean a great circle on the sphere. At that time, the main method for proving the consistency of a set of axioms was to provide a model for it. Projective geometry One of the traps in a deductive system is circular reasoning, a problem that seemed to befall projective geometry until it was resolved by Karl von Staudt. As explained by Laptev & Rosenfeld (1996):
In the mid-nineteenth century there was an acrimonious controversy between the proponents of synthetic and analytic methods in projective geometry, the two sides accusing each other of mixing projective and metric concepts. Indeed the basic concept that is applied in the synthetic presentation of projective geometry, the cross-ratio of four points of a line, was introduced through consideration of the lengths of intervals.

ntury, brought new challenges and paradoxes, urging for a deeper and more systematic examination of the nature and criteria of mathematical truth, as well as a unification of the diverse branches of mathematics into a coherent whole. The systematic search for the foundations of mathematics started at the end of the 19th century, and formed a new mathematical discipline called mathematical logic, with strong links to theoretical computer science. It went through a series of crises with paradoxical results, until the discoveries nt times as a model of truth and rigor for rational inquiry, and giving tools or even a foundation for other sciences (especially physics). But the many developments of mathematics towards higher abstractions in the 19th century, brought new challenges and paradoxes, urging for a deeper and more systematic examination of the nature and criteria of mathematical truth, as well as a unification of the diverse branches of mathematics into a coherent whole. The systematic search for the foundations of mathematics started at the end of the 19th century, and formed a new mathematical discipline called mathematical logic, with strong links to theoretical computer science. It went through a series of crises with paradoxical results, until the discoveries

is well known.[to whom?] In the nineteenth century authors on the general theory included Laplace, Sylvestre Lacroix (1816), Littrow (1833), Adolphe Quetelet (1853), Richard Dedekind (1860), Helmert (1872), Hermann Laurent (1873), Liagre, Didion, and Karl Pearson. Augustus De Morgan and George Boole improved the exposition of the theory. Andrey Markov introduced[citation needed] the notion of Markov chains (1906), which played an important role in stochastic processes theory and its applications. The modern theory of probability based on the measure theory was developed by Andrey Kolmogorov (1931).[citation
needed]

On the geometric side (see integral geometry) contributors to The Educational Times were influential (Miller, Crofton, McColl, Wolstenholme, Watson, and Artemas Martin).[citation needed]
Further information: History of statistics v

You might also like