0% found this document useful (0 votes)
12 views44 pages

Project

The dissertation titled 'An Exploration of Monte Carlo Methods' by Anand Sai P and Abel Joshy, submitted to St. Berchmans College, investigates the principles and applications of Monte Carlo methods in various fields. It covers foundational concepts such as sampling, probability distributions, and mathematical expectations, while also exploring practical applications in physics, finance, and other domains. The work aims to demonstrate the versatility and effectiveness of Monte Carlo techniques in solving complex problems that lack analytical solutions.

Uploaded by

snairappu7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views44 pages

Project

The dissertation titled 'An Exploration of Monte Carlo Methods' by Anand Sai P and Abel Joshy, submitted to St. Berchmans College, investigates the principles and applications of Monte Carlo methods in various fields. It covers foundational concepts such as sampling, probability distributions, and mathematical expectations, while also exploring practical applications in physics, finance, and other domains. The work aims to demonstrate the versatility and effectiveness of Monte Carlo techniques in solving complex problems that lack analytical solutions.

Uploaded by

snairappu7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

AN EXPLORATION OF MONTE CARLO

METHODS

DISSERTATION SUBMITTED TO

THE CONTROLLER OF EXAMINATIONS,

ST BERCHMANS COLLEGE, CHANGANASSERY

IN PARTIAL FULFILMENT OF THE REQUIREMENTS FOR THE DEGREE OF

BACHELOR OF SCIENCE (MATHEMATICS)

ANAND SAI P

Reg.No. 12200005

ABEL JOSHY

Reg.No. 12200013

Under the guidance of

Ms. DEVI N

Department of Mathematics

St.Berchmans College, Changanasserry

DEPARTMENT OF MATHEMATICS
ST. BERCHMANS COLLEGE
CHANGANASSERY
2022-2025
DECLARATION

We, Anand Sai P (Reg.No. 12200005) and Abel Joshy (Reg.No. 12200013) do

hereby declare that the dissertation entitled “An Exploration of Monte Carlo

Methods” is a bonafide record of research work done by us under the guidance

and supervision of Ms. Devi N, Assistant Professor, Department of Mathemat-

ics, St. Berchmans College, Changanassery, and that this dissertation or any part

thereof has not previously formed the basis for the award of any degree, diploma,

associateship, fellowship or any other similar title of any University or Institution.

Changanassery Anand Sai P


10/03/2025 Abel Joshy
CERTIFICATE

This is to certify that Mr. Anand Sai P and Mr. Abel Joshy, has undergone

the B.Sc. Degree Course at St. Berchmans College, Changanassery, during the

period 2022-2025, and has undertaken the dissertation under the guidance of Ms.

Devi N, Assistant Professor, Department of Mathematics, St. Berchmans College,

Changanassery. They are permitted to submit this dissertation to the Controller of

Examinations of the College.

Changanassery Fr. John J Chavara


10/03/2025 Head of the Department
CERTIFICATE

This is to certify that the dissertation entitled “An Exploration of Monte Carlo

Methods” submitted by Mr. Anand Sai P and Mr. Abel Joshy to the Controller

of Examinations, St Berchmans College, Changanassery for the award of the degree

of Bachelor of Science in Mathematics, is a bonafide record of research carried out

by them under my supervision. The contents of this dissertation, in full or in parts,

have not been submitted to any other Institute or University for the award of any

degree or diploma.

Changanassery Ms. Devi N


10/03/2025 Assistant Professor
ACKNOWLEDGEMENT

We take this opportunity to express our profound gratitude and deep regards to

our guide Ms. Devi N, Assistant Professor, Department of Mathematics,for her

exemplary guidance, monitoring and constant encouragement throughout the course

of this project. The blessing, help and guidance given by her time to time shall carry

us a long way in the journey of our life on which we are about to embark.

We would like to express our gratitude towards our family, our faculty and

our friends for their kind cooperation and encouragement which help us in the

completion of this project .

Anand Sai P

Abel Joshy
Contents

Introduction 2

1 Preliminary 4
1.1 Sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2 Probability Density Function . . . . . . . . . . . . . . . . . . . . . . . 4
1.3 Mathematical Expectation . . . . . . . . . . . . . . . . . . . . . . . . 5
1.4 Exponential Distribution . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.5 Gamma Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2 Sampling Random variables 10


2.1 Importance Sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

3 Simple sampling Monte Carlo methods 13


3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.2 Comparisons of Methods for Numerical Integration of Given Functions 14
3.3 Boundary Value Problems . . . . . . . . . . . . . . . . . . . . . . . . 15
3.4 Monte Carlo Optimization . . . . . . . . . . . . . . . . . . . . . . . . 16
3.5 Simulation of Transport Properties . . . . . . . . . . . . . . . . . . . 21
3.5.1 Neutron transport . . . . . . . . . . . . . . . . . . . . . . . . 21

4 Monte Carlo simulations at the periphery of physics and beyond 26


4.1 Mathematics/Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.2 Econophysics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
4.3 Astrophysics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
4.4 ’Traffic’ Simulations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.5 Finance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
4.6 Disadvantages of Monte Carlo Methods . . . . . . . . . . . . . . . . . 35

Conclusion 37

References 39

1
Introduction: Unveiling

Complexity with Randomness: An

Exploration of Monte Carlo

Methods

Monte Carlo methods, a powerful class of computational algorithms, leverage the

principles of randomness and repeated sampling to unravel the complexities inherent

in a wide range of problems. From simulating the intricate dance of particles in a

nuclear reactor to predicting the fluctuations of financial markets, these techniques

offer a unique approach to tackling challenges that defy analytical solutions. This

assignment delves into the fascinating world of Monte Carlo methods, exploring their

foundational principles, diverse applications, and inherent strengths and limitations.

We begin by establishing a preliminary understanding of the core concepts un-

derpinning Monte Carlo simulations, including probability distributions, random

variables, and the statistical interpretation of results. A crucial aspect of these

methods lies in the ability to effectively generate random samples from desired dis-

tributions. Therefore, we dedicate a section to the essential techniques for ”Sampling

Random Variables,” with a particular focus on the efficiency-enhancing strategy of

”Importance Sampling.”

2
3

Building upon this foundation, we introduce the fundamental ”Simple Sam-

pling Monte Carlo Methods.” Here, we explore how randomness can be harnessed

to solve seemingly intractable problems, such as numerical integration. We will

compare different approaches for numerical integration, highlighting the advantages

and disadvantages of each. Furthermore, we examine the practical application of

these methods in solving ”Boundary Value Problems”, ”Monte Carlo Optimization

Problems” and simulating crucial ”Transport Properties,” with detailed case study

on ”Neutron Transport”. These examples will showcase the power of Monte Carlo

simulations in addressing real-world challenges in physics and engineering.

Finally, we broaden our horizons to appreciate the versatility of Monte Carlo

methods by exploring their applications ”at the Periphery of Physics and Beyond.”

We will discuss applications, advantages and disadvantages of these techniques em-

ployed in fields as diverse as ”Astrophysics,” ”Mathematics/Statistics,” ”Econo-

physics,” ”Traffic Simulations,” and ”Finance,” demonstrating the adaptability and

wide-ranging impact of Monte Carlo approaches across scientific and economic do-

mains. Through this comprehensive exploration, we aim to gain a deep appreciation

for the power and elegance of Monte Carlo methods in unraveling the complexities

of our world.
Chapter 1

Preliminary

1.1 Sampling

Population and Sample

Population is the totality or aggregate of units defined according to the aim of sta-

tistical investigation.

A sample may be defined as a representative part of population.

If enumeration is done by enumerating a representative part of population, the

method is known as Sampling.

1.2 Probability Density Function

Let X be a continuous Random Variable defined on RX . Then f (x) is called p.d.f

of X if
Z x+dx
P [x ≤ X ≤ x + dx] = f (x)dx
x

Properties of p.d.f

4
5

1. f (x) ≥ 0

R
2. RX
f (x)dx = 1

1.3 Mathematical Expectation

Mathematical Expectation of a discrete Random variable

Let X be a discrete Random Variable and let its values with non-zero probabilities

be x1 , x2 , x3 , .... Let P (x) be its probability density function. ie, P (x) = P (X = x).

Then the expectation or mathematical expectation of X, usually denoted by E(X)

is given by
X
E(X) = xP (x)
x

Example

Suppose that probability that sets containing 1,2,3,4 and 5 persons pay a visit to an

art gallery are 0.2,0.5,0.2,0.07,0.3 respectively. Then what is the expected number

of persons per set?

Let X be the number of persons in any set of visitors. x=1,2,3,4,5

Expectation,
X
E(X) = xP (x)
x

= 1 × 0.2 + 2 × 0.5 + 3 × 0.2 + 4 × 0.07 + 5 × 0.03

= 2.23

Mathematical Expectation of a continuous Random variable

Let X be a continuous Random Variable with p.d.f f (x). Then the mathematical
R∞
expectation of X is defined as E(X) = ∞ x.f (x)dx, provided the integral on the

right hand side is absolutely convergent.


6

If the integral is not absolutely convergent, we say that the expectation of X

does not exist.

Example

Find the expectation of X if f (x) = 30x4 (1 − x), 0 ≤ x ≤ 1 and 0 elsewhere.

Z ∞
E(X) = x.f (x)dx
Z∞1
= x.30x4 (1 − x)dx
0
Z 1
= 30 x5 (1 − x)dx
Z0 1
= 30 (x5 − x6 )dx
0
6
x x7
= 30[ − ]
6 7
5
=
7

Expectation of a function of a Random Variable

Let X be a Random Variable with p.d.f f (x) and ϕ(x) any measurable function of

it. Then the expectation of ϕ(x) is defined as


X
E(ϕ(x)) = ϕ(x).f (x)
x
or
Z
E(ϕ(x)) = ϕ(x).f (x)dx
x
Example
x
Let X be a Random Variable with p.d.f. f (x) = 6
when x = 1, 2, 3 and 0 elsewhere.

Find the expectation of (X + 2)2

3
2
X x
E(X + 2) = (x + 2)2
x=1
6
1 2 3 58
= (1 + 2)2 × + (2 + 2)2 × + (3 + 2)2 × =
6 6 6 3
7

Properties of Expectation

1. If X is any Random Variable, a and b any 2 constants and ϕ(x) any measurable

function of X, then

E(a.ϕ(x) + b) = aE(ϕ(x)) + b

2. If X and Y are Random Variables, then E(X + Y ) = E(X) + E(Y ), provided

all the expectations exist

3. If X and Y are real valued Random Variables, then

(E(XY ))2 ≤ E(X 2 )E(Y 2 )

1.4 Exponential Distribution

A continuous Random Variable X is said to follow an Exponential Distribution with

parameter λ if its probability density function is given by

f (x) = λe−λx , 0 < x < ∞, λ > 0

= 0, elsewhere

Mean and Variance of Exponential Distribution

Mean

µ′1 = E(X)
Z
= x.f (x)dx
Zx ∞
= xλe−λx dx
0
Z ∞
=λ x2−1 e−λx dx
0
1
=
λ
8

Variance, V (X) = E(X 2 ) − (E(X))2


Z
E(X ) = x2 f (x)dx
2

Zx ∞
= x2 λe−λx dx
0
Z ∞
=λ x3−1 e−λx dx
0
2
= 2
λ
2 1
V (X) = 2 − ( )2
λ λ
2 1
= 2− 2
λ λ
1
= 2
λ

1.5 Gamma Distribution

A continuous random variable X is said to follow gamma distribution with param-

eter p if its p.d.f. is given by


1
f (x) = e−x xp−1 , x > 0, p > 0
Γp

= 0, elsewhere

Mean and Variance of Gamma Distribution Mean

µ′1 = E(X)
Z
= x.f (x)dx
Zx ∞
1
= x e−x xp−1
0 Γp
Z ∞
1
= e−x xp+1−1
Γp 0
1 Γp+1
=
Γp 1p+1
(p + 1 − 1)!
= =p
(p − 1)!
9

Variance, V (X) = E(X 2 ) − (E(X))2


Z
E(X ) = x2 f (x)dx
2

Zx ∞
1
= x2 e−x xp−1 dx
0 Γp
Z ∞
1
= e−x xp+2−1 dx
Γp 0
1 Γp+2
= ×
Γp p + 2
(p + 2 − 1)!
=
(p − 1)!

= p(p + 1)

V (X) = p(p + 1) − p2 = p2 + p − p2 = p

Gamma Distribution Second kind

A continuous random variable X is said to follow a Gamma Distribution with pa-

rameters m and p if its p.d.f. is given by


mp −mx p−1
f (x) = e x , x>0
Γp
For Gamma Distribution Second Kind,
p
Mean = m
p
Variance = m2
Chapter 2

Sampling Random variables

2.1 Importance Sampling

Importance Sampling (IS) is a variance reduction technique used in Monte Carlo

methods to estimate expectations more efficiently. Instead of sampling from the

original distribution, we sample from a more convenient proposal distribution and

rewrite the samples accordingly.

Suppose we want to compute the expectation of a function g(X) with respect to

a probability density function (PDF) p(x):

Z
Ep [g(X)] = g(x)p(x)dx.

Instead of sampling directly from p(x), we introduce an alternative proposal

distribution q(x), which satisfies:

1. q(x) > 0 wherever p(x)g(x) is nonzero.

2. q(x) is easier to sample from compared to p(x).

Using the identity:

10
11

Z
p(x)
Ep [g(X)] = g(x)
q(x)dx,
q(x)
we can approximate the expectation using Monte Carlo sampling from q(x):

N
1 X
Ep [g(X)] ≈ g(Xi )wi ,
N i=1
where:

• Xi ∼ q(x) are samples from the proposal distribution.

• wi = p(Xi )
q(Xi )
are called importance weights.

Example: Approximating an Integral using Importance Sampling

Problem Statement

Estimate the integral:

Z ∞
2
I= e−x x2 dx
0

using Importance Sampling.

Solution

1. Original Probability Density Function


2
The function f (x) = x2 e−x resembles a Gamma distribution. Instead of sam-
2
pling from the difficult function x2 e−x , we choose a proposal distribution.

2. Choose a Proposal Distribution

A natural choice is an exponential distribution q(x) = λe−λx , where λ = 1.

3. Compute the Importance Weights

The weight function is:

2
f (x) x2 e−x 2 2
w(x) = = −x = x2 e−x +x = x2 ex−x .
q(x) e
12

4. Monte Carlo Estimation

We generate N samples from q(x), then compute:

N
1 X f (xi )
I≈
N i=1 q(xi )
Expanding f (x) and q(x):

N 2
1 X x2i e−xi
I≈
N i=1 e−xi
which simplifies to:

N
1 X 2 −(x2i −xi )
I≈ xe
N i=1 i
The estimated value of the integral using Monte Carlo Importance Sampling

with a proposal distribution is approximately:

I ≈ 0.444

5. Exact Solution for Comparison

Using integration by parts, the exact value is:


π
I= = 0.443.
4
Chapter 3

Simple sampling Monte Carlo

methods

3.1 Introduction

Modern Monte Carlo methods have their roots in the 1940s when Fermi, Ulam,

von Neumann, Metropolis and others began considering the use of random numbers

to examine different problems in physics from a stochastic perspective (Cooper,

1989); this set of biographical articles about S. Ulam provides fascinating insight

into the early development of the Monte Carlo method, even before the advent of

the modern computer). Very simple Monte Carlo methods were devised to provide

a means to estimate answers to analytically intractable problems. Simple Monte

Carlo techniques retain their importance because of the dramatic increase in acces-

sible computing power which has taken place during the last two decades. In the

remainder of this chapter we shall consider the application of simple Monte Carlo

methods to a broad spectrum of interesting problems.

13
14

3.2 Comparisons of Methods for Numerical Inte-

gration of Given Functions

Simple Methods

Monte Carlo methods are a straightforward and effective technique for evaluating

definite integrals that are analytically intractable. These methods, described in

detail by Hammersley and Handscomb (1964), are discussed here for one-dimensional

integrals but are particularly powerful for multidimensional cases.In the simplest

case we wish to obtain the integral of f (x) over some fixed interval:
Z b
y= f (x)dx (3.1)
a

In Fig. 3.1 we show a pictorial representation of this problem. A straightforward

Monte Carlo solution to this problem via the ‘hit-or-miss’ (or acceptance–rejection)

method is to draw a box extending from a to b and from 0 to y0 where y0 > f (x)

throughout this interval. Using random numbers drawn from a uniform distribu-

tion, we drop N points randomly into the box and count the number, No, which fall
15

below f (x) for each value of x. An estimate for the integral is then given by the

fraction of points which fall below the curve times the area of the box, i.e.

yest = (N0 /N )[y0 (b − a)] (3.2)

This estimate becomes increasingly precise as N → ∞ and will eventually converge

to the correct answer. This technique is an example of a ‘simple sampling’ Monte

Carlo method and is obviously dependent upon the quality of the random number

sequence which is used. Independent estimates can be obtained by applying this

same approach with different random number sequences and by comparing these

values the precision of the procedure can be ascertained.

3.3 Boundary Value Problems

There is a large class of problems which involve the solution of a differential equation

subject to a specified boundary condition. As an example, we consider Laplace’s

equation
∂ 2u ∂ 2u
∇2 u = + =0 (3.3)
∂x2 ∂y 2
where the function u(r) = f on the boundary. Equation (3.3) can be re-expressed

as a finite difference equation, if the increment ∆ is sufficiently small,

∇2 u = [u(x+∆, y)+u(x−∆, y)+u(x, y+∆)+u(x, y−∆)−4u(x, y)]/∆2 = 0. (3.4)

or

u(x, y) = [u(x + ∆, y) + u(x − ∆, y) + u(x, y + ∆) + u(x, y − ∆)]/4. (3.5)

If we examine the behavior of the function u(r) at points on a grid with lattice

spacing ∆, we may give this equation a probabilistic interpretation. If we consider

a grid of points in the xy-plane with a lattice spacing of ∆, then the probability of

a random walk returning to the point (x, y) from any of its nearest neighbor sites
16

is 1/4. If we place the boundary on the grid, as shown in Fig. 3.2, a random walk

will terminate at a boundary point (x′ , y ′ ), where the variable u has the value

u(x′ , y ′ ) = f (x′ , y ′ ). (3.6)

One can then estimate the value of u(x, y) by executing many random walks

which begin at the point (x, y) as the average over all N walks which have been

performed:
1 X
u(x, y) ≈ f (x′i , yi′ ). (3.7)
N i
After a large number of such walks have been performed, a good estimate of

u(x, y) will be produced, but the estimate will depend upon both the coarseness of

the grid as well as the number of random walks generated.

3.4 Monte Carlo Optimization

Monte Carlo optimization is a stochastic optimization technique that relies on

random sampling to find approximate solutions to optimization problems. It is

particularly useful when:

• The search space is large and complex.


17

• The function is non-differentiable, noisy, or irregular.

• A global optimum is desired rather than a local optimum.

• Traditional gradient-based methods fail due to discontinuities or multiple

local minima.

Formal Definition of Monte Carlo Optimization

Given an objective function f (x) to be minimized or maximized, we define:

x∗ = min f (x)
x∈X

or

x∗ = max f (x)
x∈X

where:

• x is the decision variable (it can be a vector in multidimensional problems).

• X is the search space (bounded or unbounded).

• f (x) is the objective function, which may include noise, constraints, and

randomness.

• x∗ is the optimal solution.

Monte Carlo optimization estimates x∗ by randomly sampling points from X ,

evaluating f (x), and selecting the best candidate solution.


18

Problem

Find the shortest route through 4 cities with coordinates:


City Coordinates (x,y)

A (0,0)

B (1,2)

C (4,3)

D (5,0)

Solution

1. Generate Random Routes

• Possible random routes:

– Route 1: A – B – C – D – A

– Route 2: A – C – B – D – A

– Route 3: A – D – B – C – A

2. Compute Route Distances

• Using the distance formula:


p
dXY = (x2 − x1 )2 + (y2 − y1 )2

• Route 1: A – B – C – D – A

p √
dAB = (1 − 0)2 + (2 − 0)2 = 5 ≈ 2.24

p √
dBC = (4 − 1)2 + (3 − 2)2 = 10 ≈ 3.16

p √
dCD = (5 − 4)2 + (0 − 3)2 = 10 ≈ 3.16
19

p
dDA = (5 − 0)2 + (0 − 0)2 = 5

Total Distance = 2.24 + 3.16 + 3.16 + 5 = 13.56

• Route 2: A – C – B – D – A

p √
dAC = (4 − 0)2 + (3 − 0)2 = 25 = 5

p √
dCB = (4 − 1)2 + (3 − 2)2 = 10 ≈ 3.16

p √
dBD = (5 − 1)2 + (0 − 2)2 = 20 ≈ 4.47

p
dDA = (5 − 0)2 + (0 − 0)2 = 5

Total Distance = 5 + 3.16 + 4.47 + 5 = 17.63

• Route 3: A – D – B – C – A

p
dAD = (5 − 0)2 + (0 − 0)2 = 5

p √
dDB = (5 − 1)2 + (0 − 2)2 = 20 ≈ 4.47

p √
dBC = (4 − 1)2 + (3 − 2)2 = 10 ≈ 3.16

p √
dCA = (4 − 0)2 + (3 − 0)2 = 25 = 5
20

Total Distance = 5 + 4.47 + 3.16 + 5 = 17.63

3. Select the Shortest Route

The optimal route is Route 1 with a total distance of 13.56.

Problem

A delivery service needs to find the shortest route between 5 delivery points in a

city with randomly assigned distances.

City Layout and Distances

Points A B C D E
A 0 4.5 3.2 7.1 5.4
B 4.5 0 2.8 6.3 3.7
C 3.2 2.8 0 5.1 2.9
D 7.1 6.3 5.1 0 4.0
E 5.4 3.7 2.9 4.0 0

Solution

1. Generate Random Routes

Possible random paths visiting all points once and returning to start:

• Route 1: A → C → B → E → D → A

• Route 2: A → B → E → C → D → A

• Route 3: A → D → C → B → E → A

2. Compute Route Distances

Using the given table values, calculate total distances for each route:
21

Route 1:

dAC + dCB + dBE + dED + dDA

3.2 + 2.8 + 3.7 + 4.0 + 7.1 = 20.8

Route 2:

dAB + dBE + dEC + dCD + dDA

4.5 + 3.7 + 2.9 + 5.1 + 7.1 = 23.3

Route 3:

dAD + dDC + dCB + dBE + dEA

7.1 + 5.1 + 2.8 + 3.7 + 5.4 = 24.1

3. Select the Shortest Route

The optimal route is Route 1 with a total distance of 20.8.

3.5 Simulation of Transport Properties

3.5.1 Neutron transport

Historically the examination of reactor criticality was among the first problems to

which Monte Carlo methods were applied. The fundamental question at hand is the

behavior of large numbers of neutrons inside the reactor. In fact, when neutrons

traveling in the moderator are scattered, or when a neutron is absorbed in a uranium

atom with a resultant fission event, particles fly off in random directions according to

the appropriate differential cross-sections (as the conditional probabilities for such

scattering events are called).

Neutron transport deals with the study of how neutrons move and interact with
22

materials. This field is fundamental in nuclear reactor design, shielding analysis, and

other applications of nuclear technology.The neutron transport equation describes

the distribution of neutrons in a system. It accounts for changes in space, direction,

energy, and time:


⃗ E, t)
∂ψ(⃗r, Ω,
+Ω ⃗ · ∇ψ(⃗r, Ω,
⃗ E, t) + Σt ψ(⃗r, Ω, ⃗ E, t) =
Z ∞ Z∂t
Σs (⃗r, E ′ → E, Ω⃗ ′ · Ω)ψ(⃗
⃗ ⃗ ′ , E ′ , t)dΩ′ dE ′ + Q(⃗r, Ω,
r, Ω ⃗ E, t) (3.10)
0 4π
- ψ: Neutron flux (density of neutrons in phase space).
⃗ Direction of neutron motion.
- Ω:

- E: Energy of neutrons.

- Σt : Total macroscopic cross-section.

- Σs : Scattering macroscopic cross-section.

- Q: Neutron source term.

Here are some other key equations used in Monte Carlo simulations of neutron

transport:

1. Monte Carlo Neutron Transport Simulation

In Monte Carlo simulations, each neutron is followed as it travels through the

medium and interacts with particles. Key events include:

Mean Free Path: The mean free path λ for a neutron is given by the inverse

of the total macroscopic cross-section Σtotal :


1 1
λ= =
Σtotal σa + σs
Where:

• σa is the absorption cross-section,

• σs is the scattering cross-section.

Collision Probability: The probability of a neutron undergoing a specific event

(absorption or scattering) can be determined by the ratio of the corresponding cross-


23

section to the total cross-section:


σa σs
Pabsorption = , Pscattering =
σa + σs σa + σs

2. Henyey-Greenstein Phase Function (Anisotropic Scattering)

For anisotropic scattering, the scattering angle distribution can be modeled using the

Henyey-Greenstein phase function. It is often used to model forward and backward

scattering in neutrons:
1 1 − g2
P (µ) =
2 (1 + g 2 − 2gµ)3/2
Where:

• µ = cos θ is the cosine of the scattering angle,

• g is the asymmetry parameter, describing the directionality of scattering (i.e.,

forward scattering if g > 0).

The average scattering cosine (or mean cosine), ⟨µ⟩, is related to the asymmetry

parameter by:

⟨µ⟩ = g

Question 1: Neutron Transport with Multiple Scattering Events

Problem: In a Monte Carlo simulation of neutron transport through a slab of

material, you are given the following parameters:

• The material has a uniform density of n = 1.5 × 1024 atoms/m3 .

• The material has an absorption cross-section σa = 0.3 cm−1 and a scattering

cross-section σs = 1.5 cm−1 .

• Neutrons are initially emitted with a random angle and energy.

a) Calculate the total cross-section and the mean free path for a neutron in this

material.
24

b) Simulate 5,000 neutrons in a Monte Carlo simulation, assuming isotropic scat-

tering. Estimate the fraction of neutrons that will undergo at least one scattering

event before being absorbed.

Solution:

a) The total macroscopic cross-section is:

Σtotal = σa + σs = 0.3 + 1.5 = 1.8 cm−1

The mean free path is:


1 1
λ= = cm = 0.5556 cm
Σtotal 1.8
b) The probability of scattering for each neutron is given by:
σs 1.5
Pscattering = = ≈ 0.833
σa + σs 1.8
Thus, the fraction of neutrons that will undergo at least one scattering event

before being absorbed can be estimated as:


σa 0.3
Pat least one scattering = 1 − Pabsorption = 1 − =1− = 0.8333
σa + σs 1.8
Therefore, the estimated fraction of neutrons undergoing at least one scattering

event is approximately 83.33%.

Question 2: Neutron Transport in an Anisotropic Scattering Medium

Problem: Consider a Monte Carlo simulation of neutron transport in a medium

where the scattering cross-section is anisotropic. The material properties are:

• σs = 3 cm−1 (total scattering cross-section),

• The scattering distribution is modeled using the Henyey-Greenstein phase

function with an asymmetry parameter g = 0.5,

• Neutron absorption cross-section is σa = 0.7 cm−1 .

a) Calculate the total macroscopic cross-section for the material.

b) What is the probability that a neutron will scatter in the forward direction, given
25

the asymmetry parameter g = 0.5?

Solution:

a) The total macroscopic cross-section is simply the sum of the absorption and

scattering cross-sections:

Σtotal = σa + σs = 0.7 + 3 = 3.7 cm−1

b) The probability of scattering in the forward direction is influenced by the

asymmetry parameter g of the phase function. The Henyey-Greenstein phase func-

tion P (µ), where µ is the cosine of the scattering angle, is given by:
1 1 − g2
P (µ) =
2 (1 + g 2 − 2gµ)3/2
For forward scattering, µ = 1, so the probability of scattering in the forward

direction is:
1 1 − g2
P (µ = 1) =
2 (1 + g 2 − 2g(1))3/2
1 1 − (0.5)2
=
2 (1 + (0.5)2 − 2(0.5)(1))3/2
1 1 − 0.25
=
2 (1 + 0.25 − 1)3/2
1 0.75
=
2 (0.25)3/2
Simplifying this gives:
0.75
P (µ = 1) = 0.5 × = 0.5 × 6 = 3
0.125
This result indicates a higher probability for forward scattering in this case due to

the asymmetry. Typically, this would be normalized appropriately in a simulation.


Chapter 4

Monte Carlo simulations at the

periphery of physics and beyond

4.1 Mathematics/Statistics

Monte Carlo methods are computational techniques that use random sampling to

solve mathematical and statistical problems, especially when analytical solutions

are difficult or impractical. These methods are named after the Monte Carlo casino

due to their reliance on randomness, and they are widely used in diverse areas

such as numerical integration, optimization, probability estimation, and statistical

inference.

4.1.1 Key Applications

Numerical Integration: Monte Carlo methods are commonly used to esti-

mate integrals, especially in high-dimensional spaces where traditional methods like

Simpson’s rule are inefficient. By randomly sampling points in the domain, these

methods provide an estimate of integrals that are difficult to compute analytically.

Estimation of Probabilities: Monte Carlo methods are effective for estimat-

26
27

ing complex probabilities, especially in situations involving uncertainty and ran-

domness, such as risk assessment or reliability analysis. By simulating many trials,

probabilities of certain events can be approximated.

Optimization: Monte Carlo methods, such as Simulated Annealing, are used

to solve optimization problems, particularly when searching for the global minimum

of a complex function. These methods are particularly useful for problems in high-

dimensional spaces where conventional methods struggle.

Markov Chain Monte Carlo (MCMC): MCMC is a powerful technique for

sampling from complex probability distributions, particularly in Bayesian statistics.

It generates samples from a desired distribution by constructing a Markov chain,

and is used in various applications such as Bayesian inference and statistical physics.

Bootstrapping and Resampling: In statistics, Monte Carlo methods like

bootstrapping involve repeatedly resampling data to estimate statistical measures,

such as confidence intervals or standard errors. This allows for non-parametric in-

ference when the underlying distribution is unknown or complex.

Random Walks and Diffusion Processes: Monte Carlo simulations are used

to model random walks, which describe paths consisting of random steps. These

models are widely used in fields like diffusion theory, stock market analysis, and

particle physics.

Solving Differential Equations: Monte Carlo methods are also employed to

approximate solutions to stochastic differential equations (SDEs), which are preva-

lent in fields such as finance, physics, and biology, where systems are influenced by
28

random factors.

4.1.2 Advantages

*Flexibility: These methods can be applied to a broad range of problems, from

simple integrals to complex simulations involving many variables.

*Handling Complexity: Monte Carlo is ideal for systems that are too complex for

analytical solutions, including those with high dimensions or random components.

*Probabilistic Insight: Monte Carlo methods provide statistical estimates that

quantify uncertainty, making them valuable for risk analysis and decision-making

under uncertainty.

4.2 Econophysics

Econophysics is a field that applies concepts from physics, particularly statistical

mechanics and complex systems, to study financial and economic systems. In this

field, Monte Carlo methods are commonly used to model and simulate financial

market dynamics, risk management, portfolio optimization, and extreme events.

4.2.1 Key Applications:

Stock Price Modeling: Monte Carlo simulations model the randomness in asset

prices, using probabilistic models like Geometric Brownian Motion (GBM) to gen-

erate price paths and estimate risk and returns.

Risk Management: These methods simulate different market scenarios to esti-

mate portfolio risk, including techniques like Value at Risk (VaR) to assess potential

losses under uncertain conditions.

Option Pricing: Monte Carlo simulations are used to price complex financial

derivatives, such as Asian options, where traditional formulas like Black-Scholes are
29

not easily applicable due to complexity.

Agent-Based Models: Simulations are used to model interactions between agents

(e.g., traders) to study how individual behaviors impact overall market dynamics,

including phenomena like market crashes and herd behavior.

Extreme Events: Monte Carlo is used to simulate extreme financial events (e.g.,

market crashes), helping estimate the likelihood of rare but impactful occurrences

and assess systemic risk.

Systemic Risk and Network Models: Monte Carlo methods help model the in-

terconnectedness of financial institutions and the risk of contagion during financial

crises.

4.2.2 Advantages

*Flexibility: Can handle complex and uncertain financial systems.

*Risk Estimation: Provides insights into the probability of different financial out-

comes, such as market crashes or extreme volatility.

*Non-linear Systems: Effective in simulating non-linear financial behaviors, like

those found in options pricing or crashes.

Monte Carlo methods in econophysics are a powerful tool for understanding and

simulating complex financial phenomena, helping estimate risks, price derivatives,

and model market behaviors. While they offer valuable insights, they are computa-

tionally demanding and rely on accurate model assumptions.

4.3 Astrophysics

Monte Carlo methods are vital computational tools used in astrophysics to simu-

late complex physical systems where analytical solutions are not feasible. These

methods rely on random sampling and statistical principles to model phenomena


30

under uncertainty, allowing scientists to study intricate processes in space, time,

and matter.

4.3.1 Key Applications

Radiative Transfer: Monte Carlo methods simulate how light (or other ra-

diation) interacts with matter, such as photons passing through stars, nebulae, or

interstellar gas clouds. This helps astrophysicists understand the emitted radiation,

its spectrum, and how it is absorbed or scattered by the surrounding medium.

Cosmology and Large-Scale Structure Formation: Monte Carlo simu-

lations model the evolution of large-scale cosmic structures, such as galaxies and

clusters, from the early universe to the present day. These simulations are essential

for testing different cosmological theories, including those involving dark matter and

dark energy, by comparing simulated results with real astronomical observations.

Particle Physics and Cosmic Rays: These methods are used to trace high-

energy cosmic rays as they interact with the Earth’s atmosphere or interstellar mat-

ter. By simulating particle interactions, Monte Carlo techniques help understand

phenomena like cosmic ray showers, particle acceleration, and gamma-ray bursts.

Supernovae Modeling: Monte Carlo methods model the radiation transfer

in supernovae, helping predict their light curves, spectra, and the interaction of

radiation with the surrounding environment. This is crucial for understanding the

explosion dynamics and luminosity of these powerful events.

Neutron Stars and Black Hole Accretion Disks: In environments around

black holes and neutron stars, Monte Carlo simulations help model complex pro-

cesses like photon absorption, scattering, and re-emission within accretion disks.

These simulations aid in understanding X-ray emissions and the physical conditions
31

near event horizons.

Gravitational Wave Astronomy: Monte Carlo methods simulate the poten-

tial sources of gravitational waves, such as mergers of compact objects, and predict

the signals detected by observatories like LIGO. These simulations help interpret

real data and refine our understanding of cosmic events that produce gravitational

waves.

Galaxy and Star Formation: Monte Carlo methods are used to model the

stochastic processes involved in star and galaxy formation, including gas cloud col-

lapse and stellar evolution. This provides insight into how galaxies and stellar

systems evolve over time.

4.3.2 Advantages

*Handling Complexity: Monte Carlo simulations can manage intricate physical

interactions, including those that involve multiple bodies, turbulence, or radiation

transfer, which are difficult to address analytically.

*Flexibility: These methods can be applied across a wide range of astrophysical

problems, from small-scale particle interactions to large-scale cosmological simula-

tions.

*Uncertainty Quantification: Monte Carlo methods offer probabilistic solutions

that help quantify uncertainties in simulations, which is essential in astrophysical

research where many parameters are unknown or uncertain.

Monte Carlo methods are indispensable in modern astrophysics, enabling scien-

tists to tackle problems too complex for traditional analytical methods. They offer

powerful tools for modeling everything from cosmic ray interactions to the formation

of galaxies and the dynamics of black holes. Despite the computational demands,
32

the flexibility and accuracy they provide make them a cornerstone of astrophysical

research, allowing for deeper understanding of the universe’s most mysterious and

dynamic processes.

4.4 ’Traffic’ Simulations

Monte Carlo Traffic Simulation uses random sampling and probabilistic models to

simulate and analyze traffic flow, congestion, and interactions in transportation sys-

tems. It is valuable in modeling the uncertainty and complexity of real-world traffic

conditions.

4.4.1 Key Applications:

Traffic Flow and Congestion: Monte Carlo methods simulate vehicle arrivals,

traffic signals, and lane changes to predict congestion and delays, helping optimize

traffic flow and reduce bottlenecks.

Pedestrian and Vehicle Interaction: Simulations model pedestrian behavior at

crosswalks and vehicle interactions, accounting for random events such as accidents

or lane changes.

Traffic Signal Timing: Monte Carlo simulations help optimize traffic signal tim-

ings by modeling random vehicle arrivals and testing different traffic light cycles to

minimize wait times and congestion.

Transportation Planning: Monte Carlo methods evaluate how changes in in-

frastructure (e.g., new roads or intersections) affect traffic flow, helping planners

optimize road networks and plan for future growth.

Accident and Safety Analysis: Simulations account for accidents or near-misses,

helping to identify accident hotspots and assess the effectiveness of safety measures

like traffic signs and speed limits.

Environmental Impact: Monte Carlo simulations estimate the environmental im-


33

pact of traffic by simulating vehicle emissions under various traffic conditions.

4.4.2 Advantages

*Flexibility: Can simulate a wide range of traffic scenarios.

*Uncertainty Handling: Captures the randomness of traffic behavior and road

conditions.

*Scenario Exploration: Helps assess various ”what-if” scenarios for better decision-

making.

Monte Carlo methods offer powerful tools for simulating and optimizing traffic sys-

tems, improving traffic flow, road safety, and environmental impact assessments.

Despite challenges in computation and data accuracy, they provide valuable insights

for transportation planning and traffic management.

4.5 Finance

Monte Carlo methods are a class of computational techniques that use random sam-

pling to solve problems that might be deterministic in principle. In finance, these

methods are particularly useful for pricing complex financial derivatives, managing

risk, and modeling uncertainties in various financial systems.

4.5.1 Key Applications

Option Pricing: Monte Carlo simulations are widely used to price options when

traditional closed-form models, like Black-Scholes, are not applicable. For example,

path-dependent options (Asian options, barrier options) or options with compli-

cated features benefit from Monte Carlo due to its flexibility in handling various

underlying asset price paths.

Risk Management:

*Value at Risk (VaR): Monte Carlo simulations help estimate the potential
34

losses in a portfolio by simulating many different market scenarios. This helps as-

sess how much risk the portfolio is exposed to under various market conditions.

*Stress Testing: The method is useful for testing how a portfolio would behave

under extreme market events, such as market crashes or sharp volatility spikes.

Portfolio Optimization: Monte Carlo simulations assist in evaluating different

portfolio allocations by simulating thousands of potential future scenarios. This

helps identify optimal combinations of assets that achieve desired risk-return objec-

tives.

Interest Rate and Credit Risk Modeling: Monte Carlo techniques are used to

simulate the behavior of interest rates and assess the pricing of fixed-income instru-

ments and derivatives. Similarly, credit risk simulations help assess the likelihood

of default or the potential loss in credit portfolios.

Real Options Analysis: Monte Carlo methods allow for a more nuanced valuation

of real options, such as the flexibility to abandon, defer, or expand an investment,

especially under uncertainty, offering a more robust alternative to traditional dis-

counted cash flow (DCF) models.

4.5.2 Advantages

*Flexibility: Can be applied to nearly any financial model, regardless of its com-

plexity.

*No Closed-Form Solution Needed: Particularly useful for models that lack a

closed-form solution or are too complex for traditional methods.

*Accuracy: With enough iterations, Monte Carlo simulations can provide very

precise numerical estimates.

Monte Carlo methods are a versatile and powerful tool in modern finance, enabling

practitioners to model complex financial instruments, manage risk, and optimize

portfolios. Despite the computational cost, their ability to provide solutions where
35

traditional methods fail makes them indispensable for financial analysis, especially

in scenarios involving uncertainty and complexity.

4.6 Disadvantages of Monte Carlo Methods

1. High Computational Cost

Monte Carlo simulations often require a large number of random samples to achieve

accurate results, making them computationally expensive. This is especially prob-

lematic for high-resolution fluid dynamics problems.

2. Slow Convergence

Monte Carlo methods converge slowly compared to deterministic numerical methods

like finite difference or finite element methods. The error decreases proportionally

to 1/ N , where N is the number of samples, meaning that improving accuracy

requires exponentially more samples.

3. Noisy Results

Since Monte Carlo relies on random sampling, results can have high statistical noise

unless a very large number of samples is used. This makes it difficult to achieve

smooth, precise solutions.

4. Inefficiency for Low-Variance Problems

Monte Carlo methods are most effective for high-dimensional or highly random sys-

tems. However, for problems with low variance (e.g., laminar fluid flow), determin-

istic solvers like computational fluid dynamics (CFD) methods (e.g., Navier-Stokes

solvers) are often much more efficient.

5. Difficulty in Capturing Continuity and Small-Scale Features

Monte Carlo methods struggle to capture continuous properties like fluid velocity
36

gradients and small-scale structures in turbulence without an extremely high num-

ber of samples.

6. Requires Specialized Variance Reduction Techniques

To improve efficiency, Monte Carlo methods often require variance reduction tech-

niques (e.g., importance sampling, stratified sampling), which add complexity to

implementation.

7. Not Ideal for Transient (Time-Dependent) Problems

Monte Carlo methods are generally not well-suited for time-dependent fluid flow

problems, as they require extensive sampling at each time step, making simulations

impractically slow.
Conclusion

In conclusion, this exploration of Monte Carlo methods has illuminated their re-

markable versatility and power in tackling complex problems across diverse fields.

From the foundational principles of random sampling and probability distributions

to advanced techniques like importance sampling, we have traced the evolution of

these methods and witnessed their application in solving real-world challenges. The

examination of simple sampling techniques, including numerical integration and

simulations of boundary value problems, Monte Carlo Optimization and transport

phenomena, underscored the core strength of Monte Carlo approaches in handling

problems intractable by analytical means. Furthermore, the journey beyond tradi-

tional physics applications, into domains like astrophysics, mathematics/statistics,

econophysics, traffic simulation, and finance, highlighted the truly interdisciplinary

nature of these techniques and their profound impact on scientific and economic

thought. While Monte Carlo methods offer a powerful toolkit for understanding

complex systems, it is crucial to acknowledge their inherent reliance on statistical

convergence and the potential for statistical error. Careful consideration of sample

size, variance reduction techniques, and the specific characteristics of the problem

at hand are essential for ensuring the accuracy and reliability of Monte Carlo simu-

lations. Despite these considerations, the ability of Monte Carlo methods to bridge

the gap between theoretical models and real-world observations solidifies their po-

sition as indispensable tools in the arsenal of modern science and engineering. As

37
38

computational power continues to grow, we can anticipate even wider adoption and

further refinement of these powerful techniques, paving the way for deeper insights

into the intricacies of our world.


Bibliography

[1] Malvin H. Kalos and Paula A. Whitelock, Monte Carlo Methods- Second,

Revised and Enlarged Edition, WILEY-VCH Verlag GmbH and Co. KGaA,

Weinheim, 2008

[2] David P. Landau and Kurt Binder, A Guide to Monte Carlo Simulations in

Statistical Physics Fourth Edition, Cambridge University Press, 2015

[3] N. Metropolis, and S. Ulam, The Monte Carlo Method, Journal of the Amer-

ican Statistical Association, USA, 1949.

[4] Kroese, D. P., Brereton, T., Taimre, T., Botev, Z. I. Why the Monte Carlo

Method is So Important Today. Wiley Interdisciplinary Reviews: Computa-

tional Statistics, USA, 2014.

[5] Hastings, W. K. Monte Carlo Sampling Methods Using Markov Chains and

Their Applications. Biometrika, USA, 1970.

[6] Rubinstein, R. Y., Kroese, D. P. Simulation and the Monte Carlo Method.

Wiley, Hoboken, 2016.

[7] Robert, C. P., Casella, G. Monte Carlo Statistical Methods. Springer, New

York, 2004.

39

You might also like