0% found this document useful (0 votes)
175 views10 pages

PracticeProblems Bayesian

The document contains a series of practice problems focused on Bayesian statistics, covering topics such as Bayes' Theorem, conditional probability, independent events, medical testing, and parameter estimation. Each problem presents specific scenarios requiring calculations of probabilities, posterior distributions, and the application of Bayesian principles. The problems are designed for learners to apply Bayesian concepts in various contexts, including medical testing, manufacturing defects, and coin tossing.

Uploaded by

Rohan Bisoyi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
175 views10 pages

PracticeProblems Bayesian

The document contains a series of practice problems focused on Bayesian statistics, covering topics such as Bayes' Theorem, conditional probability, independent events, medical testing, and parameter estimation. Each problem presents specific scenarios requiring calculations of probabilities, posterior distributions, and the application of Bayesian principles. The problems are designed for learners to apply Bayesian concepts in various contexts, including medical testing, manufacturing defects, and coin tossing.

Uploaded by

Rohan Bisoyi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Practice Problems: Bayesian Statistics

A. Problems
1. Bayes’ Theorem - Symbolic Practice
Let events A and B satisfy P (A) = 0.3, P (B|A) = 0.8, and P (B|Ac ) = 0.4. Compute P (A|B).

2. Complement Event
Suppose the probability of an event A is given by P (A) = 0.3. Calculate P (Ac ), the probability of the
complement of event A.

3. Law of Total Probability


(a) Let events A1 , A2 partition the sample space with P (A1 ) = 0.6 and P (A2 ) = 0.4. If P (B|A1 ) =
0.2 and P (B|A2 ) = 0.5, compute P (B).
(b) A box contains three bags: Bag 1 (2 red, 3 green), Bag 2 (4 red, 1 green), Bag 3 (1 red, 4 green).
One bag is selected at random, and then a ball is drawn. What is the probability that the ball is red?
(c) Warehouse 1 contains 60% of items and has a 2% defect rate. Warehouse 2 has 40% of items and
a 5% defect rate. Find the overall probability that a randomly chosen item is defective.

4. Conditional Probability
(a) Given: P (A) = 0.4, P (B) = 0.5, and P (A ∩ B) = 0.2. Calculate P (A|B).
(b) Let P (A) = 0.3, P (B) = 0.4, and P (A ∩ B) = 0.1. Find P (Ac |B).

5. Independent Events
(a) Given P (A) = 0.6, P (B) = 0.7, and P (A ∩ B) = 0.42. Are events A and B independent?
(b) Given P (A) = 0.5, P (B) = 0.6, and P (A ∩ B) = 0.3. Compute P (B|A) and determine if A and
B are independent.

6. Coin Flipping
You have two coins: Coin 1 is fair, and Coin 2 has P (H) = 0.75. You randomly choose one and flip it.
It lands heads. What is the probability it was Coin 1?

7. Medical Testing
A disease affects 1% of the population. A test has 98% sensitivity (i.e. if tested on persons sick with the
disease, 98% of the times the result is positive) and 97% specificity (i.e.if tested on healthy people, the
test will correctly give 95% of them a negative result). If a person tests positive, what is the probability
the person actually has the disease?

8. Email Spam Filtering


A spam filter is used to detect spam emails which works as follows:

1
• If an email is spam, the filter correctly labels it as spam 99% of the time.
• If an email is not spam, the filter correctly labels it as not spam 95% of the time.

Suppose 20% of all emails are actually spam.


If an email is flagged as spam by the filter (i.e., the filter marks it as spam), compute P (Spam|Flagged),
i.e. the probability that the email is actually spam.

9. Manufacturing Defects
A factory has two production lines (Line 1 and Line 2) manufacturing the same product. Line 1 makes
70% of items with 4% defects. Line 2 makes 30% with 8% defects. A defective item is found. What is
the probability it came from Line 1?

10. Internet Connectivity


Two internet service providers (ISP 1 and ISP 2) offer internet connectivity in a neighborhood. ISP 1
serves 80% of homes with 1% connection failure rate. ISP 2 serves 20%, with 3% connection failure
rate. If a household experiences a connection failure, what is the probability that they are using ISP 1?

11. Radiation Detector in Particle Physics


A radiation detector in a particle physics experiment is designed to identify a specific particle produced
in high-energy collisions. Given:

• The probability that the detector correctly identifies the particle when it is present (true positive
rate) is 95%.
• The probability that the detector gives a false signal when the particle is not present (false positive
rate) is 10%.
• Only 5% of events in the collider actually produce the particle.

A detection signal is recorded. What is the probability that this signal corresponds to a real particle (i.e.,
the particle is actually present in the event)?

12. Cosmic Ray Origin Inference


A cosmic ray is detected and is either from the Sun (S), the Milky Way (M ), or an extragalactic source
(E). Prior probabilities: P (S) = 0.2, P (M ) = 0.5, P (E) = 0.3. Spectral analysis yields a profile
with likelihoods: P (D|S) = 0.6, P (D|M ) = 0.4, P (D|E) = 0.9. Compute the posterior probability
that the ray came from an extragalactic source.

13. Spam Filtering Using Bayes’ Rule


A spam filter has learned that the word “winner” appears in 80% of spam emails and 10% of legitimate
emails. Assume 70% of all emails are spam. What is the probability that an email is spam, given that it
contains “winner”?

14. COVID Test During Pandemic


In a region where 5% of the population is infected with COVID-19, a test has sensitivity of 98% and
false positive rate of 3%. What is the probability a person who tests positive is actually infected?

• The test correctly identifies infected individuals 98% of the time.


• The test incorrectly returns a positive result in 3% of healthy individuals.

If a person tests positive, what is the probability they are actually infected?

Page 2
15. Pendulum Model Validation
Two models for the period of a pendulum are being tested:
s
l
H0 : T = 2π
g
s
2
 l
H1 : T = 2π 1 + bθ
g

Based on experimental data D, it is known that

P (D | H0 ) = 0.3, P (D | H1 ) = 0.6

Assuming equal prior beliefs:


P (H0 ) = P (H1 ) = 0.5,
(a) Compute the posterior probabilities P (H0 | D) and P (H1 | D).
(b) Compute the Bayes factor B10 .
(c) Compute the posterior odds in favor of H1 over H0 .

16. Coin Toss - Beta Posterior


Suppose a coin is tossed 6 times and the result is 5 heads and 1 tail. Assume a uniform prior distribution
on the bias:
θ ∼ Beta(1, 1)
(a) Find the posterior distribution p(θ | data).
(b) Compute the expected value E[θ | data].
(c) Compute the MAP(the value of a parameter that maximizes the posterior distribution) estimate of
θ.

17. Medical Test - Rare Disease


A disease affects 1% of the population. A diagnostic test satisfies the following:

• It correctly identifies 99% of infected individuals(sensitivity: the probability that the test is positive
given the person is infected).
• It correctly identifies 95% of healthy individuals (specificity: the probability that the test is nega-
tive given the person is not infected).

If a person tests positive, what is the probability that they actually have the disease?

18. Bayes’ Theorem with Multiple Hypotheses


A medical test can detect three different diseases: A, B, and C. Prior probabilities are:
P (A) = 0.3,
P (B) = 0.5,
P (C) = 0.2.
The probability that the test is positive given each disease is:

P (T + |A) = 0.9, P (T + |B) = 0.7, P (T + |C) = 0.5.

What is the probability that the disease is B, given a positive test?

Page 3
19. Bayesian Parameter Estimation: Coin Bias
A coin is tossed 8 times and results in 6 heads. Assuming a uniform prior, find the posterior distribution.

20. Sequential Bayesian Update – Coin Toss


A coin is suspected to be biased. You toss this coin 5 times and obtain 4 heads. Using the prior
Beta(2, 2), obtain the posterior. You again toss the coin 4 times now and obtain 3 heads out of 4. What
will be the updatd posterior now.

21. Sequential Bayesian Update — Normal Likelihood with Known Variance A student is measuring
the length L of a metal rod. The measurement process is subject to Gaussian (normal) noise with known
variance. The true length L is assumed to be normally distributed as:

L ∼ N (10, 12 )

This reflects the prior belief about the mean length of the rod.
Each measurement is modeled as:
xi ∼ N (L, σ 2 )
with σ = 1 cm (known measurement uncertainty).
The physicist makes the following sequential observations:

• First measurement: x1 = 11 cm
• Second measurement: x2 = 12 cm
(a) Perform the first Bayesian update using x1 , and compute the posterior mean and variance for L.
(b) Perform a second update using x2 , treating the previous posterior as the new prior. Compute the
updated posterior mean and variance.

22. Parameter Update with Conjugate Prior


A coin has unknown bias θ, representing the probability of landing heads. You begin with a prior belief
that θ ∼ Beta(2, 2).
You toss the coin 5 times and observe 4 heads and 1 tail.
(a) Determine the posterior distribution of θ given this data.
(b) Compute the posterior mean.
(c) Compute the MAP estimate of θ.
(d) Suppose you perform 5 more tosses and observe 2 additional heads and 3 tails. Update your
posterior again and compute the new MAP estimate. Compare it with the previous MAP and
interpret the result.
(e) Interpret your results

23. Poisson Rate Estimation


A physicist is monitoring radioactive decays using a Geiger counter. The number of decay events per
unit time is assumed to follow a Poisson process with unknown rate λ (measured in events per minute).
Before collecting data, the physicist models λ using an exponential prior:

p(λ) = e−λ , λ>0

(a) In a 2-minute interval, the detector records 3 decay events. Obtain the posterior distribution p(λ |
data) up to proportionality.

Page 4
(b) Identify the form of the posterior distribution and compute the MAP estimate of λ.
(c) Compute the mean of the posterior distribution.
(d) The experiment continues for 2 more minutes, and 4 additional decay events are recorded. Update
your posterior distribution using all the data, compute the new MAP estimate, and interpret how
your belief about λ has changed.

24. Posterior Mean and Variance of Beta Distribution


In a quantum optics experiment, the efficiency θ of a single-photon detector is defined as: ”the proba-
bility that an incoming photon is successfully detected”.
The initial prior belief about θ is modeled using a Beta(2, 2) distribution, reflecting a belief in moderate
uncertainty.
The detector is then tested with a stream of photons. The results of two separate trials are as follows:

• Trial 1: 8 out of 10 photons were detected.


• Trial 2: 15 out of 20 photons were detected.
(a) Compute the posterior distribution after Trial 1.
(b) Update the posterior again after Trial 2.
(c) Compute the mean and variance of the final posterior distribution.
(d) Alternatively, compute the posterior using a single update that combines all data. Compare the
result with the sequential update and interpret.

25. A physicist is measuring the decay rate λ (in decays per minute) of a radioactive isotope using a Geiger
counter.
The number of decay events per unit time is modeled by a Poisson process with unknown rate λ. Before
any measurements, the physicist expresses their uncertainty about λ using an exponential prior:

p(λ) = e−λ , λ>0

Two rounds of measurements are conducted:

• Trial 1: 4 decay events observed in a 2-minute interval.


• Trial 2: 3 decay events observed in a 1-minute interval.
(a) Compute the posterior distribution for λ after Trial 1.
(b) Update the posterior using data from Trial 2.
(c) Compute the MAP estimate and mean of the final posterior.
(d) Alternatively, compute the posterior using all data at once. Compare it with the sequential update
and interpret the result.

26. Gaussian Posterior for Known Variance


A nuclear physics experiment measures the average lifetime µ (in microseconds) of an excited nuclear
state. Each measurement is affected by thermal and electronic noise and is assumed to follow a Gaussian
distribution with known variance σ 2 = 1 (µs)2 .
The experiment proceeds in two phases:

• Phase 1: 10 measurements yield a sample mean of x̄1 = 2 µs

Page 5
• Phase 2: 5 more measurements yield a sample mean of x̄2 = 1 µs

Assuming a prior belief µ ∼ N (0, 1),


(a) Compute the posterior mean of µ after Phase 1.
(b) Update the posterior using Phase 2 data and compute the new posterior mean.
(c) Alternatively, compute the posterior mean using a single update that combines all data.
(d) Compare the result from part (b) and part (c). Comment on the consistency of Bayesian updating.

27. A low-temperature solid-state physics experiment records the thermal noise amplitude x (in µV) of a
quantum device. The noise is known to be centered around a theoretically predicted mean µ = 0, but
its variance σ 2 is unknown due to environmental fluctuations.
To model uncertainty in σ 2 , the researcher uses the prior:

σ 2 ∼ Inverse-Gamma(α0 = 3, β0 = 0.6)

A new batch of n = 6 measurements yields a sample variance s2 = 0.15 µV2 .


(a) Determine the posterior distribution of σ 2 given this data.
(b) Compute the posterior mean of σ 2 .
(c) What is the most probable (MAP) value of σ 2 ?

28. In a particle physics experiment, the decay time x (in nanoseconds) of an unstable particle is measured
using a fast digital oscilloscope. Due to detector resolution limits and fluctuations, the measured decay
times are normally distributed with unknown mean µ and unknown variance σ 2 .
The experimenter uses the following conjugate prior:

µ | σ 2 ∼ N (µ0 = 0, σ 2 /κ0 = σ 2 /1), σ 2 ∼ Inverse-Gamma(α0 = 2, β0 = 2)

A batch of n = 5 measurements yields:

x̄ = 1.2 ns, s2 = 0.25 ns2

(a) Write down the posterior distribution of σ 2 given the data.


(b) Compute the posterior mean of σ 2 .
(c) Write the posterior distribution of µ | σ 2 .

29. In a quantum optics experiment, photons are passed through a polarization filter, and each photon’s
outcome is recorded as either horizontally (H) or vertically (V) polarized.
The probability of detecting a photon as horizontally polarized is denoted by θ. The researcher is
evaluating two competing hypotheses:

• Model H1 : The polarization detection is perfectly fair: θ = 0.5


• Model H2 : The detector may be biased, with θ ∼ Uniform(0, 1)

The researcher believes the detector is likely well-calibrated, and assigns:

P (H1 ) = 0.8, P (H2 ) = 0.2

Three photons are sent through the detector, and all are recorded as horizontally polarized (HHH).

Page 6
P (D|H2 )
(a) Compute the Bayes factor B21 = P (D|H1 )
P (H2 |D)
(b) Compute the posterior odds ratio P (H1 |D)
(c) Which model is better supported by the data, and how does the prior belief affect the result?

30. Posterior Odds and Bayes Factor Calculation


A researcher is comparing two models based on some observed data D. The likelihoods under each
model and the prior probabilities are given:

• P (D | H1 ) = 0.1, P (H1 ) = 0.3


• P (D | H2 ) = 0.2, P (H2 ) = 0.7
(a) Compute the Bayes factor B21 .
P (H2 |D)
(b) Compute the posterior odds ratio P (H1 |D) .

31. Posterior Odds and Bayes Factor — Coin Tossing


A coin is tossed 4 times and lands heads each time. A student wants to compare two models:

• H1 : The coin is fair → θ = 0.5


• H2 : The coin is biased → θ ∼ Uniform(0, 1)

The student assigns prior probabilities P (H1 ) = 0.75, P (H2 ) = 0.25.


(a) Compute the Bayes factor B21 in favor of the biased model.
P (H2 |D)
(b) Compute the posterior odds P (H1 |D) .

32. Posterior Odds and Bayes Factor — Disease Testing


A diagnostic test is used to check for a disease. Two models are considered:

• H1 : The person is healthy → test returns a false positive with probability 5


• H2 : The person is infected → test returns a positive with probability 90

The prior belief in infection is low: P (H2 ) = 0.1, P (H1 ) = 0.9.


Given a positive test result, answer:
(a) Compute the Bayes factor B21 .
P (H2 |Positive)
(b) Compute the posterior odds P (H1 |Positive) .

33. Bayesian analysis of the exponential distribution


A lifetime X of a machine is modeled by an exponential distribution:

X ∼ Exponential(θ)

with unknown rate parameter θ. Suppose we observe:

X1 = 5, X2 = 6, X3 = 4

(the lifetimes in years of three different i.i.d. machines).


Assume that an expert believes θ has an exponential prior:

p(θ) = Exp(θ | λ)

Page 7
(a) Choose the prior parameter λ such that E[θ] = 31 .
(b) Determine the posterior distribution p(θ | D), where D = {X1 , X2 , X3 }.
(c) Is the exponential prior conjugate to the exponential likelihood?
(d) Compute the posterior mean E[θ | D].

34. Two fair dice are rolled. Someone tells you that the sum of the two scores is 9. What is the posterior
distribution of the dice scores (s1 , s2 )?
Assume that:

• The two dice are fair (i.e., each face from 1 to 6 is equally likely).
• You believed the scores were independent before receiving this information.

35. Suppose we toss a coin N = 10 times and observe N1 = 9 heads.


Let the null hypothesis H0 be that the coin is fair: θ = 0.5. Let the alternative hypothesis H1 be that
the coin has an unknown bias: θ ∼ Uniform(0, 1).
Assume that before observing the data, the two hypotheses are equally likely:

P (H0 ) = P (H1 ) = 0.5


P (D|H1 )
(a) Compute the Bayes factor B10 = P (D|H0 ) .
(b) Compute the posterior probabilities of H0 and H1 given the data.

36. Posterior Distribution:


Suppose we toss a coin n = 5 times. Let X be the number of heads. We observe that there are fewer
than 3 heads, but we don’t know exactly how many. Assume a uniform prior on the probability of heads:

p(θ) = Beta(θ | 1, 1)

Compute the posterior distribution of θ given this partial information.

37. Posterior Odds and Bayes Factor — Signal Detection


In a particle detector experiment, a sensor gives a strong signal. Two hypotheses are considered:

• H1 : Background noise — strong signal occurs with probability 0.02


• H2 : True particle event — strong signal occurs with probability 0.5

The prior belief favors background noise: P (H1 ) = 0.9, P (H2 ) = 0.1
(a) Compute the Bayes factor B21 .
(b) Compute the posterior odds in favor of H2 .

38. Posterior Complements


Two competing hypotheses H1 and H2 are being considered. Assume that exactly one of these models
is true — i.e., the hypotheses are mutually exclusive and collectively exhaustive.
After observing data D, it is found that the posterior probability of H2 is:

P (H2 | D) = 0.8

Page 8
(a) Compute the posterior probability P (H1 | D).
P (H2 |D)
(b) Compute the posterior odds ratio P (H1 |D) .

39. Bayesian Posterior from Bayes Factor


Assume that two models, H1 and H2 are mutually exclusive and collectively exhaustive. The prior
probabilities of these models are given as:

P (H1 ) = 0.6, P (H2 ) = 0.4

Suppose the Bayes factor in favor of H2 over H1 is:

B21 = 4
P (H2 |D)
(a) Compute the posterior odds P (H1 |D)
(b) Compute the posterior probabilities P (H1 | D) and P (H2 | D)

40. Credible Interval for Beta Posterior


The posterior distribution for the bias θ of a coin is modeled as:

θ ∼ Beta(9, 3)

Using the table of quantiles below,

Cumulative Probability 0.05 0.10 0.20 0.40 0.60 0.80 0.90 0.95
Quantile Value 0.530 0.569 0.621 0.708 0.788 0.857 0.901 0.930

(a) compute the 80% central credible interval.


(b) Briefly explain the meaning of this interval in the Bayesian context.

B. Conceptual Questions
1. What is the fundamental philosophical difference between Bayesian and frequentist interpretations of
probability?

2. Why is the posterior probability a valid probability distribution?

3. How does the choice of prior influence the posterior in Bayesian inference?

4. What is a conjugate prior, and why is it useful?

5. Explain the concept of marginal likelihood and its role in Bayesian model comparison.

6. Why is Bayesian inference particularly useful in small data regimes?

7. What is the principle of maximum entropy, and how is it used in Bayesian inference?

8. When does the Beta distribution serve as a conjugate prior?

9. How do Bayes factors help in hypothesis testing?

Page 9
C. Derivation-Based Questions
1. Derive Bayes’ Theorem from the definition of conditional probability.

2. Show that the Beta distribution is the conjugate prior for the binomial likelihood.

3. What does the denominator in Bayes’ Theorem represent? Why is it usually ignored.

4. Prove that the sum of independent Poisson random variables is also Poisson.

5. Derive the posterior for λ in a Poisson model with a Gamma prior.

6. Compute the marginal likelihood for a binomial likelihood and Beta prior.

7. Derive the posterior predictive distribution for a Bernoulli model with a Beta prior.
α−1
8. Show that the posterior mode of a Beta(α, β) is α+β−2 , for α, β > 1.

9. Prove the law of total probability using mutually exclusive events.

10. Derive the posterior mean for Gaussian likelihood with known variance and Gaussian prior.

11. What should be the conjugate prior for Gaussian likelihood when both mean and variance are unknown.
With this prior, derive the (i) posterior density function and (ii)marginal distribution function for mean.

12. Deriving Posterior from Bayes Factor and Priors


Let H1 and H2 be two competing models. Assume that these models are mutually exclusive (only one
can be true) and collectively exhaustive (one of them must be true).
They have prior probabilities P (H1 ) and P (H2 ), respectively. Let B21 denote the Bayes factor in favor
of H2 over H1 , defined as:
P (D | H2 )
B21 =
P (D | H1 )
(a) Show that the posterior odds ratio is given by:

P (H2 | D) P (H2 )
= B21 ·
P (H1 | D) P (H1 )

(b) Hence, derive the formula for the posterior probability P (H2 | D) in terms of the Bayes factor and
the prior probabilities.

Page 10

You might also like