0% found this document useful (0 votes)
93 views20 pages

Tuljaram Chaturchand College of Arts, Science and Commerce, Baramati

This document contains a question bank for the M.Sc Statistics course STAT-4202: Parametric Inference for the second semester. It includes questions ranging from 2 to 6 marks covering topics in parametric inference including: - Definitions of statistical terms like sufficiency, sufficient statistics, exponential families. - Questions checking understanding of concepts like whether a statistic is sufficient or not for a parameter. - Questions asking to prove results about sufficient statistics and minimal sufficient statistics for different distributions. - Questions involving exponential families and determining if a distribution belongs to an exponential family. The question bank is intended to test students' understanding of foundational concepts in parametric inference through questions at various point values
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
93 views20 pages

Tuljaram Chaturchand College of Arts, Science and Commerce, Baramati

This document contains a question bank for the M.Sc Statistics course STAT-4202: Parametric Inference for the second semester. It includes questions ranging from 2 to 6 marks covering topics in parametric inference including: - Definitions of statistical terms like sufficiency, sufficient statistics, exponential families. - Questions checking understanding of concepts like whether a statistic is sufficient or not for a parameter. - Questions asking to prove results about sufficient statistics and minimal sufficient statistics for different distributions. - Questions involving exponential families and determining if a distribution belongs to an exponential family. The question bank is intended to test students' understanding of foundational concepts in parametric inference through questions at various point values
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

ANEKANT EDUCATION SOCIETY’S

TULJARAM CHATURCHAND COLLEGE OF


ARTS, SCIENCE AND COMMERCE,
BARAMATI
AUTONOMOUS

QUESTION BANK

FOR

M.Sc(Sem-II)

STATISTICS

STAT- 4202: Parametric Inference

(With effect from June 2019)


Unit-1:
For 2 Marks:

Q1. Define the following terms with one illustration.

 Sufficient estimator of the parameter by using conditional probability


approach.
 Sufficiency.
 Sufficient statistic.
 Joint sufficiency
 Likelihood equivalence.
 Minimal sufficient statistics.
 one parameter exponential family.
 Multi-parameter exponential family.
Q2.Choose the correct alternative of the following:
 Let X1 , X2 be a random sample from Poisson distribution with mean  then
E[(X1 - X2)2] is
a) 2 b) 2 c)  d) None of these

 Let X1 , X2 ,…….., Xn be a random sample from N(, ) where  is unknown,


then which of the following statement is not true?
a)   Xi 2  is sufficient for .

b)   Xi  is sufficient for .
c)   Xi,  Xi  is jointly sufficient for .
2

d) Sufficient statistic does not exist

 Which of the following does not belongs to the exponential family of


distributions?
1
a) f ( x, )  e x / , x  0 b) f ( x, )  e( x ) , x  

5x4 / e x / x 2
c) f ( x, )  e x , x0 d) f ( x, ) 
, x0
5

 2 3
 Let X1 and X2 be a random sample from a Poisson (θ). Then number of unbiased
estimators of θ is :
a) infinity b) 3 c) 2 d) 4
 Consider Pitman family of distribution { f(x, θ), θ  } with
 u ( x)
 if a( )  x  b( )
f ( x, )   v( )
 0 Otherwise
where u(x), v(θ) > 0. Suppose a(θ) is increasing and b(θ) is decreasing function
of θ. Then minimal sufficient statistic based on random sample of size n is
given by:
max{ a 1 ( X (1) ), b 1 ( X ( n ) )} max{ a 1 ( X ), b 1 ( X )}
a) c) (n) (1)

b) min{a 1 ( X ( n ) ), b 1 ( X (1) )} d) min{a 1 ( X (1) ), b 1 ( X ( n ) )}


 Two random samples X and Y of sizes n and m respectively from Exp() are
likelihood equivalent if and only if
a) i=1n Xi ≠ i=1m Yi b) i=1n Xi > i=1m Yi

c) i=1n Xi < i=1m Yi d) i=1n Xi = i=1m Yi

 Let the random variable X follows U (, +1) then which of the following
statement is not correct?
a) X(1) is sufficient for .
1
b) ( 𝑋 − ) is an unbiased estimator of .
2
c) UMVUE will not exist for .
d) Any value of  in the interval [ X(n)-1 , X(1) ] is an maximum
likelihood estimator.

For 4 Marks:

1. Let X1, X2 are i.i.d. N(θ,1) random variables. Show that (mX1+nX2 ) is sufficient
for θ if and only if m=n
2. Define a sufficient statistic and state the Neyman factorization criterion for it.
Prove the result indiscrete case.
3. Consider the following p.m.f. P(x=0) =𝜃 −𝛼 , P(X=1)=𝛼 𝑒 −𝛼 and
P(X=2)=1−𝑒 −𝛼 - 𝛼𝑒 −𝛼 .

Given a random sample of size 2, show that 𝑋1 + 𝑋2 is not sufficient for 𝛼.

4. Define one parameter exponential family with parameter θ and obtain


minimal sufficient statistic for θ.
5. Check whether following distribution belongs to exponential family.Justify
your answer.
 x 

1e
f ( x,  )  θ,xεR
2
6. Let X1,X2,….,Xn be random sample of size n having following probability
density function
3 3
f ( x, )  ;0    x  . Show that X (1) is minimal sufficient statistic for θ
x4
also find its probability density function.
7. Show that N(θ,σ2) is a member of multi parameter exponential family when
both θ and σ 2 are unknown
8. Check whether the following distribution belongs to exponential
family.Justify youe answer
1
f ( x,  )  ; , x  R
 [1  ( x   )]2
9. Let X1,X2,X3 are i.i.d. Bernoulli(p) and S1 = (X1,X2,X3), S2 = (X1 +X2,X3), S3 =
(X1+X2+X3) check whether S1, S2, S3 form sufficient partition.
10. Let X1,X2 be i.i.d. U(θ,θ+1), θ  R . Find minimal sufficient statistic for θ.
11. Show that for a sample of size 3 from Poisson(𝜆),( 𝑋1 , 𝑋2 + 𝑋) is sufficient
for 𝜆 but not minimal sufficient for 𝜆.
12. If T is a sufficient statistic for θ and  (T ) is sufficient statistics for θ when  is
one to one onto function.
13. Suppose X1,X2,……,Xn is a random sample from beta distribution of first kind
with parameters (θ,1).show that T=  log X i is sufficient statistic for θ.
14. Check whether the following distribution is member of one parameter
exponential family.

i) Ber() and Bin(n, ), where n is known.


ii) P() and Geo()
iii) Discrete Uniform {x = 1, 2, …, N} and continuous U(0, ).
iv) Exp() and N(, 1)
1
v) Cauchy (, 1) and X ~ Laplace() with pdf f ( x, )  e| x  | , x  0,  0.
2
1 | x|
vi) X ~ Laplace() with pdf f ( x, )  e , x  0,   0.
2

For 6 Marks:

1. Let X1,X2,…,Xn be a random sample from N(θ1,θ2).Show that ( ( X i ,  X i2 )


form minimal sufficient statistic for  =( θ1,θ2).
2. Define m parameter exponential family. Let X1,X2,…,Xn be a random sample
from m parameter exponential family the obtain a minimalsufficient
statistic for the parameter  =( θ1,θ2,…., θm)’
3. Define Pitman family and prove the following results
If a(θ)  , b(θ)  then min {a 1 ( X 1 ), b 1 ( X n )} is minimal sufficient statistics for
θ.
4. Define sufficient partition. Let X1,X2 be i.i.d. N(θ,1) then show that
T=lX1 +mX2 is sufficient iff l=m.
5. Show that N(θ,σ2) is a member of multi parameter exponential family when
both θ and σ 2 are unknown. Hence obtain a minimal sufficient statistic for
(θ,σ2 ) based on the random sample of size n drawn from N(θ, σ 2).
6. Define Pitman family and prove the following results
If a(θ)  , b(θ)  then Max {a 1 ( X 1 ), b 1 ( X n )} is minimal sufficient statistics
for θ.
7. Define likelihood equivalence and explain its usefulness in obtaining a
minimal sufficient statistic.
8. Define a sufficient statistic.Give one example of a statistic that is sufficient
and one which is not(with justification using the definition only)
9. State and prove Neyman factorization theorem for a parametric family of
discrete random variables. Hence check if (i)𝑋1 + 𝑋2 and (ii) 𝑋1 + 2𝑋2 are
sufficient where 𝑋1 &𝑋2 are iid with p.m.f.
𝑓 𝑥, 𝜃 = 1 − 𝜃 𝜃 𝑥 , 𝑥 = 0,1,2, … . . ,0 < 𝜃 < 1.
10. Distinguish between a sufficient statistic and a minimal sufficient statistic.
Discuss the relationship between them.
11. State Neyman factorizability Criterion for a statistic T(𝑥1 , … … , 𝑥𝑛 ) to be
sufficient for the family (L(𝑥1 , … , 𝑥𝑟 𝜃),𝑥𝜖 S,𝜃𝜖Ω} and using this show that
𝑋(𝑛 ) is a sufficient for 𝜃 for a random sample (r.s.) of size n on U(0,𝜃), 𝜃 >
0.
12. Let {f(x,𝜃),𝜃𝜖Ω} be a family of probability density functions such that
𝑢 𝑥
f(x,𝜃),= ,𝑎 𝜃 < 𝑥 < 𝑏 𝜃
𝑣 𝜃
=0, otherwise
State what are the minimal sufficient statistics in each of the following
cases.
(i)a(𝜃) = 𝑎(constant)
(ii)b(𝜃) =b(constant)
13. Let X(1) , X (2) , … . , X(n) be an ordered sample from{U(0,θ), θ > 0}.Find
minimal sufficient statistic for 𝜃.
14. Let X1 , X2 , X3 &X4 be i.i.d N(θ, 1),θϵR1 .Show that
(i) (𝑋1 + 𝑋2 ) is not sufficient
(ii) (𝑋3 + 𝑋4 ) is not sufficient
(iii) (𝑋1 + 𝑋2 ), (𝑋3 + 𝑋4 ) is not sufficient
(iv) (𝑋1 + 𝑋2 + 𝑋3 + 𝑋4 ) is sufficient
15. Let 𝑋1 , 𝑋2, … . , 𝑋𝑛 be i.i.d. U(𝜃 − 1, 𝜃 + 1).Show that (𝑋(1) , 𝑋(𝑛 ) ) is
sufficient for 𝜃 but not complete. Is (𝑋(1) , 𝑋(𝑛) )minimal sufficient?
16. Let 𝑋1 , 𝑋2, … . , 𝑋𝑛 be random sample from N(𝜃, 1),then show that T=
X1+X2+X3 is sufficient statistic for θ.
17. Let 𝑋1 , 𝑋2, … . , 𝑋𝑛 be random sample from with density belonging to
class {f x,θ ,θ   }which forms an exponential family the prove that
joint distribution of 𝑋1 , 𝑋2, … . , 𝑋𝑛 is also a member of one parameter
exponential family.

18. Let X1, X2, …, Xn be a random sample from a Pareto distribution with
 
density function f ( x,  ,  )  , x   ,   0,   2. Find a sufficient statistics
x  1
when (i)  is known, (ii)  is known and (iii) when both are unknown.
Let X1, X2, …, Xn be a random sample from a random variable X with

density function f ( x,  )  , x  0,   0. Find a minimal sufficient
(1  x)1
statistic for .
19. Let X1, X2, …, Xn be a random sample from a random variable X with
   x  1
density function f ( x, )  e x , x  0,  ,   0. Find a sufficient statistic

when (i)  is known, (ii)  is known and (iii) when both are unknown.
20. Let X1, X2, …, Xn be a random sample from a random variable X with
1
density function f ( x, )  e| x  | , x  0,   0. Find a minimal sufficient
2
statistic for .
21. Let X1, X2, …, Xn be a random sample from a random variable X with pmf
f ( x, )  (1   ) x1, x  1, 2,..., 0    1. Find a minimal sufficient statistic for .
22. Let X1, X2, …, Xn be a random sample from N(, 2) distribution. Show
that ( X , S 2 ) is Minimal Sufficient Statistic.
23. Let X1, X2, …, Xn be a random sample from a random variable X with density

function f ( x, )  , x  0,   0. Find a minimal sufficient statistic for .
(1  x)1
24. Check whether following distribution belongs to two parameter
exponential family.

 x x
f ( x, y )    p y (1  p) x  y e  ; y  0,1,...x; x  0,1,...;0  p  1;  0
 y x!

25. Check whether following distribution belongs to two parameter


exponential family.
x
1  1
f ( x,  ,  )  x e x  0, ,   0
 
0 otherwise
26. Let X1, X2, …, Xn be a random sample of size n from U(, +1) distribution.
Obtain minimal sufficient statistic for .
27. Check whether the following distribution is member of one parameter
exponential family.
1
X ~ Laplace() with pdf f ( x, )  e| x  | , x  0,  0.
2
1 | x|
X ~ Laplace() with pdf f ( x, )  e , x  0,  0.
2

Unit-2:

For 2 marks:
Q1. Define the following terms:

 Fisher information function.


 Fisher information matrix.
 Unbiased estimator.
 Estimable function.
 Minimum variance unbiased estimator
 Minimum variance bound unbiased estimator
 Ancillary statistic
 Complete family.
 Complete sufficient statistic.
 Crammer Rao inequality.
Q2. Choose the correct alternatives of the following:
 Consider Cramer family of distribution {f(x, θ), θ  }. Let T be unbiased for
ψ(θ). Which of the following is Cramer-Rao inequality?
Var (T ) Var (T )
a) P(| T   | k )  c) P( Sup | T  ( ) | k ) 
k2 k2
[ ( )]
' 2
[ ' ( )]2
b) Var (T )  d) Var (T ) 
I X ( ) I X ( )
 Crammer Rao inequality is regarding
a) Probability outside the critical region
b) Variance of unbiased estimator
c) Bound on power of UMP test
d) Probability of union of two random events.
 Let X1 and X2 be a random sample from Ber(θ). Then which of the following
is not estimable?
a) θ (1- θ 2) b) θ2 c) θ d) 1-θ2
 Which of the following statement is not true?
a) Minimum Variance Unbiased Estimator (MVUE) is unique.
b) Minimum Variance Bound Unbiased Estimator (MVBUE) always
exist.
c) Let X1, ..., Xn be a random sample from N(θ, 1). Then  xi is
sufficient of θ.
d) Likelihood equivalence leads to minimal sufficient statistics.
 Let X be a random variable with probability density function
f(x, ) = e-(x-), x  , (-∞, ∞). then MLE of  is
n n

X i X i
a) i 1
b) i 1
1 c) X (n ) d) X (1)
n n
 Let f(x, ) be the probability density function of a random variable X for
which differentiation under integration sign is permissible then,
  
E log f ( x,  )  is equal
  
a) 0 b) I() c) 1 d) Var(X)

 Let X and Y be i.i.d. Poisson random variable with mean 2 then
a) T1 = (X – Y)3 is unbiased estimator of 
b) T2 = (X – Y)2 is unbiased estimator of .
c) T3 = (X – Y) is unbiased estimator of .
d) Unbiased estimator of  does not exist.

For 4 Marks:

1. Show that unbiased estimator T of  ( ) iff T is uncorrelated with every


unbiased estimator of zero.
2. Let X1,X2 be a random sample of size two from P(θ).Is  ( ) =θ2
estimable?.Justify.
3. Define Complete sufficient statistic. Show that for Poisson distribution
with parameter θ, T=  X i is complete statistic for θ.
4. Define MVBUE. Under regularity conditions show that if T is MVBUE of θ
then it is sufficient for θ
5. Let X1,X2,…,Xn be a random sample from P(λ) ,λ>0 and n>1.Find an
unbiased estimator of λ2 based on this unbiased estimator obtain
uniformly minimum variance unbiased estimated for λ2.
6. Obtain information function of geometric distribution having parameter p.
7. Show that for a random sample of size n from Poisson with mean
λ, λ > 0, 𝑇 = 𝑛𝑖=1 𝑋𝑖 is complete for 𝜆.
8. Let 𝑋1 , … . , 𝑋𝑛 be i.i.d .U(0,𝜃) and let T=Max (𝑋1 , … . , 𝑋𝑛 ),n ≥2.Find
unbiased estimator of 𝜓 𝜃 = 1/𝜃 based on T.
9. Define complete statistic. Show that for Poisson distribution with
n
T   Xi
parameter, i 1 is complete for .
10. Let X1, X2 be a random sample from Bernoulli (p) then show that (p) = p3
is not estimable.

For 6 Marks:

1. State and prove necessary and sufficient condition for existence of MVUE.
2. State and prove necessary and sufficient condition for existence of
MVBUE.
3. Let (X1 , X2 , … . , Xn ) be i.i.d b(1,θ).Show that T1 (X1 , X2 )=1 if X1 =1,and
X2 = 1 and zero otherwise is an unbiased estimator of θ2 and obtain Rao-
Blackwellized version of T1 w.r.t. T= ni=1 Xi which is known to be
sufficient for θ.
4. State and prove Rao Blackwell theorem.
5. State and prove Lehman scheffe theorem
6. State and prove Basu’s theorem.
7. State and prove Crammer Rao inequality.
8. Let X have N(0, 2) distribution. Show that X is not complete but X2 is
complete.
9. Let X ~ Bin(n, p), where n is known. Find MVUE of p2 and p(1-p).
10. Let X1, X2, …, Xn be a random sample from P() distribution. Find
MVUE for P[X  1].
11. Let X1, X2, …, Xn be a random sample from a random variable X with
pmf f ( x, )  (1   ) , x  1, 2,..., 0    1. Find MVUE of .
x 1

12. Define complete family of distribution. Show that {N(, 1),   (-, )}
is a complete family of distribution.
13. Let X have N(0, 2) distribution. Show that X is not complete but X2 is
complete.
14. Let X1, X2, …, Xn be a random sample from a random variable X with
pmf f ( x, )  (1   ) , x  1, 2,..., 0    1. Find MVUE of .
x 1

15. Let X1, X2, …, Xn be a random sample from Poisson () distribution.
Suppose () = e- is the parametric function of interest. Then show
that, () is an estimable function but MVBUE of () does not exist
1 ; if X i  0
T1  
0 ; if X i  0
Suppose . Let M=  Xi then carry out Rao
Blackwellisation of T1 with respect to M.

16. Define Fisher information in a Sample Ix () and information in a

statistic IT (). Show that IT ()  Ix ().

17. Let X1, X2, …, Xn be a random sample from N(, 1) and () = 2. Give
an unbiased estimator of (). Examine whether its variance attains the
Crammer–Rao Lower Bound.
18. Let X1, X2, …, Xn be a random sample of size n  3 from Bernoulli (p).
Obtain an unbiased estimator of parameter p2(1 - p). Hence, find
UMVUE of p2(1 - p).
19. Let X1, X2, …, Xn be a random sample from exponential distribution with
n
1
density function f ( x, )  e  x
, x  0,   0. Show that Y   xi is
 i 1

Complete.
20. Show that the one parameter exponential family of distribution is a
Complete family of distribution.
21. Define complete statistic. If {f(x, ), (-,)} is one parameter
n
exponential family, then show that T   K ( xi ) is complete for .
i 1

22. Let (𝑋1 , 𝑋2 , … . , 𝑋𝑛 ) be i.i.d b(1,𝜃).Show that 𝑇1 (𝑋1 , 𝑋2 )=1 if 𝑋1 =1,and


𝑋2 = 1 and zero otherwise is an unbiased estimator of 𝜃 2 and obtain
Rao- Blackwellized version of 𝑇1 w.r.t. T= 𝑛𝑖=1 𝑋𝑖 which is known to be
sufficient for 𝜃.
23. Let 𝑋1 , 𝑋2, … . , 𝑋𝑛 be i.i.d. U(𝜃 − 1, 𝜃 + 1).Show that (𝑋(1) , 𝑋(𝑛 ) ) is
sufficient for 𝜃 but not complete. Is (𝑋(1) , 𝑋(𝑛 ) )minimal sufficient?

Unit-3:
For 2 Marks:

Q1. Define the following terms with one illustration.

 Critical region
 Test function
 OC curve
 Level of significance
 Size of the test
 MP test
 UMP test
 MLR property
 UMPU test
 Type I year
 Type II error
 Composite hypothesis
Q2. Choose the correct alternatives of the following:
 Let X1, X2, ... , Xn be a random sample from N(θ, 1) then SELCI of level α is:
 Z1 2 Z1 2   Z1 2 Z 2 
a)  X  ,X   c)  X  ,X  
 n n   n n 
 Z 2 Z 2   Z 2 Z1 2 
b)  X  ,X   d)  X  ,X  
 n n   n n 

 Let XN(, 4) distribution then 95% confidence interval for  is


a) (X-2, X+2) b) (X-0.75, X+0.75)

c)(X-3.92, X+3.92) d) (X-4, X+4)

For 4 Marks:

1. Define MLR property and show that distributions belonging to the one
parameter exponential family possess this property.
2. State the Neymann-Pearson lemma for testing a simple hypothesis against a
simple alternative .Prove the sufficiency part for 0< 𝛼 <1.

3. Define a test function. Distinguish between a randomized and a non-


randomized test and explain the advantage in using the former.
4. Find size and power of these test. Which one will you prefer? Why?
5. Define UMP and UMPU test.
6. Find UMP test 𝛼 test for testing 𝐻0 : 𝜃′ = 𝜃0 Vs 𝐻1 : 𝜃 < 𝜃0 ,when a random
sample is taken from {U(0,𝜃), 𝜃 > 0} distribution.
7. Let 𝑓 𝑥, 𝜃 =𝜃𝑒 −𝑥 + 1 − 𝜃 𝑥𝑒 −𝑥 𝑥 > 0,0 < 𝜃 < 1.On the basis of a
sample of size one, obtain MP test of size 𝛼 = 𝑒 −1 , to test 𝐻0 = 1 vs
𝐻1 : 𝜃 = 1/2.
8. Explain the terms: Rejection region, Errors of two types. Why the
probabilities of two types of errors cannot be minimized simultaneously?
9. Let 𝑋1 , … . , 𝑋𝑛 be i.i.d exponential random variables with mean 𝜃.Obtain the
most powerful test for 𝐻0 : 𝜃 = 𝜃0 against 𝐻1 : 𝜃 = 𝜃1 (<𝜃0 ).
10. Consider a sample of size 1.Obtain a most powerful size 𝛼 test for
𝐻0 : 𝑓0 (𝑥)~𝑁(0,1) against 𝐻1 : 𝑓1 𝑥 ~𝑐𝑎𝑢𝑐ℎ𝑦 0,1 . Also obtain its power.
1
11. If X is a random variable having pdf 𝑓 𝑥, 𝜃 = 𝑒 −│𝑥 −𝜃│ ; 𝑥𝜖𝑅, 𝜃𝜖𝑅, show
2
that the family has MLR property in x.
12. Give an example (with brief justification) in which a UMP test exists against
a two sided alternative
13. Obtain MP level 𝛼test for testing 𝐻0 : 𝜃 = 𝜃0 Vs 𝐻1 : 𝜃 = 𝜃1 𝜃1 > 𝜃0 , based
on a random sample of size n from exponential distribution with mean 𝜃.
14. Show that there does not exist UMP test for testing 𝐻0 : 𝜃 = 0 against
𝐻0 : 𝜃 ≠ 0 at level 𝛼,when (𝑥1 , … … , 𝑥𝑛 ) is r.s. from N(𝜃, 1).
15. Show that the same test as given in above question continues to be UMP
level 𝛼 test for testing 𝐻0 : 𝜃 ≤ 1vs 𝐻1 : 𝜃 > 1.
16.

For 6 Marks:

1. State Neyman-Person Lemma part-A and Part-B and prove Part-A.


2. Let X be a discrete r.v. with pmf given by 𝑓0 𝑥 = .05 𝑓𝑜𝑟 𝑥 =
1,2, … . ,20 𝑢𝑛𝑑𝑒𝑟 𝐻0 𝑎𝑛𝑑 𝑢𝑛𝑑𝑒𝑟 𝐻1 .
𝑓1 𝑥 =.06 for x=1,
= .15for x=2.3
.10
= for x=4,5,….,20
17
3. Define 𝜙1 𝑥 = 1,2 and zero otherwise and 𝜙2 𝑥 =1/2 if x=2,3
=1 if x=1
=0 otherwise.
4. Show that both 𝜙1 𝑎𝑛𝑑 𝜙2 are MP test of level 𝛼 = 10 with same power.
Do 𝜙1 , 𝜙2 satisfy NP lemma?
5. For testing a composite null hypotheses 𝐻0 : 𝜃𝜖Ω𝐻0 vs 𝐻1 : 𝜃𝜖Ω𝐻1 define (i)
size function,(ii)power function(iii) level 𝛼 test and (iv) UMP level 𝛼 test
when 𝐻1 is also composite.
6. For testing 𝐻0 : 𝜃 = 1 vs 𝐻1 : 𝜃 > 1 on the basis of a sample of size n on
𝑓(𝑥, 𝜃)=𝜃𝑒 −𝜃𝑥 , 𝑥 > 0, 𝜃 > 0.Obtain UMP level 𝛼 test and show that its
power function is monotone.
7. Define a UMP test. Obtain such test for a U(0,𝜃)r.v. based on a random
sample of size n, for testing 𝐻0 : 𝜃 = 𝜃0 against 𝐻1 : 𝜃 > 𝜃0 . How is the test
modified if alternative is left sided?
8. Sketch the power curve of the UMP test for 𝐻0 : 𝜃 = 𝜃0 against 𝐻1 : 𝜃 >
𝜃0. for a normal variate with mean 𝜃 and variance 1.Give justification
briefly.
9. Define MLR property and illustrate with example as well as counter
example. How is it useful in deriving optimal tests?
10. Show that a UMP test does not exist for a two sided alternative, to test a
simple hypothesis about the population mean of a normal r.v. with known
variance.
11. Let Let 𝑋1 , … . , 𝑋𝑛 be independent and identically distributed random
variables with p.d.f. 𝑓 𝑥, 𝜃 , 𝜃𝜖Ω.Consider the problem of testing 𝐻0 : 𝜃 =
𝜃0 against 𝐻1 : 𝜃 ≠ 𝜃0 .Does a UMP test exist for the above problem? If yes,
give an example of a distribution for which it exists and derive the UMP
test for the same.
12. State the Neymann-Pearson fundamental lemma for test functions.
13. Let 𝑋1 , … . , 𝑋𝑛 be i.i.d random variables from 𝑓 𝑥, 𝜃 , 𝜃𝜖Ω = {𝜃0 , 𝜃1 .Let
𝐻0 : 𝜃 = 𝜃0 and 𝐻1 : 𝜃 = 𝜃1 . Show that for every 𝛼, 0 < 𝛼 < 1,there exists a
k≥0 and the test ϕ𝑘 𝑥 , for which E[ϕ𝑘 𝑥 ] = 𝛼 where
1𝑖𝑓 𝐿1 𝑥 > 𝑘𝐿0 (𝑥)
ϕ𝑘 𝑥 = 𝛾 𝑖𝑓𝐿1 𝑥 = 𝑘𝐿0 (𝑥)
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
2
14. Let 𝑓 𝑥, 𝜃 =1/𝜋 1/1+(x-𝜃) ,x𝜖𝑅1 and suppose it is desired to test 𝐻0 : 𝜃 = 0
vs 𝐻1 : 𝜃 = 1 on the basis of a single observation. Show that the test
𝜓 𝑥 = 1 if 1<x<3 and zero otherwise is the MP test of its size.
15. Let 𝑓 𝑥, 𝜃 =𝜃𝑥 𝜃 −1 0 < 𝑥 < 1 and 𝜃 > 0.Let (𝑋1 , … . , 𝑋𝑛 ) be a random
sample of size n on {𝑓 𝑥, 𝜃 , 𝜃 > 0}.Obtain UMP level 𝛼 test for testing
𝐻0 : 𝜃 ≤ 1 Vs 𝐻1 : 𝜃 > 1.
16. Define monotone likelihood ratio property and check whether 𝑓 𝑥, 𝜃 =
1
exp{−│𝑥 − 𝜃│} , 𝑥𝜖𝑅, 𝜃𝜖𝑅1 has this property.
2
17. how that UMP level 𝛼 test does not exist for testing 𝐻0 : 𝜃 = 𝜃0 Vs 𝐻1 : 𝜃 ≠
𝜃0 , based on a random sample of size n from {N(𝜃, 1), 𝜃𝜖ℛ}.Suggest UMP
unbiased test for the same problem.
18. Let {𝑋𝑖 }𝑛𝑖 be i.i.d with distribution 𝑓 𝑥, 𝜃 , 𝜃𝜖Ω.Suppose it is desired to test
𝐻0 : 𝜃 = 𝜃0 vs 𝐻1 : 𝜃 = 𝜃1 .Show that any test 𝜑𝑘 𝑥~ , 𝑘 ≥ 0 of the form.
= 1 𝑖𝑓 𝐿 𝑥~𝜃1 > 𝑘𝐿(𝑥~, 𝜃0 )
𝜑𝑘 𝑥 = 𝛾 𝑖𝑓𝐿 𝑥~𝜃1 = 𝑘𝐿(𝑥~, 𝜃0 )
= 0 𝑖𝑓𝐿 𝑥~𝜃1 < 𝑘𝐿(𝑥~, 𝜃0
Is an MP test of size E [𝜑𝑘 (x)│𝜃0 ].
19. What is MLR property ? Using MLR property obtain UMP test for testing
𝐻0 : 𝜃 = 𝜃0 Vs 𝐻1 : 𝜃 > 𝜃0 , using a random sample of size n from 𝑓 𝑥, 𝜃 =
𝑒 −(𝑥 −𝜃 ) , 𝑥 > 𝜃
20. Let X follows the p.d.f f(x) =𝜃𝑥 𝜃 −1 , 0<x<1, 𝜃 >0.Derive the UMP test for
𝐻0 : 𝜃 = 1 against 𝐻1 :𝜃 < 1 based on a sample of size n.
21. Derive a most powerful test based on a single observation on a r.v. x for
𝐻0 :X follows Weibull distribution with p.d.f. x exp −𝑥 2 /2 against 𝐻1 :X
follows folded normal distribution with p.d.f.( 2/𝜋)exp −𝑥 2 /2.Sketch the
two density functions. Explain why the critical region obtained above is
intuitively reasonable.
22. Define a UMP test. Explain how MLR property is useful in deriving such a
test. Show that a MP test is always unbiased. Hence sketch the power
function for testing 𝐻0 : mean of a normal distribution with unit variance, is
zero, against a one sided alternative .Explain briefly why a UMP test does
not exist against a two-sided alternative.
23. Let X be a discrete random variable with pmf under H1 &H0 given by

X=x 1 2 3 4
𝑃0 (𝑥) 0.45 0.05 0.05 0.45
𝑃1 (𝑥) 0.20 0.30 0.30 0.20
1 𝑖𝑓 𝑥 = 2 1 𝑖𝑓 𝑥 = 3
Define 𝜙1 𝑥 = & 𝜙2 𝑥 =
0 𝑖𝑓 𝑥 = 1,3,4 0 𝑖𝑓 𝑥 = 1,2,4
Unit-4:
For 2 Marks:

Q1. Define the following terms with one illustration.

 Confidence Interval.
 Shortest Expected Length C.I.
 Uniformly Most Accurate C.I.
 Prior Distribution.
 Posterior distribution.
 Loss Function.
 Conjugate Family.
 Coefficient of Confidence Interval.
 Equal tailed Confidence Interval.
 Pivotal quantity
 Baye’s estimator.
 Minimax Decision Rule
 Risk function.
Q2. Choose the correct alternatives of the following.

 If (X(1),X(n)) is a confidence interval for population median then the confidence


coefficient is….
1 1 1 1
a) 1  b) 1  n 1
c) d) 1 
2n 2 2n 2 n 1
 Pivotal quantity used for the construction of confidence interval for σ2, in
case of N(µ, σ2) distribution follows…..
a) Normal distribution b) Chi square distribution
c) t- distribution d) F -distribution
 We prefer the confidence interval with confidence coefficient (1-α)if it has….
a) Shortest width b) equidistant confidence limits from parameter
c)longest width d) one sided confidence limits

For 4 Marks:

1. Explain the concept of a confidence interval (CI).What is a pivotal


quantity? Show with example its use in obtaining a CI.
2. Explain the connection between CI and test of hypothesis. What is a
uniformly most accurate confidence bound? How can we get such a bound
for the mean of a normal distribution with known variance?
3. Suppose a random sample of size 15 from a Bernoulli distribution with
parameter p is as follows:
1,0,0,1,1,1,0,0,0,0,1,0,1,0,0
The prior distribution of p is a Beta distribution with parameters α=2 and β=
Using squared error loss function. Obtain Baye’s estimate of p.
4. The diameter of 10 ball bearings were measured in suitable units are as
follows:
12.01,12,12.02,12.01,12.02,12.01,12.03,12.02,12.01,12.00.
Find the 95% C.I. for mean diameter assuming the diameter to be normally
distributed.
5. If X1,X2,…,Xn is a r.s. from exponential with mean 1/θ find the (1-α)100%
C.I. for θ.

For 6 Marks:

1. Obtain Shortest Expected Length Confidence Interval (SELCI) of level (1-α)


for  based on independent random sample of size n from N(, 1) by using
pivotal quantity depending on minimal sufficient statistic for .

2. Explain the term Posterior distribution. If X is a random variable having


Ber() distribution and   U (0, 1) then obtain posterior distribution of .
3. Define uniformly most accurate (UMA) confidence interval. Let X1, X2, ... ,
Xn be a random sample from N(, 2) where 2 is unknown derive equal
tailed confidence interval for  of level (1-α).

1 1
4. Define minimax decision rule. Let X  Ber (p), p{ , } and A= {a1,a2}. Let the
4 2
loss function given by
a1 a2
1
P 1= 4
1 4
P2 =12 3 2

5.

Obtain minimax decision rule if the four decision rules are


i) 1(0) = 1(1) =a1 ii) 2 (0) = a1 ,2(1)=a2
iii) 3(0) = a2 ,3(1)=a1 iv) 4(0) = 4(1) =a2
5. Let X1, X2, …., Xn be a random sample from a distribution with probability
density function
f ( x,  )   x 1 0  x  1,   0
0 otherwise
Obtain UMA confidence interval for  with confidence coefficient (1-).

6. Describe a method of obtaining a confidence interval for a parameter θ based on a large


sample.Hence obtain 100(1-α) % confidence interval for θ the mean of an exponential
distribution.
7. Let X1,X2,….,Xn be a random sample of size n from a population with p.d.f.
f(x,θ) = e x ; x>0
= 0 ; otherwise
Let the parameter θ have the p.d.f.
h(θ) = e-θ ; θ >0
=0 ; otherwise
Obtain Bayes solution for θ using a squared error loss function and Y=  Xi .
8. If X follows a Binomial distribution with parameters (k,p) and the prior
distribution of p is Beta distribution of first kind with parameters
(α,β),then find the posterior distribution of p based on a random sample
X1,X2,….,X of size n from the Binomial distribution.
9. If X follows a Poisson distribution with parameters λ and the prior
distribution of λ is Gamma distribution with parameters (α,β),then find
the posterior distribution of λ based on Y=  Xi .where X1,X2,….,Xn is r.s.
of size n from the Poisson distribution.
10. Let X1,X2,….,Xn be a r.s. of size n from a Bernoulli distribution with
parameters p as the probability of success.Let Y=  Xi . and the prior
distribution of p is Beta distribution of first kind with parameters
(α,β).Obtain Bayes estimator of p using a squared error loss function .
11. Let X1,X2,….,Xn be a r.s. of size n from a Binomial distribution with
parameters k and p .Let Y=  Xi . and the prior distribution of p is Beta
distribution of first kind with parameters (α,β).Obtain Bayes estimator of
p using a squared error loss function .
12. Let X1,X2,….,Xn be a r.s. of size n from a Normal distribution with
parameters µand σ2 . where σ2 is known. Let Y= X where X is sample
mean .Let the prior distribution of µ be normal with mean µ 0 and standard
deviation σ 0 .Obtain Bayes estimator of µ using a squared error loss
function .

*******

You might also like