0% found this document useful (0 votes)
153 views11 pages

Statistical Estimation and Testing Concepts

1. The document discusses concepts related to statistical inference including properties of estimators such as unbiasedness, consistency, and efficiency. It provides examples of different statistical tests and their assumptions. 2. Key topics covered include minimal sufficient statistics, Cramer-Rao inequality, maximum likelihood estimation, and properties that make an estimator admissible or consistent. 3. Multiple choice questions are provided related to common statistical topics like hypothesis testing, parameter estimation, and properties of estimators.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
153 views11 pages

Statistical Estimation and Testing Concepts

1. The document discusses concepts related to statistical inference including properties of estimators such as unbiasedness, consistency, and efficiency. It provides examples of different statistical tests and their assumptions. 2. Key topics covered include minimal sufficient statistics, Cramer-Rao inequality, maximum likelihood estimation, and properties that make an estimator admissible or consistent. 3. Multiple choice questions are provided related to common statistical topics like hypothesis testing, parameter estimation, and properties of estimators.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

1.

Let [Math Processing Error] be independently and identically distributed random variables, satisfying
[Math Processing Error]. Let N be an integer-valued random variable whose value n depends only on the
values of the first n [Math Processing Error]s. Suppose [Math Processing Error], then [Math Processing
Error] is called

1. Independence Equation
2. Neyman Pearson Lemma
3. Sequential Probability Likelihood Equation
4. Wald’s Equation

2. [Math Processing Error], where [Math Processing Error] is an unbiased estimator of [Math Processing
Error]. Then above inequality is called

Cauchy Schwarz Inequality

Bool’s Inequality

Chebyshev’s Inequality

Cramer Rao Inequality

3. Which of the following assumptions are required to show the consistency, unbiasedness, and
efficiency of the OLS estimator?

i. $E(\mu_t)=0$
ii. $Var(\mu_t)=\sigma^2$
iii. $Cov(\mu_t,\mu_{t-j})=0;t\neq t-j$
iv. $\mu_t \sim N(0,\sigma^2)$

(ii) and (iv) only

(i) and (iii) only

(i), (ii) and (iii) only

(i), (ii), (iii) and (iv) only

4. A set of jointly sufficient statistics is defined to be minimal sufficient if and only if

It is a function of every other set of sufficient statistics

It is not a function of every other set of sufficient statistics

It is a function of some other set of sufficient statistics


It is a function of any sufficient statistics in the set

5. A test is said to be the most powerful test of size [Math Processing Error], if

Among all other tests of size [Math Processing Error] or greater it has the largest [Math Processing
Error]

Among all other tests of size [Math Processing Error] or less, it has the largest power

Among all other tests of size [Math Processing Error] or greater it has the larger 1-[Math Processing
Error]

Among all other tests of size [Math Processing Error] or greater, it has the smallest power

6. In statistical inference, the best asymptotically normal estimator is denoted by

BAN

CANE

BANE

A) and B)

7. For a biased estimator $\hat{\theta}$ of $\theta$, which one is correct

 $ MSE(\hat{\theta})=SD(\hat{\theta}) + Bias $
 $ MSE(\hat{\theta})=Var(\hat{\theta}) + Bias^2 $
 $ MSE(\hat{\theta})=Var(\hat{\theta}) + Bias $
 $ MSE(\hat{\theta})=SD(\hat{\theta}) + Bias^2 $

8. If $f(x_1,x_2,\cdots,x_n;\theta)=g(\hat{\theta};\theta)h(x_1,x_2,\cdots,x_n)$, then
$\hat{\theta}$ is
Unbiased

Efficient

Sufficient

Consistent

9. For two estimators [Math Processing Error] and [Math Processing Error] then estimator [Math
Processing Error] is defined to be [Math Processing Error] for all [Math Processing Error] in [Math
Processing Error]

Admissible Estimator

Sufficient Estimator

Consistent Estimator
Minimax Estimator

10. Let [Math Processing Error] be a random sample from the density [Math Processing Error], where
[Math Processing Error] may be vector. If the conditional distribution of [Math Processing Error] given
[Math Processing Error] does not depend on [Math Processing Error] for any value of [Math Processing
Error] of [Math Processing Error], then statistic is called.

Minimax Statistics

Efficient

Sufficient Statistic

Minimal Sufficient Statistic

11. If the conditional distribution of [Math Processing Error] given [Math Processing Error], does not
depend on [Math Processing Error], for any value of [Math Processing Error], the statistics [Math
Processing Error] is called

Unbiased

Consistent

Sufficient

Efficient

12. If [Math Processing Error], then [Math Processing Error] is said to be

Unbiased

Sufficient

Efficient

Consistent

13. If [Math Processing Error], then [Math Processing Error] is

Unbiased

Efficient

Sufficient

Consistent
14. If [Math Processing Error] as [Math Processing Error], then [Math Processing Error] is said to be

Unbiased

Sufficient

Efficient

Consistent

15. Let [Math Processing Error] be a random sample from a density [Math Processing Error], where
[Math Processing Error] is a value of the random variable [Math Processing Error] with known density
[Math Processing Error]. Then the estimator [Math Processing Error] with respect to the prior [Math
Processing Error] is defined as [Math Processing Error] is called

Minimax Estimator

Posterior Bay’s Estimator

Bay’s Estimator

Sufficient Estimator

16. Let [Math Processing Error] be the likelihood function for a sample [Math Processing Error] having
joint density [Math Processing Error] where ? belong to parameter space. Then a test defined as [Math
Processing Error]

Generalized Likelihood Ratio test

Most Powerful Uniformly Test

Monotone Likelihood Ratio Test

Unbiased Test

17. If [Math Processing Error] is the joint density of n random variables, say, [Math Processing Error]
which is considered to be a function of [Math Processing Error]. Then [Math Processing Error] is called

Maximum Likelihood function

Likelihood Function

Log Function

Marginal Function
1. To test the randomness of a sample, the appropriate test is:

Run Test
Sign Test
Median Test
Page’s Test
2. By the method of moments one can estimate:

All Constants of a Population


Only Mean and Variance of a Distribution
All Moments of a Population Distribution
All of the Above
3. Most of the Non-Parametric methods utilize measurements on:

Interval Scale
Ratio Scale
Ordinal Scale
Nominal Scale
4. Homogeneity of several variances can be tested by:

Bartlett’s Test
Fisher’s Exact Test
F-test
t-test
5. Parameters are those constants which occur in:

Samples
Probability Density Functions
A Formula
None of these
6. The set of equations obtained in the process of least square estimation are called:

Normal Equations
Intrinsic Equations
Simultaneous Equations
All of the Above
7. If the sample average x¯¯¯x¯ is an estimate of the population mean μμ,
then x¯¯¯x¯ is:
Unbiased and Efficient
Unbiased and Inefficient
Biased and Efficient
Biased and Inefficient
8. Equality of several normal population means can be tested by:

Bartlett’s Test
F-test
χ2χ2-test
t-test
9. Power of test is related to:

Type-I Error
Type-II Error
Type-I and Type-II Error Both
None of the Above
10. Roa-Blackwell Theorem enables us to obtain minimum variance unbiased estimator
through:

Unbiased Estimators
Complete Statistics
Efficient Statistics
Sufficient Statistics
11. For a particular hypothesis test, α=0.05α=0.05, and β=0.10β=0.10. The power of
this test is:
0.15
0.90
0.85
0.95
12. For an estimator to be consistent, the unbiasedness of the estimator is:

Necessary
Sufficient
Neither Necessary nor Sufficient
None of these
13. When the null hypothesis is accepted, it is possible that:

A correct Decision has been Made


A Type-II Error has been Made
Both (A) and (B) have Occurred
Neither (A) nor (B) has Occurred
14. Sample median as an estimator of the population mean is always

Unbiased
Efficient
Sufficient
None of These
15. An estimator TnTn is said to be a sufficient statistic for a parameter
function τ(θ)τ(θ) if it contained all the information which is contained in the
Population
Parametric Function τ(θ)τ(θ)
Sample
None of these
16. The sign test assumes that the:

Samples are Independent

Samples are Dependent


Samples have the Same Mean
None of These
17. Crammer-Rao inequality is valid in case of:

Upper Bound on the Variance


Lower Bound on the Variance
The Asymptotic Variance of an Estimator
None of these
18. With a lower significance level, the probability of rejecting a null hypothesis that is
actually true:

Decreases
Remains the Same
Increases
All of the Above

1. If Var(θ^)→0Var(θ^)→0 as n→0n→0, then θ^θ^ is said to be


Unbiased
Sufficient
Efficient
Consistent
2. If X1,X2,⋯,XnX1,X2,⋯,Xn is the joint density of n random variables,
say, f(X1,X2,⋯,Xn;θ)f(X1,X2,⋯,Xn;θ) which is considered to be a function of θθ.
Then L(θ;X1,X2,⋯,Xn)L(θ;X1,X2,⋯,Xn) is called
Maximum Likelihood function
Likelihood Function
Log Function
Marginal Function
3. If the conditional distribution of X1,X2,⋯,XnX1,X2,⋯,Xn given S=sS=s, does not
depend on θθ, for any value of S=sS=s, the
statistics S=s(X1,X2,⋯,Xn)S=s(X1,X2,⋯,Xn) is called
Unbiased
Consistent
Sufficient
Efficient
4. Varθ(T)≥[τ′(θ)]2nE[∂∂θlogf((X;θ)2]Varθ(T)≥[τ′(θ)]2nE[∂∂θlogf((X;θ)2],
where T=t(X1,X2,⋯,Xn)T=t(X1,X2,⋯,Xn) is an unbiased estimator of τ(θ)τ(θ). Then
above inequality is called
Cauchy Schwarz Inequality
Bool’s Inequality
Chebyshev’s Inequality
Cramer Rao Inequality
5. Let X1,X2,⋯,XnX1,X2,⋯,Xn be a random sample from the density f(x;θ)f(x;θ),
where θθ may be vector. If the conditional distribution
of X1,X2,⋯,XnX1,X2,⋯,Xn given S=sS=s does not depend on θθ for any value
of ss of SS, then statistic is called.
Minimax Statistics
Efficient
Sufficient Statistic
Minimal Sufficient Statistic
6. For a biased estimator θ^θ^ of θθ, which one is correct
MSE(θ^)=SD(θ^)+BiasMSE(θ^)=SD(θ^)+Bias
MSE(θ^)=Var(θ^)+Bias2MSE(θ^)=Var(θ^)+Bias2
MSE(θ^)=Var(θ^)+BiasMSE(θ^)=Var(θ^)+Bias
MSE(θ^)=SD(θ^)+Bias2MSE(θ^)=SD(θ^)+Bias2
7. In statistical inference, the best asymptotically normal estimator is denoted by

BAN
CANE
BANE
A) and B)
8. If f(x1,x2,⋯,xn;θ)=g(θ^;θ)h(x1,x2,⋯,xn)f(x1,x2,⋯,xn;θ)=g(θ^;θ)h(x1,x2,⋯,xn),
then θ^θ^ is
Unbiased
Efficient
Sufficient
Consistent
9. Let X1,X2,⋯,XnX1,X2,⋯,Xn be a random sample from a density f(x|θ)f(x|θ),
where θθ is a value of the random variable ΘΘ with known density gΘ(θ)gΘ(θ). Then
the estimator τ(θ)τ(θ) with respect to the prior gΘ(θ)gΘ(θ) is defined
as E[τ(θ)|X1,X2,⋯,Xn]E[τ(θ)|X1,X2,⋯,Xn] is called
Minimax Estimator
Posterior Bay’s Estimator
Bay’s Estimator
Sufficient Estimator
10. A set of jointly sufficient statistics is defined to be minimal sufficient if and only if

It is a function of every other set of sufficient statistics


It is not a function of every other set of sufficient statistics
It is a function of some other set of sufficient statistics
It is a function of any sufficient statistics in the set
11. A test is said to be the most powerful test of size αα, if
Among all other tests of size αα or greater it has the largest ββ
Among all other tests of size αα or less, it has the largest power
Among all other tests of size αα or greater it has the larger 1-αα
Among all other tests of size αα or greater, it has the smallest power
12. Let L(θ;X1,X2,⋯,Xn)L(θ;X1,X2,⋯,Xn) be the likelihood function for a
sample X1,X2,⋯,XnX1,X2,⋯,Xn having joint
density f(x1,x2,⋯,xn;θ)f(x1,x2,⋯,xn;θ) where ? belong to parameter space. Then a
test defined
as λ=λn=λ(x1,x2,⋯,xn)=SupθεΘ0 L(θ;x1,x2,⋯,xn)SupθεΘL(θ;x1,x2,⋯,xn)λ=λn=λ(x1,x2,⋯,xn)=Sup
θεΘ0L(θ;x1,x2,⋯,xn)SupθεΘL(θ;x1,x2,⋯,xn)
Generalized Likelihood Ratio test
Most Powerful Uniformly Test
Monotone Likelihood Ratio Test
Unbiased Test
13. For two
estimators T1=t1(X1,X2,⋯,Xn)T1=t1(X1,X2,⋯,Xn) and T2=t2(X1,X2,⋯,Xn)T2=t2(X1
,X2,⋯,Xn) then estimator t1t1 is defined to be Rt1(θ)≤Rt2(θ)Rt1(θ)≤Rt2(θ) for
all θθ in ΘΘ
Admissible Estimator
Sufficient Estimator
Consistent Estimator
Minimax Estimator
14. If Var(T2)<Var(T1)Var(T2)<Var(T1), then T2T2 is
Unbiased
Efficient
Sufficient
Consistent
15. If E(θ^)=θE(θ^)=θ, then θ^θ^ is said to be
Unbiased
Sufficient
Efficient
Consistent
16. Let Z1,Z2,⋯,ZnZ1,Z2,⋯,Zn be independently and identically distributed random
variables, satisfying E[|Zt|]<∞E[|Zt|]<∞. Let N be an integer-valued random variable
whose value n depends only on the values of the first n ZiZis.
Suppose E(N)<∞E(N)<∞,
then E(Z1+Z2+⋯+Zn)=E(N)E(Zi)E(Z1+Z2+⋯+Zn)=E(N)E(Zi) is called
Independence Equation
Neyman Pearson Lemma
Sequential Probability Likelihood Equation
Wald’s Equation
17. Which of the following assumptions are required to show the consistency,
unbiasedness, and efficiency of the OLS estimator?

i. E(μt)=0E(μt)=0
ii. Var(μt)=σ2Var(μt)=σ2
iii. Cov(μt,μt−j)=0;t≠t−jCov(μt,μt−j)=0;t≠t−j
iv. μt∼N(0,σ2)μt∼N(0,σ2)

(ii) and (iv) only


(i) and (iii) only
(i), (ii) and (iii) only
(i), (ii), (iii) and (iv) only

You might also like