Estimation of Parameters
Point Estimate: A single value used to estimate a population parameter. For example,
using the sample mean to estimate the population mean.
Interval Estimate: An estimate that provides a range of values (e.g., confidence
intervals) within which the parameter is expected to fall.
Types of Estimates
1. Maximum Likelihood Estimate (MLE):
o This is the estimate that maximizes the likelihood function, i.e., the probability of
the observed data given a set of parameters.
o It is widely used for various probability distributions and models.
2. Method of Moments Estimate:
o This method involves equating sample moments (e.g., sample mean, sample
variance) to theoretical moments (e.g., population mean, population variance) to
solve for the parameters.
o It's simpler to compute compared to MLE but can be less efficient.
3. Bayesian Estimate:
o Involves updating prior beliefs about parameters with data to produce a posterior
distribution.
o The mode, mean, or median of the posterior distribution can serve as a point
estimate.
4. Least Squares Estimate (LSE):
o Used primarily in regression analysis to minimize the sum of squared differences
between observed values and the values predicted by the model.
Characteristics of a Good Estimator
A good estimator should ideally have the following properties:
1. Unbiasedness:
o An estimator is unbiased if the expected value of the estimator equals the true
value of the parameter being estimated. In other words, on average, it hits the true
parameter value.
o Mathematically:
E(θ^)=θE(\hat{\theta}) = \theta
where θ^\hat{\theta} is the estimator and θ\theta is the true parameter.
2. Efficiency:
o An estimator is efficient if it has the smallest possible variance among all
unbiased estimators. That means it provides the most precise estimate.
3. Consistency:
o An estimator is consistent if, as the sample size grows to infinity, the estimator
converges in probability to the true value of the parameter.
4. Sufficiency:
o An estimator is sufficient if it uses all the information available in the data about
the parameter. In other words, no other estimator can give more information than
a sufficient estimator.
5. Robustness:
o An estimator is robust if it remains relatively unaffected by small deviations or
outliers in the data.