0% found this document useful (0 votes)
234 views18 pages

Estimation Techniques in Statistics

Estimation involves using sample statistics to estimate unknown population parameters. There are two main types: point estimation, which provides a single value estimate, and interval estimation, which provides a range of values within which the parameter is expected to lie. For point estimation to be valid, the estimator should be unbiased, consistent, efficient, and sufficient. Common point estimation methods are maximum likelihood and method of moments. Interval estimation uses confidence intervals centered around a point estimate, with confidence limits that vary with the confidence coefficient.

Uploaded by

rohitbatra
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
234 views18 pages

Estimation Techniques in Statistics

Estimation involves using sample statistics to estimate unknown population parameters. There are two main types: point estimation, which provides a single value estimate, and interval estimation, which provides a range of values within which the parameter is expected to lie. For point estimation to be valid, the estimator should be unbiased, consistent, efficient, and sufficient. Common point estimation methods are maximum likelihood and method of moments. Interval estimation uses confidence intervals centered around a point estimate, with confidence limits that vary with the confidence coefficient.

Uploaded by

rohitbatra
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd

ESTIMATION

Click to edit Master subtitle style

5/30/12

Meaning
An estimation is a statistical technique of estimating unknown

population parameters from the corresponding sample statistic. A population parameter can be estimated in two ways:
1. 2.

Point Estimation Interval Estimation

5/30/12

Point Estimation
Meaning:- It provides a single value of a statistic that is used to

estimate an unknown population parameter.


Estimator:- The statistic which is used to obtain a point estimate

is called estimator.
Estimate:- The value of statistic is the estimate.

5/30/12

Criteria for a Good Estimator


Unbiasedness Consistency Efficiency Sufficiency

5/30/12

Unbiasedness
A statistic is said to be an unbiased estimator of a parameter if its expected value is equal to the value of the parameter. The expected value of the statistics expressed as E is the arithmetic mean of the sampling distribution of the statistics.

5/30/12

Cond..
1.

The sample mean is an unbiased estimator of the population mean because mean of the sampling distribution of several means is equal to population mean.

2.

The sample variance is a biased estimator of the population variance because the expected value of sample variance is not equal to population variance.

3.

An unbiased estimator of the population variance is given by:


n n S 2 = ( x1 x ) 2 n 1 n 1

5/30/12

Consistency
A statistic is said to be a consistent estimator of a parameter if it comes closer to the value of parameter as the sample size (n) tends to infinity. Example:- In random sampling from a Normal population, both the sample mean and the sample median are consistent estimators of population mean.

5/30/12

Efficiency
A consistent statistic is said to be Most efficient estimator of a parameter if its sampling variance is less than that of any other consistent estimator. Example:- Sample mean is more efficient then median in estimating the population mean since the variance of mean is smaller than variance of median.

5/30/12

A statistic which

has the minimum all estimators is called of the

variance population

among

parameter

Minimum Variance estimator (MV).


A statistic which is unbiased and has also
5/30/12

Sufficiency
A statistic is said to be a sufficient estimator of a parameter if it contains all information in the sample about the population parameter. Example:- In random sampling from a normal population, the sample mean is a sufficient estimator of the population mean.

5/30/12

Methods of Point Estimation


1. 2.

Method of Maximum Likelihood Method of Moments

5/30/12

Method of Maximum Likelihood


It is a process of choosing as an estimator of population parameter, ( ) that statistic which when substituted for population parameter ( ), maximises the likelihood function L.

The statistic which maximises the likelihood function L is called a Maximum Likelihood Estimator(MLE).

L = f ( x1 , ) f ( x 2 , )......... f ( x n , )
5/30/12

Properties of MLE
It is consistent, most efficient and also sufficient provided a

sufficient estimator exists.


It tends to be distributed normally for large samples. It is not necessarily unbiased. A biased MLE can be converted

into an unbiased estimator by a slight modification.


It is invariant under functional transformations.

5/30/12

Methods of Moments
It is a process of equating the first few moments of the population with the corresponding moments of the sample .

Condition:- This method is applicable only when the population moments exit. Utility:- This method is generally applied for fitting distribution to observed data.
5/30/12

theoretical

Limitation of Point Estimation


It provides a single value of estimator which cannot be expected to coincide with the true value of the parameter and in some cases differ widely from it.

5/30/12

Interval Estimation
Meaning:- It provides an interval of finite width centered at the point estimate of the parameter, within which unknown parameter is expected to lie with a specified probability. Such an interval is called a confidence interval for population parameter.

Confidence Limits:- The lower and upper limits of the confidence interval are called confidence limits.
5/30/12

Confidence Coefficient:- The probability with which the confidence

interval will include the true value of the parameter is known as confidence coefficient of the interval. The range of confidence limits varies with confidence coefficient.
Significance of Confidence Limits:- The significance of confidence limits is

that if many independent random samples are drawn from the same population and the confidence interval is calculated from each sample, then the parameter will actually be included in the intervals in probable proportion of cases in the long run.
5/30/12

Calculation of Confidence Limits


It is based on the knowledge of sampling distribution of an appropriate statistic. If the population is normal or the sample size (n) is large (> 30), percentage of area under the standard normal curve may be used to find confidence limits corresponding to any specified percentage of confidence.

5/30/12

Common questions

Powered by AI

The method of maximum likelihood is widely used for point estimation because it generally yields estimators with desirable properties such as consistency, efficiency, and sufficiency, provided a sufficient estimator exists . MLE is intuitive, as it maximizes the probability or likelihood of observing the given sample, leading to naturally optimal parameter estimates . Additionally, MLE's adaptability through transformation and its close approximation to normal distribution for large samples further contribute to its widespread applicability and acceptance in statistical analysis .

MLE typically provides estimators that are consistent, efficient, and sufficient, while though potentially biased, can be adjusted to be unbiased . MLE is also robust under functional transformations and approximates a normal distribution for large samples. On the other hand, the method of moments equates sample moments to population moments, assuming these moments exist, and is often simpler computationally; however, it may lack the efficiency and consistent properties of MLE unless specific conditions are met . The choice between them often depends on the specific data characteristics and parameters being estimated.

MLE is considered efficient because it achieves the lowest variance among consistent estimators, providing the most precise parameter estimates . In large sample scenarios, MLE tends to be normally distributed due to the Central Limit Theorem, reinforcing its utility for inferential statistics . Although MLE is not inherently unbiased, it retains efficiency and consistency, and can be adjusted to be unbiased through modifications . It is also invariant under transformations, meaning any transformation of a parameter estimated via MLE remains optimal .

The confidence coefficient represents the probability that the confidence interval contains the true population parameter. It indicates the level of certainty or reliability associated with the interval estimate . A higher confidence coefficient results in a wider confidence interval, reflecting increased assurance of encompassing the true parameter but at the cost of precision . This trade-off is crucial in balancing confidence against the interval's informativeness when interpreting statistical results.

The concept of sufficiency affects the choice of estimators by ensuring that the estimator captures all the information in the sample relevant to estimating the parameter. A sufficient estimator encapsulates all necessary sample information, maximizing the use of data while simplifying analysis . This property helps in reducing data redundancy, thus providing a more comprehensive and accurate parameter estimate, ultimately leading to more informed data-driven decisions .

Point estimation provides a single value statistic to estimate an unknown population parameter, whereas interval estimation offers a range (interval) within which the parameter is expected to lie, along with a specified probability (confidence level). Point estimates are more straightforward but may not coincide exactly with the parameter, whereas interval estimates include a confidence interval that provides more information about the potential variability and reliability of the estimation .

A 'good' estimator should meet the criteria of unbiasedness, consistency, efficiency, and sufficiency. Unbiasedness means that the expected value of the estimator equals the true parameter value, ensuring no systematic error . Consistency indicates the estimator approaches the actual parameter value as the sample size increases . Efficiency relates to an estimator having the lowest variance among all unbiased estimators, providing more precise estimates . Sufficiency ensures the estimator utilizes all available information in the data related to the parameter . These criteria collectively ensure accurate, reliable, and informative estimates of population parameters.

Variance plays a critical role in determining the efficiency of an estimator because efficiency is measured by comparing the variances of different estimators. A more efficient estimator has a lower variance, meaning it produces estimates that are more tightly clustered around the true parameter value . Efficiency is assessed by comparing an estimator's variance with that of other consistent estimators - the estimator with the minimum variance is often preferred as it indicates more precise estimation of the parameter . This concept helps prioritize estimators that make best use of data, providing robust statistical insights.

Unbiasedness and consistency are related but distinct characteristics of estimators. Unbiasedness ensures that the expected value of an estimator equals the true parameter value, reflecting no systematic error . Consistency, on the other hand, ensures that as the sample size increases to infinity, the estimator converges in probability to the true parameter value, which implies reliable long-term performance . While unbiasedness focuses on the expected accuracy for a given sample size, consistency ensures durability of this accuracy as more data become available .

The limitation of point estimation lies in its provision of a single estimate that does not account for variability or uncertainty surrounding the parameter, potentially differing significantly from the true parameter value . Interval estimation addresses this limitation by providing a range of plausible values (confidence interval) for the parameter, along with a confidence coefficient that quantifies the probability of the interval containing the true parameter, thereby incorporating uncertainty and allowing for more robust conclusions .

You might also like