STA260 Lecture 07
- Review:
- is unbiased for parameter if .
-
- Where
- This is unbiased
-
- Here there is a scaling factor.
- So it always over estimates.
- You've got the bias of the Estimator
- Sampling Distribution
- Consistency
- Convergence in Probability
- We need to show in general that or
- Chebyshev's Inequality#STA260
- A consistent Estimator
- is a consistent estimator for parameter if
- #tk review proof of the weak law of large numbers proof.
- Example proof:
- Bernoulli Trial
- Just an experiment with true or false as outcomes.
- Success or fails.
- Success occurs with probability
- bernoulli trials is a binomial
- is the total number of successes in tosses.
- is our trials.
- We want to show that
- Let
- and
- Let
- Example: we have our successes and failures enumerated as
- So we can just add the outcomes over the number of trials.
- So
-
-
- Now we have our Mean and Variance
- for
- We will replace with our estimator
- Now let's use the definition of Convergence in Probability
- Since probability is always
-
- Takes the form of what we need:
- So
- This means is a consistent Estimator for
- Sufficient condition for consistency of unbiased estimators, theorem 5
- If where is an unbiased estimator of , then is a consistent Estimator of
- Proof:
- Chebyshev's Inequality
- Let
- , because it's unbiased.
- Since we know
- No negative probabilities
- Meaning so it's a consistent estimator.
- Biased yet Consistent: Subheading in Consistent
- was an unbiased estimator of
- Unbiasedness is not a necessary condition for Convergence in Probability.
- Remember that the bias of an estimator is
- So
- If the , then we can still have Convergence in Probability and a consistent yet biased estimator.
- Exercise: #tk
- Let be iid with
- Let
- Show that
- Example:
- Let be iid with mean and variance
- Want to show that the sample mean is consistent.
- Show is a consistent Estimator of the mean
- By the Central Limit Theorem:
- Using Theorem 5:
- We need an unbiased estimator.
- We know that so it's unbiased
- #tk exercise
- We also need that
- So we're done
- By theorem 5, conditions hold, and tell us that , meaning is a consistent Estimator of .
- Either use Chebyshev's Inequality or just show that it's unbiased and the limit of the Variance is .
- Theorem 6:
- Basically showing it's linear.
- If we have is a consistent estimator of and is also consistent.
- Then is also a consistent estimator of
- also estimates
- Also the Continuous Mapping Theorem applies here.
- Method of Moments:
- A technique to Estimate Parameters of a Distribution by matching theoretical moments to sample moments.
- Moment:
- A numerical value that summarizes the shape related features of a distribution of a random variable.
- moment
- theoretical moment
- sample moment
- For data points , the moment is adding these to the power of
- Suppose the target distribution has many parameters:
- The Method of Moments technique involves setting up and solving a system of equations.
- Equate the sample moments to the theoretical.
- Example:
- Exponential
- PDF
- and
- Estimate using the method of moment estimator
- Find the theoretical moment.
- sample moment
- Equation the theoretical to the sample moment
- The MOM Estimator tells us that
- Example:
- Continuous Uniform Distribution
- Estimate using the MOM Estimator.
- Find the theoretical moment and the sample moment.
- Theoretical:
- Sample:
- Equate:
- Meaning that the MOM Estimator of , tells us that
- Note in slides, that they calculate
- By Independence
- Since it's uniform
- Example:
- Gamma Distribution
- Let be random samples from
- Find the MOM estimates for
- theoretical moment
- sample moment
- theoretical moment
- sample moment