STA258 Lecture 05
- Consistency, Convergence in Probability
- Let be a sequence of random variables.
- Let be a target random variable, defined on the same sample space.
- The Sequence
- An application of consistency is the Weak Law of Large Numbers
- Chebyshev's Inequality
- If
- Then
- As we know.
- Chebyshev's Inequality
- Useful lemma
- Then
- Ex:
- Useful tip
- Since
- Theorem:
- Question:
- Show that
- #tk
- Solution:
- The second moment is
- So
- So it's
- by the Continuous Mapping Theorem
- So is a consistent Estimator of Variance
- Standardized Random Variables
- Ley be a random variable with mean and variance .
- Then to standardise.
- If then
-
- So the mean is always
-
- From above
- Example:
- Since weight and height aren't the same units, you standardize to then try to compare.
- Example:
- Random Sample
- If Random Sample comes from the Normal Distribution, then the standarisation gives exactly the Normal Distribution.
- Otherwise standardising , approximately follows a Standard Normal Distribution.
- Ex:
- and
- Find
- From the table: this is
- Ex:
- and are independent.
- What is the :
- This is our difference Statistic
- Ex:
- If performance of two classes are the same.
- If it's positive or negative one performed better than the other.
- What's the Variance of ?
- Suppose and and
- Find the sample sizes so that will be within of with a probability of
- Always round up with samples.
- Chi-squared Distribution
- Suppose we have and
- Then
- Suppose we have
- Find such that
- We then get
- We then get
- So
- On the table we get
- Suppose we have
- We know that
- Consider
- It is an unbiased estimator
- is unbiased for estimating if
- Variance of
- Because the variance of any Chi-squared Distribution is
- So the
- This means