- What is the variance of the mean of a sample?
- How do you calculate the variance of a distribution of sample means?
What is the variance of the mean of a sample?
Variance. The variance of the sampling distribution of the mean is computed as follows: That is, the variance of the sampling distribution of the mean is the population variance divided by N, the sample size (the number of scores used to compute a mean).
How do you calculate the variance of a distribution of sample means?
Sampling Variance
For N numbers, the variance would be Nσ2. Since the mean is 1/N times the sum, the variance of the sampling distribution of the mean would be 1/N2 times the variance of the sum, which equals σ2/N.