asked 211k views
1 vote
If the sample size is multiplied by 4, what happens to the standard deviation of the distribution of sample means?

1 Answer

4 votes

Increasing the sample size by a factor of 4 or multiplying it by 4 is equal to increasing the standard error by 1/2. Therefore, the interval will be half as varied. This also works almost for population averages as lengthy as the multiplier from the t-curve doesn't modify much when increasing the sample size.

answered
User Leni Kirilov
by
8.4k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.