asked 52.0k views
4 votes
A sample of 100 independent random numbers is taken from this distribution, and its average is used to estimate the mean of the distribution. What isthe standard error of this estimate?

asked
User Bgoosman
by
7.6k points

1 Answer

5 votes

Answer:

The standard error in estimating the mean = (0.1 × standard deviation of the distribution)

Explanation:

The standard error of the mean, for a sample, σₓ is related to the standard deviation, σ, through the relation

σₓ = σ/(√n)

n = sample size = 100

σₓ = σ/(√100)

σₓ = (σ/10) = 0.1σ

Hence, the standard error in estimating the mean = (0.1 × standard deviation of the distribution)

answered
User Bhaumik Pandhi
by
7.1k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.