asked 185k views
1 vote
An unknown data set has a standard deviation of 10. What is the standard deviation when each value in the data set is multiplied by 5?

asked
User Asthme
by
8.7k points

1 Answer

1 vote
Let
\mathbf x=\{x_i~:~1\le i\le n\} be the sample, with
n the size of the sample.

The mean of the sample is


\displaystyle\bar x=\frac1n\sum_(i=1)^nx_i

Let
\mathbf y be the same data set but with each value scaled up by 5. Then the mean here is


\bar y=\displaystyle\frac1n\sum_(i=1)^ny_i=\frac5n\sum_(i=1)^nx_i=5\bar x

The standard deviation for
\mathbf x is given by


s_(\mathbf x)=\sqrt{\displaystyle\frac1{n-1}\sum_(i=1)^n(\bar x-x_i)^2}

For
\mathbf y, you would have


s_(\mathbf y)=\sqrt{\displaystyle\frac1{n-1}\sum_(i=1)^n(\bar y-y_i)^2}

s_(\mathbf y)=\sqrt{\displaystyle\frac1{n-1}\sum_(i=1)^n(5\bar x-5x_i)^2}

s_(\mathbf y)=\sqrt{\displaystyle(25)/(n-1)\sum_(i=1)^n(\bar x-x_i)^2}

s_(\mathbf y)=5\sqrt{\displaystyle\frac1{n-1}\sum_(i=1)^n(\bar x-x_i)^2}

s_(\mathbf y)=5s_(\mathbf x)
answered
User Abysslover
by
7.6k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.