asked 115k views
3 votes
If the standard deviation of a data set was originally 5, and if each value in the data set was multiplied by 3.6, what would be the standard deviation of the resulting data?

2 Answers

4 votes
You would take 5 and multiply it by 3.6 which gives you 18. I believe that is how you would do that problem. Hope it helped.
answered
User Thom
by
8.7k points
4 votes

Answer:

New Standard Deviation = 18

Explanation:


\text{Standard Deviation for sample is given by : }s=\sqrt{(\sum_(i=1) ^(n) (x_i-\bar x))/(n-1)

When each value is multiplied by 3.6 then there would be no effect on n as the number of terms will be same.

But the mean will be effected.


\implies \bar{x}=(\sum_(i=1)^(n) x_i)/(n)\\\\\implies \text{New mean = }\frac {\sum_(i=1) ^(n) 3.6* x_i}{n}\\\\\text{Also, each term }x_i\text{ is multiplied by 3.6}\\\\\implies\text{New terms = }3.6* x_i

So, the New standard deviation now becomes :


s'=\sqrt{(\sum_(i=1) ^(n) (3.6* x_i-3.6* \bar x))/(n-1)}\\\\\implies s'= 3.6* \sqrt{(\sum_(i=1) ^(n) (x_i-\bar x))/(n-1)}\\\\\implies s'=3.6* s\\\\ \implies s'=3.6* 5\\\\\implies\textbf{New Standard Deviation = }\bf 18

answered
User Feugy
by
8.1k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.