asked 137k views
2 votes
If the standard deviation of a data set was originally 5, and if each value in the data set was multiplied by 4.8, what would be the standard deviation of the resulting data?

asked
User Darkade
by
7.4k points

2 Answers

2 votes

Answer:

Explanation:

answered
User Birsen
by
7.9k points
5 votes
The mean, standard deviation will be 4.8 larger. Therefore, 5 times 4.8 equals to 24.
answered
User GuavaMantis
by
8.7k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.