asked 230k views
2 votes
If the standard deviation of a data set were originally 12, and if each value in

the data set were multiplied by 1.75, what would be the standard deviation of
the resulting data?

asked
User Go
by
8.3k points

2 Answers

2 votes

Answer:

It’s 21 (12 x 1.7) = 21

Explanation:

answered
User MGwynne
by
8.0k points
6 votes

Answer: 21

Explanation:

Take 12 and multiply it by 1.75 (12x1.75) and you’ll get 21

answered
User Jeenyus
by
7.5k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.