asked 56.3k views
0 votes
PLEASE HELP! I WILL GIVE 50 POINTS!!! A radar gun measured the speed of a baseball at 92 miles per hour. If the baseball was actually going 90.3 miles per hour, what was the percent error in this measurement?

asked
User Sachi
by
9.0k points

2 Answers

6 votes

1.883% error


| (actual - guess)/(actual) | \\ | (90.3 - 92)/(90.3) | = (1.7)/(90.3) = .0188 = .0188 * 100\% = 1.88\%

answered
User Gawbul
by
7.9k points
1 vote

Answer:

Explanation:

A radar gun measured the speed of a baseball at 103 miles per hour. If the baseball was actually going 102.8 miles per hour.

answered
User Saurabh Kumar
by
8.6k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.