asked 73.4k views
4 votes
A radar measures the speed of a car at 45 miles per hour. The actual speed of the car is 40 miles per hour what is the percent error in the reading of the radar.

asked
User Gauranga
by
8.2k points

2 Answers

4 votes
answer: 11.11%

Explanation:
the radar was off by 5 miles per hour, so we need to find what percent 5 is out of 45. percent of (times) if you don’t know how to find that out.
make the assumption that 45 is 100% because it is the output.
next we represent the value we seek with x. so 100% = 45 and x% = 5.
this gives us the simple pair of equations 100% = 45(1) and x% = 5(2).
then multiplying the equations together like 100%/x% = 45/5.
make it into its reciprocal so x%/100% = 5/45. that comes out to be x= 11.11%
5 is 11.11% of 45 so the radar is off by 11.11%
answered
User Jianweichuah
by
7.6k points
4 votes

Answer:45

Explanation:

answered
User Tekz
by
8.1k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.