asked 95.3k views
25 votes
Jace measured a line to be 14 inches long. If the actual length of the line is

13.1 inches, then what was the percent error of the measurement, to the
nearest tenth of a percent?

asked
User Migajek
by
8.9k points

1 Answer

11 votes

Answer:

6.87022901%

Explanation:

He measured 14 out of 13.1 inches. There is a difference of 0.9 inches when u subtract. He miscalculated by 0.9 out of 13.1. In a calculator you divide .9 by 13.1 you get 0.0687022901. Multiply by 100 to get the percent and round it to your desired decimal place.

answered
User Wiseguy
by
7.7k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.