asked 158k views
0 votes
In baseball, the pitcher's mound is 60.5 feet from home plate. If the fastest pitch ever recorded in Major League Baseball was 105.1 miles per hour, how many seconds did it take for the ball to travel from the pitcher's mound to home plate? Round the answer to the nearest tenth.

2 Answers

4 votes

Answer:

t = 0.4

Explanation:

(

1 hour

105.1 miles

)(

1 mile

5280 feet

)(

60 minutes

1 hour

)(

60 seconds

1 minute

)(60.5 feet) ≈ 0.4 seconds

answered
User Eric Skroch
by
8.2k points
7 votes

Answer:

t=0.4 sec

Explanation:

Remember that

1 mile=5,280 feet

1 hour=3,600 seconds

Let

s -----> the speed in ft/sec

d ----> the distance in ft

t -----> the time in sec

s=d/t

Solve for t

t=d/s

step 1

Convert miles/hour to ft/sec

105.1 mi/h=105.1(5,280/3,600)=154.15 ft/sec

step 2

Find the time

t=d/s

we have

s=154.15 ft/sec

d=60.5 ft

substitute

t=60.5/154.15

t=0.4 sec

answered
User Ronal
by
7.5k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.