asked 220k views
4 votes
5. When a kid drops a rock off the edge of a cliff, it takes 4.0 s to reach the ground below. When he throws the rock down, it strikes the ground in 3.0 s. What initial speed did he give the rock?

asked
User Benkc
by
8.1k points

2 Answers

1 vote

Answer: 11.4m/s

Step-by-step explanation:

kid drops a rock off the edge of a cliff, it takes 4.0 s to reach the ground below

u= 0 , a=9.8 , t= 4 s = ?

s= ut + (1/2) a t^2

s = 0 + (1/2) 9.8 (4)^2

s= 78.4 m

Now for the throw case the initial velocoity is unknow

u = ? , s =78.4 , a=9.8 t=3

s= ut + (1/2) a t^2

(s- (1/2) a t^2 )/ t = u
78.4-(1/2)x9.8x9/3 = u
34.3/3=u
11.4m/s = u

answered
User Wheresrhys
by
7.8k points
3 votes

Answer:

Step-by-step explanation:

assume that the only force acting on the rock is gravity, which gives it a constant acceleration of 9.81 m/s^2.

When the kid drops the rock, it starts with an initial velocity of 0 m/s and falls for a time of 4.0 s before reaching the ground. Using the equation:

d = (1/2)at^2 + vit

where d is the distance fallen, a is the acceleration due to gravity, t is the time, vi is the initial velocity (which is 0), and we solve for a:

a = 2d / t^2

a = 2 * h / t^2 (where h is the height of the cliff)

a = 2 * g * h / t^2 (where g is the acceleration due to gravity)

a = 2 * 9.81 m/s^2 * h / (4.0 s)^2

a = 4.905 h m/s^2

When the kid throws the rock, it starts with an initial velocity vi and falls for a time of 3.0 s before reaching the ground. Using the same equation:

d = (1/2)at^2 + vit

answered
User Tom Irving
by
8.4k points