asked 113k views
4 votes
HELPPPP.

Suppose a bird is 500 feet above the ground. It descends at a steady rate. After 10 seconds, it is 250 feet above the ground.
a) Write an equation that gives the height of the bird as a function of time. Be sure to define your variables!
b) After how many seconds will the bird land on the ground?​

asked
User Adsurbum
by
7.5k points

2 Answers

2 votes

Answer:

a)
h(t)=500-25t

b)
t=20

Explanation:

a)

Let
h = height above ground

Let
t = time

Because the bird descended down to 250 feet in 10 seconds with a steady rate, that means it descends 25 feet per second.


h(t) = 500-25t

b)

To find the time it takes until the bird reaches the ground,
h must equal
0:


0=500-25t


25t=500


t=20

∴ It takes 20 seconds for the bird to reach the ground.

answered
User Sdedelbrock
by
8.5k points
1 vote

Answer:

500-10x=250

x=how much the bird descended in a second

the bird will land on the ground in another 10 seconds.

Explanation:

the bird will land on the ground in 10 seconds since it descended 250 ft in 10 seconds so that is 25 ft per second. then do 250/25 and you get 10. that is how you know the bird will land on the ground in 10 seconds.

answered
User Jason Renaldo
by
8.5k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.