asked 112k views
5 votes
URGENT PLEASE HELP. A radio signal travels at 3.00 * 10^8 meters per second. how many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.54 * 10^7 meters? show all the steps that you used to solve this problem.

asked
User Hgolov
by
7.9k points

2 Answers

6 votes
t = d/s

t = 3.54 x 10.7m / 3 x 10^8 m/s

t= 0.118 s

It would take 0.118 seconds for the radio signal to reach earth
answered
User Stefan Luv
by
8.2k points
5 votes
So distance=rate x time and if we rearrange this to find the time we get time=distance/rate. Now we can just plug in the numbers.


3.54*10^7/3*10^8=time(seconds)
To solve this we can just divide the parts of the scientific notation separately.

3.54/3=1.18
10^7/10^8=1/10

Now we just multiply our answers together because we separated them before we divided.

1.18*1/10=.118

So the answer is .118 seconds
answered
User Manjit Kumar
by
7.6k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.