asked 114k views
4 votes
A radio signal travels at 3. 00  10 8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3. 54  10 7 meters? Show your work.

1 Answer

1 vote

Answer:


0.1157\ \text{s}

Explanation:


v = Velocity of radio signal =
3* 10^8\ \text{m/s}


d = Distance =
3.47* 10^7\ \text{m}

Time is given by


t=(d)/(v)


\Rightarrow t=(3.47* 10^7)/(3* 10^8)


\Rightarrow t=0.1157\ \text{s}

Time taken by a signal from the satellite to reach Earth is
0.1157\ \text{s}.

answered
User Hyojin
by
8.4k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.