asked 172k views
3 votes
A jet's velocity with respect to the air is 520

miles per hour, and its bearing is N45∘E
. The jet encounters a wind with a velocity of 50
miles per hour from the west.


After leaving Airport A and flying in a straight line for 2
hours (with the wind described as above), the jet lands at Airport B. Later, the wind stops, and the jet flies to airport C which is 432
miles due south of Airport A. Find the ground distance between Airports B and C, and determine the direction (bearing) the jet must fly to Airport C. Draw a diagram to support your work. Round your answers to the nearest whole number.

1 Answer

4 votes

Answer:

Explanation:

Let’s start by drawing a diagram to represent the problem. The jet’s velocity with respect to the air is 520 miles per hour and its bearing is N45°E. The wind is blowing from the west with a velocity of 50 miles per hour. We can represent the velocity of the jet as a vector with magnitude 520 and direction N45°E. We can represent the velocity of the wind as a vector with magnitude 50 and direction due west. The ground speed of the jet is the vector sum of these two vectors.

Jet Diagram

To find the ground speed of the jet, we can use the Pythagorean theorem and trigonometry. Let’s call the ground speed “v” and the angle between v and due north “θ”. Then:

cos(θ) = 50 / 520 θ = cos^-1(50 / 520) θ ≈ 5.7°

So the angle between v and due north is approximately 5.7°.

Now we can use trigonometry again to find v:

sin(θ) = opposite / hypotenuse sin(5.7°) = opposite / v opposite = v sin(5.7°)

opposite = 520 - 50 opposite = 470

v sin(5.7°) = 470 v ≈ 4,090

So the ground speed of the jet is approximately 4,090 miles per hour.

After flying for two hours with this ground speed, the jet has traveled a distance of:

distance = rate x time distance = 4,090 x 2 distance = 8,180 miles

Now let’s find the distance between Airports B and C. We know that Airport C is 432 miles due south of Airport A. Since we know that the jet traveled due east from Airport A to Airport B, we can use the Pythagorean theorem to find the distance between Airports B and C:

distance^2 = (8,180)^2 + (432)^2 distance ≈ 8,207 miles

Finally, let’s find the direction (bearing) that the jet must fly to reach Airport C from Airport B. We can use trigonometry again to find this angle:

tan(θ) = opposite / adjacent tan(θ) = 432 / 8,180 θ ≈ 3.0°

So the angle between due north and the direction that the jet must fly is approximately 3.0° east of north.

Therefore, to reach Airport C from Airport B, the jet must fly in a direction that is approximately 3.0° east of north.

I hope this helps! Let me know if you have any other questions or if there is anything else I can help you with.

answered
User Ginchen
by
7.9k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.