We can use the Pythagorean theorem to determine the distance of the plane from the airport.
The plane first travels $19 km$ east, so we can draw a line segment of length $19$ to the right (east) on a coordinate plane.
Then, the plane travels $24 km$ south, so we can draw a line segment of length $24$ downward (south) from the end of the first line segment.
We can then draw a right triangle connecting the starting point, the end point of the first line segment, and the end point of the second line segment. The distance from the plane to the airport is the hypotenuse of this triangle.
Using the Pythagorean theorem, we can find the hypotenuse:
c^2 &= a^2 + b^2 \\
c^2 &= 19^2 + 24^2 \\
c^2 &= 361 + 576 \\
c^2 &= 937 \\
c &= \sqrt{937} \\
c &\approx 30.6
Therefore, the plane is approximately $30.6$ km from the airport.