Final answer:
The distance between two parallel hyperplanes is computed using the difference in their respective constant terms, divided by the norm of the normal vector to the hyperplanes. The correct formula for the distance is |b2 - b1| / ||a||.
Step-by-step explanation:
The question asks about the distance between two parallel hyperplanes defined as x ∈ R^n and a^T x = b2.
To find the distance, consider a point P1 on the first hyperplane and a point P2 on the second hyperplane such that the line segment P1P2 is perpendicular to both hyperplanes.
The distance d between the hyperplanes is the length of P1P2.
The distance can be calculated using the formula d = |(a^T x2 - b2) - (a^T x1 - b1)| / ||a||, which simplifies to d = |a^T(x2 - x1) - (b2 - b1)| / ||a|| since the hyperplanes are parallel.
This further simplifies to d = |b2 - b1| / ||a||, given that a^T(x2 - x1) = 0 because x2 - x1 is perpendicular to a. Therefore, the correct answer is (a) |b2 - b1| / ||a||.