asked 15.8k views
2 votes
A double-slit diffraction pattern is formed on a distant screen. If the separation between the slits decreases, what happens to the distance between interference fringes? Assume the angles involved remain small.

asked
User Yaquelin
by
8.2k points

1 Answer

6 votes

Answer:

the distance between interference fringes increases

Step-by-step explanation:

For double-slit interference, the distance of the m-order maximum from the centre of the distant screen is


y=(m \lambda D)/(d)

where
\lambda is the wavelength, D is the distance of the screen, and d the distance between the slits. The distance between two consecutive fringes (m and m+1) will be therefore


\Delta y = ((m+1) \lambda D)/(d)-(m \lambda D)/(d)=(\lambda D)/(d)

and we see that it inversely proportional to the distance between the slits, d. Therefore, when the separation between the slits decreases, the distance between the interference fringes increases.

answered
User AfamO
by
8.1k points