asked 74.1k views
3 votes
Use the sandwich theorem to find the limit of

1/√(n² +1) + 1/√(n² +2) + 1/√(n² +3) + ... + 1/√(n² +n)

1 Answer

4 votes

Final answer:

Using the sandwich theorem, we establish lower and upper bounds for the series 1/√(n² +1) + ... + 1/√(n² +n), which converge to zero as n approaches infinity. Thus, the limit of the series is zero.

Step-by-step explanation:

To find the limit of the given series 1/√(n² +1) + 1/√(n² +2) + 1/√(n² +3) + ... + 1/√(n² +n) using the sandwich theorem, we need to establish lower and upper bounds that converge to the same limit as n approaches infinity.

Since is always less than +k for any positive integer k, √(n²) is less than √(n²+k). Thus, 1/√(n²+k) is less than 1/√(n²), giving us a lower bound for each term.

The upper bound for each term can be similarly found by noting that √(n²+k) is less than √(n²+n). By summing these bounds over n terms, we find that the series is bounded between n/√(n²+n) and n/√(n²).

As n goes to infinity, both bounds converge to zero, so by the sandwich theorem, the limit of the series is also zero.

answered
User Mayank Bansal
by
8.8k points

No related questions found