asked 217k views
4 votes
A person invests 5500 dollars in a bank. The bank pays 4.5% interest compounded

annually. To the nearest tenth of a year, how long must the person leave the money
in the bank until it reaches 6700 dollars?

A person invests 5500 dollars in a bank. The bank pays 4.5% interest compounded annually-example-1
asked
User Avelis
by
8.0k points

1 Answer

0 votes

We can use the formula for compound interest to solve this problem:

A = P(1 + r/n)^(nt)

where:

A = the final amount (6700 dollars)

P = the principal amount (5500 dollars)

r = the annual interest rate (4.5% or 0.045)

n = the number of times the interest is compounded per year (once annually)

t = the time period (in years) for which the money is invested

Substituting these values in the formula, we get:

6700 = 5500(1 + 0.045/1)^(1t)

Dividing both sides by 5500, we get:

1.21818181818 = 1.045^t

Taking the logarithm (base 10) of both sides, we get:

t = log(1.21818181818) / log(1.045)

Solving this equation using a calculator, we get:

t ≈ 4.4

Therefore, the person must leave the money in the bank for approximately 4.4 years (to the nearest tenth of a year) until it reaches 6700 dollars.

answered
User Milla Sense
by
7.8k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.