I am stuck on a problem that I believe is algebraically solvable:
If an account starts with $3000 and $200 is withdrawn annually, the account is empty in 15 years.
If there is 3% interest paid on the account annually then the time to the account reaching zero is delayed.
The first year would be (3000-200) + (3000 - 200)1.03 = value of account in second year, but the question is the correct formula to arrive at the number of years it would take with annual withdrawals of $200 for the account to reach zero?
I am sure there is an algebraic formula to derive an answer, anyone care to help out?
Thanks
If an account starts with $3000 and $200 is withdrawn annually, the account is empty in 15 years.
If there is 3% interest paid on the account annually then the time to the account reaching zero is delayed.
The first year would be (3000-200) + (3000 - 200)1.03 = value of account in second year, but the question is the correct formula to arrive at the number of years it would take with annual withdrawals of $200 for the account to reach zero?
I am sure there is an algebraic formula to derive an answer, anyone care to help out?
Thanks
Last edited: