Hey- thanks in advance!
Im at such a loss with this problem, I dont know where to begin or end. I tried a few things but I felt i was over complicating things. I was informed theres a formula that applies here and would be highly appreciative to be informed/reminded what it is. The problem is:
A wealthy donor has given MIT $500,000 to be used to pay the tuition of one student every other year until the money
runs out. Under the assumptions that
a. The current annual tuition is $34,986 per year,
b. Tuition goes up at a rate of 4.1%/year, and
c. The principal will be invested so that it earns 5%/year,
How many students will the donation pay for?
Im at such a loss with this problem, I dont know where to begin or end. I tried a few things but I felt i was over complicating things. I was informed theres a formula that applies here and would be highly appreciative to be informed/reminded what it is. The problem is:
A wealthy donor has given MIT $500,000 to be used to pay the tuition of one student every other year until the money
runs out. Under the assumptions that
a. The current annual tuition is $34,986 per year,
b. Tuition goes up at a rate of 4.1%/year, and
c. The principal will be invested so that it earns 5%/year,
How many students will the donation pay for?