This is not a homework problem, but a real-world problem I'm having at work.

I need to know an "initial amount" that is multiplied by a percentage (11.52%) to get an "end amount" of $60,532.61.

My boss told me to take $60,532.61 × (1-0.1152) to get the "initial amount" but it just isn't working for me. When I do that, I get $53,559.25. And when I take $53,559.25 + ($53,559.25 × 0.1152) I end up getting $59,729.28, which is NOT $60,532.61.

What am I doing wrong?