so here is the deal. I am trying to make a new kind of fantasy football format for my friends to play which involves assigning fictitious contract values to NFL players based on Fantasy Projections.
After many many different variations of how to suss out a fair, but reality reflective set of values I have found that the formula for compound interest works the best.
basically, if Adrian Peterson is ranked #1 and worth approx $10,000,000.00 , and Anthony Allen is ranked #100 and worth $400,000, plotted along a curve of 3.3% ranks 99-2 come out pretty well.
A=400,000(1+.033/1)^100= $10,282,344.33 that would be the equation for figuring out Adrian peterson's fictitious salary. But! now what i need is to get the curve looking just slightly more logarithmic.
As it stands right now, the 51st ranked player is worth $2,028,037.90 and I need him to be worth approx $1,000,000.
What can i do to this formula to make these points match up and have everything else fall into place? How can i "flare" out the edges of the curve a little better?
-thanks
After many many different variations of how to suss out a fair, but reality reflective set of values I have found that the formula for compound interest works the best.
basically, if Adrian Peterson is ranked #1 and worth approx $10,000,000.00 , and Anthony Allen is ranked #100 and worth $400,000, plotted along a curve of 3.3% ranks 99-2 come out pretty well.
A=400,000(1+.033/1)^100= $10,282,344.33 that would be the equation for figuring out Adrian peterson's fictitious salary. But! now what i need is to get the curve looking just slightly more logarithmic.
As it stands right now, the 51st ranked player is worth $2,028,037.90 and I need him to be worth approx $1,000,000.
What can i do to this formula to make these points match up and have everything else fall into place? How can i "flare" out the edges of the curve a little better?
-thanks