Adjusted Rate of Decay Projections vs Actuals

Firehazurd

New member
Joined
Jan 8, 2019
Messages
1
Hey folks!

I've got a rate of decay model that projects out a certain decay and then as time passes, an end user inputs the actual data from the most recent interval.


intervalInitial valuedecay projdecay valueactual initial valuedecay actualdecay actual value
1100.00-2%-2.0100.00%0.0
298.00-2%-2.0100.00%0.0
396.04-2%-1.9100.0
494.12-2%-1.998.1
592.24-2%-1.896.2


Looking at the above example, intervals 1 and 2 have already passed, and the decay was actually 0% during those intervals. My model expects the past rates to have no impact on future values, but that means the future rates have to be adjusted to compensate. I am currently doing this by looking at the aggregate of the two past actual values, and the remaining three projected values and dividing by the sum of the corresponding initial value numbers. In the above example, I would determine that the decay should be set at -1.17% in order to net the same decay values in my projection. When I utilize that decay rate, however, by either manually finding the decay value by interval or by using a decay rate formula (y = a(1 - r)x) I end up with a value that is slightly different than if I simply add the remaining projected decay values to the latest initial value.

This is a problem because this data doesn't get lain out as a single set. Multiple concurrent projections can overlap and impact the decay rates. So I need to be able to back in to what adjustments I should make to each one without having to parse out each individual set.

The variances are small, and only are reflected between the point when a particular set of data first gets an actual value for an interval and when that set gets its last actual value, but it's troubling to not be able to get it to reconcile.

I likely didn't describe my problem perfectly, but let me know if you have any questions or if you know exactly what I'm doing wrong!

Thanks!
 
Top