To start I want to say that this is not a school question, and I am not in a statistics class, nor do I know if my question is even answerable.
But anyway, here's my question: Statistically, snow in October in Washington D.C. should occur once every 43.3 years. But, in any given year, the chance is only 2.3%. How does the percent chance change as you get closer to a multiple of 43.3? Does it change at all?
I was thinking maybe something like: DeltaP = 43.3x - T
In which the change in percent chance (DeltaP) is equal to the difference between the current time (T) and any multiple of 43.3 (43.3x).
Ideally you would be able to somehow get a function that could be graphed, or be able to solve for the percent chance at any given time.
Any suggestions?
But anyway, here's my question: Statistically, snow in October in Washington D.C. should occur once every 43.3 years. But, in any given year, the chance is only 2.3%. How does the percent chance change as you get closer to a multiple of 43.3? Does it change at all?
I was thinking maybe something like: DeltaP = 43.3x - T
In which the change in percent chance (DeltaP) is equal to the difference between the current time (T) and any multiple of 43.3 (43.3x).
Ideally you would be able to somehow get a function that could be graphed, or be able to solve for the percent chance at any given time.
Any suggestions?