Basically what I am trying to do is scale standard deviation using a value that is 0<x<1
So for example, if standard deviation (S)= .0001, and y = resulting scaled value using the value between 0<x<1 applied to standard deviation
and x is between the interval [0,.5) then y<.0001$ where y is incrementally getting smaller as x decreases in value.
If x is between the interval (.5,1] then y>.0001 where y is incrementally getting larger as x increases in value.
How would you accomplish this?
So for example, if standard deviation (S)= .0001, and y = resulting scaled value using the value between 0<x<1 applied to standard deviation
and x is between the interval [0,.5) then y<.0001$ where y is incrementally getting smaller as x decreases in value.
If x is between the interval (.5,1] then y>.0001 where y is incrementally getting larger as x increases in value.
How would you accomplish this?