Hi folks!
I'm writing a program that will graphically move a ball along a series of parabolas to simulate the ball bouncing. I'm using a nested loop to do this.
I used the following quadratic equation to plot the path of the ball:
f(x) = .068x2 -20.4x + 1620
then I shifted it using f(x-shift) = .068x2 -20.4x + 1620, where 'shift' is a variable that advances on the outside loop.
So far, so good. The ball 'bounces' along the path of the parabola, advancing in the x-axis with each bounce. However, I would like to have each parabola reduce in size to simulate a diminishing bounce of the ball as gravity acts upon it. To be clear, I don't want a shift in y. I need to scale the parabola so that the start point of the ball in the y-axis remains constant.
So I believe that I need a formula for scaling polynomials. Any help or nudge in the right direction would be greatly appreciated.
Regards,
CJ
I'm writing a program that will graphically move a ball along a series of parabolas to simulate the ball bouncing. I'm using a nested loop to do this.
I used the following quadratic equation to plot the path of the ball:
f(x) = .068x2 -20.4x + 1620
then I shifted it using f(x-shift) = .068x2 -20.4x + 1620, where 'shift' is a variable that advances on the outside loop.
So far, so good. The ball 'bounces' along the path of the parabola, advancing in the x-axis with each bounce. However, I would like to have each parabola reduce in size to simulate a diminishing bounce of the ball as gravity acts upon it. To be clear, I don't want a shift in y. I need to scale the parabola so that the start point of the ball in the y-axis remains constant.
So I believe that I need a formula for scaling polynomials. Any help or nudge in the right direction would be greatly appreciated.
Regards,
CJ