Code: Select all
void TimerManager(void)
{
gametime=timer->getTime();
frameduration=gametime-oldgametime; // gametime and oldgametime are global variables
oldgametime=gametime;
// How many SIXTIETHS OF A SECOND between the last frame and the current frame?
fpsmultiplicationfactor=frameduration*0.06;
}
Code: Select all
if (!dukerot.Y)
dukex=dukepos.X+DUKESTEP*fpsmultiplicationfactor;
else
dukex=dukepos.X-DUKESTEP*fpsmultiplicationfactor;
This works as expected, and the running speed is exactly the same at any frame rate.
The jumping and falling is where the problem lies. I know the theory: in order to make it look realistic, jumping must be a uniformly decelerating motion, where the velocity starts at a maximum and is brought down to zero, frame after frame. Falling, instead must be a uniformly accelerating motion.
Anyway, this is the function that is executed every time I want the character to jump:
Code: Select all
void StartJumping(void)
{
// when Duke is jumping, he is "falling up"
dukefallingspeed=12*fpsmultiplicationfactor;
}
Code: Select all
if (dukepos.Y>0)
{
dukefallingspeed=dukefallingspeed-GRAVITY*fpsmultiplicationfactor;
dukey=dukepos.Y+dukefallingspeed;
}
node[duke]->setPosition(core::vector3df(dukex,dukey,0));
What I want to happen is for the main character to jump at the same height, at the same speed, no matter the frame rate.
What actually happens is that the height and speed are what I expect to be if the game runs at 60 fps. Instead, if it runs at 30 fps, the height of the jump is much higher, and the deceleration is much lower. But... why? Am I not supposed to multiply the velocity variation by a factor that depends on the duration of each frame, like I do with the (uniform) horizontal motion?