Hi. I have big trouble with calculating a time factor that compensates fps changes. This means, if the factor was 1 on 60 fps, it would be 2 on 30 fps, and 0.5 on 180 fps. And a value multiplicated with this factor would be fps-independent (as half fps results in doubled speed).
Now the problem is that I fail to get the exact fps value. The one from the driver is too inaccurate. I tried to save the time from a timer for several frames (I tried it with 10 up to 500), and calculate the fps then, but whatever I do, I am always walking slower on lower fps in my game (although the fps factor raises, but not enough like it seems, something is wrong).
Does anyone know an easy way to get such an fps factor or at least a more accurate fps value? I could post my code now, but since I think that it needs only one or two lines to do this (I needed about 8, but they are a bit crappy and all, and they don't really work), you will probably not need to correct it but just be able to post your short solution for this problem.
Thank you for your help!
Problem with calculating time factor
Problem with calculating time factor
I'm not a native English speaker.
-
- Admin
- Posts: 3590
- Joined: Mon Oct 09, 2006 9:36 am
- Location: Scotland - gonnae no slag aff mah Engleesh
- Contact:
hybrid, bear in mind that on Windows with a multi-core CPU, getTime()/getRealTime() will use GetTickCount() rather than QueryPerformanceCounter(), which can have appalling accuracy. I've just seen 15-16ms(!) granularity in getTime() while running at > 1000 fps.
Please upload candidate patches to the tracker.
Need help now? IRC to #irrlicht on irc.freenode.net
How To Ask Questions The Smart Way
Need help now? IRC to #irrlicht on irc.freenode.net
How To Ask Questions The Smart Way