Having problems with floating point numbers? Here's your fix
Having problems with floating point numbers? Here's your fix
I'm integrating irrlicht into an existing data display app I have, so I'm using irrlicht to render to a window of my creating.
I ran across a problem yesterday, when opening the window that uses irrlicht, my other data plotting windows suddenly couldn't do math correctly. After a bunch of digging around, I decided that my floating point calcs were losing precision.
The last thing you expect is to add a number, like 10.5 to a number like 1.3 billion, and have the answer be wrong, when both values are doubles.
I figured out that the FPU was magically going from it's default 53-bit precision to 24-bit precision when I opened my irrlicht window.
The culprit is the pID3D->CreateDevice() call. By default, directx uses single precision calcs, and sets the FPU to 24-bit precision. This is faster than using double precision, but will kill you if you use doubles in your app and expect them to work.
To fix it, change the 4th param on all the CreateDevice calls from D3DCREATE_*_VERTEXPROCESSING to D3DCREATE_*_VERTEXPROCESSING|D3DCREATE_FPU_PRESERVE (where * is whatever the mode is for that line, MIXED, HARDWARE, or SOFTWARE)
This is a bummer tho... I don't need double precision in my directx stuff, just in the other parts of my app... I may end up having to wrap my drawing stuff with _control87() calls to manage the FPU state myself, so D3D can use single precision, and the rest of my app can continue to do double precision.
I ran across a problem yesterday, when opening the window that uses irrlicht, my other data plotting windows suddenly couldn't do math correctly. After a bunch of digging around, I decided that my floating point calcs were losing precision.
The last thing you expect is to add a number, like 10.5 to a number like 1.3 billion, and have the answer be wrong, when both values are doubles.
I figured out that the FPU was magically going from it's default 53-bit precision to 24-bit precision when I opened my irrlicht window.
The culprit is the pID3D->CreateDevice() call. By default, directx uses single precision calcs, and sets the FPU to 24-bit precision. This is faster than using double precision, but will kill you if you use doubles in your app and expect them to work.
To fix it, change the 4th param on all the CreateDevice calls from D3DCREATE_*_VERTEXPROCESSING to D3DCREATE_*_VERTEXPROCESSING|D3DCREATE_FPU_PRESERVE (where * is whatever the mode is for that line, MIXED, HARDWARE, or SOFTWARE)
This is a bummer tho... I don't need double precision in my directx stuff, just in the other parts of my app... I may end up having to wrap my drawing stuff with _control87() calls to manage the FPU state myself, so D3D can use single precision, and the rest of my app can continue to do double precision.
Nope. And I could give a bunch of reasons why, but that would just rile up the cross-platform geeks.
I'm using irrlicht for my current demo because:
1) It doesn't require external libraries, it has all the source in it
2) It looks like it's primary development was with dx and on windows
3) The source shipped with real live VC project files, meaning that someone actually used a real dev environment on it, which means integrating it would be quicker
and MOST IMPORTANTLY it's not lgpl or gpl. I'm happy to share all the mods I make, and help where I can, but no way do I wish to be contractually obligated to do so.
I also very much like the fact that a single developer is currently controlling it's release and development plans. Open source projects tend to turn into design by committee, and get overrun by folks trying to make stuff run on linux.
If the focus shifts away from windows/dx, then I'd likely end up dropping it, but for now, I'm happy.
</rant>
I'm using irrlicht for my current demo because:
1) It doesn't require external libraries, it has all the source in it
2) It looks like it's primary development was with dx and on windows
3) The source shipped with real live VC project files, meaning that someone actually used a real dev environment on it, which means integrating it would be quicker
and MOST IMPORTANTLY it's not lgpl or gpl. I'm happy to share all the mods I make, and help where I can, but no way do I wish to be contractually obligated to do so.
I also very much like the fact that a single developer is currently controlling it's release and development plans. Open source projects tend to turn into design by committee, and get overrun by folks trying to make stuff run on linux.
If the focus shifts away from windows/dx, then I'd likely end up dropping it, but for now, I'm happy.
</rant>
I dont care about people being cross platform (I like to in my own projects, but thats just my thing). However, for you, it sounds like you just need something that works. I mean, since you're using IrrLicht, you dont get to touch the actual API that is run, so the only reason it should matter to you which is used, is by the results you get on your hardware.
You seem to be having problems with DX, but perhaps those problems dont occur with OGL. On top of that, perhaps your graphics/FPS are still as good?
Or is there some other reason to care about what API is running under the hood?
You seem to be having problems with DX, but perhaps those problems dont occur with OGL. On top of that, perhaps your graphics/FPS are still as good?
Or is there some other reason to care about what API is running under the hood?
a screen cap is worth 0x100000 DWORDS
Yep. The product has to eventually be installed on end user's computers. 5 years ago d3d totally sucked, and ogl was the way to go. But that is untrue now. Directx has much better driver support on a wide array of hardware. Try sticking a bunch of video adapters in a computer and run an ogl app in full screen on the primary card. Let me know after you finish rebooting.. Short answer... on Windows, directx is just plain better.Or is there some other reason to care about what API is running under the hood?
With dx, I can make my installation and support issues much better than with ogl. ogl is playing catch up with dx on features now too.
I'm not writing a stand-alone 3d app where the 3d stuff is the main part. My 3d displays are just a small part of the overall gui displays in the app, so I want to have as few extra dependancies added as possible.
It's zlib, it's not that conservative like (L)GPL: http://irrlicht.sourceforge.net/license.htmlpowerpop wrote:ummm, irrlicht is LGPL (i think)
With LGPL you MUST make public any modifications you do on the engine. With ZLib you're free to release the source or not, but you CANNOT SAY it's completely yours.
I think it's more or less correct.
Edit: here is the Ogre LGPL license explanation from www.ogre3d.org
I think it's more or less correct.
Edit: here is the Ogre LGPL license explanation from www.ogre3d.org
Under the LGPL you may use Ogre for any purpose you wish, as long as you:
1. Release any modifications to the OGRE source back to the community
2. Pass on the source to Ogre with all the copyrights intact
3. Make it clear where you have customised it.