Help on how to enable triple buffering in an Example App

If you are a new Irrlicht Engine user, and have a newbie-question, this is the forum for you. You may also post general programming questions here.
Post Reply
nightrobin
Posts: 5
Joined: Tue Jun 24, 2014 11:12 am

Help on how to enable triple buffering in an Example App

Post by nightrobin »

Good day,
I just want to ask on how to enable the triple buffering in a sample basic Irrlicht App?
Thanks in advance :o
hybrid
Admin
Posts: 14143
Joined: Wed Apr 19, 2006 9:20 pm
Location: Oldenburg(Oldb), Germany
Contact:

Re: Help on how to enable triple buffering in an Example App

Post by hybrid »

There is no triple buffer option for Irrlicht. Use your gfx card setup to enable it globally of per app.
Destructavator
Posts: 40
Joined: Sat Feb 06, 2010 9:00 pm
Location: Ohio, USA

Re: Help on how to enable triple buffering in an Example App

Post by Destructavator »

Triple buffering is really something that should be left up to an end-user's driver settings, either per-application or on a global level. If you (or any end-user running a program you wrote) want it, as hybrid said, the graphics card settings / driver control panel are where it is best done.

This is true of not just Irrlicht, but a lot of 3D engines and frameworks I've seen, and as far as OpenGL goes, even on the forum on their official site they say pretty much the same thing.

You might also want to look at:
http://gamedev.stackexchange.com/questi ... ndows?lq=1

...So although there are ways one could write a lot of code to manually do so, typically such isn't recommended by a developer, at least not in most cases unless something very special or unusual is being done.

Regarding direct, "raw" OpenGL commands, I don't know of a SwapBuffersTriple() command or anything like it that would do it all for you.

For DirectX, I'm really not sure.

Probably the best way is to let your program code do a regular double-buffer swap (in Irrlicht, I believe this is done by the engine when IVideoDriver->endscene() is called), and if an end-user of your program has triple buffering turned on in their driver settings, their driver should take care of it and make it happen, so it usually isn't something the program developer needs to worry about.

If you want to distribute a program and recommend end-users use specific settings with your game or whatever it is you've coded and released, there are ways to do this, I don't know all the details but I think it involves getting your program to either use components of a gfx card vendor's public SDKs or somehow signing up or registering your game / whatever with venders (such as AMD, NVIDIA, etc...) so that when your game is published and installed on an end-users machine, they automatically have a "profile" published with the game set up with recommended, optimal settings. I could be off on this point though, as I've barely looked into this, but I know there are ways of doing it.

Of course, long before that point, before a final product is published, on your own machine for testing you can just use your driver control panel and turn it on globally or per-application for your project. NVIDIA control panel allows creating custom profiles for running third-party programs not in their own list of games, IIRC ATI video card control panel also has something similar.

Don't worry about manually coding it in your program though. Yes, doing so is possible, but I wouldn't do so unless some special rendering scheme for a program is really needed, if you're doing something special or non-standard.
- - - -
"Destructavator" Dave: Developer of OS GPL VST "ScorchCrafter" Audio Plug-ins, contributer to UFO AI, Day Job: Guitar Luthier (Customize musical instruments, repainting w/ custom artwork, graphics, logos, decals, etc.)
Post Reply