Hello everyone.
I'm developing a port of Wallpaper Engine for Linux and I decided to use Irrlicht as the graphics engine early in the development because I already used It on other projects. After reversing the wallpaper's format and how the original Wallpaper Engine renders things I realized that it was doing multiple shader passes over textures. The most common case is textures that have a shader to apply alpha and another to apply an effect on it (for example, water ripple effects). Looking through the irrlicht's documentation I do not see any way to apply multiple shaders to the same texture so I cannot really replicate this functionality and googling the issue doesn't seem to be helpful either.
I tried to hack something rendering to textures but I can't really get any good result and I'm not so sure that might be the way to go.
This is the repository if anyone is interested in taking a look: https://github.com/Almamu/linux-wallpap ... separation
Has anyone implemented something like this before that can point me in the correct direction? Will there be support for this in the future in the engine?
Thanks in advance!
Multiple shader passes on a material
Re: Multiple shader passes on a material
Not sure. I mean you can render nodes several times and switch the material in between. But depends on the use-case how you handle the z-buffer when doing that. It's more common to do have multiple stages by rendering to a rendertargettexture and then combine the rendertargettextures with their own shader. The new PostProcessing example in Irrlicht trunk does that.
IRC: #irrlicht on irc.libera.chat
Code snippet repository: https://github.com/mzeilfelder/irr-playground-micha
Free racer made with Irrlicht: http://www.irrgheist.com/hcraftsource.htm
Code snippet repository: https://github.com/mzeilfelder/irr-playground-micha
Free racer made with Irrlicht: http://www.irrgheist.com/hcraftsource.htm
Re: Multiple shader passes on a material
Thank you for the information, looks like I was on the right track to properly support it. Looking thru the post-processing example I see that I'm doing mostly everything correct. After debugging the render system a bit more my textures or camera are being flipped somewhere in the process, so I'll check the matrix and renders; and for some reason the transparency information is lost (which made me think that the shaders weren't really being applied properly at the beginning).CuteAlien wrote:Not sure. I mean you can render nodes several times and switch the material in between. But depends on the use-case how you handle the z-buffer when doing that. It's more common to do have multiple stages by rendering to a rendertargettexture and then combine the rendertargettextures with their own shader. The new PostProcessing example in Irrlicht trunk does that.
Re: Multiple shader passes on a material
Phew, flipped textures is what you get a lot when working with OpenGL. Irrlicht was written originally for DirectX which has origin of images (0,0) at left-top. While opengl has is at left-bottom. Irrlicht flips texture coordinates (not texture itself, just the texture-matrix) in the fixed function pipeline before passing them to OpenGL. If you want to see the details - this is done in COpenGLDriver::setTextureRenderStates. But rendertargettextures are still rendered with origin at left-bottom. In fixed function pipeline we handle that by simple _not_ flipping the texture matrix for those.
But with shaders Irrlicht can't do any flipping as the users handle texture-matrices in shaders. So either you handle it in shaders or you handle it in your texture-matrix. It's also sometimes possible to handle it with the camera, but that's more tricky as you end up with a left-right-hand coordinate change when doing that (you need the camera upside-down + mirrored). So when doing it with camera you also have to change front/backface culling for each polygon.
Note that it gets a bit more complicated if you also need cubemaps (those have origin at left-top even on opengl).
But with shaders Irrlicht can't do any flipping as the users handle texture-matrices in shaders. So either you handle it in shaders or you handle it in your texture-matrix. It's also sometimes possible to handle it with the camera, but that's more tricky as you end up with a left-right-hand coordinate change when doing that (you need the camera upside-down + mirrored). So when doing it with camera you also have to change front/backface culling for each polygon.
Note that it gets a bit more complicated if you also need cubemaps (those have origin at left-top even on opengl).
IRC: #irrlicht on irc.libera.chat
Code snippet repository: https://github.com/mzeilfelder/irr-playground-micha
Free racer made with Irrlicht: http://www.irrgheist.com/hcraftsource.htm
Code snippet repository: https://github.com/mzeilfelder/irr-playground-micha
Free racer made with Irrlicht: http://www.irrgheist.com/hcraftsource.htm