A couple shader questions...

You are an experienced programmer and have a problem with the engine, shaders, or advanced effects? Here you'll get answers.
No questions about C++ programming or topics which are answered in the tutorials!
Post Reply
Nyxojaele
Posts: 98
Joined: Mon Sep 18, 2006 4:06 am

A couple shader questions...

Post by Nyxojaele »

I currently know nothing about shader programming yet, so please excuse my ignorance (I'm going to be teaching myself in a couple weeks tho...)

So- 2 questions:
  1. Is it possible to apply multiple shaders to the same texture? I'm kinda thinking the answer is "not directly", to which I ask: Is it possible to apply a shader to a texture, then render that shader'd texture to another texture, to which you apply another shader?
  2. For 1 effect I'm thinking of, I'd like to fade a shader in, let it reside on the texture for an amount of time, then fade it out-- is there a way to do this in code, or should I be looking at doing this via the shader itself (ie. Making it appear as if it's fading in/out...)
Praetor
Posts: 42
Joined: Wed Jun 20, 2007 2:31 am

Post by Praetor »

1. If you mean multiple passes, yes this is possible (I'm not familiar with irrlicht's shader system though, but it should be possible...)
2. You could have a parameter in the shader that controls the darkness, or transparency of the pixels and change that parameter over time in your code to do whatever you need it to do.

I reccomend you look for some kind of book on shaders and if you need examples that use irrlicht, look for some of the shader packs (BlindSide's XEffects for example) in the Project Anouncement forum.
"Surely we don’t need to waste resources on pathfinding; they just need to walk along the shortest route from one place to another." - EA Producer
Nyxojaele
Posts: 98
Joined: Mon Sep 18, 2006 4:06 am

Post by Nyxojaele »

  1. So multiple passes, each one calling a different shader, or multiple passes within the shader code itself?
  2. I see- that makes sense. Thank you.
Yah, I already have things lined up on how I'll be learning about shaders, I just won't have the time for a bit here. And thank you, but I've already got shaders -working- in my application, so now it's just a matter of writing my own shaders and possibly adding different "features" to my shader setup, which is where the questions stem from^^
arras
Posts: 1622
Joined: Mon Apr 05, 2004 8:35 am
Location: Slovakia
Contact:

Post by arras »

1 -each pass may be rendered with different shader.
Nyxojaele
Posts: 98
Joined: Mon Sep 18, 2006 4:06 am

Post by Nyxojaele »

I was messing around a bit with that, since I wanted to have the ability in my application to have post-processing applied to only my 3D Scene, but also the ability to have post-processing applied to everything including the GUI.

What I ended up doing was creating 2 screen quads, each with it's own RTT, and each one using it's own shader, and my main loop contained something like this:

Code: Select all

mSceneMgr->getVideoDriver()->setRenderTarget(mSceneOnlyQuad->getMaterial(0).TextureLayer[0].Texture, true, true, color);
mSceneMgr->drawAll();  //Draw only the 3D Scene to mSceneOnlyQuad

mSceneMgr->getVideoDriver()->setRenderTarget(mEverythingQuad->getMaterial(0).TextureLayer[0].Texture, true, true, color);
mSceneOnlyQuad->render();  //Draw mSceneOnlyQuad to mEverythingQuad
mGUIMgr->drawAll();  //Draw the GUI to mEverythingQuad

mSceneMgr->getVideoDriver()->setRenderTarget(0, true, true, color);
mEverythingQuad->render();  //Draw mEverythingQuad to the frame buffer
I realize that in this situation, since I want some stuff shader'd, and some stuff not (or rather, some stuff in both shaders, and some stuff in only 1...) that it's not quite the same thing, but later on, I'd like to allow for, say, my mSceneOnlyQuad to have 2 or 3 different shaders applied to it- so would I have to do a similar thing to what I have now, basically creating a stack of screenQuads, or is there another means thru which I can achieve the multiple shader passes on 1 screenQuad?

On the subject of screen quads, is there a way via [Irrlicht] code to position the screen quad in front of the screen (ie where it should be)? Currently my code relies on the shader to do that for it, but as soon as I set the material type to something non-shader, obviously, the quad is no longer in front of the screen. I suppose I could come up with a "do nothing" shader that only positions the quad in front of the screen, but then anybody who's video cards don't support shaders can't use my application at all>< So I was hoping to be able to do the screen quad placement with code if possible...
Nyxojaele
Posts: 98
Joined: Mon Sep 18, 2006 4:06 am

Post by Nyxojaele »

Okay, so I've started looking into writing some shaders, and have a basic understanding of how in integrates with C++ code, but what's getting me is that I just can't find the code in Irrlicht where the actual INTERACTION takes place.

What I'm looking for is where the actual vertex stream is passed into the (in this case, DirectX9) driver, for the vertex shader to use.

Looking at various other directx tutorials, I see various function calls, enumerations, and structures that look like they're directX native, which I would assume Irrlicht would need in the D3D9 specific files in order to use shaders-- things like D3DDECLMETHOD_DEFAULT, CreateVertexDeclaration, and SetStreamSource, but I search all the Irrlicht files and nothing comes up. I've tried walking thru the Irrlicht files myself to try and find it to no avail-- I'm just not familiar enough with the inner workings of Irrlicht nor DirectX, I guess><

So exactly how is it that Irrlicht sends the data to the shaders? What file/function?

<EDIT>
After some research, I found the answer to my question- so here it is for anybody interested:
CD3DDriver.cpp:drawVertexPrimitiveList()

This calls setVertexShader(), which defines the vertex stream, and depending on the primitive type, also calls various SetRenderState()s and eventually a DrawIndexedPrimitiveUP() from the IDirect3DDevice9 object, which sends in the vertex stream.

So I guess this means that we have no way to program in our own vertex stream for the shaders, am I correct? Is this something that may be implemented later? I can see it being a problem for some of the more advanced shader programmers...
</EDIT>
Post Reply