My SceneNode for filter post processing shaders

You are an experienced programmer and have a problem with the engine, shaders, or advanced effects? Here you'll get answers.
No questions about C++ programming or topics which are answered in the tutorials!
ItIsFree

My SceneNode for filter post processing shaders

Post by ItIsFree »

hello all, i am working in a SceneNode for irrlicht for anyone what he wants use filter shaders postprocessing like motion blur, black and whit ,etc

It is in development , but the version 1.0 is finnished . I am going to put the code for suggestion of the people. For 2.0 i want to do two rendertarget for complex filters

The SceneNodeFilter is a object. and the filters shader must be in a file hlsl

In the first code, you can see the object have a rendertarget, a object IShaderConstantSetCallBack for constant of shader and important anything

The idea is follow : you create the filter object and associate it the file hlsl with the filter , you create your scene (all nodes ) , you render your scene in the render target of filter object, and after select render target 0 and render de filter object


I put 3 codes :

1) the main.cpp with the code of filter object and a example of use
2) normal.hlsl ( filter normal -> no filter )
3) Black&White.hlhl
4) BLUR.hlsl ( for this filter you need a graphics target with ps 2.0)

In future versions i will put :
a second render target for complex filters
filter gaussian
alpha test
reflactoin
deep of field
z-testing

etc. I think put it in the wiki section , but i dont know how do it

MAIN.cpp

Code: Select all

#include "include/irrlicht.h"

using namespace irr;

#pragma comment(lib, "Irrlicht.lib")


class MyShaderCallBack : public video::IShaderConstantSetCallBack
{
public:

	float viewport_inv_width;
	float viewport_inv_height;

	virtual void OnSetConstants(video::IMaterialRendererServices* services)
	{
		video::IVideoDriver* driver = services->getVideoDriver();

		services->setVertexShaderConstant("viewport_inv_width", reinterpret_cast<f32*>(&viewport_inv_width), 1);
		services->setVertexShaderConstant("viewport_inv_height", reinterpret_cast<f32*>(&viewport_inv_height), 1);
				
	}
};


class CBaseFilter : public scene::ISceneNode
{

	core::aabbox3d<f32> Box;
	video::S3DVertex Vertices[6];

public:
	video::SMaterial Material;
	c8* vsFileName; // filename for the vertex shader
	c8* psFileName; // filename for the pixel shader

	int newMaterialType1;
	MyShaderCallBack* mc;

	video::ITexture* rt0;
	int alto;
	int ancho;

public:

	CBaseFilter(scene::ISceneNode* parent, scene::ISceneManager* mgr, s32 id)
		: scene::ISceneNode(parent, mgr, id)
	{
		Material.Wireframe = false;
		Material.Lighting = false;

		Vertices[0] = video::S3DVertex(-1.0f, -1.0f, 0.0f,1,1,0, video::SColor(255,0,255,255), 0.0f, 1.0f);
		Vertices[1] = video::S3DVertex(-1.0f,  1.0f, 0.0f,1,1,0, video::SColor(255,0,255,255), 0.0f, 0.0f);
		Vertices[2] = video::S3DVertex( 1.0f,  1.0f, 0.0f,1,1,0, video::SColor(255,0,255,255), 1.0f, 0.0f);
		Vertices[3] = video::S3DVertex( 1.0f, -1.0f, 0.0f,1,1,0, video::SColor(255,0,255,255), 1.0f, 1.0f);
		Vertices[4] = video::S3DVertex(-1.0f, -1.0f, 0.0f,1,1,0, video::SColor(255,0,255,255), 0.0f, 1.0f);
		Vertices[5] = video::S3DVertex( 1.0f,  1.0f, 0.0f,1,1,0, video::SColor(255,0,255,255), 1.0f, 0.0f);
	
		Box.reset(Vertices[0].Pos);
		for (s32 i=1; i<4; ++i)
			Box.addInternalPoint(Vertices[i].Pos);

		mc = new MyShaderCallBack();

		video::IVideoDriver* driver = SceneManager->getVideoDriver();
		rt0 = driver->createRenderTargetTexture(core::dimension2d<s32>(640,480));
		setMaterialTexture(0, rt0);
	}


	virtual void OnPreRender()
	{
		if (IsVisible)
			//SceneManager->registerNodeForRendering(this);

		ISceneNode::OnPreRender();
	}


	virtual void render()
	{
		u16 indices[] = {0,1,2,3,4,5};
		video::IVideoDriver* driver = SceneManager->getVideoDriver();

		driver->setMaterial(Material);
		driver->setTransform(video::ETS_WORLD, AbsoluteTransformation);
		driver->drawIndexedTriangleList(&Vertices[0], 6, &indices[0], 2);
	}


	virtual const core::aabbox3d<f32>& getBoundingBox() const
	{
		return Box;
	}

	virtual s32 getMaterialCount()
	{
		return 1;
	}

	virtual video::SMaterial& getMaterial(s32 i)
	{
		return Material;
	}	
};


int main()
{
	// create engine and camera

	IrrlichtDevice *device =
		createDevice(video::EDT_DIRECTX9, core::dimension2d<s32>(640, 480), 16, false);

	device->setWindowCaption(L"Custom Scene Node - Irrlicht Engine Demo");

	video::IVideoDriver* driver = device->getVideoDriver();
	scene::ISceneManager* smgr = device->getSceneManager();
	video::IGPUProgrammingServices* gpu = driver->getGPUProgrammingServices();

	smgr->addCameraSceneNodeFPS();

//------------------------------------------------------------------------
	// create my filter object

	//1)create of object
	CBaseFilter *myFilter = new CBaseFilter(smgr->getRootSceneNode(), smgr, 666);
	
	//2)select shader
	myFilter->psFileName = "Black&White.hlsl";
	myFilter->vsFileName = "Black&White.hlsl";

	//3)create "new material shader" and associate with filter
	myFilter->newMaterialType1 = gpu->addHighLevelShaderMaterialFromFiles(
				myFilter->vsFileName,"vertexMain", video::EVST_VS_1_1,
				myFilter->psFileName, "pixelMain", video::EPST_PS_1_1,
				myFilter->mc, video::EMT_SOLID);

	myFilter->setMaterialType((video::E_MATERIAL_TYPE)myFilter->newMaterialType1);

	//4) pass the constant-variables to IShaderConstantSetCallBack
	myFilter->mc->viewport_inv_height=1/480;
	myFilter->mc->viewport_inv_width=1/640;

	myFilter->drop();
//------------------------------------------------------------------------------------
// create a simple object for rendering any scene (this is only like example)
	scene::ISceneNode* node = smgr->addTestSceneNode(50);
	node->setPosition(core::vector3df(0,0,80));
	node->setMaterialTexture(0, driver->getTexture("wall.bmp"));
//------------------------------------------------------------------------------------

	while(device->run())
	{
		driver->beginScene(true, true, video::SColor(0,100,100,100));

		driver->setRenderTarget(myFilter->rt0,true, true, video::SColor(0,0,0,255));
		smgr->drawAll();
		driver->setRenderTarget(0);
		myFilter->render();
		driver->endScene();
	}

	device->drop();
	
	return 0;
}

NORMAL.HLSL

Code: Select all

float4x4 view_proj_matrix;
float viewport_inv_width;
float viewport_inv_height;

struct VS_OUTPUT
{
	float4 Pos: POSITION;
	float2 texCoord: TEXCOORD0;
};

VS_OUTPUT vertexMain( in float4 Pos : POSITION)
{
	VS_OUTPUT Out;

	Out.Pos = float4(Pos.xy, 0, 1);

	Out.texCoord.x = 0.5 * (1 + Pos.x - viewport_inv_width);
	Out.texCoord.y = 0.5 * (1 - Pos.y - viewport_inv_height);
	
	return Out;
}




struct PS_OUTPUT
{
    float4 RGBColor : COLOR0;  // Pixel color    
};


sampler2D tex0;
	
PS_OUTPUT pixelMain( float2 TexCoord : TEXCOORD0,
                     float4 Position : POSITION,
                     float4 Diffuse  : COLOR0 ) 
{ 
	PS_OUTPUT Output;

	float4 col = tex2D( tex0, TexCoord ); 

	Output.RGBColor= col;

	return Output;
}

BLACK&WHITE.hlsl

Code: Select all

float4x4 view_proj_matrix;
float viewport_inv_width;
float viewport_inv_height;

struct VS_OUTPUT
{
	float4 Pos: POSITION;
	float2 texCoord: TEXCOORD0;
};

VS_OUTPUT vertexMain( in float4 Pos : POSITION)
{
	VS_OUTPUT Out;

	Out.Pos = float4(Pos.xy, 0, 1);

	Out.texCoord.x = 0.5 * (1 + Pos.x - viewport_inv_width);
	Out.texCoord.y = 0.5 * (1 - Pos.y - viewport_inv_height);
	
	return Out;
}




struct PS_OUTPUT
{
    float4 RGBColor : COLOR0;  // Pixel color    
};


sampler2D tex0;
	
PS_OUTPUT pixelMain( float2 TexCoord : TEXCOORD0,
                     float4 Position : POSITION,
                     float4 Diffuse  : COLOR0 ) 
{ 
	PS_OUTPUT Output;

	float4 col = tex2D( tex0, TexCoord );

	float Intensity = 0.299*col.r + 0.587*col.g + 0.184*col.r; 

	Output.RGBColor=float4(Intensity.xxx,col.a);;

	return Output;
}

BLUR.hlsl
float4x4 view_proj_matrix;
float viewport_inv_width;
float viewport_inv_height;

struct VS_OUTPUT
{
float4 Pos: POSITION;
float2 texCoord: TEXCOORD0;
};

VS_OUTPUT vertexMain( in float4 Pos : POSITION)
{
VS_OUTPUT Out;

Out.Pos = float4(Pos.xy, 0, 1);

Out.texCoord.x = 0.5 * (1 + Pos.x - viewport_inv_width);
Out.texCoord.y = 0.5 * (1 - Pos.y - viewport_inv_height);

return Out;
}



sampler2D Texture0;

const float4 samples[4] = {
-1.0, 0.0, 0, 0.25,
1.0, 0.0, 0, 0.25,
0.0, 1.0, 0, 0.25,
0.0, -1.0, 0, 0.25
};

float4 pixelMain( float2 texCoord: TEXCOORD0) : COLOR
{

float4 col = float4(0,0,0,0);
// Sample and output the box averaged colors
for(int i=0;i<4;i++)
col += samples.w*tex2D(Texture0,texCoord+float2(samples.x*viewport_inv_width,samples.x*viewport_inv_height));

return col;

}

Guest

Post by Guest »

Example real Time :

[img]
http://pruebalaboratorio.iespana.es/1.bmp[/img]
Guest

Post by Guest »

sorry
Image
dracflamloc
Posts: 142
Joined: Sat Dec 11, 2004 8:13 am
Contact:

Post by dracflamloc »

Thats pretty cool!
hybrid

Post by hybrid »

Is this object capable of rendering parts of the scene tree with its effects, and parts without? Or is it basically a new root for the scene tree which just adds global render effects?

BTW: Second link does not work either, had to copy it manually. What about this one:
http://pruebalaboratorio.iespana.es/1.bmp
ItIsFree

Post by ItIsFree »

Not, it is for apply filter to a any scene what you render . It is for motion blur , deep of feild, antiailasing , etc,etc but for any scene

You render all nodes in the render target of the filter , and it has associated a shaders , then , the filter paint the scene with its shader

look this example

msdn.microsoft.com/library/ en-us/directx9_c/PostProcess_Sample.asp
ItIsFree

Post by ItIsFree »

other example for more information:

http://pruebalaboratorio.iespana.es/2.bmp

this example is with my shader Black&White.hlsl but anyone can change for other filter , without touch the code of the SceneNode

We can do a library of shader of filter for all people, i think
Salvadore

Post by Salvadore »

Non comprende. Can we have an example prog please.
Guest

Post by Guest »

something like dx9's effects interface?
ItIsFree

Post by ItIsFree »

Yes, in an engines oriented to shaders , it is the last part od the engine.

The filter Post-Production is the last change about the image 2D final

You render you scene (all node) normal , using irrlicht normal, but you render in a texture Render , not in a back buffer (rener(0) )

Then you can apply a shader (filter) to the image 2D final .

You can do it is white and black , or apply it motion blur , etc, whaen i arrive t my house , i wiil put a example with images like it is the way to render with irrlicht and the last you can apply a filter post production to image 2D final
krama757
Posts: 451
Joined: Sun Nov 06, 2005 12:07 am

Post by krama757 »

Thats beautiful! Thanks a lot for the help.

This really makes me wanna learn hlsl so I can make some grass for terrain (ofcourse Ill release shader to public if I learn it :D).
ItIsFree

Post by ItIsFree »

To Salvadore

I think this can help you for understand the filters postprocesing

http://pruebalaboratorio.iespana.es/Dibujo.gif
http://pruebalaboratorio.iespana.es/Dibujo.bmp
zelle

Post by zelle »

I just wonder how you want to achieve motion blur? are you storing the movement vectors somewhere? tell me a little more.
cheers!
ItIsFree

Post by ItIsFree »

You can do two nodes "post-precesing" with the shader "blur.hlsl". Each node "post-Processing" have your own texture render target , you can tell to irrlicth render first in one, no the image 2D is in te render target 1, after, you can tell to irrlicht render in the second, and render the node 1 (in the render target og node 2,(and then it apply it the first blur) . Now, the render target of the node 2 have the image with a one level of blur.

Now, you can do a loop between nodes. If you want to do other pass (level 2 of blur) you tell to irrlicht render in the render target of the node 1 (this is a loop), then you render the node 2 (node "post-processing") in the render target of the node 1, and then, in the render target of the node1 , you have the result, the image 2d with two levels of blur, etc ,tec with the loop

when you finish, you will have the result of the image 2D in one render target (node 1 or node 2) , then, you can tell to irrlicht select render target 0 (backbuffer) and you render the node that it has the result.


But the architecture of the node for "filters post processing" is open, you can do your own shader with blur more complex ,if you want indicate it with a variable , the direction or other thing, you only must change the file .hlsl for your own effect
krama757
Posts: 451
Joined: Sun Nov 06, 2005 12:07 am

Post by krama757 »

ItisFree, this is an awesome node.

Could you give us the example program you are posting pictures of?
(The cpp and h files).

That way we can figure out how to implement it a bit faster ^_^

=================
NVM! I didnt see what you put up top :P
Post Reply