[fixed]draw2DImage making shaders unusable

You discovered a bug in the engine, and you are sure that it is not a problem of your code? Just post it in here. Please read the bug posting guidelines first.
Post Reply
xDan
Competition winner
Posts: 673
Joined: Thu Mar 30, 2006 1:23 pm
Location: UK
Contact:

[fixed]draw2DImage making shaders unusable

Post by xDan »

Hi,

When I add a GUIImage and a mesh node with a shader at the same time the framerate drops to really really slow. Like a frame every 10 seconds... or even slower, if I have more than a few shader-ed objects visible it seems to freeze completely.

Yet if I don't add the GUI image it works fine, or if I disable shaders on the mesh node it works, but I can't have both at the same time. So somehow they interfere with each other :S

If I just put one mesh node in the scene with shaders set, it slows down if the camera looks at the object, but if I look away and it gets culled it goes to the correct speed again.

I have tested using draw2DImage directly instead of adding a GUIImage and the problem is the same.

Unfortunately I can't seem to reproduce this in the Shaders Example to give a test case.... and all functions are deep and spread out through several classes within my engine code so I can't really post the code.

I will continue to try and make a test case. But maybe someone knows what might cause this?


Update:

I've tried using a profiler for the first time (Sleepy), and it seems most time is taken in the following functions:
atiPPHSN
RtlAllocateHeap
RtlFreeHeap
tan
wcsncpy
DllMain

So it seems it is definitely not frozen completely, just taking a huge amount of time doing whatever it is trying to do.

I took a screenshot of the profiler here: http://xzist.org/temp/sleepy.png
xDan
Competition winner
Posts: 673
Joined: Thu Mar 30, 2006 1:23 pm
Location: UK
Contact:

Post by xDan »

OK so through a tedious process of commenting out and recompiling the engine I have discovered the following:


draw2DImage calls COpenGLDriver::setRenderStates2DMode(bool alpha, bool texture, bool alphaChannel)

And in this setRenderStates2DMode method,

Code: Select all

	if (CurrentRenderMode != ERM_2D || Transformation3DChanged)
	{
		
		// unset last 3d material
		if (CurrentRenderMode == ERM_3D)
		{
			if (static_cast<u32>(LastMaterial.MaterialType) < MaterialRenderers.size())
				MaterialRenderers[LastMaterial.MaterialType].Renderer->OnUnsetMaterial();
			SMaterial mat;
			mat.ZBuffer=ECFN_NEVER;
			mat.Lighting=false;
			mat.AntiAliasing=video::EAAM_OFF; // <------ THIS LINE!!
			mat.TextureLayer[0].BilinearFilter=false;
			setBasicRenderStates(mat, mat, true);
			LastMaterial = mat;
			glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
		}

Setting mat.AntiAliasing=video::EAAM_OFF; causes the problem.

If I comment out that line, it works correctly at a good speed.


Now why is this? How can turning off AntiAliasing for this "LastMaterial" make rendering so slow??!


My own guess would be that turning it off on this LastMaterial somehow causes some huge level of anti aliasing to be enabled or something when rendering the shader???! But I haven't a clue how or why. (or really what the purpose of LastMaterial is)





...

following this backwards, I can now fix the problem in my own code without commenting out that line, by setting
material.AntiAliasing = video::EAAM_OFF;
in my shader material. (or I can set to EAAM_SIMPLE, but setting to EAAM_LINE_SMOOTH, one of the defaults, causes the problem)

However there still must be some kind of bug; behaviour shouldn't change simply because a GUI image is added.
hybrid
Admin
Posts: 14143
Joined: Wed Apr 19, 2006 9:20 pm
Location: Oldenburg(Oldb), Germany
Contact:

Post by hybrid »

Yes, I also see this problem. Although I couldn't work around by changing the 2drendermode method. I'll try to find the actual causes.
hybrid
Admin
Posts: 14143
Joined: Wed Apr 19, 2006 9:20 pm
Location: Oldenburg(Oldb), Germany
Contact:

Post by hybrid »

Ok, fixed already. It was a change in the anti-aliasing settings. If multi-sampling is enabled, line smoothing comes at almost no cost. However, I changed this setting such that it's also enabled without MSAA, which causes software emulation mode for OpenGL in many cases. Hence, line smoothing is again off by default, and should only be enabled together with multi-sampling (anti-aliasing in the device settings) or for certain cards (the really large ones for CAD stations).
xDan
Competition winner
Posts: 673
Joined: Thu Mar 30, 2006 1:23 pm
Location: UK
Contact:

Post by xDan »

Great! :)
booltox
Posts: 7
Joined: Thu Nov 05, 2009 5:35 am

Post by booltox »

xDan wrote:

Code: Select all

	if (CurrentRenderMode != ERM_2D || Transformation3DChanged)
	{
		
		// unset last 3d material
		if (CurrentRenderMode == ERM_3D)
		{
			if (static_cast<u32>(LastMaterial.MaterialType) < MaterialRenderers.size())
				MaterialRenderers[LastMaterial.MaterialType].Renderer->OnUnsetMaterial();
			SMaterial mat;
			mat.ZBuffer=ECFN_NEVER;
			mat.Lighting=false;
			mat.AntiAliasing=video::EAAM_OFF; // <------ THIS LINE!!
			mat.TextureLayer[0].BilinearFilter=false;
			setBasicRenderStates(mat, mat, true);
			LastMaterial = mat;
			glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
		}
This might be a bit off topic, but Im curious about this code. Is calling glBlenFunc the only way to enable alpha blending when you are using your own Material format? Is there a driver independent method? I have been working with the assumption that using m_Material.MaterialTypeParam = pack_texureBlendFunc( video::EBF_SRC_COLOR, video::EBF_DST_COLOR, video::EMFN_MODULATE_1X, video::EAS_TEXTURE ); would do the trick. But I don't think its working.


If this has been covered somewhere else I apologize for not finding it.

Thanks,
booltox
hybrid
Admin
Posts: 14143
Joined: Wed Apr 19, 2006 9:20 pm
Location: Oldenburg(Oldb), Germany
Contact:

Post by hybrid »

The marked line and your text don't seem to be related at all. the textureblendfunc value is only supported in OneTextureBlend material.
booltox
Posts: 7
Joined: Thu Nov 05, 2009 5:35 am

Post by booltox »

hybrid wrote:The marked line and your text don't seem to be related at all. the textureblendfunc value is only supported in OneTextureBlend material.
That answers my question. I have been building a feature that uses a material based on a fragment shader and I want it to use the alpha blender functions as well. OpenGL and DirectX both have ways to do this, I just haven't found the irrlicht way of doing it. Calling glBlendFunc like the post above isn't so bad, but its obviously not going to work with DirectX.

Thanks for quick answer.

Cheers,
b
Post Reply