[no bug]addRenderTargetTexture / removeTexture problem

You discovered a bug in the engine, and you are sure that it is not a problem of your code? Just post it in here. Please read the bug posting guidelines first.
Post Reply
virasena
Posts: 3
Joined: Sun Jan 24, 2010 2:55 am

[no bug]addRenderTargetTexture / removeTexture problem

Post by virasena »

Hi, I'm new to irrlicht and am wanting to create a 2d-accelerated engine with it, the idea being if I ever need 3d it will be available. I've written a lot of stuff in libSDL and am transferring it because I want hardware accleration.

I'm creating lots of textures that I will be able to blit to and from, using them like SDL_Surface. I'm using ITexture instead of IImage because ITexture will use hardware for blitting.

For some reason when creating lots of textures and removing them, I get a sort of texture corruption. To illustrate, I wrote the following example -

Code: Select all

#include "irrlicht.h"

#include <string>

using namespace irr;
using namespace video;
using namespace gui;
using namespace core;
using namespace io;


void DrawRect(IVideoDriver *Driver, ITexture *Texture, int x, int y, int w, int h, int r, int g, int b, int a)
{
   Driver->setRenderTarget(Texture, false);
      
   //clear view matrix before drawing - in OpenGL, will not draw correctly unless do so.
   matrix4 mat; 
   mat.makeIdentity(); 
   Driver->setTransform(video::ETS_VIEW,mat); 
            
   SColor scolor(a, r, g, b);
   
   Driver->draw2DRectangle(scolor, rect<s32>(x, y, x + w, y + h));
      
   Driver->setRenderTarget(ERT_FRAME_BUFFER, false);    
}

int main(int argc, char *argv[])
{
    IrrlichtDevice *Device = createDevice(EDT_OPENGL, dimension2d<u32>(1600, 1024), 32, true, false, false, NULL);
    
    IVideoDriver *Driver = Device->getVideoDriver();
       
    //initial setup            
    ITexture *t1 = Driver->addRenderTargetTexture(dimension2d<u32>(1600,1024), "1", ECF_A8R8G8B8); 
    ITexture *t2 = Driver->addRenderTargetTexture(dimension2d<u32>(1600,1024), "2", ECF_A8R8G8B8);        
    ITexture *t3 = Driver->addRenderTargetTexture(dimension2d<u32>(760,970), "3", ECF_A8R8G8B8); 
    ITexture *t4 = Driver->addRenderTargetTexture(dimension2d<u32>(780,990), "4", ECF_A8R8G8B8); 
    ITexture *t5 = Driver->addRenderTargetTexture(dimension2d<u32>(780,990), "5", ECF_A8R8G8B8); 
    ITexture *t6 = Driver->addRenderTargetTexture(dimension2d<u32>(780,990), "6", ECF_A8R8G8B8); 
    ITexture *t7 = Driver->addRenderTargetTexture(dimension2d<u32>(780,990), "7", ECF_A8R8G8B8);     
   
    //confirming correct size created    
    /*
    char buffer[10];
    itoa(t7->getSize().Width, buffer, 10);
    fprintf(stderr, buffer);    
    itoa(t7->getSize().Height, buffer, 10);
    fprintf(stderr, buffer);    
    */

    
    //clear t7 with black
    Device->run();
    Driver->beginScene();
    DrawRect(Driver, t7, 0, 0, 780, 990, 0, 0, 0, 255);        
    Driver->endScene();
   
    //Destroy t3
    Driver->removeTexture(t3);
        
    //create t8
    ITexture *t8 = Driver->addRenderTargetTexture(dimension2d<u32>(740,950), "8", ECF_A8R8G8B8); 
    
        
    while (Device->run())
    {
         Driver->beginScene();
         
         //draw a red square on T7
         DrawRect(Driver, t7, 100, 100, 100, 100, 255, 0, 0, 255);    
                  
         Driver->setRenderTarget(ERT_FRAME_BUFFER, false);        

         //clear view matrix before drawing - in OpenGL, will not draw correctly unless do so.
         matrix4 mat; 
         mat.makeIdentity(); 
         Driver->setTransform(video::ETS_VIEW,mat); 

         Driver->draw2DImage(t7, position2d<s32> (0, 0), 
            rect<s32>(0, 0, t7->getSize().Width, t7->getSize().Height), 0, SColor(255,255,255,255), true);
         Driver->endScene();
    }
       
}
The code displays texture T7 on screen, and I would expect to see a black background with a red square. However, I get this (not to scale) -

Image

The background "corruption" is different depending on what has been blitted or created before, like the texture is invalid and is displaying some random area of memory.

I've tried stuff like moving the addRenderTargetTexture and removeTexture calls in between the beginScene() and endScene() calls, but it doesn't work. However, if I remove the call to create t8 or destroy t3 it works. Also it works if I create fewer textures (t1-t6).

I'm using OpenGL as I want it to be portable between platforms. When I run using software renderer I just get a blank screen. The program is compiled on Dev-C.

Any help would be appreciated :) Maybe some other people could compile this and see if they get the same issue? Irrlicht doesn't seem to be really designed for 2D so this is a bit of a hack, not sure if this is OK or not.
hybrid
Admin
Posts: 14143
Joined: Wed Apr 19, 2006 9:20 pm
Location: Oldenburg(Oldb), Germany
Contact:

Post by hybrid »

Hmm, maybe the memory is full and the last texture is not properly created. Or your driver rejects textures at some point. We'll have to investigate this some more. Moving to bug forum.
virasena
Posts: 3
Joined: Sun Jan 24, 2010 2:55 am

Possible workaround

Post by virasena »

I've done some investigating and have a possible workaround. The problem seems to be with removing the texture - I can call addRenderToTexture a lot of times without a problem but the call to removeTexture messes the program up. I could be wrong though, as creating fewer textures works as well, as does calling the second DrawRect call only once.

Anyway, looking at the irrlicht source, the addRenderToTexture call for OpenGL is as follows -

Code: Select all

ITexture* COpenGLDriver::addRenderTargetTexture(const core::dimension2d<u32>& size,
					const io::path& name,
					const ECOLOR_FORMAT format)
{
	//disable mip-mapping
	bool generateMipLevels = getTextureCreationFlag(ETCF_CREATE_MIP_MAPS);
	setTextureCreationFlag(ETCF_CREATE_MIP_MAPS, false);

	video::ITexture* rtt = 0;
#if defined(GL_EXT_framebuffer_object)
	// if driver supports FrameBufferObjects, use them
	if (queryFeature(EVDF_FRAMEBUFFER_OBJECT))
	{   
		rtt = new COpenGLFBOTexture(size, name, this, format);
		if (rtt)
		{
			bool success = false;
			addTexture(rtt);
			
			ITexture* tex = createDepthTexture(rtt);

			if (tex)
			{
				success = static_cast<video::COpenGLFBODepthTexture*>(tex)->attach(rtt);
				tex->drop();
			}
			
         rtt->drop();

			if (!success)
			{
				removeTexture(rtt);
				rtt=0;
			}
		}
	}
	else
#endif
	{
		// the simple texture is only possible for size <= screensize
		// we try to find an optimal size with the original constraints
		core::dimension2du destSize(core::min_(size.Width,ScreenSize.Width), core::min_(size.Height,ScreenSize.Height));
		destSize = destSize.getOptimalSize((size==size.getOptimalSize()), false, false);
		rtt = addTexture(destSize, name, ECF_A8R8G8B8);
		if (rtt)
		{
			static_cast<video::COpenGLTexture*>(rtt)->setIsRenderTarget(true);
		}
	}

	//restore mip-mapping
	setTextureCreationFlag(ETCF_CREATE_MIP_MAPS, generateMipLevels);

	return rtt;
}
The implementation uses Frame Buffer Objects and creates two textures to do this - one for the main render to texture and a secondary depth texture as a depth buffer. The depth texture is attached to the main texture with the attach() method, so when you destroy the main one, the secondary one is destroyed too.

CreateDepthTexture will use an existing depth texture if one already exists with the same size. I thought this might be the problem. Changing the call to CreateDepthTexture(rtt, false) makes sure a new depth texture is created. However, this had no effect and the bug still occurred.

Tracing the removeTexture call, it calls the destructor which first destroys the depth texture if it exists, then destroys the frame buffer (used for rendering to texture) -

Code: Select all

COpenGLFBOTexture::~COpenGLFBOTexture()
{
	if (DepthTexture) //this is the main texture (DepthTexture pointer is valid)
		if (DepthTexture->drop())
			Driver->removeDepthTexture(DepthTexture);
	if (ColorFrameBuffer) //this destroys frame buffer for both main and depht textures
		Driver->extGlDeleteFramebuffers(1, &ColorFrameBuffer);
}
I found removing the call to delete the frame buffer made the bug go away. Doing some further tests, it seemed that the problem occurred when deleting the frame buffer for the main texture only. The depth texture also creates a frame buffer but it isn't used, and deleting this doesn't create any problems.

Code: Select all

//this works fine
COpenGLFBOTexture::~COpenGLFBOTexture()
{
	if (DepthTexture) //this is the main texture (DepthTexture pointer is valid)
	{
		if (DepthTexture->drop())
			Driver->removeDepthTexture(DepthTexture);
		
   }
   else //this is depth texture
   {
      //only destroy color frame buffer for depth texture
		if (ColorFrameBuffer)
		 Driver->extGlDeleteFramebuffers(1, &ColorFrameBuffer);
   }
   //removed call to destroy color frame buffer for all textures
	//if (ColorFrameBuffer)
		//Driver->extGlDeleteFramebuffers(1, &ColorFrameBuffer);
}

}

Obviously this is not a fix, because removing the call to delete the color frame buffer will create a memory leak.

Funnily enough as a side note, if I delete the color frame buffer for the main texture, but do not call glDeleteTextures to delete the actual texture data, the bug doesn't occur either.

Anyway at an OpenGL level, the depth texture is bound to the main texture using the following code in the attach() method -

Code: Select all

rtt->bindRTT();
...
Driver->extGlFramebufferRenderbuffer(GL_FRAMEBUFFER_EXT,
			GL_DEPTH_ATTACHMENT_EXT,
				GL_RENDERBUFFER_EXT,
				DepthRenderBuffer);
---
rtt->unbindRTT();
I found removing the call to GlFramebufferRenderBuffer also fixed the problem, suggesting that the problem maybe lies with deleting the color buffer when it is still bound to depth render buffer. I tried detaching it first before deleting, but that didn't work -

Code: Select all

//this doesn't work, bug still exists
COpenGLFBOTexture::~COpenGLFBOTexture()
{
	if (DepthTexture) //this is the main texture (DepthTexture pointer is valid)
	{
      //detach from depth texture first
      bindRTT();
      Driver->extGlFramebufferRenderbuffer(GL_FRAMEBUFFER_EXT,
			GL_DEPTH_ATTACHMENT_EXT,
				GL_RENDERBUFFER_EXT,
				0);
      unbindRTT();
      
		if (DepthTexture->drop())
			Driver->removeDepthTexture(DepthTexture);
		
   }
  
	if (ColorFrameBuffer)
		Driver->extGlDeleteFramebuffers(1, &ColorFrameBuffer);
}
So at the end of the day, the bug still occurred :(. However, for 2d rendering a depth buffer is not required, so I created a second function, addRenderTargetTextureB, that simply didn't use a depth buffer -

Code: Select all

ITexture* COpenGLDriver::addRenderTargetTextureB(const core::dimension2d<u32>& size,
					const io::path& name,
					const ECOLOR_FORMAT format)
{
   //version with no depth buffer
	//disable mip-mapping
	bool generateMipLevels = getTextureCreationFlag(ETCF_CREATE_MIP_MAPS);
	setTextureCreationFlag(ETCF_CREATE_MIP_MAPS, false);

	video::ITexture* rtt = 0;
#if defined(GL_EXT_framebuffer_object)
	// if driver supports FrameBufferObjects, use them
	if (queryFeature(EVDF_FRAMEBUFFER_OBJECT))
	{   
		rtt = new COpenGLFBOTexture(size, name, this, format);
		if (rtt)
		{
			addTexture(rtt);					
         rtt->drop();
		}
	}
	else
#endif
	{
		// the simple texture is only possible for size <= screensize
		// we try to find an optimal size with the original constraints
		core::dimension2du destSize(core::min_(size.Width,ScreenSize.Width), core::min_(size.Height,ScreenSize.Height));
		destSize = destSize.getOptimalSize((size==size.getOptimalSize()), false, false);
		rtt = addTexture(destSize, name, ECF_A8R8G8B8);
		if (rtt)
		{
			static_cast<video::COpenGLTexture*>(rtt)->setIsRenderTarget(true);
		}
	}

	//restore mip-mapping
	setTextureCreationFlag(ETCF_CREATE_MIP_MAPS, generateMipLevels);

	return rtt;
}
So far this has been working fine, and the bug hasn't reappeared (yet ;)). I'll use this fuction for 2d operations and save the original one for render-to-texture in a 3d scene.
hybrid
Admin
Posts: 14143
Joined: Wed Apr 19, 2006 9:20 pm
Location: Oldenburg(Oldb), Germany
Contact:

Post by hybrid »

Yeah, indeed I also believe that there's a bug in the FBO code. But I didn't find time to trace this, yet. Maybe 1.7.1 or 1.8 will have it.
The texture generation functions will be enhanced in the future, which will allow to choose whether a new depth texture is created, or none at all. That's why I implemented all those classes. It's basically the missing API which prevents this so far.
virasena
Posts: 3
Joined: Sun Jan 24, 2010 2:55 am

Video card drivers were the problem?

Post by virasena »

Quick update - this problem resurfaced a few days later, even without the depth buffer.

Then I decided to try updating my video drivers for my NVidia 6600GT card - from 94.24 (about 2 years old) to the latest 196.21.

This fixed the problem :) Looks like this was an issue with OpenGL drivers and not Irrlicht.
Post Reply