Shaders released after destroying glx context -> segfault

You discovered a bug in the engine, and you are sure that it is not a problem of your code? Just post it in here. Please read the bug posting guidelines first.
Post Reply
hendu
Posts: 2600
Joined: Sat Dec 18, 2010 12:53 pm

Shaders released after destroying glx context -> segfault

Post by hendu »

I finally managed to track down the reason why irrlicht segfaults on exit when GLSL shaders are used. The context is destroyed before the shaders are freed!

Thus glGetAttachedObjects doesn't do anything to count, which then gets set to random memory, irrlicht tries to access the shaders[8] array beyond 8 -> segfault.


The quick fix -> initialize count to 0 in COpenGLSLMaterialRenderer::~COpenGLSLMaterialRenderer().

The proper fix -> don't rely on C++ garbage collection here, free all OpenGL resources before destroying the context.
ChaiRuiPeng
Posts: 363
Joined: Thu Dec 16, 2010 8:50 pm
Location: Somewhere in the clouds.. drinking pink lemonade and sunshine..

Re: Shaders released after destroying glx context -> segf

Post by ChaiRuiPeng »

hendu wrote:don't rely on C++ garbage collection

c++ doesnt have garbage collection :P are you referring to irrlichts' reference counting?
ent1ty wrote: success is a matter of concentration and desire
Butler Lampson wrote: all problems in Computer Science can be solved by another level of indirection
at a cost measure in computer resources ;)
hendu
Posts: 2600
Joined: Sat Dec 18, 2010 12:53 pm

Post by hendu »

Whatever automatic method is used now :P
Sure it does, destructors get called when objects go out of scope, no?


This can be seen in 1.7.2 on all mesa drivers, in stock example 10 (opengl + enable high-level shaders).
ChaiRuiPeng
Posts: 363
Joined: Thu Dec 16, 2010 8:50 pm
Location: Somewhere in the clouds.. drinking pink lemonade and sunshine..

Post by ChaiRuiPeng »

hendu wrote:Whatever automatic method is used now :P
Sure it does, destructors get called when objects go out of scope, no?


This can be seen in 1.7.2 on all mesa drivers, in stock example 10 (opengl + enable high-level shaders).
yes but that isn't exactly the garbage collection we need.

languages like Java have generic garbage collection but that might and does in many cases slow down apps because it is tricky to optimize to fit the needs of all apps.

afaik c++ does not have garbage collection, one of its greatest powers, but greatest pitfalls for newcomers.
ent1ty wrote: success is a matter of concentration and desire
Butler Lampson wrote: all problems in Computer Science can be solved by another level of indirection
at a cost measure in computer resources ;)
Radikalizm
Posts: 1215
Joined: Tue Jan 09, 2007 7:03 pm
Location: Leuven, Belgium

Post by Radikalizm »

C++ has absolutely no garbage collection nor memory management schemes

Whenever memory is allocated (eg. by the new keyword) it always needs to be freed again to prevent memory leaks

Reference counting is a good solution for freeing memory semi-automatically, but can still cause a huge mess when you create a cyclic reference somewhere
hendu
Posts: 2600
Joined: Sat Dec 18, 2010 12:53 pm

Post by hendu »

kay, kay, bad wording. But c++ memory discussion is hardly relevant to the bug, can anyone reproduce?
hybrid
Admin
Posts: 14143
Joined: Wed Apr 19, 2006 9:20 pm
Location: Oldenburg(Oldb), Germany
Contact:

Post by hybrid »

I've initialised the value to 0 in the extension functions.
hendu
Posts: 2600
Joined: Sat Dec 18, 2010 12:53 pm

Post by hendu »

BTW, there are about 900 other opengl calls made after destroying the context :P Illegal in opengl...

This is just in my simple app, I expect there would be a ton more in a complex app that used more shaders etc.
hybrid
Admin
Posts: 14143
Joined: Wed Apr 19, 2006 9:20 pm
Location: Oldenburg(Oldb), Germany
Contact:

Post by hybrid »

Calling functions without a valid context is properly defined, so we don't mess with anything bad here. Moreover, all objects and memory on the GPU will be bound to the context and released when it's destroyed. So I doubt we will get any problem here. But so far I also didn't get any severe problems with gDebugger, maybe because I just used those examples without manually added shaders.
hendu
Posts: 2600
Joined: Sat Dec 18, 2010 12:53 pm

Post by hendu »

Mesa disagrees:

GL User Error: calling GL function without a rendering context

LIBGL_DEBUG=1 ./shadertest 2>&1 | grep User | wc -l
2551

edit: I don't know if I'm reading the right spec, but [1] says:
Issuing GL commands when the program is not connected to a context results in undefined behavior.
[1] http://www.opengl.org/documentation/spe ... spec20.pdf
hybrid
Admin
Posts: 14143
Joined: Wed Apr 19, 2006 9:20 pm
Location: Oldenburg(Oldb), Germany
Contact:

Post by hybrid »

Hmm, that sounds indeed as if we should take more care with this. I'll check if this only affects the Linux device - well, I'll check if the windows device makes problems under gDebugger actually. At least that might be a reason why I did not see any errors or so.

Ok, short update: The context is properly destroyed under Windows if the window is closed with Alt-F4. If the console is killed before that, there is a mem leak (i.e. no proper cleanup, the app is dead anyway) and the context is going out of service uncleaned. There might be a similar problem under Linux.
robmar
Posts: 1125
Joined: Sun Aug 14, 2011 11:30 pm

Re: Shaders released after destroying glx context -> segfaul

Post by robmar »

I think my app has a problem with resources not being cleared up under opengl on windows, I´m reusing the Irrlicht device, but after several cycles of loading scenes, meshes, textures, etc., there are megabytes of data that haven´t been freed, which eventually crashes the system.

Using D3D, I can reuse the device without any problems.

At present, the only workaround is to delete the device, and reload, which causes delays of seconds between each reload of the device.

If anyone knows how to cleanup the GL driver without closing, please post.
Post Reply