Shaders released after destroying glx context -> segfault
Shaders released after destroying glx context -> segfault
I finally managed to track down the reason why irrlicht segfaults on exit when GLSL shaders are used. The context is destroyed before the shaders are freed!
Thus glGetAttachedObjects doesn't do anything to count, which then gets set to random memory, irrlicht tries to access the shaders[8] array beyond 8 -> segfault.
The quick fix -> initialize count to 0 in COpenGLSLMaterialRenderer::~COpenGLSLMaterialRenderer().
The proper fix -> don't rely on C++ garbage collection here, free all OpenGL resources before destroying the context.
Thus glGetAttachedObjects doesn't do anything to count, which then gets set to random memory, irrlicht tries to access the shaders[8] array beyond 8 -> segfault.
The quick fix -> initialize count to 0 in COpenGLSLMaterialRenderer::~COpenGLSLMaterialRenderer().
The proper fix -> don't rely on C++ garbage collection here, free all OpenGL resources before destroying the context.
-
- Posts: 363
- Joined: Thu Dec 16, 2010 8:50 pm
- Location: Somewhere in the clouds.. drinking pink lemonade and sunshine..
Re: Shaders released after destroying glx context -> segf
hendu wrote:don't rely on C++ garbage collection
c++ doesnt have garbage collection are you referring to irrlichts' reference counting?
ent1ty wrote: success is a matter of concentration and desire
at a cost measure in computer resourcesButler Lampson wrote: all problems in Computer Science can be solved by another level of indirection
-
- Posts: 363
- Joined: Thu Dec 16, 2010 8:50 pm
- Location: Somewhere in the clouds.. drinking pink lemonade and sunshine..
yes but that isn't exactly the garbage collection we need.hendu wrote:Whatever automatic method is used now
Sure it does, destructors get called when objects go out of scope, no?
This can be seen in 1.7.2 on all mesa drivers, in stock example 10 (opengl + enable high-level shaders).
languages like Java have generic garbage collection but that might and does in many cases slow down apps because it is tricky to optimize to fit the needs of all apps.
afaik c++ does not have garbage collection, one of its greatest powers, but greatest pitfalls for newcomers.
ent1ty wrote: success is a matter of concentration and desire
at a cost measure in computer resourcesButler Lampson wrote: all problems in Computer Science can be solved by another level of indirection
-
- Posts: 1215
- Joined: Tue Jan 09, 2007 7:03 pm
- Location: Leuven, Belgium
C++ has absolutely no garbage collection nor memory management schemes
Whenever memory is allocated (eg. by the new keyword) it always needs to be freed again to prevent memory leaks
Reference counting is a good solution for freeing memory semi-automatically, but can still cause a huge mess when you create a cyclic reference somewhere
Whenever memory is allocated (eg. by the new keyword) it always needs to be freed again to prevent memory leaks
Reference counting is a good solution for freeing memory semi-automatically, but can still cause a huge mess when you create a cyclic reference somewhere
-
- Admin
- Posts: 14143
- Joined: Wed Apr 19, 2006 9:20 pm
- Location: Oldenburg(Oldb), Germany
- Contact:
Calling functions without a valid context is properly defined, so we don't mess with anything bad here. Moreover, all objects and memory on the GPU will be bound to the context and released when it's destroyed. So I doubt we will get any problem here. But so far I also didn't get any severe problems with gDebugger, maybe because I just used those examples without manually added shaders.
Mesa disagrees:
GL User Error: calling GL function without a rendering context
LIBGL_DEBUG=1 ./shadertest 2>&1 | grep User | wc -l
2551
edit: I don't know if I'm reading the right spec, but [1] says:
GL User Error: calling GL function without a rendering context
LIBGL_DEBUG=1 ./shadertest 2>&1 | grep User | wc -l
2551
edit: I don't know if I'm reading the right spec, but [1] says:
[1] http://www.opengl.org/documentation/spe ... spec20.pdfIssuing GL commands when the program is not connected to a context results in undefined behavior.
-
- Admin
- Posts: 14143
- Joined: Wed Apr 19, 2006 9:20 pm
- Location: Oldenburg(Oldb), Germany
- Contact:
Hmm, that sounds indeed as if we should take more care with this. I'll check if this only affects the Linux device - well, I'll check if the windows device makes problems under gDebugger actually. At least that might be a reason why I did not see any errors or so.
Ok, short update: The context is properly destroyed under Windows if the window is closed with Alt-F4. If the console is killed before that, there is a mem leak (i.e. no proper cleanup, the app is dead anyway) and the context is going out of service uncleaned. There might be a similar problem under Linux.
Ok, short update: The context is properly destroyed under Windows if the window is closed with Alt-F4. If the console is killed before that, there is a mem leak (i.e. no proper cleanup, the app is dead anyway) and the context is going out of service uncleaned. There might be a similar problem under Linux.
Re: Shaders released after destroying glx context -> segfaul
I think my app has a problem with resources not being cleared up under opengl on windows, I´m reusing the Irrlicht device, but after several cycles of loading scenes, meshes, textures, etc., there are megabytes of data that haven´t been freed, which eventually crashes the system.
Using D3D, I can reuse the device without any problems.
At present, the only workaround is to delete the device, and reload, which causes delays of seconds between each reload of the device.
If anyone knows how to cleanup the GL driver without closing, please post.
Using D3D, I can reuse the device without any problems.
At present, the only workaround is to delete the device, and reload, which causes delays of seconds between each reload of the device.
If anyone knows how to cleanup the GL driver without closing, please post.