Sharing opengl context with Irrlicht

If you are a new Irrlicht Engine user, and have a newbie-question, this is the forum for you. You may also post general programming questions here.
Post Reply
floriang
Posts: 13
Joined: Mon Jul 14, 2014 12:03 pm

Sharing opengl context with Irrlicht

Post by floriang »

Hello,
I am developping a library that needs basically to be able to render something to a few Textures using data from a database, so that the user of the library can then use the textures in OpenGL to draw his application (using the Open GL texture names).

I am using Irrlicht because it was easy to use until now, and until now I was drawing directly to the screen and it was working like a charm.
Also I like the architecture and the global simplicity (the code is not difficult to browse and that could enable some great tweaks).

So then I am trying to share an opengl context with Irrlicht, and then draw a texture that I get from Irrlicht (using COpenGLTexture->getTextureName()) in OpenGL.
Actually I do not need Irrlicht to render to the screen at all, everything offscreen is fine for me, I just want to render a scene to a texture, and then get this texture for the other opengl context.
The irrlicht rendering is a little complex, but then I just render the texture from irrlicht fullscreen.

What I tried at the moment:
1. Initialize irrlicht from a window handle, and then an openGL context with another window handle (actually I'm rendering to panels in a windows form application, so they are panels, but that works the same from what I saw)
2. After I get the opengl rendering context, I call wglShareLists() to share this new rendering context with Irrlicht's.
I get Irrlicht rendering context via driver->getExposedData() .
The call to wglsharelists seems to be successful (returns "true").
3. Then in my main rendering function, I render once my drawing to a texture with Irrlicht (usign addRenderTarget/setRenderTarget),
and then I get the Irrlicht texture's name via getOpenGLTextureName() and pass the result to OpenGL.
to draw in OpenGL, I clear the back and z buffer, load identity and draw 2 triangles to which I map the texture.

The problem I have is that the result I get is a black screen.
I double checked the code by rendering a self-loaded texture from a bitmap in OpenGL on one side, and by checking irrlicht's output on the panel on the other side.
I also seem to get a correct name for the texture used by irrlicht (I checked with a debugger that the value matched).

I also noticed that in my irrlicht rendering process, if I call "endScene", the drawing result is displayed on the panel I use to initialize Irrlicht.
What could be the problem?

Do you think this can be done with this approach, or do you recommend another one?
Did anyone ever do anything similar?

Note that I had to do a minor modification to export getOpenGLTextureName() from a COpenGLTexture.

Else I was wondering if I could initialize a rendering context for Irrlicht on my side, and pass it somewhere to initialize the irrlicht device (createDeviceEx).
It doesn't seem to be supported "out of the box"?
I did see that we can pass our own SExposedVideoData to beginScene, but this also I don't really understand how it works.
Can we completely ignore the rendering context created by createDeviceEx if we use the extra argument of beginScene in this case?
I am using Irrlicht 1.8 by the way, and I have to make this work in the OpenGL ES branch too after I fix this opengl sample.

Also, if anyone has a successful sample that shows how to mix basic openGL and Irrlicht that would greatly help.

Thank you.
Florian
Nadro
Posts: 1648
Joined: Sun Feb 19, 2006 9:08 am
Location: Warsaw, Poland

Re: Sharing opengl context with Irrlicht

Post by Nadro »

You shouldn't use shared context feature and you don't need them in this case. Performance of this approach is really bad and it's bugged on some drivers. You should use Irrlicht context. In trunk and ogl-es we use a cache for OpenGL states, so if you want to modify OpenGL states with pure OpenGL functions you have to revert original states before any Irrlicht rendering call.
Library helping with network requests, tasks management, logger etc in desktop and mobile apps: https://github.com/GrupaPracuj/hermes
floriang
Posts: 13
Joined: Mon Jul 14, 2014 12:03 pm

Re: Sharing opengl context with Irrlicht

Post by floriang »

Hello Nadro,

Edit
I was saying that I it worked, and it looked like it at first, but the result is buggy.
So I will go with your proposal (using same context for opengl rendering).

Thanks.
Florian

=========== Disregard below comment ===============
Thank you for your reply.
Actually I was able to make it work with 2 different contexts!
My main mistake was to recreate a device context. It seems that the 2 shared contexts need to have the same device context, so I use the one created by irrlicht now.
Then I can share 2 different rendering contexts and it works.

Also I had to remove "SwapBuffer" from the endScene in irrlicht's opengl driver to avoid flickering.
It would be nice if we could make the swap buffer an option of endScene.
Like if we call endScene(true) -> buffer are swapped
bug endScene(false) -> buffer swap is left to the user
A little bit similar to the option of not clearing the backbuffer in beginScene.

About performance actually I was wondering:
I read that if I have 2 rendering context, I can have one thread per context.
This could be good in my case, because in this case I could do my texture creation on one CPU core, and let the user do the UI rendering on another. I'm pretty sure that would help for performances if the GPU can handle it (but can it really?).

How do you think about it?
Should I avoid multithreading even if I have two contexts?

At the moment I also implemented what you advised, so I think I will fallback to it if I encounter issues
floriang
Posts: 13
Joined: Mon Jul 14, 2014 12:03 pm

Re: Sharing opengl context with Irrlicht

Post by floriang »

Hello again,

Edit:
Nevermind I found it:

irdriver->setTransform(irr::video::ETS_WORLD, irr::core::matrix4());
irdriver->setTransform(irr::video::ETS_VIEW, irr::core::matrix4());
irdriver->setTransform(irr::video::ETS_PROJECTION, irr::core::matrix4());
irdriver->setTransform(irr::video::ETS_TEXTURE_0, irr::core::matrix4());
irdriver->draw3DLine( irr::core::vector3df(),irr::core::vector3df(), 0x0 ); // force above matrices to be applied
irdriver->setRenderTarget(0, false, false, 0x0);

Note that I had to force draw an empty line so that the transformation matrices are actually updated by the driver.


*************************************
Following the previous discussion, I am now able to draw properly using openGL.
Now I have to draw using openGL ES2

My problem is, after doing a draw pass with Irrlicht, how do I reset the state?

In openGL I was doing:

static void reset_opengl_state() {
glDisable( GL_TEXTURE_2D );
glMatrixMode(GL_MODELVIEW);
glLoadIdentity(); // Reset The View
glMatrixMode(GL_PROJECTION);
glLoadIdentity(); // Reset The View
glMatrixMode(GL_TEXTURE);
glLoadIdentity(); // Reset The View
glFlush();
}

But what can I do for opengl ES?

Actually is there an Irrlicht function that could do this for me?

So I could do my scene->drawAll
And then: IrDriver->resetDrawState

Thank you.
pyro
Posts: 1
Joined: Mon Nov 10, 2014 11:15 am

Re: Sharing opengl context with Irrlicht

Post by pyro »

I'm working on a QT project and I'm trying to use Irrlicht with.
QT create its own OpenGL context.
Effectively it should be interesting to have an option to disable swap buffer durring endScene().
Using two contexts seem to work on Windows, I'm trying to do the same on Android with OpenGL ES.
pandoragami
Posts: 226
Joined: Wed Jan 26, 2011 5:37 pm
Contact:

Re: Sharing opengl context with Irrlicht

Post by pandoragami »

I know it's not easy but you should just ditch Irrlicht and use only OpenGL. If you wrote a lot of Irrlicht code it's probably not as simple as saying so but mixing the two (Irrlicht and OpenGL) is even harder than using one or the other. That would be like creating a whole new library on top of a library that was built on top of OpenGL to begin with, very bizarre indeed.
Post Reply