Discuss: Event-driven rendering
-
- Competition winner
- Posts: 688
- Joined: Mon Sep 10, 2012 8:51 am
Discuss: Event-driven rendering
Once upon a time, someone suggested an event-driven scene manager:
http://irrlicht.sourceforge.net/forum/v ... en#p200532
While this idea isn't so practical for a scene manager, it would make sense to redraw GUI elements based on user events. I imagine this would speed things up significantly provided GUI elements are not transparent over some active scene. But rather than doing it for the whole system, it might be simpler just to do it on a per-element basis. If the element has been changed, it sets a flag that tells it to redraw itself when draw() is called. When the flag is not set, the child draw() functions are still called so as to let them decide if they need to update.
The difficult part would be determining all of the methods in the existing GUI elements that trigger the need for updating what is drawn.
I'm still kinda toying with the idea at the moment, but I wanted to know your thoughts.
http://irrlicht.sourceforge.net/forum/v ... en#p200532
While this idea isn't so practical for a scene manager, it would make sense to redraw GUI elements based on user events. I imagine this would speed things up significantly provided GUI elements are not transparent over some active scene. But rather than doing it for the whole system, it might be simpler just to do it on a per-element basis. If the element has been changed, it sets a flag that tells it to redraw itself when draw() is called. When the flag is not set, the child draw() functions are still called so as to let them decide if they need to update.
The difficult part would be determining all of the methods in the existing GUI elements that trigger the need for updating what is drawn.
I'm still kinda toying with the idea at the moment, but I wanted to know your thoughts.
-
- Posts: 1638
- Joined: Mon Apr 30, 2007 3:24 am
- Location: Montreal, CANADA
- Contact:
Re: Discuss: Event-driven rendering
I think it would be possible. You would have to "capture" the GUI rendered buffer and put it back when the scene manager is rendering all.
Let say:
Then you would render (and update) the gui stuff in the event receiver when it receive a GUI event. You will surely have to do some tweaks as you would need to refresh the buffer each time the GUI is being changed, and process everything to get GUIevents. (Ex: keyboard event that change a GUI or direct code change that change a GUI would have to be accounted for also)
The tricky part would be to do the "guibuffer" example. That would "capture" the rendered gui display, and then only "paste" the result back in the refresh loop. Looking at it, "guibuffer" could be a scene manager node that take care of it (redraw the updated buffer when DRAW is asked from the scene manager...)
Let say:
Code: Select all
smgr->drawall();
guibuffer->draw();
Code: Select all
guienv->drawall();
guibuffer->capture();
-
- Competition winner
- Posts: 688
- Joined: Mon Sep 10, 2012 8:51 am
Re: Discuss: Event-driven rendering
Those are good ideas. That would probably be the most memory-saving. In order to account for GUI changes, there could be a second buffer that renders over the first buffer and contains the active GUI element. When the active GUI element is changed, both buffers will need to be updated, but since irrlicht 1.8 has sub-elements, that updating wouldn't happen as frequently as it would without that system.
The nice benefit about this is that the second buffer only needs to be cleared around the area where the active GUI element was.
I'm thinking this could actually be implemented in the GUI skin without modifying the GUI environment or the current GUI elements except those that directly call the video driver in order to update. This is because GUI skin functions require that the GUI element pass itself to the skin. This can be used to check if it is the active GUI element and thus whether it should be drawn on the second render buffer.
The nice benefit about this is that the second buffer only needs to be cleared around the area where the active GUI element was.
I'm thinking this could actually be implemented in the GUI skin without modifying the GUI environment or the current GUI elements except those that directly call the video driver in order to update. This is because GUI skin functions require that the GUI element pass itself to the skin. This can be used to check if it is the active GUI element and thus whether it should be drawn on the second render buffer.
-
- Posts: 1638
- Joined: Mon Apr 30, 2007 3:24 am
- Location: Montreal, CANADA
- Contact:
Re: Discuss: Event-driven rendering
There could be a rendered output per GUI element to a texture as second "buffer".
You would need to update the final rendering via some kind of "compositor" that would put all the texture to the main buffer. So if a GUI change, the "compositor" will only change the desired element from the composition. In essence, the "compositor" would do "photoshop" style composition of elements by using their alpha channel and having each GUI texture used as a "layer"
The problem with this, is that you will have to modify Irrlicht extensively to allow it to render it's elements in a texture and not directly, and need some kind of flag (like "setDirty") and index (for the "compositor") when the element need to be updated in layers in the same order as the GUIManager is doing. I think that would have to be done directly inside the IGUIElement class.
The first method I mentioned, would require less change over the current structure and I think would be easier to do than this. (Could use the first method as "first iteration" and this as second...)
You would need to update the final rendering via some kind of "compositor" that would put all the texture to the main buffer. So if a GUI change, the "compositor" will only change the desired element from the composition. In essence, the "compositor" would do "photoshop" style composition of elements by using their alpha channel and having each GUI texture used as a "layer"
The problem with this, is that you will have to modify Irrlicht extensively to allow it to render it's elements in a texture and not directly, and need some kind of flag (like "setDirty") and index (for the "compositor") when the element need to be updated in layers in the same order as the GUIManager is doing. I think that would have to be done directly inside the IGUIElement class.
The first method I mentioned, would require less change over the current structure and I think would be easier to do than this. (Could use the first method as "first iteration" and this as second...)
-
- Competition winner
- Posts: 688
- Joined: Mon Sep 10, 2012 8:51 am
Re: Discuss: Event-driven rendering
I had considered that idea, but came to the same conclusions you did. Furthermore, that method would be more memory-intensive because it requires a buffer for every single element. It would also lose one of the major benefits of having event-driven rendering: to avoid redrawing transparent elements over each other. Your first post about the single buffer inspired me to think of using a double buffer system, which I believe could be done via the GUISkin. It does have the slight disadvantage of being twice the size, but that's one extra image.
I suppose it's hard to tell how effective it will be without an implementation.
I suppose it's hard to tell how effective it will be without an implementation.
Re: Discuss: Event-driven rendering
this is pretty pointless compared to eliminating all sort of searching (binary or not it doesnt matter) from irrlicht and being able to compile a DEPRECATED feature free irrlicht
Again I would push for vector2d 3d and 4d based on SSE like they do in Bullet Physics
And eliminating pointless matrix transforms, I mean why does every call to getAbsoluteTransform() need to do matrix mul OF EVERY SINGLE ANCESTOR MATRIX???
Oh and Dropping OpenGL 3.1 compatibility stuff would be epic too
Again I would push for vector2d 3d and 4d based on SSE like they do in Bullet Physics
And eliminating pointless matrix transforms, I mean why does every call to getAbsoluteTransform() need to do matrix mul OF EVERY SINGLE ANCESTOR MATRIX???
Oh and Dropping OpenGL 3.1 compatibility stuff would be epic too
-
- Competition winner
- Posts: 688
- Joined: Mon Sep 10, 2012 8:51 am
Re: Discuss: Event-driven rendering
@Devsh - That's an almost completely off-topic post (unless somehow vector 4d is related to this topic). This thread isn't about updates to Irrlicht, it's about event-driven rendering.
And no, it isn't pointless if your program is entirely GUI. I think you've been working on that game too long.
And no, it isn't pointless if your program is entirely GUI. I think you've been working on that game too long.
-
- Posts: 1638
- Joined: Mon Apr 30, 2007 3:24 am
- Location: Montreal, CANADA
- Contact:
Re: Discuss: Event-driven rendering
@devsh:
The only thing I see that seem on the subject, is optimizations. But how theses optimizations you are talking about are into the subject of having better performance for the GUI system?
If you would like to propose better way to improve Irrlicht general performance, and it is not directly related on what we are talking in this thread, I would suggest to get the details of your proposal and open a new thread on this subject. Getting on another subject myself: congratulation on the work you accomplished on "Build a world".
Getting back on the subject It would be really nice if we could redirect the GUI drawing to a ITEXTURE (in the GUIELEMENT:Draw()
With this, it would surely be possible to test theses ideas.
The only thing I see that seem on the subject, is optimizations. But how theses optimizations you are talking about are into the subject of having better performance for the GUI system?
If you would like to propose better way to improve Irrlicht general performance, and it is not directly related on what we are talking in this thread, I would suggest to get the details of your proposal and open a new thread on this subject. Getting on another subject myself: congratulation on the work you accomplished on "Build a world".
Getting back on the subject It would be really nice if we could redirect the GUI drawing to a ITEXTURE (in the GUIELEMENT:Draw()
With this, it would surely be possible to test theses ideas.
-
- Competition winner
- Posts: 688
- Joined: Mon Sep 10, 2012 8:51 am
Re: Discuss: Event-driven rendering
To have the GUI elements draw on an ITexture, we'd change the render target of the video driver temporarily. Then, when all of the elements are drawn, we draw the final texture.
To control the GUI element drawing, we would make the GUISkin also be an IGUIElement that would be directly attached to the root GUI element.
rootGUIElement -> GUISkin -> all other GUI elements
This way, the GUISkin would have a draw() function through which we could control when other GUI elements are drawn. The pseudo code for the GUISkin::draw() function might look something like this:
A GUI Skin function would look something like this:
Of course, to optimize this code, I would copy and paste some contents from the methods in CGUISkin.cpp
To control the GUI element drawing, we would make the GUISkin also be an IGUIElement that would be directly attached to the root GUI element.
rootGUIElement -> GUISkin -> all other GUI elements
This way, the GUISkin would have a draw() function through which we could control when other GUI elements are drawn. The pseudo code for the GUISkin::draw() function might look something like this:
Code: Select all
if update needed:
draw children (which buffer they draw on is handled by the other GUISkin functions)
draw GUI back buffer (inactive elements)
draw GUI front buffer
Code: Select all
if ( element passed is the GUI environment focus )
set video driver render target to GUI front buffer
else
set video driver render target to GUI back buffer
perform usual operations associated with this function