Page 1 of 1

How does real time lighting work get vertex context to the shader?

Posted: Thu May 30, 2024 5:06 pm
by Markuss
I understand how vertex shaders and fragment shaders work. When a game engine has a moving light point the light point has to change the colors of the vertices around it.

I understand the concept of shadow mapping where a scene is rendered from the lights point of view and any object that the light can see is then marked in a texture and that texture is then sent to the vertex shader and used to darken the vertex shaders vertex if it falls withing that texture.

I dont understand how basic dynamic lights in game engines like Unity or Godot or I guess Irrlicht work? I think this is called real time lighting? Is it the same technique as shadow mapping?

Re: How does real time lighting work get vertex context to the shader?

Posted: Thu May 30, 2024 7:06 pm
by CuteAlien
That's a pretty huge question. Shadow mapping is about calculating shadows - which is for the most part independent of calculating lights.
There's a website which can give you most techniques used in modern engines: https://learnopengl.com/Lighting/Basic-Lighting
And which is way better than my short explanation below ;-)

Irrlicht for the most part doesn't do that stuff from that website. By default it uses lighting calculations from the so called fixed function pipeline. That's the functions older OpenGL/D3D versions had build-in (in the ogl-es branch we do emulate that one with our own shaders as ES2 no longer has it build-in). Fixed function means - you can only modify existing functions with a few parameters, unlike with shaders where you write and run your own code on the GPU. Those fixed function light calculations works like that: You can set up 1-8 lights and then every vertex in your models checks the angle between it's vertex normal and the light. Very roughly you can think of it as having the "real" color set for a polygon when the vertex normal is going exactly towards a light. Rotating more than 90° away from the light things get dark and between 0 and 90° some interpolation is used. The exact calculation is the Phong reflection model. Several lights add up colors. If you enable Gouraud shading (enabled by default in Irrlicht) then each pixel on a polygon interpolates the values of the 3 vertex corner values. And lastly to avoid that polygons facing away from all lights are completely black the ambient color value is added (think of ambient as the simplest approximation of light reflecting from the environment). Note that at this point no polygon knows it's behind another polygon it only knows it's angles towards the lights.

Irrlicht has some materials like EMT_NORMAL_MAP_SOLID and EMT_PARALLAX_MAP_SOLID which go beyond the fixed function pipeline and use build-in shaders. Those are still similar to the fixed function pipeline, but are calculated per pixel/fragment instead of per vertex. And they add one thing - you can set a second texture which tells how the light reflects at each texture pixel. So instead of one normal per vertex you basically set a normal at each point in your texture (called normal-maps). Which gives a slight illusion of having depth. Thought also come with downsides, like only reacting to 2 lights currently in Irrlicht (that's simply the way whoever coded it back then did it, same can be coded for more lights).

Irrlicht has some shadow calculations you can add, but again a bit of an older algorithm. No shadow mapping unless you use XEffects. Instead each light kinda extends polygon borders infinitely and then, with some clever tricks with the stencil buffer, areas which are inside the shadows are found and polygons inside those areas darkened.

For more advanced light you'll have to write shaders. Most common light shaders these days are PBR (physical based rendering) shaders. Usually calculated per pixel/fragment. Thought you can start first with the simpler phong algorithm, as it's very similar and PBR can be thought of as an advanced version of that one (it adds stuff like more realistic specular lights based on tuning parameters for things like roughness and metallicity). To start with writing such shaders you can look at the shaders in the ogl-es branch of Irrlicht (vertex values have to be passed a bit different in OpenGL+Irrlicht, otherwise the ES2 shaders are nearly the same).

And after the light calculation the shadows are added (or subtracted or multiplied) with your mentioned shadow mapping. And that works as you write - rendering the scene from the viewpoint of each light and thereby getting textures which tell you how deep the lights penetrate the scene at each pixel. Then later the real scene is rendered and reads out those values from the shadow map textures. Thought it has to translate the coordinates back into it's own camera view and there is quite some loss of precision going on so the real struggle is getting rid of ugly jagged borders and light gaps where shadows should be. There's more advanced variations which for example calculate more than one shadow-map per light and chose the best one or interpolate between them and stuff like that.

Note so far this is all direct lighting - no light-rays yet reflecting from the scene around a polygon (except the ambient factor). So the really advanced stuff comes next - environment lighting. Very often that uses lots of pre-calculated textures (p.E. using pre-rendered cubemaps at certain points). Or other tricks... lots of tricks, check the website I linked above I think it has many of them.

By now people even start using ray-tracing instead. Which in the past wasn't possible in real-time, but is slowly getting fast enough. Especially with modern graphic-cards which have some more support for it build-in. And unlike other methods raytracing does combine light and shadow calculations. As in - you automatically get shadows in places which can't be reached by the light. The light calculation algorithm still uses the PBR algorithms, just applied way more often. Basically this one shoots tons of rays around the scene - either from the polygon toward the light (but also in a few more directions to get reflections from walls etc) or the other way round from the light in all kind of directions (a few years ago light-rays started only from the polygons, but modern algorithms go often in both directions). And the real trick is getting a low enough number of rays to have it fast enough for real-time and still having enough rays to get a realistic light calculations.

Btw. "real-time lighting" is not about the algorithm, that just means the algorithm you used is fast enough to give you results while you watch. So the computer hasn't 10 minutes time to render a frame which you need in 1/60th second, but it really has to finish it's calculations 60 times per second. It has to deliver in our real time :-)

Hope I managed to give you a rough overview, just ask if I wasn't explaining it well (just typed this down quickly). Thought I'm also not that super deep into that stuff.

Also note if you want to do anything slightly more advanced make sure to work with Irrlicht svn trunk and not Irrlicht 1.8. Many things just weren't possible back then, but things have gotten a bit better by now. And if you stay with the default stuff... I'd also recommend svn trunk as shadows have fixed quite a lot of bugs since 1.8 (thought still not that good really).

edit: If you got some free time, watch this video: https://www.youtube.com/watch?v=P6UKhR0T6cs&t=10s
A decade old by now, but it's going over all the basics.

Re: How does real time lighting work get vertex context to the shader?

Posted: Tue Jun 04, 2024 2:05 am
by Noiecity
CuteAlien wrote: Thu May 30, 2024 7:06 pm ..
That's a pretty huge question. Shadow mapping is about calculating shadows - which is for the most part independent of calculating lights.
..
Thanks for the extensive explanation mr CuteAlien, living up to the name with an explanation from another galaxy lmao :mrgreen:

Soon I will upload an example of realism using irrlicht svn as a base, without adding more than models, textures and lighting by default, with baked textures of course, which for some reason gives me better results in irrlicht than in unity.

I had problems with my charger recently and well, I wanted to continue creating free assets