Global Illumination creator for Static Meshes
Global Illumination creator for Static Meshes
I am a bit downed by the fact I have created my best screenshot so far back in 2008, so I decided to raise out of the depths of hell and create something eyegasmic.....
I'm currently coding a level editor, I just need to get it to be able to select mesh buffers and join them (obviously save and load previous works).
Currently I have a texture loader and browser working.
Then the only feature will be moving, scaling and rotating, objects/nodes and meshbuffers. Along with combining them (and some limited forms of splitting).
The only really useful feature is going to be the radiosity calculator which will save a detailed 3d grid of spherical harmonics coding for ambient,diffuse,normals,depth and the opacity(could be omitted by using depth in a clever way) of the meshbuffer (11 or 10 channels as floating point data == 360bytes per harmonic). Obviously Mip-mapped versions of the data would exist...
It will be a bit of a pain, since I will have to code support for floating point 3d textures.
What I am planning to do with the data later on.. if the memory cutbacks are huge then I develop some kinda streaming (where objects far away load a lower magnitude harmonic grid). Also I will make Quadlinear filtering (trilinear for the grid and one more for mip-map levels). Then I shall sample the data with a grid (stuck to the camera) and try to somehow blend the harmonics from overlapping grids (planning to do some research here). This camera grid is going to be used in the ambient pass of my inferred rendering pipeline in my engine, giving me ambient illumination. I will hope to achieve performance improvements by only updating the grid when camera moves or an obejct in the frustum moves.
The illumination is going to be a two bounce:
mix(illumination from the skybox,illumination from the skybox for the harmonic of the radiating object,visibility of the skybox)
I'm also hoping that the depth and diffuse will enable me to calculate diffuse light from dynamic lights bouncing off objects (only if the harmonics blend in some way). From personal experience I can tell that a directional light could be possible (sun) but point lights would be significantly harder (although apparently there is a way to multiply the harmonics if the light data is made into a harmonic). I would be pleasantly surprised if I got spot lights working. Obviously there is a massive no-no for shadows
Anyway, for those who do not know that I have hacked a quick Global Illumination example for irrlicht...
http://project-ninja-star.googlecode.com/files/GIv2.zip
I'm currently coding a level editor, I just need to get it to be able to select mesh buffers and join them (obviously save and load previous works).
Currently I have a texture loader and browser working.
Then the only feature will be moving, scaling and rotating, objects/nodes and meshbuffers. Along with combining them (and some limited forms of splitting).
The only really useful feature is going to be the radiosity calculator which will save a detailed 3d grid of spherical harmonics coding for ambient,diffuse,normals,depth and the opacity(could be omitted by using depth in a clever way) of the meshbuffer (11 or 10 channels as floating point data == 360bytes per harmonic). Obviously Mip-mapped versions of the data would exist...
It will be a bit of a pain, since I will have to code support for floating point 3d textures.
What I am planning to do with the data later on.. if the memory cutbacks are huge then I develop some kinda streaming (where objects far away load a lower magnitude harmonic grid). Also I will make Quadlinear filtering (trilinear for the grid and one more for mip-map levels). Then I shall sample the data with a grid (stuck to the camera) and try to somehow blend the harmonics from overlapping grids (planning to do some research here). This camera grid is going to be used in the ambient pass of my inferred rendering pipeline in my engine, giving me ambient illumination. I will hope to achieve performance improvements by only updating the grid when camera moves or an obejct in the frustum moves.
The illumination is going to be a two bounce:
mix(illumination from the skybox,illumination from the skybox for the harmonic of the radiating object,visibility of the skybox)
I'm also hoping that the depth and diffuse will enable me to calculate diffuse light from dynamic lights bouncing off objects (only if the harmonics blend in some way). From personal experience I can tell that a directional light could be possible (sun) but point lights would be significantly harder (although apparently there is a way to multiply the harmonics if the light data is made into a harmonic). I would be pleasantly surprised if I got spot lights working. Obviously there is a massive no-no for shadows
Anyway, for those who do not know that I have hacked a quick Global Illumination example for irrlicht...
http://project-ninja-star.googlecode.com/files/GIv2.zip
-
- Posts: 1638
- Joined: Mon Apr 30, 2007 3:24 am
- Location: Montreal, CANADA
- Contact:
Hi DevSH!
I'm really interested to see everything you can come up with!
I tried to compile and run the program with codeblock on Windows, but got a segmentation fault when lauching it:
That happen just after loading the "bunny" and initializing the shaders:
I'm really interested to see everything you can come up with!
I tried to compile and run the program with codeblock on Windows, but got a segmentation fault when lauching it:
The line (1366) that cause the segmentation fault is a memcopy (in the IRRlicht includes at matrix4.h file):Program received signal SIGSEGV, Segmentation fault.
At /GIv2/IRRLIC~1.1/examples/SPHERI~1/../../include/matrix4.h:1366
Code: Select all
memcpy(M, other.M, 16*sizeof(T));
I have a GeForce GTX 460, my reported version of GL is 4.1. Do we need a Uber video card to run this?Loaded mesh: bunny.obj
SHs X: 56 Y: 12 Z: 56
Total: 37632 Pixels: 285100.000000
Unsupported texture format
-
- Posts: 1638
- Joined: Mon Apr 30, 2007 3:24 am
- Location: Montreal, CANADA
- Contact:
there are no light nodes for GI, I'm programming my GI so its dynamic in the following terms:
-Dynamic for moving static meshes (you can move a car and GI will change)
-Dynamic for animated meshes (they will be able to receive GI and maybe influence GI at least for keyframe meshes (rigged/boned will be a pain in the ass to do) )
-Dynamic in response to the sun's ambient (simple mul)
-Dynamic in response to the skybox
-Dynamic in response to the dynamic lights' ambient (with attenuation)
-Dynamic in response to the dynamic lights' diffuse (with attenuation)
As for the animated skinned meshes, there is a way to do dynamic harmonics on them but none of the ways is perfect. One would be to recalculate each harmonic by feeding in the triangles, but you would have to run the shader on CPU and use the lowest mipmap possible. Another would be to flatten out the geometry of the mesh onto a 2D plane (texture), record raw position, normal, bone weight, ambient colour, diffuse colour and all the other things needed for a harmonic. Then each 3 groups of pixels would code for a triangle, that would require the normal hardware shader to run and transform (animate) the data and output triangle sizes(differential areas),diffuse,ambient,normal,and other things needed for a harmonic. Then yet another shader would run in order to render into the spherical harmonic grid accumulating the data from all triangles (pixels) from the model. Then the data would most probably would have to be downloaded from the GPU to be blended into the freakin camera frustum grid.
This would all mean that the animated skinned meshes would have to have small grids or GI would eat FPS due to fillrate. For a 10000 poly anim mesh, less than 200 probes would have to be used.
Then I am unsure whether I'll achieve these:
-Dynamic in response to the sun's diffuse with shadows
-Dynamic in response to the dynamic lights' shadows
P.S. I fyou want the texture browser code, I can give it to you
I devised an improvised version of the texture cache browser, because I dont duplicate all textures as GUI images, only draw how many fit in the bar and dont concern myself with the others. The things drawn are hardware textures.... from the video driver
-Dynamic for moving static meshes (you can move a car and GI will change)
-Dynamic for animated meshes (they will be able to receive GI and maybe influence GI at least for keyframe meshes (rigged/boned will be a pain in the ass to do) )
-Dynamic in response to the sun's ambient (simple mul)
-Dynamic in response to the skybox
-Dynamic in response to the dynamic lights' ambient (with attenuation)
-Dynamic in response to the dynamic lights' diffuse (with attenuation)
As for the animated skinned meshes, there is a way to do dynamic harmonics on them but none of the ways is perfect. One would be to recalculate each harmonic by feeding in the triangles, but you would have to run the shader on CPU and use the lowest mipmap possible. Another would be to flatten out the geometry of the mesh onto a 2D plane (texture), record raw position, normal, bone weight, ambient colour, diffuse colour and all the other things needed for a harmonic. Then each 3 groups of pixels would code for a triangle, that would require the normal hardware shader to run and transform (animate) the data and output triangle sizes(differential areas),diffuse,ambient,normal,and other things needed for a harmonic. Then yet another shader would run in order to render into the spherical harmonic grid accumulating the data from all triangles (pixels) from the model. Then the data would most probably would have to be downloaded from the GPU to be blended into the freakin camera frustum grid.
This would all mean that the animated skinned meshes would have to have small grids or GI would eat FPS due to fillrate. For a 10000 poly anim mesh, less than 200 probes would have to be used.
Then I am unsure whether I'll achieve these:
-Dynamic in response to the sun's diffuse with shadows
-Dynamic in response to the dynamic lights' shadows
P.S. I fyou want the texture browser code, I can give it to you
I devised an improvised version of the texture cache browser, because I dont duplicate all textures as GUI images, only draw how many fit in the bar and dont concern myself with the others. The things drawn are hardware textures.... from the video driver
phew 5 hours of debuggin and I can save and load my Engine's/Editor's world (textures only)
the interesting thing is that my engine uses a "resource" approach where you constantly query for the pointer to anything (irrlicht node,mesh,texture, python script, module, function) for the duration of the function. This way it's an idiotproof system, because when the engine is restarted/reloaded the resources are too and your gamestate/game class retains only the handles to the resources so it doesn't notice a difference. The handle in this case is a u32 index (number), the unsafe thing is.... that you need to typecast the pointers you get back from the engine and check for NULL (null pointer means invalid resource handle).
this is very useful with my binary world file, I can write all the resources into a file and when writing which mesh the node uses (i.e. a filename or a pointer which would not apply later on) I only write the 4 byte resource handle.
the interesting thing is that my engine uses a "resource" approach where you constantly query for the pointer to anything (irrlicht node,mesh,texture, python script, module, function) for the duration of the function. This way it's an idiotproof system, because when the engine is restarted/reloaded the resources are too and your gamestate/game class retains only the handles to the resources so it doesn't notice a difference. The handle in this case is a u32 index (number), the unsafe thing is.... that you need to typecast the pointers you get back from the engine and check for NULL (null pointer means invalid resource handle).
this is very useful with my binary world file, I can write all the resources into a file and when writing which mesh the node uses (i.e. a filename or a pointer which would not apply later on) I only write the 4 byte resource handle.
-
- Posts: 83
- Joined: Fri May 28, 2010 8:59 am
- Location: Perth, Australia
-
- Posts: 1215
- Joined: Tue Jan 09, 2007 7:03 pm
- Location: Leuven, Belgium
Looks nice devsh
Are you using any papers as a resource? I've got a SSGI implementation myself, but screen-space GI calculations have some very obvious flaws
I'm not too interested in implementing a full-blown GI system just yet, I want to optimize my pipeline to the fullest first, but it'd be nice to brush up my knowledge
Are you using any papers as a resource? I've got a SSGI implementation myself, but screen-space GI calculations have some very obvious flaws
I'm not too interested in implementing a full-blown GI system just yet, I want to optimize my pipeline to the fullest first, but it'd be nice to brush up my knowledge
Full blown GI has flaws too but it could be done relatively cheap with some thought...
I just need some help... does anyone know a good function/method that will give me sampling points on a sphere at equal distance from each other (arbitrary number)?
P.S. Loading and Unloading MeshSceneNodes works like a charm with my resource system. Anyone knows a good lossless compression system for images??? (I could use zip but its really not made for the job)
I just need some help... does anyone know a good function/method that will give me sampling points on a sphere at equal distance from each other (arbitrary number)?
P.S. Loading and Unloading MeshSceneNodes works like a charm with my resource system. Anyone knows a good lossless compression system for images??? (I could use zip but its really not made for the job)