Ok I am now regarding this thing complete and closed.. you can expect another example zip with source code within 72 hours, I just need to remove constants in order for the bake process to be flexible (tile factor is hardcoded into shader) I will also check the Y index reconstruction and put all my ugly hacks into nice structured code
I have finally handled merging/blending GI into the scene
Here are some screenshots (30 odd)
http://project-ninja-star.googlecode.com/files/GI.zip
Click here for EyeGasm (RL GI) [FULL SRC && DEMO AVA
haha lol.. I just found a simple solution to the light going through walls issue, allowing me for a smaller normal offset in the previous solution (which messes up small models). It's disabling BACKFACE CULLING!!!
Also i finally have a general idea of how to implement lighting, have 3 sh... Diffuse, normal and depth. Now Tom Forsyth says sh can be added and multiplied just fine by each other which means i can multiply the normal sh by the depth sh (normal sh is 3 shs because of xyz) to find xyz distance from cEntre of the harmonic. Next we pretend that the light position minus the harmonic centre coordinate is a harmonic (using standard conversion scheme but using only one sample) and we can add this harmonic to the normal*depth one resulting in a light vector from surface encoded for the cubemap in 3 sh. We can then dot product it with the normal harmonic obtaining a single channel harmonic giving us the lambertian lighting coefficient, which can be multiplied with the diffuse harmonic and the result added to the main ambient harmonic. All this can be done on hardware directly if presenting shs as lookup textures.
Also i finally have a general idea of how to implement lighting, have 3 sh... Diffuse, normal and depth. Now Tom Forsyth says sh can be added and multiplied just fine by each other which means i can multiply the normal sh by the depth sh (normal sh is 3 shs because of xyz) to find xyz distance from cEntre of the harmonic. Next we pretend that the light position minus the harmonic centre coordinate is a harmonic (using standard conversion scheme but using only one sample) and we can add this harmonic to the normal*depth one resulting in a light vector from surface encoded for the cubemap in 3 sh. We can then dot product it with the normal harmonic obtaining a single channel harmonic giving us the lambertian lighting coefficient, which can be multiplied with the diffuse harmonic and the result added to the main ambient harmonic. All this can be done on hardware directly if presenting shs as lookup textures.
To have a physically correct solution you should also make sure that if a mesh is closed, when you disable the backface culling, the polygon seen from the inverted normal (when the viewDir and Normal vectors give a negative dot product) is totally black, else, you could be receiving light from inside an object, which is another source of light leaks.
"There is nothing truly useless, it always serves as a bad example". Arthur A. Schmitt