Page 1 of 1

NVidiaCG Shader Implementation Gotcha!

Posted: Tue Dec 16, 2014 9:08 pm
by rabbit
Ok guys, this isn't so much of a bug as it is a heads up over using the NvidiaCG implementation (I am using Irrlicht 1. 8 ).

You can't passing arrays of float and int arrays to CG shaders.

On another note, the CG integration only supports a limited number of data types, mainly ints, and floats. No support for char, short etc

Re: NVidiaCG Shader Implementation Gotcha!

Posted: Tue Dec 30, 2014 8:08 pm
by chronologicaldot
Isn't NVidia going to dump CG? Or am I reading things wrong?
I suppose you could put this topic under "Everything 2d/3d Graphics".

Thanks for the heads up!

Re: NVidiaCG Shader Implementation Gotcha!

Posted: Sun Jan 04, 2015 12:45 pm
by rabbit
I figure it should be excluded from the next version of irrlicht because it is discontinued.

It is a very poor choice of api because it doesn't support arrays. This means, that multi-lighting shaders are kinda impossible because you can't pass in arrays of light positions / colours.

Re: NVidiaCG Shader Implementation Gotcha!

Posted: Sun Jan 04, 2015 1:55 pm
by Nadro
Yes, Cg support will be removed in v1.9.