[not a bug] Light + Scale Node + GeForce 7600 GS = ERROR!!!

You discovered a bug in the engine, and you are sure that it is not a problem of your code? Just post it in here. Please read the bug posting guidelines first.
Post Reply
Magnet
Posts: 101
Joined: Wed Sep 27, 2006 7:32 pm

[not a bug] Light + Scale Node + GeForce 7600 GS = ERROR!!!

Post by Magnet »

I am bue new PC.
I have video card: NVIDIA GeForce 7600 GS.
My project has this code:

Code: Select all

device->getFileSystem()->changeWorkingDirectoryTo("../media/");
	IAnimatedMesh* mesh = smgr->getMesh("waterLily.ms3d");
	IAnimatedMeshSceneNode* node = smgr->addAnimatedMeshSceneNode( mesh );

	if (node)
	{
		node->setMaterialFlag(video::EMF_BACK_FACE_CULLING, false);
		node->setMaterialFlag(EMF_LIGHTING, true);
		//node->setScale(vector3df(0.5,0.5,0.5));
	}
Result:
Image

If I am uncomment this line:

Code: Select all

node->setScale(vector3df(0.5,0.5,0.5));
I am take this result:
Image
If I am set scale > 1 my programm works correctly.


This error does not appear on my second computer.
This computer has NVIDIA GeForce 5200

Also this error does not appear if I disable lighting.

Therefore, my error appear only on the my PC with NVIDIA GeForce 7600 GS,
and if I am create two nodes:
1. lighting node
2. SCALED object with enabled lighting. (And error is appear if scale factor more then one)
vitek
Bug Slayer
Posts: 3919
Joined: Mon Jan 16, 2006 10:52 am
Location: Corvallis, OR

Post by vitek »

You might have a look at the EMF_NORMALIZE_NORMALS material flag.

Travis
needforhint
Posts: 322
Joined: Tue Aug 30, 2005 10:34 am
Location: slovakia

Post by needforhint »

I would also post your problem on NVidia developers forums, or report it to them somehow. It is clearly a matter of the card from the test conclusions. ... and if you solve the problem, by looking at the normalize normals flag for example, tell them you have a solution :!:
what is this thing...
Saturn
Posts: 418
Joined: Mon Sep 25, 2006 5:58 pm

Post by Saturn »

Why? This is clearly what vitek already said and it is perfectly expected behaviour. Unnormalised normals yield undefined results. Cards can react differently. This is not a bug in the driver/irrlicht or GPU.

This is, why normal renormalisation exists in the first place. So one would expect NVidia to know already. ;)
Post Reply