1 pixel off for origin in OpenGL NVidia

You discovered a bug in the engine, and you are sure that it is not a problem of your code? Just post it in here. Please read the bug posting guidelines first.
Post Reply
JoeWright
Posts: 74
Joined: Mon Dec 08, 2003 3:51 pm

1 pixel off for origin in OpenGL NVidia

Post by JoeWright »

I have a Nvidia Geforce 2 GTS and, even with the latest drivers, there is a bug with OpenGL and 2D graphics.

This can be seen in Irrlicht but is an OpenGL/Nvidia problem. Bascially the use of 2D commands works with an origin of 1,1 instead of 0,0 so everything's a pixel off horizontally and verically.

This problem doesn't appear on an ATI 9200 VIVO.

For example, a matrix of:

Code: Select all

glOrtho(0.0, (GLdouble) kWidth, (GLdouble) kHeight, 0.0, -1.0, 1.0);
causes the error whereas this fixes it:

Code: Select all

glOrtho(-0.5, (GLdouble) kWidth-0.5, (GLdouble) kHeight-0.5, -0.5, -1.0, 1.0);
However, I'm guessing this won't work on every platform (haven't tried it on the ATI yet) with perhaps some resulting in an origin of -1,-1.

I'm thinking that the complete solution would be to perhaps for the programme to try a few matricies, plot a pixel, test the actual screen position, and pick the one that worked as expected.

Any thoughts?

Joe
Spintz
Posts: 1688
Joined: Thu Nov 04, 2004 3:25 pm

Post by Spintz »

new videocard!!! :P
Image
JoeWright
Posts: 74
Joined: Mon Dec 08, 2003 3:51 pm

Post by JoeWright »

Yes please :)
Guest

Post by Guest »

I have the same card (MX version though), and I, too, find this very annoying. Please don't write this off because the cards are old, other OGL-Programs don't seem to have these problems. Besides, When I develop a program some of its users may also use older hardware.

I already wrote a small program that shows the direction of the displacement with a small arrow. I plan to show the user some buttons to add a displacement that is added to all sourceRects, until the arrow turns into a circle... hard to explain, it's a bit of a hack.

MedO
Guest

Post by Guest »

It might be of interest that this problem also affects more current cards.
A friend of mine uses something like an ATI Radeon 9600 (dont know exactly), and in his case the parts of the textures drawn to the screen are a pixel to large in the x-direction. I didn't notice this without a test program because nothing is cut off like in my case, but the problem is there.

I wouldn't mind it too much (normally easy to work around), but on my pc it really cripples Irrlicht font display, since all letters are cut off one pixel from the left.

MedO
niko
Site Admin
Posts: 1759
Joined: Fri Aug 22, 2003 4:44 am
Location: Vienna, Austria
Contact:

Post by niko »

Thanks for posting this. I also noticed this problem in Linux on my pc, but when I switch to another task an back to Irrlicht it disappears. Don't know it this is the same problem. But I'll investigate a bit. :)
Guest

Post by Guest »

Thanks Niko!
I can send you my test program if you like, or test patches.
One note though, I misinterpreted the result my friend got on his PC. The result screenshot he sent me was blurry, and I first thought this was because of bad jpeg compression, but now he sent a better version, and it's also blurred there. It looks like his problem is also an offset, but by maybe half a pixel so the card interpolates. Sorry if this doesn't help you much.

MedO
Guest

Post by Guest »

It looks like you can forget about the problem on the ati card. Some testing on my new pc (which also has a recent ATI card) suggests that the problem of my friend was caused by speed-optimized mipmapping preferences in the video setup.
The Nvidia-problem, however, remains.

MedO
JoeWright
Posts: 74
Joined: Mon Dec 08, 2003 3:51 pm

SOLVED! I hope

Post by JoeWright »

From: http://www.opengl.org/resources/faq/tec ... m#tran0030
If exact pixelization is required, you might want to put a small translation in the ModelView matrix, as shown below:

glMatrixMode (GL_MODELVIEW);

glLoadIdentity ();

glTranslatef (0.375, 0.375, 0.);

The discussion (http://www.gamedev.net/community/forums ... hichPage=2) seems to imply a lot of rendering by OpenGL is done in exactly the middle of a pixel. Therefore, some drivers interpret where that middle is 1 pixel differently. By adding the value above, all should make it the same.

How does this work for people
JoeWright
Posts: 74
Joined: Mon Dec 08, 2003 3:51 pm

Post by JoeWright »

I forgot to say that there's probably some additional things to do for textures to make them pixel accurate. I can't remember the details but those two links should provide all needed info.

Joe
Guest

Post by Guest »

There is is another thread discussing this problem (see http://irrlicht.sourceforge.net/phpBB2/ ... php?t=8493), where we tried removing the "+0.5f"-occurences from the GL draw2DImage functions.
This way, the GL texture coordinates drawn (when drawing an entire texture) should range from (0.0f,0.0f) to (1.0f, 1.0f), and the screen coordinates from (-1.0f, -1.0f) to (1.0f, 1.0f). Some positive replies, no negative yet... but still not well tested.
You seem to know a bit about GL, maybe you could look it over and see if it makes sense. Remember though that for drawing a 64x64 texture, the sourcerect is (0,0,64,64), not (0,0,63,63)... at least in the reference implementation (software renderer).

MedO
Post Reply