Page 1 of 1

apps don't show up in 32 bit in openGL

Posted: Tue Dec 06, 2005 4:11 am
by afecelis
now that 32 bit rendering is the default in version 0.14; only my apps compiled with DX show up ok.

If compiled with opengl they show up in 16 bit.

anyone else having this prob?

DX:
http://www.danielpatton.com/afecelis/Irrlicht/skydx.jpg

GL: (notice the 16 bit stripes)
http://www.danielpatton.com/afecelis/Irrlicht/skygl.jpg

Posted: Tue Dec 06, 2005 12:00 pm
by Guest
Are you running them windowed? I know it takes the desktop BPP and uses that in windowed but not sure why it would do it to ogl and not dx.. of course I am sure you have ruled that out you are hardly a "newb" ;)

Posted: Tue Dec 06, 2005 3:36 pm
by sRc
hmmmm... this is strange

I changed the skybox in the examples to something that I knew would band due to its nature (in this case a gradient) so I could look at it with the example programs, and it looks fine in both DirectX and OpenGL to me

DirectX:
Image
OpenGL:
Image

But, if I set the color depth of my desktop to 16bit to force the banding, obviously it seems SOMETHING is going on because what is displayed then differs...

DirectX:
Image
OpenGL:
Image

very strange...

(btw the green border around them is because I made a green gradient first then went back to make it black and white and lower the jpeg compression)

Posted: Tue Dec 06, 2005 8:03 pm
by peterparker
I could be wrong, but I think it is a clear demonstration of different usage for colorcomponents in 16 bit mode:
as you see, the opengl-pic has 16 clear levels of grey, so you can think of 16 different intensity levels aka 4 bit per colorcomponent as 4bit * 3 [for RGB ]=12 you can see that openGL is reserving 4 bits for alpha channel. This is also well known as "4444 Mode".

The DirectX-Screen has 32 different levels and also a very thin line in the middle of them. I think this is a "565 Mode", where 5 bits are for Red , 6 bits for Green and 5 bits for Blue so you have 32 [2^5] intensity levels and green has 64 levels, as the eye is more intensive for green tones, however the directXsurface has no support of an alphachannel.

Greetings,
Peter

Posted: Wed Dec 07, 2005 1:29 am
by afecelis
thanx for the comments guys, and for the explanation peter. It's weird since my desktop depth is at 32 bits. Perhaps it's an ATI issue? I'll try it on another box with Nvidia graphics to see if it makes any difference.

But it's something that grabs my attention since it didn't happen in earlier versions when you had toset it manually:

Code: Select all

driver->setTextureCreationFlag(video::ETCF_ALWAYS_32_BIT, true
let's see what happens with Nvidia
:wink:

Posted: Sat Feb 04, 2006 8:46 pm
by Pazystamo
I have same problem... i cant see diference when i set ETCF_ALWAYS_32_BIT or ETCF_ALWAYS_16_BIT flags on opengl with 32bit tga texture. My screen uses 32 bit. How to solve this problem/bug?

Posted: Mon May 22, 2006 9:41 pm
by Andi|xng
This is very, very easy! (Thanks to Phil for the idea!) Irrlicht seems to ignore the 32bit flag for OpenGL... Change the following line:

COpenGLTexture.cpp, line 173:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, ImageSize.Width,
ImageSize.Height, 0, GL_BGRA_EXT , GL_UNSIGNED_BYTE, ImageData);

to

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, ImageSize.Width,
ImageSize.Height, 0, GL_BGRA_EXT , GL_UNSIGNED_BYTE, ImageData);

This single character forces OpenGL to use 32bit textures... Have fun!