apps don't show up in 32 bit in openGL

You discovered a bug in the engine, and you are sure that it is not a problem of your code? Just post it in here. Please read the bug posting guidelines first.
Post Reply
afecelis
Admin
Posts: 3075
Joined: Sun Feb 22, 2004 10:44 pm
Location: Colombia
Contact:

apps don't show up in 32 bit in openGL

Post by afecelis »

now that 32 bit rendering is the default in version 0.14; only my apps compiled with DX show up ok.

If compiled with opengl they show up in 16 bit.

anyone else having this prob?

DX:
http://www.danielpatton.com/afecelis/Irrlicht/skydx.jpg

GL: (notice the 16 bit stripes)
http://www.danielpatton.com/afecelis/Irrlicht/skygl.jpg
Guest

Post by Guest »

Are you running them windowed? I know it takes the desktop BPP and uses that in windowed but not sure why it would do it to ogl and not dx.. of course I am sure you have ruled that out you are hardly a "newb" ;)
sRc
Posts: 431
Joined: Thu Jul 28, 2005 1:44 am
Location: Salt Lake City, Utah
Contact:

Post by sRc »

hmmmm... this is strange

I changed the skybox in the examples to something that I knew would band due to its nature (in this case a gradient) so I could look at it with the example programs, and it looks fine in both DirectX and OpenGL to me

DirectX:
Image
OpenGL:
Image

But, if I set the color depth of my desktop to 16bit to force the banding, obviously it seems SOMETHING is going on because what is displayed then differs...

DirectX:
Image
OpenGL:
Image

very strange...

(btw the green border around them is because I made a green gradient first then went back to make it black and white and lower the jpeg compression)
The Bard sRc

Blog | Twitter
peterparker
Posts: 11
Joined: Mon Nov 28, 2005 5:00 am
Location: N.Y.

Post by peterparker »

I could be wrong, but I think it is a clear demonstration of different usage for colorcomponents in 16 bit mode:
as you see, the opengl-pic has 16 clear levels of grey, so you can think of 16 different intensity levels aka 4 bit per colorcomponent as 4bit * 3 [for RGB ]=12 you can see that openGL is reserving 4 bits for alpha channel. This is also well known as "4444 Mode".

The DirectX-Screen has 32 different levels and also a very thin line in the middle of them. I think this is a "565 Mode", where 5 bits are for Red , 6 bits for Green and 5 bits for Blue so you have 32 [2^5] intensity levels and green has 64 levels, as the eye is more intensive for green tones, however the directXsurface has no support of an alphachannel.

Greetings,
Peter
afecelis
Admin
Posts: 3075
Joined: Sun Feb 22, 2004 10:44 pm
Location: Colombia
Contact:

Post by afecelis »

thanx for the comments guys, and for the explanation peter. It's weird since my desktop depth is at 32 bits. Perhaps it's an ATI issue? I'll try it on another box with Nvidia graphics to see if it makes any difference.

But it's something that grabs my attention since it didn't happen in earlier versions when you had toset it manually:

Code: Select all

driver->setTextureCreationFlag(video::ETCF_ALWAYS_32_BIT, true
let's see what happens with Nvidia
:wink:
Pazystamo
Posts: 115
Joined: Sat Dec 03, 2005 5:56 pm
Location: Lithuania
Contact:

Post by Pazystamo »

I have same problem... i cant see diference when i set ETCF_ALWAYS_32_BIT or ETCF_ALWAYS_16_BIT flags on opengl with 32bit tga texture. My screen uses 32 bit. How to solve this problem/bug?
My project in forum- ATMOsphere - new dynamic sky dome for Irrlicht
Andi|xng
Posts: 83
Joined: Thu Mar 24, 2005 10:49 pm
Location: Schrobenhausen, Germany
Contact:

Post by Andi|xng »

This is very, very easy! (Thanks to Phil for the idea!) Irrlicht seems to ignore the 32bit flag for OpenGL... Change the following line:

COpenGLTexture.cpp, line 173:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, ImageSize.Width,
ImageSize.Height, 0, GL_BGRA_EXT , GL_UNSIGNED_BYTE, ImageData);

to

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, ImageSize.Width,
ImageSize.Height, 0, GL_BGRA_EXT , GL_UNSIGNED_BYTE, ImageData);

This single character forces OpenGL to use 32bit textures... Have fun!
Post Reply