Page 1 of 1

double memory when loading a texture? WHY?

Posted: Wed Dec 19, 2007 1:14 pm
by necropower
guys, i am loading a 4mb , 1024x1024 , uncompressed TGA texture and when i use the IrrVideoDriver.GetTexture to load it up, it consumes 8.2MB of system memory!

I REALLY cant see a reason for that, why would it consumes double the memory of the normal texture?

i already set the CreateMipMaps to false , and it reduced a little bit but not even close to the real one...

i tried to do the same thing using another engine , TRUEVISION3D , and on it, it consumed 5.5 mb.

could someone enlight me up of why is that happening or what should i do to reduce it more?

thank you

Posted: Wed Dec 19, 2007 1:24 pm
by psyco001
the texture is internaly saved as a bitmap so it costs more memory

Posted: Wed Dec 19, 2007 1:28 pm
by necropower
OK, thats not the case, i made the test using a .bmp file(btw, same specifications, 4mb , 1024x1024) and it still eating up the same ammount of memory (8.2 mb)

any other ideas??? anyone? :D

Posted: Wed Dec 19, 2007 1:52 pm
by JP
It's stored as a bitmap, just uncompressed so it takes up w*h*4 bytes of space. Effectively that's the same size as a bitmap though.

But if it's an uncompressed tga i would have thought it would be the same size in memory as it was on disc so that's strange..

Posted: Wed Dec 19, 2007 2:01 pm
by necropower
thats what i thought, who should i ask about this in the irrlicht team? :D
cause it seems it is really duplicating memory now...

Posted: Wed Dec 19, 2007 2:09 pm
by CuteAlien
Are you using the newest Irrlicht version? I know that in older versions (< 1.2) one copy was made in CImage.cpp which was not really needed in most cases. But that part seems to be rewritten, so I supposed that this is solved in a better way by now. Though I have not yet done memory tests for newer versions myself.

Posted: Wed Dec 19, 2007 2:15 pm
by psyco001
maybe if your tga is saved with 16Bit color-depth or if it is saved as 16bit bitmap there isn't W*H*4byte.
if its a 16bit image it would be W*H*4*4Bit
from these not all formats can store alpha channel so it would be W*H*3*4Bit
list:
32bit with alpha: W*H*4byte
24bit (same without alpha): W*H*3byte
16bit with alpha: W*H*2byte (W*H*4*4Bit)
16Bit without alpha: W*H*3*4Bit
8Bit -> 1Byte
so it would be correct that the ammount of used memmory is higher than image file size when the image is internaly saved as 32Bit Bitmap

Posted: Wed Dec 19, 2007 3:26 pm
by rogerborg
Hmm. DIRECTX9, right? CD3D9Texture is hanging onto the CImage that's used to decode the image until the actual DirectX texture is deleted. I can't see any need for that. This isn't a candidate patch, it just demonstrates (e.g. by running the demo example app with DX9) that there's no harm to dropping the image after it's used to create the DX9 texture. The CImage (and the duped memory) then gets freed in CNullDriver::loadTextureFromFile().

Code: Select all

Index: source/Irrlicht/CD3D9Texture.cpp
===================================================================
--- source/Irrlicht/CD3D9Texture.cpp	(revision 1113)
+++ source/Irrlicht/CD3D9Texture.cpp	(working copy)
@@ -90,6 +90,9 @@
 		}
 		else
 			os::Printer::log("Could not create DIRECT3D9 Texture.", ELL_WARNING);
+
+        Image->drop();
+        Image = 0;
 	}
 }
 

Posted: Thu Dec 20, 2007 12:11 pm
by necropower
i think that should be on the RC since it doesnt sound necessary to keep a cimage doubling the consumed memory (btw, it worked, thank you)

Posted: Thu Dec 20, 2007 1:52 pm
by rogerborg
Yup, I've raised a bug report and candidate fix. Thanks for reporting this.