From the snippet below I thought that SIrrlichtCreationParameters::AntiAlias variable was to set the power of the anti-alias.. or the multisample power.
Code: Select all
//! Specifies if the device should use fullscreen anti aliasing
/** Makes sharp/pixelated edges softer, but requires more
performance. Also, 2D elements might look blurred with this
switched on. The resulting rendering quality also depends on
the hardware and driver you are using, your program might look
different on different hardware with this. So if you are
writing a game/application with AntiAlias switched on, it would
be a good idea to make it possible to switch this option off
again by the user.
The value is the maximal antialiasing factor requested for
the device. The cretion method will automatically try smaller
values if no window can be created with the given value.
Value one is usually the same as 0 (disabled), but might be a
special value on some platforms. On D3D devices it maps to
NONMASKABLE.
Default value: 0 - disabled */
u8 AntiAlias;
Am I wrong? I also set my mesh material's AntiAliasing (which is the mode) to EAAM_FULL_BASIC with no result.
When I trace the driver initialization I find the following code returns a PixelFormat of 0 using my irrlicht parameters at 800x600. And I know my crossfired dual ATI Radeon HD 5870 can handle that format.
Code: Select all
s32 rv=0;
// Try to get an acceptable pixel format
do
{
int pixelFormat=0;
UINT numFormats=0;
const BOOL valid = wglChoosePixelFormat_ARB(HDc,iAttributes,fAttributes,1,&pixelFormat,&numFormats);
if (valid && numFormats)
rv = pixelFormat;
else
iAttributes[21] -= 1;
}
while(rv==0 && iAttributes[21]>1);
if (rv)
{
PixelFormat=rv;
AntiAlias=iAttributes[21];
}