Page 1 of 1

I lost AntiAlias in 1.8

Posted: Sun Nov 11, 2012 2:55 am
by BrianATSI

Code: Select all

        SIrrlichtCreationParameters         irrparams;
        irrparams.DriverType =              FDriverType;
        irrparams.Bits =                    32;
        irrparams.Fullscreen =              true;
        irrparams.Stencilbuffer =           FShadows;
        irrparams.Vsync =                   FVSync;
        irrparams.AntiAlias =               (FAntiAlias)? 8 : 0;
        irrparams.ZBufferBits =             8;
        irrparams.LoggingLevel =            ELL_INFORMATION;
FDriverType is set to OpenGL
This setting used to give me anti alias for my display in 1.7.3 but now I trace the wglChoosePixelFormat_ARB call and it never returns PixelFormat > 0 even after AntiAlias gets reduced to 2. Am I doing something wrong now? On a side note, if I set the HandleSRGB to TRUE I do get AntiAlias but color is different and FPS is lower.

Re: I lost AntiAlias in 1.8

Posted: Sun Nov 11, 2012 10:42 am
by CuteAlien
You should use one of E_ANTI_ALIASING_MODE: http://irrlicht.sourceforge.net/docu/na ... 3e8f2b149d
irr::video::EAAM_FULL_BASIC is pretty typical. Using value 8 would only give you point-smoothing, which isn't even supported by many some cards.

Re: I lost AntiAlias in 1.8

Posted: Sun Nov 11, 2012 12:10 pm
by Mel
i never got antialiasing working with GL

Re: I lost AntiAlias in 1.8

Posted: Sun Nov 11, 2012 3:39 pm
by BrianATSI
From the snippet below I thought that SIrrlichtCreationParameters::AntiAlias variable was to set the power of the anti-alias.. or the multisample power.

Code: Select all

        //! Specifies if the device should use fullscreen anti aliasing
        /** Makes sharp/pixelated edges softer, but requires more
        performance. Also, 2D elements might look blurred with this
        switched on. The resulting rendering quality also depends on
        the hardware and driver you are using, your program might look
        different on different hardware with this. So if you are
        writing a game/application with AntiAlias switched on, it would
        be a good idea to make it possible to switch this option off
        again by the user.
        The value is the maximal antialiasing factor requested for
        the device. The cretion method will automatically try smaller
        values if no window can be created with the given value.
        Value one is usually the same as 0 (disabled), but might be a
        special value on some platforms. On D3D devices it maps to
        NONMASKABLE.
        Default value: 0 - disabled */
        u8 AntiAlias;
Am I wrong? I also set my mesh material's AntiAliasing (which is the mode) to EAAM_FULL_BASIC with no result.

When I trace the driver initialization I find the following code returns a PixelFormat of 0 using my irrlicht parameters at 800x600. And I know my crossfired dual ATI Radeon HD 5870 can handle that format.

Code: Select all

        s32 rv=0;
        // Try to get an acceptable pixel format
        do
        {
            int pixelFormat=0;
            UINT numFormats=0;
            const BOOL valid = wglChoosePixelFormat_ARB(HDc,iAttributes,fAttributes,1,&pixelFormat,&numFormats);
 
            if (valid && numFormats)
                rv = pixelFormat;
            else
                iAttributes[21] -= 1;
        }
        while(rv==0 && iAttributes[21]>1);
        if (rv)
        {
            PixelFormat=rv;
            AntiAlias=iAttributes[21];
        }

Re: I lost AntiAlias in 1.8

Posted: Sun Nov 11, 2012 4:33 pm
by BrianATSI
Mel wrote:i never got antialiasing working with GL
Ah, yes, I tried D3D9 and it renders with anti-aliasing. So it's an OpenGL thing. :?

Re: I lost AntiAlias in 1.8

Posted: Sun Nov 11, 2012 10:27 pm
by hybrid
The Irrlicht creation params just take a number of the samples used for anti-aliasing. The enum is for the materials, which can enable or disable AA per scene node.

Re: I lost AntiAlias in 1.8

Posted: Mon Nov 12, 2012 4:15 am
by BrianATSI
hybrid wrote:The Irrlicht creation params just take a number of the samples used for anti-aliasing. The enum is for the materials, which can enable or disable AA per scene node.
Yeah that's what I was setting. AA is just weird for me in OGL, I seemed to remember having issues finding the right settings in the previous version to get OGL to actually return a pixel format.