Irrlicht and Speed

Discuss about anything related to the Irrlicht Engine, or read announcements about any significant features or usage changes.
sRc
Posts: 431
Joined: Thu Jul 28, 2005 1:44 am
Location: Salt Lake City, Utah
Contact:

Post by sRc »

hybrid wrote:Wonderful. The glory vsync pitfall. Benchmark succeeded :lol:
indeed! cheers to master1234567!:lol:
The Bard sRc

Blog | Twitter
hybrid
Admin
Posts: 14143
Joined: Wed Apr 19, 2006 9:20 pm
Location: Oldenburg(Oldb), Germany
Contact:

Post by hybrid »

He has no 5 yet :wink:
elander
Posts: 193
Joined: Tue Oct 05, 2004 11:37 am

Post by elander »

hybrid wrote:After all, your way of 'benchmarking' is childish if any. So lets hope you're still that young that you can learn about.
You are not reading my posts at all. I never claimed that comparing Irrlicht demo results with an Ogre demo was a benchmark. To me it was a good hint. The only benchmark that i considered acurate is comparing the Ogre demo results with the two renderers and thats an acurate benchmark that Ogre opengl is too slow.

You are playing the same demo in the same computer with a well tuned system that eliminates external problems so the conclusion should be obvious when you switch renderers and get a absurd drop in FPS.

If it's still hard to understand then i can't explain it in a more simple way.
hybrid wrote:Real benchmarking requires experience and professional techniques.
Im doing benchmarks to my own systems for about 8 years and i use hardware forums often to compare results obtained from other people with similar computer system. I even buy my hardware based on what other people in hardware forums buy to get more reliable comparisions in my tests with the most common systems.

You don't need a degree on making simple acurate benchmarks or geting your system well tuned. As much as some people would like to make others believe that we live in ignorance land an nothing can be measured this is not the case.

Go to an hardware forum and learn how to fine tune your system to eliminate drivers and software problem and bechmark it properly then do the tests yourself. This is not simple but it is not rocket science either. You will never go anywhere if you don't program based on benchmarks and performance tests you do yourself.
The notion, that one individual system suffices for reliable benchmarking data is laughable and shows indeed, that you are the one who has no clue on benchmarking.
Which wasn't what i wrote at all but it isn't a suprise that you have misunderstand what i wrote by now. What i said is to know if your system is well tuned or not. My test is valid for my computer only of course but my system is a normal system with a Pentium 4, 2G of ddr ram, a normal asus motherboard and a Radeon 9600pro so i think there is no excuse. If you acept that my system is fine-tuned and that there are no differences in driver performance between DX and OGL then you have to acept that the problem must be the OpenGL renderer because it is the only thing that changes.
And don't for a minute believe the Ogre-Team to be noobs with OpenGL.
Perhaps not noobs but if you don't put some effort in the quality of your OpenGL driver then it's better not to do it at all. It only gives a bad image to OpenGL which it doesn't deserve. It's even more serious when some Ogre developers claim that OpenGL is slower than DirectX which in this case the guy who said this is certainly a noob.
Baal Cadar
Posts: 377
Joined: Fri Oct 28, 2005 10:28 am
Contact:

Post by Baal Cadar »

elander wrote:If you acept that my system is fine-tuned and that there are no differences in driver performance between DX and OGL then you have to acept that the problem must be the OpenGL renderer because it is the only thing that changes.
I accept the first part, not the latter. I only said, that *if* drivers were of equal quality for both APIs. I did not say and in fact don't believe that drivers for both APIs are of equal quality. Not at all.

Doom3/Quake4 is considerably slower on comparable ATI than on NVidia cards (German's computer magazine c't uses Doom3/Quake4 for comparison and their benchmarking shows, that it favors nvidia more than directx-based games do). This doesn't proof that it *can be* the driver's fault, but it kinda suggests it.

But I did my own figures today and they are much like yours. The same scene rendered in ogl and dx using ogre. I tested three demos, which I believed to be at least mildly demanding (grass, shadow, fresnel)
During writing this answer, I let a friend do the same tests on his nvidia based machine and he got more or less the same results as to the ratio between opengl and dx, which is around 1:1.6.

I make some investigations regarding this and notify you here about it.
elander wrote:
And don't for a minute believe the Ogre-Team to be noobs with OpenGL.
Perhaps not noobs but if you don't put some effort in the quality of your OpenGL driver then it's better not to do it at all. It only gives a bad image to OpenGL which it doesn't deserve. It's even more serious when some Ogre developers claim that OpenGL is slower than DirectX which in this case the guy who said this is certainly a noob.
They really put much effort into it. Sinbad, the creator of Ogre, likes OpenGL more than DirectX anyway, so why would he not care about the OpenGL implementation? See his blog, if you want to see, whether he puts any thought on the OpenGL way...
http://www.stevestreeting.com/?p=359
http://www.stevestreeting.com/?p=285
Especially from the last post you can derive, that they try to get it to work and get it to work as best as they can. What the nature of this problem is, I do not know. but I am curious now.
No offense :)
elander
Posts: 193
Joined: Tue Oct 05, 2004 11:37 am

Post by elander »

A difference of 1:1.6 is too big with or without good drivers, that should be obvious. You can minimize the driver quality differences between ATI and Nvidia. The worst thing to do is for people to cross his arms and blame drivers. It's a nice thing the Ogre main developer thinks that way. Now if he can only convince the other people working with him.

I understand that game engine programmers are more interested in programming than benchamrking and having to fix or dodge poop with drivers and Windows like Cramak did with Doom3 but this is unnavoidable. Microsoft won't make life easy to opengl programmers.

Also i think you should not despise simple benchmarks with old games like Half-Life for example as well as tests with not so modern but not so old games like FarCry that support both OpenGL and DirectX drivers. These tests can show some deficiencies in a game engine (if you compare similar scenes rendered with similar techniques) other tests will hide with lots of mumbo-jumbo and pixel shaders will hide. It's only aceptable when an engine suffers a constant performance hit that can be justified with optimizations for feature-packed scenes when this is a very small performance hit.

Heres a nice benchmarking app you can use:

http://unrealmark.net/

Download a game demo from unrealtournament to use this benchmark. It supports windows, linux and mac. I don't advise you to put too much fait on 3dmark unless the only thing you want to test is how well your engine runs with directx on windows.
Post Reply