Discussion in 'Industry News' started by HardwareHeaven, Jun 19, 2006.
I was under the impression that the drivers weren't ready yet.
yes but you are claiming that these settings arent comparible. this is the thing misunderstood about the so called ugly word "optimisations". Optimsations arent a bad thing, in the right instance they aid a particular game to run and look its best.
With ATI drivers running high quality there are still optimisations running via catalyst AI, with Nvidia drivers running at high quality there are no optimisations, with Nvidia drivers at quality it is comparible to ATI running at high quality with Catalyst AI on.
The drivers to run Quad SLI arent ready and buying two of The GX2s right now to run in Quad SLI is a waste of money, I think it will take nvidia quite some time to get them working right in the majority of games. The only title I know right now that works properly with 2x GX2 in Quad SLI is F.E.A.R.
That is completely untrue. NVidia just does not give the end-user the option to disable their optimizations. ATI listened to their customers and gave them the Calatyst A.I. option to disable their optimizations.
And no, those settings are not comparable. If you want equivalent settings, you should set them both to "Quality" or both to "High Quality". ATI defaults their A.I. setting to low while nVidia uses maximum optimizations all the time. It would be closer if you turned A.I. to high.
A.I. Normal=game bug fixes only.
A.I. Off=no game bug fixes. Some games will not work correctly or show graphical glitches.
A.I. Advanced=Game bug fixes and special optimizations to help increase performance (what nVidia's drivers do all the time).
I like the DH review menu on the side - that is a lot better tan the one that used to pop up from the bottom.
About the control panel settings.. It is very hard to get an equivalent setting between the two brands, there are so many variables.
What about the way they both do anisotropic filtering? They are completely different right now.. In most games you can't tell anyway.
One example of a game you can tell the difference in a lot is rFactor, or the other ISI GFX engine based race sim, GT Legends (DH reviewed that one). There are a lot of problems with Nvidia cards in those games. The lines on the road are either very aliased or very blurry, but you can't get a good result like you can with ATI.
no that is incorrect. AI normal is not bug fixes only - try turning cat AI off in some games and you will see FPS decreases - unless of course you call that a "bug fix", there are certain optimisations for games ..... regardless the point you were making originally is not right, there is no way to equally compare cards across platforms, each company uses their own methods of rendering so to say "put both to high quality and this will give exactly the same results" is nonsense . If you put NVs to high quality it disables game optimisations - ATIs high quality STILL has Cat AI helping out with optimisations - that is the definition of Cat AI in the first place, One point id like to stress is "optimisations" on both sides ARENT "cheats" they are specific methods of rendering for certain titles to help make them run better, exactly the same as Catalyst AI without the fancy moniker.
If you can see any noticeable difference between high quality and quality on NV while playing a game then your eyesight is probably one of the best in the world and you are just picking holes in the review because I feel due to your prior position you side with ATI (as you always have done on this site).
I've just bought a 7950 and I don't think the review does the card justise at all. even at stock its still allot faster then a X1900XTX and 7900GTX..
But heres the best part, these cards overclock allot !
I'm running this at 600/1600 right now. scoring around 8700 in 3dmark06...
Also, I game at 1920*1200 and the X1900XTX cannot cope at this resolution with eye candy enabled which is why I upgraded..
Well you better be careful- the 79xx series are notorious for dying a quick death when overclocked (or at default speed).
That goes for the chip on the gx2 as well. Just be careful..
Mine started acting up the other day- EVGA are cross-shipping me a new one.
Wrong. The "Quality" and "High Quality" setting are for mipmap detail levels only. Switching from "Quality" to "High Quality" turns off the optimizations for mipmap detail levels only. It has no effect on all the other optimizations. NVidia's current optimizations are reasponsible for the shimmering seen by some people in certain games. It has absolutely nothing to do with the A.I. setting.
NVidia's officially announced position on optimizations is they will continue to do them and shader replacement as much as possible as long as it follows their official optimization guidelines. They have never retracted their official position and there is no reason to believe they ever would.
Changing A.I. to "Off" will increase performance in some games and will decrease performance in others. Some of the game bug fixes actually reduce performance and have been highly documented on the internet.
Basically, ATI is being punished by game reviewers for listening to their customers and allowing the end-user control over optimizations. A lot of them even turn on "High quality anisotropic filtering" on ATI cards just to slow them down even more, knowing nVidia's cards are incapable of this settings and that leaving this setting turned off would actually make both cards equivalent.
Really? interesting ive noticed FPS drops with Catalyst AI off, so I guess it must be the 3 or 4 machines ive tried this on. Unless there are "bugs" with the games and ATI have fixed this in Catalyst AI? mmmm..... :uhoh:
Of course in the end I gave up with Catalyst AI and the bloated CCC and went the nvidia route because basically the drivers never seem to work right with the games I play with numerous shadow problems and ingame abnormalities - lets hope your faith in ATI and their "listening to their customers" mean they ditch the hideous .net CCC. Basically I think ATI "are being punished" as you dramatically put it because their drivers really arent up to par and havent been for a good 6 months. Take a look at crossfire and the issues users are having with that. ATI need to sort out their driver support because right now the only ones I see being "punished" are the end users who buy their hardware.
Yes, there are bugs with individual games and the A.I. setting fixes them, which is what has been happening with all graphics card drivers from all manufacturers for many, many years. The performance increase or drop all depends on the game/program and the associated bug.
I happen to agree with some of your objections to the other things and I’m certainly no ATI apologist.
How about ATI hardware with Nvidia making the drivers for it? That's what I want.
I hate the CCC and all the other things the ATI drivers can't do, like custom resolutions, locking refresh rates per resolution, and game profiles for example. ATI hardware, to me, is better though. I don't see forums with hundreds of people reporting the conditions their x1900 cards died in... probably because they don't die anywhere near as often.
Per game/application detection by the drivers to fix bugs or provide optimizations has always been a part of graphics card drivers and has been going on for years before it was ever discovered. Catalyst A.I. was simply providing a name and control over a feature that has probably been present forever (and yes, nVidia does it too). After nVidia announced their official position on optimizations, ATI asked their users what they should do. Some people wanted all optimizations turned off (A.I. Off). It was pointed out that doing so would make certain programs not function correctly due to bug fixes (A.I. Normal). Some people thought optimizations were fine as long as it did not alter image quality (nVidia's official position, A.I. Advanced). Instead of choosing one, they gave it a name and allowed us to choose which one we wanted.
So when does an optimization and bug fix become one? Just curious.
Well its an interesting thought I guess but it is not a road we have ever gone down on DH. Nvidia have not gave us any stipulation to testing the hardware apart from supplying the card and we get the drivers from nzone - just like anyone else would do so when buying the hardware.
I have very strong views on this and regardless of whether people believe me or not, this site does not pander to any political agendas to specific hardware companies.
For example we recently went to intel in munich and were shown Conroe benchmarks, this basically involved us sitting at a few systems and running prebuilt benchmarks on prebuilt systems. We then had the opportunity to publish these results and get a large number of hits in doing so. We declined to publish these results because we want to build the systems, we want to install the testing software and we want to ensure we test it the way we want, not the way any company feels is necessary to force upon us. Intel have a conroe setup with us now, so we will test this system thoroughly and report the findings when we are finished, and as accurately as we can. It would be a disservice to our public to not do so.
The same applies with nvidia or ATI, neither of these companies are heavy handed in their approach and neither force any testing methodology just "so we get hardware" to review, they are equally fair and equally appreciate our audience. I think it is somewhat a net myth that nvidia come round to your door and sit with you while you test their hardware making sure you run what they say. That is ludicrous. They arent the evil twin to ATI.....
This has been an interesting debate and I understand where ChrisW is coming from to a certain extent however please be aware this is not a GX2 v X1900 head to head review. It is a real world test with various setups running default quality settings to give an overall (and again real world) overview of how each piece of hardware handles a specific series of game results. In doing so we aim to inform our readers of possible increases/decreases and/or gains with their favourite games. Out of the box experiences are very important, I for one run all Nvidia games on my 7900GTX sli setup at "quality" and have done extensive testing with "high quality" and can see no difference in 95% of games. certainly under thorough photoshop analysis there has been some minor differences, but this does not in anyway affect ingame experiences and certainly no more than running Catalyst AI on or off - I feel people need to get away from another common myth that Nvidia IQ is vastly inferior, yes there was a time when NV IQ was drastically lower than ATI's but in 2006 this is no longer the case - yes I agree that Nvidia's AA isnt quite in the league of ATI's but on a default driver "out of the box" setting I find them very comparible and of the highest standard. It is also worth noting that we did run high IQ settings on a dedicated page for those interested, so I feel we have covered all bases and given the end user (hopefully) an educated, informed and entertaining look at how the GX2 would perform powering their gaming system.
Obviously you are our audience and we value all your opinions and we do read all the emails you send and the posts on the forums, that is why and how our forum and reviews have changed over the years, and I think for the better.
Interesting, maybe I should try running "quality" instead of "high quality".. I have never even looked at a game without going into the NV CP and setting it up for maximum IQ (all performance tweaks turned off). Maybe I could get a little more speed with just "quality", I'll have to try it.
Thanks. If you say it, I know it must be true. It would be hypocritical of me not to criticize this review for the same things I criticize other reviewers for.
As far as the quality difference...that is not my objection. My point is that optimizations are being allowed on one card while not on the other. It is just as possible that the "Quality" setting may provide just as good of an image as "High Quality" on the ATI card, while providing a modest performance increase.
It can be said that ATI's 6x FSAA is of equal or better quality than nVidia's 8x. That said, nobody affords ATI the luxury of benchmarking these methods together (and see wonderful benchmark numbers). But it is perfectly acceptable to benchmark "Quality" on one card and "High Quality" on the other.
If it happens it happens, if its going to fail id rather it did it sooner rather then later.. :lol:
Well, I can't disagree with that. LOL.. :lol:
To anyone else, if you go with Nvidia, go with EVGA. (just from my own personal experience)
They have the best customer service, better than any company I have ever dealt with.. They are so completely fair with the customer, and so "on top" of every little thing. Customer service like that goes a long way.
I posted this: http://www.driverheaven.net/showthread.php?p=923804#post923804
a few weeks ago... great card.. its a moster.
have a great day.
Separate names with a comma.