Driverheaven Farcry Benchmark

Discussion in 'Industry News' started by Stuart_Davidson, Apr 26, 2004.

Thread Status:
Not open for further replies.
  1. Stuart_Davidson

    Stuart_Davidson Well-Known Member

    Joined:
    Nov 20, 2002
    Messages:
    5,843
    Likes Received:
    188
    Trophy Points:
    73
    Driverheaven Farcry Benchmark / 6800 Performance

    There have been a few story’s circulating on the Internet today regarding 6800Ultra performance in Farcry and how IQ is impacted when the Device ID is changed.

    Last weekend we tested Farcry under these conditions for our upcoming 6800 Ultra review however based on the fact that information has become widely available on the subject today it may be useful to know what our findings regarding performance were when these IQ changes were made.

    Test System:
    Amd Athlon 64 FX-53
    ASUS SK8N
    2x512mb DDR433 Supplied and thanks to Corsair
    AKASA AK-855 Cooler
    Nvidia Geforce 6800 Ultra 256mb
    Gainward Geforce FX 5950 Ultra 256mb
    Sapphire Radeon 9800 XT 256mb
    IBM Deskstar 120 GXP 40GB 7200rpm Hard Drive
    Sony CRX300E DVD/CDRW
    Sony Floppy Drive
    Mercury 400W PSU
    AOC 19†9GLR CRT

    Software:
    Windows SP1a
    Direct X 9.0b / SDK
    Forceware 60.72 (6800)
    Forceware 56.72 (5950) WHQL
    Catalyst 4.4 WHQL
    Nforce Driver 3.13
    Farcry with 1.1 patch
    3dAnalyse latest build
    Fraps latest build

    The test system was built from scratch, a format of the hard drive was performed (NTFS) and then Windows XP was installed. Following the completion of the install the N-force drivers were installed. The only updates applied were SP1a and Direct X 9.0b. Following a reboot the 60.72/56.72/4.4 drivers were installed. Next the benchmarking tools were installed and finally the hard drive was de-fragmented. For all tests the Nvidia/ATI drivers were set to best image quality. This included manually setting trilinear in mipmaps and manually disabling Trilinear optimisations.


    [​IMG]

    As you can see the results are considerably different for the 6800 when the R300 device ID is forced. In addition to the 6800Ultra then using the same pixel shaders as the Radeon card the display issues which are well documented on the internet are fixed by using the Radeon path.

    In summary, forcing the R300 path if you have a 6800 Ultra is advisable as it gives you decent performance however better IQ than the standard path.

    One other point of note is that the Radeon 9800XT is very close in performance to the 6800Ultra at the same IQ levels. Only 7fps in average framerates separate the cards which is a phenomenal result for the Radeon.

    It may be interesting to hear what Nvidia have to say about why they don’t run a fully working path by default? After all, surely they should be providing their customers with the best experience possible…shouldn’t they?
     
  2. Judas

    Judas Obvious Closet Brony Pony

    Joined:
    May 13, 2002
    Messages:
    39,625
    Likes Received:
    1,491
    Trophy Points:
    138
    the card ID change... news to me... interesting findings...
     
  3. Logla

    Logla Well-Known Member

    Joined:
    May 10, 2003
    Messages:
    3,310
    Likes Received:
    2
    Trophy Points:
    48
    nVidia - the way its meant to be played - apart from a few IQ reducing tweaks which actually mean its not quite the way it was meant to be played.
     
  4. zerodamage

    zerodamage New Member

    Joined:
    May 16, 2003
    Messages:
    3,478
    Likes Received:
    0
    Trophy Points:
    0
    The 9800 XT doens't look so bad when both cards are rendering at the same quality.
     
  5. The_Neon_Cowboy

    The_Neon_Cowboy Well-Known Member

    Joined:
    Dec 18, 2002
    Messages:
    16,076
    Likes Received:
    28
    Trophy Points:
    73
    Rofl @ nvidia
     
  6. zerodamage

    zerodamage New Member

    Joined:
    May 16, 2003
    Messages:
    3,478
    Likes Received:
    0
    Trophy Points:
    0
    To be honest though, the drivers may be tweaked and optimized more before the card hits the shelves. It just seems like an ongoing trend that disappoints me.
     
  7. werty316

    werty316 BANNED

    Joined:
    Apr 11, 2004
    Messages:
    604
    Likes Received:
    0
    Trophy Points:
    0
    Changing ID is new to me and this was informing. Nvidia is scheming again....
     
  8. Chr0n1c

    Chr0n1c AMG Driver

    Joined:
    Apr 6, 2004
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    0
    erm wtf? :confused:

    thats just changing the visual setting thru the config right? basically? So in turn that isn't the actual perfomance path of the R300 since isnt that done thru drivers and not thru a games tweaking of visiual quality...right?

    just doesn't make sense to me.:confused:

    Some one plz simplify this to me. :D
     
  9. Stuart_Davidson

    Stuart_Davidson Well-Known Member

    Joined:
    Nov 20, 2002
    Messages:
    5,843
    Likes Received:
    188
    Trophy Points:
    73
    Its making the game think its using a r300, so any NV specific settings for NV40/3x are not used and any R3xx specific settings are.
     
  10. Chr0n1c

    Chr0n1c AMG Driver

    Joined:
    Apr 6, 2004
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    0
    Ic, so this is similar to renaming the game to lose optimizations?
     
  11. Stuart_Davidson

    Stuart_Davidson Well-Known Member

    Joined:
    Nov 20, 2002
    Messages:
    5,843
    Likes Received:
    188
    Trophy Points:
    73
    Similar idea yes, but not the same... close enough though.
     
  12. Chr0n1c

    Chr0n1c AMG Driver

    Joined:
    Apr 6, 2004
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    0
    Alright, :sigh: @ NV
     
  13. BWX

    BWX get out and ride

    Joined:
    Nov 29, 2002
    Messages:
    19,684
    Likes Received:
    63
    Trophy Points:
    73
    Could you see a difference in image quality? Too bad you didn't take some screenshots.

    Basically the game lowers image quality when it detects a 6800u being used?

    Shouldn't it RAISE the IQ for the more powerful card? I mean damn, when people start shelling out big bucks for the new Nvidia card they are going to want the best image quality, I know I would..

    The 9800XT still kicks some serious Nvidia but if you ask me, I bet the R420 will beat the 6800u easily, w/ better IQ.


    edit- And what happens when you force a 6800u path (default device) with the 9800XT being used?
     
  14. digitalwanderer

    digitalwanderer Colour Commentator

    Joined:
    May 8, 2002
    Messages:
    5,619
    Likes Received:
    2
    Trophy Points:
    0
    Supposedly the next FarCry patch will change the way they run the 6800, but those results are just kind of bloody shocking to me! :wtf:

    A couple of dumb questions that I probably just missed reading your post:

    -What AA AF levels if any?

    -What were the max FPS like? (I know it ain't important, I'm just curious if the 6800 had some ungodly spikes or not)

    -Was v-sync on or off?

    Good read, thanks for posting it. I look forward to more findings for you.


    BTW-As a personal favor, could you run 3dm2k1se for me at 1024x768 all default settings (no AA AF or v-sync) on the 6800 you got? I haven't seen that score and I'd really like to. (Yeah, I'm still old fashioned and like 3dm2k1se. :rolleyes:
     
  15. griswoold

    griswoold mr. vögi

    Joined:
    Apr 6, 2003
    Messages:
    161
    Likes Received:
    0
    Trophy Points:
    0
    You don't need to ask nvidia , you have to ask crytek. They developed the game not nvidia.
     
  16. digitalwanderer

    digitalwanderer Colour Commentator

    Joined:
    May 8, 2002
    Messages:
    5,619
    Likes Received:
    2
    Trophy Points:
    0
    It's a TWIMTBP title though, isn't it? And nVidia had a mixed-mode already which I'm pretty sure nVidia had some input into. (I could be wrong, I don't know for sure)
     
  17. Stuart_Davidson

    Stuart_Davidson Well-Known Member

    Joined:
    Nov 20, 2002
    Messages:
    5,843
    Likes Received:
    188
    Trophy Points:
    73
    Questions, questions, questions...

    ...ok.

    DW, res and AA/AF is on the graph...vsync was off. I'm not able to run 3dm01 just now. Sorry.

    Panamajack, forcing the XT to run as an FX increased performance (it was using lower shaders). Didnt think that needed stated. Read up on the other articles posted on the net today...IQ is considerably different between the geforce and the radeon due to PS versions used.

    Griswoold, i would have thought Nvidia would have made sure a TWIMTBP game was playing well on their hardware, even if it was a case of a driver hack to use the working path.
     
  18. BWX

    BWX get out and ride

    Joined:
    Nov 29, 2002
    Messages:
    19,684
    Likes Received:
    63
    Trophy Points:
    73
    I think the lowest FPS with both using same IQ is the most important thing here- it looks like the difference is only 1 or 2- that means under the most taxing circumstances the 9800XT is just about as fast as the "amazing new" 6800u, that is pretty amazing.
     
  19. Forge

    Forge New Member

    Joined:
    May 18, 2002
    Messages:
    107
    Likes Received:
    0
    Trophy Points:
    0
    I'd be more interested to see both cards forced to a totally arbitrary PCI ID. Using R300's might enable some R300-specific settings that are detrimental to NV40 performance.


    Probably not, but worth looking into.
     
  20. zerodamage

    zerodamage New Member

    Joined:
    May 16, 2003
    Messages:
    3,478
    Likes Received:
    0
    Trophy Points:
    0
    You would think with double the Pipes that the 6800 would be rocking. I am thinking driver issue. Card is new, game is new.
     
Thread Status:
Not open for further replies.

Share This Page

visited