Nvidia accuses ATI of FUD

Discussion in 'Industry News' started by HardwareHeaven, Jun 18, 2004.

  1. HardwareHeaven

    HardwareHeaven Administrator Staff Member

    Joined:
    May 6, 2002
    Messages:
    32,274
    Likes Received:
    163
    Trophy Points:
    88
    IN NOT SO NICE document seen by the INQUIRER, Nvidia has again launched all out war on ATI. It's part retaliation to ATI dissing Nvidia's HSI and this is, we suppose, Graphzilla's way of letting its customers know what it really thinks about the Canadian firm.

    Without discussing HSI's merits or demerits, as we're testing that in the next few days, we just want to show you one nice picture.

    see this and more at the inquirer
     
  2. IamLakota

    IamLakota New Member

    Joined:
    Dec 18, 2003
    Messages:
    552
    Likes Received:
    5
    Trophy Points:
    0
    Ok so what is FUD, F***** Up Development??? Sounds like the inguirer likes to start trouble.
    What is your take on this Zardon, I didn't think PCI-E cards were even out yet.
     
  3. mike2h

    mike2h New Member

    Joined:
    Nov 11, 2002
    Messages:
    6,359
    Likes Received:
    69
    Trophy Points:
    0
    cats & dogs fighting it out. this little war is nothing but good for the consumer. we WILL get better hardware & software faster & cheaper the more heated this battle gets.

    then again(in my paranoid mind:duh:). maybe they are just playing silly games. non of this crap presented by either side is really significant. maybe they are just pretending to have a 'war' while they are in a little pricefixing scam on the backside. i mean both companies have told us their products are cheaper to make.:evil:
     
    Last edited: Jun 18, 2004
  4. Helios_D

    Helios_D Banned

    Joined:
    Jun 7, 2004
    Messages:
    34
    Likes Received:
    0
    Trophy Points:
    0
    Seems like ATI is on a roll these days--in lying that is.

    First, they accuse nVIDIA of using AF optimizations then get caught using one themselves.

    Then they bash nVIDIA for using a bridge chip while claiming they're using the real PCI-E spec. Well, that picture reveals another one of ATI's lies.

    I don't get why ATI is screwing around like this all of a sudden. Are they so worried about nVIDIA that they had to resort to these low-life lies??

    They should've just been honest or hold their hypocritical bashings in the first place.
     
    Last edited: Jun 18, 2004
  5. HawK

    HawK Banned

    Joined:
    May 13, 2002
    Messages:
    2,092
    Likes Received:
    0
    Trophy Points:
    0
    And you see the Pic and recognise this as an PCI-E Bridge?? good for you :)

    from the earlier Inq. art.:
    Maybe so, or maybe not.. time will tell
    though the one who IS using a bridge is honest NVidia ;)
     
  6. The_Neon_Cowboy

    The_Neon_Cowboy Well-Known Member

    Joined:
    Dec 18, 2002
    Messages:
    16,076
    Likes Received:
    28
    Trophy Points:
    73
    What NVIDIA doing X raying ATI's chips? Isn’t that illegal? To be decompiling their technology and studinging and learning from it?

    Bolted on? WTF?

    Please note their talking about RV380 the new low end, not the R4XX series… As ATI said this is buffers!

    Also were not micro chip experts, with doctorates on the subject, how are we supposed to know what were looking at in these? So you just have to blindly believe


    NVIDIA cheats... NVIDIA lies like crazy ever since the 5800FX

    Your forget ATI model more closely matches refast…The ATI model doesn’t sacrifice IQ for speed like the NVIDIA’s optimizations does…

    You forget the per application detection a cheats accordingly..

    Nvidia is still the dirt on the bottom of ati's shoe in my book....
     
  7. LeanWolf

    LeanWolf alpha male

    Joined:
    Dec 3, 2002
    Messages:
    5,790
    Likes Received:
    41
    Trophy Points:
    58
    Why does anyone read anything linked to the inquirer? Why do these stupid inquirer articles keep showing up here? C'mon, it's the INQUIRER! :rolleyes: If it were true, it'd be on a real news site!
     
  8. HardwareHeaven

    HardwareHeaven Administrator Staff Member

    Joined:
    May 6, 2002
    Messages:
    32,274
    Likes Received:
    163
    Trophy Points:
    88
    They keep showing up here because believe it or not the inquirer do tend to write articles that have some level of truth in them - not always of course but I tend to link to articles which have some grounding - and even if they arent 100% accurate they are entertaining reading right? I mean we all take what we want from articles, I dont think people blindly believe anything until its proven. And the more information we have the more we take our OWN viewpoint on something.
     
  9. dipstick

    dipstick New Member

    Joined:
    May 29, 2002
    Messages:
    3,609
    Likes Received:
    16
    Trophy Points:
    0
    LOL@nVidia
     
  10. WaltC

    WaltC New Member

    Joined:
    May 9, 2003
    Messages:
    245
    Likes Received:
    4
    Trophy Points:
    0


    Actually, they didn't just "accuse" nVidia of disabling Trilinear filtering on detail textures in the game UT2K3 last year (for nV3x, simply to make it appear the GF FX was running the game faster than it actually was), they pointed it out as an observation, and several websites verifed it afterwards.

    But nVidia didn't stop their trilinear optimization, despite the negative publicity it generated at the time. Instead they expanded their trilinear optimizations so that detail textures weren't treated with trilinear in any 3d game at all, not just UT2K3, as was the case in the beginning. In response to a lot of criticism, nVidia finally in its latest series of drivers has added a "trilinear optimization disable" switch to its control panel--but it has been "broken" in at least one set of nVidia drivers thus far (which means you can't turn them off), and no tests have been run to verify whether the optimization defeat switch actually works across the entire spectrum of 3d games. That is yet to be established.

    ATi has been entirely up front about having added an automatic trilinear optimization in its latest drivers that, unlike nVidia's thus far, turns itself on and OFF, depending on the situation, with the idea being that the trilinear optimization only runs in situations where it will enhance performance without degrading IQ. Where it would degrade IQ, it turns itself OFF without user intervention.

    NVidia was found out last year because their optimizations did indeed degrade IQ, however, their optimization technique has improved a lot since then and there's much less of it visible in current drivers, from what I read. So far, it seems impossible to capture any IQ degradation from ATi's optimization in a screen shot. My own opinion is that as long as it doesn't degrade IQ, all optimizations are A-OK. The ones I don't like are the ones which degrade IQ visibly and markedly simply for the sake of securing higher benchmark frame-rate scores in popular games--just for the sake of selling cards. That was at the root of the original nVidia trilinear optimization complaints.

    I have no idea what the INQ is talking about...;) A PCIe-to-AGp bridge is this: a native PCIe chip which uses a bridge *when deployed on an AGP x8* board. When running on a PCIe board, the bridge isn't used, because it isn't needed. nVidia's using an AGP x8-to-PCIe bridge, which means that on an AGP x8 board the bridge isn't used, but on a PCIe board it is. That seems a fairly big difference to me.

    Well, I think you need to reread the INQ piece--nVidia's the one supposedly x-raying the ATi chips and mis-stating what ATI's doing with them...;) Of course, this is the INQ, remember, and they almost always get lost in the details of the things they are told...;)
     
  11. dipstick

    dipstick New Member

    Joined:
    May 29, 2002
    Messages:
    3,609
    Likes Received:
    16
    Trophy Points:
    0
    Now they are up front with it but they weren't during the intial reviews me thinks. If they were I do not understand why they ticked so many people off:confused:
     
  12. No_Style

    No_Style Styleless Wonder

    Joined:
    Jun 18, 2002
    Messages:
    6,027
    Likes Received:
    0
    Trophy Points:
    46
    Yes~! Shiver me timbers, I think he got it.

    "A PCIe-to-AGp bridge is this: a native PCIe chip which uses a bridge *when deployed on an AGP x8* board. When running on a PCIe board, the bridge isn't used, because it isn't needed. nVidia's using an AGP x8-to-PCIe bridge, which means that on an AGP x8 board the bridge isn't used, but on a PCIe board it is. That seems a fairly big difference to me."

    Nice WaltC, that's good interpretation of it. Instead of using an external chip on the PCB, (Assuming they are using a "bridge")ATI has gone an implemented within the core itself. Like adding an extension of a house. Sure the house isn't 100% rebuilt, but it's completely different from nVidia's "trailer house in its backyard" extension :D
     
  13. No_Style

    No_Style Styleless Wonder

    Joined:
    Jun 18, 2002
    Messages:
    6,027
    Likes Received:
    0
    Trophy Points:
    46
    During review times, reviewers were not informed about the "Adaptive Trilinear" filtering method. My guess is that, ATI thought their method would work flawlessly and nobody would catch on. But, they underestimated reviewers need to look at everything under a high powered microscope and the truth was revealed. BUT, like mentioned above, ATI came clean about it and isn't denying it. :)
     
  14. Judas

    Judas Obvious Closet Brony Pony

    Joined:
    May 13, 2002
    Messages:
    39,694
    Likes Received:
    1,540
    Trophy Points:
    138
    remember thought... the Adaptive Trilnear was considered... and i think for myself.... was a step up in Trilnear coding... (advancement)..... and that ATI had been developing it for quite some time..... If it doesn't degrade quality... and so far... no one has got any peice of evidence to prove otherwise.... why mention it until the patent goes through?.. (it's been in development for quite some time.... and is available on earilier cards for some time)
     
  15. No_Style

    No_Style Styleless Wonder

    Joined:
    Jun 18, 2002
    Messages:
    6,027
    Likes Received:
    0
    Trophy Points:
    46
    I think it was flawed for..

    It was flawed for the reviewers.. That's why some caught on. Far Cry was the game which people noticed it in. Wasn't? Yeah, it's been around for a long time, but the drivers used with the X800 Reviews did something wrong and it was exposed.
     
  16. lar1r

    lar1r New Member

    Joined:
    Sep 24, 2003
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    0
    Nope farcry acts like that on ALL cards. It has nothing to do with any filtering style - try it out with your 5900 or your 9800. Someone provided the save game on B3D to test it - I did.

    As for this PCI E bull - You can't tell what is in a AGP component or PCI Express component much less say its a bridge from X-rays. They mearly show the layout of the chip. In this case the AGP layout (if that is what it is) no longer exists in the new chip and is replaced by a larger PCi E component.

    Don't know who NVIDIA is trying to fool with this stuff.
     
  17. No_Style

    No_Style Styleless Wonder

    Joined:
    Jun 18, 2002
    Messages:
    6,027
    Likes Received:
    0
    Trophy Points:
    46
    Ah I sees. Thanks for clarifying that up :)
     
  18. tazdevl

    tazdevl New Member

    Joined:
    May 8, 2002
    Messages:
    310
    Likes Received:
    0
    Trophy Points:
    0
    Couple comments. Nice to see Walt on here, always a level-headed, knowledgeable response from him.

    FUD = Fear, Uncertainty and Doubt. Good tactic to create perception gaps in competitor's products.

    Neon there is nothing wrong with X-raying a competitor's chip. Competitive intelligence and reverse engineering are used by every company out there. It's common business practice. How will you figure out how your competitor is making things work if you don't? Ever wonder how some new fangled laundry detergent has competitors 2 months after it launches?

    As Walt said, the chip is native PCI with a bridge chip to go back to AGP. Makes sense considering the installed base of AGP motherboards no? Also a lot cheaper solution than developing 2 separate cores for each platform. Also might address some latecy issues that could be encountered with nVIDIA's solution.

    Has a native PCI-E nVIDIA GPU been spotted in the wild?

    Filtering issue ATI has handled extremely well. Yes they could have positioned things differently on the front-end. But the bottom line is that if it doesn't affect image quality, I could care less. It doesn't, so I don't care. I and 99.999999% of folks aren't going to sit in front of our computers taking screenshots of gameplay, making regedits and comparing the results.
     
  19. Helios_D

    Helios_D Banned

    Joined:
    Jun 7, 2004
    Messages:
    34
    Likes Received:
    0
    Trophy Points:
    0
    Well, it turns out that ATI's method of optimizations aren't perfect too. Look here for an example where ATI's filtering algorithm produces visual artifacts in a recent popular game.

    I just hope that ATI doesn't revert to their 8500's days of deceptions and sneakiness.
     
  20. means96

    means96 New Member

    Joined:
    Jun 20, 2002
    Messages:
    76
    Likes Received:
    1
    Trophy Points:
    0
    ATI vise Nvidea

    No it's not illegal for NVIDEA to X-Ray and study ATI chips. This is common practice and I would be highly surprised if ATI hasnt X-Rayed NVIDEA chip's. This argument is very silly - ATI and NVIDEA are both corrupt and you cannot trust either.
     

Share This Page

visited