NVIDIA Passion for Gaming

Discussion in 'News Discussion' started by Tahir Khan, Apr 2, 2010.

  1. Tahir Khan

    Tahir Khan Hardware Geek

    Joined:
    Jan 29, 2010
    Messages:
    492
    Likes Received:
    5
    Trophy Points:
    0
    Read Full Story Here.
    _________________________

    Drew Henry of NVIDIA has posted a video that claims NVIDIA has a passion for the future of gaming
     
  2. ET3D

    ET3D Hopeless Dreamer

    Joined:
    Aug 20, 2003
    Messages:
    3,086
    Likes Received:
    90
    Trophy Points:
    58
    New slogan: "Nvidia. It's all talk."
     
  3. gascieus

    gascieus Under the Crimson Air

    Joined:
    Dec 6, 2009
    Messages:
    140
    Likes Received:
    7
    Trophy Points:
    0
    It sounds nothing more than damage control... High temperature = not as good when it comes to overclocking, and those buying the top tier cards are usually enthusiasts. The 7800/7900 GT/GTX were the same. My 7800 and 7900 both idled at 75C, and goes up to 115C under load. I don't care what they're trying to spin, it's not good for your cards.

    As for the part about their passion for gaming... well we all know it's just a PR thing, so nothing more to comment I guess.
     
  4. Uxot

    Uxot Active Member

    Joined:
    Oct 24, 2009
    Messages:
    313
    Likes Received:
    14
    Trophy Points:
    28
    Just ..LOL
     
  5. Sihastru

    Sihastru Never been clicked

    Joined:
    Dec 12, 2009
    Messages:
    428
    Likes Received:
    22
    Trophy Points:
    18
    When the HD2xxx and the HD3xxx came out I don't remember nVidia fans channeling so much hate towards ATi products.

    Why so hateful? What are you so afraid of? By all accounts ATi fans have nothing to fear from nVidia and yet all I see is people reacting... You know what goes around will eventually come around...
     
  6. Ramon Zarat

    Ramon Zarat New Member

    Joined:
    Mar 21, 2010
    Messages:
    8
    Likes Received:
    0
    Trophy Points:
    0
    Nvidia never sounded so desperate AND pitiful at the same time. Let me debunk that little blog...

    "We picked PAX East as the venue because it’s a show for gamers, by gamers"

    You didn't choose PAX. PAX has chosen you. No one knew of this "gamers event" prior to this PR stunt paper launch of yours. PAX 'happened' to be good enough for 2 reasons:

    1- By launching your "GPU" at PAX, Nvidia became the star of this otherwise very obscure event. No one to steal the show, no one to challenge you there.

    2- PAX was just lucky to coincide with the approximate date of your launch, which oddly enough, was pushed back, *again*, to April 12...

    "Having built these cards with passionate PC gamers in mind,"

    Fermi is first and foremost a GPGPU with C++, HP DPFP and ECC. The 480/470 incarnation of Fermi GPGPU is an adaptation, a derivative, a less than ideal compromise to get gaming functionality. This "GPU" is a jack of all trades, master of none. Merely an afterthought following Nvidia's cancellation of the 300 series.

    "Since launch, we’ve been getting great feedback"

    WOW... This bullshit stink so much, I've nearly threw up while reading it. Nvidia obviously hasn't read the same forums as everybody else on this planet. As we all know by now, the general consensus is *far* from being favorable.

    "We wanted to let you know that we’ve also heard your concerns about GTX 480 with respect to power and heat."

    No kidding!! No mention of idling at 90c 24/7 when 2 LCD are being used @ 22c lab condition? In a hot summer day, they will go over 100c at load and die prematurely.

    "When you build a high performance GPU like the GTX 480 it will consume a lot of power to enable the performance and features I listed above. It was a trade-off for us, but we wanted it to be fast."

    Femi is synonymous of '"trade-off'". With 3 billions transistors, or 30% more than the 2.15 billions 5870, the 480 only manage a 10-12% advantage on average, consume and heat a lot more. Even the 4.3 billions 5970 is a lot faster for less power and heat. Efficiency wise, Fermi is a disaster. Nvidia resorted to severe compromise to make their borken architecure work. They had to disable some SP and overvolt the card considerably to make the remainder of the SP switch at 700Mhz.

    "The chip is designed to run at high temperature so there is no effect on quality or longevity. We think the tradeoff is right"

    Funny that everybody else desagree. It's a science fact that too much heat and high voltage are bad for chip longevity. Fermi is plagued by both. Since when Nvidia can conjure the adverse effects of the laws of physics themselves?

    "The GF100 architecture is great"

    For GPGPU applications, it has some potential, no one will deny that. But even then, its lack of efficiency is real deal-breaker potential.

    "we think the right one for the next generation of gaming"

    Define next "generation of gaming". If by that Nvidia means games and 3D engines that are not even out yet, do they pretend being able to see in the future?? Current DX11 games benchmark clearly shows the 480 is all but the 5970 killer it was hyped to be and barely beats the 5870 while costing 100$ more and may even require a PSU upgrade on top of that.

    "The GTX 480 is the performance leader"

    That statement is misleading. Within the GTX 400 family, yes it leads. On the market? No... The 5970 is the undisputed king.

    "We built them for you."

    What a joke. I almost fell off my chair! It sounds like some kind of pure act of love!! Listen Nvidia, my mother make cookies for me, no string attached. You on the other hand are in the business of selling products to make money and satisfy your shareholders. You are not here to make me any favor. No one is.
     
  7. Ramon Zarat

    Ramon Zarat New Member

    Joined:
    Mar 21, 2010
    Messages:
    8
    Likes Received:
    0
    Trophy Points:
    0

    Nvidia lived wayyyy too long by the sword not to perish by the sword. If you have been around since the inception of the Riva 128, you know *exactly* what I'm talking about. I have no pity for them. No one should.
     
  8. Mousey

    Mousey HH's Official Rodent

    Joined:
    Jan 13, 2007
    Messages:
    7,902
    Likes Received:
    510
    Trophy Points:
    138
    Noooo no no no you've got it wrong ET;
    "Nvidia; The Way it's Meant To Be Said."

    Gotta admit that physics demo with barney the test pilot looked amazing
     
    Trusteft likes this.
  9. Earthmonger

    Earthmonger New Member

    Joined:
    May 4, 2009
    Messages:
    161
    Likes Received:
    14
    Trophy Points:
    0
    I had a good laugh and cheered as expected when nVidia announced these cards. Now though, it's April 3rd, and it's clear this wasn't an April Fool's joke. I'm not impressed.
     
  10. Cannyone

    Cannyone New Member

    Joined:
    May 15, 2006
    Messages:
    85
    Likes Received:
    2
    Trophy Points:
    0
    I have long since become immune to Marketing Hype. The fact is that sometimes Nvidia does make a decent card. And at other times they don't live up to people's expectations.

    And, for this round at least, ATI seems to have made it to market sooner with a better set of new cards. While Nvidia is going to have to settle for hind tit... Now we just hope ATI keeps the heat on by dropping the price a bit... :D
     
    Mousey likes this.
  11. Uxot

    Uxot Active Member

    Joined:
    Oct 24, 2009
    Messages:
    313
    Likes Received:
    14
    Trophy Points:
    28
    nvidia as been for years thinking they were so good..and now ATI owned them with the 5000 series ..and they are not evryday "OMG OUR PRODUCTS OWN ANY OTHERS!" (well with the 5970 i can understand LOL)

    nvidia dx11 = 2 cards? ati dx11 = allllll the 5000 series! i think ATI and DX11 gonna rule for a long time... (yes im a freaking ATI fan AND?)
     
  12. Trusteft

    Trusteft HH's Asteroids' Dominator

    Joined:
    Nov 2, 2004
    Messages:
    24,005
    Likes Received:
    3,823
    Trophy Points:
    153
    I think some of the "bad" comments about Nvidia come from the arrogance coming from their comments ever since the logo the way it's meant to be played appeared and only gets stronger (the arrogance) by each passing month. Then some nasty/"clever" comments about Intel and ATI, the constant renaming of GPUs, the high failure rate of laptop GPUs by Nvidia and the LONG time it took them to admit to anything, but still continue selling them.
    All the above while selling their cards with a premium and taking too long to release any DX11 card (and even now they are not out yet).
    Then the patronizing tone about what people need, etc.

    Plenty of reasons really. Plus, some are AMD fanboys and would find any excuse to blast Nvidia. :)
     
  13. Uxot

    Uxot Active Member

    Joined:
    Oct 24, 2009
    Messages:
    313
    Likes Received:
    14
    Trophy Points:
    28
    true true..but what can i say i just hate nvidia.. <.< lol

    i didnt like when AMD buyed ATI...but with what now they are...no comment lol
    (didnt like when AMD started to touch ATI drivers and screwd most of it :S)

    meh im out to sleep (4 am lol)
     
  14. sammorris

    sammorris New Member

    Joined:
    Aug 4, 2008
    Messages:
    178
    Likes Received:
    6
    Trophy Points:
    0
    "When the HD2xxx and the HD3xxx came out I don't remember nVidia fans channeling so much hate towards ATi products.

    Why so hateful? What are you so afraid of? By all accounts ATi fans have nothing to fear from nVidia and yet all I see is people reacting... You know what goes around will eventually come around..."

    Oh really? I think you'll find they did. nvidia fans are all over scrutinising ATI at the best of times, let alone the worst. If this is an argument that nvidia fanboys are somehow 'more lenient' than ATI fanboys, you can forget it.

    As far as the current case goes though, there is genuinely a lot to complain about. The price. The heat. The noise. The inconstent performance. The GTX470/480 is a GPU compute card. It's not a gaming card. Not at that price.
     
  15. Sihastru

    Sihastru Never been clicked

    Joined:
    Dec 12, 2009
    Messages:
    428
    Likes Received:
    22
    Trophy Points:
    18
    Well, in my opinion this product is not a failure. What people consider a failure at this moment is just because of the high power consumption and higher then the average full load temps.

    People that use low end or middle end cards are not really looking into buying any of the current high end cards, and yet they are the ones reacting the most... People that are looking to buy a high end card, aren't really so much into power consumption, but rather raw FPS numbers.

    People looking into high end rigs, run high end power supplies (750W-850W+). In the long run (which is 2 years usually), the power consumption difference between a HD5870 and a GTX480 won't save the planet. Noise is not so much of an issue, the cards are louder then the old GTXs, but they are not the loudest cards on the market.

    Whenever I attack an Apple fanboi (because I like to, it's a hobby), their only defense is telling me "well, (insert name calling here), first actually use (insert Apple product here) and then tell me that it's an overpriced POS", because apparently I can't have an opinion about something I'm not familliar with (interesting concept).

    I'd like to tell everyone mocking nVidia for their latest high end cards... well, first actually use a high end card for an year or so (any high end card), and then complain about power consumption.

    My number crunch would be this one: ATi HD5870 has ~2.154 billion transistors, while the GTX480 has ~2.9 billion. HD5870 core is 334 sq mm, while the GTX480 core is 529 sq mm.

    Now, 529 / 334 = 1.584. And 2.9 / 2.154 = 1.346. There's a big areal density difference so it's safe to say both companies have a strange way of counting transistors.

    Anyway (1.584 + 1.346) / 2 = 1.465, just to be on the safe side. ATi HD5870 TDP (with 1GB or RAM) is 188W.

    188 * 1.465 = 275W.

    nVidia said GTX480 has a TDP of 250W. Tests with preproduction cards with 1.5GB (1.5x the amount the HD5870 has) of RAM suggest a higher number.

    So, is a TDP >250W value for the GTX480 something that you weren't expecting? Why is this all so dissapointing?

    My opinion is that this it's the only thing ATi fanbois have to do before accepting defeat. The reason for all the bad mouthing comments is 'envy'. Lol. Don't think it's a bad thing, it's what drives sales.

    If the performance per watt is something that is not appealing yet to you, then my advice is to wait for better drivers and retail cards.
     
    crowTrobot likes this.
  16. xen.chris

    xen.chris dot daemon

    Joined:
    May 26, 2003
    Messages:
    39
    Likes Received:
    1
    Trophy Points:
    0
    From all ATI 5000 series, which cards have a fps of at least 30 fps ? (meaning playable dx11 games). Perhaps 5870 & definitely 5970. Uppss, that makes 2 cards also. :D For the rest, is just a spec. - dx11 compatible. So what ? Compatible doesn't necessarily imply playable.
    When dx11 games will flood the market & current gtx480 & 470 will be 200$ less, they'll be a best-buy. Until then, they are not the best from all the rest. And pls. take into account that besides Intel there are no fabs which can create a chip on lower than 48nm manufacturing process. So from this pov. the fault of gtx480 is that is an early child ahead of its time. Whether this will prove to be a good thing or a bad one, only the cash fm buyers will tell.
     
  17. OldBuzzard

    OldBuzzard DH's oldest Geek

    Joined:
    May 25, 2003
    Messages:
    2,777
    Likes Received:
    145
    Trophy Points:
    88
    With the amount of power it takes to run them, that amount of heat they generate, and the amount of noise they produce while doing so, the 470/480 will NEVER be a "best buy".

    Gotta love the nVidiots, and the fanatics.

    the nvidiots are defending the 470/480 just like the fanatics defended the X2900s.

    Get over it. Like the X2900, the Fermi is a turd. It's a shame too, as a decent nVidia card would have brought down the prices of the high end 5xxxs.
     
    Trusteft likes this.
  18. Trusteft

    Trusteft HH's Asteroids' Dominator

    Joined:
    Nov 2, 2004
    Messages:
    24,005
    Likes Received:
    3,823
    Trophy Points:
    153

    QFT wise one
     
  19. Sihastru

    Sihastru Never been clicked

    Joined:
    Dec 12, 2009
    Messages:
    428
    Likes Received:
    22
    Trophy Points:
    18
    One of you has a 4870X2, it's TDP is 286W, the other has two 4870, each has a TDP of 160W (so ~320W, should be less in real life), and I have a GTX295 that has a TDP of 289W.

    The GTX480 has a TDP of 250W (less then 300W anyway). So I don't see the problem, the card comes with all the current shared technologies (and some extra ones) and usually has a better performance then all the combos above.

    I'm not sure where the truth is but it's in there somewhere...
     
  20. Trusteft

    Trusteft HH's Asteroids' Dominator

    Joined:
    Nov 2, 2004
    Messages:
    24,005
    Likes Received:
    3,823
    Trophy Points:
    153
    playing devil's advocate, I didn't know that the 480 was competing with the previous generation of cards.
    :p
     

Share This Page

visited