nVidia Guru David Kirk Answers Your Questions

Discussion in 'Industry News' started by Danhill, Dec 22, 2004.

  1. Danhill

    Danhill New Member

    Joined:
    May 16, 2003
    Messages:
    4,112
    Likes Received:
    0
    Trophy Points:
    0
    David Kirk has an insatiable need for power. Graphics rendering power, that is. He's fond of saying, "Why use a screwdriver when you can use a sledgehammer?" As nVidia's Chief Scientist, Kirk has overseen development of architectures from the original GeForce right through the current GeForce 6 series. nVidia has had a very impressive run of successes with its GPUs, though the FX series brought the company some unanticipated headaches. The GeForce 6 series has reestablished the company's position as a contender for the 3D performance crown, with the new SLI technology cementing its place at the top of the heap.

    Read more: Extremetech
     
  2. HardwareHeaven

    HardwareHeaven Administrator Staff Member

    Joined:
    May 6, 2002
    Messages:
    32,274
    Likes Received:
    163
    Trophy Points:
    88
    Reader Question: I've read at a number of sites that the video processor on the 6 series GPUs was "somewhat broken," but I've never heard any specifics. If you're willing to, could you share with us exactly how this processor was broken? I just picked up a 6800 GT, so this would be very nice to know.—felraiser

    David Kirk: Well, the first thing that I would say about "I've read at a number of sites..." is that you shouldn't believe everything that you read! The video processor of the series 6 GPUs is just that—a processor—and it runs software for video encoding, decoding, and image processing and enhancement. The GeForce 6 series video processor is a new processor, the first of its kind, and there was no legacy (pre-GeForce 6) code to run on it. At the time the original GeForce 6800 shipped, very little new code had been written for the new processor. So, for early GeForce 6800 customers, there was little or no improvement in video quality, performance, or reduction of CPU usage.


    ____________________________


    wow, interesting answer indeed.
     
  3. HardwareHeaven

    HardwareHeaven Administrator Staff Member

    Joined:
    May 6, 2002
    Messages:
    32,274
    Likes Received:
    163
    Trophy Points:
    88
  4. PoopyTheJ

    PoopyTheJ New Member

    Joined:
    Jan 15, 2003
    Messages:
    1,584
    Likes Received:
    1
    Trophy Points:
    0
    I'd say the WMV decoding is still not working as of the 71.20 drivers, no divx as far as I can tell either, of course maybe there's an updated codec, I'll go look.... annoying doublespeak from a corporate schill big surprise...
     
  5. Huigie

    Huigie New Member

    Joined:
    Oct 27, 2002
    Messages:
    44
    Likes Received:
    0
    Trophy Points:
    0
    "As to the GeForce 6800 video processor being "broken," I wouldn't say that."

    So, how would you call it then, f'cked up? The answer seems "too open"... :sigh:
     
  6. The_Neon_Cowboy

    The_Neon_Cowboy Well-Known Member

    Joined:
    Dec 18, 2002
    Messages:
    16,076
    Likes Received:
    28
    Trophy Points:
    73
    I'd just chalk it up to thier frequent lieing & common deceptions
    thier they really need to give thier fans a little more credit then that...


    I guess they think all those hundreds of thousands people on forums
    with 6800 cards thats say it borked are um.. lieing :D :rolleyes: :sigh: or covert ATI operatives :D

    "So, for early GeForce 6800 customers, there was little or no improvement in video quality, performance, or reduction of CPU usage."

    I'd guess the people on the forums with 100% cpu usage running videos and like 0% VPU useage would say otherwise :rofl:

    if it's not boarked with the new 6800's , then are they saying they fixed it?

    "Graphics will continue to be a collection of clever tricks, to do just enough work to calculate visibility, lighting, shadows, and even motion and physics, without resorting to brutishly calculating every detail. So, I think that there's a great future both in more powerful and flexible GPUs as well as ever more clever graphics algorithms and approaches."

    Reader Question: With all of the pressing towards more powerful graphics cards to handle features such as FSAA and anisotropic filtering, why do we still use inefficient, "fake" methods to achieve these effects?—thalyn

    "Looking at my answer from the question about ray-casting and lighting effects, graphics is all "fake" methods. The trick is to perform a clever fake and not get caught! All graphics algorithms do less work than the real physical universe but attempt to produce a realistic simulation. "

    hmm... a NV driver advertisement :lol:
    **I've book marked this** great read
     
    Last edited: Dec 23, 2004
  7. Danhill

    Danhill New Member

    Joined:
    May 16, 2003
    Messages:
    4,112
    Likes Received:
    0
    Trophy Points:
    0
    Yes indeed, its just like the X-files.....trust no one :wtf:
     

Share This Page

visited