Discussion in 'AMD Graphics Cards' started by HardwareHeaven, May 21, 2004.
Yes.. Thanx WaltC for a clear & understandable view
"I feel good duna-nuna-nunat" breaks into music
Zardon, your new avatar is freaking me out!!!! :wtf:
This current card is only a mid-step until DX10 comes out anyway. I can't say much about it, because I don't think it's all that worth it to buy one when my 9800xt works fine.
How can anything that keeps IQ at a high level and still improves game performance be considered bad? If ATI erred it was by not being upfront about the algorithm and letting everyone know what to expect. I won't say that they did it intentionally because from what I have read they really did not think it would be such a big issue. It certainly is not a cheat because, unlike nvidia, it is not application specific. Making it user selectable is an option, but why would anyone want to turn it off?
I think ATI will walk away from this understanding the need to be upfront about any new performance tweaks or changes in the drivers so it doesn't look like they are trying to hide something. They have clearly stated they want to offer gamers the best IQ AND best performance they can get. This algorithm is just another clever example of why ATI has the best driver crew in the business.
Does anyone know if TR is a pro nvidia site? We all know some sites are pro one way or another, so you often have to consider that when reading "breaking news" about hardware. Always consider the source.....
I'm not implying anything about TR, I just think that people all to often take what they read at face value with out taking time to think about the source of the info and what could be their motive for this "news". What do they get out of it? Why continue to make an issue of something that is so clearly a non issue? Why so quickly dismiss the good will that ATI has built with the gaming community? Why are they just now discovering something that has been implemented for quite awhile and if they could not detect it before how did they do so now? Did they get "inside info" from an outside source? Why does this seem to coincide with the release of ATI's new hardware? Who is really going to benefit from all of this?
I really think this is just another example of nvidia manipulating something to try to make themselves look better and to sway potential buyers their way. It's all about marketing......
JMHO of course.
I don't see what the big deal is. Its not like they are doing it just to get benchmarks and then in games it runs like ass or anything like that. The X800XT is looking to be the fastest card, again, and the image quality is still top notch. Relax people.
I said it before in another post......
I don't care how much either sides "cheats" to achieve performance. I dislike nVidia for more reasons than just driver cheats (deviant cG coding, taking up two card slots, inferior IQ....etc.)
As far as I care, either side can cheat their brains out all they want, as long as my games run without corruption or crashes and I get good framerates, THAT is the only thing I care about.
The lines between an optimization or a cheat are getting finer all the time. People are nitpicking too much, me thinks.
I chose ATI for my graphics needs and I'm going to to stick with ATI. All this "cheating" talk doesn't phase that decision one little bit.
Maybe we just need to make new words to fit what VPU makers are (supposedly) doing: "optimicheations" or "cheatimizations" :lol: for all the nitpickers.
It's a non-issue to me. Besides...what is this? The second (supposed) cheat ATI has been caught in? How many has nVidia been caught in? Surely more than 2. For all nVidia's cheating, what has it got them? They are still number two in performance.
...it sure gives us all something to talk about, doesn't it?
As in many posts before, a clear view of the situation presented, thank you for posting - and I agree...
Trilinear filtering is, as was stated in ATI's response, an effect applied to the image - and a name for a process. The process itself can be and is done differently by the manufacturers ( there's also VIA, and of course Intel - what about their methods - nah, leave 'em be...) which is fine with me. What matters most to me about the process is...
Accuracy - of the image as displayed on the screen.
ATI has their "eye" on that point, and have been an industry leader for many years in displaying an accurate rendition of what the software author intended to display to the user.
I commend ATI for their hard work, and for the honesty they display in the response concerning Trilinear Filtering. I don't believe that they ever intended to "cheat" or decieve at any point on that issue. Their explanation shows they were, and are, striving for accuracy in their processes. From the tone of the response, ATI likely didn't think their adaptive algorithm - keyed to how a person perceives the transition within the image when the algorithm is applied - would raise a stink because of the benefits the user recieves - both speed and image quality simultaneously.
Raise a stink it did though - but ATI - unlike another company - gave an immediate response.
I shared my thaughts earlier with you guys about ATI in this thread;
I still feel the same way.
About Nvidia I just don't know what to belive or not, I really thaught that they actually learned something from last year, but they didn't.
I also think it's very proffesional of ATI to go out and chat with people about the problem found earlier.
ATI ALL THE WAY, once you've had ati you won't back regardless, in regards to this thread they need to do a poll that way is easier to see how many care and how many don't
*shrugs* I guess they should have been more clear and up front about the process, but other than that I don't care. And I don't care much about that, really. Better performance without sacrificing image quality is a win in my book.
I think Tech report is full of shit. They start bashing ATI for not giving out Propriatery information that R&D at ATI developed and why should they. This Optimization clearly is not effecting IQ, if this optimization clearly damaged the IQ then I would say yes ATI is bad for doing so. This comes down to "the all fuxing important reviewer" who do these people think they are, they are not all that important, just because they didn't read the PDF's and din't have ATI deliver the information to them on Silver fuxing platter it entitles them to slander a company? I don't god damn think so. Whats next hey I think ATI/Nvidia should just give us the schematics for the r420 or NV 40 so we can realy review their technology. ATI has not done anything wrong and because some back assed reviewer hasn't read all the info that entitles them to something special? Nvidia has don't shady things before but it is correcting their mistakes but what seperates them and ATI is that ATI admited to the algorithm where Nvidia denied everything.
Bottom line is an Optimization increase performance with out degredation of the expierience , ie IQ entitles the optimization to be considered a valid and legit optimization. I believe Tim Sweeny said that. (I may be wrong on that quote tho) If its not then I claim the quote
Personally? The "press" lies to us enough. So screw em. Oh, I shouldn't say lie. I should say the tainted truth due to heavy biases. You can't trust online sites for tech info anymore. At all.
The main thing these folks seem to be missing is the fact that you still can enable regular trilinear. And what the hell, I mean T filtering hasn't changed in 10 years. Shouldn't advancements in it be welcome?
Its no wonder nVidia is a wall PR people. You have to deal with all these stupid people that go crusading on a whim and screaming for blood.
Since I only partly understand what's going on with the whole
Tri-linear optimizations and the sometimes overboard,
and over the top, mean-spirited accusations made by all the fly
by night web/tech sites...I'll just say that if any company
claims their product(s) has certain attributes and features it
damn well better do them. If they don't..they won't be in business
Now...what I do know is this...
In my opinion and views...the following:
The good things about ATI:
1.) Since the 9700 Pro came into my possession some 1.5 years ago,
and now I also own the 9800 Pro and 9800XT, I have been consistently
pleased and sometimes pleasantly shocked at how well the Radeon
products render 3D images. With the games and apps I play and use,
Anisotropic Filtering and FSAA make extrememly minimal compromises
to the overall performance and speed of the images.
The I.Q. of all my Radeons is for the most part incredible.
The 9800XT, while not a big jump in performance over my 9800
Pro, makes up for that by having outstanding I.Q.
2.) ATI are one of the handful of companies, in any industry,
that actually seem to value their customers and the customers
needs, desires, wants and feedback.
Just guessing, but I would say 95% of the companies in the USA
and beyond, be it makers of graphics cards, other PC components,
or someone who makes TV Dinners or condoms, care little if any about
the customer after they have the money in their tight little fists.
ATI actually giving a crap about who they are selling too is consistently
shocking to me.
In my view...the bad things about ATI
1.) What's with the new X800 Pro and X800XT names?
I know they are following the old Roman numeral system by using
the X as a 10 in effect, but really...come on...it's kinda stupid.
To me...the product names X800 Pro and X800XT seem like in-house,
beta "project" names.
Kind of reminds me of Tandy/Radio Shack. For years this company has
been trying to find itself, all the while watching their stock
value and profits take dives.
Well...to me...it's real simple.
Who the f**k wants to go into any place named "Shack"?
Hello?!!!!!! Anybody freaking home?!!!
Despite the old adage, there is much to a name.
(And yes...I have alot of sales/marketing experience, and could come up with
much better and attention catching themes, slogans and ideas. For a price.)
2.) ATI and really even nVidia, need to slow down their production of
the "next generation cards". The software out today rarely challenges
the products that were made a year ago, much less 6 months.
Painkiller is one example. It plays well above necessary speeds
on my Radeons at 8x Forced Aniso and 4x AA at 1280 x 960.
Farcry I understand is the most graphically sophisticated out right now.
I tried to get the demos to run, just to see...but they would lock up
at the loading screen. Since I didn't care much for the genre and
theme of the game...I didn't dive too deeply into what might be the problem.
(*Can someone tell me what happened...a patch? Glitches in the demos?)
The consumer needs time to breathe.
The software people need time to catch up with the technology and really
exploit what's already out there.
And a new product should perform at least twice as well as the previous one.
20-30% faster or better is not...much, and will tire the consumer out quickly.
it suprises me that you may think that the optimizations turning off when doing the colored mipmap test would be the problem. turning them off before the color mipmap test would hide the optimization effects, thereby keeping them a secret and not telling people about this. instead, they detected when a colored mipmap was being used, and turned off the optimization so that you wouldnt know it was running otherwise, since they make no mention of it anywhere else.
they also claimed to use full trilinear filtering all the time, which is clearly not the case.
i try not to keep my views biased, but this is indeed a big deal, as this company has just deceived everyone, intentionally. i will buy the fastest card...brand and make dont make a huge mark on my decision. right now, im thinking about the 6800ultra, as it won in all of the tests in CGW and also has ps3.0 support and 128bit floating point precision...true, it is a fat pci slot hog and nice little psu burden...but i can overlook those for performance, as i dont use all of my pci slots anyways
I've seen all the debates, and read ATI's chat log, and all the rest on this subject, so now I'll put forth my opinion.
First thing is, while ATI has been very upfront about what exactly happens with this optimization, it DOES invalidate some of the scores that all the reviewer's have been posting (which is why some of them are so unhappy, and going rather overboard on hammering ATI over it). My reason for saying this is simple, since ATI's optimization was running on the X800(XT) versus the 6800U where it was specifically manually disabled, all of those reviews that were doing thier best to present actual card speed differences in a "apples to apples" way of comparing, did so inaccurately. With modern vidcards, it's getting harder and harder to achieve "apples to apples" comparisons anyhow, but if the reviewers had known, they would have at least not presented the results as such.
What is needed to fix this "issue" at this point in time, is simply a way to turn it off. While I suspect it will have minimal impact on the speed of the vidcard, it would allow the reviewers to correct any results they got from the benchmarking, as well as make all of the rainbow tunnel screenies they subject us too, be showing the actual visual differences between the big two's vidcard offerings, instead what we have now, where the screenies we are shown do NOT exactly match what a game look's like while it's normally running.
ATI's optimization is a GOOD THING. But with that said, they have known that the reviews that have been posted since they put this in place, that showed the good looking ATI filtering versus the fairly ugly Nvidia "brilinear" filtering screenies were inaccurate. Obviously, they should have said so at the time, since they did have to know about it. ATI didn't, and hasn't lied about this issue, but they do deserve the PR blackeye they are getting from it, since they knew that reviewers were "getting it wrong" for the past two years, and didn't correct them.
the fact that they got cought isnt good, but well... is not like detecting apps and launch especial tweaks for it, they are making a general detection on mipmaps which i think is a good idea, is just wrong that they got caught, but hey, is always a great idea, isnt it?
ATI have Better IQ is the truth ATI rules
Nvidia need work in next video card if want still sale video cards in 2006
Ati IQ does rule. But From what I have read they instructed reviews to disable Nvidia's "Optimizations" to make it fair. While unnoitced to anyone Ati was running this. I believe it may have been a tactic on Ati's part to gain a performance boost. I love ati and a x800 will be in my system soon.
Nvidia and ATI have been dishonest with their filtering I still like both companys and will always buy the faster card whether is be ATI or Nvidia I have no favorites, I want the best and fasted card at the time of purchase. I think both companies have came along way over the past several years and comend them on there awesome cards. I just hope both companies can learn from there mistake and continue to make us the video cards they have made in the past couple years.
I completley agree! I went from a Geforce -> Geforce 3 -> Geforce 4 4600 (remember Nvidia was it back then) -> to my 9800 Pro -> (possibly) -> x800
The consern I have is When I buy 400-500 dollar card I don't want them using this optimization with out telling me! When I think I'm get TRUE TRILIEAR at 40fps or something and with out these optimization I would get 2 (just figure of speech) I would like to use these things cus their nice but I don't want them clouding my judgement on what the card can/could do!
Separate names with a comma.