Discussion in 'Reviews & Articles Discussion' started by craig5320, Oct 12, 2011.
OK, my Swedish isnt too hot but I cant see any mention of specifics about the test platform. E.g. Was SP1 used. Did they have all CPUs/Test systems on the same drivers? Were the games patched? Timedemos? All new testing? And so on.
And i'll say again... how is our benchmark GPU limited if 2 different CPUs cause different FPS figures? If it was GPU limited then surely both scores would be identical?
Also, i notice you didnt give us the link to your review yet. Can you provide it?
Well, as I'm being Swedish I can translate a bit of the information.
First of all, you are correct that there are no information about the platforms besides what hardware they used, not if SP1 was used and how the systems were installed prior to each test. As far as I can tell they may have used the same drivers or they may not have, the information isn't there.
What they do say is that they downclocked all cpu's to 2.9GHz. Whatever for is beyond me since that's not how the CPU's were ment to run.
As for the games, all of them are tested with timedemos. No information about patches avaliable and nothing stated if all these tests were made for this review or if old tests were used.
So, the nordichardware review doesn't prove anything really which pretty much proves your point Veridian.
Great, thanks for the translation skills Liqourice. Much appreciated... interesting to know about the 2.9GHz info too.
I find it really strange. I mean, I've never seen a motor journalist compare two cars just using the first 3 gears... of course the stock settings are what's relevant.
I don't think the results are at 2.9Ghz coz BD won't stand a chance against eg the older Thuban at the same speeds. Wprime32 score is at 7secs, winrar at 3XXX is where eg 2600K should be. I just gave NH as an example because you have a better idea of how the CPU can influence a game if not being GPU bottlenecked... and it shows raw performance figures. Maybe we are talking about two different things here...
Also Bluemak I never said anything about testing at 800x 600 1600 x 1200 or 1920 x 1080 is normal... In fact you can even enjoy games on a Q6600 if you like with a modern graphics card... in a CPU review it's the intention show what influence a CPU makes. If there's hardly any difference at X res and X detail. Why not go lower to show a better difference. Coz when you upgrade the GPU it might make a difference then what's under the hood...
For the diff results, maybe the AMD platform is just better when being hammered in the graphics department with an ATI GPU onboard. Better driver support due to the new release drivers ? I don't have the 6950 here, only 5870. I will go for the GTX580 anyway to see if I can spot a difference...
But, if the systems are the same (where possible) with only difference the CPU, then what does it matter the higher resolution? Why is that a problem showing which CPU performs better? For Gaming benchmarks. If system with CPU A with say a 6990 gives 80fps at 1920x1200 and system with CPU B gives 120fps, doesn't this give a clear signal which CPU is faster? It's not like the harder the GPU draw, the less important the CPU is...because if that was the case there wouldn't be an issue to talk about.
I just don't get where the problem is.
We are talking about gaming benchmarks, you can't take out the game out of the gaming benchmarks and you only play games at resolutions which make sense. If you have an old GPU then perhaps there would be a problem, but in that case, what does one care how the CPU performs in games when he should clearly upgrade his GPU first to be able to use his CPU too.
As someone who has been reading reviews on this site since before it got its new name (back when it was still called DriverHeaven.net), I have to say that this site has some of the best testing methodology when it comes time to review new gear. That doesn't mean I always agree with the review conclusions, though, and this review is such a case. However, that doesn't mean the review is biased in any way, shape, or form. There is absolutely nothing stopping me from looking at the various test numbers and coming to my own conclusion. In fact, since each of us has different needs and wants, it is very difficult to come up with a conclusion which will satisfy everyone. This is especially true in the case of this processor, which seems to perform on par with a 2600K in gaming while not being as good in encoding and general processing.
Although I am not in the market for a new processor right now (still rocking an i7-930 OCed at 3.8Ghz), I don't think I would buy the FX-8150 because I am very interested in encoding and general computing performance. However, if I already had a compatible AMD-based motherboard, this might be a worthwhile upgrade.
My only real disappointment with the review was the lack of any commentary regarding the excessive power consumption during overclocking. If I was a serious overclocker, I would be very concerned about the much larger power draw and, in turn, more heat produced by the FX-8150. Still, the review gave the power draw numbers, so once again readers can come to their own conclusions.
Actually the encoding and rendering seems to be where the strengths of this processor lie. Consider the 10 second difference between the 2600K and the 8150. Now I know there is about a $50US difference in price between the two but considering the next processor in Intels line is the 2500K and that the 2500K doesn't have HyperThreading there is going to be a difference that is larger than 10 seconds between the 2500K and 2600K for this type of application. So if a person was picking between a 2500K and the 8150 for encoding the 8150 would clearly be the better choice between the two. I would also consider that the 8150 seems to be better in other modeling and encoding applications compared to the 2600K and is almost always faster than the 2500K. The exceptions being unoptimized encoding applications like iTunes.
So I guess its true strengths will lie in heavily threaded applications and computer games.
I am by no means currently a fan of AMD.. i'm a fan of what is efficient, what is working, and what is currently showing real world figures ... i'm not a "BRAND" Fanatic... and while i had used amd exclusively for many more years then i did intel..... i'm fairly familare with both. Fact is I've always had a good experience overclocking AMD cpu's too...
But down to the facts... While yes running games at the lowest possibile resolution/settings/etc..... hell why not go wireframe and disable some of the effects/settings that aren't typically disableable.
I personally don't think it would be all the difficult to take the short amount of time to set the video graphics and other game specific settings to the lowest value and rerun the test just to see what kind of figures pop up. But again, how many people buy anything and run at such settings as it's been repeatedly mentioned.
EVEN if you took the most modern games today and ran them at the lowest possible values, the CPUs still wouldn't be loaded to capacity, nowhere near it... You'd have to run multiple instances of the game and run them in tandem.. something not easily accomplished.
The entire review all included a video card ... so it doesn't seem entirely practical to include ultra low settings just to pull multiple triple digit figures that really have no practical place in the benchmarking world. It's not even realistic to call it a real world test.
But as i said..
It's not entirely difficult to add an extra set of graphics with the ultra low settings included specifically for CPU senarios.
The last time i took a good look at low detail mode cpu comparisons was back in the non-hardware TnL days when cpu power played a giagantic roll in games... where many of the graphics although accelerated... still were handled mostly by the CPU..... much of that is now unnecessary and doesn't even excist.
I wouldn't mind getting a BD if it really matches your review. However, I do find it curious though that you're basically the only major website who shows BD in this favorable a light.
This doesn't exactly look like a CPU that bests 2600K in gaming (from Xbit Labs):
As has been said before, its not really possible to compare reviews against each other. In the case of those two examples that you quote we cant really compare as Metro and SC2 are not in our review, Xbit also stop at 1680x1050 and dont list whether they use AA/AF or not. Not quite going down as far as 800x600 low detail but equally not really stressing the full system.
What I woud say is that in the one test they did run at 1920x1080 which was 3DMark extreme the FX and 2600k are almost identical in performance (just as our results showed).
To be honest, I'm disappointed, the gaming benchmarks give a false impression that BD is on a par with the 2600K, when in reality all that is going on is that it's actually the GPU that is fully loaded in these tests. If you were genuinely looking to show that the CPU doesn't matter at these settings I think you're argument would have been better served by also comparing other slower CPUs, which some of the other reviews have done. At least that way users can then see that they don't even need to spend BD or i7 money to get the same performance.
Hey I'm happy with my T1055 chip OC at 3.05 tubro up to 3.6ghz . Never goes above 40C in video watching ect.. haven't game that much on pc side yet .
Seriously, there's like 4 or 5 now after the first one :uhoh: I wouldn't be surprised that all the accounts are from the same guy... if that's the case, please just go and get a life
Also, I expected more from Bulldozer, but seems OK as a budget option against the Intel solutions
Yup thats how a proper CPU review is done. Lower settings to let the CPUs flex their muscle, all HH have done is a GPU benchmark that proves nothing and gives the misleading idea that bulldozer is decent at gaming, which it isnt, it gives lousy 2008 phenom II performance for more ££ and power.
To quote someone elses good analogy:
"Comparing CPU performance at 2560x1600 with all the goodies turned on is like comparing car performance during rush hour traffic in Los Angeles. Sure, it may be my real world experience, but its a shit test for determining which car is faster."
I too was puzzled by the high score (9/10) - but thankfully I also read 25 other reviews. This site must see something that others don´t.
Yup, and thats what makes us awesome. Thanks for visiting, you have enhanced your life.
Did you read the AT page on turbo core? You might be better off going for a slightly higher base clock-speed.
Hi, first time poster wanting to ask these other first time posters a question.
What makes you think that reducing graphics gives you a fair system?
I'm a 'hobbyest game programmer', so while I don't have years of expertise behind me I'd like to submit two points:
1) Reducing graphics options can quite easily reduce CPU usage.
2) FPS is NOT linear.
Because of this:
You cannot be certain that a 'low res' FPS would apply at - normal graphics.
You cannot make assumptions about performance - 60 vs 120fps in theory could be nothing more than a 5% improvement.
It's an important and valid testing option - but to claim it's more valid or more important than actually testing the games at real world settings is a fundamental misunderstanding of testing philosophy. Real world testing gives you facts, 'artificial' gives you nothing more than an indication.
[There are of course some exceptions to these rules - but you have to obtain guarantees first. If a test is dependent on a certain number of frames/game ticks (E.g. SupCom replay) then the FPS is more likely to be linear]
Separate names with a comma.