Just a side note, after this build I might not even go for a high end build for my next PC, I have finally realized it is about gameplay and having fun and not eye candy. Especially after what Nvidia did to us 2xxx series card users. Although, I did buy the RTX 2080 way back when it was about half way through it's lifetime cycle. The thing that pisses me off is that my laptop is still relatively new and has a RTX 2060 in it which at the time I got it I actually thought was good value.
The 30 series didn't suddenly destroy your cards. Edit: I do agree though about not going for the top end. It's pointless unless you are rich.
Videos are funny, but I see no reason for any 20 owner to be upset. No logic behind it. Did you expect no advancement?
Advancement, I expected. Price of advancement? I did not expect. I paid just over 800 bucks for my card and it had been out for some time. If I had waited to upgrade this cycle I could have paid around 600-700 and gotten a card twice as powerful.
970 to a 1660 is what... a 30% bump in performance, turing has the better streaming capabilities (rtx 2060) if i recall for encoding quality. I just think it would have been wiser to delay just another month and see what amd launched and what nvidia responded with. Prices likely would have began to drop in order to bring in newer gpus. also it's best to leave the existing nvidia drivers installed, install card, let it dump the drive in, then update the drivers.
So it is an a possible issue with you not waiting, not of Nvidia. It's absurd to blame them because you didn't wait when you knew the new cards were coming. It's like me complaining for getting the 1660S, in the when a "2660" comes out next yer with double the performance of my 1660S for the same price or even a bit lower. How is that Nvidia's fault? I could understand if the card you bought was a new release and few months later they released the 3080, but that is clearly not the case here, is it?
The 1660 Super, according to multiple benchmarks/games I have seen over the last weeks is up to about 40% faster than a 970. which is a good bump, but also as you said, the improvement of the Turing is also very important and worth it for me. Also the extra VRAM. At 1440p and also at video editing the "4"GB of the 970 hurts. True perhaps AMD will release killer cards which might force Nvidia to reduce prices, but they might not too. (AMD or Nvidia), so I rather use these months not suffering and yes suffering is the word I am going to use, for at least the drop in performance in 4K video editing and not only. Thanks for your input.
If I were you I'd have still waited, just sayin'. Edit: but you do have a point in saying that the next time you upgrade, it'll be your entire machine.
Yep. Plus, if for some reason I have money in a year or so from now, the next versions of the 30 series will be out, the Super or Ti or whatever.
I posted these in the show your new stuff, but here it is too for you. I am going to take a picture of the card later, I have to run some tests first on the 970 to compare them.
Cool stuff, though I would have liked to see the back plate, my card has a really cool design on the back plate as well.
I didn't have time to change it yet. I hope I will later today. I ran a couple of tests, not many, to see how my GTX 970 is handling stuff, with actual numbers. World of tanks, it's really difficult to test it properly unless you run a replay, but I am not sure if that puts it in exact same stress as it would in real time so I decided to play some and also run the Encore benchmark tool. Ultra settings outside of light quality which I have at high, extra effects quality at high, grass in sniper mode disabled, fov 120. During play at 1440p I got average of about 78 (out of 5 battles), min 57, max 103, 1% low 57 , 0.1% low 45. The benchmark tool (Encore) at ultra settings with ray trace shadows off and AA at ultra. 1440p, average of 59.8 , min 40.1, max 166.6, 1% low 39.5, 0.1% low 36. Fallout 4, 1440p max settings other than Bokeh, lens flare, motionblur which I are disabled. External gameplay of 38 min, exploring and fighting . ave 57, min 27.5, max 74.6, 1%low 28.8, 0.1% low 3.1 Mind you the last two shouldn't be taken too seriously as with using the pip boy all the time breaks that. Not bad actually, you don't really notice any issue with the game even when it goes in the 30s. I also played some in 1080p, I didn't record the results, but the min was like 90s and the game played in anything between 104 and 150 or something. GTA V 1440p max settings other than motion blur, unless the benchmark tool enables it, I don't know. GTA V 1080p max settings MIN MAX AVE Pass 0, 7.812787, 148.661301, 78.190193 Pass 1, 30.951511, 122.657249, 57.209419 Pass 2, 49.604156, 279.049011, 72.899628 Pass 3, 48.980953, 190.519745, 81.337204 Pass 4, 30.228834, 197.484055, 78.908791 GTA V 1440p max settings MIN MAX AVE Pass 0, 8.489505, 61.609741, 47.441925 Pass 1, 21.158873, 118.122322, 38.844463 Pass 2, 33.360130, 236.949982, 44.138920 Pass 3, 16.923195, 193.517166, 49.433861 Pass 4, 18.312569, 269.650818, 49.032539 The 1440p benchmark run felt slower than the numbers show here. One thing worth to mention is that VRAM was almost always maxed out as was the GPU utilization at 99 and 100 percent. CPU not so much, I think the max I saw it go was 53% It's a 2014 Intel i7 5960X (8C 16T 3Ghz boost to 3.5Ghz) I am now considering what other benchmarks to run on my GTX 970. As long as I have the games that is.