From GTX260+ 216 OC worth upgrading to GTX 285?

Discussion in 'NVIDIA Graphics Cards' started by LilRazor, Jul 10, 2009.

  1. LilRazor

    LilRazor Active Member

    Joined:
    Jul 30, 2008
    Messages:
    158
    Likes Received:
    6
    Trophy Points:
    28
    Hi Is It Worth to upgrade from Galaxy GTX 260+ 216 Overclocked to gtx285? or am i just waisting my money for little performance or to no difference?

    or should i spend my money on a sli mobo and get another gtx 260?
    also along with a 24 inch lcd monitor
    thx in advance

    gtx 285 is tempting T.T
     
  2. Teme

    Teme Super Moderator

    Joined:
    Dec 22, 2004
    Messages:
    8,496
    Likes Received:
    175
    Trophy Points:
    73
    Thread moved to nVidia cards forums.
     
  3. mike2h

    mike2h New Member

    Joined:
    Nov 11, 2002
    Messages:
    6,359
    Likes Received:
    69
    Trophy Points:
    0
    well another mb(assuming u get a quality one)& a good 24" monitor is going to cost u more than the 285. that being said i dont think an mb upgrade is really that cost effective for u unless u got to i7 platform wich entails even more money.
    the upgrade to the i7 platform is what i would save for personally, it will cosat u more but will be well worth the extra expense. then down the road u could get another vid card & use your 260 as either sli or a physx card depeding on wich way u go.
    of course if u have some crappy 17" mon or even crappier 19" i would prolly make that my priority as i believe most peeps dont place enought emphasis on the main interface with their comp- their monitor.
    dont know if that helps much:)
    bty the 285 will give u a noticable improvement over your 260. of course some of that improvement is based on the size/quality of your monitor.
     
    Last edited: Jul 10, 2009
  4. PH3N0M

    PH3N0M The Master of Sarcasm

    Joined:
    Aug 15, 2008
    Messages:
    348
    Likes Received:
    23
    Trophy Points:
    0

    Save your cash and purchase a Core i7 setup. In my opinion, it's rather pointless to spend money on outdated hardware such as the Core2Q line of chips. I'm not saying they're shit, it just makes more sense to upgrade to NEW technology instead of sinking money on older technology.

    Case and point, I was considering upgrading my Q9550 to the QX9770 but there was one problem - the QX9770 is still $600+. I decided for that kind of money, I could upgrade to the Intel X58 chipset, Core i7 920, with 6gb od Corsair Dominator DDR3-1600. As you can see by my system specs, that's the route I chose rather than spend money on older technology.

    My advice, save your money and go with the Core i7 setup. Then, purchase a bigger monitor, and finish it off with purchasing another GTX 260.

    Trust me, you'll be glad you did!
     
  5. Trusteft

    Trusteft HH's Asteroids' Dominator

    Joined:
    Nov 2, 2004
    Messages:
    24,010
    Likes Received:
    3,831
    Trophy Points:
    153
    ph3n0m, did you actually get a performance increase from the Q9550 to the i7 920?
     
  6. PH3N0M

    PH3N0M The Master of Sarcasm

    Joined:
    Aug 15, 2008
    Messages:
    348
    Likes Received:
    23
    Trophy Points:
    0

    Oh, hell yeah. For example, I fired up Prime95 the other night and I was able to play COD4 and do other things without my CPU even once acting as if it were overloaded. I can't believe how much of a beast this CPU is. And, yes, all of the cores were maxed at 100%.

    [​IMG]
     
  7. LilRazor

    LilRazor Active Member

    Joined:
    Jul 30, 2008
    Messages:
    158
    Likes Received:
    6
    Trophy Points:
    28
    o.o so then i should go straight for core i7?

    will this do?
    core i7 920 2.66ghz
    corsair dominator 6GB 1600mhz
    Asus P6T
    24" LCD HP w2407
    and another GTX260+ or GTX 285
    Is it ok?
    but how much performance Increase will i see with x2 GTX 260 vs 1 GTX 285 ?
     
  8. mike2h

    mike2h New Member

    Joined:
    Nov 11, 2002
    Messages:
    6,359
    Likes Received:
    69
    Trophy Points:
    0
    thats the ticket right there. tho i would a good look at the intel x58 board to.
    imo i would go with the 285 & use your 260 for physx(assuming you play types of games that are adopting this quickest) & then when it starts to slow down a little get another 285.
    im not sure if the 285 supports tri cards but if it does, then down the road u would have sli 285 & 260 doing physx.
    fyi the 260 is way overkill for physx only so an option would be to sell it(assuming u go the 285 path) 7get a lower end 9000 series & pocket the extra cash. just not sure if the extra cash u might get would make that worthwhile.
    & of course, getting another 260 & going sli that way is a VERY good option.
     
  9. kris23

    kris23 Going Insane.....

    Joined:
    Dec 30, 2006
    Messages:
    5,984
    Likes Received:
    152
    Trophy Points:
    73
    GTX 260s are cheap, i recommend going that route....

    if i had an i7 id go with triple 4870s in crossfireX since theyre so cheap now..... 130 bucks a pop
     
  10. mike2h

    mike2h New Member

    Joined:
    Nov 11, 2002
    Messages:
    6,359
    Likes Received:
    69
    Trophy Points:
    0
    problem with that is atis cards are great performers but crossfire x is not as robust as sli. that & there is no real reason to go triple anything if u r using mid to highend cards from either company.
    plus he already has a 260... & yeh, the sli 260 would give him best bang for the $$ right now.
     
  11. PH3N0M

    PH3N0M The Master of Sarcasm

    Joined:
    Aug 15, 2008
    Messages:
    348
    Likes Received:
    23
    Trophy Points:
    0


    Yes, I would recommend going with that setup and since you already have (1) GTX 260 already, I'd recommend purchasing another for SLI. The (2) 260's will perform better than a single GTX 285 and you'd also save some cash.
     
  12. LilRazor

    LilRazor Active Member

    Joined:
    Jul 30, 2008
    Messages:
    158
    Likes Received:
    6
    Trophy Points:
    28
    oooo ok then thx
     
  13. LilRazor

    LilRazor Active Member

    Joined:
    Jul 30, 2008
    Messages:
    158
    Likes Received:
    6
    Trophy Points:
    28
    ok now i have another question =.= i cant seem to control the fan speed on my gtx 260+ its always stuck at 40% riva tuner or evga programs does not work ... why is that?
     
  14. PH3N0M

    PH3N0M The Master of Sarcasm

    Joined:
    Aug 15, 2008
    Messages:
    348
    Likes Received:
    23
    Trophy Points:
    0

    I use the NVIDIA System Tools with ESA Support. You can download it directly from NVIDIA using the link below.

    NVIDIA System Tools with ESA Support v6.05


    [​IMG]
     
  15. LilRazor

    LilRazor Active Member

    Joined:
    Jul 30, 2008
    Messages:
    158
    Likes Received:
    6
    Trophy Points:
    28
    ive download from the link u gave me but i still see no difference ... no fan noise increase , no temp drop nothing !
     
  16. nicnik

    nicnik In the Land of Snow

    Joined:
    Oct 18, 2005
    Messages:
    569
    Likes Received:
    10
    Trophy Points:
    0
    i'd go with 2 260s, jus be sure to get a cool and quiet one and 55nm, coz chances are both cards will have to be next to each other, making the top one much hotter. As far as I know it does not fit properly in the CM690 if u use slot 1 and 3 for gfx, much depends on your mobo though.

    If you only have a limited budget then the 2nd gtx 260 will make the biggest impact to gaming performance on your system.
     
  17. LilRazor

    LilRazor Active Member

    Joined:
    Jul 30, 2008
    Messages:
    158
    Likes Received:
    6
    Trophy Points:
    28
    well it sure fits perfectly in my cm690 =D
     
  18. kris23

    kris23 Going Insane.....

    Joined:
    Dec 30, 2006
    Messages:
    5,984
    Likes Received:
    152
    Trophy Points:
    73
    most likely the reason why you cant control your fans is because theyre probably hardwired to the main power..... its a custom cooling solution so it probably doesnt even register in BIOS for the above reason
     
  19. PH3N0M

    PH3N0M The Master of Sarcasm

    Joined:
    Aug 15, 2008
    Messages:
    348
    Likes Received:
    23
    Trophy Points:
    0

    That's because you have to open the NVIDIA control panel, click VIEW at the top of the window, select DEFINE CUSTOM VIEW, and select everything you want to adjust settings for. You will need to accept the End User License Agreement and then click VIEW > Define Custom View and select device settings before it will allow you to adjust ANYTHING on your video card(s).
     
  20. LilRazor

    LilRazor Active Member

    Joined:
    Jul 30, 2008
    Messages:
    158
    Likes Received:
    6
    Trophy Points:
    28
    ive done that already i put the fan speed 100% and click apply still no difference
     

Share This Page

visited