1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    The HD7970 may not be submitted to retailers for another 2 weeks, but the reviews are nonetheless out of NDA. The 50% increase (PR315 versus 210 for the HD6970) turned out to be exactly right (aren't I clever? :p) but not in the way that I expected.

    From a 'graphics horsepower' point of view, the HD7970 is actually only 30% faster than the HD6970, so in older/simpler games (including more basic modern titles like DiRT 3), that's all you get. However, the average is brought up to 50% by a much heightened resilience to match and best that of Geforces. AMD are no longer so vulnerable to the 'anti-AMD tesselation tax'. As a result of that, performance in high-end games like Battlefield 3 is up 50%, and for biased games like Arkham City, a considerable 70% improvement can be had (PR350). Crossfire gains are much the same as the HD6 series, 95-100% in the games where it works well, less obviously where it doesn't :p.

    Power consumption at idle is down from 20W to 11W, and load consumption is similar but marginally higher at 220W vs 210W nominal. The HD7970 retains the same 10.5" size and 8+6 power configuration as its predecessor, and keeps the same rear I/O connectors with the exception of the loss of the half-band DVI port to allow a full-width rear exhaust slot. This is likely to explain the reduced noise level (a return to slower fans it seems) and lower temperatures, despite marginally high power usage.

    The downside? The price. Due to the lack of a rival product from nvidia anywhere near the horizon, AMD are free to take advantage of the fact they have a card that's reliably faster, and more efficient than the GTX580. Since the GTX580 is still overpriced, the HD7970 will not be cheap - $550 or £425, likely to be gouged for availability at first sale, so I would expect $570-$600 or £450 for the initial period into February, if you can get one.

    Still, now exists a single card that's quiet, fits in most systems, yet has the memory and processing power to run a considerable number of games in eyefinity by itself, yet still retains the excellent potential to run in crossfire without causing excessive demands on the rest of the system. It may not be a massacre of the previous gen like some may hope, but the HD7970 is some very capable hardware.
     
  2. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Sounds good Sam :) I may just upgrade in february.
     
  3. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Forgot to mention, they also support 'Eyefinity 2' - which enables support for custom monitor configurations such as P20/L30/P20, and are also capable of driving UHD displays from a single output.
     
  4. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
  5. KillerBug

    KillerBug Active member

    Joined:
    May 21, 2006
    Messages:
    3,802
    Likes Received:
    0
    Trophy Points:
    66
    $570-$600? Unless you are gaming in 3D, I just don't see the point. My 560 runs everything with maxed out settings at 1080P flawlessly, plus it has CUDA cores for desktop apps.
     
  6. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Haha yes and no. I run at 1920 x 1200 and 1080p and even with Crossfire working excellently, many games are far from flawless. But bah don't mind me just being nitpicky :p
     
  7. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Once again, I can't see the last page.

    And now I can :D

    Jeff, are you a videophile :p
     
    Last edited: Dec 25, 2011
  8. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Weee got BF3 for Christmas! Now to start my next multi-hundred hour addiction. Just broke 400 hours on Bad Company 2. Will probably be putting it to rest in the near future.
     
  9. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    They're already trying to phase out DVI aren't they? I see monitors on newegg that don't have them. Oddly, they have D-sub/vga though. Rather strange. In any case, Display port and HDMI are my main concerns any more. I'm very interested in eyefinity now. I'll probably buy 2 more (Cheap) monitors in the not too distant future. I like the videos of eyefinity in action ;) Probably get an HDTV too. You can probably imagine what I'd use that badboy for. Although a Projector is something I've always wanted as well!
     
  10. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Well, no it doesn't. It just runs everything that you play maxed out, and that's fine for your purposes, so for you, there would be no point in upgrading. For people who play more demanding games though, there are plenty of reasons to own more powerful cards like this. In addition, it's not just nvidia GPUs that can accelerate desktop applications, both brands can do it.

    They're not phasing out DVI. The reason you don't see it is normally with cheap displays - companies are still using VGA on the cheapest displays because seemingly it costs less. The only occasions you wouldn't see it on higher-end displays are when they have TV-style functionality and use HDMI along with some other standard def connectors. DVI's not going anywhere yet, because HDMI was not intended to replace it, and Displayport, which was, has done a very bad job so far, as it's an unreliable technology. (Nvidia haven't even adopted it yet!). DVI's days will begin to become numbered once 3840x2160 displays start showing up, as they will require HDMI/Displayport to function.
     
    Last edited: Dec 25, 2011
  11. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Accidental double post. My bad.
     
    Last edited: Dec 25, 2011
  12. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Ahh, ok. I understand. Thanks ;)

    Eesh, february feels so far away :(
     
    Last edited: Dec 25, 2011
  13. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    kInda rewarding to know my 2407WFP is essentially still at the head of the tech curve. Features-wise, it's still more than a match for any comparable new product.
     
  14. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Ditto! I'm still a bit upset about my faulty SD reader, but what are you gonna do...
    There's no way i'm sending it to dell for replacement. My luck the replacement would have something else wrong. E.g. Dead/stuck/hot pixels.

    I definitely love my U2410. I've been doing a lot of christmas projects with Photoshop. WAYYY better than that old Sammy I had!
     
  15. hi i am new in this forum..
     
  16. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    LOL! Is that right...
     
  17. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I'd say the 2408 is, with Displayport, as that does make eyefinity possible, versus very difficult otherwise. Apart from that though, the Ultrasharps are still 'it', even the 5+ year old ones. It kind of sickens me to think that my 3 year old 3008WFP which wasn't really new when it was purchased, is still the most technologically advanced monitor out there...
     
  18. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    I find that rather humorous. Disappointing as well. I'd like new technology to be released, and compete with the Dell Behemoth!
     
  19. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    I find it satisfying as my rather expensive monitor is the only piece of the puzzle my other components have to work to match. Adding the 23.6" 1080p ASUS screen to replace the 22" Acer was icing on the cake. I can now play in 1920 x 1200, or 1920 x 1080, and the oldest LCD I own is by far the best. Am strongly considering LED though. Most of the new flat panel tech makes me yawn. Plasma, 120Hz, etc, etc, boring. LED actually got me excited. The single greatest leap in quality since we went to flat panels. Really negates the CRT vs Flat Panel colors argument. Really really beautiful. Might replace my secondary yet again with a larger LED for movies.

    I also decided not to make my planned jump to Sandy Bridge. Especially after some play with Skyrim and BF3 with Crossfire working properly, I don't see a need. Until I upgrade my video cards, my system has remained at a very happy zenith. It doesn't matter if my CPU is bottlenecking these cards if my games still average in the 60s maxed with AA... I daresay the "too little, too late" Phenom II has truly shown its colors. I might be arsed to hop to a 1090T if only for the OCing and extra 2 cores. Should keep me happy. Would definitely reduce the bottleneck to almost nothing.
     
  20. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    I feel about how you do Jeff. I've strongly considered Bulldozer as well as the new platform/motherboard, but I just don't see the need at this time. I'm still considering 1090t, and may just jump on it in february. But even then, my performance gains would not be incredible. I would likely see performance gains in X264, and multitasking, but with other applications, I may even see a slight drop in performance. The difference is probably not easily noticed though. Since I run my long encode jobs at night(while sleeping), I'm not sure if a 6 - 8core is even needed. I however would love to run encodes during the day, while shutting off 2 cores. I generally don't run intensive jobs during the day, because I need my CPU strength for other programs ;) With 1090t, I may just get what I want. I'm hoping the price drops at least a little in february. Maybe even the 3Tb drives. I see they're down a little. They don't cost a whole lot more than I spent on my first one. Of course it's wise for me to get a board with UEFI first.
     

Share This Page