1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Aw... No screenshots? :p
     
  2. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    The noise wasn't unbearable, but it was loud. The heat - well, it wasn't too bad with a window open in winter, but in summer the A/C was a necessity for a small room (and A/C is pretty rare in the UK).
    This isn't obviously anything especially to do with running 4 GPUs, it's simply that two HD4870X2s and an overclocked Quad Core run about 700W DC or 800-850W AC in gaming load. If you're having a moment of madness and you burn test all 4 GPUs and the CPU at the same time, you can pull about 930W DC (1050W AC). Since I only have an 850W PSU though, I cancelled that test a few seconds in when I saw the power draw.
    Graphics cards have got a little more power hungry since the HD4870X2 days, but the difference is, people tend not to run quad graphics as often, because it's also more expensive - you have to remember, those HD4870X2s had an initial MSRP of about $500, and I effectively paid $720+tax for the pair. As you will have seen, you can spend a lot more than that these days on single GPUs, and the current dual-GPU king of the hill, the GTX690, is $1000 by itself - so two of them would be $2000, almost triple what I paid for my quad GPU setup.
    If you did run two GTX690s though, you'd be looking at a good 50-100W or so more than the 4870X2 setup.

    Back when the HD4870 and GTX280 were the most powerful GPUs out there, running QuadCF made things that were otherwise well out of reach, possible at 2560x1600. I was able to run Crysis Warhead with a fairly respectable level of detail at that res, when it would have been hopeless on a single card (although 1GB VRAM was a bit limiting here).
    I think perhaps we should be wary of giving QuadCF too much praise though, because do you really need it right now? I'm willing to bet that once you see what's achievable with two HD7970s (or perhaps HD8970s at the time you'll be buying) you may not consider QuadCF necessary. The newer HD7970s are a clear 50%+ faster than my HD6970s, yet although I could perhaps do with a little more processing power, I'm not feeling the pinch too hard yet [at least, not when I'm not yet playing Far Cry 3 and Crysis titles!]

    The QuadCF system in a mid-tower was a very short-lived affair by the way, as during the initial 'troubleshooting' phase, I decided that perhaps 60ÂșC+ was a bit much for my X48 chipset, so I relocated to a full tower in March 2009, a couple of months after I bought the second X2 (I'd been running one X2 since August happily in the mid tower with just a 520W Corsair HX).

    The 'digital' Corsair PSUs refers to some digital signal processors that analyse the output voltage and adjust the PSU's output on the fly, a little bit like load line calibration for motherboards, but in PSUs. In my mind, the more there is in a PSU, the more there is to go wrong, so I think it's a bit over the top.
     
  3. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    ACE COMBAT: Assault Horizon
    SLI Support: No
    Crossfire Support: Minimal (18% scaling)

    Average 60fps:
    1920x1080: Radeon HD5870/6950/7850, Geforce GTX480/570/590/660
    2560x1600: Radeon HD7970GE, Geforce GTX680
    CPU: AMD Athlon II X4 610e/any Phenom II X4/X6/any FX, any Intel Core 2 Quad/Core i3/i5/i7

    Minimum 60fps:
    1920x1080: Radeon HD6970/7870, Geforce GTX580/660
    2560x1600: No current cards capable (HD7970GE@55Hz, GTX680@50Hz)
    CPU: AMD Phenom II X4 940/X6 1090T, FX-4320/6120/6300/8150/8300, Intel Core 2 Quad Q6700/Q8200, i3 2100/any i5/i7

    Average 120fps CPU: Intel Core i5 Mk1 @ 2.90Ghz, i7 930/2600/3770 stock
    Minimum 120fps CPU: Intel Core i5 2500K @ 3.60Ghz, i7 2600K @ 3.50Ghz, i5 3570K/i7 3770K stock, i7 3930K @ 3.90Ghz


    DMC: Devil May Cry
    SLI Scaling: 84%
    Crossfire Scaling: 99%

    Average 60fps:
    1920x1080: Radeon HD5830/6790/7750, Geforce GTS450/GTX550Ti/650
    2560x1600: Radeon HD5870/6950/7850, Geforce GTX470/560/650Ti
    3840x2160: Radeon HD6990/7970, Geforce GTX590/680
    CPU: AMD Athlon64 X2 3800+/any Athlon II/Phenom II/FX, Intel Pentium E2160/any Core 2 Duo/Quad/i3/i5/i7

    Minimum 60fps:
    1920x1080: Radeon HD5830/6850/7750, Geforce GTX460 768MB/550Ti/650
    2560x1600: Radeon HD5870/6970/7850, Geforce GTX480/560Ti/660
    3840x2160: Geforce GTX590/690, Radeon HD7850 Crossfire (No single AMD cards compliant - HD7970GE/HD6990@55Hz)
    CPU: AMD Athlon64 X2 5600+/any Athlon II/Phenom II/FX, Intel Pentium E2200/Core 2 Duo E4500/E6400/any Core 2 Quad/i3/i5/i7

    Average 120fps:
    1920x1080: Radeon HD6990/7850, Geforce GTX480/570/660
    CPU: AMD Athlon II X4 615e/any Phenom II X2/X4/X6/FX, Intel Core 2 Duo E8500/any Core 2 Quad/i3/i5/i7

    Minimum 120fps:
    1920x1080: Radeon HD6990/7870, Geforce GTX480/570/660
    CPU: AMD FX-6200@3.90Ghz/FX-8150@3.80Ghz/ FX-4320/8350 stock, Intel Core 2 Q9550/i3 2100/any i5/i7

    Video memory required:
    1920x1080: 512MB AMD, 640MB Nvidia
    2560x1600: 640MB AMD, 896MB Nvidia
    3840x2160 (estimated): 1GB AMD, 1.25GB Nvidia
     
    Last edited: Jan 28, 2013
  4. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Crysis 3 Multiplayer Beta:
    Crossfire scaling: 89% at maximum load, 52% at playable load
    SLI scaling: 83-94% variable

    Average 60fps:
    1680x1050 HIGH (1.5GB VRAM required): Radeon HD7970, Geforce GTX590/660Ti
    1920x1080 HIGH (1.5GB VRAM required): Radeon HD7970GE, Geforce GTX590/680
    2560x1600 HIGH (2GB VRAM required): Geforce GTX690 (No single/AMD cards capable - HD7970GE/GTX680@35Hz, HD6990/7970@30Hz)

    1680x1050 VHI (1.5GB VRAM required): Geforce GTX690 (No single/AMD cards capable - GTX590@55Hz, HD7970GE/GTX680@50Hz, HD7970@45Hz, HD6990/HD7950/GTX660Ti/GTX670@40Hz)
    1920x1080 VHI (2GB VRAM required): Geforce GTX690 (No single/AMD cards capable - GTX590@50Hz, HD7970/GTX680@40Hz, HD6990/HD7950/GTX660Ti/GTX670@35Hz)
    2560x1600 VHI (2.5GB VRAM required): No current cards capable (GTX690@45Hz, HD6990@30Hz, HD7970GE/GTX590@25Hz, HD7950/7970/GTX660Ti/670/680@20Hz)

    CPU: AMD Phenom II X4 940/X6 1050T/any FX-series, Intel Core 2 Quad QX6850/Q8400/Core i3 2100/any i5/i7

    Minimum 60fps:
    1680x1050 HIGH: Geforce GTX590/680 (No AMD cards capable - HD7970GE@50Hz)
    1920x1080 HIGH: Geforce GTX590/690 (No single/AMD cards capable - HD7970GE/GTX680@45Hz)
    2560x1600 HIGH: No current cards capable (GTX690@50Hz, GTX590@30Hz, HD6990/7970/GTX680@25Hz)

    1680x1050 VHI: Geforce GTX690 (No single/AMD cards capable - GTX590@45Hz, HD6990/7970/GTX680@35Hz)
    1920x1080 VHI: Geforce GTX690 (No single/AMD cards capable - GTX590@35Hz, HD6990/7970/GTX680@30Hz)
    2560x1600 VHI: No current cards capable (GTX690@30Hz, HD6990/GTX590@20Hz, HD7950/7970/GTX660Ti/670/680@15Hz)

    CPU: Intel Core i5 7xx @ 3.50Ghz, 25xx @ 3.45Ghz, 34xx @ 3.20Ghz, i7 960/2600/3700 stock, No AMD CPUs capable


    NOTE: Despite what we said earlier, the mutiplayer beta for Crysis came nvidia-biased 'out of the box' - the latest Catalyst 13.2 beta needs to be applied to see the results shown above on AMD hardware.
    This driver constitutes a 25% boost on HD5 series, 17% boost on HD6 series and 14% on HD7 series products. With this fix considered, performance of AMD/nvidia is equal.


    If you're shooting for an average of 60fps and don't mind periodic drops to 30 (bear in mind this is a multiplayer title), then the CPU load is fairly reasonable, any of the 2008/2009-gen Phenom IIs and Core 2 Quads can handle it.
    If, however, you want a fluid 60fps experience (and you have the graphics hardware to back it up, or the graphics settings low enough), then it's an Intel-only zone. Anything Sandy Bridge or Ivy Bridge, if it has 4 cores, just the turbo-boost alone will handle the game fine without needing to overclock.
    However, the original Nehalem generation can also hit the mark if overclocked into the higher 3s, not difficult for most of them except for a few of the original stepping i7 920s.
    Even with Ivy Bridge CPUs, 120Hz is not an attainable target in this game yet, so we need not focus on the graphics hardware required for such (It should be noted that the GTX690 can actually hit 130fps average at 1680x1050 High, but not minimum - and you wouldn't be spending $1000 in 2013 on graphics to run below 'High' at 1680x1050!)

    If we look to the future when the 'not yet capable's will start disappearing - a lot of this relies on Crossfire support improving. There is little that the GTX680 (and therefore, two high-end GTX600 GPUs) can't handle outside 2560x1600.
    At 1920x1080 High, two 1Ghz HD7970s will actually achieve a minimum of 60fps with crossfire in the state it's currently in, so this resolution on High is currently within either brand's grasp.
    At 1920x1080 Very High, we will need a more typical 90-95% scaling figure for two 1Ghz HD7970s to succeed. It seems likely that further driver revisions will have achieved enough, that by the time the HD8970 is released, two HD8970s will also be sufficient here.
    At 2560x1600 High, two HD8970s should just achieve the 60fps minimum here, assuming as previously stated, that crossfire scaling tops out in excess of 90% by the time of their release.
    At 2560x1600 Very High, unless quad-crossfire is successful in this title, and people can justify the expense/power/noise etc. we will be waiting at least an additional generation (HD9) before this will be smooth. Quad-SLI is an unknown here, two GTX690s may suffice if scaling is perfect, but history does not imply this. A 'GTX790' might be necessary here.

    This of course is multiplayer. Arguably the SP will probably be more demanding, but the significance of 60fps will be reduced...
     
    Last edited: Feb 2, 2013
  5. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    lol 60FPS minimum. Sam you're nuts :p
     
  6. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Applies for all, but this in particular is a multiplayer benchmark. Do you like frame rates to drop below 60fps often in multiplayer games? not sure I do.
     
  7. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Considering most can't even maintain the 60FPS average, I don't see it as much of a loss not to have the minimum. I've had astoundingly good BFBC2 games below 60FPS.
     
  8. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    What is the benchmark standard you aim for? Perhaps I should include some lower minimum frame rate results, but I'm not sure where to set the bar...
     
  9. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    My standard is usually minimum 40FPS for most heavy games. BF3 maintains that standard at maximum settings with no AA and the lowest level of FXAA. Average is maybe around 60-70 with dips to the low 50s and high 40s. 40FPS minimum is an absolute worst case scenario. Crysis 2 manages minimum 30 absolute worst case and I would consider it quite smooth as it averages 50-60. Both games remain quite fluid. I wouldn't say so if they didn't. Also take into account that there are usually tweaks available for games that are simple to do and make a significant difference in how they perform. BF3 in particular is transformed by changing the RenderAheadLimit to 1 and disabling triple buffering through an ini command. It runs like a champ even with Ultra Textures on 1GB cards.

    In a great many older and even more recent games based on well-used engines I can indeed manage minimum 60 easily. Heavy stuff like Crysis 2 and BF3 need a separate standard for most people though I would think. My tolerances are maybe looser than yours, but I do like to keep it above or around 60 most of the time. Minimum 60 is just too much. Even with the new reserves of power I've unlocked, it's simply unrealistic. I would rate my PC as maybe slightly above average as far as gaming PCs go, if a bit behind the curve.

    BTW Several newer Total War games and other RTSs are showing improvements from the 1090T as well. Supreme Commander Forged Alliance in particular skyrocketed in framerate. I think I can safely say that a large chunk of modern games support hex-cores quite well and take proper advantage of them. The extra two cores on the 1090T are not just for show. This thing is a beast. I've had good improvements in almost all of my games, sometimes quite large. I would be interested to throw in a better pair of video cards say 7850s or 6970s and see what happens.
     
    Last edited: Feb 4, 2013
  10. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I think I should point out the last few games I've played have been hovering around the 40-50fps mark more often than being above 60 - it's simply that, disregarding 120Hz, a minimum of 60fps is the 'absolute max' for a title - if you can hit that, you have the best protection from scenes that suddenly drop the frame rate, and the experience should be wholly fluid throughout. For a more reasonable level of demand, that's what average of 60fps is for, but agreed on a reduced minimum fps, I may introduce that later.
     
  11. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Dead Space 3
    SLI Scaling: 80%
    Crossfire Scaling: 85%

    Average 60fps:
    1920x1080 (512MB VRAM required): Radeon HD5830/6850/7770, Geforce GTX460-1G/560/650Ti
    2560x1600 (640MB VRAM required): Radeon HD6970/7870, Geforce GTX580/660
    3840x2160 (896MB VRAM required [est]): Radeon HD7870CF, Geforce GTX690 (No single cards capable, HD6990/HD7970GE@50Hz, GTX680@40Hz)
    CPU: AMD Athlon64 X2 3800+, Intel Pentium D805/E2140

    Minimum 60fps:
    1920x1080: Radeon HD5850/6950/7850, Geforce GTX470/560Ti/650Ti
    2560x1600: Radeon HD6990/7950, Geforce GTX590/670
    3840x2160: Radeon HD7970CF, Geforce GTX690 (No AMD cards capable, HD6990@40Hz, HD7970GE/GTX680@35Hz)
    CPU: AMD Athlon64 X2 4200+, Intel Pentium D820/E2140


    Going down to minimum 40fps didn't really seem necessary here...
     
  12. Enigma346

    Enigma346 Member

    Joined:
    Nov 23, 2008
    Messages:
    72
    Likes Received:
    0
    Trophy Points:
    16
    Ok guys. I need help on this. I play various games, shooters, MMORPGs and so on. I like newer games so I need a card that can push them very nicely. So this is where I am. I came down to the GeForce GTX 660 (Not the Ti) or the Radeon HD 7870. I am leaning more towards the HD 7870. If I do decide on this card is there a specific brand I should get? I mean there is Sapphire, Diamond, XFX and so on. Does one really outshine the other when it comes to terms of longevity and such?
     
  13. Red_Maw

    Red_Maw Regular member

    Joined:
    Nov 7, 2005
    Messages:
    913
    Likes Received:
    0
    Trophy Points:
    26
    I have owned both Sapphire and XFX cards and as far as I know both are still working (gave the sapphire to a friend) the quality of the XFX is much higher. The XFX runs much quieter and cooler than the Sapphire, to the point I would probably not buy another Sapphire card without a very strong recommendation.

    As for 660 vs. 7870 I think they are about equal in performance so what I usually do is pick the one that does best in my highest priority games. Hopefully some one with more knowledge of current gen graphics will chime in >_>
     
  14. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I personally would vote for XFX or MSI. The latter tend to be a bit stingy with accessories (e.g. not including crossfire connectors) but their cards seem fairly solid. Sapphire stuff has been pretty dreadful for the last few years - very high failure rates, often poor noise levels, and poor design choices (e.g. PCB-sharing with higher-end models, thus requiring more power connectors than all other brands for a given product).
    The GTX660 and HD7870 are fairly well matched, both in price and performance. It really comes down to the sort of games you play most, if you often play games that given the GTX660 an advantage, get that. Otherwise the HD7870.

    Now that reference designs are becoming increasingly rare in the graphics industry, nvidia's lack of manufacturing quality is somewhat less of an issue, as the build quality of non-reference cards is almost universally poorer than reference. In all other aspects beside their unpleasant corporate attitude, nvidia have come good in recent years. Most of their products are now reasonably priced, their power consumption is very reasonable and they are employing performance optimisations across the board that mean their gameplay experience is every bit the equal of AMD's if not marginally superior.
    The behaviour of nvidia as a company still disgusts me, but there's now very little to criticise about their product lineup, and as AMD's successes in the industry continue to dwindle (e.g. the failure of the HD7990 to be released as a product, potential delays to the HD8 series, which already looked a very minor refresh compared to the GTX7 series as well as their CPU troubles) I suspect the graphics industry may start becoming increasingly one-sided at the high end.
    What I can only hope doesn't happen is that AMD get left in the same position for graphics as they are already in for CPUs - i.e. only fulfilling the midrange and low-end of the market. This would leave nvidia to charge whatever the please and stunt innovation at the bleeding edge. Something that bizarrely, most gamers seem to agree with. You're hard pressed to find any news article about graphics hardware these days that isn't peppered with comments calling out how pointless graphics upgrades are because "until new consoles come out, you can max out any game without issue".

    As terribly inaccurate as that statement is, it's a commonly held belief, and unfortunately as you can see from the success of EA and Activision, what gamers say goes, even if it isn't what's right...

    I'll also point out that when discussing the next gen of consoles, we're hearing comments like 'It's not about the specs, it's about the gameplay' - the same remarks Nintendo used to fend off criticism about the outdated hardware in the original Wii. I fear the technological leap this time round will be nothing like as significant as last gen.
     
    Last edited: Feb 14, 2013
  15. Enigma346

    Enigma346 Member

    Joined:
    Nov 23, 2008
    Messages:
    72
    Likes Received:
    0
    Trophy Points:
    16
    Thank you both for both the very helpful answers. When it comes to what a game would run better on.... If I see it is sponsored by Nvidia then a Geforce would probably be better for that game? Same for AMD? I play a lot of Guildwars 2 as of right now. From what I have seen, even if this is true that AMD plays Guildwars 2 better because of the bandwidth. Now when it comes to the GTX 660 Ti or the 7870 they both rival each other in the game. The 7870 of course being the cheaper and providing about the same game play capabilities as the 660 Ti. Over clocked the 7870 from what I have read could actually perform better than the 660 Ti. Think I am going to go with the HD7870. One more question I have. I noticed the card is for PCIE 3.0 x16. Would it work on my PCI Express 2.0 x16?
     
  16. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Normally speaking a game sponsored by nvidia (e.g. "The way it's meant to be played") suggests it will favour nvidia hardware, but not always - sometimes the reverse is true. It pays to check benchmarks before you buy, or if you're really not sure, ask here, we should be able to find some evidence that determines whether the game is biased to one or the other, or neutral. I'd avoid comparing the HD7870 to the GTX660TI by price as the GTX660Ti commonly surpasses the HD7870, and is really a worthy competitor to the much more expensive HD7950. The GTX660 Standard and HD7870 are more appropriate rivals.

    PCIe3 and PCIe2 cards/motherboards are interchangeable both ways.
     
  17. Enigma346

    Enigma346 Member

    Joined:
    Nov 23, 2008
    Messages:
    72
    Likes Received:
    0
    Trophy Points:
    16
    Thank you once again for the quick answers. I appreciate the help and I have now chosen. :) Thanks!
     
  18. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Not a particularly great times for AMD's desktop graphics division.
    The HD8 series is now likely to be delayed by about 6 months until late summer 2013, and has already been announced to be a 'minor refresh' of HD7, similar to what we saw with HD6 versus HD5. This means that the flagship product of the HD8 generation will surpass the GTX680 by a small handful of percent. Meanwhile, the curiously named 'GTX Titan' has been announced to the press by nvidia, with claims of a 30% lead over the GTX680. Now nvidia are known for broken promises but if this makes it out in any significant numbers, the high-end of the graphics sector will be largely a one-horse race this time round. With AMD already having exited the high-end of the CPU sector, graphics may well follow, since their successes with games consoles this gen mean their income stream is more effective elsewhere.
    One brand at the top with GPUs is likely to cause the same sort of stagnation we see with CPUs, and is not good for the consumer. If there is no dual-GPU part from AMD this gen either, that will only make things worse.

    Edit: AMD have stated there will not be a dual-card version of the HD8 series until at least 2014, perhaps ever.
     
    Last edited: Feb 18, 2013
  19. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    OK, updated state of play:

    2013 will be rather a bewildering year for the desktop graphics industry.

    Having now achieved "the fastest products at every price point" AMD have now cancelled their next line of products. With the exception of possibly a reference version of the HD7990, there will be no new products from AMD that advance performance frontiers in 2013. The HD8 series as we knew it, has been cancelled.
    What will happen is as follows:

    New 'Sea Islands' cards will be released, but under the HD7000 series moniker still, and as low-midrange products.
    New 'Southern Islands' cards will continue to be released, but low-end derivatives rebadged as the HD8000 series for OEM system building reasons.

    The highest-performing GPU from AMD will remain the HD7970 1Ghz Edition until at least the end of 2013, 18 months after its release.

    Meanwhile nvidia have released their single-GPU behemoth, the GTX Titan, to the world. With a $1000 price tag, it's near GTX690 territory, but as by far the most powerful single GPU card ever to be released, nvidia being themselves, have priced it 'appropriately' for that accolade. You can triple-SLI them if you so wish, providing the equivalent power of four GTX680s in three GPUs, or as a standalone card offering a little north of a 30% boost over one GTX680.

    Now here's the interesting thing. In a large number of games, the GTX Titan obviously far outstrips the competition (but not the non-reference HD7990s or the GTX690 excluding non-dual-graphics titles), but in several major titles, such as Sleeping Dogs and DiRT Showdown, this vast performance boost isn't even enough to surpass the HD7970 Ghz Edition.

    Running the numbers, more often than not the 1Ghz HD7970 is taking on the GTX680 and winning. I'm not sure when this happened, but it's rather caught me out. For the moment, AMD's 'we win at every price point' actually seems to have some merit to it.

    Why that's a license to rest on your laurels for 18 months I have no idea, and we know a refresh from nvidia is due in 2013, which AMD will have to compete with without releasing any new products. How well that will go down, I have absolutely no idea... Strange times ahead, but not particularly exciting from a gamer's point of view, since the fastest card at christmas in over 10 months time will be the same card many have already owned since last summer...

    If you were hoping for a latest and greatest to help tackle the heavy demands of Crysis 3 (the full game is better optimised than the beta by the way), then you'll be disappointed, at least for another year. All you can hope is that that crossfire setup you were lusting for becomes a bit more affordable...
     
  20. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Crysis 3 is a step up in graphical quality from Crysis 2 in every way. It rivals Crysis for the most part and beats it in several ways. Gameplay seems to be a bit disappointing though. Very short, and lacking creativity. Not a bad game by any stretch, but it seems Crysis 1 was the peak of Crytek's enthusiasm for the series.

    As far as AMD goes, they are in middle of a major restructuring, so it's no surprise. I would like to see their cards drop in price a bit. I'm not opposed to a new pair of cards, and the 7900 series is hardly slow as you can see :p I quite like the performance numbers those cards generate, and would gladly buy them if prices were to drop some more. As it stands, high end 7900s are still kind of expensive. A pair would be about $700 on sale. More reasonable are 7850s, which are faster than my cards by a wide margin, and have a lot of the architecture improvements.

    I doubt we are going to see them disappear any time soon. They have ongoing deal in the console market, and we all know consoles generate income, be it directly or indirectly.
     
    Last edited: Feb 21, 2013

Share This Page