1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    They are excellent at Tesselation, when tesselation is all that is going on. Whether that ability can be put to use in a game environment without lag is yet to be seen. On top of that, the current mediocre tesselation performance of Radeons is meant to be a driver limitation, so their performance may yet improve.

    Prison Break: The Conspiracy
    Hardly a game I'm interested in, but included for comparison's sake.
    Minimal: Radeon X1800 series/HD2600XT/HD3650/HD4550/HD5450 or above, Geforce 7600GT/8600GT/9500GT/GT220 or above
    Reduced: Radeon X1900 series/HD2900 series/HD3690/HD4650/HD5570 or above, Geforce 7900 series/8600GTS/9500GT/GT220 or above
    Moderate: Radeon HD3800 series/HD4700 series/HD5570 or above, Geforce 8800 series/9600GSO G92/GT240 or above
    Good: Radeon HD4700 series/HD5750 or above, Geforce 8800GT/9800GT/GT240 or above
    Optimal: Radeon HD4860/HD5770 or above, Geforce GTS250 or above
    Extreme: Radeon HD4870X2/HD5850 or above, Geforce GTX295/GTX470 (est) or above
     
  2. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    This is true but I went to my friend's house to try it on his 5850. Tessellation is very cool when the effect is working properly, but it doesn't seem to work for Metro 2033 or is very poorly implemented and results in a massive FPS drop. Other than the respective blur and DOF shaders, Dx9 is the exact same as Dx11. I'll take FPS any day over better motion blur. Besides the Dx9 motion blur is great and the DOF is already a bit overdone to begin with.

    There is a theory though, that the game is going to be crippled until Fermi comes out, seeing as Metro 2033 was basically Nvidia's Dx11 development testbed. Then tessellation will work properly but only on Nvidia cards. Wouldn't be too surprised if that were true either :p

    Still waiting on driver optimizations as well. At the same settings a 5770 outperforms a 4890. Now I understand newer technology, but the card just isn't that damn fast. A 4890 is way faster. Something is holding the HD4000 series back in Metro 2033. And it's not the lack of Dx11.
     
    Last edited: Mar 22, 2010
  3. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Tesselation always does cause a massive FPS drop because it's a huge amount of extra computation. Whether newer versions of Catalyst or indeed the Geforce GTX400 series can improve on this is yet to be seen. Generally tesselation tends to cause the same sort of performance drop as 8-12x AA.
    While it's quite possible (and even likely) that games that are TWIMTBP will be ATI-crippled from a tesselation perspective, the benchmark of Metro 2033 I've seen actually states that the game runs better on ATI cards than the nvidia equivalents. On top of that, I don't see any evidence of a 5770 beating a 4890 either.
    At 1280x1024 in DX10, the HD4890 has a 27%/29%/29%/23% advantage at minimum fps and a 26%/28%/26%/25% advantage at average fps at the respective detail levels. At 1920x1080 this becomes 24%/29%/24%/33% and 27%/26%/28%/22%. This all sounds about right. With DirectX11 and all its associated features turned on, the additional graphics performance required to achieve the same frame rate is 54% at 1280x1024, and 49% at 1920x1080, so on the same card, you'll be losing a third of your frame rate by enabling all the DX11.
    As it stands right now, on minimum frame rates ATI are slightly ahead, with the HD4870 comparing not with the GTX260-216, but the GTX280, and the HD4890 on a par with the GTX285. On average frame rates it's a closer call, with the typical equivalancies being accurate.
     
  4. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    The Settlers 7 (AA excluded)
    Low Quality

    Minimal: Radeon X1800 series/HD2600XT/HD3470/HD4450/HD5450 or above, Geforce 7600GT/8600GT/9500GT/GT220 or above
    Reduced: Radeon X1900 series/HD2900 series/HD3650/HD4650/HD5570 or above, Geforce 7900GT/8600GTS/9500GT/GT220 or above
    Moderate: Radeon HD2900XT/HD3800 series/HD4670/HD5570 or above, Geforce 8800 series/9600GSO G92/GT240 or above
    Good: Radeon HD3870/HD4700 series/HD5700 series or above, Geforce 8800GT/9800GT/GTS250 or above
    Optimal: Radeon HD4850/HD5770 or above, Geforce GTX260 or above
    Extreme: Radeon HD5850 or above, Geforce GTX295/GTX470 (est) or above

    Medium Quality
    Minimal: Radeon X1800XT/HD2900 series/HD3650/HD4650/HD5570 or above, Geforce 7900GT/8600GTS/9500GT/GT220 or above
    Reduced: Radeon HD2900 Pro/HD3800 series/HD4670/HD5570 or above, Geforce 8800 series/9600GSO G92/GT240 or above
    Moderate: Radeon HD4750/HD4850/HD5700 series or above, Geforce GTS250 or above
    Good: Radeon HD4870/HD5800 series or above, Geforce GTX280/GTX275 or above
    Optimal: Radeon HD5850 or above, Geforce GTX295/GTX480 (est) or above
    Extreme: Radeon HD5970 or above, Geforce GTX285 Tri-SLI/GTX480 SLI (est) or above

    Maximum Quality
    Minimal: Radeon HD2900 Pro/HD3800 series/HD4670/HD5570 or above, Geforce 8800 series/9600GSO G92/GT240 or abover
    Reduced: Radeon HD4700 series/HD5700 series or above, Geforce 8800GTS G92/9800GTX/GTS250 or above
    Moderate: Radeon HD4870X2/HD5830 or above, Geforce GTX295/GTX470 (est) or above
    Good: Radeon HD5970 or above, Geforce GTX285 SLI/GTX470 (est) or above
    Optimal: Radeon HD5870 CF or above, Geforce GTX285 Quad SLI/GTX480 SLI (est)
    Extreme: Radeon HD5970 CF or above, Geforce GTX480 Quad SLI (est)
     
  5. cincyrob

    cincyrob Active member

    Joined:
    Feb 15, 2006
    Messages:
    4,201
    Likes Received:
    0
    Trophy Points:
    96
    which 3dmark do you guys use to test you gpu? ive finally installed the 8800gts(well it has been installed just not used) ive been playing COD MW2 actually just beat it.i noticed my temps got up to 63c at max. id say thats pretty good. so i want to test it out and see what kinda score i get
     
  6. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    3DMark 06. My system scores a 17,000.
     
  7. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    3dmark06 is the typical standard. Vantage is less popular as it requires at least Win Vista or Win7.
    In 3dmark06 my PC scores about 27,000. In Vantage it's 24,000. Today's average gaming PC scores around 10,000-12,000 in 06. Note, however, that the test is CPU-limited for a lot of people.
     
    Last edited: Mar 23, 2010
  8. cincyrob

    cincyrob Active member

    Joined:
    Feb 15, 2006
    Messages:
    4,201
    Likes Received:
    0
    Trophy Points:
    96
    downloading it now.
     
  9. cincyrob

    cincyrob Active member

    Joined:
    Feb 15, 2006
    Messages:
    4,201
    Likes Received:
    0
    Trophy Points:
    96
    [​IMG]

    think it said the FPM was 32 and 34. it had 2 sets of numbers

    how does this fair with others?
     
  10. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Sounds about right for your system, high CPU score, low GPU score. You aren't likely to see much higher than that without a graphics upgrade. On the other hand, Jeff's system that scores 17,000 would score 22,000 if he were to use a Q9550 instead of his Phenom II. That's not saying he should, that's just what I know the same graphics setup can achieve with that CPU.
     
  11. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    LOL! Is that because 90% of all softwares favor intel, or because the processor is indeed that much better? :p
     
  12. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Simply because an overclocked Q9550 is that much more powerful than an overclocked 940, and when you have something as powerful as two HD4870s, 3dmark06 is entirely CPU-limited. When I had a Q9550 and upgraded from two 4870s to four, my score only went from 21,500 to 23,500, that's how limiting it is. It took my i5 to get me to 27,000.
     
  13. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    So 9550 is 33% faster than 940, WOW!
     
  14. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    well same clocks yes. at stock id say they are very equal.
     
  15. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    I see nothing wrong there. Respectable score. The 8800 series is still pretty good for gaming too. Not ideal, but quite capable.
     
  16. Red_Maw

    Red_Maw Regular member

    Joined:
    Nov 7, 2005
    Messages:
    913
    Likes Received:
    0
    Trophy Points:
    26
    I use 3dmark 2006 since it came free with my MB XD 63c is excellent for an 8800GTS, that's about my record low under load, with the high being closer to 110C.
     
  17. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Not too worried. 3.7GHz on mine is respectable speed and more than enough power to drive these cards. Later on here I might see a new CPU, probably i5. But first I'm getting a graphics upgrade. Twin 5850s pretty soon I think... saw just ONE in action and it was one of the most beautiful things I have ever seen. I'm giggling like a schoolgirl at all the leet pwnage :p

    This one was was on a stock 940 as well because the guy won't OC it. I can't say I blame him either. He wants everything to just work. But yeah that means I might have some decent headroom with my OC :)

    Also, with the Q6600 and the 940BE both at 3.7GHz, the AMD is faster in games, snappier on the OS and booting, and runs cooler on the same voltage. The Intel is obviously faster in a few things, but not for my main use which is pure gaming XD
     
    Last edited: Mar 23, 2010
  18. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Omega: The main issue is not so much the stock performance but the overclocking. The Q9550 and X4 940 are a mere 6-8% apart at stock, but the 940 will only overclock by around 25% typically, whereas the Q9550 will do at least that, typically 30-45. In the instance I am comparing, Jeff's 940 is at 23%, whereas the Q9550 in question is at 41%.
    Ultimately, both are respectable CPUs at this performance level, and while the Q9550 would be considered an upgrade, it would be too slight to ever suggest to someone who currently owns a 940. Heck, even an Athlon II 620/630 owner would be difficult to convince, and rightly so. The Core i5/i7 range represents the only noteworthy upgrade from AMD's current Quad lineup.
    As for the 3dmark score, I would generally class 8 or 9 thousand as a reasonable gaming grade score, as that represents the typical performance of a brand new high-end gaming PC in late 2006, early 2007. An E6600 and 8800GTX stock would score around 9200 3dmarks. You will note that the brunt of the Q9550 has taken up the slack of only using the 8800GTS, and added an extra 1000 or so to the score, due to 3dmark's CPU limitations.

    It is worth noting that while clock for clock the 45nm Yorkfield Intels are 20% ahead of Phenom IIs, the original 65nm Kentsfield chips are dead on equals. Thus, with the better memory performance of the AMD CPUs than FSB-limited Intels, you probably would see slightly faster operation on the Phenom than the Q6600.
     
  19. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Thanks for clearing that up. So in the industries best interest, both intel and Amd are head to head, except for some slight voltage differences. And for the enthusiast, Intel provides the best overclockers as of late. :)
     
  20. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Generally for stock CPUs AMD and Intel are on even footing, AMD provide better CPUs at the cheaper end of the market, but have no truly high-end offerings.
     

Share This Page