1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Perhaps I misunderstood :S, when you have multiple inputs, there is always a switch, you don't have to unplug the cable for it to recognise the other input.
     
  2. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    Sam, that answer is interesting as hell - I didn't think of that which makes perfect sense. And at the same time that answer brings me completely back to the discussion about the impact on the 5970 (not your 4870x2 cards) from being put on an 8x slot versus a 16x slot.

    I realize your current motherboard does not provide dual 16x slots for crossfire (correct me if I am wrong.) So presumably, the two 5850s with 2gig memory each that you could buy, versus a 4gig petunia, would end up at 8x each, just as the two gpus on a 5970 will end up with 8x each.

    So now, what you're saying makes sense - for you.

    For me, on the other hand, or for anybody who might have a crossfire board with two 16x slots - then I hear you agreeing with the idea of two 5850s instead of a 5970.

    At the same time, doesn't this bolster the argument that hardware canucks was making that the 5970, on a PCI-E 2.0 motherboard, would take a bad hit if it had to be put on an 8x slot, thereby reducing the bandwidth to each gpu to only 4x?

    Anyway, yes, I picked the worst game. Here's how all the games on that page stacked up - Drop in fps, average and minimum, at 2560x1600, by moving 5970 to x8 instead of x16 slot on PCI-E 2.0 motherboard:

    Fallout 3: average -5% minimum -10.5%
    Far Cry 2: average -16% minimum -36%
    Hawx (DX10): average -2.9% minimum -13.6%

    Hmmmm. Hahaha. Well, clearly Far Cry 2 was the very worst. But Sam, you are quite clear that you pay strict attention to minimums - so drops of 10% and 13% are not exactly trivial and could make the difference between something that's playable or not.

    Anyway, I concede the point - it's not as bad as the Far Cry 2 data looked. LOL

    You and me both Shaff!

    If Miles gives me that 9450 4 gig with 8800 card, does that count?

    Wow, hyperthreading can kill some applications! I just was playing around with an m4v to avi converter for DivX movies - my mac brother downloads legit podcasts from itunes and we picked up a little thing that plays Divx on the tv off a thumb drive or usb disk drive. Anyway, I tried the converter on a variety of computers - the 4ghz p4 was getting 140 frames per second (the laptop at 1.6 ghz pentium M had scored 155 - both of them with 2mb L2 cache) and I noticed it wasn't pulling 100% cpu load like the other computers - only about 50% on each hyperthreaded logical core. I turned off HT, and cpu usage jumped to 100% and frames jumped to 230 frames per second, up from 140 - that's 65% faster with HT off! I left it off!

    Well I have to run. That was quite a nice thrashing you administered to Shaff as Jeff noted, Sam - lol - that's what he gets for complaining about me never upgrading. Hahaha.

    (So look guys, if Miles gives me that 4 gig 9450 and I throw a 2 gig 5850 on it, what kind of 3dmark6 score will I get on stock clocks? And then, how high should I try to overclock it? I don't know what motherboard he's got until I get my hands on it.)

    Rich
     
  3. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    So I have 2 5850s now. Watch for benchies in the next few days :D
     
  4. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Confirmed. My samsung is even more awesome than I originally realized :p

    Jeff, I'm jealous. I can't upgrade for probably quite some time :(
     
    Last edited: Apr 28, 2010
  5. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    I'm at an even worse spot! The 6 core Phenom II is out and it's awesomely priced. I have the board, I have the high performance memory, I have the video horsepower, but I'm not sure if I can justify the money. As mentioned elsewhere it's a great value for money but many games don't fully take advantage of the two extra cores. If it shows good benefits in the games play, (namely BFBC2, Crysis, L4D2, Metro 2033) then I might be seriously tempted to get it. But there still remains the question of whether I should or not. And if so will I part with my faithful old 940 or put it in a budget box?
     
    Last edited: Apr 28, 2010
  6. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    I may look to selling some of my stuff, so I can upgrade LOL! Or, I'll just be patient. I do have some things to sell though. For instance, I have a wireless router and PCI adapter card that are still in the box, that I can't see using now. By the time I need such a device, it'll probably be too slow ;) Long story why I have them...

    The 260 is fairing ok with most modern games. And the 965 should get me by for a while. I've only seen a handful of processes capable of bringing it to its knees ;)
     
  7. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Yeah the GTX260, especially your 216 model, is a very capable video card with any modern title. The few demanding games out there that challenge it are brand spanking new. There are still yet games to come out you could probably max.
     
  8. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Jeff: I can certainly see the appeal of the Phenom II X6. If there were a similarly priced hex for i5 I'd be sorely tempted as well. Add to that the Phenom II X6 is hands down the best CPU AMD have ever made. It ups the core count to 6 while retaining the 120W power draw of the newest 955/965 steppings, thus it can overclock to 4Ghz on air.
    However, ultimately it is still a Phenom II, which means versus your 940 there is no benefit per core per clock cycle. You are either buying higher clock speed (a waste, with your 940 already at 3.7) or the extra cores. So it comes down to whether you'll use the extra cores. 4 cores, the 1090 is just a 965. Six, and it's a midrange i7 (as in, 870/940/950). If the upgrade is almost solely concerned with gaming, I wouldn't bother. If it's concerned with video encodes and rendering that can use all 6 cores, go for it, it's well worth the money.
    Good to hear you've got your second 5850, that's turning into a seriously powerful machine :p
    Rich: I wouldn't leap at suggesting two HD5850s over an HD5970 for someone with a full 32x bandwidth. They'd get slightly more performance but I'd want to make sure that's all they wanted to do before suggesting it. An HD5970 is a fair bit more expensive, but occupies a lot less space, and more importantly, allows for extra GPUs to be added, or alternatively, something else to use the slots otherwise unusable with two cards installed. For many, it's a no-brainer, the two 5850s win every time because they're a lot cheaper, but the 5970 price is falling in the UK, so it's not necessarily an easy choice, depending on the user.
    You seem to have overlooked the article we posted about how GPUs perform at 4x bandwidth. Remember, you cited Far Cry 2 being bad, but that was the only case of the three games where there was any effect at all. There is a much more comprehensive review from I think it was Tweaktown, that tested over a dozen games, with the exception of STALKER Clear Sky, all of them showed 5% drop or less, running an HD5870 GPU at 4x.
    The one difference is that sadly said article did not consider minimums. This is something I will have to look into, but bear in mind, even if the drop in minimum fps works out to be 12% on average, running 3/4 GPUs is still very much worth it, as even with only 50% scaling on the additional GPUs it more than makes up for the difference.
    Q9450 and an HD5850 tricky to say, at stock clocks not a great deal, low teens thousands. Get the CPU up to 3.7-3.8Ghz and you might get close to 20,000, you may even go over it.
     
  9. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    I'm not too watchful of encoding times and whatnot. I've never done it enough to tweak my PC specifically for it. As you probably guessed, this build is 100% gaming focused. I want to find out conclusively whether or not it helps in certain games. It's an afterthought really, but it is a CPU upgrade for my current platform. That's why it's so tempting. Why go all-out and switch to i5, when I could just get a 6 core Phenom II? That plus considering it uses the newest stepping means it would probably OC higher too. I'm interested to see where the prices settle down to. It could very well be my next CPU upgrade.

    http://www.techpowerup.com/reviews/AMD/HD_5870_PCI-Express_Scaling/1.html

    Read em and weep. Barely any performance loss down to 4x. Theoretically then, even 16x/4x Crossfire would be worth using. If so, I know a few budget board users who will be jumping on the Crossfire bandwagon :p

    Of course, 2560 x 1600 on 1x is obviously out of the question due to the sheer data throughput needed. But up to 1920 x 1200 with 4x is consistently within ~5% at all times. I daresay those results are pretty conclusive. They run something like 22 tests with consistent results all the way through.
     
    Last edited: Apr 28, 2010
  10. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Quad cores for gaming make sense for the most part, but they're far from being properly utilised. I can hardly think of any games that will work with 6 cores.
     
  11. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Been playing arkham asylum for over a half hour now. Reminds me of how left 4 dead runs. Real smooth, and averaging 60F/ps. Can't seem to find any graphics settings. I likely haven't looked hard enough though LOL! While I think it looks good, I wouldn't say it looks incredible. But that could be due to my resolution settings. It's definitely not running at 1920 x 1200.
    I haven't bought it yet. I decided to see if it warranted opening my wallet ;) So far, it's definitely really close. They sure do make the game pretty simple. They hand you clues on a silver platter LOL!
     
  12. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    lolwut? Did you try looking in the install folder? I'm pretty sure there is launcher exe in there with a video settings menu. With your card you should be able to absolutely crank it at 1920 x 1200.
     
  13. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    While I'm perfectly capable of figuring out such a process, why would they do that? Why not put those settings WITHIN the game. Not everyone knows how to navigate the system folders. I suppose the manual that comes with the game probably states such though :p
     
  14. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Hey jeff, when you say Launcher exe, do you mean Launcher .ini? I'm familiar with ini files. I'm 99% confident in understanding their functions. I've had to reconfigure more than one, on more than one occasion LOL! Is this seriously the way Batman Arkham Asylum is re/configured though???
     
    Last edited: Apr 29, 2010
  15. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Nonono

    C:\Program Files (x86)\Eidos\Batman Arkham Asylum\Binaries\BmLauncher.exe
     
  16. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    LOL! Thanks. That's still a silly way to have it setup. Not everyone is gonna know to look there.
    Looks like it was set for 1680 x 1050, with anti aliasing disabled. And V sync was enabled. Ever since sams recommendation, I've wondered about having it enabled. From what I understood, V sync limits frames to 60 F/ps??? Or at least all that Fraps can record...
     
    Last edited: Apr 29, 2010
  17. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    VSync does limit the fps to 60. In some instances it can stop tearing (good), others it causes lag (bad). AA can only be enabled in Arkham Asylum on nvidia cards without a config hack, as nvidia inserted a line of code into the game stating 'If Primary_RenderDevice=ATI, AntiAliasing=disabled'
    Jeff: Yes, but at 2560x1600 there is a fairly substantial hit :p
     
  18. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Haha new disxcovery. Enabling Vsync in the Metro 2033 cfg file improves framerates significantly. This can be duplicated and seems to work on EVERY system. The theory is that turning on Vsync disables the 3D Vision optimizations. I'm not sure if this is really true but I DID get a 10FPS increase. My Minimum on my benchmarking spot also went from 17 to 22. Make of that what you will.

    http://www.gamefaqs.com/boards/genmessage.php?board=935068&topic=53962903

    I don't recommend the r_gi tweak because it causes drops more often than improvements and messes with how certain effects are rendered.

    The Vsync and FOV tweak though, are magical. Especially the FOV really makes the game sing! :D
     
    Last edited: Apr 29, 2010
  19. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Thanks to both of you for helping me out. Means a lot ;)
     
  20. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Not a problem. I pride myself on knowing just how to tweak a given game. It is my pleasure to pass some of the knowledge on :D

    BTW check my edit if you plan to play Metro. The Vsync tweak really works!!! Even on ATi cards!
     
    Last edited: Apr 29, 2010

Share This Page