1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    It's my excuse and I'm sticking with it LOL!
     
  2. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Yep there are two buttons just below the scroll wheel. One turns the DPI up and one turns it down. Not software controlled either. It changes the setting right in the firmware of the mouse.

    Well usually I turn the sensitivity in-game right down. It's usually there to make up for gamers with slower mice. Turning the setting all the way down in the game basically makes it full hardware so it depends entirely on the DPI of the mouse itself. A few games actually have the default sensitivity right in the middle so I leave them at stock. Like Sam says I usually leave it unless it feels way off from my desktop sensitivity.

    ----------------------------------------------------------

    As far as SSDs go the only actual game I think can benefit is Elder Scrolls 4: Oblivion. To add mods you need to uncompress the entire game's data folders which takes it from about 6GB in 7 or 8 files to about 9GB in thousands and thousands of little files. When my virus scan runs it literally spends half the time scanning in my Oblivion folder due to the sheer number of small files. Even with my current rig it hitches like mad in outdoors areas. Already pretty much confirmed that if I had an SSD it would be almost hitch-free. Mind you this is one of the worst games for hitching and loading pauses in history.
     
  3. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    A game, if sufficiently badly coded, will lag on any storage hardware, as if it has to stop to load, unless that loading process can be sped up to many thousands of times its current rate, there'll still be a dropped frame.
     
  4. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Well Oblivion responds extraordinarily well to SSDs as shown by a friend who got one recently. Of course, it will probably never be hitch-free, but it can be gotten quite close. And considering every object in the game has some sort of stats or interactive qualities, I think the game is quite decently coded. Consider I first played it on An X2 4400+ at 2.6GHz and an X1800XT and the only thing to really improve has been the amount of AA I can use. Hitching has only gotten better as I have found a few tools that control the memory loading plus my myriad of CVAR tweaks. For me it is a game well worth tweaking. As seen in Fallout 3 they severely underused the power of the engine and missed out on the graphics and optimization sweetspot. Fully modded out, Oblivion is breathtakingly beautiful and challenges even brand new games. The weakest point was the textures. I now have added about 4GB of textures myself and the game looks amazing. And this is without any fundamental changes to the engine. Just better textures.
     
  5. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Then it's a well coded engine, but a badly designed configurator :p

    Made a little rough estimation chart for CPUs

    P4 Northwood 3000: 3000
    P4 Prescott 3000: 2800
    Athlon XP 3000+ [2.16Ghz]: 2800
    Athlon 64 3500+ [2.2 Ghz]: 3500
    Athlon 64 X2 4200+ [2.2Ghz]: 3500 x2
    Pentium D 930 [3.0 Ghz]: 2800 x2
    Core 2 Duo E4300 [1.8Ghz]: 3200 x2
    Core 2 Duo E6300 [1.86Ghz]: 3500 x2
    Core 2 Quad Q6600 [2.4Ghz]: 4500 x4
    Phenom X4 9950 [2.6Ghz]: 4400 x4
    Phenom II X4 955 [3.2Ghz]: 6100 x4
    Core 2 Quad Q9550 [2.83Ghz]: 6100 x4
    Phenom II X6 1090T [3.2Ghz]: 6200 x6
    Core i7 920 [2.66Ghz]: 6800 x4
    Core i5 750 [2.66Ghz]: 6900 x4
    Core 2 Duo E8600 [3.33Ghz]: 7200 x2
    Core i7 980X [3.33Ghz]: 8600 x6

    So as you see, Rich's CPU fares up to CPUs I've had, and the one Jeff has roughly like this:
    Athlon XP 3000+ @ 2.16Ghz: 2800 x1
    P4 Prescott @ 3.8Ghz: 3600 x1
    Core 2 Duo E4300 @ 3.15Ghz: 5600 x2
    Core 2 Quad Q6600 @ 3.24Ghz: 6100 x4
    Core 2 Quad Q9550 @ 3.4Ghz: 7300 x4
    Phenom II X4 940 @ 3.7Ghz: 7100 x4
    Core i5 750 @ 4.12Ghz: 10600 x4

    The i5 I use, which cost all of £150 to buy, is three times Rich's P4 per-core, so 12 times as powerful. That really makes a difference.
     
    Last edited: Jul 26, 2010
  6. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    Wow, that cpu chart was beautiful:

    Hmmmmm! (Why does this discussion remind me of Kevin's transmission? I swear, you guys, if Miles gives me that 9450/9550 whichever one it is, I'm gonna quick like a bunny jump from 6,000 to 15,000 3dmark6 points as soon as I add in a 5000 board, and then I'll start participating in some of these new games.)

    The most modern processor around right now where I live, is my brother's phenom one. (He bought it without asking me, and realized quickly later that he should have gotten the phenom II.)

    I ran a little mkv to avi converter, on all my computers, as a single-threaded benchmark. My p4 runs the program about 30% faster with hyperthreading turned off, which tells me it's single-threaded - correct me if I'm wrong. Factoring in the much slower speed of the phenom, at about 2.4 ghz, versus my 3.99 ghz p4, and the fact that the phenom is quite a bit faster on the conversion, it seems as though the phenom has a clock for clock 2.2X advantage over the p4.

    Sam, from your computer chart, I see you're rating the i7 same as the i5 clock for clock. Do you see any advantage at all of the i7 over your i5 for gaming?

    The i7 has hyperthreading, and I think your i5 does not, as I recall, but for gaming we know that's not an advantage. What about pci-e express lanes? Wasn't there a compromise on the i5 where you can't have two 16x graphics card lanes simultaneously? If so, do you see that ever holding you back with generation 6 or maybe with 2012 generation 7 ati boards on a smaller die?

    Hahahaha! Sounds like I'm not gonna entice you yet, Kevin, come on dude, you're missin out! The transmission story was pretty good though, I'll give you that! LOL

    Rich
     
  7. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Actually a Q9550 moderately overclocked with a 5800 would be quite close to 20k in 06. With the 3850 though you're looking at around the 9-10k mark I'd imagine.

    Different programs are biased pro/con Intel or AMD in that sort of arena, so you can never get fully accurate results, but as terrible as the Phenom is, it's still faster than your P4 by quite a margin, as you see there.
    The i7 has no advantage whatsoever over the i5 for games, other than 32x PCIe versus 16x. Given that even with two dual 5870 cards that's going to hold you back less than 10% in all but a couple of rare titles, it's not a big issue. Later on it may become one, but since the HD6 won't be a big quantum leap, there's no danger of 16x becoming inadequate yet. PCIe3 will be out by the time it's a big issue.
    I'm also going to be consolidating to two GPUs in the future, so it'll be no issue at all, 8x is indistinguishable from 16x to any GPU in any game right now.
     
  8. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    Hmmmm. Well, sounds like you'll stick with that i5 for the forseeable future then - will it carry forward to pcie3? Or am I talking so far out, with HD7 in 2012-13, that we might as well assume new mobo, new cpu, and new gpu cards.

    I can't carry my 3850 forward - it's an agp board.

    Wow, 20,000 sounds outstanding! I'd be like a kid in a candy store! I'd be like omega with a flushed transmission and dragon speaking naturally working again, lol. I'd be like shaff with a hackintosh laptop and top rank once more on xfire! :p

    I could actually finish the rain chapter in Hells Highway! For now I'm gonna assume 9440 (was there a cache difference for 9550 or just clock speed?)

    Miles said something about an nvidia card in the 9450 - I think it was some kind of 8800 - does that mean the 9450 is sitting on an nvidia chipset motherboard?

    When he said "give" it to me, he actually said "loan" cause he doesn't know if -v- will ask for it back one day, so I can't mod it beyond much more than throwing a new gpu card in there. I probably shouldn't try to overclock the 9450 - unless I'm ready to replace it if I burn it up. Well, I guess as long as I don't rape the case to install a fan the way I did with my p4 case, I could unchange back to stock pretty easily if he needs it back.

    He's got another tower for me - earlier hardware - he has no idea what - probably a core two duo dual E5200 or some such - that would be a big improvement also. I told him I'd find room for any hardware he wanted to move out of his garage, lol.

    Then shaff could finally stop bit**ing at me about upgrading. Or maybe I should say, Shaff, and Sam, and Jeff, and ddp - no ddp hasn't said anything yet. He'll be next. "Rich I'm gonna ban you if you don't get rid of that p4 by the end of the year!" So next year, even though I don't have it yet, I'll have to start lying about how great the 9450 is, and then you guys will ask me how I liked the ending of Hells Highway, and I won't know what to say - then soon after that I'll be in psychotherapy, with Kevin, discussing metallic flakes in our transmission fluid.

    That was a sad ending. And all I ever wanted was 10,000 3dmark6 points. :p

    Rich
     
  9. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Right now there's no indication whether or not HD7 will use PCIe3. That tech is still very much in its infancy, so not sure when it will appear.
    I'm well aware you can't move the 3850 AGP to a PCIe system, but I was saying just for comparison's sake to give you an idea about the 3850's capabilities as a whole.

    Not necessarily, that's no indication of the chipset used. You did need an nvidia chipset to use SLI with Core 2s, but you don't need one to run a single nvidia card, thankfully.

    The E5200 is actually just as new, if not newer than the Q9450 is, it's just lower end. The E5200 may have been a budget chip, but I own one, they're quite potent, easily a competitor to the old E6700, and with the exception of the high power Phenom dual cores, as fast as anything AMD have to offer even now in the dual core sector.
     
  10. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    Hah! I'm gonna be driving up there Monday to go see Inception with Miles - maybe he'll bring up the garage being clogged up!

    Yeah, I'd be really happy with either tower. I KNOW that he doesn't have anything that goes back as far as P4, so whatever he has, I'll be miles ahead, no pun intended.

    When I finally retire the p4 (under threat from ddp) I'll put it in the sunroom for anybody to use to replace the 400 mhz XP that's sitting there now - for 10x faster internet access. When I do that I plan to quiet the p4 down considerably - I'll pick up two more 120mm 800 rpm fdb fans to replace the 1600 rpms, and I'll put a throttle (that came with the V1000 3850 cooler) on the 1900rpm 120mm exhaust that I squeezed into the lower rear - no need for so much turbulence looking up movie showtimes, lol.

    Rich
     
  11. abuzar1

    abuzar1 Senior member

    Joined:
    Jan 5, 2005
    Messages:
    5,818
    Likes Received:
    4
    Trophy Points:
    118
    Still writing essays I see :p
     
  12. ddp

    ddp Moderator Staff Member

    Joined:
    Oct 15, 2004
    Messages:
    39,169
    Likes Received:
    137
    Trophy Points:
    143
    you can say that again as i did not threaten him. i'm innocent!!!!
     
  13. abuzar1

    abuzar1 Senior member

    Joined:
    Jan 5, 2005
    Messages:
    5,818
    Likes Received:
    4
    Trophy Points:
    118
    Haha. I have so much to catch up on around here :(

    Hey guys how good is the 330M in the MBPs?
     
  14. Red_Maw

    Red_Maw Regular member

    Joined:
    Nov 7, 2005
    Messages:
    913
    Likes Received:
    0
    Trophy Points:
    26
    Can some one explain to me why the reference 5850/5870 gpu's are touted as being the only non crap 5850/5870's? Maybe my standards are too low but try as I might I can't find anything bad to say about my sapphire 5850.
     
  15. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    330M? I assume you mean the mobile GPU, it's an nvidia mobile chip, so it won't last very long before you'll need to RMA, but it's moderately powerful at least.
    Maw: Every non-reference sapphire card i've had has been broken, every reference card i've had has been rock solid.
     
  16. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    I sincerely apologize. I think I missed the part where you defined the difference between reference and non reference cards. How can you tell the difference? :S
     
  17. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
  18. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    no it won't die as standard. Just remember this is in an apple product. They do their research ( on mbp) to get the most solid parts. Its made for people who like stuff to just work, with no hassle amd people pay the premium for it.

    the nvidia chip debalce was over the 8400/8600GT.
     
  19. Red_Maw

    Red_Maw Regular member

    Joined:
    Nov 7, 2005
    Messages:
    913
    Likes Received:
    0
    Trophy Points:
    26
    Well I certainly hope mine fairs better lol. Thanks sam.
     
  20. abuzar1

    abuzar1 Senior member

    Joined:
    Jan 5, 2005
    Messages:
    5,818
    Likes Received:
    4
    Trophy Points:
    118
    I could always get a Envy 14 lol but I'm stuck on stupid OSX.

    Y'all excited for Halo Reach?
     

Share This Page