1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    That should be all the evidence you need.
     
  2. navskin

    navskin Regular member

    Joined:
    Jan 7, 2007
    Messages:
    347
    Likes Received:
    0
    Trophy Points:
    26
    i everyone been a long time since i posted here. hows everyone doing anyways.

    ok so just got a new rig and i think i have a problem but i ant sure. I have 2 GTX 260 in Sli from xFx and i have a problem with the clocks on the cards for some mad read the clocks on both cards have been downclocked, is this normal behavaer for thease cards.

    Spec Of my rig.

    Intel core 2 quad Q6600 @ 2.40Ghz
    Motherboard: Abit IN9 32X-MAX
    Ram: 4 Gb Corsair
    GPUS: xfx GTX 260 in sli With DOWNED CLOCKS AHHHHH
    1TB HDD Seagate


    thanks for any help you can give me.


    EDIT: they got downclocked from 576mhz to 301mhz.
    sharder from 1242mhz to 601mhz


    EDIT AGAIN: and the 3dmark06 socore was 11673
     
    Last edited: Jul 23, 2009
  3. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    You are monitoring the clock speeds of the cards at load right? When sitting at the desktop not running a game, all graphics cards clock down so they use less power and produce less heat.

    Your 3dmark is probably low due to the clock speed of your processor. 3dmark06 is primarily a CPU-only test nowadays as the CPU boundary is too low to accurately measure GPU performance. Run vantage, or test a game.
     
    Last edited: Jul 23, 2009
  4. navskin

    navskin Regular member

    Joined:
    Jan 7, 2007
    Messages:
    347
    Likes Received:
    0
    Trophy Points:
    26
    sam your always spot on every time. there fine when playing a game and when you close the game the clocks reduce back to the energy saver, so you may aswell call it Cool And Quite for graphics cards.

    thanks for the info
     
  5. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    That's what it is. For ATI cards it's about as effective as cool and quiet, but for nvidias it's much better - just as well, as on the whole they're less efficient architectures.
     
  6. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    Sam, thanks for the detailed response. One thing I was slightly confused about - you mentioned I had 3 degrees of room on the cpu, but my bios has an 85 degree emergency shutoff, and I was running 73, so did you mean 12 degrees of room - or are you saying that a cpu should never run hotter than 76 at the most?

    I actually experienced good temps yesterday, and it was another hot day!

    I ran Left 4 Dead about 10 hours. Cpu temp stayed in the high 60s - about 68 or 69 (I turned speedfan off after a couple hours and stopped monitoring cpu temp to reduce cpu load.) Gpu temps, which I continued to monitor the whole time, never exceeded 78 degrees. (I was happy with these temps - it looks like the ATT renderer was a lot more of a system stresser than L4D itself.)

    I did crash once about 2 hours after I started - I was running 695 gpu clock, and 936 memory clock. I dropped the memory clock 2% to 918 and I was okay the next 8 hours - I had one close call when it started stuttering - but it recovered.

    Since my temps aren't bad, the only thing a cooler could do for me is allow me to push the memory clock and gpu clock back up higher for slightly smoother fps without crashing - I assume that's what we're talking about a cooler doing for me - right?

    So for coolers, the thermalright HR-03 looks awesome. By the way, Sam, I noticed a Thermalright 4870x2 solution in the FAQ on that page - using one HR-03 cooler that sits on top of the card, and an older cooler that sits on the bottom - I followed the Russian link and saw the photo. Nice!

    I really like the way the HR-03 sits on top of the card - that would mean I would have no problem with my side fans hitting anything mounted under the card.

    However, after a lot of study, I truly doubt the Thermalright will fit in my super narrow 7" case. The HR-03 width is 6.14." Here's a picture from the installation pdf.

    [​IMG]

    The pipes come up way out from the edge of the card. I actually measured, and I have 1 1/2 inches from card edge to side wall which is more than I thought I had.

    So from the bottom part of the picture, the HR-03 looks okay, but from the top of the picture, it doesn't look okay.

    I say that because the picture shows a card with the gpu chip mounted toward the mobo side of the card, but my chip is mounted in the middle equidistant from case wall side to mobo side.

    If I were able to slide the HR-03 mount along the pipes, (see the blue writing) it looks like I could make it work - but you tell me, Sam - I am assuming the mount is fixed in position and can't be slid along the pipes, am I right?

    If that is true, then I think it is highly unlikely that the HR-03 will fit in my unusually narrow case.

    But now that I know that I have 1 1/2" from edge of card to case wall, it looks like my 120mm intake fans do not come into the space of the 3850 card, so either the vf900 or the vf1000 will work since they are narrow enough not to overhang the side walls of the card.

    My 3850 card is 5 1/2" wide. The vf900 is essentially 3 6/8" wide, the vf1000 is 3 1/8" wide, so both coolers will be indented inside the card about an inch on each side, and my existing intake fans won't be anywhere close to hitting.

    So I'll just keep watching the three ebay auctions, lol.

    You got that right, especially the part about the mauling. LOL I thought about it later - I have NO expansion slots, and only one agp slot, hahaha. Thank god I have built-in ethernet, and built-in 5.1 sound.

    I thought to myself - "What if one day I wanted a wireless card?" But then I remembered I already own a little usb wireless US Robotics device that's in my "spare parts kit" - those things work pretty well.

    PSU-wise I have the Corsair 450 that you're looking at, per Sam's prior recommendation to me - very quiet. I replaced a 500 watt Allied, more total watts but with only 20 amps on the 12 volt rail. Like Shaff found out - 12 volt amps count!

    Hey Nav, long time no hear. I was thinking the same "Are the cards throttling down under no load?" (I've learned a few things from Sam and Shaff and Jeff, lol.) Glad to hear they're working fine. I remember when "just 11,600" 3dmarks was way more than anybody had! (Easy for me to remember since I'm still back there in the stone age with just under 5,000!)

    Rich
     
  7. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    That's exactly what I meant, seeing a CPU beyond 75ºC is not at all recommended. For the tiny overclocks you're using Rich, I don't see why you don't just forget about instability and temperature issues and leave the clocks stock, the performance difference is so marginal it isn't worth the effort.
    The 4870X2 unofficial solution worries me, because for one you can only use it with one card since it occupies an impressive five slots using it, and for two because it's unofficial there are no officially supported VRMsinks provided with those coolers, so I've no idea how you cool them properly, my guess is you don't. I'd rather spend £70 importing an Accelero 4870X2 than spend £45 on those coolers and fans and worry about destruction.
    The HR-03 leaves loads of room around the outside, i have at least a couple of inches in my NZXT Lexa. The mounting point I believe is attached separate to the base, but it can only be fitted in one place, else the base will not make contact with the GPU, so you have about a quarter of an inch's adjustment at best, and I really don't like having something like that uncentered in case it breaks contact, the die surface on AMD GPUs is tiny compared to main CPUs and nvidia GPUs, a small lack of contact would spell doom for the card.
    " I already own a little usb wireless US Robotics device that's in my "spare parts kit" - those things work pretty well" - not really, I don't know a single one that's anywhere near as good as a PCI wireless adapter, let alone wired networking.
     
  8. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    mant he only problem i see with that TR is the absurd ammount of slots it uses, esp when you stick a fan on it.
     
  9. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    What, three? The same amount any large aftermarket cooler uses with a fan on it. They only use five slots if you install it upside down, which is only intended for limited room crossfire setups with large cases and wide-spacing boards.
     
  10. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    well putinng it facing up, wouldnt that interfeer with the NB?
     
  11. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    Ok, Sam, now I understand, don't push the p4 above 75 as the highest limit. In that regard, I think I'm good. Yesterday's cpu temps were in the high 60s.

    One time I was thinking about putting an Arctic cooler on it - and cloning into a huge ide drive since I had read about the dangers of losing your Sata on some mobos where the sata clock is a multiple of the fsb clock - just as happened to me when I tried going from 200 to 210.

    But now that I have the Satas, I can already tell the improved speed of level loading - I don't go idle and find my bot having left the safe house any more, lol. So I can't overclock the p4. Therefore, if the p4 will support the present temps, then I'll leave it with the stock hsf. (Come to think of it, I have already sunbeam modded the cpu inlet so it's sucking cool air directly from the outside of the case - so I have already improved on the stock hsf - otherwise I bet my temps would be higher.)

    No, Sam, No No No!

    Hahaha!


    FIGHTING FOR EVERY FPS

    You would be absolutely amazed! The performance differences ARE HUGE! When I talk about reasonably fluid gameplay - I mean getting about 31 fps MAX most of the time.

    Believe me guys, I am RIGHT AT THAT BREAKING POINT between smooth gameplay and noticeable lag.

    It's like moving pictures - to fool the eye of MOST PEOPLE you need at least 24 fps - isn't that the magic number? I could google it - but it's something like that. If you try 23, you can see the flicker.

    That is just like me and smooth gameplay. Before this hot weather, I have been running the Catalyst recommended "Auto tune" of 702 and 936, and my gameplay has been smooth - most of the time when I look up into the upper right corner, I see 31 fps. I haven't really been feeling any lag, due to the remarkable ability of Valve to scale for my weak system. That's probably not the ultra smooth you are getting, Sam, dipping down to 60 in the rough spots. But compared to how my older games play - where I know I'm getting in the 40s and 50s - Left 4 Dead feels similarly reasonably smooth and fluid.

    Remember, too, that I am using End it All to free the cpu as much as I can, and per an earlier tip of yours, Sam, I have disabled hyperthreading. I can verify by experience that no hyperthreading is best.

    Regarding running stock clocks, I did in fact try that one time, by accident. Let me tell you guys how that worked out for me.

    What happened is that I forgot to restart Ati Tray tool after using End it All. That left my gpu and vram clocks at stock 2D mode - 668 and 828 - which is what I have Ati tray tool set them back to when I exit 3D mode. Stock clocks are a 5% slower gpu clock and a 13% slower vram clock than 3D mode.

    So I started up Left 4 Dead. All of a sudden - I hadn't been playing for more than a few minutes - I noticed the game was REALLY laggy! I asked myself, "What the hell is going on? This game is playing like total sh*t!"

    Then I noticed that I didn't have any fps display in the upper right corner. I exited L4D, restarted Ati tray tool, and was GREATLY RELIEVED to come back to smooth gameplay.

    So believe me, Sam, guys - the small graphics overclocks ARE HUGE!! I am fighting for every single fps. Hahaha!


    YESTERDAY'S GAMING SESSION

    So during yesterday's test of the new exhaust system, which Keith accurately characterized as the "mauling" of my case, (I LOVE that word - truer words were never spoken!) I didn't even try the 702 Catalyst "auto tune" gpu clock because of the scary high temps up to 101 of the stress test from the day before. I slightly throttled back 1% to 695, which had dropped the stress test temps by 7 degrees, leaving me a mere but significant 4% above stock of 668. But I DID use the "auto tune" recommended memory clock of 936, which is a bigger 13% above 828 stock.

    Then I had the one crash of the day.

    After the crash, which happened in the first 2 hours, I rolled the memory down by 2% to 918, leaving me still a significant 11% above stock. After that I had only one two-second scare: "Is it crashing ...." (and it wasn't even a high action spot) but the system recovered.

    I DID notice one time when I was spraying zombies who were trying to break into the stairwell on hospital finale, that there was an element of lagginess - but other than that things were smooth the whole time. (But perhaps not quite as smooth as the 702 and 936 that I have gotten used to - I can tell the very slight differences.)

    Believe me Sam, I am playable now - but JUST BARELY. I need every stinking fps I can get, lol!

    So tell me, if I mess with a vga cooler, which I am inclined to do, in your opinion should that allow me to move back to 702 and 936? All things being equal, assuming I haven't damaged my card somehow, I SHOULD be able to at least return to those settings, right?

    - - - - - - - - - - - - - - - - - - - - - - - -


    Hey I have one funny story from yesterday that you guys might enjoy:

    Rich
     
  12. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    LOL Rich! I can relate in some sense.

    I thought this was kinda funny. It says "Beefy Computer" on the side of the case. Look familiar to anyone ROFLMFAO!!! You guys may have noticed already. But I guess I saw inanimate objects as irrelevant, so my mind naturally gave them no attention. However, when im in certain moods, I find myself REALLY scanning the area's and looking MUCH deeper into things. Especially since ive been reaching for my artistic side. If I have one LOL! Nah, I think everyone has an artistic side. Even if only a small one.
    Sorry sam, but I gotta post this one :D
    [​IMG]
     
  13. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Shaff: Nope, never has in any of the 3 boards I've had that cooler installd in, even with the P5N-E SLI's huge heatsink.
    Rich: There are plenty of games I play even now that run at 30fps ish, they aren't Left 4 dead but they do exist. In such instances, the performance increase from overclocking is negligible. Seriously, the mathematics of it don't add up. Get 90fps and you overclock your GPU by 4% and you get 94fps, great but you can't tell the difference. Get 30fps and overclock your GPU by 4% and you get 31fps. 1fps does not make a blind bit of difference in these circumstances - the more your games start to lag, the less extra frames overclocking will give you, it's simple.
    The 24fps rule applies to visual images. However, since the refresh rate of a monitor isn't infinite, you get problems with the synchronisation of frames. Say for instance your game runs at 25fps, that's a frame every 40ms. Your monitor will most likely be 60Hz, a frame every 16.666ms. An image only comes out of the screen when the monitor refreshes - if the image has changed from last time, great - smooth gameplay. if it hasn't, it manifests itself as a jerk , because you'll have to wait until the next refresh for the image to change, even if the PC rendered the next frame a tiny fraction of a milisecond afterwards.
    Imagine the timeline - 40ms for a frame rendered, 16.66ms per monitor refresh
    0: New frame on the screen
    16.666: Refresh - no new image
    33.333: Refresh - no new image
    40: New frame rendered - no refresh yet
    50: Refresh - new frame on the screen

    Thus what we have here, is 25fps that has effectively dropped to 20fps. I hope that makes sense.

    For this reason, the only guarantee of smooth gameplay, unless the refresh rate of a monitor was 1KHz or so, you need to meet the refresh rate of the monitor for smooth gameplay, 24fps on its own isn't enough.
    Hold on, you might be thinking, isn't this what VSync does? Yes it is, it will attempt to produce 60Hz for a 60Hz refresh if it can be achieved, if not, then it will use 30 (some games use 45, this is rare). VSync has other advantages like reducing vertical mismatching in frames, called 'tearing'. However, there is a problem with Vsync, which is why I don't use it. If you play a game that will always be smooth, well above 60Hz, unless you get lots of tearing, Vsync doesn't have much of a point, since you always meet the refresh rate anyway, it's no issue. When you do have lag and Vsync could potentially compensate for it, say your frame rate varies wildly from 70 odd to 32 (a not uncommon range of frame rate in games), vsync will be chopping and changing between 30 and 60fps often, and this causes lag. It's really irritating, and for that reason I simply live with Vsync turned off.
    Further to the 'you can only see 24fps' argument, the same actually applies for TV. You may not notice it at first because TV tends not to lag, but look closely and for the same reasons mentioned above, you can tell it's a problem (this is more an issue for PAL and video rips rather than NTSC real TV, as NTSC is 60 frames interlaced/30 progressive, PAL is 50 or 25). Now watch the same footage on a 100Hz TV instead. Due to the higher refresh rate, the picture from the same source looks much smoother. Then watch a 60fps progressive video (there are some out there, usually home recorded HD streams) and you'll suddenly become aware of how incredible the difference is.
    Brief spurts of lag in Left 4 Dead are more related to CPU than graphics (as in game freezes for a couple of seconds) - it happens sometimes with weaker CPUs.
    A new GPU cooler could possibly give you stable speeds at a higher clock - in one part of your posts I see you're getting 100ºC temps which are certainly going to cause you problems with overclocking, and then in the next part you tell me they're in the 70s, which is ample cooling - long story short, heavy overclocks don't match up well with temperatures over 75ºC. Do realise though, that the limit of stability may well be voltage-side, however, and that 702/936 will always be slightly unstable. For the record, don't overvolt your graphics card - it ALWAYS ends in tears.

    Omega: The clear Dell dimension ripoffs as 'Beefy' computers are a Valve trademark, they've been in games since Counter-Strike Source. If you look closely in Half-Life 2, you can also identify that the CRT monitors are Viewsonic.
    In CSS at least, not sure about left 4 dead, if you shoot the side panel, it comes off revealing a motherboard, cpu cooler etc. inside.
     
  14. keith1993

    keith1993 Regular member

    Joined:
    Aug 14, 2008
    Messages:
    434
    Likes Received:
    0
    Trophy Points:
    26
    Just the other day I couldn't help but notice this
    [​IMG]

    Its my mouse albeit wireless and silver and that's a Microsoft keyboard to go with it.
    You've got to admire the level of detail Valve put into their games.

    EDIT: oh no the curse of Imageshack has struck again the odd image is refusing to load..
     
    Last edited: Jul 24, 2009
  15. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I can see the image fine. Also note the Asus cd rom drive...
     
  16. keith1993

    keith1993 Regular member

    Joined:
    Aug 14, 2008
    Messages:
    434
    Likes Received:
    0
    Trophy Points:
    26
    lol that's because I edited it with a re-uploaded image.

    Damn I actually have enough for that PSU but I'm gonna end up spending all my pennies on holiday - hence I won't be posting for a week starting tomorrow. Whitby (what's wrong with staying in the UK for your holiday?) here I come!!!
     
  17. navskin

    navskin Regular member

    Joined:
    Jan 7, 2007
    Messages:
    347
    Likes Received:
    0
    Trophy Points:
    26
    Hi there peeps, hows do i know if the 2 gtxs i got are in sli in windows vista x64 i have checked the nvidia control panel and there id no sli option for sli?

    thanks
     
  18. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    You can't sli an 8600 with an 8800.
     
  19. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    he knows that, he has 2 GTX 260s. as seen on last page.


    umm on the control panel, is there not an toption somewhere for SLI? if nto check GPU-Z to see if its enabled. best way would be to check a game, and take a card out and see if the frame rate drops.
     
  20. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Remember I respond to 20 threads or so a day if not more, so I tend to go by what's in people's sigs for reference, if they're not up to date then that's an issue :p
     

Share This Page