1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Yup, I'm a nervous wreck! :p 330USD... I've read reviews of my GPU quitting prematurely. So I'm taking steps to insure it lasts me a while.

    68F in my office, and the GPU still idles at 47C. Man they're friggin beasts LOL!
     
    Last edited: Feb 28, 2012
  2. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    About 25% from what I can see.

    Omega, that's the case with pretty much any GPU, they'll never idle much less than 40 due to the low airspeed of radial coolers and the ambient heat of the rest of the card (and the pc itself for that matter).

    Even budget basic cards often idle in the 40s.
     
  3. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    About 25% from what I can see.

    Omega, that's the case with pretty much any GPU, they'll never idle much less than 40 due to the low airspeed of radial coolers and the ambient heat of the rest of the card (and the pc itself for that matter).

    Even budget basic cards often idle in the 40s.
     
  4. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    The air in here feels soo cool, it's just amazing LOL! My cpu is idling around 25C :D I know, I know, idle temps are trivial! :p
     
  5. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Indeed, most stock components now, even high-end ones, use less than 15w at idle. That's as much as one older hard disk!
     
  6. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Anywhere in the 40s idle is quite cool for a video card actually. My 4870s would idle near 50-60 and load in the 80s.
     
  7. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Yes, because an HD4870 used 50W at idle, compared with the HD5800 series at 27W, the HD6800 series at 19W and the HD7900 series at 8W.

    For ref, the GTX200s were about 40-50W, the GTX400 series about 40W (60W for the GTX480, 110W with two monitors attached!), and the GTX500 series about 30W.
     
  8. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    My 260 idled cooler. But this one's certainly more beefy ;)
     
  9. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Unusual, but then the temperature sensor might not have been accurate anyway. Nvidia are known to produce intentionally miscalibrated temperature sensors, but this is usually more common with chipsets than graphics cards.
     
  10. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    LOL my 680i sensor was way off by about 20 degrees. Laser thermometer was showing some scary temps. IIRC it averaged in the 80s and 90s load on a daily basis and that includes an active cooling fan right on the chipset sinks. The on-board thermistor showed more like 60s and 70s and that was still worrying. Go figure every Nvidia chipset after the wonderful 590 SLI, would suck terribly. My ASUS M2N32-SLI Deluxe was one such 590 SLI board that performed admirably and ran with reasonable temps. Ofc this was also in the days ASUS was known for quality hardware and intelligent board layouts. Passive chipset cooler as well which is a stark contrast to the 680i's near meltdown waiting to happen the second that little fan quits, which they did quite often.

    Have been considering putting one of my old systems back together BTW. Have the large majority of the original components still working. 4400+ AM2 Windsor, X1800XT, etc. I still have the 620W Enermax Liberty sitting here as well with good components but a bad fan controller. Would like to get it tested before I bother repairing but eh I really cbf'd to do it XD My 550VX and 620HX are still A-OK.
     
    Last edited: Feb 28, 2012
  11. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Fans on the nforce 6 were still rare. Mine didn't have one. Thermoprobe placed the heatsink of my 650i (less power hungry than the 680) at 72C. That puts the internal chip temperature at at least 80C. The highest the sensor for the board ever read was 42C.
     
  12. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Ha ha! The new Evga precision has a frame limiter. The GPU can run even cooler now :p Limiting to 80Fps, drops the temps 10C in GRiD. I know, I'm overly cautious eh? Perhaps... but I like things to run cool. Having had Nvidia chips overheat in the past ;)
     
  13. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    Well thanks Jeff and Sam. That sounds better than "sucks." Haha.

    I'm with Kevin. That Zalman cooler (was it the V1000?) you told me about, Jeff, dropped the 3850 by so many degrees - and the very quiet fan always ran 100% since it wasn't adjustable. I don't think I could ever bring it to hit 80 after that aftermarket cooler. Prior to that - many times in the 90's - bordering on 100, and I too was a nervous wreck. The only thing that helped was I kept telling myself I only paid $110 for the card on ebay, not like the $330 that Kevin just forked out!

    And on the current rig - the 8800GTX - upwards above 90 in warm weather on furmark (I know that's an artificially heavy load) even using rivatuner to fix the fan at 100% which I still do when I game. The antec just doesn't have any decent airflow. I started to fix that somewhat with the kama bay in front - especially when I put a kaze 3000 rpm in there! You said, Sam, that I'd hear it through the headphones, but when I'm gaming I really don't hear it. (I play it pretty loud - people will come into the trailer and be standing right in front of me, and I don't know they're there, lol.)

    And I think I posted about six months ago - Miles said something to me that after a year, it was unlikely anybody would ask for that rig back. I didn't tell him but that sounded like I could drill some holes.

    So I mounted a side intake - another 3000 rpm kaze - blowing right on the graphics card, to complement the front intake. That right there dropped temps 10 degrees. Very rarely do I bust over 80 - but as you guys have just posted - when I do hit in the 80s, I may be in for some crashing, which happens from time to time. With those two kazes blowing, when I crank up the fan controller, a lot of wind comes out the back of that antec.

    So I think that my comfort factor is more like Kevin's - mid to low 70's makes me happy - above that, no. :D

    Rich
     
  14. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Well certainly don't let my opinions sway your decision to buy the game. A lot of people had fun with it and it is a really interesting game. Some of my opinion is so people don't expect much from over-hyped games and some of it is because I don't agree with lazy game-making. At least for its time, Bioshock 1 was creative, technically competent, and good-looking. There was no excuse for that steaming pile of a sequel.

    Sadly it's a problem stemming from the current generation of consoles lasting as long as they have. Any multi-platform release WILL be limited to match the consoles. It's up to the developers to decide if the game is limited FOR consoles or limited BY consoles. Regretfully, any company wishing to make a technological impact either develop for PC only or are heavily PC centric. Even console gaming's biggest franchise, Call of Duty, started quite firmly on the PC. More interesting is that games designed as console or even platform exclusives, end up being better quality in the end. God of War 3, Metal Gear Solid 4, and Halo Reach come to mind.
     
    Last edited: Mar 5, 2012
  15. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    No discussion on any of other new generations?
     
  16. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    So the GTX680 is out. It's faster than the HD7970 (albeit it only makes a difference at 1920x1200), it's cheaper than the HD7970, and it even only uses the same amount of power as the HD7970. Long and short, it's better unless you run 2560x1600+, in which case it's about even.

    HD7970 price cut incoming or AMD are heading for trouble...


    Card / Performance Index @ 1920x1200 / Performance Index @ 2560x1600 / Idle power consumption / Load power consumption:

    HD6970/195/210/20W/210W
    GTX580/230/230/50W/250W
    HD7970/240/310/10W/210W
    GTX680/300/320/10W/210W
     
    Last edited: Mar 22, 2012
  17. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    Wow, nvidia did bloody well. Power consumption is shocking for the performance considering its from nvidia
     
  18. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Wish my hand wasn't forced to buy a new card. My GTX 260 still had some life in it :( I guess it still could. But of course I won't know for a while. I'd hate to pop my PSU! I guess I could use an external power supply, and see what happens. I have a cheapy laying around ;)
     
  19. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Have recently discovered the Crysis 1 port for Xbox 360. Built on CryEngine 3 and iirc released shortly after Crysis 2. I almost cried thinking PC lost elite status, but LOL the differences had me on the floor rolling. It's a competent version of Crysis but it is not even in the same ballpark as the original. It does seem to run very well though. Have rented it for a few days just to try it and it's definitely Crysis, but the entire game is stripped down or otherwise graphically compromised. The lighting, textures, draw distance, poly-count are all akin to Low settings in the PC version with like medium shaders and shadows. Basically as bland as possible. It, for the most part, does resemble Crysis. They also adjusted the time of day in a few levels, so it's hard to make a fair comparison in some areas. The particle effects are still generally strong but the physics are stunted. Nothing is quite as breakable or interactive. The plants don't blow in the wind but they do move as you push through them. The framerate does dip a bit but is surprisingly a stable 30 for the most part. Also objects have a very noticeable pop-in akin to the console version of Oblivion, ie s***. So basically this proves that even extremely optimised, consoles can not run Crysis at all. As a stand-alone console game it would be bearable, but it lacks most of what made Crysis impressive, the graphics. It looks like a slightly more polished 360 game, which it is. It's not terrible, but it's no credit to consoles either because it's a 5 year old game and barely runs at settings we could manage years ago, and only at 720p mind you.

    A lot of reviewers are saying it's a better use of CryEngine 3 than Crysis 2 was because it's a better game besides. I'd have to agree. It is still Crysis ie a damn good game. I'm enjoying it. I will say it does have better sound effects than Crysis 1 on the PC. It sounds nice. 100/100 for effort on the port. 7.5/10 as a stand-alone game compared to the 9 to 9.5 of Crysis PC. Maybe 7.5 to 8.5 for Crysis 2, for console and PC respectively. The PC version scoring higher only because of the high res texture pack and Dx11 which are dang pretty.

    Also, woohoo Charter.

    www.speakeasy.net/speedtest
    Download Speed: 35453 kbps (4431.6 KB/sec transfer rate)
    Upload Speed: 4250 kbps (531.3 KB/sec transfer rate)
     
    Last edited: Mar 23, 2012
  20. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    From what I can gather, the trend comes out something like this:

    Figures stated for 1920x1200 / 2560x1600
    HD6950: 175/190
    HD6970: 195/210
    GTX570: 195/195
    GTX580: 225/230
    GTX590: 360/350
    HD6990: 380/400

    HD7950: 250/260
    HD7970: 295/305
    GTX680: 305/300

    Not had a look at the lower end HD7 series yet, but as far as I can tell they're roughly analogous to their predecessors (i.e. 7870~6970, 7850~6950, 7770~6870, 7750~6850)
     

Share This Page