1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    basically they took the G80 and pumped it with steroids to get the GT200 core. (exactly what intel did with the P4s and look how amd destoryed them)

    it has 1.4 billion transistors, a 512bit bus, and 1GB of RAM (this is the GTX 280, not the GTX 260) it has 240 shader processors (now called shader cores to compete more with intel)
     
  2. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    To be fair though, when Intel pumped the P4 with steroids it actually got slower again, the GTX280 is still a lot faster than its predecessors, just not faster enough to warrant the price.
     
  3. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    So it's basically a G80/G92 with overkill specs? That makes sense. And it sounds like a desperate move to maintain market supremacy. Seeing as the only GeForce 9 card worth the PCB it's stamped on is the 9600GT, IMO.

    Nvidia newsletter says that this will be their last single core card. Supposedly they are coming out with duals and quads pretty soon. I don't know how that'll work, but I see ATI making the same move eventually.

    Also, the 9800GTX has dropped to the $200 price-point at newegg.

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814500039
     
  4. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Given the current state of Crossfire, that stands to lose nvidia a lot - SLI is good and all, but the gains being posted by AMD's current dual card offering are nothing short of eyepopping compared to how dual GPU performance used to be, and for a big part still is with SLI.
     
  5. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    AFAIK, Crossfire has always scaled better than SLI. Even X850 crossfire was showing near 100% performance gains.

    The GTX280 is a very impressive card. But the price a laughable given the $300 price tag on the 4870. I hope Nvidia does not release their next card too soon. AMD/ATI needs some time on the top after a disappointing last gen effort :)

    Any news on what AMD plans to do with Phenom? Are they releasing a successor any time soon or are they making any improvements?
     
    Last edited: Jun 30, 2008
  6. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Yes, but how often did you see those gains? :p

    As for what nvidia have planned next, I think their next offering will probably be a dual card based on a pair of GTX260s, with some tweaks to get the power consumption similar to that of the GTX280.
     
  7. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    By dual card do you mean something like the 9800GX2? From what I've heard it's going to be an actual dual core. If that's the case, quite a few older games might have compatibility problems like what we saw during the jump from single to dual core processors.

    Also, check this out.

    http://www.fudzilla.com/index.php?option=com_content&task=view&id=8001&Itemid=1

    If this can be done on a regular basis, I might seriously make the jump back to AMD. If only for brand loyalty.
     
    Last edited: Jun 30, 2008
  8. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I don't really see the advantage of going dual core with graphics cards. Another new technology to have to program for, and it's not as if you multitask with graphics, you're just trying to add processing power... The only benefit I see is if it means the disadvantage of SLI are removed, in which case I'm all for it.
     
  9. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    I can see gains from dual core. But I don't think they have fully tapped the power of single core graphics yet. We are a far ways away from needing an added core. Even Crysis still has the same basic formula as Far Cry. Models, textures, shaders. And that's it. No extra effects to utilize a dual core.
     
  10. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Heh, yes, Age of Conan typically runs at 40-60% GPU usage on ATI cards... what a farce.
     
  11. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Exactly. Look at Bioshock. It's based on the same Unreal 2.5 engine as Rainbow Six Vegas, but it runs about like FEAR. Maxed settings at 1920 x 1200 with 4xAA and 8xAF gets me ~65FPS average. It runs smooth as butter and looks fantastic.

    Granted it looks about as good as FEAR from a character model/ environment standpoint. But the lighting looks much better, IMO. A good example of a well optimized game regardless of which platform you run it on, be it PC or Xbox 360.

    Keep in mind that, aside from the monitor and RAM, this is a fairly budget oriented rig. Even the X1800XT 512MB can handle it maxed at 1280 x 1024.
     
    Last edited: Jul 2, 2008
  12. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    the unreal engine is one of few that actualyl take advantage of Qcores.

    UT3 shows big gains, and with the unreal engine supporting physix (the agea one that nvidia now control) so the unreal3 engine is a good one, well bar the fact that you have to set AA from the nvidia control panel/CCC, as they dont provide it in game. haha
     
  13. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    That's never been a big deal for me as I always use the control panel anyway. Instead of using separate profiles for each game, I just go in and change my settings depending on what game I'm playing.

    Bioshock and Rainbow Six Vegas are based on UE 2.5. It's basically Unreal 2 modified with parts of Unreal 3. So they don't get the quad core advantage like UT3, but AFAIK they still support dual core.

    I only wish RS: Vegas had been optimized for better performance. It runs like poop even on heavy duty SLI/Crossfire configs.

    Unreal engine 3 is a very good engine that I can see staying around at least as long as other Unreal versions. It performs excellently and produces jaw-dropping graphics. Gears of War for PC runs like a dream on my rig and is easily a graphical rival to HL2: Ep2 or Crysis, IMO.

    A lot of people say Unreal Tournament 3 isn't that good looking a game. But it has a lot to do with the style of gameplay. It's too fast-paced to stop and actually enjoy the graphics that much. The game is loaded with very high resolution textures and very nice looking models.

    @shaffaaf, how well does UT3 run on your 7800GT?
     
    Last edited: Jul 2, 2008
  14. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Estuansis: Bioshock runs well. To say that it's like FEAR in that regard is a bit of a misnoma. FEAR is the most scalable game I have ever seen. It can be scaled to run well on even an X1400, or it can be scaled to run like dump on even a GTX280 - yes, you heard. It basically comes down to the fact that the shadow engine in FEAR is incredible - it can deliver typical run of the mill shadows or no shadows, or very intensive effects that were ahead of its time for 2005. We're saying Crysis is ridiculous now because we can't get anywhere near to turning it to max, but in truth, maxing FEAR in 2005 was like maxing Crysis now, FEAR was just less noticeable because if you dropped the settings a bit, it would actually run properly.
    Rainbow Six Vegas should really be excluded from Unreal engine comparisons because it's a port, and one of the worst (not THE worst, that one goes to Double Agent) but for any game to be able to pull 4fps on an HD387 at some setting when it was made back in 2006 gets a gold medal for the 'ridiculous requirements' list.
    The reason why it runs badly on SLI/Crossfire setups is because it isn't supported. I make 4fps with one 3870, but made only 3 with two of them, late drivers too.

    Back to Bioshock, I ran it at 1920x1200 back with my old X1900XT with all the eye candy on, I'm pretty sure it made a good 40fps or so. Not too shabby at all!

    Shaff: You mention PhysX, but other than Cell Factor, UT3, and the 3dmark vantage test, what actually uses it, years on from its introduction?

    I've actually stopped using Catalyst control center now - over the past two or three XP installs, I've racked up over five thousand errors in my system log for Catalyst control center issues. The program's still a joke, even now. No wonder it takes so long to start up.

    Gears of War and UT3 look, and run quite similarly. I'd give UT3 one extra plus point for graphics for better textures, but two minus points for very poor quality motion blur and haze, that spoil the pretty textures underneath. Both can be maxed at 1680x1050 with an X1900XT, and both can be maxed at 2560x1600 and be smoothish with an HD3870. The only opportunity I had to test crossfire with them, CF wasn't yet implemented for the games, but now that it is, undoubtedly any HD3000 or 4000 crossfire setup will mince them even at my res. I didn't test AA, because it's not in the game.
    As for the overall quality, HL2 Ep2 beats GoW/UT3 quite handily for graphical quality - whilst some parts of crysis are nice, others are so bad, I place it further down on the list.


    Wow, that was a long one! :)
     
  15. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    SAM, STOP MAKING ME READ :p

    @ Estuansis

    i now have a HD x2900GT, and i havent actually got the game (no money to buy it yet, but its my next one :) )

    sorry.

    sam, now that nvidia have bought it, and with the plethora of TWIMTBP games coming out, they will take the oportunity to use CUDA to implement it. im sure off it. look how nvidia killed 10.1 with its TWIMTBP. im very sure it will be implemeneted, such as now bilizzard have now partnered with ATI, and are anouncing havoc physics for diablo 3.
     
  16. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Indeed, nvidia using their market dominance to abolish a technology that offers superior image quality. Well, it wouldn't be the first time eh?
     
  17. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    http://www.guru3d.com/article/geforce-gtx-280-review-test/18

    Is this the article you're referencing? Either way, wow, 27FPS on a GTX280. You wouldn't think FEAR would be so graphics intensive. I'll have to admit that it's an incredible looking game, despite it's age. Kind of drab and boring though. No change in scenery.

    I know, isn't that nice? Even the 7600GT could handle it mostly high at 1280 x 1024. That makes it easy to play on an older PC. Check this out:

    http://www.youtube.com/watch?v=FFoNAHPQxKg

    Not only did it run decent, it looked decent. I think this video says a lot about the game. Like the guy in the video says, a lot of gamers would be happy if the developers released a shader model 2.0 patch. An X850XT or X800XL could handle the game just fine at 1280 x 1024 with medium-high settings.

    Episode 2 is by far the best looking game ever made, IMO. It's not quite as "pretty" as UT3 but the models are very smooth and the lighting is very natural.
     
    Last edited: Jul 2, 2008
  18. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I think so yeah. You make a good point about FEAR, my comment about it has always been it's one of the best looking games in the worst looking environment - so much more could probably be done with the engine than to render those sorts of environments.

    Agreed - Crysis isn't good in its entirety so you lose the effect, Games like GoW/UT3 aren't as good looking full stop, but HL2:Ep2 whilst not offering as many stunning effects as other games looks the business, and does from start to finish - there isn't a section I can name where there's a sudden break in the quality of the graphics - they're uniformly excellent, as is the performance. 2560x1600 with AA on one 3870? How often do you see that in games produced after 2004?


    I just spotted this post from earlier:
    It's the same with the HD3870 - if you own an HD series card, you can't run Return to castle wolfenstein - not supported. The reason is that 3dfx glide support is no longer added to new graphics cards, so games that rely on it won't work full stop - break out the software rendering! Ugh...
     
  19. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    It's the same with Call of Duty: United Offensive. And to be honest, UO looks closer to CoD2 than CoD1. These are modern games, but the newest hardware no longer supports them. WTF. I'm even using XP with Dx 9, and I still have problems.

    Thankfully the X1800XT plays all Q3 engine games with no problems. A 2.4GHz X2 and a 512MB X1800XT is still a very potent platform for a lot of the latest games. There are very few games it can't run with fairly maxed settings at 1650 x 1080.

    The Q3 engine is also very scalable as well. I had no problem getting CoD: UO to run choppy on a 7600GT. There's no reason these games shouldn't be supported. They still have huge fanbases.

    EDIT:

    Just saw this:

    Rename R6Vegas.exe to FEAR.exe. Your performance will go through the roof.
     
    Last edited: Jul 2, 2008
  20. abuzar1

    abuzar1 Senior member

    Joined:
    Jan 5, 2005
    Messages:
    5,818
    Likes Received:
    4
    Trophy Points:
    118
    Hmm, I wish I could talk about playing games. Unfortunately, all I can tell you is that Mass Effect looks pretty damn good on my 360.

    By chance, have any of you played Advent Rising?
     

Share This Page