1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. Sweetman0

    Sweetman0 Member

    Joined:
    Dec 9, 2008
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    11
    Yep I'm using a slimline case. This is depressing to hear D;
     
  2. Sweetman0

    Sweetman0 Member

    Joined:
    Dec 9, 2008
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    11
    A bit off topic but, if i was to sell my PC how much could i get for it? Just the slimline with a cordless keyboard/ hp printer. I have a few games i could sell to raise enough money to buy a new PC maybe.
     
  3. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    inflated metric?
     
  4. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    All of the tests from GameGPU show a distinct correlation between different cards with no variation suggesting some of their results are extrapolated according to a metric produced from a certain set of games, rather than testing every card with every game. The GTX480 at GameGPU scores higher versus its rivals and lower-end brethren compared to anywhere else, so they've presumably produced their metric for it based on an anomalous result.
     
  5. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
  6. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    i didnt know there could be gfx conspiracy theories(!)

    why has it? they are just reporting the numbers they got.
     
  7. DXR88

    DXR88 Regular member

    Joined:
    May 29, 2007
    Messages:
    714
    Likes Received:
    2
    Trophy Points:
    28
    So sam would a Agia physX defeat the limiter in place, and help the ATI achieve an optimal performance in Nvidia based games?

    ive always been told and was my impression that the physX cards where useless, in a real-world scenario.
     
  8. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    AGEIA PhysX cards have long since been disabled by nvidia cards to stop people getting PhysX with Radeons. To get PhysX with AGEIA cards you have to hack the drivers, and because AGEIA cards were rare, the hacks are more optimised for using ATI primaries with geforce secondaries. PhysX is somewhat useless, as physics can never change the game, as it would make it impossible to have nvidia owners play against ATI owners online. As a result, PhysX is basically being used as a means to give nvidia 'better graphics' over ATI through the use of GPU processing in PhysX to render particulate effects. None of this is very extensive, and a high-end CPU would be able to take care of it fine. Of course, that is why nvidia have coded PhysX so if it detects it's being run on a CPU, it limits it to using 0.4-0.5 cores, no matter how good your CPU is, thus causing CPU PhysX to be impossibly laggy on any system. There's no reason for it, it's just another reason why nvidia are a corrupt organisation.

    Shaff: Think about what the article is saying. On HardOCP's previous reputation, that article alone is going to have earned nvidia a huge chunk of sales, despite them having used three of the most biased current games out there. HardOCP themselves commented that using 'Very High' on Metro 2033 with a geforce adjusts some of the settings down so the game looks worse, hence the higher fps scored by geforces. Bad Company 2 has serious issues with AA on ATI hardware. That's not nvidia's doing, but it's well known, and they have omitted testing without AA. AvP is of course hugely nvidia-biased, and always has been. If there was a single game in there that wasn't massively biased, I wouldn't make such a fuss of it, but quite frankly that's an nvidia PR stunt, not a review article.
     
  9. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    would love proof of the phsyx coding for the CPU aswell.

    once again, they are reporting on those games. people go and check to see if the games they want to play can be played and with what GPUs. now if those games are popular they will use them, why wouldnt they? it seems to me that alot of games have an nvidia "bias" over an ATI. if so many games do, and esp popular ones, maybe it is that the card is better.

    in the end personally, most consumers dont care about bias,t hey want to know what card will run best for their game, and will buy according to that. not to sure why you make a big deal about it.
     
  10. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    That's already been documented, I forget by who, I think it may have been techreport.
    As for why there are so many biased games, it's not because the cards are inherently better, since they do not gain bias in any of the older games that never were biased. It's simply that nvidia have a hand in development of the majority of PC games now. Usually, this only brings about a minor level of bias, but the titles that nvidia are keen to focus on will have substantial boosts in them. Because ATI don't bribe developers like that, for any particular Radeon to be perceived as "as good as" its nvidia counterpart, it has to be a lot more powerful. Taking neutral games (there still are a significant number) you can see where the real performance figures are, and that's GTX470~HD5850, GTX480 = c. 1.07x HD5870, GTX465=HD5830, GTX460=0.95xHD5830, GTX460 1GB=1.05xHD5830, give or take a couple of percent. Any game where the GTX470 takes a significant lead over the HD5850, the GTX480 sits more than 10% above the HD5870, or the GTX460/465 are significantly ahead of the HD5830, is biased. It's not because the GTX cards are superior hardware, that's exactly what they want you to think, it's because these said games are optimised for nvidia hardware. You could perversely argue that because such a vast quantity of games are biased nowadays that that effectively means the geforces are better simply by default, but given the fact that almost none of the games I play on a regular basis are biased, that's not something I'm going to concede.

    Most consumers do indeed care about how the card they buy will run their games, hence, if you've ever read any of my spec posts, you'll notice I often ask about the sort of games people play, because if they are considerably biased, now that Fermi hardware isn't awful (well, one of them isn't) it makes sense for them to buy a Geforce. However, if they're not playing the biased games much, and they buy a card based on how it performs in a biased test, they're misled, considerably. I don't stick up for ATI, i stick up for honesty. Not something you find from nvidia, or anything to do with them.
     
  11. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    I agree with Sam 100%. Just because Nvidia biased games are becoming the norm, doesn't suddenly mean Nvidia hardware is superior. And for the record, a good chunk of my favorite games are Nvidia's bitches. But there are so many reasons not to buy Nvidia cards that any informed consumer would be a complete retard to do so. Only if you are building for a specific game would Nvidia be a reasonable buy, otherwise they are a waste of money.

    Three kinds of people pi** me off:

    - People who act with indignance when you respond unfavorably to disrespect
    - People who will argue to the death for a piece of information that is wrong by definition of fact
    - People who will blindly and adamantly refuse to think for themselves

    Sadly, most Nvidia fanbois fall directly under all three categories.
     
    Last edited: Aug 12, 2010
  12. ddp

    ddp Moderator Staff Member

    Joined:
    Oct 15, 2004
    Messages:
    39,169
    Likes Received:
    137
    Trophy Points:
    143
    i'm innocent!!!!!
     
  13. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Not necessarily saying that Nvidia cards are junk. In the games that are Nvidia biased, they are very good. But it also stands to reason that Nvidia has paid big money to make themselves appear the better product.

    Not to say either that all Nvidia users are idiots. Many simply and truly are looking for the best bang for buck and a lot of Nvidia biased games make it a tempting option. There is no real fault in this train of thought. One of my only other true PC geek friends uses a GTX285 and it runs his games excellently. He realizes all of the downfalls of Nvidia and he acknowledges them. But in the long run he really just wants to play some games, plain and simple. And this guy is very intelligent and analytical as well along the lines of Sam. You would get along very well with him Sam as he offers good reasons for his hardware choices. He's a Gigabyte fan as well :D

    Premature death is a big issue for all Nvidia products. Personally have seen:

    - Countless dead Nvidia motherboards, not coincidentally, the Asus P5N-E SLI is #1 for failures out of all boards ever made, ever. Next after that is 750i/a boards and 680i boards. NForce 3, 4 and 5 were reasonably good.
    - 3 dead Nvidia based laptops, again 2 of them being Asus.
    - Well over 10 dead Nvidia video cards, 3 of them being owned by me

    Ati products?

    - 1 dead X800GTO after being unlocked to 16 pipes and overclocked to X850XT specs, presumably better cooling would have saved it
    - Just about any Asus card or 3rd party Sapphire card. Never seen a dead reference card personally.


    Just for the record, my 5 year old X850XT is still running perfectly and sees frequent and heavy gaming. I can't say the same for my 2 7600GTs or my 8600GTS which were all newer and used much less power and produced much less heat.
     
    Last edited: Aug 12, 2010
  14. Red_Maw

    Red_Maw Regular member

    Joined:
    Nov 7, 2005
    Messages:
    913
    Likes Received:
    0
    Trophy Points:
    26
    If you ask me that's a rough definition of "fanboy", doesn't really matter what they're a fan of.
     
  15. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    This is true. But Nvidia fanboys follow that list like a religion =/

    I can't say I'm not a fanboy either, having built an AMD rig vs a comparably priced i5 rig(If I had waited). At least I can admit that AMD is the worse product from a technical standpoint.

    Also, I have bought several Nvidia cards in the past. As previously mentioned 3 of them have died but one is still ticking just fine. My trusty old 8800GTS. Will be very interested to see if or when it dies...
     
    Last edited: Aug 13, 2010
  16. DXR88

    DXR88 Regular member

    Joined:
    May 29, 2007
    Messages:
    714
    Likes Received:
    2
    Trophy Points:
    28
    Nvidia and intel are the same cup of tea, only ones got 2 lumps of sugar instead of 1.

    Nvidiot i believe is the term we're looking for.
     
  17. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Well Intel at least backs up their shoddy business practices with good technology. I actually can't level any decent complaints at Intel's products other than price. And even the price has improved lately. Nvidia is in an entirely different league of asshattery. Intel may have a few small biases in some things but the CPU market is mostly fair performance-wise. Sure they throw their branding everywhere, but they don't actively cripple the competition to the extent that Nvidia have done to ATi.

    Oh yeah, just hit 80 hours of Battlefield Bad Company 2. Man do I love that game. Where have you been all my life? lol
     
  18. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Textwall inc

    The funny thing is, having had a long and hard think about it last night, running an SLI system for my next upgrade actually makes a lot of sense. Yes, it's capitulation to the corrupt organisation I've been trying to avoid, but on all other points:

    -All the multiplayer games I play routinely except Alien Swarm (which isn't likely to take off unless they add much more content) run fine on my current cards. They may be a little CPU-bound, but they're not GPU-bound.
    -All the multiplayer games I play routinely are neutral or ATI-biased.
    -A considerable majority the crazy single-player games I'd like to be able to play but can't run smoothly are nvidia-biased games.
    On that basis, if I upgrade, it actually makes more sense at present to buy an nvidia card, unless ATI gain such an advantage in performance/£/$/whatever, that this is nullified.
    They used to have such a lead, but nvidia have done the one thing that can make up for the Fermi architecture - slash prices.
    Back in the spring when I considered upgrading, this is how things stood:
    HD5850: £225
    GTX470: £310
    HD5870: £320
    GTX480: £420
    HD5970: £520

    Now in August this is how things stand:

    GTX460: £150
    HD5830: £160
    GTX460 1GB: £170
    GTX465: £175
    GTX470: £220
    HD5850: £230
    HD5870: £310
    GTX480: £360
    HD5970: £480
    Now you occasionally find deals for less than that (An HD5870 for £290 for example) but that applies to both brands.
    Now when you consider these cards against their rivals, it's not exceptional, but it's about right. The GTX460s are either side of the HD5830 both in price and in performance. The HD5850 and GTX470 are similarly priced. The HD5870 is still cheaper than the GTX480 by a reasonable, but not huge margin.
    However, apply the good old nvidia bias, and this changes, considerably, and things get even worse when you apply high levels of AA and considerably worse in dual-GPU mode.
    Perhaps unsurprisingly, the games where single GPU nvidia leads, in dual-GPU nvidia trounce. Any game with a hint of bias towards nvidia, whether it's their fault or not, tends to lead two, for at least two GPUs, c. 100% scaling. This is something only the closest ATI games (i.e. Valve and STALKER) can achieve.
    While I can question their choice of titles, I can't really question HardOCP's data - in the right game, two GTX460s (£340) pull serious leads on two HD5870s (£620).

    So, from a performance perspective it seems beneficial going with SLI.
    Now considering heat/noise/power consumption. No upgrade there, but not much of a downgrade. Given that in a high-end game my 4870X2s run at pretty much 100% fan speed (5000rpm+) versus the 4000rpm of the GTX480, the high temps of GTX400s will be aided by my side fans, and the power consumption of two 480s to two 4870X2s is pretty much identical.
    The only problem therefore is reliability.
    You stated that the P5N-E SLI was the #1 failed board of all time, I'd imagine it probably shares that with the Striker boards. Most P5N-E SLI boards made it to 6 months before failing. I don't know of a Striker board that made 3 (of c. 20 boards).
    Still, despite that, we'll lay some of the blame onto Asus, since there's been a premature demise of plenty non-nvidia boards from them.
    nvidia cards are less reliable than radeons, that's pretty much a given. However, most failures tend to happen 18 months - 2 years down the line, or more, which is roughly in-tune with upgrade time. Only risk is if there is a stagnant market for a period, or a period of minimal gains like we may be in at present.

    I drew out the benefits I would have from each upgrade type in accordance to 100 being the score of an HD4870.
    This chart only considers GPUs on their neutral merits with expected CF/SLI scaling. Biased games are not considered here, only biased CF/SLI scaling.

    2xHD4870X2 - 100-400, typical(ATI)350, typical(neutral)250, typical(nvidia)170 - sums 770, 572W TDP, £0 required
    HD5970 - 155-310, typical(ATI)300, typical(neutral)270, typical(nvidia)220 - sums 790, 294W TDP, £480 required
    2xHD5870 - 180-360, typical(ATI)350, typical(neutral)320, typical(nvidia)250 - sums 920, 376W TDP, £620 required
    HD5970+HD5850 - 180-490, typical(ATI)470, typical(neutral)360, typical(nvidia)280 - sums 1110, 445W TDP, £700 required
    HD5970+HD5870 - 180-540, typical(ATI)520, typical(neutral)390, typical(nvidia)290 - sums 1200, 482W TDP, £790 required
    2xHD5970 - 155-620, typical(ATI)540, typical(neutral)390, typical(nvidia)280 - sums 1210, 588W TDP, £960 required
    2xGTX470 - 160-320, typical(ATI)270, typical(neutral)290, typical(nvidia)320 - sums 880, 430W-500W(?) TDP, £440 required
    2xGTX480 - 190-380, typical(ATI)320, typical(neutral)340, typical(nvidia)380 - sums 1040, 570W TDP, £720 required

    Assigning all types of game an equal weighting, two 470s scores similarly to two 5870s, for two thirds of the price, with a higher TDP, but less than I presently use.
    Two 480s score almost as high as the 3-GPU setups, with a noticeably higher TDP but the same cost, though obviously the lead is taken with the nvidia titles.
    An interesting point this raises is how useless the quad GPU setup seems with the 5970s.
    I'll now include, for pure speculation, how two HD6870s might fare in the test.

    2xHD6870 - 260-520, typical(ATI)500, typical(neutral)460, typical(nvidia)360 - sums 1320, 500W TDP, £800 required

    Interesting. Price and TDP are of course absolute guesswork, but this scores far higher than two 480s, being very close to its score in geforce titles (unbiased ones at least), with a lower TDP, and a not dissimilar cost.
    Will have to wait for the real facts there.

    However, here comes the tricky part. Let's take ourselves a massively nvidia-biased game, something which there are plenty of:

    Aliens vs Predator - DX11, 2560x1600, no AA

    HD5870 CF: M21 A63.5
    HD6870 CF (est): M30.5 A92.1
    GTX460 1GB SLI: M32 A69.5
    GTX480 SLI (extrapolated): M47 A104.6

    Plenty of room in the average fps to turn AA on, you'd think, until you see the minimum fps figures. That falls into distinct categories. The HD5870s are unplayable in parts, the HD6870s/460s are only rough in parts, the 480s are actually (for AvP at least) pretty fluid, with a 55%+ gain on minimum fps, arguably more important.
    Also consider the 480s have 1.5GB of RAM which may be necessary when AA is applied.

    Now let's try a title quite important to my system upgrade:

    Battlefield: Bad Company 2 - DX11, 2560x1600, 4xAA, HBAO

    HD5870 CF: M12 A43.8
    HD6870 CF (est): M17.4 A63.5
    GTX460 1GB SLI : M22.5 A46.6
    GTX480 SLI (extr.): M33.3 A70.2

    Once again, not much in it for average fps, but a slam dunk for minimums. The minimum of 33 is too low for a multiplayer game really, so HBAO would likely be turned off in a real example.
    I will also concede that Catalyst 10.7, despite being officially released as bug-free in BBC2 (hench why it must be the driver that is benchmarked) has severe performance problems in the title. Previous releases of catalyst do fare better. Of course, I doubt they're the requisite 91.4% better to put a future unreleased GPU on an even footing with one that's been out for 4 months and already costs less than the HD6870 probably will on release, in a game that nvidia had no hand in biasing.
    You can see my point here right?
     
  19. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    You do remember that Nvidia kills image quality to gain FPS right? After seeing the horror that is Metro 2033 on an Nvidia card, I don't think I will ever buy an Nvidia again. That and considering that Nvidia holds the lead in so many games literally due to bribes? I refuse to buy them as a matter of principle.

    Consider that the cheapest 5850 available on Newegg is still within ~$20 of what I bought mine for at the original MSRP. I mean, Nvidia has already had to slash prices significantly to match the value of Ati cards, and Ati have barely budged yet. I would assume they want to keep their lead and thus will match Nvidia's slash with one of their own. Especially with the impending release of HD6 in the not so distant future. The market didn't suddenly become concrete. AMD/ATi are experienced competitors.

    I mean, looking at your own prices from 310 to 220 in the GTX470? That's one hell of a price cut. ATi won't sit idle for long. I'd say the war is just beginning :p

    Though like my other friend I wouldn't blame you one bit for jumping to the Nvidia wagon if you could truly justify it. I think it would be sort of neat to have both an ATi and Nvidia rig though. Lord knows that Q9550 isn't even near the end of its useful life.
     
    Last edited: Aug 13, 2010
  20. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    There's conclusive proof of that for Metro 2033, but not for any other title, and I've seen plenty of comparisons. Even if you ignore the needless requirement for PhysX to enable some effects in games the fact that support for AA is coded out for ATI cards but not for nvidia means that, one way or another, for a fair few games you stand to get a better image quality using geforce cards. Metro 2033 so far seems to be the only unfortunate examples.
    The fact that ATI haven't cut their prices since launch (exception of the HD5970) shows they're confident - a little too confident I expect. They released the HD5830 far too expensive for what it was and had to cut it back, same as nvidia had to do with the GTX470/480. nvidia are famous for releasing the fastest single GPU card ever and charging through the roof for it as a privilege, they did it with the 8800GTX and Ultra, they did it with the GTX280/285 and they did it with the 480. Couldn't do it for the 470 as the HD5870 was superior. However, when they released their midrange cards they usually price things right (cite 8800GT, GTS250 and GTX460). The GTX460 is priced exactly right.
    In my mind, ATI charge prices that were relevant back when they were utterly superior to geforce products, i.e. everything up to shortly before the GTX460's arrival and the corresponding price cuts. Consider the cards on neutral merits and pricing is right. Consider how many games are nvidia-biased, and the Radeons are looking overpriced, especially when you look at multi-GPU setups.

    The problem I of course have is cost. There are plenty of ideal outcomes to this confusion, if I had ample to spend. Sadly, I don't. As it stands now, geforce price cut or no price cut, beating what I have now is still quite marginal, and costs well in excess of what I paid for my last cards. Here's hoping something good comes of the 6870s to stir things up, and soon. Having both an ATI and nvidia PC would be quite handy, but very expensive.
     

Share This Page