1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    bc2 is bugfileld though. i get the overexposure that jeff mentions, the wate fail that sam mentions and texture flickering randomyl on snow or sand aswell. me donts like. this was with both 1 and 2 4870s and my brothers PS which has a 9800GT.
     
  2. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Yeah, same, I saw buildings randomly appearing and disappearing simply by moving inches at a time, and that was also on a 9800GT. This is all with legitimate copies of the game too, so for once we can't blame dodgy cracks :p
     
  3. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Ok. I totally misunderstood you then. I was once told, that GTA IV was a CPU whore. GPU still has a role to play then LOL! Can't wait to get higher GPU memory, as well as the process strength that will come with the right GPU. I like a more fluid experience ;)
     
  4. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    It is a CPU whore as well, moreso than a GPU whore. Doesn't mean it's not a GPU whore as well :p
     
  5. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Well...I'm NOT overclocking my GTX 260 LOL! I prefer allow it to perform its abilities at stock Guaranteed settings LOL! Besides, I think within a few months time, I'm gonna upgrade to something ATI. Not sure what yet. 2-3 months down the road, there could be something new and improved, which drives current prices down. I'll definitely be paying attention to your posts regarding benchmarks to price ratio ;)
     
  6. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Aha you're on about hardware requirements. I'm on about graphical quality, which is subjective ;P

    I don't remember saying anything like that??? It runs 37 average all maxed at 1680 x 1050. I haven't actually benched 19 x 12 yet. I would imagine it does a sight better than 20 on a 5850 though.


    There has never been a game yet that has let my imagination run wild like Crysis. It is one of the most immersive, atmospheric, and awe-inspiring games ever made. This is due to its implementation of all the eyecandy and how it brings the effects together to create an entire experience. As far as I'm concerned, Warhead is just Crysis with sharper textures, smoother shaders and cleaner shadows. My point being they both create the same effect. Yes Warhead is better but I'm talking Crysis in general.

    YOU MUST PLAY IT!!! Quite literally the best console-to-PC port ever made. The graphics SHOCKED me. It actually looks amazing all cranked with AA. Very nice looking game.

    Umm, WTF? That's messed up. I hope that's not apples to apples.


    Sam I basically agree with everything you said. As far as demanding games go, all the ones you listed hit it right on the head. I think we maybe got out arguements crossed up :p

    I only get the overexposure. Not a single other thing I can think of. The game renders perfectly in Dx10.

    Oh I quite agree Omega. I never OC my video cards. The gain is nowhere near worth the cost if something goes afoul. CPUs you just crash and try lower settings. GPU, well I have seen several fry under stock voltage just from OCing.
     
    Last edited: Apr 10, 2010
  7. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
  8. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Omega: It's certainly possible, though ATI claimed to be producing the HD5890 to nullify the GTX480. Given how bad it is, they don't really need to. Even if it's only a 12% improvement like with the HD4890 vs HD4870, that's enough to make the HD5890 a better card for performance, before you consider all the GTX480's flaws.
    The next ATI architecture is likely to be due in perhaps as little as 7 months.
    Jeff:
    That was my point, you claimed Metro 2033 was no more demanding than Crysis (at least, I think that's what you said). It is :p
    I am quite 'unstable' for immersion. An immersive game has to be fundamentally enjoyable, believable, and have no inconsistencies. Bogus science in a game meant to be realistic, or a sudden glitch in the graphics of an otherwise realistic looking game breaks the experience for me. I'm not set on stuff being realistic with today though. I'm quite a fan of the 1987-2001 Star Trek series because even though it's all bogus science, it's all consistent with itself, so having watched earlier stuff, it all makes sense.
    I will have to try the RE5 PC port, as I never played the Xbox version in its entirity, just most of it, co-op, and not even always in high def :p
    As for the Metro 2033 shot, it is apples to apples, on the 'Very High' setting. Apparently it can't be replicated at lower detail levels.
    As for BBC2, unless I'm meant to change it somewhere, it is running DX10 for me as well. I don't have a DX11 card yet, and I certainly didn't force DX9.
    I sometimes overclock my GPUs but only for 3dmark lulz, in the real world I've never owned anything that overclocks significantly enough for it to be worthwhile, primarily because I buy the 'pinnacle' cards. If I buy the 4GB HD5970 I suspect that to be the same.

    Omega: The angles are different because it's a high speed custscene, but seriously, you don't notice the difference? The detail level on the left is pathetic to on the right, look at the texturing.
     
  9. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    The ati screen looks slightly darker, and smoother. Perhaps you're right. But the Nvidia side had it tougher. There are angles in its shot, that make it more difficult. In other words, the frames almost looked rigged to favor ati.
    I say this as a completely neutral party. I rarely favor ANY brand over another. I'm quite open minded in that respect. Unless a brand has wronged me... ;)

    Its also possible that your eye is sharper than mine LOL! I do where glasses, but my abilities to focus optically, have decreased in the last decade. A particular illegal narcotic. My own fault...
     
  10. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    i prefered the look of nvidia tbh. atis seemed, how do i say it, too sharp?
     
  11. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Am I the only one that's noticed the fact that the resolution is practically half?
     
  12. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Resolution? How do you figure? The angle is certainly way different, but how do you come to that conclusion?
     
  13. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    texture resolution?
     
  14. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    You just lost me. Sorry... :S
     
  15. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    look on his hair/(poncho?) the resultion of the textures on the nvidia side seem to be less than the ati side, ie making the ATI versions seem more sharp.
     
  16. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Noticed that too. It might be the Nvidia one is the same resolution but a zoomed in picture??? I'm not entirely sure why the two are different but the game can't look THAT much worse on Nvidia can it? I seriously question the validity of that comparison. I mean I have no problem believing Nvidia have dropped the quality a bit to gain some performance and make their card more competitive. They've done it before. But I think there are tricks in that comparison used to expand the gap further. Different angle as well which, without AF, WILL affect texture sharpness.

    Not entirely sure though. I'd like to see the article that came from. Possibly taken out of context maybe?
     
  17. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    That much worse, THAT MUCH WORSE! LOL! Have my eyeballs taken a vacation or something LOL! The two really don't look THAT much different in pic quality. The angles are the only LARGE difference in my opinion. I guess my eyes are no longer trustworthy ;)
     
  18. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    The Nvidia picture has drastically lower texture and lighting quality. Don't blame your eyes, blame your tolerances. Sam and I are IQ freaks(Sam more so than I) and something "small" like that is very large to us indeed. This is essentially the difference between a good looking game, and one that vies for some of the best graphics of all time. The ATi shot has WAY sharper textures and much more depth in its lighting.


    Essentially, the Nvidia shot is horrifying to me as I would imagine it is to Sam as well. His mention on the resolution is the actual texture resolution itself. The ones on the Nvidia side are about half the size but stretched to fit the same model.

    I'm aware of the angle difference but I don't think that would cause THIS big a quality difference.
     
    Last edited: Apr 11, 2010
  19. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Omega: The difference, considering they're meant to be the same detail level, is absolutely disgusting. No wonder you think GTA4 looks good! lol

    Jeff: Can't be as simple as zooming in because all the light shading has gone down too, it's literally as if every detail slider has been set to 'high' instead of 'very high'.
    Let's be clear, HardOCP didn't even bother looking at this until they heard loads of complaints from the community, then began investigating...
    If this is how nvidia have their 20% performance edge, I'm worried, a drop in detail like that should probably net you more of a frame rate advantage than that.
     
  20. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Yeah GTA4 looks good FOR A GTA. But considering the many titles that came out before and after... well lets just say GTA4 is not with the times in the graphics department. Had they released it 2 years sooner and better running, then maybe the graphics would be considered AVERAGE. Not even kidding. Is it like the only game you play Omega?

    Yeah just a theory :p I know exactly what you mean though. If that really is apples to apples, not only will I avoid Nvidia for the foreseeable future, I am now outspoken AGAINST them. Makes me sick really if that's true. I can understand a few rendering shortcuts here and there, but actually butchering the game? Not good business, not good tactics, and not good for gamers of any kind.

    No I can't really place any blame on the OCP. They have been one of the very few bias-free review sites out there. Easily the most comprehensive performance and IQ reviews I have ever seen. If that really is a real shot, there is nothing you could do to get them to look THAT different.

    It's either a mix-up of what settings they were using, or Nvidia has seriously failed in all ways in which it is possible to fail. And they're being bitter about it too :p

    Yeah seriously, dropping to high on my game looks almost exactly like that shot and I'll tell you what the FPS boost is larger than 20%... considering Very High adds about 90% of the good visual effects...
     

Share This Page