That's true. But it's just an average. I'm sure the HD3850 is faster than the 7900GTX in the games you play.
The HD3850 256MB is unfortunately owned by higher resolutions, but if you only game at up to 1280x1024 you're not likely to encounter big frame drops that are due to the memory alone, since the only games that need more than 256MB at that res aren't going to run smoothly anyway on an HD3850. If you use 1680x1050, or moreso 1920x1200+ then the 512MB is more important. Estuansis: We've had the discussion about Crysis before, it's not a valid ATi vs nvidia test. It is useful in the sense that, if you can run Crysis, you can run anything, but as a brand for brand comparison it's once again skewed in nvidia's favour (remember that this is how nvidia operates, landmark titles such as 3DMark, Oblivion and now Crysis are "optimised" to give the highest frame rates regardless of image quality). Given the different rendering techniques used, I avoid using Crysis for card comparisons. There's nothing explicitly wrong with nvidia cards drawing Crysis, hence how they got away with it, but it's missing a large amount of detail compared to how the game was designed to look. "The way it's meant to be played" - what a laugh! Abuzar: I used to trust the THG VGA charts, and I still trust their articles, but the recent VGA charts are a crock of ***t, because all of the nvidias outperform all of the ATis, a 6600GT beating an HD3870? Give me a break. Look for the individual articles on the 8800s and HD3800s for better testing.
Heh, for a pretty large list of games that's true. Of course, it doesn't apply to all games. I've run CS:S at 1600x1200 with 4xAA on only 256MB before, but then unless you use HDR, CS:S isn't really a very demanding game.
Link please? I'll believe it when I see it. Crysis and Oblivion look exactly the same on my PC as they did with the X1800XT at the same settings. Only differences I noticed were maybe smoother AA and sharper AF in favor of the ATI card.
That's because you haven;t compared them side by side. As I say, you don't notice it directly, because there's nothing explicitly wrong with it, but there's a significant amount missing with the nvidia card. I posted it in one of the threads a little while back. VR-Zone demonstrate the difference in Crysis.
I built my mate a pc a few weeks back, his being a 8800gt, and had it running along side my 3870 and on cod4 you tend to see more detail in the distance than you do with an 8800gt. it is subtle but I think this is one of the 'nvidia traits' they do gain higher frame rates but I agree with sam you can see it when you have them side by side. plus with the release of ati cat 8.3 the frame rate is upto 25% faster on the 3800 series cards ( well so they report ) I have yet to check this out though. I havent noticed much increase but then my frame rates are high enough not to notice a small increase.
The following performance gains are noticed with this release of Catalyst™. Call of Juarez: Up to 20% improvement is noticed on ATI CrossFireX™ configured systems containing an ATI Radeon™ HD38x0 series of product Company of Heroes DX10: Up to 17% improvement is noticed, especially at lower resolutions, on ATI Radeon™ HD38x0 products Crysis DX10: Up to 15% improvement is noticed on all supported ATI Radeon™ products Lost Planet DX9: Up to 36% improvement is noticed across all supported ATI Radeon™ products and in ATI CrossFireX™ configured systems Shadermark 2.1: Performance scores increased up to 35% across all supported Radeon™ products and in ATI CrossFireX™ configured systems Unreal Tournament 2004: Up to 10% increase in performance is noticed on systems containing an ATI Radeon™ HD2400, HD2600, or HD3400 series of products on in ATI CrossFireX™ configured systems thats on the release notes, but I read the 25% gain somewhere on the ati/amd site. Im moving upto a 790x mobo soon, the Gigabyte GA-M56S-S3 is pretty crap, onboard lan died within 3 weeks as is common with alot of nvidia chipsets.
thus you understand the move to the amd chipset the sb600 is far better, mind you im leaving onboard lan alone now, as I get a better ping through my pci gigabit lan card. ping on the cod4 servers i tend to go on is 19ms compaired to 45 - 70ms before just had a couple games, the drivers are better, but still not 100% there in my eyes, lets see what 8.4 cat brings to the table
I suppose I don't notice the ATI/Nvidia difference like you guys do. For me, ATI definitely has the edge in image/texture/line quality. But that's where the differences stop for me. I see no wrong in going for a higher performing Nvidia card and sacrificing a few(relatively) minor details. Could somebody find that article for me? I can't seem to track it down. EDIT: I'll have to admit though, that I was going to be getting an HD3870. I DO notice the IQ differences, even if you guys notice MORE differences. Heck, the only game the HD3870 struggles on is Crysis. And I suppose you could still come to a happy compromise, seeing as the HD3870 produces impressive performance regardless. All high in DX9 @ 1280 x 960 with 2xAA is easily doable on the HD3870. But since I have the 8800GTS now, it would be a shame to upgrade so soon Plus, here's a kicker for me. They always test Crysis at high resolution in dx10 with very high settings. Of course it's not going to be playable. The game is made to scale forwards more than backwards. It's basically a big tech demo. I've noticed big framerate differences between dx10 and dx9. Since most gamers are still happily using XP, shouldn't they be testing in the most common conditions? All high(vs very high) dx9 at middling resolutions with some AA is going to get you a more realistic picture of a video card's value and performance. I'd even settle for the very high XP tweak. Vista and dx10 kill performance. Windows XP is for me until they completely resolve those problems(and eliminate DRM). And as to getting the best IQ? 2x/4x Anti-aliasing cleans up lower resolutions significantly even on my 1920 x 1200 monitor. 1280 x 960 w/ 4xAA looks as good(to me) as 1920 x 1200 w/ no AA AND gets higher performance. Honestly, I'd rather have had the HD3870 over my 8800GTX even. It just wasn't even announced until after I got the 8800. Crysis aside, there is NO game where there is a significant advantage choosing Nvidia over ATI. FPS of 90+ vs FPS of 120+ makes no difference to me.
ck5134: Hmm, I have a board with an ATI chipset as well. Whilst it still works, it has its quirks, much like the other Asus boards I've had... The graphics drivers for ATI are nowhere near 100% unfortunately, but neither are they for nvidia either. In short, I'm not really impressed with anyone's graphics drivers, be it for XP or Linux. Abuzar: Remember though, a 10% frame rate boost from overclocking my card earned me only 500 extra marks from my original 10K+ score. Overclocking it further and getting an extra 1% gave me an extra 200. I wouldn't gauge 3dmark as a measure of performance boosts. Estuansis: I quite agree with the entirity of that post. Whilst you can say going for an image quality boost that is marginal in some games is purile, so is going for a frame rate boost you're not going to see. Even a frame rate boost that is noticeable, e.g.the HD3870 to the 8800GTX, it's not sufficiently much for an upgrade, and consequently, it's not sufficiently much to be worth panicking about.