maybe those are fake... I mean when the 2 8800GT in SLi (essentially the 9800GX2) only gets 26FPS, then how is this possible?
Given the infancy of the card I wouldn't rule that out. Given the comparative score to the GTX though, those tests seem legit.
Just performed a modest overclock on my GT, took it from 650/1900 to 700/2000, the speed of the SSC edition card I was going to get but wasn't in stock. Rather painless I must say with no noticeable jump in temps and cruised past 12k on 3dmark. Put over 400 points on my score - doubt it'll help very much for Crysis tho!! Maybe a fps if I'm lucky!
Not bad... Suppose I ought to post this here as well: http://img98.imageshack.us/img98/4097/3dmarkrecordqv5.jpg
Nice. Oh if I had a better C2D! That extra 500mhz is deffo giving you the edge, knew I should have got the E6420, then I'd have the extra multiplier so I'd crack the 3.0ghz mark. I'm happy with it tho - I'm sure you are with yours aswell! At 2.99ghz and 11727, it's certainly an improvement on 8579 at 1.86ghz!
Sam, how does Crysis play? Maybe you can do some benchmarks like the Crysis bench? What are the temps on your 3870x2?
Maccer: I run a 4300. Surely the 6320 is better? Waymon: X2? I own but the humble single HD3870 thanks very much. Just because I bought two of them... As for Crysis, using the hacked Ultra settings in DX9 (yes, ultra, some random guy tweaked the settings beyond very high apparently), I get 20-23fps at 1680x1050.
Well as you know, the differences are the fsb and the cache 200 vs 266 and 2mb vs 4. Yes, stock speeds the E6320 is better, but yours has the luxury of the higher multiplier and better o/c! That's why if I had the E6420 I'd be at 3.4ghz at the same bus speed. Interesting to note though that my cpu score is a few points higher even with the difference in speeds. Perhaps that extra cache is doing it's job - I always thought the difference between 2 and 4mb was pretty much nil. I'm sure your gfx card is making good use of that extra speed though, even if it doesn't make up for having half the cache in the cpu test.
Waoh, wait a second sam! I thought you said the 3870x2 was tempting, that's why I thought you got that! lol Anyways, those scores are great for a single card then! Also, I use the Ultra high settings config as well! Looks so much better... Play it at 1280x1024 with around the same frames as you, however in sometimes I get to the mid 30s in the jungle, I guess because there is not to much action.
It would be tempting, but for the fact it's much too expensive. Abuzar: because Crysis doesn't work properly with Crossfire.
the benifits for Dual GPU setups happen at 1920x1200 + Res with AA and full eyecandy, etc. hell even your 3850 would be same as the gtx at 1024x786 on most games, which then would be CPU limited.
Hey guys, I just thought of something. The 2900 PRO can be flashed to an XT easily. The 2900XT beats out the 3850, and it's only 144! http://www.newegg.com/Product/Product.aspx?Item=N82E16814102717
Yeah, but is it worth the extra power, and huge slot design? I also hear on newegg a couple of people complaining about overheating, which a card like that with the giant cooler shouldnt be doing. So abuzar, if you buy it can I have your 3850?
I LOVE the "huge" slot design. Puts air OUT of my case. I don't like single slot cards. Depends, how much do you want my card for? It's a pre-overclocked MSI 3850, I'll sell it for 180 Think my 550VX can handle the 2900XT?
The 550W VX will be able to handle the 2900XT, but bear in mind that if you have your PC on for 6 hours a day and game for one of them (less than what's normal for people like us I'd expect) your electricity bill will rise by about $250 a year...
Well, the 2900XT uses at least double the power of the 3870 at load, so for what it's worth you may as well keep with the 3800 series. Short term outlay, long term gain. I'm going to do some investigating with my Sapphire single-slot card tomorrow. Its power usage suggested it was stuck in 3d mode, however, based on the results with my Powercolor card it looks like the 3d mode power consumption applies up until the drivers are loaded, then it spools down. If the Sapphire card can manage this in windows, my concerns about it are unfounded. After all, it's noisy, but presumably only in 3d mode, and if you're overclocking, the reference cooler is louder, because you have to bring the fan speed up.
I hate the coolers on the 3850. Stupid ATI, would two slot coolers have KILLED them? All that heat in my case, passing over my CPU! I'm still selling my 3850 though, for 180. One of my friends might be buying it. Then I can get this: http://www.newegg.com/Product/Product.aspx?Item=N82E16814130318 EDIT: Shit! Just went outta stock. Ok then I'll get this. They're having blow-out sales on the 8800GTs! The 3870 is actually 10 dollars more! http://www.newegg.com/Product/Product.aspx?Item=N82E16814500005