Hello. Please fan boys stay away from here. I am a neutral pc enthusiast who has an ati x1900 xtx and is thinking about buying a x1900 crossfire master OR selling the one I got and buying another card: 8800 GTX or one less expensive, 8800 GTS 320mb or 640mb maybe? or radeon hd 2900 xt? I was searching for a while and couldn't find a decent benchmark that showed x1900 crossfire performing vs 1 single new generation card (like 8800 gtx, etc , the ones mentioned above). Please any help would be greatly appreciated! Thanks! EDIT: sorry I forgot to say I have a conroe e6600 running at 3.15 per core, 2gb mem, and ocz 700w power supply (so psu power is not an issue fortunately)
Plain and simple 1 newer card is better than 2 older cards period. In a year those 2 cards will be outdate, while on the other hand the nVidia 8800 will still be a fairly decent piece of hardware. Plus you get DirectX 10 support.
Yeah you're right, I shouldn't be thinking about crossfire here, but getting a new one. I think that 8800 gts 320mb is gonna be my choice. Thanks dude. PS im not worried about dx10, i think that by the time i play anything on dx10 i probably sold this card already and got a new one
8800GTS is a cheaper solution. It may not bag you quite as many frames per second in the games that support crossfire, but in the overwhelming majority that don't, the 8800GTS will also be significantly faster, not to mention quieter, smaller, cooler. In fact, there really isn't a lot going for the Crossfire solution. Don't get me wrong I'm hardly an nVidia fanboy, I haven't used one of their cards for nearly 4 years, but right now, the 8800 is the only realistic option for high performance.
How does the Nvidia 8800 series compare to the Radeon HD 2000 series? I have a 7950 GT OC and have thought about upgrading it sometime, not right away though. So I was kind of wondering which one was doing better at the moment? But then again I've herd something about a new 8900 coming out also (but I don't know when), so that might be something to consider but it would depend on the price.
I'll tell it like it is (no fanboyism, just truth): 8800GTS: Cheap, fast, quiet 8800GTX: Expensive, very fast, quiet but hot HD2900XT: Quite expensive, not very fast, very loud and very hot. Go for an 8800 every time.
Yea then it sounds like it's something from the 8800 series for me. The 7950 I have now already runs way too hot for me. It gets to around 90 C with GRAW. And thats with a slot fan installed! Sorry, off topic, but just wondering. Thanks for the info.
90C is actually fine for a graphics card, lots of them run that hot. The 8800s don't, but my X1900 did, until I replaced the cooler with a Thermalright one.
Oh, ok. Well that helps alot, I was afraid I was going to fry it soon because of the high temps all the time. My old 7300 LE didn't get above 70 C, so this one just scared me. But if others run that hot too I guess I'm fine. Thanks for letting me know that before I went and bought another cooler for it.
I did the same in 2004, bought a PC with a Radeon 9200 in it and quickly upgraded to an X800 pro within 6 months.
hey guys thanks so much for your responses and yeah 90 c for a video card is not that bad my x1900xtx runs at like 80-90 every time I think Ill go with the 8800GTS 320MB cos the 640 is too expensive for me... EDIT: just so you know, 90 C for a processor would be VERY VERY HOT
Indeed it would. Anything above the 55C mark is undesirably hot. Stock processors can run up to about 70C before serious issues occur, but 55C is about the limit before overclocked CPUs start to become unstable. The Thermal emergency shutdown for CPUs is between 80C and 100C typically. GPUs are happy to go up to 130.
yeah my e6600 overclocked (not too overclocked, just 3.15ghz i didnt want to put more voltage to it) runs at 50 loaded, and 32 idle. Of course, i dont trust core temp, otherwise my house would have burned, cos the damn thing shows always like 85.
Everest is usually a reliable measurement of temperatures, even if it sometimes gets them the wrong way round!
Then I guess I don't have to worry about heat. Before I found out my graphics card is safe at 90 C I got it down to 65 C or 70 C on a full load. And my CPU runs at around 33 C Idle, and 40 C on a full load. Sorry kind of off topic but he decided what graphics card to get already anyway. And now I have decided what my next card will be also, unless the 8900 series is decently cheap but I doubt it will be.
I'm glad so many people contributed to this post, with no fan boy war at all. Can't believe so many people defend brands like they make money out of them... BTW I have a scythe ninja plus on my core 2 duo E6600 which runs at 3.15ghz, and still get temps of 50-55 C on a full load / 30-32 C idle(with an ambient temperature of like 28 C), and my case is the aerocool with the 25cm fan on a side, and the 10 cm fan on front, with 2 fan extractors on the back...Oh and with no extra voltage in the OC. EDIT: I forgot to type my question, my question is how come you keep it so low on load? what's your ambient temp please?
Those temps are characteristic of the Scythe Ninja running passively. I'm going to guess you don't have a fan attached to the heatsink itself, only one in the case behind it. Having a fan attached to the heatsink makes a surprising amount of difference to the load temps with a Ninja.