For the price, it's a good deal. It'll play most games at maxed settings at a decent resolution. The 5400+ BE might hold you back in newer games. Though, AFAIK, the AMD X2s are still "good enough" for gaming with up to an 8800GT and will be decent for a while still.
Yeah, that's one thing I'd be wary of. But the nForce 7s have proven to be a bit more stable and reliable than the nForce 6 series. Remember that they've made improvements. The nForce 4 and 5s were actually good. I should know, I've owned both. So the nForce 7 can't be the worst chipset you can get. Also something to be careful of is that Ultra PSU. Most of the Ultras I've seen have failing 12v rails among other things.
Thanks guys, actually when the time comes to do a build, I was hoping to build one this past summer but things did'nt work out, but it wont be a hot rod but some where in between or better maybe so I'm kinda looking at options, and already I have learned to stay away from this one cause two of it's options are already in question, the ultra and nforce 7. When I finally do get a chance for a build I do want do want it without a bad review from anybody which I know might be difficult, I know everybody has an opinion and it can set off a frenzy lol, but I have plenty of time.
the 790i ultra is the the only good suriving intel nvidia chipset. mmm who has seen the benches. so far, SLI is scaleing much better than CF on the X58 baords.
I'm not even remotely interested in X58 at this point. My X38 is still a fairly new high end chipset capable of monster overclocks and excellent x16 Crossfire support with the latest PCI Express 2.0 cards. X48 was more of just an upgrade to overclocking and power management, and X58 seems more like a high end server-grade chipset. DDR3 is way too expensive and DDR2 isn't really that old yet. Especially seeing how long DDR1 went as the gold standard. DDR3 doesn't even give a noticable performance improvement. And SLI scaling is better because almost no games actually SUPPORT Quad Crossfire. Most games these days are TWIMTBP titles and thus offer better support for Nvidia's multi-card solutions. Considering also that Core i7/X58 is really only useful for GTX280 SLI or similar. Which, IMO, is absolute overkill even in high res with maxed AA and AF. My Q6600 is enough for 2 HD4870s in Crossfire, and that's still overkill for everything but Crysis. But, apparently, a 3.4GHz Quad bottlenecks 2 GTX280s badly. Overpriced, overkill, over and done.
The 780i was diabolical. The 790i was better, but still uses as much power as two P35 boards at once and had instability and incompatibility issues on occasions. nforce chipsets are just bad. The nforce 6 was agreeably worse though. re SLI/CF on X58, at this stage it's too early to call, as i7 is such a rip off that not even a new builder should really consider it unless money really is no object. Considering Dual versus Quad graphics, running the highest resolution there is with Anti-Aliasing applied, not many games won't run properly like that. Supreme Commander Forged Alliance runs sufficiently for the most part, that just leaves Crysis and Crysis Warhead. Neither of those have proper Triple-SLI scaling either, and on top of that, require 2GB per GPU of memory to run maxed out with AA. It's worth noting that the benchmarks that were run at 2560x1600 all Very High being noted as playable on triple GTX280s did not include Anti-Aliasing Without AA, 2560x1600 all High at least is very playable on a 4870X2 for the original Crysis. Maybe not warhead, but with a CPU ceiling of 21fps, who cares?
My girlfriends Dell is running a 4400+ and I'm thinking of OC'ing it if the mobo allows. How far did you get yours to and what cooling were you running on it?
You can't overclock dells, not in the normal sense. You can try using CPUFSB but I doubt you'll get very far.
To be honest, I'm still a major AMD/ATi fanboy. But I buy what is most worth my money. It just so happens that the Q6600 gives me stellar performance within the range of a $600 Core i7 and pushes a considerably high end ATi system quite adequately. Core i7 is not even that much of an upgrade to Core 2 Duo. The hardware is ludicrously expensive and nowhere near justifies the extra performance, especially when we don't really need it at all. And as far as ATi vs Nvidia. I like Nvidia. My 8800GTS G92 is probably one of the best cards I've ever used. But the fact remains that they hold most of the gaming world by the throat. It was a nice change of pace to see ATi deliver the better product this time around. Hmm, $250(HD4870) vs $450(GTX280). You tell me which one you're getting considering both are massive overkill cards. I sure don't have $200 a card extra for power I don't need. Even my application of the dual HD4870s was questionable because just one is fast enough to max everything but Crysis adequately. The GTX280, X58, Core i7, DDR3 and the like may be impressive. But it's really just a huge ass-load of wasted money for minimal gain and no real noticeable improvement. I got mine stable at 2.6GHz, but could have possibly gone higher had I been using a different motherboard at the time. I was using an Arctic Freezer 64 Pro. But all this is irrelevant. You can't overclock Dells. EDIT: Sam, I'm trying Vista Ultimate X64 on an 80GB drive tomorrow. I'll use it for a while and see if I like it. If I do decide to switch over, will an upgrade install go smoothly, or am I better off reformatting?
well i had the money so i bought a 4870x2. but had i more money i certainly would have gone gtx280 in sli and x58. i just made a statement that sli was scaling better than cf (didnt mention quad) and i7 helps ALOT with dual GPUs vs core 2. i didnt tell anyone to buy it. and no it isnt over kill ask sam, we killed his system trying to run crysis 2560 dx10 very high 8xaa. its needed. if a 4870 was overkill it would easily play crysis or warhead or FC2 or even FSX, but really it cannot.
Yeah, but games have come out after Crysis with similar graphical goals and triple the performance. Also consider that Sam is using 2560 x 1600 and he is the only home user I know that does. So yes, Core i7 can be useful with multi-card graphics at high res. But 95% of us will never ever push a system that hard. The performance hardware I am using is more than adequate. If there is a CPU bottleneck in my system, I haven't found it yet. And my single 4870 could do Far Cry 2 quite adequately with all Very High at native res. And 2 together does all Ultra with 4xAA smoothly. At 1920 x 1200, that's impressive.
right but i never recomend anyone buy it, or that its good for the home user, all i said was that SLI>CF in i7.... what game is remotely as good looking as crysis, yet works decent? and by smooth and decent i mean 60FPS, not 25 or 30. what are your frames per second?
No you didn't say that. But I'm dissing it for the straight fact that it's way overpriced and only beneficial to the few super high res gamers like Sam. I'm not arguing against you. I'm arguing against Core i7 and the ridiculous amount of money you have to spend to even USE it. Let alone the relatively minor performance gains compared to the insane prices. I get 45-60FPS maxed at 1920 x 1200 w/ 4xAA in Far Cry 2. I daresay that's quite SMOOTH. And it's a visual rival to Crysis easily. And Deadspace is another one. It has comparable quality textures and models and performs like a dream. I get 100+ FPS in most cases. And what about Left 4 Dead and Episode 2? Those are both superior in overall quality and art direction(IMO) but give blistering performance as well. And Call of Duty 4. That game looks seriously nice all maxed and also performs like a champ. Don't get me wrong. Crysis is an excellent game with technically superior graphics. But many games manage to look even nicer IQ-wise for the sheer fact that you can use AA and AF without destroying your performance. Not only that, but the AA in Crysis gives that performance hit without really DOING anything. There is a noticeable effect, but compared to something like CoD4, the AA in Crysis is practically useless. I don't even bother to use it. I didn't upgrade for Crysis. I upgraded to play future games. We're at a point where all the best looking games get awesome performance on even budget rigs. But Crysis breaks that trend by running like crap, being extremely system intensive, and glitchy to boot. You can't base your hardware purchases on one game that is already unanimously decided to be a poorly coded system hog. Crysis may be cutting edge, but if you only worry about your Crysis performance, you may as well buy an Xbox 360 and save yourself the headache.
ok well, tbh i hate crysis as it is, but really, now a days (well as soon as my thermochill PA 120.3 gets fixed, i will most likey buy components for 3dmark . deadspace scares me too much to play, and IMO lush jungle>barren desert any day. also i may hate crysis but my god FC2 is boring lool. on DX10 you will see why crysis looks better than anygame. just go under water or look in the trees toward the sun. cod4 looks no where near crysis. and trust me to know, i have put over 900 hours into that game.
Seeing SLI scale better than crossfire goes against what I've seen with 775 for several months. Crysis never came out on the xbox, it's too demanding.
LOL it's not THAT scarey ;P That's definitely a matter of opinion. I think the African setting is a refreshing change of scenery. The deserts and plains have a beauty of their own. I'm kinda sick of nothing but jungles. And I think Far Cry 2 is fun as hell. It all depends on what kind of gamer you are. I really enjoy the immersion in the world. Real weapon fatigue, get tired after running, fix your vehicle, use a map to find your way to the next location. I thought that was the POINT of an open ended shooter. This is the kind of game I was hoping Crysis would be. Though I think Crysis IS the better game overall. Much more thrilling action and a better story. LOL it's called hacked very high. I get all the light beams and god rays and translucent leaves and color grading and POM and everything else. DX10 is a fricking farce. EVERYTHING for the very high settings can be hacked in except for the more subtle blur shaders. DX10 was tacked on to sell more copies of Vista. Crysis was originally developed as a Dx9 native game.