Dude, OC the Q6600. It oc's easy to 3.15 on Auto. Overclocking is fun and it doesn't take to long, maybe a couple hours a day, then run a program while your at work or whatever. I built my own computer and overclocked perfectly fine withing the 1st week of doing either of these things for the 1st time. Hes right, you pay the same price for half as many cores.
What if it's the clock speed of the CPU not the number of cores? Having a 3.15Ghz Q6600 won't make any difference if that's the case...
Maybe you are right, but I don't really like the idea. I was never much of a gambler... I'd like it to last a while because I will not be upgrading for at least a year or two. Hmm...guess that implies q6600. =p I'll go with the q6600 then, but I need to research the OCing aspect of it. I may change my mind when I hear about them lasting a month.
That's the 45nm quads, which is why I'm not buying one, apart from them being double the price. The 65nm quads seem to overclock fine within reason.
Thats true. All I know is that the quad core is more than good enough for gaming and especially good for applications. At 3.15 the clock speed is already higher than the duo core clock speed stock..but then again you can over clock the dual core as well. As of yesterday I set all my start-up processes to my second core (Core 1), as Esthuansis (SP?) recommended, and my computer boots faster and navigates a bit better, It is however a minorly noticeable change.
I just thought about it, and I think that the q6600 will be plenty tough for me. My current computer has an integrated graphics chip and is not very good at...anything. When you get to high resolutions and have your CPU at 3.0+ there should be a GPU cieling. I am trying to google your problem sam, beacuse it just boggles me. edit: after extensive hunting...I have no idea. All that I've gathered is that AoC has an option for using "all available cores" but for Crysis, a guy with a q6600 said that he only sees two being used. The GPU is definetely not the bottleneck and you've been wanting a quad for a while anyway. lol
Yeah, jump on the bandwagon, sam. lol Truth is some games are starting to use quads. So I figured I could future-proof my self a bit with this upgrade. I'll have the quad and the option for crossfire so I can hopefully handle future games pretty well. I still can't wait for Crysis Warhead. I want to see what they do about performance. Maybe they'll add some new features to the engine like better shadowing and maybe fix some of the rough spots we see in Crysis. I can only hope
The 45nm chips seem a little faster at the same clock speed though, from what I've read a stock 3.0Ghz E8400 would be more like a 3.3-3.4Ghz E6750. That means an 8400 would be an improvement, but not much of one, as I wouldn't want to take it to above 3.6 - and like with the gamble of the quad core - what if having four cores would make a difference - I've spent £100 for a minimal performance boost... proxyRAX: Remember, I am using an HD4870X2, and in XP too, so that may have something to do with it. As for AoC, It's not a top priority since it hardly even uses more than 1 core, but I don't want it to be a precedent for other games. I'm thinking mainly about future here. Estuansis: I'm looking forward to Warhead 2, especially given how the original runs now.
Well I used to have a E6750 at 3.8Ghz. I've been using a quad core for the past few computers. Personally I would take a Quad core even if it is a little bit more expensive. TheftAuto is right. Do you ONLY game? I do other things with my computer as well. Things that really use that extra L2 cache. Hell, it doesn't shorten the life of your CPU SIGNIFICANTLY even at 3.4ghz. IMHO, the Core 2 Quads are very safe to OC as long as you stay within a safe voltage. Anyway who keeps their computer for more than 6 months anyway? lol JK
From what I've seen and read, the differences are more like 10%, maximum. A 3.4GHz E6750 is way faster than an e8400. The advantage of the E8400 is its higher OC potential. You can reach like 4.2GHz on one with a good tower cooler. I'll post the article if I can find it. Whereas a lucky E6750 owner will hit more like 3.8GHz with a tower cooler. I tried it with the Tuniq. I could not break 3.79GHz without bluescreens, though the temps were still ok. I could have had it at 3.6GHz, but 3.4 let me drop the voltage a bit and lower the temps. The Tuniq could handle it but not the Freezer 7 Pro. It would idle at 40*C, which is a bit toasty for me. I agree. The E6750 was sweet and still is. So you need something Quite recent to have any sort of improvement. The Q9450 is way above it in specs, cores, and cooling potential. Plus the price is right, not a beast, but not cheap. So I hope for an improvement, even if not really noticeable in performance. Last check says some guys did 3.4GHz at 1.3v. I'm going to aim for that and see where I end up. I still don't think I need to go over 3.4GHz though. Maybe if I go Xfire I'll need some extra power.
Yeah, plus it comes REALLY in handy if you encode movies or stuff. Oh and IT IS GOOD for multitasking.
I don't give a monkey's about idle temps, which is why the 80C idle temp of the 4870 didn't bother me. A CPU will never overheat at idle, so I only really care about laod temps, and with a highly OC'ed CPU and a Freezer 7, they're high - maybe not too high, but high enough.
Max load at 3.6GHz was somewhere around 65*C. So, not exactly in the danger zone, but getting close. Too close for me. I like to keep it under 60 at all times if possible. As far as the 4870's running hot. What does ATi say is the safe limit for the card? I would guess 100*C or so. Idling at 80 is not safe for that kind of hardware at any rate. Turn the fan speed up a bit and watch the temps drop. Someone turned the fan to 50%(on a single 4870) and his idle dropped to 46*C. It's the second post down. http://www.overclock.net/ati/367444-safe-temp-hd4870.html Could you please try this once on either card? I would like to see what you think about fan noise and what the temps are. Also, my Q9450 and 4870 are going to be here tomorrow. This is going to be exciting
Idle temp on my 4870 was about 77-78C (900rpm), load temp was about 87C (2400rpm). Didn't have any fan speed control software with my 4870 - I could potentially test it in my PC, but it took me so long to install the X2, I'm not really going to spend all that time just for that purpose. What I will tell you though, is the X2 idles at around 65C (1200rpm) and loads in the low nineties at 2400rpm, in crossfire, eighties when not (also runs at a lower fan speed when CF isn't working too). Upping the fan speed to 50% (there's not currently a tool to go any lower) drops the idle temps to about 42-45C, load I'm not sure, I don't yet have a game that will use crossfire in window mode to check. When you get your Q9450 and HD4870 running, would you mind running 3dmark06 twice? I'm curious to see what score you get a) at stock, and b) at whatever you consider a reasonable overclock.
That's good to hear. Hopefully I can do the same with my single 4870. I want to find a happy medium with fan noise though. During gaming, I can stand some fan noise. But on desktop, I want it barely noticeable. Sure, I can do that. I wonder if the OC boosts the scores at all.