I used CPU-Z since v1.2 & currently on V1.44 and cannot remember ever having vcore included in the online validation. Somebody enlighten me w/ what patch or version allowed this. I'm really curious. I used to capture and ink in my vcore for the purpose of sharing, never to try to fake. My fonts were simple photoshop and not meant to look like or pass for CPU-Z.
I have a question about two GPUs. Which one is better? Intel Extreme Graphics 2 w/32mb shared ram OR ATI Radeon 7000 w/64mb on-board ram This is for a Thinkcentre (British spelling as it says on the system...don't know why they used those spelling in the US but w/e) 8183-41U. It only has two PCI slots and nothing else (its one of those really thin desktops) so I can't use an AGP card (does not have a slot) nor can I use a PCI-E card (not invented back then). thanks, -im1992
would i be able to run quake 3 at higher settings than i could with the onboard GPU? and would i see a FPS improvement? thanks, -im1992
im sure the 3450/3470 comes in PCI as does the 3650, you can pick them up for cheap. BTW a mate if offering my a X2900XT for £35, should i take it to tide me over till mid july?
ok i got it up and stable at 3.78ghz. i increased my vcore up to 1.3000v plus my mem is OC'd a little as well. up to 840mhz here is OCCT and its results after a hour of stressing. i dont like the temps http://i72.photobucket.com/albums/i196/Cincrob/2008-06-16-12h38-VCore.png http://i72.photobucket.com/albums/i196/Cincrob/2008-06-16-12h38-Volt12.png
The HD2900XT isn't a bad performing card, in most circumstances it performs near the level of the HD3870 - that level of performance for that price is pretty decent. You'll just have to bear with the mega loud cooler and the astonishing power consumption.
is it true that the G80 core yields more performance than the G92 core at the same clock speed? or am i being a complete idiot? thanks, -im1992
Not sure how that's relevant to the HD2900XT, but yes it is. The 8800GTX is similar in performance to the 8800GTS G92 - the former is clocked at 575mhz, the latter at 650.
id have assumed i would have been better at 1920 with AA than the 3870, due to the 512-bit interface. atleast that way, maybe i wont need to upgrade to the 4850 just now, which will leave me more to the 24" monitor.
thats because the G92 cores had less TMUs and ROPs, but the clocks were therefore much higher to compensate. OUCH, £450 for a GTX 280, which is banging heads with the 9800GX2, which is at £300, I know which one I would choose. come on £350 4870X2 or £250 3850x2
i see so the performance would be the same? with Rainbow Six Vegas 2 i get 19 FPS average with an e6300 @ 3.088ghz and 2(SLI) GeForce 8800 GTS 320. is this normal? oh yea, and the latest Nvidia drivers with SLI enabled... -im1992
Out of the 2900XT and 3870, the latter is certainly better, but for 35 quid, it's a steal. im: I get about the same frame rate but at 2560x1600. I get 30+ at 1920x1200 and the game is smooth. This is with one HD3870.
i run mine at 1440 x 900 and i get 19 FPS average so this is bad right? cause if you get 30+ at a higher res than me????? -im1992
Absolutely fine. With two of them you can even max it at 2560x1600. im: yes, very bad. The resolution I use has more than triple the pixels.