Sam, I don't know? If I was going to be doing a new AMD Phenom IIx4 955BE build, I think I might spring for the extra $36 and buy the 1055T. With both Intel and AMD now producing 6 core CPUs, I don't think it will be that long before we see a decent sized shift to more Multi-Threaded Apps. The Intel's are far too expensive to buy, for them to have manufactured them, just for the exercise. For me upgrading the CPU, I would go with the 1090T BE because I can recoup what I paid for my 955BE, quite easily, due to the low price I paid for it and how new it is, to begin with. I would only need to come up with about $150 cash, and I can make very good use of the two extra cores for video encoding. Russ
The AMD X6 CPUs are really only of much use to existing AMD owners though, and many may already have decent CPUs. My point is that, to someone who already owns a 955, they may as well just keep it, rather than buy an X6. To someone who doesn't, they may as well get a 955. The 1055 may only be $36 more, but has to be overclocked a not insubstantial amount to perform on equal per-core, 2.8 to 3.4. As you rightly point out, most people don't overclock. For people who use apps that can take advantage of 6 threads on a regular basis, they are of course a much more viable. For the majority though, they aren't a worthwhile upgrade.
Sam, Damn! Shot down with my own words! You are right though, it's only worth it for the Enthusiast! We have met the Minority, and they is Us! LOL! Russ
Sure. I'd love to have an X6 to play with in video encoding apps if I actually used them, overclocking the heck out of it would produce some seriously nice numbers. Fact is though, what one core of an AMD can do at 4ghz, a core of even an i5 can do at 3Ghz, an i7 could probably manage a little bit more, and I'm willing to bet you can still overclock i5s and i7s a fair bit further on conventional hardware. As an existing AMD owner who's at enthusiast level and encodes video, the X6 is the perfect CPU, and even to someone building a new system solely video editing, the X6 is probably the better choice. If non-enthusiast, it's not such a hot deal though, and if gaming is more of the main concern, forget it, for now at least. Personally I see Intel's approach as relatively sound. The sort of people who can make use of six core machines fully are typically businesses buying top-grade machines, and when you can afford the extreme CPUs, the hex i7s are in a class of their own, not to mention they still overclock like hellfire. Once six cores become beneficial in other areas, I reckon mainstream 6-core CPUs may appear, but in truth, I think by the time this happens 8-core CPUs will be around, and it may end up being a bit of a washout.
I was looking them from the stand point of having to buy the cpu, MB and ram. Obviously if all you need is a cpu that's the most cost effective route. btw if I'm not mistaken hasn't AMD already released an 8-core cpu?
Red_Maw, Those were Opteron 6100 series, Server chips for socket G34 server motherboards. Not only 8 cores, but 12 cores as well. From the 8 core 1.8GHz 65w 6124HE to the 12 core 2.36GHz 115w 6176HE. All have 4MB of L2 and 12MB of L3 cache. I would imagine that some of these will wind up as either Socket AM3, or a new socket for the general consumer market. I'm betting on AM3 for the 65w 1.8Ghz 6124 and the 80w 2.0GHz 6128, replacing the current two 6 cores, at about $265 and $295 respectively! Just think, we have Bobcat and Bulldozer to Look forward to in 2011. Things are getting exciting at AMD! Russ
Well I figure look at the Clawhammer Opterons. We had Athlon 64 architecture in those well before the technology hit consumer grade. It's only a matter of time before 8 core CPUs trickle down and we have even more hardware to play with. Sure the 6 core Phenom II is a beast but, like Sam says, not really useful for a gamer. That's why I still haven't given up my original run 940. Outside of a few CPU whore games, it performs spectacularly. I think the next CPU upgrade I jump on will be Bulldozer. They sound a lot more confident about it than they ever did with Phenom I. I'd really like to see AMD take the lead again after so long being restricted to the budget and value market. Sure my Phenom II is an excellent performing CPU, but it still doesn't touch an i5 or i7. I suppose if I bought that 30" screen and the graphics hardware to use it, I'd have had an i5 long ago. But as it stands, keeping my graphics setup tame has allowed me to squeak by with my Phenom II for now.
Hmm okay so those Opteron 6100 cpu's were what I was thinking of. Not to burst any bubbles but if I remember correctly the last thing I heard about them was that they barely compete with the intel 6-cores. Bulldozer on the other hand still shows some promise. Even if they don't beat or provide stiff competition to i5/i7 cpu's I'm interested in seeing how AMD's "inverse hyper-threading" works out. With programming for parallel computing still a long way off something like that might just be the performance boost we've been waiting for.
Hey guys, I'm about to build an AMD box running off a ASUS M4N98TD EVO AM3 mobo with a AMD Phenom II X6 1090T Black Edition Thuban 3.2GHz Processor and had a compatibility question. I checked the specs of the mobo and it says that the only rear panel port for optical drives is a S/PDIF Out port. I've always wanted a blu ray player in my machine and was wondering if I could use a blu ray optical drive (the LITE-ON Black 8X BD-ROM 16X DVD-ROM 48X CD-ROM SATA Internal BD-COMBO Model ihes108-29 - OEM) with SATA only interface with the ASUS M4N98TD EVO AM3 mobo. Thanks guys, J
walri, The S/PDIF is for sound only! The Optical goes to a FiberOptic cable, and plugs into a special port on many sound systems.
SPDIF is an external audio interface, if you have a home receiver to connect or other external system, not using the computers on board audio. If you have a BD reader you can still hear the onboard audio using only SATA. EDIT:And TheOne wins by 1 (minute)
So basically they will be compatible with one another? What I'm really asking is where I would be able to attach (in which port) the blu ray player on this particular motherboard.
And PLEASE don't do it while the Mobo is running. You can wreak havoc on the Southbridge and channel doing that. My brother made that mistake on one of his Dell boards. It fried one of the IDE channels
Oman7, I know the suggestion was made to me where I would unplug and plug in the monitor while the power is on. Ain't happening, at least not by me! I got the RMA after I called MSI. Now I can send the 9500GT back for repair or replacement. I think I will get one of these to replace the 9500 with. Looks like a decent enough Video Card! http://www.newegg.com/Product/Product.aspx?Item=N82E16814127478 Or perhaps the 512MB one http://www.newegg.com/Product/Product.aspx?Item=N82E16814127495 I'm also considering one of these, a 9800GT http://www.newegg.com/Product/Product.aspx?Item=N82E16814133304 Right now I'm leaning more towards the GTS 250. My main reason is the technology is newer. The 9800GT is getting a bit long in the tooth, it's been around about 3 years! Russ
Estuansis, I like the 1GB MSI card. Just wondering if the extra 512MB of memory is worth the money difference? A lot of people on line say yes! Russ
the gts250 is quicker, but they are based of the same tech unfortunately. Both are G92b GPUs, just the GTS250 has 128 "cuda cores" (aka stream proccessors) and the 9800GT has 112. basically what nvidia have done is: 8800GTS 512MB> 9800GTX> 9800GTX+> GTS250. all the same core, just branded differently over 3 different generatioms, just because most of the public won't have a clue as to what has happened. Yes there has beena die shrink and an oc, but essentially it's the exact same card.