1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Newbie planning out computer parts

Discussion in 'Building a new PC' started by dytopia, Jul 6, 2008.

  1. PeaInAPod

    PeaInAPod Active member

    Joined:
    Nov 28, 2005
    Messages:
    3,050
    Likes Received:
    0
    Trophy Points:
    66
    Taking into reason some of your choices dytopia I have compiled a complete machine on newegg.com let me know what you think.

    For the motherboard and videocard I went with eVGA for both. The reason being that there products are of exceptionally high quality and both the motherboard and videocard are covered by a full lifetime warranty if you register the products with eVGA. And considering you want are going to do some heavy gaming the fact that this board supports SLI is a plus and this motherboard is reckoned to be one of the best mobo's for overclocking! And the videocard is brand spanking new and hands down the fastest product on the market. The motherboard is $260 and the videocard is $540.
    eVGA 780i SLI mobo (2 time winner of neweggs customer choice award)
    eVGA GTX 280 videocard

    At first I was going to go with one of the new intel quad-core cpu's (teh Q9300 2.5Ghz @ $250)but I opted to go with a higher clocked dual core as you net more performance now and don't rely on the ability to overclock the quad core for higher speeds. So for $190 your getting a easy overclockable Intel E8400 thats already @ 3.0Ghz.
    CPU Specs

    PC Power and Cooling is easily the best PSU company. For $160 your getting a 750 watt PSU capable of driving four graphics cards and still has a 80 percent efficiency rating. It is a very capable PSU and I have had a similar model for 6 years and have had zero problems. If your not sure about this company reputation/quality quality consider in one of the leading PC magazines(Maximum PC)they annualy build a "Dream Machine" and the PSU company of choice for all there top end equipment? PC Power and Cooling.
    PSU Specs.

    For memory I went with Corsair. The sticks linked to below are DDR2 800 speed, $75, and are on the eVGA certified memory list for this motherboard.
    Memory Specs.

    For the case I opted for a Antec Nine Hundred @ $119. It is a very capable case and one of if not the best case for use with air-cooled systems.
    Case Specs.

    For the rest I spec'ed out a 500 GB SeaGate SATA harddrive for $80 w/ a 5 year warranty, two LG IDE DVD Burners for $24 each;went with 2 because it is a really handy to have 2 drives, and a $6 tube of Artic Silver 5 thermal paste.

    This PC is compromised of the latest and greatest parts on the market, and includes everything you need to get started for $1,477.91 and thats before any mail in rebates. In a earlier post you asked about the IDE ports on a motherboard, there is one IDE port on this board which supports up to two devices, which would be the dual dvd burners. IDE harddrives are old and pretty much defunct, SATA is the new connection for harddrives. It is MUCH fast and prices for SATA harddrives are no different then IDE drives, if anything there cheaper. This board has 6 SATA ports, so you could connect up to 6 HD's.

    I didn't include the Thermalright 120-Extreme heatsink because I couldn't find it but expect to spend another $50-70 for it.
     
    Last edited: Jul 7, 2008
  2. dytopia

    dytopia Member

    Joined:
    Apr 29, 2007
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    11
    Looks good, but id prefer 4gb ram than 2gb.

    Seems quite pricey with the S&H plus the taxes, would you mind telling me the major differences with this machine and the other one made?

    Because to be honest I dont know much about all this..
     
  3. PeaInAPod

    PeaInAPod Active member

    Joined:
    Nov 28, 2005
    Messages:
    3,050
    Likes Received:
    0
    Trophy Points:
    66
    I'll explain shortly, I got to leave for 30-45 minutes. brb.
     
  4. dytopia

    dytopia Member

    Joined:
    Apr 29, 2007
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    11
    Gah, I live in Canada =/

    what about this quote taken directly from there website, unless you have a newegg canada site?

    EDIT: source - http://www.newegg.com/HelpInfo/FAQDetail.aspx?Module=4
     
  5. PeaInAPod

    PeaInAPod Active member

    Joined:
    Nov 28, 2005
    Messages:
    3,050
    Likes Received:
    0
    Trophy Points:
    66
    lol, that took a lot longer than expected. Alright, as for 4Gb of memory. That is really overkill, unless you plan on running a 64 bit OS. Any modern version of windows is most likely a 32-bit OS. Due to the way a 32-bit OS maps (read: distributes) memory the system can never fully take advantage of 4Gb in most cases 3Gb is what the system displays as usable memory and in some cass 3.5Gb is shown. Now why not just go to a 64-bit OS right? I mean you can then use upwards of 8Gb's! The reason is software support. Updates for 64-bit software/drivers/etc. is far and few between. And in all honesty 2Gb of memory is plenty for todays programs and games. Now if you plan to install a version Windows Vista instead of XP Home/Pro then it may be worth upgrading to Vista 64 bit if your addiment on using 4Gb otherwise XP works just fine with 2Gb.

    For the differences between the machines. They come down to this....

    For the PSU's...
    *Not counting mail in rebates, the PC Power and Cooling offers a better value. Using a simple price/watts formula to purchase the Corsair PSU would cost you .23 cents per watt were as the PC Power and Cooling costs .21 cents per watt.

    *It does not use modular plugs, which increase resistance and potential for electrical mishaps.

    *It features one solid +12 volt rail. Many PSU's, such as this corsair unit, feature "multiple +12 volt rails". This is a bad design for a high end PSU. The way a single high-output +12 volt rail works is that is has lets say 60 amps to distribute so if only 1 piece of hardware is utilizing the rail it has access to all 60 amps. With multiple rails the power from the +12 volt output is split between however many rails there are. So 60 amps between three rails equates to roughly 20 amps per +12 volt line. So if you only have 1 high power draw piece of hardware it is limited to the 20 amps available to its rail and the power delivered to the other +12 volt rails sits there unaccessible and unused. But with a single +12 volt rail this piece of high power drawing hardware has the ability to use the full 60 amps should it need it.

    For the motherboards...
    *first and foremost is the warranty. Gigabyte to the best of my knowledge offers a 3 year warranty. eVGA a full lifetime warranty for the life of the board. nForce chipsets are touted as "performance/gaming oriented" and "power user" chipsets, while as the chipsets from intel commonly fall behind in various benchmarks.

    *the gigabyte only sports 1/one PCI-Express x16 connection which kills any possibilities of dual videocard setups/upgrades. The eVGA board sports 3 pci-x x16 slots for the ability to use 1,2, or 3 graphics cards in unison via S.L.I. technology.

    *Gigabyte has 8 Sata ports the eVGA has 6. I consider this a moot point as filling up 6/8 SATA ports means the ability ot use 6-8 terabytes worth of harddrive space and how soon is any home user going to use that much?

    *the eVGA board has dual ethernet plugs, which are handy in some cases. I for one use port 1 as my internet port and then have a router acting as a wifi expander hooked up to the 2nd port. So I use port 1 and any traffic coming in over my wifi access point gets routed through the 2nd port, to the first, and then to the router. the feature is mostly useless but quite handy in some cases.


    For the processors....
    *Well Intel is definetly the company to look to for high performance parts. And the CPU is a toss-up. The quad-core offers brute multi-core performance best for video encoding/editing and all other types of multimedia work. Unless you are either very confident in your overclocking ability or are willing to spend around a $1,000 on a CPU quad-core speeds usually come in around 2.4-2.6 Ghz for $200.
    Dual Core cpu's on the other hand will spank a quad-core cpu in just about any modern game, they however faulter to quad-core cpu's in multimedia work. Another thing is that for about the same price ($10 less actually) of a 2.4Ghz quad-core a 3.0Ghz dual-core can be had. Both processors work in both motherboards so it's your choice. I myself have the Q6600 in my machine. I do a lot of gaming but also do a LOT of video/audio work with apps that scale well with 4 cores. It does well in gaming but for the same amount of money I spent a faster dual core could have been bought that would be better for games.


    For the videocards....
    *Here is a paragraph from the conclusion of a article from pc news site Anandtech were a comparison was done between a single AMD/ATI Radeon HD 4850 and a single nVidia GTX 280...

    So this conclusion paragraph essentially states that...
    *2 Radeon HD 4850's come close to the performance of a single nVidia GTX 280 and it takes 2 HD 4870's to beat a single GTX 280
    *nVidia parts offer more performance now
    *1 GTX 280 is more powerful than 2 HD 4850's so the need to upgrade is lessened/shortened.
    *If 1 HD 4850 is "easily enough to run anything out today" then 1 GTX 280 should be able to play all these games faster/at a higher level of quality then the HD 4850 parts. And with the ability to daisy-chain multiple cards the performance gap just keeps increasing.
     
    Last edited: Jul 8, 2008
  6. PeaInAPod

    PeaInAPod Active member

    Joined:
    Nov 28, 2005
    Messages:
    3,050
    Likes Received:
    0
    Trophy Points:
    66
    The best canadian site I found was http://www.tigerdirect.ca

    and when using newegg tech chat support applet I asked a rep about shipping to Canada and they said they can't do it due to a agreement between them and another canadian retailer(they failed to mention the name). They did say if you call them after placing a order they can ship it to the closest U.S. UPS facility and you can pick it all up there. Not really a great option unless you live on teh Canada/US border.
     
    Last edited: Jul 7, 2008
  7. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    Actually, most games have crossfire as a plus now that drivers are updated, and the GTX 280 is being beaten by two 4850s. Wait till the reviews with the 8.7 drivers come out.

    Next a lot of the stuff you have said there seems to be picked off the manufacturers pages.

    There is no proof that modular PSUs cause any electrical mishaps, nor extra resistance. Why would many extremely stable PSUs, some of the best in the world use modular cables?


    Secondly, MOST PSUs that offer multiple rails actually have about 2 12V rails. Although their stick may say one thing, the writing inside has a different story. the maximum an 8800 ULTRA draws from the PSU is about 10As therefore rails with "only" 20A are PERFECTLY fine, and corsair has a system where if more Amps are needed in one "rail" then it can supply more, and reduce the power for the other "rails". This infect works as if there is only one rail to begin with.

    Also with 64bit vista, have you tried it? I have been running it since December, and use a lot of programs, and upgrade a decent amount of hardware, and I am yet to find a driver that didn’t work etc. MS has made it clear to manufacturers that if they want certified 32bit drivers, they need 64 bit drivers as well.

    Crossfire scales near to 100% with the new 4 series, where as it only occurred going near 80% with the 9600GT, and most others dont scale more than 50%. Tri sli is even worse.

    Nvidia boards cost ALOT more, and offer ALOT less than Intel boards. Yes they may have lifetimes warranties, but it’s for a good reason... the nvidia chipset has too many problems. Intel chipsets, esp. the x48 and p45 offer unparalleled overclocking, and dual Ethernet...WTF, you honestly put that over stability and overclockability? MANY MANY of the p45 and x48 and x38 chipset boards have dual LANs, and some of the P45 even have quad LAN.
     
  8. PeaInAPod

    PeaInAPod Active member

    Joined:
    Nov 28, 2005
    Messages:
    3,050
    Likes Received:
    0
    Trophy Points:
    66
    I don't mean to be a ass but there are so many mistatements and what not in your post I can't help but rip it apart :)

    This makes no sense. Your bragging that it takes 2 HD 4850's to beat a single GTX 280? 1)No driver fix can solve that and 2)I wouldn't brag about that fact. And "most games have crossfire as a plus..."??? Crossfire IS NOT a technology used in game software. Crossfire is a technology that enables two videocards to interface with each other for the benefit of increased 3D performance. Crossfire isn't something a game can support or not support, Crossfire is hardware. Yes the HD 4850 is a excellent card for the money, but for all out video gaming power the GTX 280 can't be beat.

    Why would PSU's use modular cables? Because people buy them, they are handy, and it's another feature for the manufacturer to tout. Ask any real electrician breaking and reconnecting a power line WILL increase resistance and increase the chance for a "electrical mishap". With modular cables you are at risk for shortouts, cables coming loose, corrosion, etc. With non-modular PSU's you have a smaller risk for a shortout. Yes cables can still come loose from the motherboard but 1 risk spot is better than 2(mobo and PSU if using modular cables). So by using non-modular cables you essentially halve your risk. And as for the corrosion it could lead to shortouts and the PSU's inability to reliably deliver power along the corroded rail among other things.


    Were are you getting your power ratings from?!? Here are two different(card 1, card 2) 8800 Ultras, both state they need at least 30-34 amps on the +12v rail, so I don't know were you get "10 amps" from. But if 10 amps all that was needed for a 8800 Ultra we could all be running multi-gpu gaming setups on 400 watt PSU's!! lmao, 10 amps thats a riot! :) And as for Corsair's power management feature you speak of it may work, but a dedicated +12 volt rail will always be better, period.

    You seem to think I have somethign against 64 bit vista. I don't. I even recommend it to dytopia if s/he wants 4+Gb of ram and needs a new OS. Also I didn't say that the drivers didn't work I said, in other words, that since the market majority is using a 32 bit OS a majority of the driver development is going to be focused on the 32 bit drivers. The 64 bit drivers while working will always lag behind the 32 bit ones until 64 bit goes mainstream. And yeah having certified 64 bit drivers doesn't necessarily mean they are equal to there 32 bit brethren.

    Total BS. Neither Crossfire or SLI scale 100%, and neither likely ever will. That said scaling is going to be around 65%-75% across the board regardless of videocard, SLI, or CrossFire. And to further explain why no dual videocard solution will ever hit 100% scaling is pretty simple. With two videocards you have to factor in the "time" for them to communicate and the additional "delay/time" needed for the master card to render the input from the 2nd card with its own data. Due to the added workload it can't scale 100% capacity. This idealogy holds true for tri-sli as well as the more cards added (1 card + 2 additional cards) the more work the "master/main" card has to do therefore reducing its ability to scale higher.

    As far as costing more and getting less that is just BIAS.Also, nVidia makes chipsets only, not actual mobo's, unlike intel which makes both chipsets and motherboards. As far as I know the only nVidia chipset based motherboard manufacturer to offer a full lifetime warranty is eVGA, they also offer this for/with their videocards. This has nothing to do with the chipset rather the companies desire for satisfied customers and good public relations. The nVidia 680i chipset was rushed to market and it(the 680i chipset) had problems. I say had because these problems were fixed and/or addressed with the 780i series. And you talk of Dual Ethernet on intel boards, nVidia was the first chipset manufacturer to bring that feature to the consumer market. (sarcasm ahead ;p)WOW! Quad LAN on the P45's? Really? Because the 3 most expensive P45 boards on newegg didn't have this "Quad LAN" feature. In fact the only motherboards to have this feature were a few one-off boards by Gigabyte and Asus, among others. Find one board manufacturer who is actually still incorporating 4 LAN ports and I won't dismiss this as a marketing gimmick by Intel. And yes I admit Intel CPU's in general OC easier on Intel chipsets, that is to be expected. Notice I said they overclock easier, not necessarily better. The ability to OC lies mainly in the quality of the motherboard and the features of the boards BIOS. And there are no stability/OC'ing issues on nVidia chipsets at stock settings. Only with a bad OC can issues start to show themselves, but then again a bad OC would do that to any companies chipset.
     
    Last edited: Jul 8, 2008
  9. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    np with ripping, i may have to do the same :)

    i like debates, allows people too see both sides of the story :)

    yes i am bragging that 2 4850s will beat a GTX280. £240 for 2 4850s vs £340 for one GTX 280, i know which one i would take.

    and im sorry but your information is wrong, crossfire is software based, apart from the connectors, its fully software based, where as SLI is harware based. this is why you could crossfire a 3850 and a 3870x2, and they are working on drivers to xfire between the 3 and 4 sereis of cards.

    give me 3 reliable sources that this has ever happened to any one? nop? though not. as i am doing electronical engineering, i think i would know a little about this :) disconnecting and connecting a cricuirt has NO affect on its resistance. unless you re connect the cables with a different metal, there will be ZERO difference. corrosion is a non worry, as oxidisation does not occur. cables will tno come loose, unless you tug and break their retentions.


    do you honestly belive anything written by a manufacturer? they mean for the WHOLE system. they do this to cover their arses, so people cannot complain. they test this with a extreme quad cores, overclocked massively, with alot of high power fans, HDDs ODDs ETC ETC.

    ok 10A was off i think it was 11A for the one card.

    can you give proof that a single rail is better period?
    if the 100% scalling in crossfore is BS have a look for yourself:

    [​IMG]

    no nvidia card does it, yet TWO of the ati cards jsut have.

    could we resort to no swearing, i do not want this thread closed down :)

    [​IMG]

    [​IMG]

    :D of course in games where crossfire support is not ready, this wont work, but i am sure you would check this before you go and spend £240 :)


    ok let me firstly clarify, when i say intel and nvidia boards, i do not meant he actual mobo it self, but rather the chipset, sorry if i confused you :)

    i hoenstly think that Quad LAN is a gimick as is dual lan, but since you rated the dual lan, i though you might like that.

    have you not heard the horror problems with the 790i chipset which was erasing HDDs? untill soposedly a bios update fixed it, but rather it didnt :(


    as for Ocing abilities, yes they may be equal, hellt he nvidia better, but in stability there is onyl one winner, and that is intel. the amount of problems that the nvidia chipsets have are patheic to say the least. esp with they 4 series southbrige still going "strong"¬_¬


    http://www.overclockers.co.uk/showp...P45 (Socket 775) PCI-Express DDR2 Motherboard

    Gigabyte GA-EP45-DQ6 Intel P45

    LAN: 4x Realtek 8111C Gigabit LAN

    http://www.gigabyte.com.tw/Products/Motherboard/Products_Overview.aspx?ProductID=2831




     
  10. PeaInAPod

    PeaInAPod Active member

    Joined:
    Nov 28, 2005
    Messages:
    3,050
    Likes Received:
    0
    Trophy Points:
    66
    Again bragging that 2 cards can beat 1 is not something to brag about. And according to Anandtech 2 HD 4850's are slower than a single GTX 280. Heres the quote, check the article out for yourself if you don't believe me.
    "A pair of Radeon HD 4850s can come close to the performance of a GeForce GTX 280,"

    That is also why crossfire tends to be the lesser performing of the two multi-gpu technologies when compared to each other.

    Maybe you should read my posts twice before posting. Did I ever say that modular connectors would guarentee electrical problems? No! I said that having modular plugs raised the risk for the problems, which is entirely true. Your an electrical engineer? Hard to believe since your stating that modular cables don't cause more resistance as this type of thing is electrical 101. More breaks in the circuit the higher the resistance. To prove my point about modular cable here is a excerpt from wikipedia.org....

    "While modular cabling can help reduce case clutter, they have often been criticized for creating significant amounts of electrical resistance. Some third party websites that do power supply testing have confirmed that the quality of the connector, the age of the connector, the amount of times it was inserted/removed, and various other variables such as dust can all raise resistance.[8]

    While eliminating the excess cables can improve the flow of cooling air inside the computer case, the modular connectors tend to reduce airflow inside the power supply itself. The emphasis on appearance in modular power supply marketing tends to underscore this point"

    I think this question should be directed at you. I explained my reasoning for my/and others belief that a single +12 volt rail is better. You on the other hand haven't backed yourself up.

    Theres no way in hell your a electrical engineer, because nothing above you said related to electricity/pc power makes any sense. First off power supplies are tested in a PSU tester. A machine that can max out the PSU load limit for extended periods of time while measuring the various voltages being output and the variance in them. And if you think that 8800 Ultra's or ANY modern mid-high end graphics card can run off of 10-11 amps your sadly mistaken. Go find one modern videocard that runs off of 10-11 amps and I will say I was wrong. You saying that is just plain ignorant.

    As for all those charts you posted they prove my point!Look at the first chart with the readings for Frames per Second (FPS) for a single Radeon HD 4850, it is 21.5 FPS. Two 4850's in Crossfire mode would have to post 43 FPS to be considered 100% scaling. It gets 39.5 FPS, a difference of roughly 2.5 Fps. The same comparison but with nVidia's GTX 280 shows a 10.2 FPS difference. With 2.5 and 10.2 being our numbers that would equate to 25% scaling differnce between the two technologies. But this would only be for these 2 benchmarks, two benchmarks is hardly enough to say anythings definete. And this all goes along with what I said before about scaling perforamce being 65%-75%.

    I didn't state that Dual Ethernet ports was a feature you should buy for. I simply stated that in certain circumstances having a 2nd ethernet port is quiet handy.

    I didn't hear about this. I did hear about people with RAID setups getting to zealous with their overclocking, damaging there raid controller, which then didn't allow them to access there drives and in other cases corrupted all there data. But that was a user error, not a manufacturing defect. Prove me wrong, post a article that says otherwise.

    Problems? Like what?

    How so? Like I said before the stability of any cpu in any mobo will be perfect as long as it isn't OC. I also said that when OC'ing to get a stable system a solid motherboard with a good BIOS and chipset was needed. I said that Intel was a easier OC'ing board, but that in general higher OC's could be had on a nVidia board due to the performance oriented nature of there chipsets and BIOS. So unless you can think of something smarter to add than "intel is the most stable" don't bother posting about it.

    And those four links for the Gigabyte motherboard, that is the same motherboard I referenced in my post. But like I said how many consumers have bought that board? Both AD and the PC parts site you linked me to have ZERO reviews for the product which leads me to believe no one has bothered to purchase it. And my belief stands that the inclusion of 4 LAN ports is but a gimmick.
     
  11. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Considering that a GTX280 costs as much as three HD4850s, getting two and having the same performance as a 280 (better in quite a few instances) makes it the better buy in mind. Yes, you have to get a good motherboard, but who wouldn't when they're spending six hundred dollars on a graphics card?

    Crossfire used to be gimmicky, but now it's just plain sense. Oh, and Anandtech have got a fair few suspicious results in the past. Crossfire DOES now give 100% boosts - and as for claiming the GTX280 is the latest and greatest because it only took one card is mad - it costs as much as two normal cards, it uses as much power as two normal cards, it runs as fast as two normal cards. So what if it's a single card? Economically it makes no difference but for the fact it can be run in any old motherboard, and with that much power why would you want to?
     
    Last edited: Jul 8, 2008
  12. PeaInAPod

    PeaInAPod Active member

    Joined:
    Nov 28, 2005
    Messages:
    3,050
    Likes Received:
    0
    Trophy Points:
    66
    I know, acknowledge, and admit that...
    1)the GTX 280 is ungodly expensive
    2)buying 2 HD4850's will net you near the same performance for about $100 less
    3)fiscally the HD4850 makes sense

    But what I am getting at is that for gaming you want to buy the best part you can now. So it makes more sense in a gaming mindset to buy the more powerful GTX280 now. Yes it's expensive now but it can beat many 2 card configurations on the market by itself. And down the road should a graphics upgrade ie needed, prices will have dropped, and another GTX 280 can be slotted in to serve as an upgrade that will expand the life of the machine a good many years.

    I'll just finish off by reiterating my stance that cost/other issues aside. Any single card that can outrun 2-cards running X-fire/SLI is a superior card. Yes it may be more expensive but your buying the performance.
     
  13. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    There doesn't seem much point in doing that to me - let me explain.
    Firstly, the only way the GTX280 will ever improve is through SLI, completely out of the question for most systems solely on the power requirement - not just having to upgrade the PSU, but the absurd increase in the cost of electricity, and to the planet. As for the GTX280 being a superior card, from a performance standpoint it is, but that does not make it a superior product, simple as that. It's vast size, heat output and so on render it impractical for a future development. A pair of 4850s will be no more practical, but having spent so much less money, the saving offsets any future upgrades, and when an upgrade does come, the two cards can be split up to perform different roles.
     
  14. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    IMO it is somthign to brag about, not becuase it is two cards, but becuase of the price to performance. if you want to talk abotu one card being that good, wait till the 4870X2 (about 1 month) comes out, and it will wipe the floor with the GTX280.

    how can you say xfire is the less performing of the two technologies, are you really that far into being an nvidia fanboy? (ask sam, i am not, when the 8800GT i was recomending that ALOT, and the 9600GT, price to performance is the biggest winner, fanboys are always the loosers)

    when the scaling of adding a second card in SLI meets 100%, then come back and tell me. until then, the xfire performance can increace with software, where as for SLI thats not really an option, its either coded well for it, or not in game.

    WOW, you quoted the most reliable source WIKI, where i am quoting my teachers, who are more than qualified.

    i dont know what you knwo abotu resistance, BUT, having breaks in a circuit will NEVER increace resistance. reistance increaces if the resistivity of a wire is higher. if there are more atoms in the same voluem of a modular cable, vs non modual cable, then YES resistance will incrace. this is becuase as electrons flow, they carry charge. this charge is the electrical energy needed to (for example, light a bulb) IF there is a lot of atoms in the wire, or less atoms, but bigger atoms (EG from an element with a higher Proton number) then while electrons flow, they will hit these atoms, and lo and behold, will waste the energy they have stored. THIS is resistance. when electrons hit the atoms and this slows their drift velocity down, wasting energy in trying to increace it again.

    having modular cables DOES NOT increace the number of atoms in a wire liek magic or somthing. turning a circuit on and off does not increace resistance. BUT if a wire is longer, resistance incraces, as there are more atoms in the wire. this is where modualr cables come in well. they alow you to use smaller cables, therefore reducing resistance. why is it that as you extend a telephone wire, the quality is reduced? this is due to resistance. but if you have a samller wire, less resistance.

    i dont give two hoots whether you belive that i do electronic engineering or not, that does affect what i do, but i am not in idiot, i do no read these companies words liek they are the gospel truth. i do my own research, and use my knowledge i have gained academaclly to apply it.

    BTW have you ever read the reviews. they are not taking exact frames, but rounding up. when rounding up you can have a 1FPS error both int he upper and lower limits. (simple standard devition)

    with not jsut this site, BUT MANY sites getign 100% or near enoguh 99% scaling on the new cards in games that support xfire, its unheard of and amazing. take witcher. before the 8.6 hotfixs, they showed patheric scaling, but with a mere driver fix, they show near 100% scaling. GRID did not use xfire untill the new drivers.

    http://www.xcpus.com/forums/rumor-mill/12406-790i-780i-may-cause-hd-corruption-failure.html

    read that, and the links they provide.

    http://www.anandtech.com/mb/showdoc.aspx?i=3279&p=4

    http://www.xbitlabs.com/news/chipse...a_Corruption_During_Overclocking_Company.html

    http://www.xtremesystems.org/forums/showthread.php?s=08384819b23d5b996de2463d8c104269&t=183119

    talk abotu unstability with nvidia chipsets. and yet you still support people getting them? yet these problems have never been had with the intel chipsets.

    obviously having a nforce 4 chipset still on the nforce 7 boards is also okay with you?

    talk abtou nvidia innovation ¬_¬

    oh what a way to argue, jsut dissmiss what i said, and act as if it means s**t all.

    stability if the first and foremost thing an OCer looks to. you look to take a chip beyond a cirtain level, BUT keep it stable. who cards if it is stable enough to run a Q6600 at 4.5GHz for 3dmark and post it on the net, jsut to grow your e-penis? MOST OCers look to compromise and get the Q6600 to 3.6GHZ and have a fully 24/7 stable computer, rather than BSODS and corruption every day.

    why the hell would you spend hunders on a board to keep a CPU at stock? does that jsut defeat the purpose? or is it that nvidia have stuch a strangle hold on you, they need your money for SLI.

    i would swtich in a heartbeat back to nvidia if their price to performance was better than ATIs, and their drivers worked nicely on vista 64bit (which they didnt, but the second i switch to an ATI card, no mor BSODS (but then thats my experiance)


    Quad lan is a gimmik as is dual LAN, but i am sure there are people that use it.


    im sorry if i sounded rude, but i tried to stay cheerfull in the other post, wehre as you obvously desided agianst that, and sounded completely rude to me in your last post.


    here is a review for the DQ6
    http://www.cpu3d.com/content/view/5141/54/


     
  15. dytopia

    dytopia Member

    Joined:
    Apr 29, 2007
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    11
    May I just say, Im brand new to this, and I know barley anything about computer parts and building, i dont even know what overclocking means. I know just the very basics of hardware which is nothing in comparison to you guys. So all I want to do is build a top of the line computer that will not malfunction on me, I just want to keep it safe..safe and cheap. Under $1500. So I really need a straight forward, non-conflicting answer here because im really lost. Sorry, im just really new to this.
     
  16. PeaInAPod

    PeaInAPod Active member

    Joined:
    Nov 28, 2005
    Messages:
    3,050
    Likes Received:
    0
    Trophy Points:
    66
    Yes, we get it! I even said teh HD 4850 was the more sensible/economical videocard to get, but that for all out graphics horsepower the GTX 280 reigns supreme. I start to get rude because you only seem to read the parts of my post that your interested in and nothing more. If you bothered to completely read my posts you would have read and realized that I have agreed taht teh HD 4850 offers a higher value, but that I stills stand firm in saying the nVidia based cards are better when it comes to performance. I am not taking into account cost/or other fiscal issues. I am simply stating that to date the GTX 280 is the single fastest card on the market. Argue about that all you want but you know it's true.

    No I am not a fanboy. I am a poweruser who believes that when 1 product beats another product(like one GTX 280 handidly beating 1 HD 4850) that that product is a better product. I buy the best parts at the time. And I do take in to consideration price-to-performance, it is why I bough a Q6600 for $260 and OC'ed it and not a $1,000 quad core.

    And again with the electrical nonsense. You really don't seem to know what your talking about. Resistance can be increased by any number of factors such as quality of the conducting material. But again it is electricity 101. If you have two electrical lines both the same length; one being a solid run of copper and the other the same length but spliced together with crimp connectors/spade-terminal connectors there will be nearly always be a higher resistance measured on the spliced length of cable. I mean really theres no way you can say this isn't true. Heck just another way to prove your wrong about "resistance only coming from the amount/size of atoms in a cable", what about when corrosion builds up on battery terminals, does that not raise resistance? Yes it does and the raise in resistance has nothing to do with higher resistance cabling in this instance so your theory about resistnace is wrong.

    Thanks for posting that. I had already acknowledged the fact that the 790i caused data problems when I posted this...
    And it is why I recommended the 780i instead of the 790i.
    If you would get off your high horse about the modular cable issue you would realize that I never said that having modular cables will definetly result in more resistance, it just raises the possibility of a increase in resistance.

    I am sticking to this, what empirical proof do you have that intel chipsets are superior to nVidia? Heck I already said this....
    I am not saying that OC'ing with one chipset over another will result in a increase in OC ability. Rather that a higher OC is more easily obtained on a intel chipset. Again you don't fully read waht I post before running off to post you rebuttal.

    hypocrite, in the same post to!
     
  17. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    For once it's not me starting the fires! lol
    Shaffaaf: Brand affection is something that you have experienced as much as anyone else. However, this said, you are usually quick to spot when it clouds your vision and are responsible and mature when dealing with it. As a result, barring a few brief moments, you know where it's at with most PC hardware. The 8800GT and 9600GT have both been excellent cards, and now it's the HD4850's turn, you've realised that. I don't think Pea has yet.
    I won't say much about the electronics, not that I don't understand, I don't want to get involved, but the fact that modular cabling causes electrical problems is bull. The addition of extra connectors introduces the same amount of resistance as lengthening the cables maybe a couple of inches at best. If there was a problem with resistance, you'd start seeing voltages drop, because the effective resistance of circuits in a PC are so low (think, so many amps off so few volts). The fact is they don't. What does happen is the cheaper plastic connectors can become loose with frequent use and not make good contact - if you have a PSU that uses components of that quality then modular connectors are the least of your worries. Modular PSUs are fine, just like good 500-600W PSUs for running dual graphics cards, the naysayers in both camps are usually the same folk.
    About GRiD briefly, the 4800s support crossfire with 8.5 and 8.6, the 3800s don't seem to support it with either. Not a problem as such, but it goes to show, the two cards crossfire in completely different ways. This helps explain the greater reliability of the system and the bigger performance gains. It's a big result for ATI, one which nvidia have yet to see themselves. If the 9800GTX scaled for double in SLI, people would be buying far more of them, the HD4850 would potentially have a competitor. Alright, not the 9800GTX, it's crap value, but certainly the 8800GT. I remember the 9600GT SLI results, they were impressive indeed, but still fell a little short of the mark - if the other cards could get even this result they'd be away, but they can't, seemingly, I'm not sure why.
    Shaff brings about the big point with SLI though. Nvidia chipsets SUCK. A lot. Untold problems, stability, overclocking, compatibility, you name it, they mucked it up. Intel's chipsets aren't perfect, but the fact that they support dual graphics without all the hideous bugs is absolutely glorious for ATI - heck, with the help of companies like Gigabyte you can even buy a cheap board that runs dual graphics and doesn't fall to bits, crash or cause driver issues with various different bits of hardware.

    As far as quad LAN goes, not seen it, but I've never had the need for four gigabit ports, I only used two for a time. Have you heard of something called a gigabit switch?


    Dytopia: Sorry you're seeing all this, but it's all in the name of getting the right message across, not just for you, but for anyone else. See the bottom of my post for a simple answer.



    Peainapod: "Reigns supreme" - you don't hear lines like that every day. Fanboy much? We've all been there, we see an excellent product and latch onto it like nothing else matters. Fact is though, it's not the only way to build a gaming PC. There's nothing wrong with it, but why spend all that extra money? It's an even greater difference in the UK...
    Lmao - the HD2900XT is faster than the 8600GT, but would you buy one? Hell no! There's a lot more to graphics cards than how fast they are.



    Dytopia: A pair of HD4850s in a Gigabyte Intel chipset board, such as the EX38-DS4 is an inexpensive, winning combination that will offer incredible graphics performance for a long time to come. It's easy to setup, easy to use, cheap to buy, and above all reliable. You can go with the X48-DS4 if you prefer which nets you an even better board. Avoid nvidia chipset boards, always.
     
  18. abuzar1

    abuzar1 Senior member

    Joined:
    Jan 5, 2005
    Messages:
    5,818
    Likes Received:
    4
    Trophy Points:
    118
    TL;DR
     
  19. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    then why post LOL.

    you know what peainapod, if it makes you feel better, i really cannot be bothered to someone who is not open to the fact he maybe wrong, its jsut not goign to achieve anything, esp when you wont actually listen tot he things i say.

    just one thing id like to pick out in which you are really ticking me off:

    so i provide you something which is important and should be seen by many, and you brush it off to say you had said it before, yet you did not, THIS is what you said:

    so what i did was to show you it WAS A MANUFACTURING fault. a fault of NVIDIA, and they made a statement about it. Yet you seem to be so up your self that you will not eat some humble pie, when you are wrong, instead you say you knew it. ¬_¬ this is one of the main reasons there is not point in bothering with fanboys such as yourself.

    you say you want price performance (Q6600) yet you'd rather get a GTX280, which will nto just cost you $200 more, but also have to cost you a better mobo (incase you dont have a sli mobo) nto to top if off cost you an EXPENSIVE PSU. yet for some reason i see you do price to performance... !_!

    what quailifactions do you haev in electronic engineering that you jsut dismiss physics for your opinoin, there is no way that the modular cable, not their connectors will use a different metal with a differernt resistivity fromt he wire, if they did, voltages would be horrible, yet you do not see that. i am not stupid (which you didnot say but i can assume it from your posts) i ahev an experiance in electricity, and jsut becuase you have some mangled book of what they tell you in electricity 101 (what ever that BS is) does not mean i am wrong. i am going from years experiiacne in electricity, and study in the area with it.


    you kepp on saying it wont do damage but increace the mishaps... that the same thing, and iam telling you it WILL NOT increace the mishaps.
    its not my fault you read the PC power and cooling pages as gospel truth.

    and that e-peen statment was no directed at YOU specifically, ii shoudl ahev written "one's" instead of your.

    anyways this is the last post i make to you.

    i agree with what sam says. get what he says.

    and i am sorry for confusing you.
     
  20. abuzar1

    abuzar1 Senior member

    Joined:
    Jan 5, 2005
    Messages:
    5,818
    Likes Received:
    4
    Trophy Points:
    118
    Ignore these idiots, listen to me. Just kidding. They aren't idiots. Still, listen to me.

    Intel Q9450
    Gigabyte GA-X48-DS4(Or the Asus Rampage Formula if you can afford it)
    Corsair Dominator PC2-8500 4GB
    2 ATI HD4850
    Cosair 620HX or 650TX(they perform about the same, but the 620 is modular, and about 50 dollars more. I use the TX)
    Xigamatek HDT-S1283 CPU cooler with the retention bracket(or the Thermalright 120 Extreme)
    Case is your personal preference. Some of my recommendations are The Antec 900, NZXT Tempest, Thermaltake Armor, and Raidmax Smilodon(the one WITHOUT the PSU)
    I would get a WD Velociraptor for my OS and a Samsung Spinpoint F1 for my other files

    Oh and ncix.com has a good selection, and good prices. It's based in Canada so you should be good.
     
    Last edited: Jul 8, 2008

Share This Page