1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official PC building thread - 4th Edition

Discussion in 'Building a new PC' started by ddp, Sep 13, 2010.

  1. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Temp0 is Ambient, Temp1 is the on-board CPU temp, Temp2 is the Northbridge. AMD's on-core sensors are notoriously badly calibrated. I pay much more attention to the Temp1 sensor than to the cores. It is working properly. Remember it's the max temps we need to look at.

    Ha most of those should be in the Q6600 filebox but they always seem to float around depending on what I need to do. The FALS is my OS drive and there are 2 more Seagates below those. The FALS and two newer Seagates make up this system's normal drive array. The HAF is currently full up on drives :p

    The NorthBridge is currently not overvolted. All at stock volts. I have the cooler swapped out from the older board and the 40mm fan for it is next on my list. I want to get it back below 60 or, as you say, it's not going to last long-term.

    Take comfort in the fact that it(the Northbridge) never even goes to 55 under gaming and other loads. Only Prime 95 takes it that high. I played some 4 hours of Max Payne 3 today and it only hit 51*C.

    Under gaming load, which is the hottest this PC is going to get daily, the temps are VERY comfortable. Not a single hint of being too hot. In fact, quite cool compared to similar AMD systems.


    My current temps sitting idle with a few browser windows open, a movie playing, and some torrents downloading.
    [​IMG]

    As you can see, there is no cause for alarm here.
     
    Last edited: Jul 27, 2013
  2. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    To be honest, my temps get a bit silly when doing a proper stress test, primarily due to the heat the GPUs generate when being stressed (500W between them in an all-transistors test environment) heats the PCH up as it sits directly underneath them, and that reaches 65C odd unless I max the side fans out at 1900rpm - I've since lost two of them since last doing that (one failed fan, one failed channel on the scythe controller), but in normal usage, the most hardware-demanding (not performance-demanding) games like Left 4 Dead 2, it's mostly 50s on anything that isn't the GPUs. Annoyingly, since finally updating from an old 2012 set of drivers recently to a 2013 one (13.6 maybe?) maybe once in every 10 or so L4D2 games I'll get a hardware lockup mid-game. The timing correlates too well with the graphics driver update for me to think it's hardware, and it doesn't happen in any other titles that I've seen.

    Given that crossfire on the HD7 series is still quite poor to actually use, I was at one point considering moving to SLI for my next upgrade (which'd still be a long way off at the moment), but now that a 4K monitor is definitely on the shopping list, that option's been removed, as nvidia still don't support 4K60 properly due to the displayport debacle. AMD it is.
     
  3. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    The final steps in my little cooling upgrade will be to make sure that the temps do not get silly at all. Obviously, practicality must take priority here. My PC is not going to be running Prime 24/7, so it's never going to be pinned at max temp like the screenshots show. But I would like to have that ability nonetheless, if for nothing but peace of mind...

    I think every performance user and OC'er would like to be able to run a stress test 24/7 and never have an issue. Impractical, maybe, but it sure is a bullet-proof way to know your cooling will always be enough.

    As for stress testing GPUs, I think it's absolutely stupid. No load, not GPU Compute, not Rendering, nothing, will ever push them to that type of power draw or heat output. It's completely unrealistic, much more so than using Prime or IBT on a CPU, and will never serve a purpose except to destroy good hardware and to test liquid loops.

    As far as drivers, AMD have been dragging their feet for about 4-5 months now. I am currently considering an Nvidia card for my next upgrade as AMD have severely let me down with their driver support for the HD6800 series. Basically no Crossfire support for any new games until they're far beyond old news. Still waiting for a proper CAP for Metro last Light as my current scaling is only some 40%, when Metro 2033 has about 90%.

    Not to mention that in the past my raw performance has been dictated by AMD's mood that month. Right after I got these cards, a driver update necessary for Crossfire scaling in some newer titles caused their single GPU performance to take a 10-15% dive and it was never remedied. Excuse my language but that's absolute BULLSHIT. My hardware got directly nerfed to sell newer cards. I believe you've recorded this change yourself.

    I'll also mention that several companies have blatantly halved the performance of AMD cards vs the equivalent Nvidia. TWIMTBP isn't stagnating, it's growing. Don't even get me started on PhysX. Basically removing features from the game that have nothing to do with PhysX, and withholding them until you buy an Nvidia card.

    http://www.youtube.com/watch?v=VafzR7JqO2I

    I rest my case. Blatant BS. Not one single effect here requires PhysX to do practically. Havok is capable of all of it with the same general precision and performance, and CPUs are currently underutilized in games, so there's easily enough power in most gaming rigs to render it. Those are some pretty spectacular effects as well, and I feel that I am truly being cheated out of the full game. Try enabling PhysX on an AMD card, and you get the effects, but at 1/4 the framerate...
     
    Last edited: Jul 28, 2013
  4. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    This did happen but as I understand it, this was due to the fact that the HD6800 series performance was actually artificially raised by an image quality hack at launch, which was quickly removed when it was spotted by reviewers etc. In all honesty, I'd prefer it gone and have lower performance, as historically, image quality over performance was what owning an ATI was all about.
    AMD's attitude re: TWIMTBP is 'If you can't beat them, join them' and there is now a small number of titles where roles are reversed, but not many. AMD have never been anything like as good at marketing as nvidia, and it shows when you look at how many fanboys there are of either side. I'd probably say it's at least a 4:1 or even 5:1 ratio of nvidia fanboys versus AMD fanboys, which is what makes querying (or even advising) AMD hardware all the more difficult in open forum.

    Code was found in nvidia's PhysX implementation years back to artificially limit the frame rate by capping the PhysX process at c. 30% of one core of your CPU (and as far as I know it has never been removed). I'll also point out that this code, though crippling on any CPU, therefore means owners of AMD FX CPUs with lower performance per-core, will be even harder hit by this.
    PhysX is perfectly capable of running on the CPU if you've got the spare cores to handle it, but extra performance and visual effects are not what PhysX is about - it's all about degrading the performance of your opposition through, effectively, bribing game developers.
    It's a disgusting practice which I've long been of the mindset of trying to counter by avoiding the purchase/recommendation of nvidia hardware unless using the latter makes an obvious economic case, but frankly, people who actually do the research into this sort of stuff and properly understand what's going on are in a real minority - a small drop in an ocean of 15 year old Call of Duty players with blue LED-lined cases that think nvidia are god's gift to the earth - it's all marketing, nobody's born to think that way.

    I could understand nvidia's reluctance to embrace displayport, as the implementation of adapting it to fit DVI displays (for eyefinity purposes) is absolutely disgusting, and by far the worst attribute of AMD's driver standards. That said, they have now at least added it to the recent crop of Geforce cards. It's still only a single full-size connector rather than multiple mini-displayports though, and further, nvidia's implementation of 4K is quite poor still.
    Really, going for ultra high-res displays is still AMD's territory - nvidia aren't going to develop cards to work well with a platform that leaves their GPUs decidedly second best (Geforce performance drops off quite rapidly above 2560x1440).
    Thing is though, the way things currently are, SLI delivers an enjoyable gaming experience, Crossfire doesn't. Unless that changes, one single HD7970 is as good as it gets on the AMD side, and for 3840x2160 gaming, even if you disable AA due to the higher DPI on the screen, that's looking a bit thin.

    I'll re-assess the situation at such time I can actually afford a 4K display, so probably a little under a year from now. By then, if Crossfire still suffers the same issues it does today with the HD7 series, it'll be time to start looking at the Geforce lineup. Otherwise, it's still going to be an AMD solution next upgrade.
     
  5. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    I resent the Blue LED remark :p I like flashy stuff as long as it's tasteful.

    That said, I agree with the entirety of that post, especially AMD's absolutely piss-poor management of Crossfire. More often than not, I'm usually stuck gaming on a single card lately because AMD haven't released a relevant CAP for about 6 months.

    I will certainly be going single card this time around. Crossfire was dead awesome when AMD actually gave a crap and kept their CAPs updated. The ones they have released lately affect games that don't need it or games that nobody plays. Look at the latest 13.5. Still tweaking CoD4 performance? Really? You mean that game that runs at over 100FPS maxed on every card since the GeForce 8800s came out? Basically zero effort from AMD on their driver situation for quite a while now.

    I also agree that SLI is currently much better than Crossfire. Being hardware based means it simply works in most things. Crossfire depends entirely on AMD to release new drivers. The HD7s have been out for quite a while now. I don't think they plan to put any more effort into supporting Crossfire as a serious technology.

    It really sucks that you basically need to go Nvidia these days, or you get screwed. It doesn't seem like AMD gives much of a damn about their customers any more.
     
    Last edited: Jul 28, 2013
  6. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Oh I quite agree, but it's rarely tasteful. Unfortunately you very rarely see it with other colours. I was very fond of blue LEDs when they first came out and even went with a blue themed PC for a while, but it became too synonymous with that sort of cheap tacky system, so I scaled back the lighting and went looking for other colours. I've abandoned blue backlit keyboards, never had a blue mouse, but ultimately there are blue LEDs on all my monitors, my Z-5500 console, all my external hard disk docks bar one, all the hotswap trays on my server, and the HDD LEDs on my PCs. That's plenty :D

    SLI is every bit as profile and driver bound as Crossfire, and nvidia don't always get it right - there were a fair few complaints of titles that went 5 months before a driver update. The difference is, with Geforces this happens on a fair few occasions, whereas with Crossfire it's the norm, and it's an unusual case to see otherwise.

    There's still much I resent about nvidia and the PhysX situation is the tip of the iceberg, but I've been 'voting with my wallet' for almost 10 years and they're most definitely still the number 2 manufacturer.

    I still genuinely feel I will break the long-term longevity of my system if I install a Geforce card, as they still on average don't seem to last as long, but if it means a decent running system for 2-3 years, that'll probably do.
    The mentality of 'bad software is fixable, bad hardware isn't' is starting to wear a bit thin now...

    The HD6900s have always been fairly good performers in crossfire, they're not going anywhere for now - if I went with HD7900s though, different story, and one HD7970GE is still not an upgrade from two HD6970s. Two GTX780s are currently way out of my league cost-wise anyway, and as said, until I push even higher up the resolution scale, I'm still not in any dire need of more graphics power until I get round to Far Cry 3, Crysis 3 and Metro, which is still a long way off, too much else to play that'll run well on my existing hardware first!
     
  7. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Haha but no LED fans or CCLs in the case? For shame :p

    No I agree that flashy lights and gizmos have become the norm for cheaply built junk PCs. Blue is EVERYWHERE. I know quite a few people who fancy themselves "PC Gamers" because they got a cheap case with some lights or a bargain bin PSU with an LED fan. It does begin to grate on one's senses when people choose to illuminate shoddy workmanship and crap components.

    I do prefer that my PC have some dedicated lighting however. Call it a diehard habit from my noob days. I also don't mind LED fans if they're decent quality. I like it flashy so as to illuminate my attention to detail and build quality. I take a great amount of pride in my clean builds as I spent many years developing my wire management skills :)

    I agree entirely on driver releases. You don't see near the complaints leveled at Nvidia's driver department. And when a game is lacking proper SLI support, Nvidia make it a priority as they know people will be eagerly awaiting the next driver. AMD have yet to release a non-beta driver since 13.4. It's going to be nearly 4 months since they've released a proper driver, and it still probably won't fix anything. In that time, Nvidia have released nearly a dozen driver updates, all bringing updates that address issues with the newest games. I haven't seen AMD put that kind of priority on a driver release since 2011... Not to mention the ENTIRE CROSSFIRE USERBASE IS STILL WAITING TO PLAY METRO LAST LIGHT.

    Yes the HD6900s never got shafted as badly as the HD6800s and 7900s. AMD made a major screwup here. My HD5850s and HD4870s were much less problematic as well.

    Oh I definitely agree. I'm sure if we both sat down and compiled a list of known biases, bribes, and outright cheating, it would be several forum pages long. Nvidia has a LOT of things not to like. PhysX is most certainly one of many, many complaints.

    Nvidia's longevity issues are indeed still prevalent. I can't help but agree, though, that Nvidia hardware simply seems to work better on average and faces significantly fewer issues with performance, scaling, functionality, etc. Of course, the argument of bad hardware vs bad software still holds weight, but AMD still have yet to fix their bad software...

    Crysis 3 I'll certainly give you. Far Cry 3 is much more reasonable and quite well optimised but still very demanding. Depending on which Metro you're referring to I'm not sure what to say. Metro 2033 would actually be quite doable considering my own performance with the game. 60+ at almost all times, excellent Crossfire scaling. Metro Last Light on the other hand... refer to Crysis 3...
    -----------------------------------------------------------------------

    On another note, Crysis 2 was a step in the wrong direction but actually managed to be an excellent game otherwise. Crysis 3... they reintroduce Psycho with a whole different personality, a new face model, and a new voice actor. They completely ruined the best character in the game. Also, Nomad, the main character from Crysis 1, gets a retconned death in some obscure, poorly made comic. Don't even get me started on their treatment of the story. The first games set everything up for an amazing sequel, and they basically scrapped everything for generic enemies and the lamest FPS story in years.

    How the hell do they manage to put out solid gold like Crysis and Warhead then make the sequels such a steaming pile? They take the story, quality, ambition, meticulous attention to detail, art direction, and even some of the technology completely out of the sequels and throw it down the drain. Crysis 2 and 3 are so much worse quality and differently made than Crysis and Warhead that I'm starting to wonder if they were even made by the same team.

    Crysis 1 and Warhead managed near photorealism at times and had gigantic, sprawling environments packed with hand-placed details... and Crysis 2 and 3 are horrible CoD-clone corridor crawls with way too much gloss, uninspired level design and mediocre graphics. Crysis 1 still looks miles better than 3.
     
    Last edited: Jul 28, 2013
  8. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Nah, the only LED fan I have in use now is a red Antec Tricool as the CPU fan in my gaming PC.
    The LED is useful as it means I can see if the motherboard fan controller has activated the fan (once every 100 or so boots it 'forgets', leading to an overheat prety rapidly - 200W+ CPU doesn't work with no fan!) but really it's because it's substantial enough to survive attached to the cooler. The slipstream fans I use won't work on hot surfaces as they're too fragile - I tried a Gentle Typhoon high-speed for the extra grunt, but the bearing noise was horrendous. The S-Flex fans are still the best I've had for balance of quietness airflow and longevity, but 1500rpm doesn't quite cut it when the room is hot.

    The HD4800s were definitively the last 'just works' AMD cards, none of the GSODs the HD5s had, none of the crossfire woes the HD68s and HD7s have (different issues I know), consistent performance, solid reliability etc.
    All that said, the HD4870X2 pair was high maintenance, after they get old (2 years+) the coolers that were perfectly adequate at 3000rpm when new, are at 6000rpm no longer really keeping up, even with a good clean. The 60dB+ noise level, and having to remember to crank all 4 1900rpm side case fans up to max as well is not something I miss, nor is the 800W power consumption/heat output.
    The HD6970s have been inconspicuous - with the lack of light in my case you can't even see them through the window, they make minimal noise even in games, now having just last month surpassed the age my HD4870X2s were when they were retired. The performance is all there, slightly more of it in fact since you're not relying on 4 GPU scaling or a cap of 80% (something the HD6 series did provide a benefit with, upping to the current 95%), but with none of the heat/noise/fuss.
    Apart from lack of new game profiles, they work great together. Less than can be said for the HD7s unfortunately.

    I can't say the HD7s are bad as singular cards though - the MSI HD7770 I bought for my LAN PC (see http://www.newegg.com/Product/Product.aspx?Item=N82E16814127687 ) was cheap (<£100), is very quiet (<20dB) even when gaming, and has so far worked perfectly since I bought it, even if it isn't used often. The power usage and heat output are tiny for the level of performance you get, which is effectively an HD5850/HD6870's worth, in a proper slot-length card with a single power connector.
    Just a shame that multi-GPU technology hasn't advanced at the same pace...if at all...if not regressed.
     
    Last edited: Jul 28, 2013
  9. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    For the record, Bioshock Infinite runs absolutely perfectly maxed with 4xAA. 60FPS locked with Vsync(with some minor tweaking of course). Damn near 100% scaling. I sure wish more attention would be paid to big name titles. There are a great many I could run beautifully if Crossfire simply worked properly. War Thunder, on the ACES engine, is another example. The engine has already been firmly established in several big name flight sim releases. Crossfire works wonderfully in them. War Thunder, however, needs a CAP. It's already capable of multi-GPU as SLI works for it, and enabling AFR makes Crossfire work with some 80% scaling, but with visual artifacts. It's all down to AMD to release a proper driver that supports it. It's a big name competitive simulation MMO with 3 million players. You'd think some attention would be paid to it. Nothing for months and months now. Been playing since March.
     
  10. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Rather gutting news, the next generation Nexus 10 will be manufactured by Asus. I'm pretty attached to my current-gen Nexus 10, it's pretty indispensible, so it looks like I now have to look into getting another current-gen 10 before they go out of production, in case anything happens to my current one. Since I'm still waiting for anyone but Apple or Google to produce a high-res tablet, it looks like in the worst case I'll have to keep two of the new ones on me at all times and ensure everything on them is cloud-based, so that when one fails I can take over on the next one. Such a nuisance, Asus seem to be making almost everything these days.
     
  11. Ripper

    Ripper Active member

    Joined:
    Feb 20, 2006
    Messages:
    4,697
    Likes Received:
    13
    Trophy Points:
    68
    What's your beef with ASUS? Or are you just speaking from a market diversity point of view?

    I think it was the obvious commercial move, and the N10 did a lot worse than it's smaller counterpart.
     
  12. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I think a primary reason for that is the price though. If an Asus-manufactured N10 is cheaper (which it could well be) then that's as good a reason as any.

    An Asus product is no good to me personally because I really don't want the time and effort sending stuff back every 6 months. The 10 has become such an integral device for me at work that I can't afford to be in the position where any minute I could be left without use of the device. Carrying two 10" tablets around for sake of contingency really isn't very practical.
     
    Last edited: Aug 13, 2013
  13. Mr-Movies

    Mr-Movies Active member

    Joined:
    Nov 9, 2002
    Messages:
    1,225
    Likes Received:
    0
    Trophy Points:
    66
    ASUS or not that could just happen, no one is covered 100% of the time unless you have great redundancy.

    I don't think ASUS tablets are bad. I've sold plenty and haven't seen much come back unlike some other tablets.
     
  14. ddp

    ddp Moderator Staff Member

    Joined:
    Oct 15, 2004
    Messages:
    39,167
    Likes Received:
    136
    Trophy Points:
    143
    Mr-Movies, what "other tablets"?
     
  15. FredBun

    FredBun Active member

    Joined:
    Nov 27, 2003
    Messages:
    940
    Likes Received:
    0
    Trophy Points:
    66
    Some time ago if anybody remembers I made friends with a computer repair shop owner in my neighborhood, I often visit my friend named Boris the owner, hanging around on weekends sometimes I see tons of laptops come in, I have seen so many different brands come in for repair or people ask Boris if he would be interested in buying their used laptops which he does often, he repairs and sells them, and only once did I see an Asus come in, I asked him why, he said they last longer, better made and most of all are easier to repair than all others, nice layout and roomier to work with.

    I'm sure everybody else has their own opinion on which is best but watching with my own eyes the story speaks for itself, I also have learned when a Mac comes in is when I see Boris's face cringe, again I asked why, he answered hardly any room to work with than I saw for myself he was not bull crapping, taken them apart looked like a nightmare.
     
  16. Mr-Movies

    Mr-Movies Active member

    Joined:
    Nov 9, 2002
    Messages:
    1,225
    Likes Received:
    0
    Trophy Points:
    66
    For Adults this is what I can think of off the top of my head. Most of them are good or OK even the Kyros which is cheap. Supersonic and Ematic not so good. We've had problems with the Galaxy tab and there was a recall but I still would buy one and the Nexus is solid plus I like Jelly Bean. Polaroid is crap no surprise there. I also like the Iconia but people complain about the heat which is normal for that kind of cpu power. The Lenovo Lynx is nice if you like Windows 8, I'm not a fan really. I've found with Lenovo that their moderate to high priced laptops and pads are decent but like most manufactures some of their stuff is crap.

    Samsung Galaxy Tab's (1, 2, 3)
    Ematic Genesis Prime & Pro versions
    Acer Iconia (multiple versions)
    Polaroid 10.1" Internet Tablet
    Coby Kyros (Multiple versions)
    Lenovo Lynx 11.6" 64GB Tablet with Windows 8
    Supersonic Matrix
    ASUS Nexus 7
    Galaxy Note 10.1 Inch 16 GB Slate Tablet
    ASUS Eee Pad Transformer TF101
    Asus Transformer Pad Infinity
     
  17. ddp

    ddp Moderator Staff Member

    Joined:
    Oct 15, 2004
    Messages:
    39,167
    Likes Received:
    136
    Trophy Points:
    143
    had to do that on a 2 or 3yr old mac pro laptop that either the operating system screwed up or the hard drive was developing bad spot. had to pull the drive out & run a data recovery program on it to retrieve the data the customer's daughter didn't get around to backing up. 205gigs & 27hrs later it was done.
     
  18. FredBun

    FredBun Active member

    Joined:
    Nov 27, 2003
    Messages:
    940
    Likes Received:
    0
    Trophy Points:
    66
    LOL I rest my case.
     
  19. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Many years ago, around 2002-2004 that was true. This is how Asus' reputation was built. Since around 2006, however, they've been selling filth at the same premium price and for some reason people still keep buying it. You can't deal with unexpected failures of course, but there's a difference between having a product that suddenly fails, and having one that you know will fail any minute but you've no idea when - it produces a whole different mentality. It's something I'm glad I'm rid of in the PC sector, and having left Asus behind have now finally got to the stage where I have several PCs all reaching quite considerable age (3-5 years) yet are still perfectly reliable enough for daily use. The days of replacing motherboards every few months are not days I miss. The same is true for mobile devices. Why would I put myself through that sort of experience all over again?
     
  20. Ripper

    Ripper Active member

    Joined:
    Feb 20, 2006
    Messages:
    4,697
    Likes Received:
    13
    Trophy Points:
    68
    Yeah, obviously price is the reason and I'd have thought the new N10 will literally be a bigger version of the new N7; nothing innovative, it just does the job at the price point.

    I've had an original N7 for a while and not had any problems but I use it casually and would too need a bigger tablet for work purposes or heavy usage.
     

Share This Page