1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Read my edit. The great majority of my widescreen video is anamorphic 21:9 or 2.35:1 so it would be an awesome benefit for me. Gaming might not be seamless but it would work alright for the vast majority of content. It would make a passable substitute for triple screen gaming, though a hugely expensive curved 21:9 monitor might excel in that area. Blowing video up to that size is amazing with even decent 720p. DVDs and the like are awful but any sort of proper HD looks pretty good at large sizes. At least that's my opinion. Low bitrate video is always going to be awful.

    http://www.ebay.com/itm/LG-34UM56-3...551545?hash=item1c55eb0b79:g:4hcAAOSw-vlVn5Bu

    Super price for it. Mighty tempting. Not 1440p but still okay. It's between that and the Samsung for me. The benefits are very real and I've been thinking about a 21:9 for ages. 1440p is magical though so it's kind of a tough decision.

    Also, all of the 32" 1440p panels have varying quality issues. The 21:9s are a lot more reliable and more consistent from the outset being IPS-related panels. Both are South Korean panels. Also with a wireless keyboard I can move the display a bit closer if need be. The 27" will hang around regardless of my choice but I need something more epic. Either one eliminates the chance of using the 27" at the same time though. Unless I can somehow rearrange my PC setup...

    The drop to 1080p for the 21:9 is an interesting consequence, but 27" 16:9s at 1080p seem to look pretty decent, and the LG is basically a wider one of those. Desktop realestate isn't really a major issue for me. I want size mostly. I need an entertainment focused monitor. Both the 32" Samsung and the 34" LG fit that bill pretty nicely. The 32" 16:9 would be pretty awful at 1080p but the 34" 21:9 at 2560 x 1080 is a little more interesting. Its pixel density isn't far from the 32" at 1440p. Likewise the 32" at 1440p is also a very interesting combination. I like both of them, and wish I had the desk space for both, or a way to try them both out. It seems like for sheer size I'd want a 32" but wow that anamorphic widescreen. I am very much into video and audio playback at an enthusiast level, so the 21:9 appeals to me greatly. It's greatest weakness is going to be A/V peripherals like consoles and the like. Those will only display in a 16:9 box roughly the size of my current 27".
     
    Last edited: Oct 30, 2015
  2. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    My god DDP, your boat model building detail is so accurate you need a microscope? I could see how you and Kevin might want it for circuit boards but for modeling? Hey, when are you going to post a picture of one of your models?


    Jeff, thanks for linking to that video.


    Jeff, there is no doubt about it.
    For you I like that 21:9 1440 option!

    Here's a few reasons why:

    1. You have a VERY powerful setup, and while that puts an extra 34% load on your rig, your hardware can handle that.

    2. You like racing and flying games, and you like seeing modern WIDE screen videos without borders. I also like all the new movies - and no question about it - they are all very widescreen.

    3. You just picked up one of the best racing games you have ever had, and watching that video - it strongly reminds me of eyefinity - the very wide field of view has got to add to the immersion factor - you even have a racing steering wheel to increase your immersion.


    -- THAT IS A RACING MONITOR IF I EVER SAW ONE --


    4. You recently joined the high-def world, and falling back to 1080 - I don't know about that. You already appreciate the pixel density of high-res - you have gotten used to it. You may not like moving back to medium res.

    Is 21:9 1440 too much of a leap into more pixels? Well, how does it compare to 4K? Sam is running 8.3 million pixels. This would be 4.95.

    So, without going all the way to 4k, which as you have accurately pointed out is a whopping giant step in the direction of overpowering most hardware, the 4.95 million pixels would put "merely" a 34% increased strain on your system, as we mentioned.

    You have the horsepower to pull it off - in style. Your SLI is working wonderfully, and you have a very fast cpu. As you say, Intel and Nvidia are doing great things for you! You can do this!


    Yes, okay the extra pixels might possibly force you to occasionally drop V-sync - but in any case you would definitely stay well above 40 fps!


    By comparison, I just finished Unity averaging about 25 fps the entire time, slightly lagging most of the time, noticeably lagging in high population areas, on the graphic setting of High (not Very High, and certainly not Ultra High. I have comparison screen shots I'll post later.)

    So I couldn't consider a monitor like that with my present hardware, but you have moved way out forward in powerful hardware - you can pull it off.

    Later, that 21:9 1440 might also turn out to be my upgrade path.


    (My 30" lately has been blinking off momentarily. When I first sit down to play (just finished AC Unity) and the table moves slightly, the monitor blinks off momentarily. I guess that something is loosening up in the monitor board - the cables seem to be tight.)


    One day, on my gaming table, I could easily replace my 30" with one of those 3440x1440 monitors, sacrificing 10% of vertical pixels, but picking up all those horizontal pixels to feed my peripheral vision and enhance immersion - now that would be really cool.

    I don't play racing games any more, but I did play two of the games in the video that you linked to - crysis 3 and AC Unity.

    I never knew such a monitor existed. Those two games looked very good on that monitor. :)

    Rich

    PS I like the Samsung. We have a 24" samsung 1080p tv/monitor up in LA at the animator's sister's house - and it has an awesome picture (4k pixel density if you are looking side by side at a 48" 4k monitor) and it is very reliable.

    This price is about what I paid for my 30" Dell 5-6 years ago.

    http://www.microsoftstore.com/store...afeed_Google&gclid=CPKY283E68gCFQRrfgodjDsBxw


    And look, Dell has a curved one also - $750.

    http://www.walmart.com/ip/43857334?...5867856&wl4=&wl5=pla&wl6=101795925416&veh=sem
     
    Last edited: Oct 31, 2015
  3. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Aha that's just a little out of the range of what I want to spend on a single monitor right now :) I have considered the option of getting the Samsung first, and later getting the 21:9 1080 LG. I simply don't want to spend what it takes to get a 1440 curved monitor, but I would certainly like the benefits of 21:9 in that size. Problem is, to fit both on my desk is nearly impossible. If I had a larger room maybe. As it stands, I'm probably going to opt for the 32" Samsung and call it a day. It's still a vertical and horizontal increase in size, and for 4:3 and 16:9 content it's going to give me a much larger viewing area.

    If there were a larger 21:9 option in 1440p at that price I would spend whatever it takes. A 34" is the same vertical height as my 27" which is the only thing stopping me. My primary aim is to make the objects on the screen larger., which the Samsung will do, in 1440p, at half the cost of a 34" 1440p. I'm still going to have at least one 1440 monitor on my desk no matter what haha. If I were going to buy a 34" I'd still want to find a way to do dual monitors. There would be room for that. The 32" though is going to replace the Dell on my main PC I think. The Dell will be sold to a good friend if it works out well.
     
    Last edited: Oct 31, 2015
  4. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Which Samsung is it - can you link?
     
  5. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    http://www.amazon.com/Samsung-WQHD-LED-Monitor-S32D850T/dp/B00L3KNOF4/ref=sr_1_1?s=electronics&ie=UTF8&qid=1446066264&sr=1-1&keywords=Samsung+32"+1440

    It's pretty epic all on its own.

    Size Comparison:
    http://www.displaywars.com/32-inch-16x9-vs-34-inch-21x9

    Excellent little tool that. As you can see they each have their strengths. Overall the Samsung has more benefits than just about any other monitor for my uses with none of the 4K drawbacks. It's also the best built monitor using that generic AMVA+ panel as they use the latest revision of the panel and better surrounding electronics. They also tend to get shipped better panels from the get-go.
     
  6. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Oh, now I completely understand. You're staying with the same pixel count, but moving from 27" to 32" which will increase the size of everything by about 20% larger - good move! I am more and more impressed with Samsung monitors. The price on that one looks good.

    Your comparison chart seems to indicate that you will end up with more screen surface area with the 32" 1440p, than if you went to the 34". Oh yeah, they have surface area below - that is correct, 437.55 vs 418.55.

    Let me ask you this - what is available in 1600p in a 32, or maybe in a 34". A quick look and I don't see anything. The visual comparison chart doesn't include those specs. Google isn't coming up with anything.

    Looks like 1440p is the sweet spot of good monitors at great prices. Your 32" should work out well for you.

    By the way, going back about two weeks, your discussion of keyboards was quite interesting. I picked up the saitek backlit gaming keyboard that you used to have - blue backlighting - it's performed well all these years although some of the key labels have rubbed off for wsda - no problem. I have it securely taped in place.

    For my desktop, I picked up a logitech back-lit keyboard, part number 820-001268 - original about $75 - newegg had a refurbished for about $35 - best keyboard I have ever owned. The labels don't rub off unlike the cheapee logitechs. The action is very precise - nice tactile feedback.

    I googled that part number, and this one came up with a different number. But I think it is very much like this one:
    http://www.walmart.com/ip/Logitech-...71885d3893790a6fab4b426ef8a2b4a&veh=cse#about
     
  7. ddp

    ddp Moderator Staff Member

    Joined:
    Oct 15, 2004
    Messages:
    39,167
    Likes Received:
    136
    Trophy Points:
    143
    Harvard, my monitor on this computer is a 19" Samsung frankenstein as it has parts from 2 different Samsung monitors, screen & cover from 1 & video controller plus menu buttons board from another. what was just vga is now dvi & vga with dvi connected to videocard.
     
  8. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    At Sam from a page back about the notched capslock key.

    Personally I have huge hands so I hit the capslock button infrequently when typing. It's extremely frustrating if you are writing in volume, and it's easy to put the wrong meaning across in messaging. Changes an amazing typing keyboard into an obnoxious eyesore. The Razer generally feels really good, if a little soft for my liking, but the capslock key makes me hate it so much. Also, the macro buttons on the far left feel exactly like the other keys and it's easy to hit them when going for the left modifiers. Hitting "M5" instead of Control, for example. I feel for those by finding the edge of the keyboard, which obviously isn't possible to do for the Razer, as there's an extra row of keys.

    I would imagine other typists have had similar problems or a notched capslock key(or the DOUBLE notched on the Logi K350) wouldn't exist. Even my 30+ year old IBM model M(M for freakin' masterpiece) has a notched capslock. Perhaps it's an American standard? Maybe we have a different pattern of typing due to our differing uses of the same language and it's a requirement? I really don't know. I am not a standard touch-typist as most here know. Usually my index fingers mostly with my middle fingers thrown in here and there. Lol it's kind of amazing when I watch myself typing. Those fingers fly to cover the entire keyboard, and I match any traditional typist pretty well in practical scenarios. Maybe a few more typos, lol, but as evidenced by this post and the thousands of others I have made, not really a hindrance.

    I would use my IBM keyboard permanently, but it's EXTREMELY loud, and EXTREMELY heavy. You could literally beat down a door with it then immediately after type up the collective works of Shakespeare with flawless precision. Just way too much of a tank for me. Gonna keep it until death it's that amazing. It will definitely outlive me by a long time.
     
  9. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    The Logitech K350 wireless keyboard got here today. Fairly solid build for a keyboard but nothing special. Pretty lightweight. I think quieter and softer than I expected but otherwise it types beautifully. All the extra buttons are programmable which is great. I have a use for all of them. The modified layout feels very much like a natural evolution of QWERTY. Easy to get used to. It was $30 "Used - Very Good" from Ebay which is a great deal. Very clean condition with no scratches or blemishes on top and only a torn label on the bottom. No discernible input lag. Played a round of games last night and it worked just fine. The palm rest and the shape/angle of the keyboard make it exceptionally comfortable to rest my hands on. Far better for long sessions than the Razer or IBM because of less wrist fatigue from holding my wrists straight. The ergonomic shape is not extreme like Microsoft keyboards so it's not awkward to use. I daresay this is one of the better wireless keyboards I've tried. Overall It's not ideal but far better than either of my other keyboards and very utilitarian so will stay on the main PC for now.

    Also, it has a very simple UI for some functions such as volume, zoom and the like, showing a basic scrollbar. Very nice to see where your Windows volume is at in games. The keyboard essentially has features that are gimmicky, but work well. As a wireless keyboard there have been far worse.

    Having a totally wireless desktop is awesome as well. Much less of a chore to move things around or move my PC. The mouse also functions as a high performance wired mouse if I really need it to. As long as they last, it seems Logitech have made some pretty decent peripherals here. *knock on wood* Sometimes functionality and ergonomics are more important than pure performance. Though the performance seems to be there.

    I run my mouse at 333-500Hz polling instead of 500-1000Hz and set it to power saver mode which lowers the signal strength a little and allows the mouse to sleep, which I want. I haven't noticed a major difference at 333-500Hz but at 250 its slightly noticeable. Signal strength seems unaffected.

    A better rechargeable battery should help a lot and get me beyond a day on this mouse. Enough where I can just plug it in when I go to bed and it's ready for a whole day again. Currently I have to charge this one every 10-12 hours. Not ideal, but should be bearable with a pair of high performance batteries and the excellent multi charger I have. The keyboard simply sips power in its default mode. Logitech advertise that a good pair of alkalines should last years. Duracell Coppertops it is until the Low Self Discharge Ni-mh's arrive. I am a minor rechargable battery enthusiast so this is just another opportunity to do some research and get a quality set of batteries :) Samsung, Sanyo, and Sony are my top picks.

    Sony are by far the best, but it can be hard to find some of their premium Li-Ions and they are always more expensive. Some of their best batteries are very high demand, but low production.

    Samsung is almost as good as Sony, but they are far more available and cheaper. When buying Li-Ions, always a safe choice. These are my personal favorite for vaporizer batteries. Their 18650-25Rs are safe, high performance, and last a long time. Not great for very low resistance atomizer coils, but I prefer not to run that low anyway. They aren't designed for <.2 ohm connections. 20A max continuous limits them to roughly .2 ohms and above at 3.7v. Very powerful batteries that fit a lot of applications. Popular in high-perf flashlights.

    Sanyo makes the very best of some types of batteries, but since acquisition by Panasonic, outsources others to cheaper manufacturers, so they're hit and miss. Their Eneloop Ni-mhs are still among the best if not the best.
     
    Last edited: Nov 5, 2015
  10. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Oh god, been missing forum replies yet again...

    In the meantime, I bought some new kit ;)

    [​IMG]
     
  11. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Awesome! No K model for the main PC? I suppose these are already a lot faster than a first Gen i5 without OCing. I won't accept DDR4 though until 3200 is more popular. I find it takes double the previous standard to get the proper benefits. 400, 800, 1600, 3200, etc. Nice component selection. Can't complain about anything there. Probably what I'd buy, more or less. Skylake should slap your old Lynnfield around. The newer generation CPUs are far improved from the first gen i7s and i5s. Whole different world.

    Going back through the venerable Warcraft III + Expansions after a loooooong time away. For 2002-2003 the graphics are stupendous. Very colorful and detailed 3D and as I remember it was very light on system requirements for the time. My mom's old Celeron D/Extreme Graphics II desktop could run it playably at medium-low settings. At 2560 x 1400 it really pops. Some of the textures are a bit weak but the game has aged extremely gracefully. Very eye pleasing art style.
     
  12. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Nah, the 6600K is substantially more money than the 6600 (A good 40% more) due to stock availability issues. Plus, dare I say it, spending my working day on IT gives me less enthusiasm to fit a large noisy cooler and have to tinker with overclocking speeds. The 6600 is sufficiently fast that even at stock it'll be a moderate upgrade.

    Agreed on War3 and its expansions, it looks old but it's stood the test of time well, we still play custom maps in it periodically.
     
  13. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    There's really a lot to be said about art styles affecting a game's longevity. I can still play Zelda Wind Waker no problem and it still feels fresh and new. With the sharpness of the Gamecube's excellent quality component signal, it looks downright mouthwatering and gorgeous. Metroid Prime, Pikmin and Super Mario Sunshine are like that as well because Nintendo's development studios were really on point with the Gamecube. It's really hard to make 3D graphics ageless but some games really succeed. On PC there are a lot of 2D isometric titles that still look gorgeous and are not an eyesore to play, and there are actually a great many 3D games. Warcraft III is interesting to say the least. Almost a beta for World of Warcraft.
     
  14. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    On the matter of memory, Skylake CPUs only officially support DDR3L or DDR4, not DDR3. Further, all the high-end boards seem to use DDR4 which is really the intended memory. I very nearly ordered two HD3P boards until I realised that of course only the UD5 upwards has an SLI license. Just in case I ever get rich enough to operate two GTX980Tis or similar in the future, that's something I really want to be prepared for (hence the 1000W PSU, certainly way overboard for a single 970).
     
  15. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Sam I don't disagree with any of your choices. I only think you need faster memory, but that's just me :)
     
  16. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    I can't believe you aren't a touch-typist Jeff - amazing "pecking skill." Did you ever want to take a keyboarding class? I AM a touch-typist, but I do hit the Caps Lock key a lot. However, as I don't really have to look at the keyboard, I notice pretty quickly WHEN EVERYTHING BECOMES ALL CAPS AND I AM SHOUTING LIKE AN IDIOT.

    It happens a lot more often on texting - in that case I AM looking at the tiny qwerty keyboard as I am NOT touch-typing - fingers are nowhere small enough - but I still love the keyboard versus the old way of using that app they have that guesses the word you are trying to type - and is correct most of the time. Before I discovered that mode, it was hell to type a text message - but that's at least five years ago.

    Anyway, touch-typing RULZ - really - why don't you take a keyboard class - like even a night-school, or maybe even just a program if you think you might be able to teach yourself. The only reason I suggest it is that in my opinion it requires FAR LESS ENERGY to touch type, than to peck - and like you said yourself - you're amazed at the speed and accuracy of those fingers flying all over the keyboard - hearing you say that is impressive enough - I have seen some guys who, like you, were amazingly fast. Almost as fast as I am. :)

    Sam - I am going to make a guess that you are a touch-typist - am I wrong? What about you Kevin?

    How about you, DDP? (Oh, I forgot, they don't know about touch typing in Canada. Well, DDP, it's where each finger has a certain set of keys that it types, and you just lay your hand on the keyboard, left pointing finger on F, right pointing finger on J. You could probably watch a youtube about it, and then write to a Canadian newspaper, and create a sensation! You might as well get some laughs before me and the Finnians come calling.)

    By the way, DDP, you are officially the monitor guru- taking apart those Samsung monitors and frankensteining them like that - impressive.

    So, Sam, those parts were for your current 4k gaming? I thought you had said you were going to pick up some stuff for your older server. Did you gain maybe 20% performance increase?

    That had to be a pretty good chunk of change - my guess is at least $600 with the power supply. So now that Jeff has jumped ahead of everybody with some fast equipment, are you also getting geared up for SLI?
     
  17. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    DDR4 is all new to me so I did a fair bit of reading beforehand and in any benchmarks that really mean anything to me (i.e. anything other than a synthetic memory bandwidth benchmark) I could find no appreciable difference between the speeds apart from 2133 being a bit slower than anything else.

    Rich - I'm not a touch typist either, I type when relaxed around a 70-80wpm pace, but when rushed for time can type effectively up to about 100-110wpm 'two fingered'. The only time anything other than my two index fingers get used is for things like space (right thumb), enter (right ring finger) and stuff like shift/ctrl/alt (left ring finger/left ring finger/left thumb respectively) - otherwise the rest of the keyboard layout is all index finger. If I ever have anything wrong with either of them (usually a byproduct of my annoying habit of biting my nails), my typing speed plummets.

    I should perhaps explore frankensteining my old 3008WFP as it could probably be resurrected. Trouble is, unlike the 19" Samsungs in DDP's post, any of the parts it uses are going to be pretty much bespoke as no other manufacturer produced a copy of the 3008 (probably because it was admittedly a terrible design, definitely not up to Dell's usual standard) so given its considerable four-figure original price tag, replacement parts for it are likely to be very expensive. It's already had a replacement Schottky for the PSU to fix that known defect, the other design flaw has now manifested itself which is under-rated power control for the CCFL backlights, which even from new overheat above 50% brightness and have presumably now burned out altogether.

    On the new kit, you're way off I'm afraid, it was about £870, so £725 pre-tax ($1090). Taking equivalent prices pre-tax, the PSU would be $170, the RAM $100 per pair, The i5 6600 $220, The Z170X-UD5 board $190, The i5 6500 $200 and the Z170-HD3P board $115.

    One of the pairs of DDR4, the 6600, Z170X-UD5 and RM1000i PSU are for Voyager, the gaming PC, because the stability of the current i5 750/P55A-UD4 setup is such that I've had to turn the overclock off altogether to guarantee stability. Whilst at 3.6-4Ghz the 750 could still hold its own against modern chips, now that I've had to do that it really is time for an upgrade as the difference will be considerable. It's probably my fault as I used a fairly high voltage on the chip to ensure stability in the early part of its life (1.325V vs 1.1875V stock) but considering the combo would be 6 years old in February it's had a very good run for enthusiast-level gear. It's also a sad sign of the times that equipment that old has been relevant so long (and arguably still would be now if it could still keep its original 4.12Ghz overclock that it ran for around the first 4 years).

    The other set of RAM, the 6500 and the Z170-HD3P are for Intrepid, the file server. Since almost 2 years ago now it's had the occasional bluescreen which originally attributed to an RMM tool we were testing out for work (and it's just as well that we tested!) which caused blue screens on almost an hourly basis. After removing that the system was never quite the same so I had intended to reinstall windows, but latterly discovered I could increase the interval between bluescreens by turning the fan speeds off, attaching a fan to the CPU cooler, and leaving my window open, so figured if it's heat-related, it must be hardware. That kit is even older, the Q9550 and XMS2 Dominator RAM dating from Q4 2008 and the X48-DS4 from summer 2009. The server was built new in mid-2012 and was using second-hand parts then, so once again it has served me well. I can recall the days when 6 year old hardware was good for absolutely nothing due to its comparative performance to new stuff. The PSU in the server was only replaced around a year or so back due to a failed fan bearing in the Nexus NX-5000 it had, so I took the opportunity, given the large number of disks installed, to up the wattage rating a bit with an RM650.

    The PSU is being replaced in Voyager because I'm concerned at the fact that the GTX970 regularly crashed (several times a day) when connected to the 8+6 pin connectors on the current Zalman ZM850-HP (now approaching 7 years of age). When running on the 6+6 pair it's fine. nvidia hardware may be picky compared to AMD and I may well not have noticed if the R9 290X had worked out, but that set of circumstances suggested to me that if I ever did want to go dual graphics, I'd probably have power problems. Since I'm future-proofing the motherboard for SLI now, I may as well do the same with the PSU to avoid having to redo the whole build again later down the line. Dropping a graphics card in is a lot less work than redoing all your PSU cabling.

    Jeff's PC is streets ahead of mine spec-wise at the moment, whilst when I install the 6600 I'll catch up to him in terms of CPU performance, it'll be a little while before I make the graphics upgrade. Due to my resolution I've decided not to explore SLI with the GTX970, there's simply no need. The next graphics purchase I make will either be a GTX980Ti or whatever the then equivalent is. This considerable expense is one I had been putting off, but circumstances have really rather forced my hand. It'll take some time (certainly well after christmas) to restore the bank balance sufficiently to justify a graphics upgrade, but I do concede that at 3840x2160, I definitely need one to run games on high detail. The 970 does a very admirable job at this considering what it is, but it was only ever intended to be a temporary measure to evaluate how nvidia handle MST (not well it turns out, but better than AMD's new cards at least). Here we are a whole year later and it's still not being replaced in the near future.
     
  18. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Sam your typing style is very similar to my own. I've never taken the time to pay attention to which finger hit which key but most are similar. My thumbs are largely idle though. They rest on space, but I use my index fingers for space. I may have to try practicing with my thumbs to improve my efficiency. Right now I can keep a steady 40-60wpm and when writing at length can build up to about 80-90. But I can hold that 80-90 for a couple hours if I can keep my focus. Good music ups my concentration a lot because I can really absorb myself into music. For me, it's comparable in level of engagement to watching a movie. Keeps my mind on one track.

    As far as CPU performance you'd still be hard pressed at stock to match my OC'd Haswell/Devil's Canyon. Skylake is pretty good but it's not that good. And the memory is the same speed with higher latencies, for what that's worth. Nehalem/Lynnfield was only so-so as a CPU despite the performance at the time, and compared to newer generations isn't really that good at all. I'd say the upgrade will make a significant difference. Sandy Bridge and up are really a different beast. Skylake is just a gratuity :) Also SLI'd GTX970 basically matches or beats a 980Ti in most benches. A pair of 980Tis though... that's power man.

    Intel have released so many gens of microarchitecture in a short time. They all improved something but only some made a splash. When Sandy Bridge came out there was really no reason to buy an Ivy Bridge later unless you were a latecomer. When Broadwell came out it had(and still has) the best integrated graphics ever but as a CPU was simply a more expensive Haswell. I imagine Skylake will have an under-appreciated younger brother as well. A Sandy Bridge would have shut me up. There'd have been zero reason to buy the 4690K. An i5 750 though? Lots of reason to upgrade. They were a mediocre CPU generation on a mediocre platform. Skylake is far better.
     
  19. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Nor me until I just had a look at it.

    Sandy Bridge really wasn't that much of a step forward from Lynnfield, though it was a bigger step than Ivy Bridge was with respect to Sandy Bridge. The only architectures in the normal sector (disregarding higher-end offerings like Bloomfield and Sandy Bridge-E) I've not used are Sandy Bridge and Broadwell, the latter due to how short-lived it was and never really appearing in the desktop consumer market. 750 imminently being retired, 3470 in the other LAN machine (Princess), 4690S in the current LAN machine (Endeavour) and now 6500/6600 being deployed, hopefully tomorrow if I have the time.

    The jump from Yorkfield CPUs like the Q9550 to Lynnfield/Bloomfield was a smaller change than the 'big bang' of when the Core 2 Duo was first introduced, but it's still the biggest jump there's been in CPUs since that time. From my recollection, the first-gen i5s and i7s were offering 20% per clock gains on the CPUs they replaced, though admittedly operating at lower clock speeds in many cases except for the very expensive i7 940/950/965/975. Beyond that point I haven't really seen one particular gen offer any more than 10% per clock, partly because there's been no competition to force anything better. You're absolutely right that a 6600K would offer considerably more performance than the 6600 standard I settled with as I imagine I'd be able to overclock it beyond 4Ghz fairly comfortably, versus the 3.3Ghz I'm going to be lumbered with here, but frankly, with the £70+ cost increase of the chip itself, the extra cost of another tower cooler to fit it, the extra heat & noise from running an overclocked chip and the labour cost of maintaining the cooler and guaranteeing the overclock is stable, nah. At a time when more often than not I barely have spare time to fit new hard disks, that's more trouble than it's worth for me at the moment sadly. Were the 6600K and 6600 more similarly priced I may well have considered it.
     
  20. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Painfully slow process so far - the UD5 motherboard is DOA, and due to an apparent compatibility between the Z170 chipset and a large number of SSD types, I'm unable to boot off that with either board. While the UD5 will be being RMA'ed, it appears any Z170 board will have the SSD issue, so I'm going to have to try and figure out if there's any way round it. Google been largely fruitless thus far.

    Edit: Was talking nonsense about the disk issue, what I think happened was Windows had put its bootloader onto the WD Green drive I replaced at the same time (owing to bad sectors) as that drive is common to both my current SSD and the previous one Windows ran on. Ran a repair check on my windows install and off it went, with the 6500 and the HD3P board. HD5 RMA booked in...
     
    Last edited: Nov 7, 2015

Share This Page