1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
  2. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Yeah you probably should, that's a valuable monitor to just leave lying around :p

    Alas, my old 3008WFP has now finally bitten the dust, no urgency to replace it as it was just a secondary spare display to use at work, but a little annoying as now I have to dispose of it!

    On the gaming front, still going to be tricky to tear me away from Brutal Doom - very simplistic, but I still very much enjoy it with all the extra stuff that mod brings.
     
  3. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Valuable to us, yes. I've looked at buying them used since, and they certainly don't hold their resale value! :S I spent $500, and you can get them now for barely over $100USD. Last time I looked anyhow.

    I suppose that could be at least partially due to updated technologies. LED being more electrical efficient. And of course, UHD. If there are better technologies out there, usually people flock to it. Driving outdated tech price down.
     
    Last edited: Oct 11, 2015
  4. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Still a nice monitor though.

    What do you use your microscope for, Kevin?

    What about you, DDP?


    ================================

    Brutal Doom?

    Sam, holy crap, I just watched the trailer.


    View: https://www.youtube.com/watch?v=oSzYliSASKc


    That's what me and my Finnian friends are going to do to any Canucks who get in our way as we take the US back, but I don't yet want DDP to know about it.


    Talk about retro! Doesn't the non-stop bloodbath get old?

    And I thought Left 4 Dead was nerve-wracking. :)

    What's the next mod - tactical nukes?

    [​IMG]



    Well, Sam, if your old 30 inch Dell just died, I hope that doesn't mean that mine has limited life. If I had to buy another one, I would probably do something really foolish and get 4K, and then I'd have twice as many pixels to not be able to drive with my always-outdated hardware!


    On the positive front, I now am the proud owner of 3 pixma mx870 printers/scanners/fax. Two of them are good - one is just for messing around - it's good as long as you don't try to really print out something nice on high-res paper.

    The printer is a nice piece of engineering. Two sets of sample packs of some very fancy matte paper are on their way to me, so the animator's sister can come down and we'll try to create some artistic renderings for wallpapering her bedroom!

    I have most of these in decent resolution like at least 1080p.

    [​IMG]

    I never got to Arma3 yesterday, but maybe tonight. More and more I think I'll do the Operation Greenstorm again, against the 100 enemies - now that I've seen that Brutal Doom trailer it will be child's play - and create a save just when I shout out to myself - "Special forces squad 200 yards north!"

    I'll take the sniper rifle. And now I've learned to take off things like zoom sights, to use on other weapons. So maybe I'll put the zoom sights on the Zephyr machine gun - REALLY deadly when you do that!!!
     
  5. ddp

    ddp Moderator Staff Member

    Joined:
    Oct 15, 2004
    Messages:
    39,167
    Likes Received:
    136
    Trophy Points:
    143
    Harvard, will be using it to work on small detail on my model warships plus any circuit boards that need repairing using smt parts. you'll have to take the US back from your own country men as we didn't take it. maybe it was those pesky Russians that took it?
     
  6. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    DDP nailed it. It works wonderfully with circuit boards. I also use it for grading coins. So as to not stress my eyeballs LOL! They are quite good eyeballs, and I can force them to see small things, but eye strain causes ME the worst kind of headaches.
     
  7. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Not especially as doom 2, partly due to its age and partly due to how simple it is to modify, has vast quantities of third party content to play. The 'non-stop bloodbath' (which I think is a fair assessment) just becomes the norm after a while and isn't something I really think much about, but it does make the game engine seem far less antiquated and stale. It's mainly the other additions like weapon reloads, hand grenades, akimbo and the use of alien weapons that rejuvenates it, as well as behavioural changes to the enemies themselves.

    On the Dell front, I wouldn't be unduly concerned - the ultimate failing of my 3008 was a direct result of a design flaw inherent to the 3008 to begin with, overheating backlight drivers. The 3007 never had this issue, so I don't think there's any concern for yours.
     
  8. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Just a few months later update. SLI is turning out to be straight up better and more reliable than Crossfire ever was for me. It still has a few of the same quirks, and the performance gains aren't much different from what Crossfire gave me. That being said, it works with a larger number of games, and Nvidia update their drivers monthly/weekly whereas AMD release WHQL drivers maybe 3 or 4 times a year and buggy beta drivers in between. It took them until 6 months after I purchased my GTX760 to release a non-beta driver for AMD cards, and that was AFTER I got sick of waiting 4+ months to play Metro Last Light in Crossfire. The better part of a year spent waiting for Crossfire compatibility? That's unacceptable and a joke. Every new game I've tried has worked with SLI out of the box. Regardless of the architectural differences, SLI is far better engineered, and relies far less on software optimization to work properly. Nvidia almost always release SLI fixes within days or weeks, not almost a year. AMD severely let me down there.

    Memory so far hasn't been a big concern. Even the most demanding titles like BF4 and Dragon Age Inquisition barely bat an eye at 8xAA, and remember this is at 1440p. Only freak anomalies like MiddleEarth Shadow of Mordor and GTAV are affected, and they can easily be tweaked under the limit while maintaining AA and maxed/nearly maxed settings. My general framerates in every game I've played are spectacular. This is the first time I've ever been able to use Vsync as a default option in most games, and simultaneously at the most demanding resolution I've ever personally used. Even very demanding ones simply do not drop below 60FPS. Vsync certainly is icing on the cake, as screen-tearing is far worse and more noticeable for me than jaggies OR low framerates.


    Also, unlike AMD, I rarely EVER notice microstutter. I had to use 3rd party tools to get a lot of games to run acceptably on Crossfire. SLI on the other hand, I've simply turned it on once and have never touched the option since. Also, a lot of games required Crossfire disabled to function properly. Not had a single conflict with SLI. Either it works or it uses a single card. No glitching or freaking out.

    I'm sure the 1GB 6850 fiasco will resurface in the near future, but the hardware and software work so much better, it's almost a non-issue.
     
  9. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I wouldn't say nvidia offer SLI support for new titles instantly, sometimes it does take a few months gauging from various benches, but it's undeniable that SLI is better supported by the manufacturer than crossfire which is effectively a pot luck scenario - if it doesn't happen to work now, it probably never will. nvidia's frame pacing is probably the reason why you're not seeing microstutter. I believe AMD have also now added this but it was a relatively recent feature and confined to at least the R9 200 series, if not possibly only the 300 series and above.
     
  10. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    My 6850s had the option in the driver control panel but it didn't seem to make much difference. The 970s have been very smooth sailing, and are excellent factory OC'd cards with aftermarket cooling and tons of horsepower. I don't remember my 512MB or 1GB 4870s having the issue but my 5850s certainly did. The 5850s were better cards outright than the 6850s btw. Never hit memory limits as suddenly, and took a far lower hit in FPS and stuttering when they did. The 970s will drop right off the grid when they hit about 3.7GB. No mercy at all. A very small few games go that high though, and all because of either bad programming, or the use of AA.
     
  11. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Yeah the HD5850s had a lot more actual grunt behind them than the HD6850s - it rather felt like the HD6850 was a smaller GPU sped up, rather than a GPU with size that befitted its performance, probably because that's pretty much what it was. nvidia's memory management seems rather less efficient than AMD's, especially with the advent of HBM, but 3.5GB is still a fair amount of video memory to have, even if not what it will say on the box. Bear in mind I'm running 3840x2160 and only a single GTX970 at present, so twice the pixels with half the power and it still does a fair job. Frankly though, I'm still loathed to touch a dual-GPU setup until I can ditch MST, because while nvidia's support for it is a little better than that of the abysmal R9 200 series, it's still nothing like as good as that of the officially-unsupported HD6000 series cards that preceded either. Ditching the monitor for one that supports SST would be one solution if any decent products were available (Dell's R&D seem to have ditched SST at 4K even though almost every other display manufacturer is happy to use it), but what are people using multiple displays supposed to do? The blind acceptance of the travesty that is multi-display merging from either side is dreadful.
     
  12. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Just got Dirt Rally on Steam. Simply superb!!! Everything Ive wanted in a Dirt game since the beginning. Given the lack of a proper Colin McRae game in almost a decade, this is a true return to the series roots. Every little detail is perfect. All the right camera modes, astounding graphics deserving of the money I've spent on this PC, excellent online play. Also proper Grid 2 handling in a Dirt game for the first time, which is very comparable to what Gran Turismo has been doing since GT4. The graphics are something special and it is appropriately demanding. All maxed with 4xAA averaging about 55-65FPS which I consider well within the sweet spot given my 59Hz monitor. Remember on SLI GTX970s so this game is a beast compared to what I've tried before. Very few games are this rough on my video cards, mostly hardcore military/space/flight sims. The number of graphical options is high and when maxed out the visuals are mind-bogglingly realistic. Also, the Intel-only extras work properly now as Grid 2 detected my CPU as non-Haswell. It's technically a Devils Canyon.

    Easily one of the best game releases of the year if not the past few years. I am absolutely tickled at how good it is. Also, no forced multi-discipline racing like the previous Dirt games had. Pure rally with small spatterings of other rally-related events from beginning to end. This is easily a rival to any other racing game available. The same could definitely NOT be said about previous entries in the series, which had great technology but were far too arcadey and unrealistic. Only Dirt 3(possibly) and Grid 2 are comparable.

    Additionally, automatically detects my PS3-intended Logitech Driving Force GT and auto-maps the controls to its console counterpart for ease of access. Certainly no complaints there. This wheel is definitely a cheapo but has all the features of higher-end wheels so its performance is very premium. It works well and greatly enhances the realism, precision of control, and most importantly it is fun to use.

    Looking at the performance numbers of similar systems it's become obvious to me that my PC is far more of a beast than it's ever been. What can I say? Intel and Nvidia are treating me good :) My friend just purchased a Gigabyte 390X G1 Edition to pair with his 4770K. Curious as to how that turns out because they certainly whomp my 970s individually in benchmarks.
     
    Last edited: Oct 24, 2015
  13. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    What can I say, you're buying what's best at the time, not being led by brand allegiance!

    Still eyeing up my options for next upgrades, so much to spend money on it's hard to know where to begin - desktop CPU, desktop graphics, file server needs a new platform soon, the list goes on! :p
     
  14. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    I found the 4690K to be the best possible value. All the newest technologies with the lower price of an i5. Benched against my friend's 4770K with HT disabled, it's more or less the same while using less power. In games the two are indistinguishable. Even the best AMD can't compete. The 1100T might be able to push this system okay but the bottlenecks would be bad. It was already bottlenecking a single 970 in a lot of titles.

    In desktop graphics the 980Ti or Fury are currently the top-end value kings, with the 390X edging out the 980 a bit in the high end. For 4K I imagine the 390X might make a lot of sense, though the question is how it handles your MST display. My Core 2 Quad secondary is still beating along with daily use. Plays all the newest games and hosts game servers no problem whatsoever. I have a particular friend who visits nearly every day and he flogs it pretty hard with no issues. Still snappy and feels new. Works great*knock on wood*.

    Just in, got a lightly used WD1001FALS in perfect health for free. Getting added to my main PC as soon as I can be bothered to route the cables.
     
  15. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Slowly coming to the realization that my 27" display is not big enough. The resolution is perfect for me. Not too high so as to be hilariously demanding, but high enough. Size though, is an issue still. Especially when I sit back in a comfortable driving position, or lean back to watch a movie. For those uses an HDTV was absolutely superior. Have been looking at some of the 32" 1440p Korean panels. Possibly the Samsung equivalent. They all use the same panel anyway. Most of the Korean brands are seriously problematic or too sparse in features. One particular one called Crossover seems to put a little more effort in though. Samsung and the like are more likely to produce a better monitor, but pricey... I am determined to stay above 1080p but under 4K so that limits my choices unless I want a 30" 1600p panel most of which lack HDMI. HDMI is an absolute must no questions asked and display port is almost the same priority. Most of the 32" panels have exactly what I'm looking for but I need to find the best implementation. There is a BenQ model but it has questionable long term reliability. These monitors are exactly what I wanted from the beginning. Hopefully I can find a quality model.

    Gigabyte Raptor mouse developed a double click in one of the side switches so I simply don't trust it anymore. Great performing mouse and well built, but I want better. Sold to a friend for $30 and he is tickled with it.

    Replaced with a Logitech G700S wireless laser(Also bought a Roccat Taito Control mouse mat to go with it OMG what a nice mousemat especially for $15. Wanna buy 3 more for backups). Basically the mouse is the older G700 with some of the quality issues addressed. Better click switches mostly. Already comes with a top-of-the-line Panasonic Eneloop rechargeable Ni-mh AA battery. Almost no energy drain while idle but only 2100mAh optimal, more like 1900-2000 from internet testing sources. Lasts about 1 solid day of use before needing a recharge with a micro USB cable as the mouse can charge its own batteries. I'm going to go ahead and replace it with a rotating pair of Eneloop XX/Pros. Rated at 2550mAh more like 2400-2500 in testing. They can only be recharged about half the number of times as the regular Enerloops but handle high draws better. Still 500 cycles vs 1000 is still comparable to any Lithium Ion in this size. My Samsung 18650 25Rs are really good quality batteries and they're only good for 500 cycles as well. And with a pair rotating through my charger, I don't need to use the charge cable and can keep the mouse truly wireless. Most wireless mice can make batteries like these last for weeks or months. This mouse uses a dedicated bluetooth receiver, has a 1000Hz polling rate, stores its setting in onboard ROM, uses a more powerful laser, etc, so uses far more power than other wireless mice. It stands to be said that it has far better performance than other wireless mice as well though. 8200DPI and zero input lag are not common features in the wireless mouse realm. It's every bit as good as any desktop mouse in theory.

    Razer BlackWidow mechanical keyboard performing great and feels good for gaming but I simply do not like how it types and loathe the absence of a notched CAPSLOCK key. Frustrating keyboard for me to type with. Stiffer switches with more feedback would be welcome more like my trusty IBM model M from 1983. I've found myself using the IBM far more often recently as it has a superior feel under my hands. A shame as the Razer is a sweeeeet piece of kit and is otherwise astoundingly good. Just not for me.

    Experimenting with a Logitech K350 wireless which I love the layout of, and which has a double notched capslock. Only downside is membrane switches but in a quality keyboard those are usually fine for me and give satisfying feedback. The cheap Saitek membrane boards I've owned always had excellent typing characteristics for me. Lots of great reviews and looks very comfortable to use vs my Razer or IBM which need a tall padded wrist wrest. The K350 seems to be a golden standard in wireless keyboards. The more expensive Logitech K800 is backlit which is what I prefer but uses scissor switches which almost always feel terrible. Not really interested. Will see how I like it when it arrives. Wireless is nice if it works properly, and it would be nice to have a wireless desktop if I like both devices. So far the mouse is a win, but the keyboard is a little more difficult to pin down.
     
    Last edited: Oct 28, 2015
  16. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Haha, another Roccat Taito user eh? Welcome to the club - use them both at home and at the office, relatively inexpensive and a nice surface.

    For the monitor I think something like the Samsung S32D850D would be good for you and not too expensive, but I just can't abide VA panels these days. Unfortunately I think the only IPS 2560x1440 panel is 27".
    Frustratingly it appears that even though it'll come with HDMI 2 (which might suffice), the Dell UP3216Q still needs MST to be able to use Displayport. Even though all other manufacturers have left that unnecessary incumberance behind, the only monitor brand I really trust in that sector to use decent panels (the Sharp panel in the UP3214Q is amazing) is still the only monitor brand still stuck 3 years in the past for electronics.

    Not sure what you mean by a notched caps lock key - very few keyboards I've used in the UK have a caps lock key look like anything other than a wider tab key. Personally I prefer lightweight response and if anything the Ducky I use with its cushioned keys is too soft, but if it means it lasts longer without falling to pieces like the Qpad did, so be it. 2 1/2 years in and it's still in pretty good shape. Ditching the Razer Taipan for a SteelSeries Sensei Raw was a good move. It might be a little more accurate (which wouldn't take much!) but it's much smoother to use, lighter and more comfortable. The Taipan which I moved to the office just feels horrible by comparison. The more lightweight Abyssus is far nicer, but only three button rather than 7. The Abyssus has also since developed bounce on its scrollwheel, which is annoying.
     
  17. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    http://www.amazon.com/Samsung-WQHD-LED-Monitor-S32D850T/dp/B00L3KNOF4/ref=sr_1_1?s=electronics&ie=UTF8&qid=1446066264&sr=1-1&keywords=Samsung+32"+1440

    That's the hot ticket right there. DP and HDMI. About $450 used which I can stomach for a Samsung display. Will be ebay sniping around for one or possibly grab one on Amazon. Again, all the monitors of this type share a single panel from a single mfg but they all have different implementations. It's an AMVA+ panel which means not amazing but great color accuracy and deep rich blacks. Cheaper models have horrible gray to gray response time so ghost pretty badly. More premium builds seem to be okay though. Samsung is among those. DisplayPort is said to help reduce this issue as well. It shouldn't be too large a shift away from my IPS-based Dell UltraSharps. Definitely a far cry from the HDTVs I was using.

    I like the benefits of higher resolution, but hate the drawbacks of going too high. A 32"4K panel would be amazing but my poor 3.5GB 970s would cry. Not to mention very pricey for both the panel, and the cards to drive it. Out of my range. 1440p is very doable and a single powerful card can max most games at that resolution. It reaches some sort of philosophical perfection deep in my mind, haha. It also seems to be a high enough resolution to mostly eliminate the pitfalls of desktop usage on a screen that size. Certainly miles better than 1080p.
     
  18. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Poor 970s plural? I'm still using a single 970!

    Really the lack of proper SST monitors wouldn't be a problem if AMD or nvidia had a clue when it comes to graphics drivers. As it stands though, I still update my 970's drivers regularly and nothing ever changes, I still routinely have to put up with display driver crashes and bezel compensation errors that require a PC or monitor reboot, respectively, to rectify. I pity those who still struggle on with properly tiled displays like eyefinity and vision surround. I honestly don't know how people stand it. I like to take steps to ensure my PC is stable and operates properly. Knowing that achieving that simply isn't possible due to the technology being used would drive me mad...
     
  19. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Im also giving a lot of thought to the 34" 21:9 displays. They would be simply stupendous for widescreen movies, and avoid many of the issues with things like eyefinity or vision. LG has one in 2560 x 1080 which isn't too far from a 32" 1440p in pixel density. There is a more expensive version available in the same size but 3440 x 1440. It would be excellent, but ouch that demanding resolution. Also, the 34" 21:9s aren't any larger vertically than my 27", which I need for the graphics in games to be physically larger. Perhaps a wider field of view would shut me up though, haha. The 1080 vertical 21:9s are really nicely priced, but the 1440 ones are hellishly expensive in comparison to everything else I'm looking at. That would be enough to outweigh a 16:9 32" though.

    EDIT:


    View: https://www.youtube.com/watch?v=JCqHmQas3_Q

    Very very tempting. It would be a drop in resolution to go with the reasonably priced option, but the immersion especially in racing/flying sims would be insane. Not to mention wide format movies in full screen. Currently have to expand them, cutting off the sides to fill the screen. I hate that. The reverse would be true though for some things. This screen would have black bars on the sides for 16:9 content. 2560 x 1080 would give me some cushioning for the future as well... Choices. I love 1440p, but wouldn't mind losing it for more viewing area while keeping reasonable clarity. The 21:9 1440 might be a solid choice though. I'm already monstrously overpowered for most games. The panels don't seem to have any serious quality issues, so I might want to hunt around for one. So 32" 16:9 1440 vs 34" 21:9 1080 vs 34" 21:9 1440. All of them are larger than this, and all have their benefits. Hmmm. Thoughts?
     
    Last edited: Oct 28, 2015
  20. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    16:9 is plenty wide enough - what I'd gain for panoramic films (which would only be 1920x800 anyway and therefore look bad made so large), I'd lose for stuff filmed at the 'correct' aspect ratio...
     

Share This Page