1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Bridge?

    I should have mentioned:

    [​IMG]


    uhhhhh - that's scuba gear, and night vision.

    no bridge required



    Hey, so you're building a couple dozen models at once. What the hell!


    You take a sheet of plastic and you carve out the individual part of the ship!!!

    Are you a mad scientist!

    Actually it sounds pretty intense. Kind of like these art pictures we are doing. But wayyy more work. I still think you could post some kind of a picture - even just showing one part taking shape, like an artillery gun. :)
     
  2. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Been a while since I've played a good single-player game in detail, will definitely need to pick one up again soon :p

    I've recently picked up Warframe again after 2 or 3 years of not playing it. Amazingly, it's still in beta so there are a few odd bugs lying around, but on the whole it's a fun, if slightly repetitive TPS. Playing as a group of 4 on a defense mission is quite good fun and the mod system based on collecting numerous different types of resource from the remains of the enemies keeps things somewhat interesting at high level. As a free to play title, there are no substantial barriers to playing it all without charge.
     
  3. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Well, Sam, if you're looking for suggestions....

    I never heard of Warframe, but it sounds like it is fun - and if you have buddies who will jump in with you.

    But thinking of single-player .....

    I keep thinking back to those two Russian titles - I didn't want to play them until Jeff bugged me about it - the subway games based on the books. Oh, yeah, Metro. I totally had the wrong idea - zombie freaks in a dark tunnel - boy was I wrong! You rarely see children in video games - but they had them in those games. The atmosphere was unique and compelling. The story gripped you. The game was challenging. There was a lot of replay value. In about a year I could go back and do both of those again.

    I don't know if you ever played those, Sam, and I don't know how intense they would be texture-wise for your graphics card to try to run them in 4k - but they are the two that most stick in my mind.

    (Actually, I have to amend my list - there was one other that was the equal of those two. That was Dishonored, by Viktor Antonov, the famous art director who was in charge of Half Life 2. And by the way, the two DLCs were also of the same quality - superb. One more game that I found out about through Jeff.)

    Of all that I've played since then, all 4 Batmans which were great, the Assassins Creeds which were all great, 3, 4 and just now Unity (and later this year I will probably get into Syndicate) the COD games which were not bad (not great but not bad) and Arma3 which continues to amaze - still 100 community-authored scenarios to go - and Far Cry 2, 3, and 4 - especially Far Cry 3 which was utterly epic - still those 3 stand out in my mind.

    Rich
     
  4. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    At long last, another step forwards in graphics hardware is upon us. Not exactly an enormous one, but substantial enough that I had a go at producing a few more unified figures, taken from the average results of a fairly broad range of benchmark suites from different sites. The results are as follows:


    GTX 1080: 1295
    GTX Titan X: 1019
    GTX 980 Ti: 1000
    GTX 980: 784
    GTX 970: 659
    GTX 960: 470
    GTX 950: 427

    R9 Fury X: 957
    R9 Fury: 884
    R9 Nano: 847
    R9 390X: 821
    R9 290X: 720
    R9 290: 652
    R9 280X: 571
    R9 380X: 520
    HD 7970: 503
    HD 7950: 421
     
  5. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Nice!!

    So my two 7950s, at around 500 each, give me 1000 at perfect scaling - near the GTX 980 T1, and a little ahead of one R9 Fury X. What kind of price are those higher cards commanding? Still no move to smaller dies?
     
  6. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    The HD7950 scores around 420 on that chart - at perfect scaling you'd see 840, around the level of an R9 Nano, but I'll be honest the likelihood of getting perfect scaling in modern titles is dwindling. Believe it or not nVidia now only support 2-way SLI on their latest cards, using a double-width single bridge for extra bandwidth rather than allowing the cards to be daisy chained like before, arguably because especially of late, running more than 2 Geforces is a waste of time. Historically AMD have fared slightly better with 3+ GPUs but it was still very much a gamble, and now that AMD are so very far behind in efficiency and raw performance, there really isn't any need to use crossfire in modern machines at all.

    AMD have announced that the new Polaris 10 architecture is only going to be an upper midrange offering, serving second-fiddle to the extant Nano / Fury / Fury X lineup, but with much better power consumption than the 390 series, something AMD sorely needs.

    Both the GTX1080 and the upcoming Polaris cards are based on a massively reduced fab size, the first such shrink for almost half a decade. Typically we're used to seeing shrinks of the order of 25-30% -> 130nm to 90, 90nm to 65, 65 to 45 and so on. This time around, due to failures of the intermediate nodes, nvidia have jumped straight from 28nm to 16nm and AMD from 28 to 14. This means that while the GTX1080 is 30% faster than the GTX980Ti, it has only 70% of the TDP at 180W vs 250W and test bench power results suggest those numbers are accurate. As AMD's new parts will only be midrange parts I'd expect dramatically reduced power consumption, possibly as low as 120W for even the fastest offering, which is single-6pin territory. Nonetheless, the top card (likely to be an R9 480X) is likely to be no higher than around 800 on that performance score. For anything that competes with nvidia in the high-end you'll need to wait until at least Christmas, probably 2017, for their high-end parts to arrive. By that time I imagine nvidia will be preparing a fully-fledged GPU based on their new 16nm architecture - the GTX1080 is far from the full size offering on the new platform despite the impressive numbers - after all, once the silicon process is a little more mature, there is nothing to stop them extrapolating what the GTX1080 can do at 180W back up to the 250W figure common to high-end GPUs these days. I would expect similar gains to GTX980Ti vs GTX980 on top of the GTX1080 once that occurs, probably meaning that within a year or so there may be a card on offer that scores in excess of 1600 on that scale. Whether it makes it out the door as a $600-$700 'GTX1080Ti' or a $1000+ 'GTX Titan V' or some letter they haven't used yet, will likely depend on whether AMD's next generation is any good I suspect.
     
  7. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    In most games my pair of 970s is already insanely overpowered. Only a very tiny few have memory limits(GTA V, Syrim w/ mods, Shadow of Mordor, Dirt Rally), otherwise the graphical horsepower available is more than enough for a long time. A true shame as the 970s perform excellently and scale really well. 3.5GB is just not enough. 6GB on a 980Ti would be ample, and the 8GB on a 390X or 1070 would be excellent. May make the jump to a discount 390X, or wait a bit and get a 1070. The Fury, Fury Nano, and Fury X are basically vaporware considering their price. I'm still looking around the 300-400 mark for a single card. A well equipped 1070 G1 or similar would probably fall within that range, or maybe I can save a bit of cash and eBay a factory OC'd 390X for cheaper due to price drops when more new cards release.

    Problem with 390X though is noisy as hell, and would basically require a new PSU to Crossfire a pair. Somewhere in the neighborhood of 1000-1200W. Stock pairs have been measured drawing some 800W+. Otherwise I'd already have them. A discounted one from a buddy or the internet could probably be available in the future. I would certainly experiment with a single card and see what it does. Maybe the memory is enough. Heck, if the 970 had a proper 4GB of memory, I might not be hitting the limits that I am. Awul waste of a video card.

    Reminds me of issues in the past. Should have gotten 2GB 6850s. Should have waited for a 390X or gotten a 980. Keeps repeating =/ At least CPUs aren't advancing too quickly. My 4690K seems up to any task I throw at it. Don't think I've ever seen it struggle even a bit.

    TL;DR - My 970s have way more than enough power but I need more graphics memory. Lots of suitable cards currently available, and the promise of decent prices for new cards. Nothing is a great deal though so I'll probably wait for discount or used current-gen rather than go to the next-gen. I just don't need the rendering power as much as I do the memory.
     
    Last edited: May 18, 2016
  8. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    It's interesting as despite running a higher video resolution than you I find the 3.5GB of the 970 less restricting, primarily because running just one of them I'm not able to turn detail settings in big hitters up high enough to require that much memory - the additional processing power granted by SLI plus the fact that it carries its own video memory overhead makes that much more an issue. It's still fairly rare for games to much exceed 4GB at 2560x1440 these days but given in practice with two 970s you're effectively using around 3.25GB per card I'm guessing that's just a little on the short side. In true nvidia style corners were cut everywhere on the 970 but I suppose it is a testament to the architecture how successful the card is in spite of that.

    I wouldn't regard the R9 Fury and Nano vapourware, they're widely available, and even though I don't really think vapourware is an appropriate term to describe a product very few buy due to its price (e.g. stuff like the 295X2 and Titan X) I wouldn't classify the Nano and Fury in that category either. They're both a little pricey for what they are, but not excessively so. The main issue is the thermal output - in the case of the Nano it's a much more efficient card than the other two but the heatsink still isn't large enough to deal with the heat on its own so requires a fair bit of ambient ventilation to avoid throttling, with the Fury it's just a case of dumping a considerable amount of heat into the case, and with the Fury X it's neither of the previous two issues but all the potential pitfalls of running an AIO water cooler in your machine. The 390s are more affordably priced and arguably a much better value buy but offer slightly lower performance still for the same large power and heat output. The AIB GTX1070s are likely to be around the $400 mark - I'd strongly suggest sticking it out for one of those - GTX980Ti like performance in a 150W TDP for that sort of money and 8GB of video memory is going to make it a very compelling card indeed - should be available in a month or two's time.

    In gaming, CPU has become a total non-issue - where I wish I may have gone with a slightly higher-end CPU is during 4K video playback, game screen capture and video transcoding (the latter two not necessarily 4K).
     
  9. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Skyrim uses 3.8GB, GTA V uses 3.7GB, Shadow of Mordor uses nearly 4GB, and Dirt Rally uses about 3.4-ish but also has massive performance requirements. Anything requiring over 3.5 is nearly always an instant drop to unplayable for me, and too much stuff is borderline at the settings I use, even without AA. 1080p would be a different story entirely, but then I would never have bothered with SLI.

    Games I've struggled with in the past are a non-issue though. Most notably Crysis 3, Company of Heroes 2, Empire Total War(old but insanely demanding modded), Men of War Assault Squad 2, and Sins of a Solar Empire Rebellion. Those last 3 I play frequently.

    So yeah, the power is enough, but the memory is not. And as you can see, with a proper 4GB I probably wouldn't have hit a limit so soon. It's like they purposely neutered the card to limit its capability, not to save money.

    BTW I do notice performance drops after about 3.2GB as well so even the main chunk of memory has something going on with it. The 970 is essentially a 3GB card in disguise as a 3.5GB card masquerading as a 4GB card. Reminds me of the American playing an Australian playing a Black soldier in Tropic Thunder. Funny in the movie, but a shameful waste of what could have been one of the best video cards ever released. I feel like Nvidia sort of ripped me off. These could have been my go-to for several years more. It's not like memory costs these days. For $360+ they could have included 4GB. Cards several generations older managed it.
     
    Last edited: May 19, 2016
  10. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    It's a double whammy - with such a loyal fan base and such lacklustre competition from AMD, the 'scandal' that was the GTX970's memory allocation was utterly inconsequential. People either lived with it, or they got a 980 instead. Not too many people switched to the AMD alternative as even with the revelation that the 970 was a marketing scam that continues to this day as they still sell it as a 4GB card, it still worked out to be for many, myself included, the better product. Incidentally if the 290X had worked properly with an MST display like the UP3214Q I use, I'd probably still be using it, assuming it hadn't failed - the 95ÂșC operating temperature on the 290X brings the average failure rate across all the various partner boards of the 290 and 290X up from the industry average of 1 in 50, up from the industry average of high-end graphics cards of 1 in 25-30, to a pretty dismal 1 in 12.
    However, if you got one of the ones that survived the temps, had good enough ambient airflow to keep it close to the full clock speed at 95 or could live with hacking in a custom BIOS to allow the fan speed to exceed 50%, and weren't using an MST monitor (note: both HD6970s could handle it just fine. The 290X was incompatible. The GTX970 still does a worse job than the 6970s but at least it's sort of tolerable) then the 290X could make a viable competitor, but damn - why not just avoid ALL of those issues and settle for slightly lower memory and marginally lower GPU performance in return for half the noise and less of the BS.

    I made a lot of excuses for AMD in the past and I'm still far from being a 'fanboy' of either brand, but putting all the real world attributes of the products on paper and not just the frame rate results, it isn't difficult to see why nvidia have an ever increasing market share in the industry. For every two AMD desktop GPUs that are sold, nvidia sell nine.

    It's not a good thing either as lack of competition is primarily how things like the GTX970 scandal are 'acceptable' in the first place and also part of the reason hardware progress is stagnating. I'm actually quite surprised at how potent the GTX1080 is as there is no rival to it whatsoever for at least a year, possibly not ever the way things are going at the moment. Just look at the CPU market to see where lack of competition is leading. Ever increasing performance per watt possibilities lead server CPUs to grow bigger and more capable with things like the E5-2699V4 dropp;ing an incredible 22 2.2Ghz Broadwell cores in a single 145W package. Given it's an HT CPU, that's 44 threads you can play with on a virtual server host and the likes. Plenty of AMD desktop CPUs pull more than 145W delivering 8 'cores' to desktop users, cores that are likely to be slower still than the Broadwell offering. Look at Intel's mainstream quad-core CPU products on the other hand and performance hasn't even doubled in 5 years...

    Back onto graphics, my current plan is to get hold of a GTX1080 as soon as I can lay my hands on one, set up a new fresh Windows 10 install to replace the 8.1 I'm using now and see how MST fares. If sticking with nvidia means it's still as buggy as it is on the 970, then the UP3214Q will be upgraded to a UP3216Q and I'll be using HDMI 2.0 instead. I'm rather hoping that won't be necessary as the upgrade doesn't earn me anything other than fixing those bugs - the panel of the UP3214Q is already first class - the new display even looks identical. Posting and taking delivery of 32" displays is such a nuisance. I've become too accustomed to collecting parts from pick-up points and Amazon lockers!
     
  11. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Well the possibility of a second hand 390X coming my way is definitely real, so I'd still give one a go. My friend is currently considering a 1070 or 1080 to replace his. The 390 is enough memory, and the price(one of my 970s + $50 cash) is hard to argue with. It would be quite noticeably faster than a single 970, and even a bit faster than a 980 if your numbers are accurate.

    Also, while the 390X is a monster power sucking card, my 2-3 year old 750HX should at least handle a single one with zero issue. I could probably barely squeak by in Crossfire if I turned my CPU to stock, but lately it's becoming more compelling to get a faster single card. From numbers I've seen even a stock 390X is more than enough performance, and the used one would be the cheapest way to get what I need for memory.

    A brand new Samsung 850 Evo 256GB M.2 SSD may head my way as part of that deal as well. I'd certainly be interested to see what he wants for it. M.2 is the direction I'm headed for my OS drive. The drive being SATA over M.2 instead of NVMe/PCI-E doesn't really bother me. It's bound to at least be similar to my 128GB 850 Pro. Lifetime is supposedly better on the pro but honestly I don't see myself wearing out an 850 Evo too quickly. SSDs have advanced amazingly far in reliability in a few years.

    Monitor-wise I'm going to ignore the issues with my 32" and use it as normal. Just played 4 hours of GTA V on it and never looked at the spot once. It's faint enough that I can mostly shut up. If more 32" 1440p options existed I'd be looking into them.

    Windows 10 isn't perfect. Win 7 is still the king of Microsoft OS's, but 10 is still better than 8 and 8.1 in every capacity. Absolutely superior. Plus the spying is fairly easy to disable.
     
    Last edited: May 19, 2016
  12. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Probably not an issue if you're likely to go back to using one card, but one thing to be aware of is that if you do go with an M.2 SSD it's usually the case that running them with dual graphics cannibalises PCIe bandwidth from the graphics cards as the 16-lane CPUs can't keep up - not an issue for X99 users of course.

    Yes, singular full 300W cards will work happily off a 520W PSU as I did so with an HD4870X2 in the past, and in full gaming load with both GPUs fully laden it pulled about 450W AC, so around 390W from the DC end. If I recall that was with a mildly overclocked Q6600 so your 4690K is likely to produce similar results, perhaps slightly more, but either way 600W would cover it, let alone 750. CF on full size cards like that is really the domain of 850W PSUs - I ran the same test with a pair of 4870X2s and out of the PSU it pulled around 700W in games with my Q9550 and in a burn test around 920W, which I was quite surprised to see the 850W produce. Then again it was almost identical to its 1000W counterpart.
     
  13. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    As far as PSUs go I wouldn't go lower than 1KW for a pair of 390X's or similar. They have been known to spike quite high. I certainly wouldn't try it with an 850W. Overclocked CPUs can also take monster power. 200W isn't uncommon, especially with things like my relatively cheap AIO cooler. I can easily get the 4690K to pull that much well within safe limits. So at 200W plus 300W for each card you are coming dangerously close to that 850W PSU's upper limits. That's not counting drives, board, memory, fans, etc. Thus 1KW+ PSU or don't do it in my eyes. They are particularly power-hungry cards.

    In contrast, I'd have no issue running say a pair of GTX980s at ~165W max TDP or even have a go at a pair of 980Tis at ~250W. Add 20-ish watts each for factory OC'd cards. They simply don't spike like the AMD cards do. 275W TDP and spiking to 300+. Factory OC'd 390X models like I want are easily 300W per card and spike much higher. An 850W PSU would be hard pressed to power the GPUs alone. Most quality PSUs will power a single one though.

    On Z97 M.2 draws PCI-E 2.0 bandwidth from the chipset. The chipset provides a full 8 lanes of usable 2.0 bandwidth separate from the CPU. CPU is interfacing with the full-size PCI-E slots only. It would share with my SATA Express ports and my sound card. The sound card is 1x and I think a PCI-E SSD would be 1x/4x so should work fine. Besides that, the 850 Evo M.2 is a SATA Express SSD which I think equates to only 1x. So I don't think there should be any major conflicts. I was worried about this when installing my sound card, and Gigabyte tech support were very helpful and gave me a full explanation. You can populate just about every port and slot in the system without issue, as long as your total bandwidth isn't greater than 8x on the non-full-size slots. PCI slots are supplied by the same bus. I'm not sure if SATA Express and PCI-E are treated differently on this bus or count towards the same bandwidth cap. This should cover all Z97 chipsets from all brands but I think a few very basic models lack the "Plex" PCI-E/SATA combo controller chip that makes it possible.

    The engineers knew what they were doing it seems :p

    [​IMG]
     
    Last edited: May 23, 2016
  14. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    So after helping my family with some financial issues over the past few months it's time for me to get my windfall. Daddy-o agreed to get me a ~40" LED HDTV. Next week or so I'll be demoing some at local stores. If it works out like I hope, the 32" Samsung monitor will probably get put away. Looking at the inclusion of VGA, otherwise just HDMI for everything else is acceptable. My Yamaha receiver outputs clean, unmolested component video through HDMI. My receiver also has less compatibility issues with TVs vs monitors which make it act funky sometimes.

    Have given the situation some thought, and new video card(s) at this point would be wasteful when I'm much less satisfied with my display. I never found the quality to be lacking, but given its other flaws and my current performance issues, this 32" display needs to be replaced with something bigger and easier to drive games on. If it means a resolution drop to keep my budget from going hog-wild, so be it. A larger display benefits me in many other ways, and it being a TV means better connectivity and scaling with my five connected consoles. N64 and Dreamcast through S-Video, PS3 through HDMI, Gamecube and Xbox through component. A TV scaler will work far better with these consoles, especially the Xbox and PS3 which do not scale well with my monitor(though they scale great through component on the old 2407WFP).

    Dropping to 1080p would greatly alleviate my memory issues for the time being. Skyrim drops from 3.8GB usage to 3.4GB with no other changes, and I can tweak it down from there by a little bit. It goes from jumpy and unstable while flying around with noclip to very smooth and consistent. I have several mods installed where I chose the 2K or 4K version where a 1K or 2K one might have sufficed. At 1080p they would certainly look fine and are still a vast upgrade from the stock graphics. I could probably get the smaller versions of a few and reduce my usage by another 100-200MB or so. If I can create enough cushion it should be totally stable and smooth no matter what I'm doing. Think of the best graphics you can imagine. Modding Skyrim is worth it for the dedicated. Installing basic mods is a no-brainer even for a novice. But installing large or advanced mods, or several large, advanced mods together has an extreme learning curve. The average PC gamer could install Mod Organizer and get a few big texture mods and expansion mods going with no issues. Add an elf chick with skimpy armor or something. Trying to retexure and generally enhance nearly everything in the game is a different story entirely. Sitting at 395 Installed. WIll be just over 400 when done. Almost all textures, with two user-made expansions, and a heap of fixer and enhancer mods to flesh out the game. Fix mechanics, optimize bloated scripts, increase distant LOD, etc.

    Additionally, LED is a direct remedy for some of the issues I had using TVs previously. Mostly unnatural, oversaturated colors, and loss of contrast when I lowered the brightness, common on most LCDs. Any decent LED should be able to maintain color accuracy and contrast much better when adjusting brightness. They also have excellent contrast out of the box.

    I want clean video scaling, size and video playback here first and foremost. Even a bargain bin LED TV will give me that. So if I take it a step or two up and get say a slightly older Sony, LG, or Samsung mid-range model at discount prices, I should get an okay TV. Everything else is secondary. At 40" and below 1080p is crisp enough not to bother me at all for desktop use. Only at above 40 does it become a horribly blurred mess. My Coby 39" worked A-OK in that capacity.

    Gaming also works great at 1080p, and for minutely detailed RTS games and the like, there's always SLI AA. Even with 4K film there is a benefit because I usually do a few notches of pan-and-scan/zoom with anamorphic content. Have experimented around and found a decent setting that doesn't cut off anything important, and leaves room around the edges in scenes framed by a full anamorphic screen. 4K film looks amazing if you blow it up a little on a large screen. Even at the lower 1080p resolution, the color information is still there. It's still a cleaner, smoother image.

    My biggest priority is HD film and consoles. My main display is the center of my entire entertainment setup, so 32" is just lacking, despite the benefits of 1440p. An updated 39" LED would be a large upgrade from the older LCDs I've used. All the reasons stated above are enough to convince me that my original plan was the right idea all along.
     
    Last edited: May 23, 2016
  15. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Just an update. HiSense 40H5B purchased for a bargain. Bought from a local swap page on Facebook for $200. New-in-box purchased, opened and never used a few months ago. Tape holding screen protector and sealed plastic still intact. Receipts and everything included, original price $350. 40" 1080p LED Smart TV(not that I'll use that feature). It also has some sort of lame emulated 120Hz mode. I'll be sure to check it out, but thankfully it can be entirely disabled, leaving a standard 60Hz display. Some TVs can emulate the 120Hz effect quite well though and cheap hardware can surprise so I'll still try it.

    Image quality seems good and colors are mostly co-operating while I adjust contrast and brightness. Quite a nice looking display. A hair on the cooler side of color temperature but fairly neutral. At 4 feet viewing distance, basically at the back of a large desk, text legibility and desktop usability are pretty good. Nearly identical to my 39" so the difference must not be huge. Productivity so far has been okay. The drop in resolution hurts a bit, but I'm finding myself adjusting quickly. Again, this is at a considerably larger viewing distance than my 32", so bear that in mind.

    That being said, HDTVs do make good monitors if you know what to expect, and are prepared to tweak the image settings a bit. Their response times aren't amazing so they tend to ghost just a hair. They also have the color tint way too high from factory, and contrast and brightness that will burn your retinas. Scale it all back though, and they are generally a pretty decent display. Comparable to any modern TN panel monitor in everything except response times. No worse than my A-MVA Samsung monitor's mediocre response times though. LED TVs are a leap from older CCFL LCDs in that regard. More expensive models will have 4:4:4 Chroma Subsampling, which gives crisper text and slightly better overall definition by separating colors better within individual pixels. This is something most modern monitors have, but most TVs do not. On TVs, it usually introduces more latency though, which can be hit or miss already. TVs without 4:4:4 Chroma usually need Windows ClearType, which works well if you know how to use it. ClearType makes text look pretty good on my cheap-o display.

    So yeah, other than it being a generic-branded TV with a built-in chance of dying, it works quite well. The size and image quality are definitely enough. Wish me luck on the long-term reliability roulette.
     
    Last edited: May 24, 2016
  16. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    For $200 you can't really argue I suppose. It'll take a lot to convince me of the quality of using HDTVs as monitors, every example I've used, regardless of the contrast quality and potential ghosting issues has never offered a sharp image, regardless of viewing distance making text very difficult to read. Fine for video but not for desktop use really. Still, if it works for you then fair enough!
     
  17. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    If you know what to tweak and aren't expecting amazing results, they are very functional. The panels themselves are usually a dime a dozen and used in all different brands. Unless you want something more expensive with a proprietary panel, you are likely getting the same TV from several different manufacturers. The biggest issues with their usability are that color tint/saturation and contrast are set far too high from the factory, even for normal TV viewing. Brightness is also very high, but honestly that evens out a LOT when contrast and tint are lowered. It's a little blinding when I first wake up, but for all day use, I'm not having any noticeable eyestrain like an un-adjusted TV might give. In fact, I find TVs to be more comfortable on my eyes than a monitor once adjusted. And you KNOW I've had a few decent monitors. I think the lower DPI agrees with my brain in some weird way. Another setting from factory that's usually WAY too high is sharpness. Instead of a sharp and smooth image, the over-used sharpness setting creates a very grainy and overexposed image. Most TVs will look AWFUL for desktop use out of the box due to this setting. Until you reduce this setting or turn it off entirely, you'll NEVER get a good image from an HDTV at desktop distances. My TV came with it set at 10 notches out of a max of 20. Currently I have it at 1 out of 20. Enough to keep the image from being too soft, but not immediately noticeable to the naked eye unless you use the display for a while. Quite a nice effect when used as sparingly as possible.

    1080p at 40" may seem like a horribly blurry display, but it's actually wonderful for desktop-distance games and movies. Like blows my 32" 1440p monitor out of the water. No comparison whatsoever. Even my old LCDs were great for that, which is what convinced me an updated LED would be very nice indeed. Size trumps everything for immersion. Detailed FPS's/RPG's/RTS's and the like included. Also, proper 1080p film looks amazing with 1:1 scaling, even directly in your face. No monitor I've ever used can even touch a cheap HDTV(LED or not) for movie viewing. The mainstream adoption of LED in the last half decade or so just means that the lower-end for color reproduction has jumped up a little, so it's the cherry ontop of an already decent pie. An LED TV will never touch my MVA or IPS panels for color accuracy, but will destroy them in contrast ratio(a HUGE deal for film and games). They can certainly be calibrated to at least approximate a modern TN panel monitor in accuracy though. It's close enough to satisfy me when read with the calibration tool. Not far off my old ASUS 1080p 23.6". A bit on the cool side for color temperature, but totally acceptable considering the quality of the panel. Older CCFL LCD displays could almost never be adjusted to acceptable levels.

    Windows ClearType plays a MAJOR role in making text bearable. I do tons of reading daily and it's actually easier on my eyes than a monitor for that, so it's a perfect usage scenario for ClearType. A TV with 4:4:4 Chroma Subsampling will do a vastly better job with text, but will be proportionately more expensive. So if you are nitpicky and willing to spend, you can get a pretty nice monitor-type display out of a TV. I will not however, say they compare to a decent monitor in displaying text. I leave ClearType off on my monitors because text already maps wonderfully to their native resolution. It's absolutely 100% essential when using a TV though. Would be curious to try a more expensive 4:4:4 Chroma panel in my own home and see if that holds true. It might be that a more expensive TV would be enough to display sharp text natively with no adjustment. In short, displaying text is an HDTV's largest weakness. You are absolutely correct there. It can be decently remedied with ClearType though, and not all TVs are created equal. More expensive ones will do it far better.

    Also keep in mind that my usual viewing distance is about 4+ feet from the monitor leaning back in my chair. Even with the 32" things can get a bit squinty for me. Forces me to always be in typing posture to read or browse the web or anything else with lots of text. That's fine for when I'm doing work or being productive, but limits my casual enjoyment of what I'm doing. That's kind of a priority, because it's more a glorified HTPC and gaming box than anything else. The viewing and listening experience matter far more to me than any other aspect. In that respect, even my older and far worse HDTVs excelled over any of my monitors. They just take the cake for gaming. Nothing like playing Skyrim, GTA V or FarCry 4, and having an awesome mountain vista or a sprawling city fill your entire view. It is a feast for the eyes, that not even my 32" monitor can properly emulate. To get anything even remotely close in a monitor, requires many times the price.

    As far as gaming clarity goes, it's far better than you'd expect. Also, dropping back to 1080p means my video cards are even more hilariously overpowered than they were at 1440p. I mean, minus video memory issues, they were stupidly powerful for 1440p. Insanely powerful. At 1080, I can apply basically as much AA as I want to a lot of games, and get pretty nice results because I have so much memory and GPU overhead. Believe it or not, 16x SLI AA removes jaggies entirely, even out of native res. The performance hit is huge for SSAA variants like that, but it does work a treat, even on this very low DPI display. For newer games, 4xAA is enough to clean up the image spectacularly. Again, no comparison to a monitor for clarity, but like color accuracy and displaying text, there are some pretty reasonable methods to bring it in line. Granted, even without AA, it produces fairly crisp image for games. IMO 1080p is a pretty suitable standard. Any lower and a TV would simply suck, but it's JUST enough. Even at 40" you have to really shove your face in the screen to see individual pixels. Size also helps make up some of the detail lost by dropping resolution. There's a LOT to be said for simply making your display larger with no other changes.

    I'm never going to say an HDTV makes for an amazing display and I'll never try to convince you it's better than a quality monitor. They are simply too limited in capability to do what a monitor does. Simple facts about how they function mean they are far more suited than most people think though. Most of the biggest issues can be countered with workarounds or fixes. If it were truly a bad display, I wouldn't use it. As far as your experiences telling you they can't produce a sharp image, you are doing something wrong or haven't taken the time to really adjust it properly. It is perfectly 100% acceptable for me, even coming from high DPI monitors. A few tweaks can go a LONG way. It's not perfect, but the benefits outweigh the drawbacks by a landslide for my usage.

    TL;DR, If you know what to tweak, TVs can be gotten to quite acceptable quality. Plus, expensive TVs need far less tinkering to make them acceptable. Also, 1080p means my expensive and overkill powerful video cards might actually last a few years instead of being replaced for something so pathetic as a lack of memory. I find TVs to be a great alternative for a monitor under the right conditions. If I used my PC differently, it might be a different story. They are godly for movies and games, and have little to no consequence for anything else I do.
     
    Last edited: May 25, 2016
  18. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Hmmmm, so Jeff, you're contemplating ditching sli, and coming back over to AMD, increasing your memory to 8 GB - getting a 390X all for only $50 plus one of your 970s. Sounds interesting. Nice to have friends who have their own plans that dovetail into your plans.

    Also interesting, Sam, is the new drop in die size - finally!

    So what kind of work are you doing these days?

    Ohhhh - forget the 390X - now you're going for bigger monitor! Okay, you still have a rich treasure-load of console games.

    Congrats on your new 40"!!

    That's an interesting comment. As I recall, you sit pretty far away from your monitor - far away meaning a lot further than my 1 foot. I have no choice - that end of the trailer is small to begin with so I am right on top of the monitor. But if I had an opportunity to sit back, yeah I'd want a bigger monitor.


    Yeah, that's how I remembered it - way back in the saddle!!

    Well, it seems like you are back at 1080p to a nice comfort-zone - your dual 970s are now no longer memory-limited, you can mod to your heart's content, no need to moan to the lack of 4GB - and you're back to large size better suited to your 4-foot viewing distance.

    Not bad!

    So you sound like you are a master of tv knowledge - do you work at Best Buy now, or what kind of work have you been involved with lately?

    Rich
     
  19. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Rich, have worked in TV and electronics repair and service jobs for a long time. Fix PCs out of my home frequently. Also, I'm an enthusiast with Google and a thirst for knowledge :)

    If you can sit close enough, a smaller display will probably suffice for PC usage. For me though, I am basically running a small home theater in my room with gigantic 4ft tall 200W+ speakers. The size of the screen doesn't fit the size of the sound and is a weak way to display epic cinematic masterpieces such as Lord of the Rings, Star Wars, etc. I have an actually pretty powerful 5.1 setup with a meek little 32" computer monitor as the main display for everything. Just NOT enough.

    A 40" LED TV is a major improvement in size. Almost a small cinema screen at my distance. Also because it's a fully up-to-date LED-lit display, it's far better than the HDTVs I've used as monitors in the past. The contrast level is absolutely fantastic, which is great for both movies and games. It also has reasonable color accuracy and the colors are very vibrant and eye-pleasing. 1080p at 40" is a bit large for desktop usage, and the OS side of things is a bit "softer" than on a high-DPI monitor. ClearType makes text sharper though, so it's perfectly usable with no real annoyance. Nothing is blurry though a few things are a bit softer. It's surprisingly sharp for the most part. 1080p movies have only improved in sharpness and clarity, because they map to the screen's resolution much more cleanly. Only very low bitrate video suffers, and that's not the fault of the display. A 39" like my old Coby or a 37/38" would have been slightly better in clarity, but odd sizes are hard to find. 40" is awesome for movies though, and easily enough for gaming with or without AA.

    The 970s will still get replaced in due time. But if the current trend holds, the use of 1080p should be enough to stave off the video memory goblins for a while. I'm talking extending the life of my cards by a number of years instead of replacing them after just over a year. That's major. They are high-end cards purchased for almost $400 each. They are also among the fastest factory OC'd 970s available. They need to last longer or they were a waste of money. I was honestly regretting not just getting a second 4GB GTX760 for a while. They wouldn't have had the same memory issues, and two of those in SLI is still a fairly beefy setup. With the 1080p LED TV however, things are on an entirely different playing field. It means they perform how they should, without restrictions. It changes them from my greatest liability into my strongest asset. That difference alone is almost worth the change.

    Finally my 970s are incredible, hilarious, glorious overkill for my native resolution. THAT is what I bought nearly $800 in video cards for. My performance issues have disappeared. My movie viewing and gaming experience is vastly improved. High resolution monitors do not offer what I seek. It is FAR too expensive to get one in this size, and the performance cost for something like 4K is completely insane. With un-hobbled cards, 1440p might have still been okay, but find a 1440p display this large.

    It's only a VERY SMALL few games that even touch the limit for me. They just happen to be premium quality games that I want to put a LOT of time into. GTA V and Skyrim I play currently. Middle Earth Shadow of Mordor I haven't been able to play yet due to video memory constraints. Also Dragon Age Inquisition and Battlefeld 4, which have seen very little play time from me due to constant "out of video memory" errors, despite great performance. With such powerful video cards, running at reduced settings is NOT an option. I refuse.

    Several people on other forums and elsewhere have stated that 970s seem tailor-made for performing at 1080p. That seems to be true. It probably is true.

    Also, the 32" 1440p Samsung monitor had developed a large discolored blotch on the upper left section of the display. Layers de-laminating which is sadly common with these displays. Any other 32" 1440p panel is going to be the exact same manufacturer, so I have serious doubts about investing any more time or money into the idea. A TV matches my budget, and does what I want it to do BETTER than any monitor within a range of $1000. A more expensive TV might have been a better solution, but this one was silly cheap and is brand new. It'd take a pretty expensive TV to provide anything noticeably better, and 4K isn't a realistic option.
     
    Last edited: May 26, 2016
  20. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Jeff, I was following you pretty well, until at the last you said

    Am I to take that to mean - prior to reducing resolution to 1080p?

    Or are those titles STILL going to continue to plague you with the "out of video memory" errors.

    I think it's the former, and that you have solved your problem with those games - which if that IS THE CASE - is wonderful.

    Also I totally see what you mean with home theater, large speakers, but tiny monitor. Not balanced. Way better now with the 40.

    Rich
     

Share This Page