1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    I don't think they are limited. Only Nvidia-designed. Much like Nvidia cards can run TressFX. I haven't heard mention of it. They are all perfectly basic graphics techniques. Don't take my word for it though.
     
  2. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Far Cry 4 is also a far more technically advanced game, that does a lot more with the engine and more efficiently. Lots of neat stuff. Some parts are lacking a bit but it is overall superior to Far Cry 3 and easily one of the best FPSs ever. I only score it as high as I do because it is probably one of the best put together games I have ever seen. The graphics are NOT better than Crysis 3's but it has a lot of exceptionally well implemented features.

    Also, enjoying it on my newly repaired LG 32" HDTV. Much sharper and more desk-space-practical than the 39", but the colors/contrast/brightness need some more fiddling than the Coby 39" required. The Coby has much better default color settings than the LG.

    As for the fix, dead mainboard replaced with compatible one from a 47" Vizio. Menu/splash screen is different from original and has different image/audio controls. The remote largely doesn't work save for power, volume, and channel. All the menus need to be navigated with the hardware buttons on the TV. A hassle but worth the benefits and the buttons are easy to reach from my chair without stretching or feeling around. No functions missing either so no performance/function downside. Just a slight pain to switch inputs.

    All things considered it's a better quality LCD panel, not to mention much sharper and crisper due to smaller size. Much more forgiving to my 480p consoles as well, again due to size. Easily comfortable with this as a long term solution until I bother to pony up for an LED of similar size and capability. So far, it's proven exceedingly difficult to find a <40" LED through local stores that isn't limited to 720p. 32" or 37" would be perfect. Basically the highest possible capability 32" LED would be my ideal monitor. Monster contrast ratio, 120Hz, 1080p, the basics of any mid to high end TV these days.

    Mind you, the color/contrast difference between HDTVs and monitors can be completely eliminated through the available adjustments almost 100% of the time. Viewed as simply a 32" 1080p LCD monitor, this TV is easily comparable to my old 23.6" 1080p Asus. Obviously the trusty 2407WFP blows away any other flat panel I have ever owned in sheer image quality and color accuracy, but this 32" is in a nice little sweet spot. The pixels were simply too large on the 39". Color/contrast can otherwise be gotten quite close.
     
    Last edited: Nov 24, 2014
  3. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Wow, brilliant review Jeff.

    Thanks for shoe-horning in that amazing video.

    I realize that I am not a total graphics connoisseur like you guys, and some of the effects, the soft shadows, etc. I would probably miss. But I liked how the hair was handled on the buffaloes and on the tiger.

    Before I read Sam's remarks - "a suite of Nvidia-unique features" - I was thinking - "I won't be able to run this game."

    But AFTER reading Sam's remarks, I'm thinking, "Well, I'll maybe be able to run it, but I don't have to worry about soft shadows - I won't see them."

    However I have the idea that AMD will figure out a way to take a lot of those effects and come close to reproducing them, given time, and as soon as they see how well the game is selling. That tells me that I will only benefit by holding off, until a Far Cry 4-enhanced Catalyst is released. (If you Nvidia guys hear of that happening - please post it.)

    I'm glad to see Ubisoft continue to do some good things.

    On that other eagerly-anticipated Ubi title, I'm looking forward to your eventual review, Jeff, of Assassins Creed Unity, about which Sam posted that 4k screen (and a bunch of glitch videos - leave it to Sam. :D )

    So now you're running a 32" down from the 39" tv. I forgot that you had switched to the TV. Do you still lean way back in the chair? Or do you play close up like I do with my 30? (And I still wonder if you'll eventually find yourself migrating to high-definition.)

    Rich
     
  4. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Actually, the Nvidia features, except for fur, all run better than the normal ultra setting.

    I still sit back in my chair. 1920 x 1080 at 32" is ample size to read and play games.
     
  5. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Not quite according to GameGPU's testing, it's not a huge difference admittedly, but they look slightly more demanding than Ultra+:
    [​IMG]
    [​IMG]

    No AMD results at 'nvidia' setting initially led me to believe that it wasn't possible to select the option on AMD hardware but of course it may simply be they assumed that would be the case based on the designation and never experimented with it. I would imagine, especially given how the rest of the game behaves, that AMD hardware would take more of a performance hit at that setting...
     
  6. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Gigabyte GTX 970 G1 Edition ordered and on the way. Only reported issue with these cards is some coil whine. As long as I avoid that, should be a damn fine purchase :)
     
  7. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Yeah coil whine has been a major complaint of all GTX970s - I get it, but to be honest I'm not sure it's a great deal worse than most other cards I've owned - the difference is the 970 is quiet enough for it to notice more when it happens.
     
  8. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    No coil whine to speak of on the GTX760
     
  9. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Coil whine's often a bit particular anyway, I used to get it on my HD6970s, but only when drawing a very high frame rate, say 400+ - so intro videos did it, as did things like GZDoom, but mainstream titles made no noise whatsoever, same for the windows desktop. My 970 seems to perform the exact same way, perhaps the coil whine is a little louder when it does it maybe, but even then it's not really much different - the only time it's ever bothered me was when the WEI refreshed itself on the one rare occasion I left my PC on overnight and it started squealing away while I was trying to sleep. Not having done that for a long time, for all I know the HD6970s could have done that as well...
     
  10. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Seems we have some dubious chinese 'leaked' graphs of the R9 390X performance. Interested to see how accurate this is - quite promising if they're true, primarily from a performance/watt perspective, though raw performance is still solid:

    [​IMG]
    [​IMG]

    This places the R9 390X at a performance index of 1.31 relative to the 290X, and in the case of this test, 1.16 relative to the GTX980.
    Wattage, however, is the bigger win, with the card pulling 200W off the 12V rail rather than the excessive 280W the 290X currently needs. Not quite the 185W the GTX980 achieves (note nvidia still under-rate their TDPs, even though they're clear winners in the power department this gen, as the 980 is officially only rated at 165).
    If true, 390X represents a fairly impressive 85% increase in performance per watt. Of course, IF this is a 20nm card, that does, however, mean that with a process shrink, they are only tied for performance-per-watt with nvidia, who will of course steamroller them when they produce a 20nm card.
    If it's still 28nm though, that's both positive, that AMD have got an architecture that isn't horrendously inefficient (comparatively speaking) on their hands at last, but disappointing, that their new card is still 28nm and we'd have to wait yet another gen to see 20nm. Either way is not ideal, but progress is progress, certainly, and AMD GPUs might be tolerable for heat/noise once again... maybe.
    That of course assumes this isn't just numbers plucked out of the air - given the dev boards were recently sent out though, the timing is viable... we shall see.
     
  11. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    So Vishera has had a nice price cut. Maybe it's time to experiment. Intel won't go anywhere for the forseeable future. Plus I have the best Vishera board available :) At worst, it still far cheaper than buying an Intel board+CPU right now.
     
  12. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    http://www.techspot.com/review/917-far-cry-4-benchmarks/page3.html

    Look specifically at the GTX970. About 71FPS average in Far Cry 4.

    http://www.techspot.com/review/917-far-cry-4-benchmarks/page5.html

    Now look between the 4320 and the first i3. about 70-71FPS. That's about where my Thuban would score as it sits.

    I think this maybe the last card I ever use on this CPU. It's not a huge bottleneck in this game, but it's still a bottleneck, however slight. I have achieved equillibrium lol. The Thuban was a monster chip for its time. Took this long to find a real bottleneck :)
     
  13. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    CPU tests for games are still a little variable to be honest, and far from an exact science - those results are believable, and about what I'd expect, but nonetheless you can never be totally sure until you try a game and, if experiencing performance issues, assess how much (or otherwise) your CPU and GPU are being loaded.

    Generally speaking, AMD's top-end Phenom II and higher-end FX-series CPUs are still adequate to play the vast majority of games out there, but you're much further into the 'comfort zone' using a contemporary i5 or i7 quad core. Depends how much of a 60fps nut you are - if you need a solid 60 in everything you play, AMD should have been left behind long ago. For more 'realistic' gamers though, either are arguably still fine - I just see little value in building a contemporary high-performance system using an AMD CPU.
     
  14. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    So just bought one of my favorite games ever today. Gauntlet Dark Legacy for the Xbox. As with many cross platform games, the Xbox version is easily the best.

    Gauntlet Legends and its expansion Dark Legacy(more characters and levels) were released in arcades very close together chronologically, so they had a mixture of ports on the consoles. The main distinctions that can be made are "Stock" and "Enhanced" ports. The Enhanced ports add an inventory system which is extremely useful because it allows you to store power-ups for a later time instead of using them upon pickup. It also adds a player shop where basic items can be replenished and makes a few tweaks to gameplay and balance to suit home consoles. These are pretty huge differences that change the gameplay strategy wildly. Here's a basic rundown of the versions available in order of release:

    PS1: Enhanced Port. Legends. Proper audio but terrible graphics with a low framerate and very reduced number of onscreen characters. Widely accepted as a bad quality port. Enhanced features are nice but the game is very much butchered by the lack of hardware power. 2 Player only.

    N64: Enhanced Port. Legends. Much downgraded graphics but still far superior to PS1 and with everything intact. Full number of onscreen enemies. Music is rewritten in MIDI audio to save cartridge space for voiceovers. Still sounds good for the N64. Generally considered the best port of Gauntlet Legends. 3 player only but can be increased to the full 4 players with the N64 Memory Expansion Pak. Large part of my childhood :) Recommended Legends Experience

    Dreamcast: Stock Port. Legends/Dark Legacy. Arcade perfect graphics far above the N64 with full quality audio. Includes some characters and levels from Dark Legacy but lacks the Enhanced inventory system from N64/PS1. Really a downer if you've played an enhanced port. It changes the entire way you play the game, and makes it much more fun for casual play. Technically a truly excellent arcade port, but lacking the one essential feature of console Gauntlet. It does have the very unique ability of being able to display in 480p on an RGB capable monitor just like the arcade version. None of the other consoles do this. Recommended Arcade-Perfect Experience

    PS2: Stock Port. Dark Legacy. A truly perfect full arcade port of Dark Legacy. This means it lacks the Enhanced features much like the Dreamcast. Well regarded but lacking that all-important inventory. I only recommend the Dreamcast over this for arcade die-hards because the Dreamcast version runs progressive scan with otherwise identical graphics while offering almost all the same levels and characters. This is otherwise a better game due to a bit more content. 4 player available with PS2 multitap adapter.

    Gamecube: Enhanced Port. Dark Legacy. Arcade perfect graphics and sound but with the Enhanced features. Largely playable but suffers from bad slowdown in large areas due to the Gamecube EMULATING the PS2 version. All the features are there, but it simply runs very badly on the console. I'd recommend playing the PS2 or Dreamcast version without the Enhanced features it's that bad. 4 player playable but the framerate is even worse. Again it's playable, and Enhanced, so still not a terrible game. Just a bad port.

    Xbox: Enhanced Port. Dark Legacy, Absolutely arcade perfect graphics and sound and re-built from the ground up for the Xbox. Features the best framerate of all the versions plus those wonderful Enhanced features. Easily the best version on 6th gen consoles. 4 player available. Recommended Dark Legacy Experience

    This was an extremely interesting and important couple generations for consoles. The early 6th gen especially. The Xbox version of multiplatform games is commonly drastically superior to its brethren. Whether it be 480p/720p or 16:9 support where there was none, better graphics or framerate, or simply more features due to the Xbox being the latest release of its generation.

    Star Wars Starfighter Special Edition is an excellent example of this. More ships, better graphics, better sound, higher quality FMVs, etc. The Xbox was both far more powerful than the PS2 and released later, so an enhanced version made great sense to sell Xboxes and provide an ultimately better game. GTAIII and Vice City especially are greatly enhanced for the Xbox. It was a big deal at the time. Alongside all new models for all the people and cars, there were better textures, better framerate, better draw distance, and, my favorite feature, the ability to import custom soundtracks. SanAndreas was not nearly as enhanced. Only slightly better textures, framerate and draw distance. It pushed the PS2 much harder. Still better on the Xbox though. I highly value my Xbox as the definitive way to enjoy multi-plat 6th generation console games. It was easily to most influential generation for modern games, and the most instrumental in establishing today's monster franchises. For example, NO game on Gamecube or PS2 can touch the visual splendor of Half Life 2. The Xbox ran it, not smoothly at all times, and not cranked, but it ran playably with all the eyecandy effects. Awe-inspiring for that gen of consoles to even think about that, let alone execute it so smoothly.
     
    Last edited: Nov 28, 2014
  15. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Somehow I got pulled out of the loop! :S So, I'll renew my subscription lol

    Hey Sam, didn't you have some issue running 4K resolution? What I mean is, weren't you locked at 30Hz for a time? I've been reading of others having issues in that regard. Since I'm in the market for a 3D panel, I thought, why not also attempt to get 4K as well. That way, passive 3D will theoretically be 1080P per eye :)

    I realize those panels are expensive. I was thinking about waiting til my tax return. Not real hot on the idea, since I'm itching for 3D. But I'd prefer stay WITH the technology curve! LOL!

    I'd really like to run it at at LEAST 120hz! Is this not currently possible? Or is it simply down to proper cables? I seen HDTV's boasting 4K@120Hz. But if a PC can't achieve this, I'll be waiting til it can...
     
  16. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Several issues :p

    The main difficulty in getting 4K to work at 60hz on the desktop is that most panel controllers can't handle the bandwidth required, so two panel controllers were used with most monitors (my UP3214Q included) - each one handling one half of the screen. The UP2715K (A '5K' screen) actually uses four such controllers, one for each quarter.

    This method, known as 'display tiling' (or MST for 'multi-stream transport) therefore relies on PCs being able to handle the two separate virtual displays as one display space - enter AMD Eyefinity or nvidia vision surround. This is where the issues arise - neither of these technologies are fully developed, though vision surround is arguably far further advanced than Eyefinity.
    I owned an R9 290X briefly but had to send it back as it performed so badly in eyefinity it was unusable at 4K - far worse in fact than the HD6970s it replaced. The raw power was there, plenty of it, but being in the middle of a game and 'no signal' coming up on a routine basis just isn't livable.
    The GTX970 I replaced it with has its own problems, most of the time it can't boot into windows with that monitor on its own - it requires either another monitor to be connected (or latterly I find myself using remote desktop instead as it's easier) to reboot the PC, and only on second boot will the screen come up. Once there, it's stable, but once in every 15-20 or so times I launch/exit a game the bezel compensation will error and I'll end up with a duplicated band of my screen on one side - easily corrected by opening the control panel and resetting the scaling settings, takes all of 5 seconds to do, but either way, plug'n'play user friendly tech this ain't!

    TVs that say they do 4K at 120Hz will be lying, for the most part. It's simply not feasible to do that yet - you will more than likely find that they support "120Hz mode" (I'll get round to the parentheses in a moment) at 1080p, but not at 4K.
    I use the parentheses because TVs never have supported 120Hz inputs from a device except in very rare circumstances - they achieve 120Hz by interpolating frames at the display-side - e.g. they take a 30Hz signal, compare Frame A to Frame B, and insert the corresponding three 'intermediate' frames in between that represent a proportional change from one frame to the other, to give a smoother effect.
    This makes motion more fluid, but it does not introduce any extra detail, and fundamentally does not mean you get to, for instance, play games at 120Hz. For that you would need a genuine 120Hz PC monitor - those are fairly plentiful, but typically only with TN panels. Personally, I'd take IPS picture quality over a faster response time any day.

    There are some '4K' (technically that should also be questioned, as 3840x2160 is not 4K either, it's UHD!) monitors that support 60Hz over SST (single-stream transport) which means they do not require all the technical complexities of eyefinity or similar, but they are currently only offered with TN panels, which I'm not a fan of, so would rather wait.

    4K 120Hz panels are to my knowledge non-existent as the displayport specification to support them is still in development. It will be at least a year or two I'd have thought before we see them (Might I also point out I don't think there have been any 1440p 120Hz monitors either, let alone 2160p). Also bear in mind running 3840x2160 requires FOUR times the graphics power that 1920x1080 does. Running 120Hz at that resolution brings it up to eight times.
    Graphics processors simply aren't up to that at present. If you desire things so smooth that you want 120Hz over 60, the level of micro-stutter evident in dual-GPU setups will simply render your efforts pointless, so you'd be looking at eight times the demand from a single graphics card. For modern titles, that just isn't going to work :p
     
  17. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Way to smash my hopes and dreams LOL! Nah, I had a feeling your reply would be something like this. I prefer steer clear of TN panels myself too. After owning the Dell Ultrasharp U2410(IPS), I could never knowingly go back! I have read reviews of Genuine 120Hz monitors, that game really well, but it still feels like outdated tech to me. It has to be IPS, or no deal. And at this time, I would never take anything less than 120hz/3D either.
    Despite your explanation, I may opt to become a guinea pig :S lol. Not sure when yet, I do have significant paydays coming up, so I'll consider it then. But my tax return should be decent as well.(However a few months out...)

    Whether 4K(UHD) is ready for mainstream or not, I would benefit from the real estate. I always like a larger work space :D If I could find one that pulls less than 55W, I'd really be sitting well. Since the Dell typically pulls ~55W.
     
    Last edited: Dec 1, 2014
  18. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Well that's it, going from the 3007WFP to the 3008WFP which were both IPS and now the IGZO UP3214Q, plus having IPS auxiliary monitors in the form of two U2312HMs, TN panels just look terrible to me. I mean, fine if you need a cheap monitor without much care, TN is still very cheap (though with Iiyama's IPS displays now sub £100 for businesses, even that logic is questionable now!) but if you're going to spend big bucks on a '4K' display, half-measuring it with a naff TN panel just seems like a colossal waste.

    I'm not currently aware of a single 120Hz IPS monitor that has ever existed at any resolution/size - feel free to correct me if I'm wrong on that, but you may be waiting a while!

    Is the wattage of the monitor really that important compared to the rest of your hardware? Not wishing to open up the whole Intel vs AMD can of worms, but if power consumption really means that much to you and you're still running an AMD CPU, especially if overclocked, you're easily using more than 55W over and above the equivalent Intel...
     
  19. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Nah, I don't have my 1090t overclocked. 6 cores does plenty fine at stock, when encoding X264. Substantially better than the 965 I once had. If I can save on the electric bill, I'm all for it. I try to be green :p Besides, my northbridge runs uncomfortably warm, when overclocking. I may try again, on the next board. Which I also want to consider in February.
    I see your point :( Patient I must be. True 120Hz IPS panels seem to be scarce/non existent.

    Probably just gonna go with a cheapy 3D passive monitor, to get me by for now. Been strongly considering this one.
    http://www.lg.com/hu/monitorok/lg-DM2752D
    I refuse to bow to Nvidia and there infernal active technology! :p

    Seems to be low wattage too! 35w if it's stable at that.
     
  20. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    My i5 3470 is now more than two years old, but not a great deal has changed since it was released - the i-series CPUs have got a little more energy-efficient still, but admittedly not a great deal faster. Still, At less than 80W all-in for both the CPU and the chipset (which of course is in the CPU) the 3470 performs broadly equivalent to how my 750 does at 3.8Ghz. Where AMD arguably does best versus the i5, on second-pass x264 video encoding, that still makes it 20% faster than a 1090T, at pretty much half the wattage. I might also point out that current gen (4000 series) i5s idle at less than 5W. The heat output of a modern intel build, until you add a substantial graphics card and/or mechanical disk drives, is almost null. Furthermore, due to the low idle power consumption of modern graphics cards, a system kitted out with a 500GB odd SSD for all that matters, and a single 4TB slow drive like a WD Red for basic content, a modern i5 CPU, 16GB RAM and a pretty much top-end graphics card like a GTX980, when idling, can pull as little as 45W, barely more than a Wii U games console, and certainly far less than something like the PS4 or Xbox One, let alone a powerful gaming PC of old. If you have a system running almost 24/7, it's certainly something to think about.
     

Share This Page