1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    When gaming my head is roughly 2 feet from the screen and both monitors adequately fill my immediate field of view on their own. That being said, 4K vs 2K textures is significant for me, as I DO spend some of my time examining things up close. Skyrim in particular benefits hugely from texture and mesh mods as the game is very up-close and personal. It actually encourages scrutinizing the environment around to look for small details. You don't need a 4K monitor to get the effect of 4K textures. Though a monitor with good pixel density such as my 1920 x 1200 24" or better is recommended. A 1080p monitor is somewhat limited in that regard. Okay for gaming, but the sharpness and clarity aren't even in the same ballpark. High pixel density makes good textures really SHINE. 2560 x 1600 in particular is great for that. I have spent a bit of time gaming on a 30" 3007WFP. They are no joke, even though my 2407WFP is already far better than the average monitor.

    As far as FXAA goes, no way, ewww. The only implementation of it that I like so far is being able to turn it to its lowest setting in the Battlefield games. Then it looks quite good and can be a passable substitute for real AA.

    Every other game simply has it turned up WAY too high at default which makes it seem as if someone smeared Vaseline over the screen. It's a useful effect, but needs to be used extremely sparingly as it blurs EVERYTHING on the screen, in contrast to AA which just blends edges.

    Mind you FXAA is noticeably less terrible with my 2407WFP vs the ASUS 1080p display. Low pixel density + FXAA = terrible. In that case, real AA is the only way to go.
     
    Last edited: Dec 15, 2013
  2. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    We're not talking about viewing distance here. Viewing distance obviously increases the crispness of any image, the further you get away. If you're playing with the same source resolution, the monitor resolution makes little difference. I say little and not none, because as you might expect, upscaling 1920x1080 from a film for example will look slightly better on 3840x2160 than 2560x1600 because no interpolation is required - 3840x2160 is an exact multiple of 1920x1080 so you simply get a 2x2 square the same colour for each original pixel, rather than having to try and interpolate half-pixels, which will 'smudge' the image, in the same way as zooming in on an image in windows photo viewer.
    As it stands, the higher pixels per inch of the 4K display, assuming you are looking at a 4K image, such as a 2160p video, or more likely your desktop, allows you to sit closer to the screen and get the 'far away' effect. Everything like the start menu and desktop icons are smaller and as a result, more crisp to start with. Sitting in front of the UP3214Q with the standard 23" 1080p U2312HM beside it, it looks very fuzzy and low-res, whereas there's nothing wrong with it, it's just seeing the two beside each other that you realise what a difference it makes.

    Now, back to the game analogy I used. Think about this for a minute. Say there's a wall texture that's 1024x432 pixels, that you're standing fairly close to, but not right in front of. Where your player is currently standing, it might make up a certain proportion of your screen, say 32% of your screen's width and 24% of its height.
    If you had a 1920x1080 monitor, standing in that place, you'd be seeing a 1024x432 texture at only 614x259 - because that's 32% of your screen's width resolution and 24% of its height resolution. Meanwhile at 3840x2160 you can see the whole texture, which will in fact be upscaled, as you're now viewing a 1024x432 texture at 1229x518.
    Apply this to something further away in your gamne, that only makes up 0.5% of your screen's size in each direction. Certainly large enough to be, for example, a unit heading in your direction in a game that you can see. At 1080p that object is 19x11 pixels, not much to go on about its detail. At 3840x2160 however, that same object will be rendered at 38x22 pixels. That may well be enough detail to get a clear view on what it is that's approaching.
    The texture of this objects is clearly going to be much higher than either of these values, but having that higher screen resolution means you get a much better render of it from a distance. Since as you say, 4K textures are rare, most objects will be 'sharpened' by a 4K monitor beyond a couple of feet in front of you. Even if most textures are sub 1000 pixels wide and you upgrade from a 1920 wide to a 3840 wide monitor, all you need to do is make the texture take up less than those 1000 pixels on your screen and you're seeing an improvement. For most games, a very short distance is needed for this, as most individual textures in a game aren't covering half your screen at once!
    On the other hand, as Jeff said in his post, there may be occasions when you want to view stuff really up close. As in walk up to a wall so you can go no further, such that the wall texture fills more than 100% of your screen. Most games don't actually provide wall textures at very high resolutions at all, most evident when you see safety notices on the walls, walk up to them and discover they're too blurry to read. The advantage of having 4K textures in a game for people who don't have 4K monitors, is that you can do things like this and still have a clear image in front of you. I am very much in favour of better textures for games, as it's still the most tangible indicator of 'good graphics' you can get, and far from the most demanding on hardware. Trouble is, it's also the most costly for the devs.

    Now to the displayport debate again, so you understand what you're dealing with.

    Displayport allows daisy-chaining off ports using a hub, much the same way as USB allows you to use hubs to expand the number of ports you have. This technology is called MST. Previously, you could buy MST hubs and run three displayport monitors off one port on your graphics card (as it's rare for graphics cards to come with three displayport connectors). This was useful for people using eyefinity as it meant they could use displayport for all three monitors. Mixing and matching monitors with and without displayport in eyefinity was always troublesome because different timing methods are used, and the displays could go out of sync with each other.
    As it stands right now, there is no image handler chip that can handle a single 3840x2160 stream at 60Hz, only 30. The way the Dell UP3214Q overcomes this is an internal MST hub. As far as your PC is concerned, a Dell UP3214Q (along with the other 31.5" 4K monitors on sale from Asus and Sharp) is two 1920x2160 monitors, one for the left half of the panel, and one on the right.
    Now, the "left" hand display processor supports 3840x2160 at 30Hz, so if you set the monitor to 30Hz mode (Displayport 1.1) in the menu, you can still get 4K without using eyefinity, just with a 30Hz refresh rate. This isn't very nice to use, but if you have a device that doesn't support the newer displayport standard you want to connect, this will work.
    In order to get 60Hz though, you have to use both sides - to get them to appear as a single monitor you need to use Eyefinity to merge the two displays together.

    With regard to AC4, two observations:
    - Increasing Anti-Aliasing never increases the CPU load. If you're seeing higher CPU usage, something else may be going on with the game such as your current location. If you are seeing a drop in GPU load making you think the CPU is working harder, the GPUs may be running out of video memory.
    - It's impossible to distinguish any quality difference in the thumbnails provided. To really judge the differences in quality between the screenshots you'll have to post them at their original resolution, rather than 900x600 odd.

    As for Jeff's comments on dpi, DPI is king, it really is -

    42" 1920x1080 (e.g. HDTV): 52.5dpi -> 2.75kPsi
    32" 19201080 (e.g. HDTV): 68.8dpi -> 4.74kPsi
    31.5" 3840x2160 (e.g. UP3214Q): 139.9dpi -> 19.6kPsi
    30" 2560x1600 (e.g. 3007WFP): 100.6dpi -> 10.1kPsi
    27" 2560x1440 (e.g. U2711): 108.8dpi -> 11.8kPsi
    24" 1920x1200 (e.g. 2407FPW): 94.3dpi -> 8.9kPsi
    23" 1920x1080 (e.g. U2312HM): 95.8dpi -> 9.2kPsi
    21.5" 1920x1080 (e.g. S2240L): 102.5dpi -> 10.5kPsi

    You'd be amazed what a difference even the 102.5dpi makes on a 21.5" vs a 23" at 1080p. I set an LG 21.5" up for a customer at my office once and it sat next to my 23" Dell, and I remember thinking 'wow, that's much crisper than mine' - and that was a 14% difference in the amount you could display in a given area.
    Going from a 30" 2560x1600 to 31.5" 3840x2160 is a 94% increase. That's a big deal.
     
  3. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Hi Guys - happy holidays to all!!

    You're right - I could not tell any difference in those images - furthermore, running around the game yesterday after a one-week absence (building a W7 + XP tower for my black roommate Daryl) I said to myself, "Well, that's really nice - it must be at the EQAA 8x(16) setting, but it was back at my 4x(8) earlier compromise, two settings down, and the game still looks quite good.

    How do I link to the full-size image? I am using photobucket per Shaff's recommendation of 5 years ago. Do I load the image full-size into there, and then just display the link, plus maybe the 900 width thumbnail?

    So I hear what you guys, Jeff and Sam, are saying about dpi. Your explanation of being able to see more of the wall texture, because of the high res monitor, even if the texture is still back at the 2560x1600 size - that made a lot of sense to me - good job Sam - I think I followed that. Also, thanks for explaining the way eyefinity works - I did follow that very well.

    The customer's monitor looking crisper - that made sense, and the 94% increase of the 4k monitor - huge jump in crispness. Maybe 2 years from now when the foundries have made further reductions in die size the 8000 or 9000 family AMD cards will be fully up to the challenge.

    (I just heard about a new thing - graphene - a 2d graphite structure one atom thick - which IBM used to make an experimental integrated circuit. Maybe that's what will be required eventually to run 4k without using enormous amounts of graphics card power and heat.)

    I haven't been totally scientific about this - but I was logging core temp, and gpu-z. At one point, on a core-temp once-per-second log, there is one line that shows cpu load at 100,100,100,100. Can't ask for more than that from the struggling 9450. Gpu load maxes at around 78% each gpu. Framerates are down, at the higher graphics settings. But reduce the EQAA, framerates rise, gpu load drops, and so does cpu load, to mid-to high 90% - but no more maxing out at 100%.

    So .... if this data is correct .... is this counter-intuitive? I remember on Sleeping Dogs that I set framerates to 30, to stop the cpu from maxing at 100% on all cores, which was producing wild framerate variations as I drove around town. So higher framerates, equals higher processing load, right?

    Yet the AC4 data seems to indicate that despite reducing framerates, increasing EQAA has the effect of increasing cpu load.

    As to whether I am exceeding my vram limit of 3gigs - maybe - I didn't check that. But even if I am - how does that cause my cpu to max? Anyway, when you add it all up, obviously I am cpu limited.

    The lagginess started to bug me, and I dropped back to the lower to get the frames up. I discovered that the MSI afterburner OSD costs me about 2 fps for the gpu information, and another 2 fps for the cpu load information which comes from enabling rivatuner - which tucks the cpu core temps and loads inside the MSI OSD. BUT... the hide hotkey now works quite well, and as soon as I hide the OSD I get all my framerate back.

    This works particularly amazingly on furmark, which I use to test my OSD. I have fraps now, for $37, well worth it, and I can run fraps, and the msi osd at the same time, in a 2560x1600 window. Furmark and fraps show 25 fps. I hide the msi osd, and now furmark and fraps show 50 fps - what I used to get before when I was displaying the msi osd.

    I don't know what happened on my system, but I never noticed the msi OSD having any kind of affect on framerates - certainly not cutting furmark in half. I was always getting 50 - I put that information on the shortcut as part of the name, as my quick test to verify that both cards were truly running in crossfire.

    But suddenly - half the framerate with the MSI OSD. I uninstalled 2.2.3, and re-installed the former 2.2.2, but no change. Maybe a registry key got modified and stayed modified. Anyway, as long as it hides quickly - why should I care? So as of yesterday when all this became apparent, I now hide it in AC4 unless I want to know what time it is, and just rely on fraps for whether or not I can boost my graphics setting.

    As I think I already posted about, I had bought the fraps program, because MSI didn't work at all for a couple of recent games, the COD Ghost, and the Battlefield 4. Both of them were not as bad as I thought they were going to be, and ultimately I decided they were worth the coin. Ghost single player had some amazing space station shooting, flying around in zero gravity - twice. And they had some underwater scuba shooting - similar type of feeling - with special bullets and special guns - and I got killed a bunch of times before I mastered it. So there was some good challenge. Those guns are real - I had a hard time believing it - but the skinny long bullets are effective as some little distance - like 30 or 40 feet - which as I say is hard to believe. Each one like a little torpedo or spear-gun - actually dashes along at up to 30 or 40 feet according to the game.


    The Russians designed one that carries regular bullets at the same time, and automatically knows when you leave the water to change ammo.

    I should look up the exact specs.

    Oh my god, I did. The small bullets up to 15 meters = roughly 45 feet, the AK size to 75 feet, and a very large dart armor-piercing round can maintain its effectiveness for up to 60 meters - almost 200 feet. What the hell!!

    As for Battlefield 4, the single player again wasn't too bad. I enjoyed parts of it. The multiplayer - I did put about 20 hours into it. But the black uniforms against black shadows renders the enemy invisible - often - and I began to get bugged by that. On the hotel map, I had some good success staying outside and fighting on the perimeter - that was actually fun. But the frenetic inside activity was bleak and more of a twitch shooter feeling - it wasn't my style.

    I actually went back to the Medal of Honor multiplayer and had great success with the repeating sniper weapon. I think I posted about this before. So if I want multiplayer, I'll go back to that, or to BF:BC2 - or even to the BF2 demo, or once in a while to one of the CODs.

    The challenge is running out on AC4 - I'm about 75% through the game arc. Early on I ran around the map and looted soooo many boats, and consequently upgraded my ship so much before coming back to the main story arc, that the things they throw at me now are a piece of cake. Well - still a bit of a challenge - but I'm glad to have the strongest hull, the extra-strength mortars, and the golden super-powerful repeating swivel cannon. I might be ready to take on one of those legendary ships one day. They are very hard to kill, but if you disable a few ships close by, that you can leave sitting there ready to board in order to repair your ship - like a giant health pack :p - that might be a way to finally beat one of them.

    Happy holidays again to all,
    Rich
     
  4. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I use imgur for my image hosting at the moment, as it's not cluttered with useless features and ads - you just have to learn to ignore all the 'internet culture' images it offers on the homepage from other users.
    I found imageshack became pretty much unusable without paying for an account, and photobucket often only keep images a few months before deleting them.

    Graphene's been a hot topic for a year or two now, and might potentially offer something to replace silicon, but we won't reach the 'silicon's not good enough' stage for another 5-10 years I don't think.
    Remember the R9 290/290X are still built on a 28nm silicon process, whereas Intel's Haswell CPUs are already on 22nm. Intel only expect issues to crop up once going below 10nm.

    As for GPU/CPU usage and limitations, I think you're pushing both boundaries at once, which makes analysis different. The game you're playing overstretches your CPU, and the detail/resolution settings you're using overstretches your GPU at the same time.
    When your GPU is hammered by AA, the frame rate may be lower but more consistent, such that the CPU is always at 100%. When reducing the work on the GPU, the amount of graphical load may be more fluid, such that the CPU will be periodically slowing down slightly when graphically intensive operations come in fits and starts.
    Remember also that there are some CPU operations that are not frame rate-dependent.

    Exceeding 3GB VRAM at 2560x1600 is very unlikely. I don't currently know of any games that can do that without mods unless you are supersampling.

    Frame rate drops like that with the OSD applications highlights why I try to avoid using them at all costs :) There is no reason why such applications should cause that level of drop in performance. Combined you're losing a good 10% there, which is the difference between models in a product range.

    NEVER use Furmark as a means of testing frame rate. It will not react the same way to changes in graphical load/capability as games do since it is designed to stress-test the GPU, not provide a measure of performance. The frame rate counter is really there to show the GPU is not dropping frames inconsistently due to overclock or fault-related instability.

    You're not the only one to have complained about the darkness in BF4 multiplayer. I'm personally hanging on until there are fewer bugs, and possibly until I have a card more adept at playing it, but we'll see. For me to enjoy Battlefield games I really need to be in a squad on voicechat with friends.

    Happy christmas to all etc :)
     
  5. ddp

    ddp Moderator Staff Member

    Joined:
    Oct 15, 2004
    Messages:
    39,167
    Likes Received:
    136
    Trophy Points:
    143
    same to you guys.
     
  6. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Happy holidays to everyone. Have fun and stay safe :)

    Treated myself to a new pair of headphones. The Creative Fatal1ty's were finally getting used up. I've repaired the flimsy spaghetti cable about 10 times. They were getting physically and functionally beat. Too bad as they were broken in wonderfully and had beautiful sound.

    Got a nice deal on a Logitech G230 analog headset over at Best Buy with my discount card. Normally $60 retail, with small after-holiday discount, plus a discount from my card brought it to $42. Very nice considering the Creative Fatal1ty headset was about $50. Similar drivers and performance specs to my previous headphones, look a little nicer, more comfortable, maybe a bit more solidly built. Time will tell there as the Creative headset really stood up to some abuse despite the fragile cable.

    Sound quality so far is quite comparable, though maybe starts to distort at a lower volume than the Creative unit. Bass is a little more powerful, but at the cost of some sharpness. Time will also tell there as these 40mm headphone drivers always require a little breaking in before their true quality becomes apparent. The Creatives were the same way.

    Mic is not quite as good. Picks up more background noise, and the sound card does not have noise cancelling. Not that bad or annoying, but something worth noting for some users.

    $50 at time of purchase. Quite an old design.
    http://www.newegg.com/Product/Product.aspx?Item=N82E16826158082

    $42. Normally $60, even at Newegg.
    http://www.newegg.com/Product/Product.aspx?Item=N82E16826104840

    The 7.1 capable G430s were available for the same price, but I've heard them in some detail before and IMO: the low price + attempting proper 7.1(ie trying to do too much) = low quality components and sound. The identical, but stereo only, G230s seemed a much better buy. Also, they're analog. I have good reason to use my analog :)
     
    Last edited: Dec 26, 2013
  7. Ripper

    Ripper Active member

    Joined:
    Feb 20, 2006
    Messages:
    4,697
    Likes Received:
    13
    Trophy Points:
    68
    Comfortably the youngest regular still here checking in.

    Hope everyone had a good Christmas!
     
  8. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    How old are you Ripper? I joined the site at like 15.
     
  9. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I too joined the forum at age 15, looking for support. Sufficiently disillusioned, my first signature I believe read 'my computer's crap, yours?'
    How times change...
    In February I believe I celebrate a decade at aD.
     
  10. ddp

    ddp Moderator Staff Member

    Joined:
    Oct 15, 2004
    Messages:
    39,167
    Likes Received:
    136
    Trophy Points:
    143
    you've been here a bit longer than i have as i joined oct 15 2004.
     
  11. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    February makes 6yrs for me. But I've been aware of Afterdawn for significantly longer ;) I wanna say, discovery was around 2004. Could be 2005 though. 2003 - 2004 is when I really got involved with my first computer. Dial-up internet :S Patience builder!

    Though certain "essential" softwares were introduced to me, around 2002. Obsolete now... Winmx!
     
  12. Ripper

    Ripper Active member

    Joined:
    Feb 20, 2006
    Messages:
    4,697
    Likes Received:
    13
    Trophy Points:
    68
    My reg date is Feb 2006 and I'm 22 in the first half of 2014, making me 13 when I joined. So 8 years coming up!

    I distinctly remember people assuming I was a lot older though, so I didn't correct them!

    Edit: (Which would explain why I was probably a total PITA to begin with, right ddp!)
     
    Last edited: Dec 30, 2013
  13. cincyrob

    cincyrob Active member

    Joined:
    Feb 15, 2006
    Messages:
    4,201
    Likes Received:
    0
    Trophy Points:
    96
    Feb 15 2006 is when i joined... yes its been a long time since my last visit....lol
     
  14. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Joined in January 2006. This January will be 8 years for me :D
     
  15. ddp

    ddp Moderator Staff Member

    Joined:
    Oct 15, 2004
    Messages:
    39,167
    Likes Received:
    136
    Trophy Points:
    143
    Ripper, not as bad as doggybot was.
     
  16. Ripper

    Ripper Active member

    Joined:
    Feb 20, 2006
    Messages:
    4,697
    Likes Received:
    13
    Trophy Points:
    68
    Before my time I think, don't recall the name, but it doesn't look like he made the cut!

    I've never been banned - on purpose anyway! Haha.
     
    Last edited: Dec 30, 2013
  17. ddp

    ddp Moderator Staff Member

    Joined:
    Oct 15, 2004
    Messages:
    39,167
    Likes Received:
    136
    Trophy Points:
    143
    banned him jan 3 2006.
     
  18. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    I'd really like to understand how a 60Hz monitor/tv, is capable of being 3D compliant.
    http://www.newegg.com/Product/Product.aspx?Item=9SIA3PC1711414
    Further, how HDTV's that have 120Hz+, are incapable of 3D. I was under the impression, that 120Hz was essential to the standard. Clearly not. I understand how some TV's would not have the ability, based on selling schemes of companies.

    Clearly I don't understand current 3D technology :( Is 27 inches large enough for a desktop environment for 3D viewing? I once heard, the larger the better, but surely there's a point where one has reached overkill ;)

    There has to be a way simulate 3D on my current Dell U2410. Essentially, make the software believe it is 3D capable. I know that AnydvdHD has a setting, "Simulate Connected 3D Display", but I'm not sure that that is what I'm asking for.

    Looks like newegg is mistaken, about the component/composite connections...
     
    Last edited: Jan 5, 2014
  19. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    30Hz at 3D - all TV/film content is 24Hz or 30Hz remember, so 60Hz screens can display it.

    The reason you can't do it on your U2410 is because the display still needs to support polarisation for the glasses to work.
     
    Last edited: Jan 5, 2014
  20. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    So that's it. I guess that helps me understand a bit. Thanks :)
     

Share This Page