1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Dropping to 1080p has solved my performance problems. All of them. The larger screen also provides a superior viewing experience. It's enough resolution for me to not really mind the difference. The size FAR outweighs the loss of resolution. As mentioned above, if TVs were a lower standard than 1080p, they would suck for this purpose. It just happens to be enough to give a crisp experience at this size and allows comfortable desktop realestate. I wouldn't try anything larger because 40" is nearing the limit of acceptable. A 42"-46", even a good one, would likely look much worse than a 40". A 50" or 55" would be plain awful. 32"-40" is fine though. I aimed as large as I felt would be comfortable. My ideal size would be 37" but those are somewhat uncommon in LED.

    Either way, I now get incredible performance with maxed settings where I used to need medium or high settings. Most games ran stupendously fast at 1440p with everything cranked plus AA. A few were just frustrating though and they were ones I've wanted to play for some time. Playing games not maxed sucks when you've just recently made several major upgrades. I'm having no second thoughts about the switch. It's something I've thought about doing for a couple years, and have been considering heavily since I got the 970s.

    BTW it works wonders in Dirt Rally. Size AND performance were needed in that game. Really wonderful to play in-car with a racing wheel now and much smoother with higher settings. I love it.
     
    Last edited: May 27, 2016
  2. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    That completely makes sense.

    At your normal viewing distance of about 4 feet, with your home theater design and large powerful speakers, as opposed to my crowded "right on top of the monitor wearing headphones" - the ability of the human eye to see the difference between 1080p and 1440p is called into question. I am not really sure that the human eye can tell the difference.

    You would know more about that than I would.

    When you say "crisp" - that's the issue.

    We actually recently picked up a black friday 4k samsung smart tv for about $580 - I think it's a 46".

    Some months ago, our one paying roommate got a "free 1080" from a neighbor across the street - it always took a long time to power up and after 10 years, and 5 years of hassle, they were giving it away. We snagged it and set it up in a corner of the sunroom on a solid wall to your left as you walk in, where it doesn't affect the viewing windows at all.

    When that tv died after 3 months, and my brother happened to be visiting from Korea, we decided to all chip in and pick up a Toshiba for $480 on a black friday special - then when my brother and room mate were over at Best Buy, they came across this thing for another $100. I knew my TV-knowledgeable room mate was all interested in 4k - he kept talking about it - so I thought it would be a good idea.

    When I go in the sunroom, some evenings, I sit about 8 feet from it - I stepped it off just now - and I cannot tell any difference in crispness versus the 1080p toshiba 47" in the kitchen, where I also sit about 8-10 feet away on the other side of the large dining table. They both look crisp with the Power DVD 13 extra sharpness and upscaling to 1080p.

    Of course I realize that the content is not 4k - it in fact is 720p videos that I upscale with Power DVD 13. I am convinced that a 3 gig video at 720, looks better than a 3 gig video at 1080p, because of the Power DVD 13 upscaling. So I can store 3 gig 720 videos instead of 10 gig 1080p videos.

    Anyway, the samsung is supposed to have a built-in upscaler to 4k. Both displays look crisp.

    We have put on some cable 4k material, and I have seen the extra sharpness, and yes it is pretty cool. If I one day got a 4k video - would I be able to see the sharpness difference from 10 feet away? I don't know.


    So I am really glad to hear that you have optimized your home-theater gaming experience, gone back to your prior 1080p, and all your memory limitation problems have disappeared - and now you can enjoy your considerable modding upgrades at full maxed out setting. That's awesome!

    Rich
     
  3. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Yeah there are downsides to HDTVs for desktop use, especially if you demand a razor sharp display. Not even the best HDTV is going to give you that. Particularly with 4K TVs you need to pay out the ass to get a 4:4:4 Chroma display that isn't fuzzy up close. Most 4K TVs are 4:2:2 or 4:2:0.

    These numbers basically represent the quality of reproduction and separation for color in each individual pixel. If you look closely, a 4:4:4 Chroma subsampling TV will have very sharp and defined pixels, while a lesser TV will be a bit fuzzy with some slight color bleed and aliasing. The idea is that the human brain detects differences in color much less easily than differences in brightness. So manufacturers often make panels that display higher quality information for luma(brightness/luminance) than they do for chroma(color). It usually works out pretty well as even lower quality displays can look very sharp indeed. Very large displays(50"+) start to need it though, even at far viewing distances.

    [​IMG]

    With some fiddling the lower quality panels are passable enough for desktop use and easily enough to enjoy games at ultra settings. Even detailed RTS games have enough detail and clarity to enjoy fully. Recently been playing World in Conflict again, and it's AWESOME on a large screen. The graphics are still pretty great. So it's really a minor nitpick having a cheap panel. If I were serious about going 4K, I'd want somewhere in the size range of my current display or larger with 4:4:4 Chroma Subsampling. But then I'd want a pair of GTX1080s, a matching RAM kit to double my RAM, and a new PSU at the very least as well.

    For home theater or TV viewing, only enthusiasts or the well-to-do should really be concerned about panel quality. As your experiences have shown, with a good quality source, most TVs tend to work out pretty evenly.

    Most of my films are 1080p rips with the majority being 13-17GB for live-action films and 10GB-ish for animated films. Basically the most compression you can get before impacting image quality noticeably. Very clean video quality at those sizes, indistinguishable from a Blu-Ray at normal viewing distances, and almost always have the HD surround track. Sometimes a hair imperfect with your face right in the screen, but even I would be hard pressed to tell the difference unless I knew beforehand. Good encoders can minimize the impact of compression.

    I usually aim for having DTS-HD MA or Dolby TrueHD in my rips, but the more basic DTS and Dolby Digital are acceptable for most movies. Especially the Blu-Ray basic DTS is double the bitrate of DVD DTS so it's pretty good, even if it's not lossless like DTS-HD. That makes it VERY common in most rips to save space for video quality while giving a "better than DVD" sound experience. If you ever see a 1.5mbps DTS track on a film, it's definitely the core of a DTS-HD track ripped from a Blu-Ray.

    I reserve some space for particularly special movies such as Lord of the Rings, Star Wars, Saving Private Ryan etc. Those are usually 20-35GB rips depending on the film which are fully raw, uncompressed Blu-Ray video. At that level they almost always come with the full HD surround track(sometimes even have the track re-encoded into 5.1 FLAC for easier surround decoding), and are entirely un-altered so you can shove your face right into the glorious scenery. They do take some crazy storage space though. 100GB a set for the LoTR and Hobbit trilogies uncompressed.

    For small file sizes, 720p is going to be far better. Less space is wasted on resolution and more is used for color information and audio quality. A 3GB 720 will look much cleaner than a 3GB 1080 on a large screen, and will likely include better sound. On a smaller screen, it might be harder to tell the difference. My usual rule is use 720p if it's smaller than 10GB and 1080p if larger. 1080 in small files usually looks grainy and has compression artifacts during fast motion scenes. Small 1080p rips usually have crap quality audio too. 128kbps stereo or 384kbps 5.1 which is bearable but not ideal. Have seen some 192kbps 5.1 surround as well and it's awful. For a storage-conscious collector, 5-8GB 720p rips are a pretty popular niche among content uploaders. They usually include DVD quality Dolby Digital or DTS and are far more than enough the enjoy the movie on an HD screen without using way too much space.

    EDIT:

    The same kind of principle applies in video games. Things like textures and other game elements are usually compressed or reduced from their original source to some degree. If done well, the difference can be indistinguishable. As I drop to 1080p, Skyrim uses much less video memory, and because I'm at 1080p, many of my highest resolution textures can be swapped for their optimized or lower resolution counterparts with little change in visual quality. The savings are compounded, which means better overall stability and performance in the game.

    Very demanding when modded, and possibly unstable due to the massive world causing constant issues with memory constraints. It's not a 64-bit program, and has a quirk where in stock form it counts your video memory usage into its 3GB cap. Utility mods facilitate moving your video memory use back to the video cards, among other things. In stock form this never really becomes an issue as long as you have more than 4GB of RAM. Modded however, resource usage shoots through the roof, with the main process using ~2GB, and video memory quickly approaching 4GB with a heavy configuration. When choosing to add textures, you need to be conservative, and using a memory patch is almost unavoidable if you plan to use more than one or two big texture packs.

    It's not like the game is constant BS to run modded. You just have to read the mod documentation and know what you are adding to the game and how it will affect it. With the proper memory patch mods in place, it is entirely video card horsepower and video memory limited. When I am done with this project, the game is possibly going to be more stable than it was stock, as long as you have a 3.5GB GTX970 or better :p That would make it easy to run AA on an 8GB card in the future.

    Most of my mod textures are 2K. Bethesda did add a bunch of nice 2K textures in the high resolution texture DLC, but the art and compression quality on the modded textures is superior, so the DLC textures are overwritten despite being the same resolution.The majority are 1K to replace 512, or 512 to replace 256. Essentially aimed for double stock resolution on smaller things, and more consistent 2K for the main world. It worked out like this:

    A very few large environment features such as mountains, and all of the Dragons - 4K

    Environments, architecture, characters, creatures, armor, weapons, equipment ie the Primary Assets - 2K

    Smaller armor pieces ie gantlets, boots, etc, People's hair, tools, larger plants, items on tables and desks, potions and food, etc - 1K

    Small environmental details, small plants, bugs, people's teeth and eyes, small objects, etc - 512

    My original mod configuration leaned much more toward the high end of that scale. The newer configuration will lean more toward 2K files as the max. Only a bare few 4K files have been selected, as noted above. I took several files from 4K back down to 2K, and many more down from 2K to an optimized version of 2K with 1K normalmaps underneath. Nearly the same quality, with significant video memory savings over a large number of files. I'm still trying not to use too many 512 textures, but some very small items can do it easily, so it's a process. If I had known what to look for to begin with, I would have been much more conservative and had a far easier time of it. Simply bringing many small things up from 256 to 512 and from 512 to 1K makes a drastic difference, and barely affects your video memory.
     
    Last edited: Jun 7, 2016
  4. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Well, 3 days later than it was supposed to be here, my GTX1080 is finally installed (Never mind that on-top of the pretty steep $750 MSRP over here, I was still charged the equivalent of $30 for 'free Saturday delivery' - when it didn't actually arrive until Tuesday. Have sent the retailer an email about that. As for the card itself, how is it?
    Well it all seems to work OK, under normal conditions it's very quiet, however a bug with fan speeds on all GTX1080s that seemingly nobody at nvidia picked up during testing means that you do get occasional sporadic bouts of fan noise - nothing excessive, but it's rather reminiscent of the fan noise you get when first booting a machine happening at random mid-game.
    https://www.techpowerup.com/222895/...edition-owners-complain-of-fan-revving-issues

    That's a minor technicality though. The GTX1080 also takes another step down the chain of MST support, or lack thereof. You may recall that when first acquiring the Dell UP3214Q the tiled display proved almost a non-issue with the crossfire HD6970 setup it was originally paired with - it caused problems with Crossfire on some titles (but then what didn't?) but was largely stable at the desktop. Then came the short-lived R9 290X, short-lived because it was very unstable with MST and would lose sync at random after only a few minutes' use, repeatedly, which would cause games to crash when they realised no monitor was connected. The GTX970 was a little better, being largely stable in game, only losing sync mid-use every few weeks, but did have to be rebooted between 1 and 5 times almost every time the PC was booted, or screen resolution switched (e.g. launching and closing full-screen games), which was very tedious.
    The GTX1080 makes things simpler, it does not support the UP3214Q at all. - to see anything other than the 'MSI' BIOS splash on the screen, I need to use the HDMI connection - with no HDMI 2.0 on the UP3214Q that means 30Hz at 2160p or 60Hz at 1080p - which I am currently switching between depending on desktop (former) or gaming (latter) use, but honestly, I think I'll put the GTX970 back in tomorrow.
    To mitigate this I've ordered a UP3216Q to replace the 3214. This was a step I had intended to take even if the 1080 duplicated the 970's behaviour, 18 months is long enough to put up with those MST issues, but it's frustrating that the 72 hour delay in receiving the card means that getting another replacement monitor will be at least another 36 hours away before the card can really be used properly. It also of course forces the issue of spending the equivalent of $1030 on the new display at a different retailer rather than waiting for them to come back into stock at the normal MSRP ($950). There's also the issue that nvidia's drivers are utterly hopeless at using HDMI, continuously altering colour settings and disabling the 'digital vibrance' feature any time a game is opened or closed (and not re-enabling it afterwards) leaving colours looking very washed out. My concern is that HDMI 2.0 will duplicate that problem, but perhaps DP1.4 over single-stream will do the job. Time will tell I suppose....
     
  5. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Hey Jeff,

    Those photos were very helpful. Reading left to right I could clearly see the quality improve on color and on gray scale.



    It is very funny that you mention Saving Private Ryan as one of your special movies. I too have that on a pretty good rip, at 720p - I think around 5 gig.

    I pulled that movie out a few days ago after reading about Vin Diesel, and how he made a short 20 minute film called Multi-Facial, about his audition challenges.


    By the way, I found the DVD with that short on it, and bought it through Amazon from Goodwill, for $4. I'll rip it, and I'll have it in case you or Sam or Kevin might be interested in watching it - anybody who is sort of a Vin Diesel fan as I have become, will probably like it.

    Steven Spielberg reportedly cast Vin Diesel for Saving Private Ryan because of having watched that short. When I read that I thought - "I sure don't remember him in that film." (spoiler alert: As you know Jeff, he is the one that gets killed very early on as they go looking for Matt Damon, after grabbing that little girl from the french couple because she reminds him of his niece, right after Tom Hanks grabs the kid away from him. A sniper up in the church tower nails him and he lies there groaning for about 3 minutes.)

    Vin Diesel as you know is more known today for the Fast and Furious films, and also the Riddick films.

    I am a big fan of Avatar, and after realizing that Michelle Rodriguez, the helicopter pilot from Avatar, starred in the Fast and Furious titles, I ended up watching that series and appreciating Vin Diesel, whom I saw a long time ago at the theater in The Riddick Chronicles, which I thought was pretty good. My brother who was with me didn't like it so much, but I did enjoy it.

    Through Michelle Rodriguez, who I think is really cool to watch, I also got into the Resident Evil series with Milla Jovovich, after reading that Michelle Rodriguez starred in the first (and last) of that series. I thought that series was good also, particularly the last one with the little girl who thinks Jovovich is her mom. In my opinion, excellent - bringing out a whole other dimension to the Jovovich character.



    I see you agree with my perception that a 3GB 720 is better than a similar-sized 1080p. I have observed the same thing you have.

    For me I benefit from the upscaling that Power DVD 13 does.

    We do have some space considerations. Occasionally, I am forced to use handbrake to convert an 8 gb 720p or 1080p down to a 3 gb 720p mkv.

    We recently bought a 4 TB external - maybe a year ago - finally after 3 years - upgrading from a 1.5. That 4 TB is already half full!

    So yes, I will handbrake those 8 gb 1080p files down to 3 gb 720p, usually with default quality of 20. I do a test of 1%, starting about two minutes into the video to get past the credits, and see how big the resulting file is - I'm aiming for 30 MB to get the final 3 gigs. Sometimes I'll increase quality to 18, or to 15. I also have to use handbrake if I accidentally get a 265 file - power dvd 13 won't run those - handbrake will give me the 264 version.


    I have said it before and I'll say it again - you are the modding guru.

    I believe I actually installed and play-tested Skyrim. I see that you really get immersed in it. It didn't quite match my style of game - we've talked about this before. I just couldn't get into it. Not sure why.


    ARMA SCENARIOS - SCOPES - CARRY TWO RIFLES - 3X GRENADIER ROUNDS
    Meanwhile I am about 20% into all those Arma3 community-authored scenarios - missions. Some of them have been really interesting. I came across a new scope - I forgot the name - you don't get it playing Nato. It zooms from 5x to 25x in small increments, about 10 steps in all, as you click through your mouse scroll wheel. That is a terrific zoom, but you drop back to 5x when youi reload your rifle. Ghetto! And that's not real world in my opinion - why would your optics change on inserting a new clip. The other scopes do not do that. That can actually get you killed in a tight spot.

    But if that's all that is available, similar to the TWG scope which I consider the best in the game, the 5x-25x gives you 3 color options, hitting N for night mode - some bizarre colors arranged around the red/orange spectrum. Hitting N 3 times gives you a color where people glow pink even behind bushes. So this is their counterpart to the TWG where thermal imaging gives you white hot for the enemy soldiers.

    The TWG scope is 8 as I recall, or 18 on the second click. I might be slightly off on these numbers. The 8 is a little bit much for 50 yard shooting, but it's manageable - however 5x is better. The jump to 18 (or is it 22) is a huge jump, and long distance only. So you can see that the 5x to 25x in a sense is a better scope, except for the fact that it drops back to 5x all the time.

    Oh - the other problem about the 5x to 25x, is that you increase zoom by the mouse wheel, and if you go past the max zoom of 25, the next mouse wheel click actually changes weapons, and you find yourself holding your pistol. Now THAT can really get you killed, lol.

    Anyway, there was one scenario recently, where I led my team up to the top of a pretty tall hill. It turned out to be the hill from hell - enemies were firing at us from 3 directions - from about 300 meters out on their hill, where, in prone, they were very accurate.

    I later left my team at the camp, and went up there alone with a drone. I created a save point just when I reached the top, and then I ran forward about 25 meters, 5 meters off the top, to a slight flat, where I could go prone, and address 3 enemies moving toward me and approaching the base of my hill, where I would no longer be able to target them in heavy tree cover. I had to kill them quickly.

    I repeated that save point, choosing a different scope each time - iron sights (2x) red dot (2x) dts (5x/12x), TWG (8x/18x), and the 5x-25x, plus the ultra sniper scope, which is 22x/75x as I recall. By far the best was the TWG. If upon my first fire, they got slightly behind the trees, I could still pick them out.

    The basic red dot was a joke, and sometimes I got all 3, but it was hell finding them behind the trees - sometimes I guessed correctly. I really needed at least 5x minimum. The dts 5x/12x was good, but no thermal imaging to help behind slight tree cover. The 5x-25x was not too bad - at the 5x I could target much better than the 2x of the red dot of course. And I finally learned not to click past the 25 and change weapons to my pistol. The sniper scope was way too much, even at the first 22 setting - certainly at the 75 - these guys were about 200 yards out and 22/75 works better at 500-700 yards and beyond up to 1200 yards. The sniper scope is interesting - at extreme range like 1000-1200m the 75x zoom is no good - you have to lift your rifle so high to compensate for bullet drop, you can target better at 22x where you at least can barely keep the enemy at the bottom of your scope. (That's why you can even use the TWG scope on a sniper rifle. Its zoom of 18 is not far below the 22 first step of the sniper scope, and with the thermal imaging, you can see the targets behind cover - and keep them in your scope - maybe at the extreme bottom of the scope.)

    After killing those 3 guys about 40 different ways testing out all the scopes, I launched the drone, and had a lot of practice moving from the drone map to my regular map, finding guys that were moving in the drone map, and placing them in the regular map where I could do a SHIFT+LEFT Mouse, and put a position marker on their location, then find them coming out of the map.

    And I discovered something really amazing! You actually can carry two rifles in that game, if you have the large carry-all backpack. You can carry the heavy-duty long range sniper rifle, and you can carry a regular assault rifle, even the 100 round version of it that I prefer - at the same time - plus your pistol. I had no idea before that I could move a rifle into my backpack!!! You can also carry an anti-tank weapon with you - pre-loaded - you won't have room for more rockets. They have the fatigue now built into the game, but if you carry your pistol, you can move steadily at that slow-paced run - you don't have to walk which is REALLY slow - and usually the fatigue doesn't happen unless you are doing a lot of climbing.

    Oh, one other thing, I came across the 3x grenadier round. You can fire 3 grenades without reloading. Awesome! I tried it on another scenario attacking a patrol about 400 meters away. But it wasn't as much fun as I thought because the grenade takes so long to shoot that far. I WAS able to create a lot of damage, but ultimately I was much more effective just with the rifle.

    Well, I'll let you go - enough about Arma3. Aren't you the one that got me started on that game?

    BTW I sent you a couple emails.

    - Rich


    EDIT - SAM - CONGRATULATIONS!!

    Sam - wow you jumped into that big bad 1295 performance GTX1080! And you paid a pretty penny for it, and then a monitor upgrade for another $1,000!

    So in about two days you'll see if you finally have solved some of your vexing issues with 4k resolution! Nice!

    I had to chuckle when I read your remarks about Digital Vibrance!! Weren't you the one who said "too cartoonish - not for me - my buddies don't care for it either" about 7-8 years ago when I was raving about Digital Vibrance, playing the Treyarch title, World at War (cod5) and I had JUST gotten the GTX 280, I think, which came with my quad 9450 - and I accidentally discovered digital vibrance in adjusting my monitor - and I LOVED IT! It made the forest-on-fire effects so colorful, and that red tractor in that one field was so vivid - I was so jazzed! They used fire all over that game - in the flame-thrower fighting in the pacific - everywhere. A very colorful game!

    And then a year after that - maybe two years later - coming back to AMD, I found the video display settings in Catalyst allowing duplication of digital vibrance, by increasing saturation, and if you like, by also changing color temp which I don't do anymore - but I for sure DO the saturation change, usually from 100 default to 130.


    So when I read this:


    --- I have to admit, seeing that you are now a digital vibrance fanboy, like me - it gave me a big chuckle!!
     
    Last edited: May 31, 2016
  6. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Hehe - I thought of you when I made the comment as I do remember you harping on about DV when you started using it years back - guilty as charged, but I have to be honest - I never even realised the setting was enabled until now it's started messing up - it really is just a combined contrast/saturation scale. It evidently has always been kept at the default 50% value, as I've never changed it. However, really the picture looks no real different to how it did when I ran AMD GPUs. When that setting is disabled however, colours are noticeably washed out - compared to the U2312HM to my right running off the HD7770, which is all running default settings. I know the UP3214Q is quite a wide gamut monitor so I know that's wrong. I wouldn't say I'm a digital vibrance fanboy, but what I do know is that when I switch resolutions on HDMI the colour goes very off, and I fix that by dragging the digital vibrance slider one notch then putting it back to how it was before. For all I know, doing that could be affecting other settings too.

    Looking at the fact you use 130 for saturation on AMD, I'm guessing you turned DV up beyond 50%?
     
  7. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Okay, I see that I can't really call you a DV fanboy - but it's funny just the same!!!! :p

    What is my DV setting?

    Wow, Sam, good question. I have been away from nvidia DV for so long, I forget how the setting works. But wait - I think the computer in my room (as opposed to out here in the trailer) has a cheap nvidia card that I bought JUST for dv (by cheap I mean $7 - that's cheap right?)

    The desktop tower - the one that animated Left 4 Dead, a Dell has built-in realtek. But the colors were whack on that. I had to have my DV fix so I just googled and found out what really cheap nvidia cards had dv - they came out with dv so long ago, like what - maybe 2003 or so - that there are a ton of really cheap nvidia cards that have it.

    Wait, I know where the paperport folder is, in my collection of folders under Z Internet orders. I found it - Computers, graphics card, nvidia agp cheap.

    Total price $6.21. MX200 64MB agp, dvi card. Wait, that was for a different computer. What do I have in the milesdell?


    Okay, I looked up the specs - I have an nvidia Geforce 256mb 6800. I don't know if that is what Miles had in there when he was animating - maybe so. I was wrong in everything I just said - he didn't animate on realtek on-board video, he animated on this geforce 6800.


    I turned the computer on - in a minute I'll look up the nvidia settings ......................

    ...........................................


    NIVIDA SETTINGS IN MY ROOM
    So here is the nvidia 6800 desktop of my former 3 ghz p4 everyday tower. I stopped alternating through all the 1000+ wallpaper images for a while when this appeared. For some reason I really like that green with the red pepper.


    For about a year, I have settled on this one wallpaper, instead of rotating through over 1000 really nice images
    [​IMG]



    And here are the nvidia control settings - I'm only at 20 dv. I can't remember how I set those - I think I did tweak them at one time - they are certainly not too wild I don't think.


    Doesn't seem to be DV over 50 - it's only 20 - not sure when I set this, but the pepper is a nice shade of red
    [​IMG]




    And here below is my present desktop - that little $100 microtower compaq with 4 gig ram that I picked up on ebay for my roommate, then liked it so much I got one for myself (it's up to 6 gigs now.) It's not a p4, it's a core 2 duo, E7300 - maybe like your computer about 10 years ago when I was still on my 3.8 ghz p4. Where the frame per second conversion speed on my little speedtest, was 202 for the milesdell, the frame speed on this is 365 per core, and there are two cores, so 730 vs 202 - it's wayyyy faster.


    I had to go away from on-board realtek graphics to get somewhat the same shade of red on that pepper - catalyst saturation 130
    [​IMG]


    This is the one, remember, that I was debating - remove w7 and move back to xp? You guys said NO! So I have two partitions, an xp partition and a w7 partition, and on w7 I also run xpmode, a virtual machine. I mostly boot into xp, but I did pick up a copy of outlook express for w7, and I am making sure that I can totally run off w7 if I finally take the plunge.

    But here's a strange thing. On XP, I actually see thumbnails in folders of the ufo files, which have live objects, from my photoimpact, which I recently upgraded from 4.2 to 13 - the last version before they stopped making it. But in my w7 partition, they show an icon, not a thumbnail. I have no idea why xp started showing thumbnails - was it service pack 3, or was it the new photoimpact 13 which was written for xp - probably the latter.

    Anyway, back to our discussion of saturation - the last image is catalyst, on a $39 graphics card, the HD6450 - my 4th purchase of that terrific passive cooling card, with hdmi, dvi and vga outputs. It's the home theater card on 3 computers, using the hdmi, but I'm just running off the dvi. I got the card, because the on-board video wouldn't allow me to match the red chile plant on the wallpaper of the top image.

    As soon as I switched, and amped saturation from 100 to 130, I was able to pretty much match my prior desktop.

    Rich
     
  8. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Lots of promising info coming from AMD about their new CPUs on the AM4 platform. I remain skeptical after the Bulldozer debacle. However, 8 cores, 16 threads with HyperThreading, 95W TDP with a 40% boost in IPC, sure sounds like a winner. AMD themselves say it closes the gap and is on par with 7th generation i-series CPUs. Possibly exaggeration, but if it can get anywhere near close and maintain good clockspeed scaling, the doubled number of cores alone would make it an upgrade to my 4690K. A sizable one at that. Apparently they had an easier time with gains by focusing all of their RnD on a single CPU core design(like with Phenom II), instead of multiple. Instead of focusing on budget CPUs, they will build the high-end flagship, and let budget parts trickle down as the yields improve. There will also be no dual core version at all, though I imagine hex cores and the like will appear.

    What I do in the future remains to be seen. Currently my CPU is amazingly overpowered for what I ask of it. It is a tremendous value for what I paid, and outclasses pretty much everything except Haswell-E CPUs. It has proven a very intelligent buy, even considering how much better is still available. Extra cores and threads maybe a must in the future though. Warhammer Total war has proven that beyond-quad multi-core CPUs are still relevant when the envelope is being pushed. If the new AMD is a solid per-core competitor with reasonable power usage characteristics, I might switch on principle alone. I am a life-long fan and if AMD goes under that's bad.

    Also the RX480 outperforms my video cards by a decent margin, uses about the same power, and has more than double the memory. Not to mention it'll be more than $100 cheaper at release than they were. I am sorely tempted to go back to Crossfire, but the lack of general support for it is still an issue. SLI has awesome support, and a superior method for running single GPU when the need arises. That may improve for AMD in the future though. They are starting to gain a little momentum, and the industry is starting to back their development policies. Bringing PC gaming to the masses and showing them that it can be affordable is a good thing, even if it dilutes the user base a bit. More money is sorely needed in the aftermarket.

    A video card that brings high end power to everyone is noteworthy. Those are rare. Last I can think of that would come anywhere near close is maybe the 6600GT or the HD3850, and those weren't near as powerful for their respective release date as the RX480 supposedly is. It's an awful lot of GPU power for $200. Probably more like ~250 for a premium factory OC'd one with a good cooler.

    The more I use 1080p the more I realize how comfortable it is to use. I can maintain high framerates much more easily. I'm never wondering if the next new game will totally obsolete my GPUs. There's also far less fiddling to stay within my memory limits. Yes the GTX970 is a flawed product, but a true 4GB card wouldn't fare much better. Powerful 8GB GPUs might change a few things for me, but I'm probably going to opt for 1080p with lots of AA and totally maxed eye candy, vs 1440p or higher with compromises. I am able to fully enjoy a game and appreciate its fine intricacies and details at 1080p. Skyrim with high resolution textures(among many games with beautiful graphics) is no less spectacular. Will admit though that 1440 at 32" to 1080 at 40" is a pretty big transition.
     
    Last edited: Jun 10, 2016
  9. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    If all that's true then I'd be very pleased. However that is a dramatic improvement from where things currently stand and based on AMD's progress over the past few years I remain very sceptical. A 40% faster IPC chip with a 25% (or more) reduction in TDP is an enormous increase - not far off Core 2 vs. Pentium D levels of improvement. Anything that alleviates the current monopoly on performance CPUs is good for the consumer as the current situation is pretty dreary, however, given AMD's CPU R&D seemed to be even less effective than their GPUs, I honestly question how this has been achieved.

    The RX480 figures you're posting also have me sceptical. The cards are just about as fast as a GTX1080 in crossfire with full scaling which, based on benchmark testing I've seen of late, happens even less often now than it did in the past. Almost everybody I've spoken to regarding Crossfire / SLI in recent years says it has been downhill ever since the HD6/HD7 series. It wasn't exactly great with the HD6 series cards, though useful enough to justify the cost of two cards that, in their own right, were within 10-15% at my resolution of the fastest cards available. If two of these things can only just keep up with nvidia's best in ideal circumstances, it doesn't matter that the pair are still cheaper, mean frame rate across a common benchmark suite is going to be far lower, and TDP far higher, despite their improvements (2x150W is a lot more than 180W).

    If you're running 1920x1080 perhaps a single RX480 will be sufficient. It is after all a quarter of the pixel count of 2160p and only half the graphics power, and the GTX1080 is doing pretty well at 2160p all things considered. The price isn't extortionate, the power usage is reasonable, but it does beg the question - the GTX970 is half the performance of the 1080 as well, it's 145W, so broadly similar TDP, the only thing it lacks is having 3.5GB memory vs. the 8GB of an RX480 (maybe, I hear the cheaper versions will be 4GB). Is there really any point in switching? At 1080p I wouldn't say it's common that I see 4GB prove insufficient.

    As far as 'high performance to the masses' - the GTX970 has fluctuated, but even though our currency over here has weakened of late, can still be had for £250 at present. it's been around that price for the past 18 months - I only paid that much for mine back in 2014. The MSRP of the 8GB RX480 with tax added comes out at £210 - for a card that's likely to perform very similarly, sure it's an improvement, but it doesn't really make an enormous difference to what people can get for their money, I don't think. It just means people that are spending £200 odd can get something far more power-efficient for their money versus the power-hungry R9 300 series.

    I totally agree about a big screen providing immersion with games - I'm happy with 32" as a compromise. I was reluctant to eschew 30" 16:10 for 32" 16:9 initially, but having seen how much better TV & film look on it I've got used to it and now it doesn't bother me at all. I'd struggle to fit anything larger in here and I predominantly need the higher resolution for desktop work, not gaming space. For that it's extremely useful, using even 1440p monitors feels comparably claustrophobic. The fact that games look stunning at 2160p is an added bonus, but really just that. There is no denying, however, that running 1920x1080 max detail with smooth frame rates these days is a cinch compared to how it used to be. 1440p has become relatively easy with top-end cards, 2160p is really only where the big difficulties lie, but I have to say so far the GTX1080 is doing an admirable job of it, especially considering its power footprint.
     
  10. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Just gonna put this out there. Tomb Raider benchmark, absolute max settings with 4xSSAA, 1920 x 1080, TressFX ON. Minimum 56FPS, Average 96FPS, Maximum 126FPS. There's an awful lot to be said for being able to run vsync with 4xSSAA in Tomb Raider. I will always run vsync if at all practical. It's like a drug for me. And with 4xSSAA jaggies are scarce, even at 40" and 1080p. Many games could do it at 1440p. At 1080p, the sky's the limit. Basically Skyrim, Oblivion, and Crysis 3 are the only games that won't lock to 60 maxed. Crysis 1 shows its longevity and sheer ambition at 60FPS. No longer the king, but only an elite few rival it yet.
     
    Last edited: Jun 12, 2016
  11. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Well, Jeff, that was interesting news. I did a little googling - 8 gb is certainly a nice chunk of memory on this energy-efficient RX480. I pulled up a couple of articles about it. How does it compare, for example, to my crossfire 7950s - I have them overclocked to 975 where they are supposed to be equivalent to the early 7970 at 900 clock - memory of course is 3gb. Probably not enough benchmark data yet to answer that question.

    So, Sam, sounds like you're pretty pleased with your GTX 1080. I paid particular attention when you said,



    Those are the kinds of statements that finally got me hooked on upgrading to the 30" monitor, well before I had the graphics horsepower for it (at that time I was still back on the agp HD3850 - on my 4 ghz pentium 4 - which as Jeff just mentioned was a pretty powerful graphics card in its time.)


    What kinds of games are you now capable of running at 2160p? Could we get you to post a full-size screen shot?

    Rich
     
  12. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Sam, premium versions of the GTX970 launched at near $400 in the US. I paid $369 for one and about $300 for another used a few months later. The RX480 is launching at closer to $200. And yeah, it performs noticeably better, without any nasty memory constraints, at a fairly similar TDP. When it releases it's going to perform like a GTX980 at half the price or less. I'd daresay that's pretty significant. A card with that relative power for $200 hasn't been available for several generations. $100+ is a pretty big difference when budgeting an upgrade or an entire build, so it opens up a very decent option for several tiers of buyers, at least in North America.

    If AMD's driver division can get their heads out of the sand, and produce proper Crossfire Application Profiles, the RX480 could be extremely potent. As it stands now though, SLI is absolutely superior in every imaginable way. My last foray with Crossfire could be described as frustrating to say the least. Support outside of AAA titles is spotty at best, and Crossfire users had to wait about 6 months to play Metro Last Light, which is completely unacceptable. That's not much of a value at all. With SLI, it's far more rare to need single GPU, and I'm actually playing a lot of the same games I did when running Crossfire. AMD showing Crossfire numbers for the RX480 is a joke and pretty much everyone can see that.

    As far as running high-resolution, the benefits are clear, and I'm never going to argue 1080p is better than 2160p or 1440p. There's just no comparison in sharpness, clarity, and sheer desktop real-estate. 1080p is a bit claustrophobic after running 1440p for over a year, but It's still comfortable after settling back in. If TVs were a lower resolution I wouldn't even consider the trade-off reasonable. 1080p though is a different beast, and is hardly ever fully utilized by consumer hardware anyway. I don't think of it as limiting myself for performance, but as maximizing the use of the display. My 970s are wonderful on it, which is a stark contrast from their hit-and-miss nature on the 1440p. I routinely tax my video memory, and there's a difference between using too much and not having enough. 4GB is borderline at 1440p. 3.5GB is knife-edge. At 1080p, 3.5GB is plenty.

    Also, I am permanently running some older consoles, and monitors suck with them almost universally. Both the Dell and Samsung 1440p displays flickered terribly with my Xbox, which runs its menu in 1080i. In contrast, the image is very steady and sharp on a TV. And neither display upscaled very well, though the Samsung was noticeably better. 480p content is significantly better looking on the larger 1080p display, at the same viewing distance. TV scalers are usually superior, and this is a $300 TV vs a $500 Samsung and an $800 Dell. That's pretty sad. Not even gonna start on 480i/576i content which basically two whole generations of game consoles and MANY generations of consumer hardware relied on exclusively. I understand that interlaced content is now antiquated, but simply displaying it shouldn't be difficult for any basic display. If that's one of the trade-offs when using a monitor, the TV is probably going to stay right where it is. I highly enjoy PC gaming and it's easily my main platform no doubt. But I enjoy multimedia as well and monitors just seem to suck at that.
     
  13. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Jeff, you have clued me in on a lot of titles that I never would have thought of. I still have yet to play Assassins Creed Revolution (is that the name - the industrial revolution in England?) But I will. I was wrong about the last title, the French revolution. Once I got into it, it was great, and the side missions were fantastic.

    When you mentioned Metro Last Light - I guess it's good that I didn't jump right on that title when it first came out. By the time I got to it it played beautifully - so that must have been when they fixed crossfire.

    But again - that's a title I got from you - I even told you I wasn't sure I wanted to play that dark "subway freaks" game!


    I could not have been more wrong. Except that I was also just as wrong about Left 4 Dead - I wasn't going to even play it but you guys were all over it. I was sure wrong about that one too.


    But Metro - wow - the Slavic flavor of that game was very different from anything I had played before - the culture is very rich - those two games rank very high on my all-time favorites lists, along with all of the Bioshocks, AC3, AC4, Far Cry 3, COD2, COD4, Left 4 Dead, Bad Company Vietnam, Counterstrike GO, Crysis, and a few others.


    [Speaking of Bioshock, that reminds me - I finally watched that classic movie Brazil - in fact the director recently did a cameo in the critically panned Wachowski Bros. title Juno Rising. Yeah they missed with that one, but one reviewer said the only part of that movie that made sense and kept a consistent tone, was the bureaucratic paperwork sequence finishing up with the cameo.

    So I guess they were trying to make something similar to Brazil, but the execution was very flawed, the tone of Juno Rising shifted all over. It didn't have the light tone of Brazil - it would have been so much better if it had. But when I watched Brazil, itself, which got it right, I had the thought,

    "Wow, here is where a lot of the ideas came from for Bioshock!"

    One right after the other, Bioshock inspiration, and by that I mean everything from the art-deco, the plastic surgery, the lavish excesses of an ultra-rich aristocratic lifestyle.

    Have you guys all seen Brazil?] ​


    Somebody at Home Depot tipped me off to the Sniper Elite games and they were great, and I'm glad I played them - but they don't rank in that top strata of games like the two Metro titles. I'm glad I played the Batman games also - all four of them. All because Sam talked about the new Batman in a post - how unplayable it was because of the gpu load. Thank goodness that AMD released a beta catalyst for it - it finally played pretty nicely. Yeah, Batman was a lot of fun. But again, for me, not in that top strata of games like Metro.


    So Jeff, they don't have to be top strata games, but are you aware of any interesting games that I might have missed, or that might be coming along pretty soon?

    Sam from those performance benchmarks that you keep on top of - any new hot titles out there?
    And Sam - some screens please :)

    Rich
     
  14. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I could certainly post some screenshots of games at full resolution, but without them running on a higher-DPI display the effects will be somewhat lost I think.

    I have seen Brazil, but it was quite a long time ago!

    In other news, I finally decided to do something I'd been promising to do for several years and install a larger desk in my room, so my workspace now looks like this:
    [​IMG]
    The conspicuous space on the bottom left used to house the LaserJet 4050N printer, which I have yet to put back as I haven't needed to use it yet and I enjoy having more space under the desk for my feet. Carpet in that area definitely needs a deep-clean though :p

    I have to credit someone in this thread, possibly Kevin? for the use of the hard disk storage boxes which I have now implemented for all my major backup disks - so there are now 36 coloured boxes to find a home for! Eventually as I retire all the old 1TB backup disks and 2 or 4TB disks take their place that number will come down a bit.
     
  15. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    I think you're right about the high resolution uploads. Without the DPI (native resolution), it'll be lost in translation lol
    I sure wouldn't mind a higher resolution monitor, or 4K tv for that manner. But it's not practical at the moment. Not to mention my GTX 570 would probably be hard pressed gaming at higher resolutions ;)

    Yeah, it could have been me that sited the hard drive storage containers. I recall being excited about them lol. I currently have 6, and only 1 is vacant. I store and swap particular drives as needed. Given the data they contain, they don't need to run 24/7. I give them exercise once a month-ish lol
    [​IMG]
     
  16. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Yup, mine look identical to that. I'm not a great fan of the latch design, some of them are extremely stiff which is not ideal when handling fragile components, but they're proving useful so far. Likewise, the disks I store in them are only used periodically to update my backups. To be sure of data integrity and to ensure they are properly exercised when they come out, I usually defrag and sector-optimise them after each update.
     
  17. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Ah yes. My 6TB drive tends to be slightly bulkier/tighter in the case. Not extreme, but noticeable.

    When I bought the 6TB drive, it freed up so much of the higher used drives, I was able to set up a software raid for the 2 x 2TB drives, and ran a software (Windows 7) Raid 1. The 2TB array, houses very precious image files. I find that the OS drive frequently runs out of temporary cache for thumbnail previews, and has to reload the preview. As a single 2TB drive, it took longer to load the files. While not a great deal better, it's certainly noticeably better. Plus I feel better that the data resides on 2 drives ;)
     
  18. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Nothing to do with what size of disk goes in, they all fit comfortably, but the actual plastic mechanism for unlocking the case is very stiff on mine - so I had that trouble before there were any disks in them.

    I understand what you've done with the RAID 1 for those image files, but if they really are precious, I must quote one of the IT golden rules - RAID is not a backup.
     
  19. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Well, it was more for read speed than anything. And if one fails, I have the other :S
    What kind of a catastrophe would take both drives simultaneously? Yes, a faulty motherboard failure, PSU failure, etc could take them, but it could also take them all. What's the basis for the IT's to claim that?
     
  20. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    The fact that outside the world of big business I see RAID arrays fail almost as often as I see disks fail. Even businesses often don't get it right - you've listed some of the reasons there. If you want vital data backed up, you have to account for all points of failure. Anything which can wipe that entire machine, from a potential rogue BIOS update, RAID driver update, faulty motherboard, flaw that means both disks fail at a similar time (more common than you might think if they're the same brand/model), power surge/failure, PC being damaged / stolen, environment damage to the property, there are plenty of risks there that RAID doesn't save you from. Backing up data means it goes onto an entirely different machine, and preferably also off-site.
     

Share This Page