1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Hmmmm. Please define card cooker.

    As in .......... something that TOTALLY DESTROYS the video card?

    You don't mean that ... do you?

    Because the power color is now toast. I'm talking lox and bagels. I'm talking garlic bread. I'm talking artifacts that would knock your socks off:

    [​IMG]

    Focus in for the large size. :(

    Rich
     
    Last edited: Jul 17, 2012
  2. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Have you started overclocking the graphics card yet? Those artifacts are the exact sort of thing you see when you've overclocked the memory too far on a GPU.

    As for the memory heatsinks, why would they be? It's rather like putting a better heatsink on your motherboard chipset or memory - if it's overheating otherwise then sure, but besides that, there's no point.
     
  3. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Simply put FurMark puts way more load on your GPU than anything else, including GPGPU computing(ie Folding, Bitcoin, SETI). MUCH more than any game. I have heard of and seen more than a few cards basically cooked to death on FurMark. Video cards aren't like a CPU where you can ideally put 100% maximum load on it and depend on the cooler to just work. They are not normally subjected to this total 100% load on all of their hardware and the coolers are not built to handle it. FurMark is a pretty dangerous tool in inexperienced hands. To the point where both AMD and Nvidia have their cards throttle clocks and voltage when using FurMark, it's that dangerous.

    Same way you can't just push and push on the clocks and just expect everything to go back to normal. Everything has to be done gradually and within limits because of how the hardware works. Considering the artifacting you're getting, plus your somewhat erratic clock settings, you might have pushed too far on the memory, and accidentally damaged the card. 900/1350 is VERY reasonable for those cards, but how high have you actually gone? If 900 is a mild OC what's a high one? Video cards don't normally OC as well as CPUs.

    Agreed with Sam on the memory sinks. The card is designed to work properly and cool itself as-is. If it were to need memory sinks to function properly something else is wrong. Likewise with aftermarket coolers. Voids the warranty a lot of the time, with high risk of damaging the card if not installed properly.
     
    Last edited: Jul 17, 2012
  4. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    For that exact reason, FurMark is almost defunct as a test unless you can disable all those limitations. So far OCCT still seems to have evaded the lockdowns. If you want an absolute burn test on the GPU/PSU use OCCT. Beware though, it does some pretty alarming things. My 4870X2x2 setup with the Q9550 pulled between 700W-810W in games with full quadCF scaling, out of the mains (so about 620-720W DC). When I ran the OCCT PSU test (which is basically max stressing the CPU and GPU at the same time), it pulled 1060W out of the PSU, or around 920W DC(on an 850W PSU!)

    Also Rich, problems with MSI Afterburner are not rare. It's a useful program, but it's very badly coded. The reason why is that if anybody is critical of how the program is actually coded on their forum, they get banned instantly for it, and their advice is not heeded. It's another one of those devs where they just don't care. Shame really, as very few programs come close to offering the functionality that Afterburner does.
     
    Last edited: Jul 17, 2012
  5. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    I wouldn't recommend running those types of stress tests on a PC at all. They simply aren't designed for that, even under the most extreme loads they don't get pushed nearly that hard. That's just an easy way to damage hardware that would work fine if left to its own.

    The best way to stress video cards IMO is to use a benchmark with proper scaling and a high GPU demand. Unigine Heaven is wonderful for this and pushes my cards much harder than any game barring Crysis/Warhead exclusively.

    CPUs alone you can get away with a bit more, as almost anyone has an aftermarket cooler capable of handling quite a bit. IBT(IntelBurnTest) is great for CPU/Memory stressing and puts on even more load than Prime 95.


    Afterburner has terrible coding, and is out of date. It also has a lot of compatibility problems with all sorts of hardware and interferes with proper functioning. Not to mention it's packed with bloatware and you have to jump through hoops to OC with it. The clocks are very limited as well even after you unlock everything and it doesn't let PowerPlay work properly so results in higher operating temps at idle. AfterBurner is terrible.

    BTW Sapphire Trixx offers everything AfterBurner does, plus some. Better coded by far and extremely lightweight to boot. Only feature it's missing is the graphing function but that's what GPU-Z is for isn't it? The fact that Trixx works very nicely with PowerPlay and has a wider range of clocks is worth the difference personally.
     
    Last edited: Jul 17, 2012
  6. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Add one more card to that - my formerly good power color (but don't tell newegg - they've approved my RMA of course.)


    Yeah, but Sam, how do you know they are overheating? I am not aware of any sensor that detects memory heat.

    The other thing - I am barely overclocking the memory. The stock clock is 1250. Catalyst lets you go to 1575. The overclocking guide said - "Go to 1575" right off the bat, before they even started to overclock the gpu. But I read some of the newegg reviews, and while some people apparently are able to overclock the memory like that - others can't. I have not done a huge amount of testing, but at 1400 memory, it was iffy as to whether I would get a screen laid out like a solid texture - meaning crashed card and need to reboot, or not.

    So I backed off to 1350, and that allowed me to get the clocks to 975 on the core, and while initially I juiced the voltage a bit, I now have backed off to stock voltage - I have it set to 1087 mV, and the real voltage per gpu-z and also per the trixx windows 7 gadget, runs .065 to 1.05 depending.


    Those artifacts are because I ruined the card. Those artifacts are there at 800, 1250 stock clocks, and the lowest resolution test. The card is toast, and is now boxed up and going back - that mistake cost me about $70 in shipping and 15% restocking fee, but at least not the whole $400.


    Just like Jeff said, I ran Furmark, and I should not have.

    I ran Furmark, and I allowed gpu temps to get up to 90. I have been so used to thinking that ATI cards can take high temps, even the 8800GTX can take 90 without blinking, that I was unaware I was ruining the card.

    The power color initially ran the 3dmark11 test beautifully. The card is going back, and I bought an XFX - the model with one slot of rear ventilation, although I am aware that most of the heat from the fans goes inside the case.

    I would have bought another HIS IceQ but they are more than two slots wide, and I can't fit it in slot 6 as it bumps up against the bottom of the case.

    That tool Trixx is amazing and beautiful.

    That graphics test Unigine Heaven is wonderful.

    But the artifacts I showed you did not show up in Heaven, only in 3dmark11, which is why I am so glad I sprung for the $20 and picked it up, especially after you raved about the graphics, Jeff. I got lucky buying 3dmark11. Without it I wouldn't be RMA'ing the power cooler right now, and those artifacts would be haunting me in a month or two.

    At first I thought 3dmark11 was boring, not understanding as well as you do, Jeff, the complexity of the textures, and all the other stuff that is going on. But now, when a card shows those submersibles, without artifacts, I know that a helluva lot of graphics processing is going on, and I totally appreciate the 3dMark11 test. I really like Heaven too, but from my personal experience, 3dmark11 caught the damaged card, Heaven did not.

    Furthermore, as far as stressing the card to test an overclock, like the 3d guru guy who wrote the article on Overclocking the 7950 said, "Just run one full session of 3dmark11, and if it gets through that, the overclock is stable."


    My #1 overclock, like you mentioned, Jeff, is 900 and 1350. The next, #2, is 925 and 1350, the 3rd is 950, and the 4th saved position in Trixx is 975 and 1350, all at stock vddc of 1087, with power of course at maximum allowable (200 watts if needed.) Stock clocks of 800, memory of 1250, doesn't have to take a position in Trixx - I can just hit Reset and everything is back to stock, so Trixx effectively lets you store 5 profiles at the touch of a button. Niiiiice!


    Fan is set to 100%. It's summer, it's hot, I'm now afraid of temps, and the kazes make much more noise (but that HIS turbine is loud for sure - until I turn on the kazes) and no sound gets through my Medusa 5.1 earphones so I don't care.


    While I am gaming, I have Trixx open, and Afterburner, and Riva. I found that you can tweak with Trixx, and all the others show the new tweak, including Catalyst, except for Afterburner, which will display the clocks, but not the vddc voltage, and which wants me to reboot so it can read the graphics card. But if I do that, it glitches and gives me huge text strings for the graphics data, and I have to uninstall it, and then re-install.

    I like the fan control on Afterburner - but Trixx has the same thing, and so does Catalyst for that matter.



    I use furmark ONLY to test the OSD, which I keep on the upper right corner, at 2048 position. No more furmark long testing to cook cards - if I need a long stress test I'll run Heaven for a couple hours - it just keeps going beautifully, and it shows the OSD very nicely.

    So I run furmark for a minute, at 2560 x 1440, to test the OSD. Sometimes the cpu data is on top, like an hour ago. So I closed furmark, exited riva, ran furmark again, to make sure the graphics data was proper, then ran riva and picked up the core data on the bottom where I want it.

    So the SINGLE, ONE and ONLY reason for Afterburner is to report the sensors in the OSD. (I can position the OSD anywhere of course, and I can change the font color.)


    [​IMG]


    In my humble opinion, that is a Kick-Ass OSD, running at position 2048, in the upper corner of the screen, giving me all the real-time info I need while I am gaming. It runs beautifully in Heaven. It runs in most games, but not in most multiplayer punkbuster games. It runs nicely in Left 4 Dead also.


    So by way of summary, Afterburner will report the graphics card temp, and various other sensor readings, and I don't need it for overclocking, just to do the reporting - and it gives me time of day, lol. Within Afterburner, Riva will report the core temps and loads.

    (I stumbled on that by accident, Riva displaying within Afterburner. I had been trying to get Riva to work, to no avail, and I turned on Afterburner, accidentally not turning off Riva, and suddenly there was all my core data!!!! WTF!! Actually a forum poster had mentioned that Afterburner would host the Riva data - but I didn't quite know what he meant until that happened. The core temp data display was not as good as it is now - I had inexplicably left off the load for core 0 - also I had the core frequency only on core 0, but it is better to display it on all the cores, because now I can just look past that and see the core loads stacked neatly on top of each other. So a few days ago I re-installed the 8800GTX, going through putting back the nvidia driver, fixing the riva core display, then running Driver clean to get rid of all nvidia stuff and putting the 8800GTX back on the shelf. I am so jazzed, because I had been thinking that I would have to log cpu loads and temps through core temp, and that I would not be able to see them real-time.)

    One more time: I love this OSD. :D

    So:

    1. The tweaking tool is Trixx - thanks Jeff.
    2. The reporting tool is Afterburner plus Riva - thanks Unwinder you magnificent Russian basta*d.


    For someone to reproduce this OSD, they have to have an old nvidia card lying around, vintage 8800 is good. Maybe 280 or so as well. That's about as far as Unwinder got on Riva when he dropped that project and starting getting paid by MSI. Afterburner may not be able to tweak my 7950, but I don't care - I have Trixx. As long as Afterburner is at least up to date enough to accurately read the sensors - gpu load, gpu temp, gpu core clock, memory clock real time, and memory vram usage (by the way, Max Payne 3 is so far the highest vram usage at 1.75 gigs) - that's all I need Afterburner to do.

    Rich
     
  7. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Well that would explain much of it right there. You never simply turn a component to the max setting. Overclocking is a process of small increases and testing between each change. Just because 1575 is the max it lets you do, doesn't mean you should go there. The options exist on my motherboard take the CPU all the way to 6GHz if I chose, but I would never expect it to do that, it's unrealistic.

    Also, Newegg's reviewers are on the whole quite braindead. I have proven this with several different products. Particularly motherboards getting low ratings because the majority of reviewers did zero research and used the wrong RAM.

    Nooooooooo. No overvolting without liquid cooling. Just trust me. You should never have done that and tried FurMark. That would absolutely fry any card.

    Use GPU-Z for tracking video card temps. There are, I believe, 4 different thermistors on most ATi/AMD cards. If your core reading was at 90, I guarantee the memory and VRMs were well over 100.

    3DMark 11 is a bit more demanding than Heaven so would show artifacts faster. I can almost guarantee Heaven would have similar problems if you ran it long enough. A typical stress test with Heaven is several hours vs a few runs with 3DMark. As a stability test 3DMark has always excelled because it's sensitive to damage and OCing.

    You shouldn't have to be afraid of temps as long as you don't do daft things like run FurMark :p Those cards should run fine for years at the stock fan settings.

    Never run multiple OCing programs at the same time. Others have argued, but every single time I watch them struggle to make things work properly as well... If you need to track video card temps during a session, you can use GPU-Z. Just set it to update in the background, and you can track averages, maximums, minimums etc. If you must absolutely run it, make sure you only have clock control on one single program. Ideally you shouldn't even have the window for AMD Overdrive open at all.

    Don't run FurMark at all. It's bad, and it's not a realistic test in any sense of the word. It's an extreme test meant for hardcore benchmarkers and cooling enthusiasts to torture top-end systems, not overwhelm stock coolers on rigs meant to game and see normal use. You wouldn't take a passenger car and run it at the Bonneville Salt Flats, so why would you take a daily use PC and subject it to extreme hardcore torture tests? It doesn't make sense.

    BTW even big time OCers and benchers only ever run FurMark from 15-20 minutes.
     
    Last edited: Jul 17, 2012
  8. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Amen!

    I got my RMA invoice back, and Newegg isn't even charging me the restocking fee - I am out only the return $10 for shipping, since initial shipping was free. I made out like a bandit!!!

    I swear, I am now petrified of Furmark.

    I run it only for less than 60 seconds to check the OSD - which it excels at doing.

    I could check in Heaven, but not really because Heaven doesn't have a 2560 resolution, besides it takes forever to load, unlike the furry monster donut which loads in 30 seconds and is more than ready to toast your graphics card PRONTO. Actually the furmark heat buildup takes a while - pretty much 10 minutes - so running it as I do for about 30-60 seconds the temps go up maybe to 60 if even that.

    My new XFX came today! So I will put that in later tonight and torture it with furmark!

    NOT.

    Hey, I ran Crysis last night at VERY HIGH settings, and AA only 2x. I remember Sam saying it really doesn't need AA, so I wasn't going to use it at all, but settled at 2x for the moment.

    I am in the first chapter, and getting about 32fps. It seems very smooth and I haven't noticed any lag. MAN IS IT GORGEOUS !!!!!!


    I haven't tried your special crysis mods, because I'm only getting 32 fps as it is. But once I start crossfire, I'll apply the mods. Everything is so interesting - the graphics are awesome.

    I noticed a little group of crabs running around at the base of the cliff, after I slaughtered all the Koreans at the command post trailer area. I dropped down the cliff to investigate the crabs and caught a couple with my hands. Also I caught a little sand chicken. I should have tried that on the big chickens back at those two settlement areas - I think I'll go back and do that. Each creature has an interesting animation as it is captured in your hands, but when you let it go, it ends up dead. Oh well. I'll go catch a chicken or rooster tonight, and see the chicken animation, lol.

    That tranquilizer solves a lot of problems. I was trying to take out the boat gunner, so to make it easier on myself with the jamming radar trailer. I was hitting him with single shots. He flinched a lot, but 10 shots had no effect. So I tried the tranquilizer, although I was doubtful it would shoot that far.

    I shouldn't have doubted: one tranquilizer took him down, and then 3 shots at his drugged body killed him off. So that's a little trick - try to just tranquilize them if you can - probably saves bullets too. You could probably switch to fists and pound them with super strength and not use any bullets at all, I suppose, or super strength and grab them by the neck which always chokes them to death. :p

    You get an infinite supply of darts, so that is a good reason for keeping that rifle even if it is out of bullets. All of that might change when you have to start killing aliens, but it is good for now.

    The graphics are so good, I'm considering getting a new pair of Costco prescription glasses just for gaming, so I can totally enjoy every graphical nuance. I was cleaning off my lenses last night - but my eyes are not corrected the same, so a new prescription is really what I should do.

    Rich
     
  9. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    The effect is so minimal for the performance cost, that I generally don't recommend it. It also interferes with the Edge AA, which IMO is more effective than the actual AA, and has a smaller performance hit.


    Mods or no, you should be running the system.cfg as it improves performance as well.

    The problem is that those guys have body armor, and the silencer kills penetration at long ranges. Use unsilenced weapons for things like that. They do drop quite efficiently.

    For being stealthy, headshots are always a kill. I hold onto the SCAR and keep it in semi-auto with the assault scope and a silencer. If you use it conservatively and are observant, there's enough ammo throughout the game for it to be a very effective silent head-popper. It's more powerful and more accurate than the Korean assault rifle.

    My second weapon slot I always treat as interchangeable, and just pick up whatever good Korean weapon is around at the time.

    ---------------------------------------------------------------------

    If you want to play higher difficulties, be aware that you also sacrifice things like HUD features, controllable guns on vehicles, grenade indicators, etc. You can simply edit the difficulty files in the game folder, and adjust those as you wish, while still getting the increase in difficulty. Google is your friend here and most of the values are fairly obvious as well.

    I personally use a normal profile so the health regen, suit regen, damage and everything are left alone. It is balanced very well. Also driver controlled guns, Koreans speaking Korean instead of English etc. I turn all of the AI, number of enemies allowed to fire at once and what have you up to Delta difficulty numbers. There are other variables as well like how sensitive they are to noises, how quickly they react, how closely they search for you when someone spots you, etc. They actually dumb down the AI on lower difficulty levels, and make you pay in basic features of the game to get smart AI. And that's saying a lot because the AI is awesome at any difficulty.

    So basically it has all the features of normal difficulty, with the good features of a harder difficulty. I don't think you should have to sacrifice all the neat features of the game just to get the higher difficulty. I also feel the suit is balanced enough, and that the way more deadly enemies are enough of a challenge for me.
     
    Last edited: Jul 18, 2012
  10. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    Okay, I'll turn off AA. Great!

    Hey, for the rest of this post, I'm back over on the builder thread where my link sent me. I'm on your magnificent - best crysis post of all time - post #4500 on page 180.

    Rich
     
  11. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    I have put a huge amount of effort into modding Crysis over the years. My first effort was discovering/creating/tweaking a "Fake" Very High setting that had the performance of High and the visuals of Very High. This was a great asset in the months after Crysis came out. It offered about 90% of the Very High visuals while allowing my hardware at the time to still play the game.

    Doing all these tweaks and tests has allowed me to really learn what makes Crysis "tick", so to speak. Long story short, that simple System.cfg is an excellent example of how extra graphics and performance could be attained without altering the game at all. The graphics mods are icing on the cake after years of research and testing.
     
  12. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
  13. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    In the most basic terms, it means that Nvidia and AMD cards handle memory a bit differently, but because they have different performance goals, not better or worse hardware. Am I even close?

    What seemed to jump out at me the most is the argument for Eyefinity, and how AMD's wider memory bus allows more freedom with very high resolutions. Whereas Nvidia's hardware is geared toward mainstream users on regular monitors (ie 1080p, 1200p, etc). But while they are both optimised differently, they can still go head-to-head in most situations because of the relative power of the hardware.

    Something else that also stuck out for me was that he figures Nvidia cards do not handle memory as efficiently. This has been a problem for them for a long time. I remember G92 cards facing the same limitations, where they would run out of memory more quickly than the equivalent AMD card and how especially the HD4000 series introduced much better memory architecture for AMD and allowed their cards to handle AA more gracefully than the competition.

    So basically ATi/AMD and Nvidia are back to the core battle they have been fighting since the GeForce FX 5 series and the Radon 9000 series(maybe even before that, but that would be before my time). Nvidia aims for more horsepower for the average user while AMD aims for better quality for the advanced user.
     
    Last edited: Jul 19, 2012
  14. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Yeah it's pretty much an official confirmation of what has often been held as 'truths' about the way AMD and nvidia's architectures differ. The different methodologies of multi-monitor tech is as good an illustration as any of AMD's somewhat more 'proper' approach to high-resolution gaming. The current HD7 generation suffers somewhat from ineffective pricing in the face of the GTX6 series, even though they've cut the price a few times. A shame as the HD7900s are still great cards for multi-purpose gamers. Right now I couldn't justify an upgrade until I get a 30" display (or equivalent) again. Even then, I think I'll be waiting another generation. It's got to the stage where I just forget the HD6970s are there - they get the job done, and they don't really make much noise while doing it. A far cry from the old days, in both those ways!
     
  15. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    I believe the release of the new Xbox and the PS4 will have something to do with that :p

    No seriously. Because everything nowadays is basically designed for multi-platform, we are in a period of stagnation for game advancement. Crysis 2, a game which should have been a major leap forward but failed, is a glaring reminder of this. When the power of the consoles increases, games will start to push better graphics again. For better or for worse, the console developers are setting our standards.

    Notice how PC exclusives have pushed the envelope. Witcher 2 is a good example. They finally released a console version I think not long ago. The comparison is laughable. Entire areas are stripped of their geometry. Parts of buildings gone and everything. They had to severely limit the game to get it running properly on consoles.

    Crysis on the 360 is another excellent example. They had to put it on the less technically capable Cryengine 3, then severely limit everything to lower than low settings. The game uses paper cut-out silhouettes for distant objects and many parts of the game are heavily altered to adjust for the console's power. It's actually pretty nice looking overall for an Xbox game, but Crysis on the PC makes it seem laughable. It is a full port of Crysis, but everything that made Crysis great is gone.

    http://www.youtube.com/watch?v=xQ_1mcnH2yQ
     
    Last edited: Jul 19, 2012
  16. harvardguy

    harvardguy Regular member

    Joined:
    Jun 9, 2012
    Messages:
    562
    Likes Received:
    7
    Trophy Points:
    28
    I like how the AMD engineer put it. The better looking games tend to use more vram. I like the 3 gigs of memory. Of course, I'm on 30", thanks to Sam's influence, and I'm totally hooked on what I consider the increased immersive factor of that 30" experience.



    Wow! All I can say to that is I'm lucky you like shooters, among your favorite genres. If you were totally a Sim City type of guy, all your awesome knowledge would "go to waste" lol.

    That xbox vs pc crysis video is awesome.

    Actually I think the xbox doesn't look bad. As you guys well know, both owning consoles, consider this - you play on your tv, you buy a box for what - $400. You don't have to worry about hardware or operating systems. You throw a disk on it and if you're not a PC gamer, you're happy. You get 1080p and it's halfway pretty good.

    If you go over to Jeff's house and see what "years of knowledge" could bring you, maybe you'll drop dead in astonishment, and then maybe you'll convert. But if not, you're a happy console player, you don't know what you're missing, and the hardware is cheap. What's not to like?

    I like consoles. I like game developers developing games for consoles. All those console players buying all those video games keeps the game companies in business. The game developers all own pcs - everybody at valve is on a 30" dell. Miles isn't a gamer, but a lot of those guys are. Good for us PC gamers.

    If possible, they will almost always produce a PC version of the game, for the extra 1 million sales, and for themselves and their friends, while the consoles bring in the 10 million sales big profits and pay all those dev salaries.

    I love consoles. But I don't want to own one. LOL

    Rich
     
  17. ddp

    ddp Moderator Staff Member

    Joined:
    Oct 15, 2004
    Messages:
    39,167
    Likes Received:
    136
    Trophy Points:
    143
    problem with game consoles is that they are really not that upgradeable as compared to pc's. pc's can be upgraded with faster cpu's(up to a certain point), more ram, bigger drives & better\faster videocards. consoles only have limited upgradedability in bigger drives & certain optical drives.
     
  18. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    I feel much the same way Rich. Consoles are great for the average Joe. I know plenty of serious gamers who are console exclusive and they get along just fine.

    As it happens for me, I was first introduced to PC gaming because I didn't have a console. I had some of the older ones, and some even older ones, lol but no new ones. I turned to PC gaming for my new games fix. First it was a Dell Dimension with a 2.4Ghz Celeron D, 512MB RAM, drop-in X1300 PCI video card, etc. Very very basic. But it allowed me to enjoy awesome games like Rome Total War, Stronghold, Half-Life 2(the engine has since been updated and it's doubtful that PC would handle it anymore), etc. I had a lot of great times with very little power a my disposal. Then I built my first gaming PC. I remember Sam actually advised me about it, because my first ever post here was asking for help on a build. I remember most of the components.

    AMD Sempron 3100+ "Palermo"(256K L2) Socket 754 OC'd 1.8 -> 2.4GHz Stock Cooler
    Biostar T-Force 6100(nice beginner board for the time)
    1GB(2 x 512MB) Patriot Signature Series PC3200 DDR-400 CL-2.5
    Sapphire X800GTO -> Varying OC and unlock states
    Thermaltake TR2 430W PSU

    That was a sweet system, but it really wet my appetite for more. It was able to play a lot of games, but struggled with brand new releases like FEAR and Oblivion. About a year later, I built this:

    AMD Athlon 64 X2 3800+ "Manchester"(2 x 512 L2) Socket 939 OC'd 2.0 -> 2.6GHz Arctic Cooling Freezer 64 Pro
    ASUS A8N-5X later a slightly higher end ASUS A8N-E for better OCing
    2GB(4 x 512MB) Crucial Ballistix PC4000 DDR-500 CL3-4-4-8
    Sapphire X850XT
    Enermax Liberty 620W PSU

    In due time, socket AM2 came out and I got the itch for another upgrade. This time I invested in a much newer video card, some high performance RAM, my first kit of DDR2, etc. I went through a LOT of different configurations over time, before I figured screw it, buy something high-end and work from there.

    Consoles don't offer this kind of valuable experience. They don't offer this type of satisfaction. I built as I learned and I learned as I built. On an Xbox all you do is hit the power button.
     
  19. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Guild Wars 2 Beta


    --VERY HIGH DETAIL--

    2560x1600 Certified: GTX690
    1920x1200 Compliant, 1680x1050 Certified: GTX680
    1920x1080 Compliant: GTX670
    1600x900 Compliant, 1440x900 Certified: GTX580
    1600x900 Compliant, 1366x768 Certified: GTX480,GTX570,HD7970
    1366x768 Compliant, 1280x720 Certified: GTX560Ti,HD5850,HD7950
    1366x768 Compliant, 1024x768 Certified: GTX470,GTX560,HD6970,HD7870
    1280x720 Compliant: 1024x600 Certified: GTX460,HD5830,HD6950
    1024x768 Compliant, 1024x600 Certified: GTX465,GTX460,HD6870,HD7850
    800x600 Certified: GTX460SE,GTX550Ti,HD5770,HD6850
    853x480 Certified, 800x600 Compliant: GTS450,HD5750,HD6790,HD7770
    640x480 Certified, 853x480 Compliant: HD7750

    No currently available CPUs are 60Hz Compliant for this title.
     
  20. Red_Maw

    Red_Maw Regular member

    Joined:
    Nov 7, 2005
    Messages:
    913
    Likes Received:
    0
    Trophy Points:
    26
    If you actually played GW2 any thoughts on how good/bad the game actually looks sam? From what I have heard the game does not look as good as it should given the hardware required to play smoothly at high detail would suggest.
     

Share This Page