1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Video Card Thread (Mostly Gecube x1950xt)

Discussion in 'PC hardware help' started by Waymon3X6, Jun 28, 2007.

  1. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    MaccerM: Crysis doesn't support AA, you may as well turn it off. It'll be reducing your frame rate, but look closely and you'll find the game isn't anti-aliased at all.
    The HD3850 is what Abuzar owns, not the HD3870, which is as rare as the 8800GT has also been over the past few weeks.
    I'm not going to compare the 8800GT to what you did have (which I'll be honest was complete rubbish), but I'll compare it to what you could have had. There are so many people that still fault ATi for certain rebuilder's inability to make decent graphics cards, which annoys me. They become fanboys for life, and as such get the raw end of the stick almost every time. People who typically buy the 8800GT in spite of availability of the HD3870 (so currently just the US) go "but I want it to be fast" - look at those benchmarks again. There are very few where the 8800GT is much more than 10-15% faster than the HD3870. It's always faster, but never by more than that amount. Now, consider the lowest frame rate you're willing to put up with in a game. 30fps? 40? I'll be honest with you, I'd have a hard time telling the difference between 30 and 33fps, or even between 40 and 46. I would instantly notice if my games had cruddy shadows though, as did most of the people who bought 8800GTXs so they could play Oblivion well, a year ago. Imagine their disgust when their new £350 card rendered shadows only half as well as the X1900XT they just got rid of!
    This pretty much sums up where nvidia stand in the graphics card market.
    I suppose if you wanted to sound like a complete *profanity* you could compare them to two artists - one that can paint an entire landscape in 10 minutes, but it looks a bit sloppy, or one that spends a bit longer making it look neat and tidy.
    I don't buy powerful graphics cards to get more frames per second than even a high speed camera could see. I buy them so I can turn the resolution and level of detail up, to make my games look good. What use is being able to do that if at the end of it all, the game looks worse than it did before?

    What's more, most of the people who bought 8800GTs probably run 1280x1024 or 1680x1050 monitors. At this level, they can probably max a lot of games. This therefore means, that a 3870 could also probably max a lot of games - if you're maxing out the detail of a game with a decent frame rate, how about some decent quality rendering to go with it?
    [/rant]
     
    Last edited: Dec 21, 2007
  2. abuzar1

    abuzar1 Senior member

    Joined:
    Jan 5, 2005
    Messages:
    5,818
    Likes Received:
    4
    Trophy Points:
    118
    Hey I have my CPU clocked at 3.6 Ghz, and I haven't really messed with the video card. It's factory overclocked.
     
  3. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I've just spotted the 512MB HD3850 in stock in the UK. I'm quite tempted, that's some tasty kit for £130.
     
  4. MaccerM

    MaccerM Regular member

    Joined:
    Aug 6, 2007
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    26
    Wow 3.6ghz is quite a clock, that's a ghz over stock on the 6750 isn't it, no wonder you mark so high! I don't think I'll get much over 2.6 on mine - I'll give it a go soon once my tim has set and let you know.
    Sam, I take your points - for now the games would play noticably just as fast, and have a better render quality on the 3870 - but in a few titles time, that extra 10-20% is gonna come in handy. When the frame rates start to slip, the 8800 is going to play smoother than the 3870. You are going to have to sacrifice level of detail to play at the same fps with the 3870, so the 8800 will last you that little bit longer. So the painters will be painting away and in a few pictures time the 3870 guy will have to start using smaller canvasses or cut out some of the detail to finish at the same time as the 8800 guy!
    When a single card no longer cuts it you can buy another one and slap it in SLi, which everyone agrees although it doesn't scale as well as crossfire it has better game support.
     
  5. abuzar1

    abuzar1 Senior member

    Joined:
    Jan 5, 2005
    Messages:
    5,818
    Likes Received:
    4
    Trophy Points:
    118
    Actually I think SLI scales bette rthan crossfire.

    Also I have taken my E6750 to 4Ghz but I didn't like the heat it produced. I did start to overclock the video card but I got carried away and screwed it up. I had to re-install windows to get my PC working again.
     
  6. MaccerM

    MaccerM Regular member

    Joined:
    Aug 6, 2007
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    26
    In a review I read it said in some games the crossfire scaled virtually 100%, whereas the SLi was about 60-70%. The net effect was they were the same as the 8800s had a bit more power to start with but in some games crossfire scales really well.
    4ghz is a mentle overclock! It's quite amazing that you can buy cpus now that have so much more potential than their rated speeds. I can't imagine you'd ever choke a 4ghz chip at the moment!
     
  7. abuzar1

    abuzar1 Senior member

    Joined:
    Jan 5, 2005
    Messages:
    5,818
    Likes Received:
    4
    Trophy Points:
    118
    I just played around with my overclock a bit and not I'm at 11,000! I'm going to lower the clocks now though. Need to get a nicer cooler...
     
  8. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    SLi indeed scales far better than crossfire, and has better game support, which is why I never recommend anyone go with Crossfire. As for the performance gap between the cards, as frame rates dip I think it'll become even less relevant. 10% on 20fps is just 2fps, hardly a big difference.
    Still, I like my Anti-Aliasing so I will never support a card that's unable to render it if another similarly priced one will.

    As for a decent cooler, can I sneak a bit of brand favouritism in here and recommend a Thermalright HR-03GT?
     
    Last edited: Dec 24, 2007
  9. abuzar1

    abuzar1 Senior member

    Joined:
    Jan 5, 2005
    Messages:
    5,818
    Likes Received:
    4
    Trophy Points:
    118
    Haha I knew you would say that but I meant a CPU cooler. Maybe I should get a Thermalright Ultra 120? lol
     
  10. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Sure, why not? 3 hours gaming in a room with ambient temperature of 24C and my CPU temp is 41C. Not bad considering I didn't bother changing the Arctic silver after I removed my old heatsink (so only half of it's still on there!)
     
  11. MaccerM

    MaccerM Regular member

    Joined:
    Aug 6, 2007
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    26
    On the cpu speed link to 3dmark scores, I clocked my cpu up the other day to 2.60Ghz and got 10,904. This was in cpu/ram unlinked mode tho with loose 5-5-5-15 timings on the Ram which I have since lowered to 4-4-4-12. Score at 2.33Ghz (with tight timings) was 10,220. That translates to just over 2 ½ points increase per mhz on the cpu clock, compared with the 0.75 points per mhz Sam was getting. So Sam, I think you may underestimate cpu speed in 3dmark scores, I think the gain you see depends on your gfx card and whether it can use that extra cpu power. Put that in context Sam’s 4600-5500 with the XT is a 20% increase in score for a 67% overclock on the cpu. My 10220-10904 is a 7% increase for an 11% overclock on the cpu – clearly much better figures.
    So, at 2.66Ghz I’ll be 11,000 plus easy, and that’s at least a whole Ghz lower than Abu, but I’m holding off the fiddling too much (back at 2.33 now) until I get my 800mhz Ram thru, which will get me to 2.8Ghz in linked memory mode, and that should get me past 11,500 with no probs, just by tweaking the cpu and without touching the gfx card. And if I had a 3.0ghz chip or if I could get mine to 3.0Ghz then we’d be talking 12,000, and that would be about a 30% or more gain on a 60% overclock.
    So abu, I hope I’ll be able to set you a challenge in a short while – crack 11500 with your setup!! For now I’d be interested to see your score with your cpu at stock, that’s 2.66 isn’t it? (9000’s??)
     
  12. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    So what if I told you that by increasing my overclock further by 150mhz, I got an extra 122 marks?
    3.0Ghz: 6755
    3.15Ghz: 6877
     
  13. MaccerM

    MaccerM Regular member

    Joined:
    Aug 6, 2007
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    26
    On your overclocked gfx card? Well that proves the point, that the faster your gfx card the better it is able to use the extra cpu power and convert that into points. My original comparison was based on cpu clocks using stock gpu speeds so that it's fair. When you o/c the gfx card are cahnging additional variables so you change the results, but going on your old figures with the g/c at 663mhz you went from 6700 to 6870, which is a 3% increase on a 5% cpu/oc which is obviously much better than the 20% increase on the 67% cpu o/c you got on stock speeds. Obviously mine would scale better if I o/c'd the gpu as well, that's not the comparison though. The comparison was that the 8800 performance scales much better with cpu clocks (at stock gpu speeds) than the X1900. It just shows that the newer cards benefit more from the cpu speed increases.
     
  14. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Yeah, 663/873 is my overclocked speed. 625/725 is the original.
     
  15. Waymon3X6

    Waymon3X6 Regular member

    Joined:
    Mar 9, 2006
    Messages:
    2,193
    Likes Received:
    0
    Trophy Points:
    46
    Pretty good overclock sam! Is that with the stock cooler? Oh wait, you're running the HR-03 right?

    Just ran 3dmark06 again with the quad @ 4Ghz with 1.6 on the vcore. Still testing to see where the FSB wall is. Other people say they get to 4.20 on 1.6 before they hit the wall. Might try to run it at those speeds, and increase the GPU clock to 900Mhz when I get the flashed BIOS loaded.

    20265 3DMarks
    SM2:8270 Marks
    SM3:9379 Marks
    CPU:5547 Marks

    Having some problems w/ publishing the results, it says I am running a illegal copy when I used the 3dmark06 key code that came with the Maximus...

    Heres the link anyway, not sure how long it will be up there:

    http://service.futuremark.com/orb/resultanalyzer.jsp?projectType=14&XLID=0&UID=13254827

    EDIT: Here's the link to the screen shot, sorry for the giant size!
    http://img172.imageshack.us/my.php?image=3dmark06gw8.jpg
     
    Last edited: Jan 8, 2008
  16. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I am, but I achieved the same overclock on the GPU with its stock cooler. It was obviously just a lot noisier!
    1.6VCore? Holy hell, that's high. Are you going to keep running it at that voltage?
     
  17. abuzar1

    abuzar1 Senior member

    Joined:
    Jan 5, 2005
    Messages:
    5,818
    Likes Received:
    4
    Trophy Points:
    118
    Ray, you're going to burn out your CPU at that voltage, and your parents will be mad lol. 4Ghz on a quad core is crazy, I didn't know they could do that much! Seriously though I wouldn't take up the voltage more than 1.52.
     
  18. Waymon3X6

    Waymon3X6 Regular member

    Joined:
    Mar 9, 2006
    Messages:
    2,193
    Likes Received:
    0
    Trophy Points:
    46
    ...It was only for 1 3dmark run, as I was desperate to break 20k. I think that voltage is max for the quad, as I will try and lower it in very small increments to find the lowest voltage where I can still keep 4Ghz.
     
  19. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Then I'll ignore that figure. What do you get at a CPU voltage you're happy to live with?
     
  20. abuzar1

    abuzar1 Senior member

    Joined:
    Jan 5, 2005
    Messages:
    5,818
    Likes Received:
    4
    Trophy Points:
    118
    The voltage our CPUs can handle should be about same as they are G0 stepping. In fact I think a Duo can overclock more and take more voltage.
     

Share This Page