1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Video Card Thread (Mostly Gecube x1950xt)

Discussion in 'PC hardware help' started by Waymon3X6, Jun 28, 2007.

  1. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    No, I thought they were still 32bit and maybe also 64bit versions.
    Why would I have read that people will be stuck with C2Ds when "Vienna" with 64bit only would come out?
    That's weird...

    But yeah, it's also because of Vista.
    I haven't found much positive or critical about upgrading to Vista, and also with XP being lighter and still good and supported and all.
    (Supported until like 2009 or 2014 or something, I know there will be a SP3 for instance right?)
    I just had a better feeling when reading about making no mistakes in the next Windows OS.

    The only good reason I can come up with to upgrade to Vista is to have DX10.
    But just that is not enough by far...


    So, I'll try it with the Frequency 1 step up, and I'll see what happens.
     
  2. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Yeah, let us know how you get on.
    As far as I can recall, only the Core Duos in laptops (not core 2 duos) were lacking 64-bit support.
     
  3. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    Ah, alright.
    But there also are 32bit C2Ds for desktops?
    Or else they shouldn't really have had to create 32bit versions of Vista.


    But ahm, I saw the BIOS wrong.
    I can't edit the second 2 lines (under 'Memory Frequency For') I mentioned.
    I can only set the 'Auto' to '1.33', '1.66' and '2.0'.
    And it seems to be at 2.0 since it has the same values as it shows on Auto.
    So I'll leave it at that since the other options are lower anyways.

    But I can edit the first 2 lines (under 'CPU Host Clock Control') like I mentioned, hence the 'X' before the options/values.
    So should I change the values behind 'AGP/PCI/SRC Fixed', since they are the same values as after 'AGP/PCI/SRC Frequency (Mhz)'.
    Maybe that's the way to change it, that maybe 'Fixed' means that you can edit the value individually or something?
     
    Last edited: Aug 4, 2007
  4. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Well, 64-bit Vista works a lot different to 32-bit Vista, it's not a simple case of "if your CPU can do it, then do it".
     
  5. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    But you can only run 64bit OS on a 64bit CPU right?
    Or is it like, you can run UP TO 64bit, so lower bits including 32bit too?


    But ahm, should I change the 'AGP/PCI/SRC Fixed' then?
    As I showed in the previous post.
     
  6. Waymon3X6

    Waymon3X6 Regular member

    Joined:
    Mar 9, 2006
    Messages:
    2,193
    Likes Received:
    0
    Trophy Points:
    46
    wow this is way off topic.... lol
     
  7. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    No it's not, I'm still trying to speed up the AGP-bus to help the card.
    I don't know if I should do the thing I said.

    The other CPU-half is yeah.
     
  8. Waymon3X6

    Waymon3X6 Regular member

    Joined:
    Mar 9, 2006
    Messages:
    2,193
    Likes Received:
    0
    Trophy Points:
    46
    That's what I was refurring to...

    Anyways, maybe this card will allways perform medioker? Since it is a PCI-e gpu, modified to fit an agp slot. The agp slot just cant handle the bandwith I guess.

    I have not seen one person with this card working 100%. I guess I kinda got mine to work, medium/high on bf2 at 25-40fps is good enough for me, since that's what I bought this card for.

    I guess I'll start to hate this card when Crysis comes out, and when I can only play on low with 20fps. :(
     
  9. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    I think you will, but I also think you'll hate AGP. If you want to play Crysis, you're just going to have to dig deep I'm afraid. I'm not looking forward to subjecting my Poor little PC to that.

    As for the CPU, correct, All Core 2 Duos , Athlon64s, Athlon X2s and Athlon FXs, along with the Pentium 4 600 (and I think the 500 as well) and Pentium D 800/900 support 64-bit, but will run 32 bit OSes fine.

    yes, go for it on the AGP/PCI/SRC frequency.
     
  10. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    I didn't get this card for Medium/High settings, but highest depending on which game.
    Well, I would put some settings in like Battlefield at Medium to let the system "breathe" a bit, for performance's sake.
    Since most settings in BF you don't evne see much difference between Medium and High, while some are critical to be at High(est) though.
    But this card should just suck it up and pump 100fps at highest settings in BF2 damnit lol.
    This card is from begin '07, like may or something, not that it's the best card, but still.
    It better run a damn 2005 game well, especially for it's price.

    I'm sorry, but it's kinda pissing me off now.
    Oh well, I'll try the frequency-thing...
     
    Last edited: Aug 5, 2007
  11. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Actually I'm not entirely sure my card can pull off 100fps at max at that res in BF2, and that's PCI express.
     
  12. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    Yeah ok, but that sounds like it should be able to do 50-75+ constantly.
     
  13. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    True. I guess it's noe obvious why nobody else made the X1950XT in AGP. If you're still not happy, all I can suggest is to try a 7800GS+.
     
  14. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    I tried the frequency-thing, but I think it only smoothed it out a bit.
    Frames still go too low.

    I've been thinking about another one I've seen yes, by XFX but I'd have to find back the bookmark of the exact one.
    I wouldn't know how it would perform better, but I feel more confident with nVidia and a more standard looking card.
    Plus it's more of an official card, probably made for AGP anyways.

    The messed up thing is probably that it's the same price as this one.
    Which just feel weird, you know, paying the same for way lower specifications.
    I know it's the performance that counts, but still...
     
  15. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    As far as I'm aware, they're also nonstandard, and made by only one company, I think it's Gainward. There's nothing explicitly wrong with ATi cards, but just that this one's a bit odd.
     
  16. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    Yeah, that's what I mean.
    This one's a pretty different case than the other ones, which always happens to me in products (and girls and most other situations).
    Anyways, the XFX is one of those more standard looking too, a flat (1-slot) with a "blower" with one of those smaller fans.

    Plus, I haven't read or experienced many positive things about ATi.
    Even though I believe or know that it's one of the (at least) 2 top GPU-brands.
    No doubt about it actually right...

    But it's like, I always have a feeling a GeForce will work right.
    I put the card in, install the drivers, and it's ready to go.
    But with ATi, there's things like the CCC not even being able to open, bad performance, stuff like that.
    Also reading the quality of ATi Radeon graphics are a bit lower, like more fragments and things like that.

    It's always been pretty good to have a GeForce, starting with the... ehm... GeForce 3 Ti200 or something, before that it was Trust Voodoo XD , and then the 6800XT.
    Except for the fact that they weren't always the best cards out, I never had many other problems besides low performance.
    With the reason of low specs that is, not having high specs and still low performance like now :p .
     
    Last edited: Aug 6, 2007
  17. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Actually, up until the end of the 7000 series, ATi cards offered marginally better picture, at a marginally decreased performance, which I preferred, since if you don't want it quite so sharp, you can turn the detail down to get your performance back, the nvidia cards prohibited you from ever getting the best visual experience. When the 8800 arrived, that was finally resolved.
    The truth is, there are only two major players in the desktop graphics market, in much the same way as there are only really two CPU makers now, although more do exist. However, unlike Intel vs AMD, a long-standing rivalry of underdog vs market menace, ATi vs nVidia is long-standing know-it-all (ATI) versus ground-up newcomer (nvidia). The first Geforce only appeared a few years ago, so it's pleasing to see just how far nvidia have come.
    I hold no fanboyism for or against either company, but up until the 8000 series, I usually went with ATi cards, due to the scandals that nVidia used to get where they are now. The Geforce FX series was awful, left in the dust by the performance of the Radeon 9000 series, but that's not what the benchmarks had you believe, simply because back then, the cards and drivers were engineered to get massive 3dmark scores to up sales, and offered poor performance everywhere else.

    I too had a Geforce, an MX440 PCI a long while back, it was fine, and none of the 6 graphics cards I've owned over the years has ever had a fault, so both manufacturers I trust on that part.
    You've just had a negative experience with one brand, and have been put off using them, and in your case example, it isn't strictly ATi you should be cross with, it's Gecube. There are plenty more people who've had issues with an nVidia card (and you can bet your life the fault was with the card manufacturer, not nVidia) and sworn never to use them again.


    These are the cards I've had over the years:

    ATi Rage Pro 8MB Integrated over PCI bus 1999-2002
    Sparkle GeForce 4 MX440 64MB PCI 64-bit 2002-2004
    Sapphire Radeon 9200 Atlantis 256MB AGP 128-bit 2004 (Jan-Sep)
    Sapphire Radeon X800 Pro 256MB AGP 256-bit Sep2004-2006 July
    Sapphire Radeon X1900 XT 512MB PCI express 2006- Present (refitted with Thermalright HR-03 cooler)
    Sapphire Radeon X1600 Pro 256MB PCI express 64-bit 2007- Present in Multimedia server
     
    Last edited: Aug 6, 2007
  18. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    I know it's usually the manufacturer to blame, and I can easily "guess" that GeCube isn't one of the bigger GPU-brands.
    I only found the brand GeCube when I searched for the AGP-card with the highest specifications, months ago.
    Even had to get used to the name since I know the Nintendo Gamecube from years ago lol.

    But I guess the only other Radeon I had, which was a lower-end 9200 LE Family or something, was also because of it being from Gigabyte like my motherboard.
    Only, it's not one of those big GPU-brands like Sapphire for instance, but it is good with Motherboards and things like that.
    I've never seen anything bad about them anyways, and it's not letting me down so far.

    Then again, I had bought a Sapphire before, which was a X1600XT (512mb) I believe.
    I'm not sure, but something like that.
    Only it wouldn't even show anything on-screen when I'd start the computer.
    But I think it was the computer to blame since it was a DELL.
    I believe that company is known to like... lock features and stuff, not even being able to edit the BIOS and things like that?


    I didn't get as deep into the technical side of computers before like, I think 2005.
    But I only started to know about GeForces when they had those GeForce 2, 3, 4 and maybe 5(?) and all.
    And that's just hearing about them from other people and all, and seeing them displayed in stores.
    Later I saw a lot of people on the internet had those high-end ones from the 7-series, that they were so awesome and everything.
    And also PCIe, SLi and Crossfire appearing and stuff like that.


    Oh yeah, I just remembered we also had the Viper 550 after the Voodoo2 I believe.
    Around those times that 3DFX came out, which was all awesome and stuff hèhè.
    I still remember games having splash/intro-screens for 3DFX, like GTA 2.

    And I think this is the best AGP-GeForce I could find:
    http://www.xfxforce.com/web/product...ce™+7900&productConfigurationId=1006114
    It's really pushed against the barrier of being an 8-series lol.
    I mean like: SEVENTY-NINE-FIFFFTY... GEE.. TEE *PUSH* FIVEHUNDREDANDTWELVE EMBEEEEES ARGH! *explosion*

    :p ...

    I'm not a big fan of those colors and all, but I like how it looks individually.
    Plus the green would kinda match my case, I believe it's even got a lighting up logo on the top-side.

    Again, a blower to the wrong side, but I guess I could also even see for one of those other ones, I forgot the brand.
    Or else Zalman again...

    What do you think of it?
    How would it perform with it's specifications, would it be able to play HD-quality fluently (without stutters)?
    And what game-settings of course, things like that...


    Sorry for the long posts, just bla-ing on, just take your time lol...
     
  19. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Pretty much all of those problems sound like they weren't the graphics card.
    Anyway, you're welcome to try the 7950, XFX are a good brand. Graphics is sort of importsnt to play HD video, but the processor is too.
     
  20. DInc

    DInc Regular member

    Joined:
    Jul 19, 2007
    Messages:
    254
    Likes Received:
    0
    Trophy Points:
    26
    Yeah, the same guy of the bus-frequency-idea told me the 6800XT didn't have this certain technology to run HD correctly.
    Even though I do have PowerDVD with this certain HD-codec (or somthing) installed.
    With this GeCube-card it still performed HD about the same.
    Oh, I do see the H.265 for HD mentioned at the bottom of the page though.

    I don't know if you know if my processor could handle it?

    Copied:
    "High-Definition H.264, MPEG-2 and WMV Hardware Acceleration2
    Smoothly playback H.264, MPEG-2, and WMV video—including WMV HD—with minimal CPU usage so the PC is free to do other work."

    I guess it would give my processor a break.



    And how do you think games would do, like an estimate of settings Vs. framerates?

    You have to know that I want it to play the heavy game Rainbow Six Vegas, so I don't know if the card would be enough.
    And I'm not really planning to keep the best looking settings down.
    I'm sure it's an improvement from the 6800XT, but... again, maybe it's not enough to take it.


    http://www.gpureview.com/show_cards.php?card1=441&card2=511
    It seems like every number is about double or even 3/4 times as much.

    And compared to the X1950XT:
    http://www.gpureview.com/show_cards.php?card1=511&card2=503

    Some numbers are like, damn, that's so much more it might need.
    Note that surprisingly the XFX's Memory Clock is 1300MHz instead of 700, so 400MHz more than the X1950XT's 900MHz.
    Or actually 600MHz more since it shows 702 requested in the Overdrive-menu, strangely enough.

    Again, what I also meant before, is that the Texture Fill Rate is better(?) on GeForces than Radeons usually.
    Plus it has double the amount of video-RAM.

    The Memory Bandwidth is a bit lower, but I guess that would rather fix the trouble it has here??

    I'm only worried about the Shader Operations being 2-3 times as much on the X1950XT.
    I mean, shaders are a big thing in games right?
    It does have Shader Model 3.0 obviously...


    I wish there was an easier way to find honest results.
    Like some website where you can "build a system", so obviously the same parts as your own, and see what it would do with a certain card.

    Doesn't something like that exists?
    Or at least a website, with existing results through similar systems?
     

Share This Page