1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

C&C Generals Playing Errors

Discussion in 'CD-R' started by Crossfire, Aug 6, 2003.

  1. Applecorp

    Applecorp Member

    Joined:
    Aug 19, 2003
    Messages:
    97
    Likes Received:
    0
    Trophy Points:
    16
    Yeah, I'm going to look for a cheap GF4 MX card I think. I can't afford the best :(
     
  2. Praetor

    Praetor Moderator Staff Member

    Joined:
    Jun 4, 2003
    Messages:
    6,830
    Likes Received:
    1
    Trophy Points:
    118
    After looking up some of my docs about the GF3Ti and the GF4MX, you may find that a GF3Ti may yield better performance as compared to the GF4MX.... your call really. :)
     
  3. Applecorp

    Applecorp Member

    Joined:
    Aug 19, 2003
    Messages:
    97
    Likes Received:
    0
    Trophy Points:
    16
    Thanks Praetor, I was considering getting a used card, and maybe a GT3Ti would be the best option, and very cheap.

    Thanks again :)
     
  4. Praetor

    Praetor Moderator Staff Member

    Joined:
    Jun 4, 2003
    Messages:
    6,830
    Likes Received:
    1
    Trophy Points:
    118
    Well MaxPC says both the GF3*.* and GF4*. cards are DX8 cards however I have read in places that the GF4MX are DX7 whichis slightly disturbing hehe. Try to get a DX8 card dude
     
  5. Applecorp

    Applecorp Member

    Joined:
    Aug 19, 2003
    Messages:
    97
    Likes Received:
    0
    Trophy Points:
    16
    MX dx7? that is disconcerting!

    I have found an "Abit nVIDIA GF4 Ti4200 128MB-DDR 8xAGP Rtl GphCard" new for £106.(thats $167 US)

    Any good?
     
    Last edited: Aug 24, 2003
  6. Shoey

    Shoey Guest

    A) Yes, although the price "seems high". I bought my video card about 4 months ago and paid 90 buckaroos (total) "Google" up Jayton or eVga.

    Shoey :)
     
  7. Applecorp

    Applecorp Member

    Joined:
    Aug 19, 2003
    Messages:
    97
    Likes Received:
    0
    Trophy Points:
    16
    Well I'm in the UK so prices are bound to be higher!
     
  8. Praetor

    Praetor Moderator Staff Member

    Joined:
    Jun 4, 2003
    Messages:
    6,830
    Likes Received:
    1
    Trophy Points:
    118
    I'm in Canada and the price seems about $40CAD too high but well within a reasonable range. Then if i think about it a little more, Abit is a decent manufacturer and it may be worth the extra bit of moolah.
     
  9. Praetor

    Praetor Moderator Staff Member

    Joined:
    Jun 4, 2003
    Messages:
    6,830
    Likes Received:
    1
    Trophy Points:
    118
    Ah.... finally... manual found... now to clarify some stuff:

    FROM THE MSI MANUALS

    GF4Ti4600
    - Features the nVidia nfiniteFX II engine
    - Dual Programmable Vertex Shaders <-- This is indicitive of DX8/DX9 cards... DX7 cards did not feature programmable nothing
    - Lightspeed Memory Architecture II
    - Accuview Antialiasing
    - 4 dual-rendering pipelines
    - 8 texel/cycle
    - Dual cube environment mapping
    - 10.4GB/s memory bandwidth
    - 136M verticles/sec
    - 4.8G AA samples/sec fill rate
    - 1.23T operations/sec
    - DX8.1 Card

    GF4Ti4400
    - Features the nVidia nfiniteFX II engine
    - Dual Programmable Vertex Shaders <-- This is indicitive of DX8/DX9 cards... DX7 cards did not feature programmable nothing
    - Lightspeed Memory Architecture II
    - Accuview Antialiasing
    - 4 dual-rendering pipelines
    - 8 texel/cycle
    - Dual cube environment mapping
    - 8.8GB/s memory bandwidth
    - 125M verticles/sec
    - 4.4G AA samples/sec fill rate
    - 1.12T operations/sec
    - DX8.1 Card

    GF4Ti4200
    - Features the nVidia nfiniteFX II engine
    - Dual Programmable Vertex Shaders <-- This is indicitive of DX8/DX9 cards... DX7 cards did not feature programmable nothing
    - Lightspeed Memory Architecture II
    - Accuview Antialiasing
    - 4 dual-rendering pipelines
    - 8 texel/cycle
    - Dual cube environment mapping
    - 8.0GB/s memory bandwidth
    - 113M verticles/sec
    - 4.0G AA samples/sec fill rate
    - 1.03T operations/sec
    - DX8.1 Card

    GF4MX460
    - 2nd Generation T&L Engines
    - Non-programmable Shading rasterizer with 24 of 26 DX8 pixel shading functions <-- so is this really a DX8 card?
    - 38M Triangles/sec
    - 1.2G texel/sec fill rate
    - 600M pixel/sec fill rate
    - Single cube environment mapping
    - 8.8GB/sec memory bandwidth
    - According to MSI, this is a DX8.1 capable card

    GF4MX440
    - 2nd Generation T&L Engines
    - Non-programmable Shading rasterizer with 24 of 26 DX8 pixel shading functions <-- so is this really a DX8 card?
    - 34M Triangles/sec
    - 1.1G texel/sec fill rate
    - 540M pixel/sec fill rate
    - Single cube environment mapping
    - 6.4GB/sec memory bandwidth
    - According to MSI, this is a DX8.1 capable card

    GF4MX420
    - 2nd Generation T&L Engines
    - Non-programmable Shading rasterizer with 24 of 26 DX8 pixel shading functions <-- so is this really a DX8 card?
    - 31M Triangles/sec
    - 1.0G texel/sec fill rate
    - 500M pixel/sec fill rate
    - Single cube environment mapping
    - 2.7GB/sec memory bandwidth <-- OUCH!
    - According to MSI, this is a DX8.1 capable card

    GF3Ti200Pro
    - nfiniteFX engine for full programability <-- something the GF4MXs dont have
    - Lightspeed Memory Architecture <-- yet another thing the GF4MXs dont have
    - Programmable Vertex Shader <-- and again
    - Programmable Pixel Shader <-- and again
    - The manual says "integrated hardware T&L" so I would imagine only 1st generation T&L
    - 2.8G AA samples/sec fill rate
    - 6.4GB/sec memory bandwidth
    - DX8.1 capable card
    Interesting that they dont specify the pixel and texel fill rates.


    FROM THE ASUS MANUALS

    GF3Ti500Pro
    - nfiniteFX engine for full programability <-- something the GF4MXs dont have
    - Lightspeed Memory Architecture <-- yet another thing the GF4MXs dont have
    - Programmable Vertex Shader <-- and again
    - Programmable Pixel Shader <-- and again
    - The manual says "integrated hardware T&L" so I would imagine only 1st generation T&L
    - 3.84G AA samples/sec fill rate
    - 6.4GB/sec memory bandwidth
    - DX8.1 capable card
    Interesting that they dont specify the pixel and texel fill rates.


    Summary
    All the GeForce4Ti model cards are based on the NV25 chipset
    All the GeForce4Mx model cards are based on the NV17 chipset
    I have no idea what the chip model is for the GF3Ti cards (not even sure they made GF3MX cards)
    If you have the budget, jump yourself to a GF4Ti otherwise hunt around for a GF3Ti.

    According to MaximumPC's August issue, they state that any nVidia GeForce3 and any nVidia GeForce4 card is a DirectX8 compliant card in their final summaries however their first paragraph states that: DirectX 8 cards were the first to include programmable shaders on the GPU

    Furthemore, in MaximumPC's June issue, they compared the some of the older video cards:
    -GF3: "...Although quite anemic compared to today's DirectX9 accelerators, the GeForce3 holds its own in the poor-boy scene. But will a core speed of 200Mhz and a 230Mhz DDR be enough to trump the GeForce4 MX 460?"
    -GF4MX460: "When you see the 'MX' designator, you think 'budget'. But while nVidia's budget video card, the GeForce4 MX 460, doesnt run programmable shaders in hardware, it does feature a higher core and memory speeds than the programmable GeForce3 (300/275Mhz as opposed to 200/230Mhz), and more memory bandwidth than the GeForce4 Ti4200. We didn't think the nForce2 could match the raw power of GeForce4 MX 460"
    -The Contest: "...And despite out conjecture that the highly clocked GeForce4 MX 460 would be the overall champ, the old GeForce3 nosed by it to steal victory. It turns out that in this contest, the GeForce3's 57 million transistors and four pipelines beat the GeForce4 MX 460's 29 million transistors and two pipelines - core and memory speeds notwithstanding."

    [/i]Personally[/i], I tend to agree with the June issue's judgement about the GF4MX and the GF3 (and they were comparing with the MX460 -- odds are, the one you will find is a MX440, the did not however specify whether it was a GF3Ti200 or GF3Ti500). It seems that their August issue made a slight goof up about classifying ALL GF4's as being DirectX8 capable. (Even according to the specs, the MX cards only feature 24 of 26 DX8 pixel shading functions). So So in my summary (hehe this has turned out to be quite the rant), I would reccommend that if you have the money, hunt down a GF4Ti, if you dont then (if you have the time, as they are incredibly hard to come by, in my experience), hunt down a GF3Ti and as a last resort, grab a GF4MX.
     
  10. Applecorp

    Applecorp Member

    Joined:
    Aug 19, 2003
    Messages:
    97
    Likes Received:
    0
    Trophy Points:
    16
    I think I'm going to go for the card I found, I know I can find it cheaper, not much but as you said Abit are good, my GF3 siluro is an Abit and I am very happy with it's perfomance.
     
  11. Shoey

    Shoey Guest


    GF4Ti4200-TD
    - Features the nVidia nfiniteFX II engine
    - Dual Programmable Vertex Shaders <-- This is indicitive of DX8/DX9 cards... DX7 cards did not feature programmable nothing
    - Lightspeed Memory Architecture II
    - Accuview Antialiasing
    - 4 dual-rendering pipelines
    - 8 texel/cycle
    - Dual cube environment mapping
    - 8.0GB/s memory bandwidth
    - 113M verticles/sec
    - 4.0G AA samples/sec fill rate
    - 1.03T operations/sec
    - DX8.1 Card


    http://www.msi.com.tw/program/products/vga/vga/pro_vga_detail.php?UID=378

    [bold]Here's a great MSI video card, "if the price is right".[/bold]
    MSI FX5600-TD256 (nVidia GeForceFX 5600\8 AGPx)
    http://www.msi.com.tw/program/products/vga/vga/pro_vga_detail.php?UID=446

    Shoey :)
     
    Last edited by a moderator: Aug 24, 2003
  12. Praetor

    Praetor Moderator Staff Member

    Joined:
    Jun 4, 2003
    Messages:
    6,830
    Likes Received:
    1
    Trophy Points:
    118
    Yes indeed an excellent card! Beware the MSI drivers tho - they tend to be a bit dated..... and finicky at times. I would reccommend you use the nVidia reference drivers unless you have a specifc reason no to otherwise. :)_X_X_X_X_X_[small]ASUS A7V8X-X, AMD2500+
    Samsung 1024MB, PC2700
    360GB [3x120GB, 7200, 8MB]
    MSI Starforce, GeForce4 Ti4400 128MB

    AFTERDAWN IRC: irc.emule-project.net, #ad_buddies
    COME SAY HI![/small]
     
    Last edited: Aug 24, 2003
  13. Praetor

    Praetor Moderator Staff Member

    Joined:
    Jun 4, 2003
    Messages:
    6,830
    Likes Received:
    1
    Trophy Points:
    118
    Finally dug it up... now to put this DX7/DX8 thing to rest hehe. Accoding to PC Gamer (April 2002):
    So its decided... i'm surpsied Maximum PC made such a goofup :S ...all the GF4Tis use the NV25 core while the GF4MXs use a NV17 core.
     
  14. Applecorp

    Applecorp Member

    Joined:
    Aug 19, 2003
    Messages:
    97
    Likes Received:
    0
    Trophy Points:
    16
    I was surprised the GF3 series didn't last that long, the GF2 series seemed to go on indefinately.

    But the leap between 2 and 3 is quite immense don't you agree?

    My GF3 card quite happily runs todays most demanding games.

    Also, what are your thoughts on direct x 9b, for my Geforec 2 Pro and later cards?
     
    Last edited: Aug 27, 2003
  15. Praetor

    Praetor Moderator Staff Member

    Joined:
    Jun 4, 2003
    Messages:
    6,830
    Likes Received:
    1
    Trophy Points:
    118
    Yes indeed i would agree the jump from GF2 to GF3 is immense... the introduction of the programmable pixel/vertex shader will account for that. Also the GF3s are DX8-hardware cards whereas, the GF2s are DX7 cards. All this fancy crap just means that the hardware is natively capable of handling DX7/DX8 instructions. If i am not mistaken, if your hardware does not support say, DX9 instructions, then those instructions are relegated to your CPU which kinda negates the point of having a video acclerator in the first place hehe. This of course only applies to specific application calls for DX9 effects and not necesssarily for the entirety of the game (I hardly believe that the main menu is 3d-rendered and requires extensive hardware DX9 support hehe).

    That it may, but with the coming of HL2 (which im not looking forward to hehe) and DOOM3 (which i am looking forward to), your current hardware setup may be pushed just a little bit further than even some of the more demanding games now hehe I would render to guess that to play HL2/D3 at an "enjoyable" level one would need a GF4Ti series or better card... a GF3 may do it but your enjoyment factor will be heavily dependant on the other aspects of your hardware.
     
  16. Applecorp

    Applecorp Member

    Joined:
    Aug 19, 2003
    Messages:
    97
    Likes Received:
    0
    Trophy Points:
    16
    I eventually bought a used GF4 ti4200 for £60 on ebay, not bad I think.

    I too am very much anticipating HL2 and DOOM 3, as I would prefer to play them on PC rather than my XBOX. Is HL2 coming to XBOX?

    Would my new card be sufficient enough to play these games with a decent framerate, or would my CPU (Athlon 1ghz), RAM (256 mb) etc drag it down? I'm pretty sure I need more RAM at least.
     
    Last edited: Sep 14, 2003
  17. Shoey

    Shoey Guest

    You'll get the best performance uping your video card ram if you're a pc gamer m8. Sure there are "tweak" programs out there to help.
    Seriously consider uping your ram to at least 512. I'm running 512 as it is and I'm not comfortable and soon to up to 1 gig.

    Shoey :)
     
  18. Applecorp

    Applecorp Member

    Joined:
    Aug 19, 2003
    Messages:
    97
    Likes Received:
    0
    Trophy Points:
    16
    Thanks, I'll probably add another 512mb.

    So no CPU problems then? If I had to change it, it means buying another mobo. My CPU seems to be performing very well all things considered, I recently did a benchmark/stress test and it passed.
     
  19. Praetor

    Praetor Moderator Staff Member

    Joined:
    Jun 4, 2003
    Messages:
    6,830
    Likes Received:
    1
    Trophy Points:
    118
    Athlon 1GHz.... i take it thats the Tbird or the original Athlon... in either case.. you may find that HL2 and D3 run slowly (more so if you ramp the quality settings hehe). Consider upgrading the memory dude. Im runnin 1GB and its not enough hehe....


    About upgrading the CPU... you wont have to upgrade the mobo if you stay within the same CPU class (generally). However since the Athlon/TBird is ancient as far as the nwer systems go, if you want to upgrade the CPU yto something more newer (a Palo/TBred/Barton) you will have to upgrade the mobo... look around but you should be able to find the ASUS A7X8X-X mobo (same as me) and a TBred 2400 for fairly cheap.
     
  20. Shoey

    Shoey Guest

    Wonders why? (just kiddin' m8)

    You can do wonders if your mobo supports the AthlonXP 2100+ as this has a multiplier of 13x (I think) and you can overclock. If your mbo doesn't support this high of a cpu maybe a mobo flash "might" get you there but I seriously doubt it. Look at some Asus& MSI (Micro Star) mobo's as you can find one to suite your needs reasonably fai priced.

    Shoey :)
     

Share This Page