1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Intel P4 vs AMD

Discussion in 'PC hardware help' started by brobear, Sep 23, 2005.

Thread Status:
Not open for further replies.
  1. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    If I could get CCC to work on my machine I'd tell you what I get. However, since I can't that'll have to wait!
     
  2. brobear

    brobear Guest

    aabbccdd
    On PSUs, always allow a decent overhead as you don't want a PSU running max. At least 10% over the required output would be a good level to shoot for. If the PSU has more reserve, that just means it won't have to work as hard. Constant heat build up shortens the life of electrical components. So, the cooler it runs (less load) the better. For basic systems though, some of the big PSUs are overkill. I'm actually getting more than I need with the Antec, but it was purchased with another, more high performance, build in mind for later.
     
    Last edited by a moderator: Mar 31, 2006
  3. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Hi, sorry to divert the subject slightly, but earlier we had a debate about ATi VS nvidia on image quality. I've been looking around, and the most noticeable effect is this image.

    FOLLOW THE SLANTED WALL AT THE BOTTOM.

    (Image courtesy of Apple740 at Hardforum)


    [​IMG]

    If anyone finds anything that proves the scenario the other way round I'd like to see it.
     
  4. Sophocles

    Sophocles Senior member

    Joined:
    Mar 1, 2003
    Messages:
    5,985
    Likes Received:
    77
    Trophy Points:
    128
    tocool4u

    Sam was right. Those were Vid card settings. You will note that your memory clock in MHZ settings of 270 are exactly one half of 540.

    Graphics cards are easy to overclock but the success depends on your cards GPU and cooling on both memory and GPU. If you want to get the most out of your cards then you might want to consider some highly optimized drivers by Omega. They take drivers from Nvidia and ATI and improve them for enthusiasts. If you choose to try them out, be sure to research what you're getting into and follow the instructions. I use them all the time. Since I'm not using a Nvidia card at thist time, here's the link for ATI users.


    http://www.omegadrivers.net/
     
  5. vspede

    vspede Member

    Joined:
    Dec 25, 2005
    Messages:
    90
    Likes Received:
    0
    Trophy Points:
    16
    Wow your AI Booster can go up to 20%? Mine only goes up to 10%

    Yea with mine I have an OC Setting "and" and NOS Setting. I tried to do both OC and NOS to 10% but my computer freezes.

    Yea I was such a newb, when I bought it. I saw the listing on my motherboard and CPU saying FSB 2000!!! I was like whoa, My old pc was only FSB 800. But apparently it was an advertising trick where FSB Can be OC to 1000 and Hyperthreading makes it 2000 or something.

    Most I can get so far with AI Booster for my AMD 64 3700+ is 2.4 Gigs and FSB 878 with a 10% boost from the AI Booster.

    Anyways that picture for the Half-Life 2, I don't see a difference, do you?
     
  6. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Look again, and as I said, FOLLOW THE SLANTED WALL AT THE BOTTOM, all the way down the corridor. At the end of the corridor you should notice a major difference.

    That 3700+ speed is nothing to sneeze at, with it running at 2400mhz, that's effectively an Athlon64 4000+, which are not far off twice as much money as your CPU.

    You're right about Hypertransport by the way, it is 2x1000mhz.
     
  7. tocool4u

    tocool4u Guest

    I see a difference....Look at the end of each picture........One of them is blured out..Thus making the top 1 better


    And might i add..I think this is the best most organized thread on AD!!!
    Don't you?
     
    Last edited by a moderator: Mar 31, 2006
  8. Sophocles

    Sophocles Senior member

    Joined:
    Mar 1, 2003
    Messages:
    5,985
    Likes Received:
    77
    Trophy Points:
    128
    tocool4u

    It's certainly one of them! Here's another for DVD copying and you'll note 6402, brobear, and others throughout that one too.




    http://forums.afterdawn.com/thread_view.cfm/97052
     
    Last edited: Apr 1, 2006
  9. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    Brobear,

    I too crashed the first time I tried 20%. I went back to 10% and started playing with the memory settings. But only the auto settings. It wasn't until I got the Corsair XMS that I started really playing with the memory settings. First question is: What is the frontside bus speed of your CPU, 533 or 800? My Drystone benchmarks are marginally better than the P4E 570 3.8 and I blow it away by a wide margin in Whetstone. They improve very little going to 3.70. I imagine that the way Intel did things by raising the speed that the higher the base CPU frequency, the less headroom you have to OC. Running yours at 4.10 to 4.20 seems to me to be a realistic goal. After I finish this post, I'm going to look in the BIOS and I will let you know what I have set. Some of the settings I arrived at were trial and error!

    Happy Computering,

    theonejrs
     
  10. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    Brobear,

    O.K., I'm back and here goes. I didn't do the whole BIOS, just the things that matter.

    In the [bold]Main Menu:[/bold] IDE Configuration is set to advanced mode
    Enhanced mode support = on P/ATA

    [bold]Advanced Menu:[/bold] AI Overclock Tuner set to Manual
    I have my CPU External Frequency set to 240
    DRAM Frequency set to Auto
    AGP/PCI Frequency set to 66.66/33.33
    CPU Voltage set to Auto
    DDR Refrence set to Auto
    AGP VDDQ Voltage set to 1.50
    Performance mode is set to Turbo

    [bold]Configure advanced CPU settings[/bold]
    Max CPUID value limit set to disabled
    Enhanced C1 Control set to Auto
    CPU Internal Thermol Control set to Auto

    [bold]Advanced chipset settings[/bold]
    Configure DRAM Timing by SPD set to Disabled
    DRAM settings are 2.0 2 2 2 5
    DRAM Burst length is 4 clocks
    Memory acceleration mode is Enabled
    DRAM Idle Timer set to Infinate
    DRAM Refresh Rate is 15.4 uSec
    Graphics Adapter Priority is AGP/PCI
    Aperature size is 128MB
    ICH Delayed Transaction is enabled

    My computer will run stable with reasonable temps at any CPU speed from 3.0 to 3.6 with these settings. all I have to do is change the CPU External Frequency to set the CPU speed. Some of these settings make a huge difference in my benchmarks, but like I said in my last post my 3.0 will beat a 3.8!

    Hope this helps,

    theonejrs



     
  11. tocool4u

    tocool4u Guest

    hey,
    I heard that some AMD processors are "locked" for OCing
    What does this mean,and how would you unlock it?
    My bro's PC runs in AMD sempron(i don't know if it is locked)

    Oh and i heard that the difference between The celeron and P4 is that their is a part on the celeron that is disabled
    Thanks
     
    Last edited by a moderator: Mar 31, 2006
  12. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    sammorris,

    re:The video card pics. First off, let me say this. These are 2 entirely different video cards, with vastly different GPUs. Both take different means to achieve the results seen. Because these are subjective things it is difficult to say which is the best.

    In a quick first look it seems that the X1800XL has the edge! But does it? Down low in the background it shows the wall much sharper. Then take a close look at the holes in the tower. They appear much sharper with the 7800GT. also, in the extreme background the renderings of the background buildings. The windows are much sharper with much more detail than on the X1800XL.

    I copied the image and blew it up in my Windows & Fax viewer. The dots you see are actually little squares. There are more squares and they are more distinct on the 7800. Blow them up and see for yourself. Frankly, I'm a little suspicious of the results. I mean these are still only 2-D pics so why is the upper left wall about the same on both while the lower wall seems indistinct and out of focus on the 7800GT. If this is a moving train then the blur would only serve to be a little more life like. I personally would be happy with either card but I don't think one is superior to the other overall. It just seems to me that both card designers concentrated their efforts and priorities on different areas of the screen. Just my observations and 2 cents worth!

    Happy Computering,

    theonejrs
     
  13. brobear

    brobear Guest

    But it's not a moving train and without extreme magnification the ATI clearly shows a superiority in this capture. Plus we're all seeing the pictures on different monitors using different graphics cards ourselves. I don't see the difference that theonejrs is seeing with those background windows and buildings. Blowing up pixels can show what the makeup of a picture is like, but the picture itself has a lot to say at its native resolution. The definition of the wall disappears and there's more blurring around the door at the rear. As for the windows in the background, I'm seeing those about the same in both pictures, no great definition, but visible.
     
  14. brobear

    brobear Guest

    Theonejrs
    I wasn't naive enough to believe my system would do an automatic 20% OC, especially without some power to pull it off. I just wanted to see what the AI would do? I was already aware one wasn't going to hurt the system with the automatic settings invoked. If things are working as they're supposed to, the system protects itself. As I mentioned, I was rewarded with a shrill female voice emanating from the AI through my speakers. ;)

    As for shooting P4 3.8GHz 570s out of the water, the Northwood can do that at less than 3.6GHz. I've already seen some of what it can do, and Scubabud's posts show some of the potential. Here's a copy of my system's bench running at 3.57 (5%). The 3.8GHz isn't what it should be. Just look at Scubabud's system running at 3.7. Here's my bench at 5%:

    [​IMG]
     
  15. aabbccdd

    aabbccdd Guest

    sammorris, looks like i have the right video card huh lol (per my sig.)
     
  16. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    Brobear,

    I agree totally. I think the monitor has everything to do with what you actually see. Maybe this one just does a better job. Maybe like some DVD copy programs the alogorythims used concentrate on different areas of the screen, concentrating more on the central and far distance on the 7800. I also checked it out on the Dell 420 Workstation which now has the Dell H1226 19" on it now and it does look a little different. I repeat, I would take either card and be very happy with it!

    I just got done viewing some digital pictures on the Sceptre and they (like everything else, so far) look outstanding. I also scanned some photos of my kids and they also look great. All my friends that have seen it want one!

    I did notice one thing when copying a made for TV movie with DVD Shrink. If you re-author and copy only the movie the quality is poor. Copy the whole DVD and the quality goes way up. Maybe it's the timing of the videotape. Copy just the movie and the colors look washed out with no real depth. Copy the whole DVD and both the colors and the color depth improve dramaticly.

    I sent you my BIOS settings just as a guide there are a couple of things in there that when enabled or set on auto make all the difference in the world in the benchmarks, with a noticable difference in performance. Particularly with the Memory Acceleration enabled and the Performance mode set to Turbo. I may just get a 3.4 Northwood for it eventually and find out for myself. No real rush though as I am more than satisfied with the way it runs right now. I know that the 3.8 is not as good a chip as the 3.6, potential-wise but I'm pleased that my OC'd 3.0 beats both of them! It wasn't the cheapest way to more performance, but it sure was fun learning. As far as the Sceptre goes, it is my second best buy ever at about $400. The only better deal I ever got was the Dell 1226 used for $50.

    By the way, If you come across a window case like this one, please let me know. I bought this one from Computer Geeks and was going to get a black one for the new build. I waited too long and they no longer have it. It was a cheap case that got expensive after I bought it and added everything to make it complete and I really like it. I like the drive door as it keeps the dust (very bad here) out of the DVD and CD drives. It has to open left to right as it sits on the right side of my desk. I also like the long LED display on the door. If it opened the other way I would have to reach around the open door to put a disk in.
    [​IMG]

    Happy Computering,

    theonejrs

     
    Last edited: Mar 31, 2006
  17. tocool4u

    tocool4u Guest

    Last edited by a moderator: Mar 31, 2006
  18. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    tocool4u,

    Thanks for the link. The last time I looked for that particular case the Geeks told me they no longer had that model and probably wouldn't get any more. I think it looks "bitchin" in black. I'll order one tomorrow. What I like about it is it's has a lot of room but it is not too big. Add some quality fans and a a good power supply and it's ready to go. Looks nice too! My only complaint with it is I can't get the front headphone and mike plug-ins to work with my MB, and I have no idea why.

    Thanks again,
    theonejrs
     
  19. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    aabbccdd,tocool4u and brobear, I'm glad I'm not the only one who sees a noticeable difference between the two in so much as top is better.

    Theonejrs, don't take this as an insult, but I'm picking up slight traces of nvidia fanboy...

    tocool4u, as for "locked CPUs" all desktop CPUs have a locked multiplier, as in how many times the front side bus speed the CPU is, which is why we have to adjust the front side bus itself so much to see results. However, some motherboards such as brobear's old dell board don't permit any of the functions required for overclocking. It's less to do with the CPU, more the motherboard in terms of whether you're able.
    However, it's all to do with the CPU and RAM as to how far you can go.
     
  20. brobear

    brobear Guest

    Theonejrs
    Thanks for taking the interest and recording those BIOS settings. As for the graphics and monitors, I have decent equipment. I wasn't talking about an inability to show the quality of a photo at any point. I'm sure you're happy with your Sceptre monitor and nVidia card. I'm pleased with my system as well, but don't feel the need to create hypothetical situations where it may do better or supply pixel counts to make a supposition as to possible quality. If you feel so strongly that Sophocles maligned your favored brand, the thing to do is pick up the gauntlet he dropped and go find some photographic proof that supports your claims. As far as everyone else here is concerned, the nVidia generated photo loses definition where the Radeon is clear (it's a still, not a moving train). Time to start googling around to find the proof to support your assertions. Otherwise, we may be forced to believe Sophocles presentation of the nVidia breaking down where the Radeon is going strong.
     
    Last edited by a moderator: Apr 1, 2006
Thread Status:
Not open for further replies.

Share This Page