1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Intel P4 vs AMD

Discussion in 'PC hardware help' started by brobear, Sep 23, 2005.

Thread Status:
Not open for further replies.
  1. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Absolutely. I'm still looking, and have yet to find something that disproves what I posted earlier...
     
  2. Sophocles

    Sophocles Senior member

    Joined:
    Mar 1, 2003
    Messages:
    5,993
    Likes Received:
    77
    Trophy Points:
    128
    The ATI card clearly shows that it renders the better image. My monitor is no slouch when it comes to visuals so what I'm seeing is real. This however shouldn't come as a suprise because neumerous comparisons have shown the ATI 1000 series to be the clear winner when it comes to image rendering. Expecially if you happen to have a TV tuner and need top notch video deinterlacing. However I expect that Nvidia will get their act together quite soon and then one of them will try to outdo the other on another front as the battle goes on.

    Extreme tech comparison.

    http://www.extremetech.com/article2/0,1697,1916973,00.asp

     
  3. brobear

    brobear Guest

    I'm all for competition, better equipment and lower prices.
     
  4. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Anybody who isn't needs to get their head together!
     
  5. brobear

    brobear Guest

    LOL I thought that went without saying. ;) There has to be the exception, but I'm never there when they start giving the money away.
     
  6. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Lol you said it!
    I know exactly the feeling, "I'm sure that was cheaper yesterday..."
     
  7. Sophocles

    Sophocles Senior member

    Joined:
    Mar 1, 2003
    Messages:
    5,993
    Likes Received:
    77
    Trophy Points:
    128
    Only a few days ago a rumor surfaced that Dell had just acquired Alienware. Many believe that Dell acquired Alineware so that they could build PCs using AMD CPU's. There's a long running debate as to whether or not Intel had been putting pressure on Dell to use only Intel CPUs. Now one would think that this is big news in the PC world and it is if it weren't for an even bigger piece of news that has just surfaced.

    It is now rumored that Newegg has acquired Dell computers. Some of the information is still a little sketchy but since the news came from a ruputable tech site (TG daily an extension of tomshardware) I'm inclined to give it some credence. If anyone is interested in reading about this latest bit of news, here's the link.

    http://www.tgdaily.com/2006/04/01/newegg_to_buy_dell/

    Below is another link abut Dell's acquisition of Alienware.

    http://www.betanews.com/article/Dell_Acquires_Alienware/1143070457
     
  8. aabbccdd

    aabbccdd Guest

    hummm Intel isnt going to like this huh lol

    and sammorris thats a great pic between the two video cards now we have proof in the ATI camp ,unless someone can come up with a pic to disprove this
     
  9. ScubaBud

    ScubaBud Regular member

    Joined:
    Dec 29, 2004
    Messages:
    1,951
    Likes Received:
    0
    Trophy Points:
    46
    Am I in the right place? Is today April 1st, 2006? <G>
     
  10. aabbccdd

    aabbccdd Guest

    yup ScubaBud Scotty has just beamed you up lol it is April 1 ,2006
     
  11. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Hmm, I'll believe the Alienware-Dell deal but I'm not going to believe the newegg-dell deal just yet. Firstly it seems absurd, I'm sure Dell are a far bigger company than newegg, and secondly that was posted on April Fool's day.

    Edit: I've now actually read the article, and I'm definitely not believing it lol.
     
    Last edited: Apr 1, 2006
  12. tocool4u

    tocool4u Guest

    That first paragraph doesn't seem like it is true....
     
  13. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Well quite, the more you read, the sillier it gets!
     
  14. tocool4u

    tocool4u Guest

    I like this paragraph....


     
  15. ScubaBud

    ScubaBud Regular member

    Joined:
    Dec 29, 2004
    Messages:
    1,951
    Likes Received:
    0
    Trophy Points:
    46
    Knock Knock...

     
  16. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    That's what told me to read the article...
     
  17. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    sammorris,

    I was being very objective in my observations. This is just the way I viewed the posted pics. To me it makes a lot of sense to enhance certain areas of the screen over others to enhance the action. Of course this enhancement has to come from somewhere and I'm sure that the game designers do this when they design games. They all rob from peter to pay paul! After thinking about it overnight, I think those posted pics rely more on the programer's skills. After all I was viewing them using an FX5200 AGP card. It didn't dawn on me that my monitor might have more than a little something to do with the things I observed and comented on. I mean it is a "static" picture of a game. After looking at them on my old monitor, it just increased my admiration for the Sceptre and it's picture quality.

    While I may be a fan of XFX as I've had very good luck and results with their video cards, I will more than likely go with an ATi X800 AGP 8x for this machine. It just offers me more for the same price. Since I'm going to use this computer as an entertainment center in my room after I build the new one, why turn down a TV tuner and Vivo that is essentially free as both cards cost the same. The difference in overall video quality is so slight that most people would not notice the difference anyway. It's like the old Atari 5200 vs Colecovision video arguments. The 5200 technically had the better, more advanced graphics engine but the Colecovision always looked better. anyone who ever played Centipede on both machines will tell you that. For it's day, Centipede on the Colecovision was probably the best and most faithful translation of any arcade video game either game machine offered. If you had the "roller controller", you were in heaven! It was superior by light years to the 5200 version which had a terrible blocky look to it's graphics. I wish I could find that rom for my Coleco emulator! This was the only game Atari ever ported to the Coleco and it was, and still is magnificent! Lots of fun too!

    vspede,

    Forget about the AI and use the BIOS to set things. You get much better control of the settings. On mine I can increase the base fsb using the manual CPU settings from 200MHz (800 quad pumped)in increments of 1 MHz. It gives you much better control over the performance of the CPU and memory. I know most everyone is in a hurry to see what they can make their computer do, but go slow. My Mom used to say "nobody wants to hear about the labor pains! They just want to see the baby"! Trust me when I say, these are the labor pains! Take changes in small steps. Start at 10% (if it will boot). Live with this for a week or so. Benchmark it to see where the improvements are. Use it in the real world and see what the real benefits are for all the software you use. I took mine all the way to 3.70 and it ran great. One day I needed a quick copy of a DVD so I did it with DVD Shrink. The copy failed! I lowered the CPU speed back to 3.60 and it worked just fine. I raised it back to 3.70 and it failed again! All my other programs ran perfectly at 3.70, so you have to spend some time using your programs looking for glitches. You can't do this overnight!!! If everything works OK, then make some more "small" changes and benchmark again. Look at Sophocles Sig. He's running at 2.67Ghz with his Opteron 175. Why the odd number? It's partly due to the multiplier (mine is 4MHz for every 1MHz increase in the base fsb) and that's what it runs best (for now) at! He's spent considerable time experimenting with the BIOS settings. There is no quick route to great performance! It takes time and patience, but the rewards are worth it. My 3.0 out benchmarks the 3.8 P4/1MBL2 processor that costs almost 4 times as much!

    Happy Computering,

    theonejrs
     
  18. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Hmmm, it's an interesting standpoint. And I agree about overclocking, do it slowly.
     
  19. Sophocles

    Sophocles Senior member

    Joined:
    Mar 1, 2003
    Messages:
    5,993
    Likes Received:
    77
    Trophy Points:
    128
    Does no one have a sense of humor anymore?
     
  20. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Yes, I do... I laughed at that article... and?
     
Thread Status:
Not open for further replies.

Share This Page