1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official OC (OverClocking) Thread!

Discussion in 'PC hardware help' started by Praetor, May 1, 2004.

  1. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Both of my quad cores can do Vista and 7 just fine with their current OCs. The dual core can do XP and 7 fine but Vista is super sensitive and I have to back it down to 3.1 to stop the bluescreens.

    Mind you this is with Orthos Load. The 7750 BE tests clean for 24 hours in XP and 7. But doing seemingly mundane tasks in Vista will bluescreen it pretty quickly.
     
    Last edited: Jun 12, 2009
  2. chop2113

    chop2113 Regular member

    Joined:
    Oct 18, 2006
    Messages:
    265
    Likes Received:
    0
    Trophy Points:
    26
    I see. Ive never used vista. Only Xp pro and now win7. Im really enjoying win7 64bit. Its refreshing after being with xp for so long. Took a little get'n use to. But im comfortable with it now. Runs everything i had in xp either the same or better.
     
  3. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Yeah Windows 7 is actually turning out a LOT nicer than I expected it to be. I never thought I'd say this but WAY TO GO MICROSOFT! You actually did something right for a change :D

    Not to mention Windows 7 is WAY more forgiving on slower PCs than Vista. I have a friend running it quite decently on a 1GHz Pentium 3 with 384MB of RAM. It could barely boot Vista.

    I'm still pretty happy with Windows XP SP3 32-bit as my main OS though. At this point it's nearly completely bug free, performs 100% consistently, and is very resource light. I will be getting Windows 7 x64 Ultimate the day it releases. But until then I 'm quite happy to spruce up XP a bit with some custom themes, programs and matched wallpapers. My desktop looks quite sleek and organized actually.

    Also, for my games I use Rocket Dock to keep my icons hidden. I have about 40 games ATM in a hidden pop-down menu ready to go whenever I may get the urge. Most of them cracked so I don't need disks. Very convenient to just click an icon and have my game. Rocket Dock is also resource light. All sorts of cool icon effects and transparencies and whatnot and it's taking 10MB of RAM.
     
    Last edited: Jun 12, 2009
  4. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    My windows 7 experience has been pretty good. Extremely minimal problems. A volume issue with a game at times, but supposedly there is a fix for it. So far most things install without nag. Rivatuner gives me headaches, if I dont boot up without driver signing. Ah well. Im still pretty content with XP for now, until Windows 7 is released, and all the patches I require are released. I would love a touchscreen to go with it. THEN the OS would be VERY productive :D
     
  5. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Well those bugs are to be expected with practically a beta OS...

    And LOL Rivatuner. If you're OCing your video card that could cause issues in itself. I don't generally recommend it. If you're using Rivatuner for fan speeds, well then bum luck. Try Nvidia nTune, it works well enough for both overclocking and fan speeds.
     
    Last edited: Jun 13, 2009
  6. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    Estuansis,
    I like the Codename you gave to the 7750-BE. The Little Dual That Could! You have that right! I just finished the largest DVD I have ever seen (7.8GB/2:52) with DVDRB/CCE, and it took 66 minutes total! Doesn't sound very good until you compare it to the 85 minutes total it took to do on my E6750! The 7750-BE certainly can! LOL!!

    Russ
     
  7. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Well it was certainly a surprising CPU for the money. I bought it expecting a refresh of Athlon 64 X2. It blew my mind when I found it decently outpacing my old E6600 at the same 3.2GHz. It's not quite as fast as my E6750 at 3.4GHz was but it gets pretty close for the most part. And apparently it IS faster in a few things. The thing games like a beast! It pushes that big, fat, factory OC'd 9800GTX+ no problem. Even if a lot Intels are much faster, AMD has always kept up quite well in gaming. Intels don't have the 3DNow!(+) instruction set.

    For being a Phenom I dual core it's really impressive. Had it been released with the Phenom I quads it would have sold like hotcakes. Too bad it came so late in the game. But it certainly offered an interesting upgrade for my trusty old 5000+ BE! I didn't think the difference would be so huge! Maybe a Phenom II dual core is in line soon to replace it? I don't have unlimited funds but I like getting new hardware to play with :)

    Also, Phenom architecture offers an interesting angle at OCing. The NB/HTT tweaks you showed me drastically improved the response of the entire system without ever leaving 3.2GHz!
     
    Last edited: Jun 13, 2009
  8. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    85minutes on an E6750? Huh. My 5200 Athlon X 2 darn near meets that. And it has a lesser clock. I wonder what the price difference of the two were, when they were released.
     
  9. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    Estuansis,
    I think if they had done it back when the Phenom came out, the 7750-BE would have been a flop. It would have had the same problems that the quads experienced. I think it's a much better chip today than it would have been, even a year ago. The architecture of the Barcelona core has always been good, but unfortunately there were mistakes in the implementation of the architecture, that were cast in the silicone. Imagine the success that today's Phenom would have had, if it had been right from the start. With AMD's experience with designing and building a monolithic Quad, as opposed to the route Intel took with the Core 2 Quads, that's beginning to turn around. Remember also that not only did AMD build a true Quad, they were the first to get to 45nm. This may not seem important, but in the end it might be!

    This is in no way a rant against Intel, but they did struggle to get to 45nm, and finally had to go to the Metal Gate transistors to accomplish it. Now we are all patiently waiting for the Metal Gate transistors for the Phenom and Phenom IIs, which should improve overclocking and scaling of the CPU, as well as consuming about 30% less power, not to mention that they are faster in operation than conventional transistors. It's not just the cores that draw power! Don't forget also, that there is still lots of room within the architecture for things to be improved in the future, while Intel has pretty much painted themselves in a corner as their C2Q and C2D architecture is about tapped out. There's just no room left to allow for much improvement without a new design. Consider too, that originally, the cost of making AMD's Quads was horrendous, as AMD chose a more expensive path than Intel. At first there were poor yields with the wafers, combined with all the other problems the Barcelona had, and it was a disaster! Today, that's an entirely different story. Wafer yields have come way up, and because there is no need to employ people just to connect the cores, their production costs are lower than Intel's!

    Internally, as a company, AMD was no better off! That's what happens when people are told not to think for themselves. It stagnates the brain! It also almost put AMD under! Intel was the very opposite of that. Had there not been a handful people who thought for themselves, and had faith in the fact that they were going the right direction with the P-III architecture, even after they were ordered to stop working on it's development, there would have been no C2D as we know it today! Netburst architecture would never have cut it!

    Here's where I break out my Crystal ball! I think that the AMD lines we be Regor, Phenom or Phenom II based, all made on three production lines as needed. As the wafers improve, the percentage of Quads with sub standard cores will decrease. I think they still will be used for Dual Cores, but I think the majority of the Dual Cores will be produced as Dual Cores, to meet demand. Same way with the triples to a degree. That's one of the reasons I am thinking about skipping the 550-BE altogether and get the 720-BE x3 Phenom II, as more of them will be Quads with a substandard core. That can only mean that the triples will be better binned than the duals, as the standards for the quads would be the highest, and three cores would have met or surpassed that standard. In all the tests I've seen the triple has shown some very strong points, and does better in most things than the x2 550 does and it games extremely well. The price difference at the moment is only $28, both with free shipping at the egg! I also don't think it will be obsolete as quickly as the duals! Let's face it, the Brisbane has had it's day, Long Live Regor! I'm sure there will be many more models to come!

    My best guess is that eventually the Athlon x2s, will be Regor core. Probably Phenom x3 and x4s based on the Barcelona core and Phenom II x2, x3, and x4 based on the Deneb core. I still see the x2 as necessary, as it's an easy and inexpensive step up to better and more efficient architecture and AM3, once we see the appearance of the metal gate transistors! Whatever AMD does do, it's going to get interesting! I'll also say this. Ati is what saved AMD! I think that buying Ati was AMD's saving grace. Had it not happened, there would have been no one around to see and understand where the problems really were, and that was within the company itself and the way it did things. Far to many people listening to what other people think instead of using their own brains and do some thinking for themselves. You can't learn anything if other people do your thinking for you!

    Best Regards,
    Russ
     
  10. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    Oman7,
    I do have to point out that I have never seen a more complex DVD in my life. This was a conversion of a 35mm film at 24fps to NTSC, which is 30 fps interlaced. That's all work that has to be addressed by the encode. My AMD was about 20% faster at encoding and about 50% faster for this rebuild. I don't know how many total frames there were, but the number was huge! I know that every time I looked up it was chewing on another 19,000 frames! It rebuilt the whole DVD in just under 6 minutes! I think I understand what effect raising the CPU NB and HT link frequencies has accomplished! The CPU doesn't have to wait as long for it's data, so the machine is faster, and the CPU is fed data at a faster rate thanks to the 2200MHz CPU NB frequency and a 2200MHz HT link. It adds up to a fatter memory bandwidth with a faster HT link delivering the information to the CPU. I think that's what makes the encode times so much faster. The Rebuild is stunning to watch! Compared to before, the blue line moves noticeably faster. By the time it reaches half way, you know it's flying! It also speeds up the VOB cleaning thing at the end of RI4M, before it starts DVD Shrink. Now if I could only figure out a way to stop DVD Shrink from even running, as I almost never use it! LOL!!

    Oh! My 64x2 4800+ in Oxi averages about 5-7 minutes slower than the E6750 did when encoding with DVDRB/CCE! No OC on it at all!

    Best Regards,
    Russ
     
  11. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    im sure intel hit 45nmm a year before AMD.....
     
  12. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Yeah I think they did as well. The Q9 and E8 series were the first I think and were released 18 months or so ago, maybe a bit longer than that.
     
    Last edited: Jun 13, 2009
  13. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    Sam & Shaff,
    The 45nm Quad was first demonstrated by AMD in August of 2006. In fact, Intel was third to 45nm chips as Toshiba had a GPU that was 45nm about a month after AMD demonstrated the 45nm Quad core Opteron!

    Best Regards,
    Russ
     
    Last edited: Jun 13, 2009
  14. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    show casing it and bringing it to the market are completely different.
     
  15. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    shaffaaf,
    They didn't just showcase it, they demonstrated a working model in 45nm. Of course it took much longer to get them out, but AMD was the first operational 45nm CPU, closely followed by Toshiba's 45nm GPU! It was in one of the articles I posted in the AMD thread!

    Best Regards,
    Russ
     
  16. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Sounds to me like they were being cautious. Nothing wrong with that. Sounds like a wise business decision to me :D
     
  17. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    i wouldnt say so, imagine the PIIx4 came out 2 years ago!
     
  18. theonejrs

    theonejrs Senior member

    Joined:
    Aug 3, 2005
    Messages:
    7,895
    Likes Received:
    1
    Trophy Points:
    116
    Shaff,
    It was almost 3 years ago when AMD demonstrated a working 45nm Quad Opteron, in August of 2006! That's 34 months ago! That's a bit more than two years ago! LOL!!

    Russ
     
  19. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    I think if the Phenom IIs had come out 2 years ago, AMD would have dominated easily. But now, they are merely catching up. I know I'll be on it when they release something newer and faster, but my 940 is the same chip as all Phenom IIs so buying anything else would be useless. Intel, on the other hand, offers several variations on their quads and duals, so there's more freedom of choice within their product line. AMD's tri-cores offer something similar to this but they are really a whole class of their own. I know what you mean about the 720BE Russ, that is a fantastic value CPU.
     
  20. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Demonstrated is irrelevant. Yes, they made one, but it means nothing unless the public can actually buy them. As far as 'if it had come out two years ago it would have dominated' you could say the same for any technology. Had the HD4870 came out when the 8800GTX still ruled the roost it would have been ridiculously overpowered. Had the Core 2 Duos come out before the Athlon X2s it would also have been ridiculous.
     

Share This Page