Same here, just plain old windows defender and firewall. Viruses still make it through the net on occasions, and false positives negate much of the peace of mind you get from having an AV - to suddenly have a file go away because it returned a false positive is extremely annoying - yes, you can whitelist the folder, but only once you realise what has happened and get the file restored. Sometimes that's not so easy.
Yah, I only install/run malware bytes once in a blue moon. I'm careful what I download, and where I browse. I suppose one day It'll bite me in the butt though. As the people who host them become trickier
I use AVG like Russ but turn off Resident Shield which is there AV program. I run it manually when searching dangerous sites and have it only search key areas of my system, Hidden system files, cache folders, temp folders, and so on. This cleans out the cookies and what not that can be used to infiltrate your system as there is nothing you can do about them when surfing no matter how savvy you think you are. This keeps me clean and with out problems. I do have some files I have to white-list on initial AVG installs but it is minimal and pretty easy to do as I never have AVG automatically fix problems and always flag me to an issue when it arises.
Sam, you're a smart guy. No as a matter of fact I was not running them in the same pci-e slot, and I had not re-loaded the drivers. I will do both of those things tonight in testing the gigabyte which just showed up. I was running the non-HIS IceQ cards, the Power Color, and then the XFX, in slot 6, where they will go, having pulled out the HIS card in my testing of the others. So you're totally correct, my testing has been apples and oranges. Additionally I have not been re-installing the drivers, but since you mention that, I will do that today. I don't exactly understand why - it's the same chip - but I am sure you have a good reason. So following that procedure, this gigabyte will then receive the fairest test. I will say this: The XFX which was crashing all the time, is just a hot-running card at 82, and I am guessing the choice of PCI-E slot would not have affected that too much, but maybe the crashing would have been reduced. The HIS IceQ at 100% fan was running, as I mentioned, around 54. Last night I played for about 5 hours, letting fan be on auto. The low fan speed is 900, the high is 3400. The fan was at 2100 the whole time, the card at 63 degrees. So HIS is comfortable letting the card run at 63. Again, by comparison, the XFX was at 82. I can't remember what the Power Color was running at but I may have written that down. I do recall that I told myself, "Well, it is no IceQ" so definitely it was not as cool as the HIS. If they could modify that HIS design for two slots, they'd sell a lot more cards I think - I for one would choose no other even at a price premium. I looked at cpu and graphic loads last night - this was all on crysis. I played a couple hours just killing the nano-suit guys in the cemetery. It turned out the Korean AK is just fine - no close-in weapon is needed. But you do need to be within about 20-30 feet which isn't hard, if you get used to seeing them cloaked. They apparently can't see me, but when he said "cheap suit" maybe he meant that once you grenade them a couple times, the suit has a few bright spots that linger even while they're cloaked. But even before that, once you get used to it, you can spot them - that ghost ice person moving about. They go down fast with one clip of AK. (I know it's not an AK, it's a FY-71 or some such??) I accidentally hit the X key and changed the SMG to single shot, and I still killed one of the nanos who was close in - firing very rapidly in single shot mode - the SMG is pretty powerful. I also tried to kill them (I finished off the first two who ghosted in close by, then played for a couple hours different ways of killing the last two) with just grenades, meaning hand grenades and grenade launcher. I managed to do that several times, but grenade launcher is tricky to use, and does not seem as powerful as hand grenades. Hand grenades are definitely helpful. Anyway, I looked at my loads. GPU was of course 99%. The cpu cores were all about 25% or so, but last core was 80%. Maybe that core was managing smoke, etc. So I guess that would be the limiting factor on the cpu, allowing me only 20% of fps improvement before running at 100% on that core. My frame rates have been at around 33, and in the snow they dropped to 25, but again, that is not bad for me - sort of minimal, but I didn't really notice any lagging, even shooting all the flying monsters that came when I was defending Prophet. That tells me that my cpu would probably limit me to 40 fps. Again, that's fine for me. As the graphics get more difficult, it won't necessarily increase the load on that fourth core - just on the crossfire cards. So if the 9450 at 3.334 gives me 40 fps, even under the heaviest graphics load with crossfire 7950s, that's fine for now. Regarding that fourth core, I should have looked to see if the same fourth core was loaded in every chapter, but I forgot. I might check that tonight - see if it is the same fourth core that is running at 80%, in both regular play, like at the mine with the mini-gun, the cemetery with the nano suits, and in the snow protecting Prophet. If I remember I'll try to jump into those three chapters to take a look. Rich
Update to yesterday's post, with 10 more hours of crysis testing, including completing the game, and testing the Gigabyte 7950 in slot 3, the same slot as the HIS IceQ, for apples vs apples testing: Also, per my on-screen display, the fourth 9450 core is the most active in every part of the Crysis game on all types of maps. At one point it hit 90%, and I was at about 33 fps, so even 40 might be just occasionally obtainable on occasion due to cpu bottleneck once I have crossfire running. I don't know if all games use that fourth core so much - maybe just crysis. FPS ACROSS ALL OF CRYSIS, SURPRISINGLY UNIFORM, AND NO LAGGING ANYWHERE The good news is that I finished crysis, and I can now report on fps across the entire game: the fps average was 33, max about 38, min 25. It never dropped below 25 or 26, and I never felt lag, even in the snow, even in the alien ship, even in the last crazy battles on the aircraft carrier fighting the giant ship with all those explosions, and all those purple flying monsters. So best fps was sometimes 38, but usually 33, worst was 25 = about 25% reduction in fps between the average of 33, and the most difficult sections at 25. That is a surprisingly small difference between minimums and average, from what I have heard for earlier AMD families, like 4000, 5000, and 6000. These tests again were for the 3 gig vram 7950, running 2560x1600, ultra settings, but no AA. Jeff advised me that AA interferes with the superior internal Crysis edge-smoothing, which runs better at less of a performance hit. I eventually applied Jeff's config.sys which picked me up about 4 fps in various sections - prior to that I never saw more than 34, but after applying it, I saw up to 38, depending on the section. If you go to Advanced graphics settings, after using Jeff's config.sys, 3 of the 10 settings are no longer VERY HIGH, but instead they say CUSTOM. For more info on what that config.sys does, refer to page 180 here in the thread, Jeff's post #4500. GIGABYTE HD7950 VERSUS HIS ICEQ HD7950, SAME SLOT, RE-INSTALLED DRIVERS Now - for the card testing. I put the gigabyte into first gpu slot 3, where I had the HIS IceQ, and re-installed the drivers. The cool-running 59 degree gigabyte crashed every 40 mins. If that sounds bad, by comparison to the XFX it was not so bad - the hot-running XFX was crashing every 10 minutes once it hit 82. To reduce the Gigabyte crashing, I tried to underclock the card which comes at overclock of 900 compared to reference of 800, by dropping back to 850. That seemed to help. Most interestingly, the last four crashes were blue screens. .................The cards seem to run at different vddc voltage: The gigabyte notifies Trixx of 981 mV vddc, and when you hit reset on Trixx, that is what it shows. I seem to recall that the HIS IceQ also had wanted 981, according to Trixx, when I first installed it 3 weeks ago, but when last night I pulled the gigabyte out, and put the HIS back in, immediately Trixx changed reset vddc from 981 to 1087. This had to come from the HIS card, where else could it have come from? The registry? I guess that the card is notifying the Trixx tweaker. I was initially running the HIS at 1050 as customary, where I had it when it ran for 6-8 hours at a time with no crashing, but when it first hung, I then changed it to 1087 - if they have designed the card for higher vddc voltage who am I to argue? Even at 1087, the HIS still hung twice more, spread out about 2 hours apart, at the end of the game with very difficult graphics - lots of explosions, shooting, alien weapons, etc. I might try a higher vddc, maybe 1120, maybe even 1150. The HIS can handle the extra heat, so if that stabilizes it, that's what I should do, I think. BLUE SCREEN, VERSUS IN-GAME HANG Now, the difference in the crashes: The HIS were the old style crashes - the graphics card hangs. I was able to hit alt ctrl del to get to W7 task manager, and kill crysis. That occured a total of 3 times over about 8 hours. (I don't like it, but since I am overclocking the HIS, maybe, as I say, I should try higher vddc.) For temps, the HIS with auto fan runs about 64 at 2100 rpm, at max fan it runs about 55, fan at 3400. When it first hung, I changed to max fan. The Gigabyte runs about 5 degrees hotter - at max fan it ran at 59. So the gigabyte with auto fan was running about 65, but when it started crashing I changed to max fan which is 4400 rpm for the 3 small fans, which dropped temps to max of 59. But still it crashed. I tried different vddc, such as 1050, and then I tried underclocking and lower vddc. I reasoned that higher vddc should not reduce stability - the opposite is true - but it will increase heat of course. It was not crashing as often as the XFX, but unlike the XFX, whose crashes were all in-game hangs, the Gigabyte crashes were all blue screens at the end of the game - the last 4 crashes. Correct me if I am wrong, but blue screen means computational errors - check sum errors, etc. In other words, you might question my 9450 overclock to 3.334. In fact by dialing that back to 3.2 or 3.3, that might stop the blue screens. But unlike with the Gigabyte, I didn't get any blue screens on the HIS IceQ, nor on the XFX. Why not? 1. Well, the XFX was running hot. So it was not getting rid of its heat by dumping it into the case. It kept the heat, got hot, and hung every 10 mins, but no blue screens. 2. The HIS IceQ runs cool, and it does not dump heat into the case - it is the old style single-fan turbine, shoving the heat out the back. It still hung 3 times, every 2 hrs, but no blue screens. 3. The gigabyte gets rid of the heat, it stays cool, but all the heat goes straight into the case affecting the cpu and northbridge. My system blue-screened every 40 mins. CARD HEAT-EXHAUSTING SYTLE DIFFERENCES, GIGABYTE VS HIS ICEQ Looking at the card architecture, as we said, the HIS is the old style single-fan turbine, which blows most of the hot air out the back of the case. The gigabyte is a 3-fan cooler, which blows all the hot air inside the case. I didn't notice high core temps with the Gigabyte, but maybe I wasn't checking often enough. I should log them with core temp. Also I could have been experiencing high northbridge temps without knowing. On cpuid, I think systin is the northbridge. I can't log it, but if I keep that program open, at least it will tell me min, current, and max. If I go outside crysis for a while every half hour, before a blue screen, I might be able to tell from max systin how my northbridge temp is doing. I could put a little 20mm or 25mm fan on the northbridge - I have several I have picked up over the years - if not a separate northbridge cooler. Maybe that would stop the blue screens. SIGNAL CONTAMINATION ON GIGABYTE AND XFX, VERSUS CLEANER SIGNAL ON HIS ICEQ? The other thing that bothered me a bit about the gigabyte, besides 5 degrees hotter than HIS, and blue screen crashes twice as often, was that I got some green sparkling with the Gigabyte, similar to what I also saw on the XFX. This was not massive - confined to an area about 1/2 " high, and 2 inches wide, on the menu, and also in-game for example on a solid black floor. There was more of it on the XFX, and also on the Power Color when I was going through Max Payne 3. I don't know where that comes from - is it voltage handling - or is it signal contamination? I don't see it except maybe microscopically on the HIS if at all, and only when I was looking for it having just removed the Gigabyte. It is not noticeable, whereas it is very noticeable on the other cards. I would call that build quality - the electronic components. There should be no sparkling, right? A cheap DVI cable could do that, but mine is thick and highly rated, and again, I don't get that on the HIS IceQ. CASE MOD WITH BOTTOM FILTERED INLET, AND SECOND HIS ICEQ I am about ready to give up on any other type of 7950. So one solution is to send the gigabyte back and make the HIS work in slot 6. I believe I can make the HIS work in slot 6, by modding the case bottom, and cutting out a hole for the turbine fan. That would be similar to modding the side panel and cutting a hole for a 120mm fan, which I already did. I cut many similar holes in my former P4 gaming rig for additional fans. It takes at least 2-3 hours, involving maybe 200 tiny drill holes. It is a pain, but doable. I always add a 120mm fan filter. I believe in positive case pressure to keep dust out, and all inlet fans with washable filters. The filters are plastic, 2 in a set for $3 from xoxide, or from other net retailers. The mod would involve putting feet on the case - I could always rig some feet by using two 6" long pieces of wood, one at the front, one at the back. I would drill out the hole, and then maybe take a cheap 120 mm fan, cut out the fan housing, glue that to the case bottom, and mount the filter on that, so the HIS had room to protrude out that extended bottom. The glue has to withstand popping off the fan filter cover from time to time. Epoxy glue, or a glue-gun, should be plenty strong. So all in all, from the old-style turbine design on the IceQ, which the other cards seem to not offer any more, to the seemingly better cooling, to the seemingly better prevention of signal contamination meaning absence of sparkling, the HIS IceQ seems head and shoulders much higher in build quality than any of these other cards. What do you guys think of the implication of the blue screens versus hangs, and my idea for the case mod to add a second HIS IceQ for crossfire? EDIT: Rather than drill holes and put on feet, I think it's time to bust out the spedo which has massive case ventilation possibilities, and bottom PSU. The HIS IceQ part that sticks out, would be after the psu, so the slot 6 has a lot of room below, once you get past the PSU, and I believe that would be perfect for the HIS IceQ in slot 6. Rich
If three different graphics cards crash, whether or not in the same way, it's probably not the cards at fault. My money's on the OS/drivers, or the motherboard. Usually it's the latter.
It's as much an AK as any other. Definitely an AK. The FY-71 may be a reference to it being license-built. If you're using my System.cfg that would be the physics load on the 4th core. Crysis runs everything else on Cores 0, 1, and 2. This is actually a good thing and good for performance as it frees up the other cores to run the game. Your math has no basis at all. My Phenom II is comparable in speed and Crysis frequently hits 60FPS. Any FPS limit you are hitting is because of using one card. Increasing the graphical performance won't magically change the CPU load drastically. It should see the same CPU usage or similar whether or not you run Crossfire. The CPU is running the core game, the physics, AI, scripting, etc. These do not increase in demand as the graphics do. In short, Crysis is GPU bound, not CPU bound. Your CPU is not currently a limiting factor AT ALL. Really you should have been using the System.cfg from the start. It essentially fixes the game and the differences are more than just FPS numbers. It has to do with everything from hitching to physics loading to video memory usage.
I found my Q6600 quite limiting in Crysis, but not my Q9550. That should suggest where the limiting factor is - bear in mind oth these CPUs were overclocked.
Well, if the HIS IceQ is hanging - are we still thinking motherboard Sam? Remember, it's only in really tough areas - and only at the end of the game - once every 2 hours, a total of 3 times. It never hung through prior 20 hours of crysis, maybe 10 hours at a time. So is this saying that my 9450 overclock is not stable after all? Maybe that is the case. Well, Jeff, I am running the system.cfg for sure now, so if it's the physics that is causing the 4th core to run at 80-90%, are you saying that the physics load does not go up according to the frame rate? If that is what you are saying, then I read that as saying that, whether I am getting 30 fps, or 60fps, the cpu load, on fourth core, let's say for physics, won't increase by doubling the frame rate. So in other words, the fourth core physics loading could be constant, at 80% or so, independent of the frame rate? If so, that is quite interesting, and unexpected. I didn't realize it worked that way and yes, I was under the mis-perception that the cpu loading increased at higher frame rates. I guess when you get right down to it I don't understand frame rates. So - a building is falling - the fourth core calculates all the physics involved in the fall - and that loads the core 80%. The graphics card takes the position of the building and the lighting effects, smoke etc. and depending on how powerful, and if cf, produces 30 frames a second, or 60 frames a second, as the building falls. The fourth core load stays 80% in this hypothetical situation the whole time. If I have somewhat stated that correctly, then again, I have to say, Most Interesting! I was on the verge of unboxing the spedo for the first time this afternoon - pulling it from its perch on the top garage shelf where it has sat for four years. I did download the manual last night, which is in about 8 languages. Thermaltake definitely was marketing that case all over the world. I don't have the advanced with the side 230mm fan, but it looks like the side door has at least two 120mm fan positions, before I mod some more if necessary. There may also be a ceiling exhaust, and at least two back exhausts, and I don't know how many front intakes. I bought some sheet metal today, to prepare for modding my own kama bay since those are no longer being produced. I just bought 6 fan filters. I figure that by cutting 1/2" strips of sheet metal, maybe 10 inches long, I can drill holes and mount them on the top front of a 120mm fan (another stip on the bottom front) using the filter base to create a sandwich with the sheet metal in the middle. Then I can bend the two sheet metal sides back, and find the 5.25" optical drive side screw channels, drill some holes, and use the long channel to perfectly position the fan so it lines up neatly with the front of the case. I would have another sheet metal strip on the bottom front two fan holes and do the same thing of course. This way I think I can mount the 120mm fan, and intake fan filter, with snapping cover, securely enough that it allows snapping and unsnapping that filter cover which takes a little effort. I might have to add black weatherstripping around the edges of the fan body to fill in any gaps - but the black filter cover looks pretty good and the overall look should work out. (Maybe too many moders were doing just that exact thing, instead of forking out the $25 for the kama bay and maybe that is why they stopped making the product.) I was going to do this for both of Miles' i7 computers, in addition to reseating the HSF - probably in fact installing a heatpipe HSF on both computers, which have only the stock Intel hsf at this time. He called me - his temps were way up in the 90s again, as he was playing Doda which he is doing modeling work on - and his computer was crashing. I told him to remove the side cover. He texted me a few minutes later, and core temps immediately dropped back to the 70s. I was shocked - I really didn't think he'd see an improvement but I was out of ideas when I suggested removing the panel. So he obviously has an air flow problem. I saw him today and was kidding him. "You work for a gaming company, but they don't give you guys gaming cases." He laughed, but it's so true. And they really have to play those games from time to time to playtest, and to test out their work, and that's when the heat builds up. Perhaps his company does not want to have a lot of noisy cases around, but, Sam, you sure proved that you can have good airflow and virtually no noise. (With my noisy Kazes I haven't proved any such tning, but that might all change with the spedo - it WILL be interesting breaking out of ATX size and going full tower.) Sorry about no pictures at this time, but when I do the filter front intake mod, I think I'll take some photos to show how it was done, as well as show the final look. This of course is for cases with none, or maybe just one front intake, and at least 3 unused 5.25" optical drive bays where a top front 120mm intake fan could go. Rich
Also, you were not using the ideal video hardware for Crysis and were in a period of motherboard turmoil for quite a while... Not saying that isn't true, but the Q6600 is hardly much different in performance than the Q9550 given equal clock speeds. Also, Crysis has proven to be much more video limited than CPU limited in my experience. My Q6600 has not been a limit at all in my personal testing. If I recall correctly your Q6600 vs Q9550 issue turned out to be an overhead issue running 4 cards in Crossfire. Correct me if I'm wrong. Yes and no. Think of it like this. Every game world is like a human body. The CPU renders the bones(scripting, AI, physics, etc), and the video card renders any skin(objects, textures, shaders, etc) you choose. Just because the skin is prettier, doesn't make the bones more complex. So while graphical load may be increased, the core game underneath isn't really doing anything more than it would with any other graphics, it just looks better. You can still expect a CPU bottleneck in just about any game, but only in very intense situations. It shouldn't affect your overall average framerate as long as it's adequately powerful.
I never ran quadCF with the Q6600 - at the time of buying the Q9550 I was only using a single 4870X2. The Q6600 and Q9550 were not overclocked to the same speeds, the Q6600 maxed out at 3.24 on a tolerable voltage and the Q9550 reached 3.65. This plus the additional performance of Penryn vs Core and it added up to about a 30-35% performance increase - with two GPUs that certainly made a difference. Rich - no game crashes the entire PC just because. If you get issues like that, there is definitely a stability problem somewhere. If there is a stability problem with three different graphics cards, even if it manifests itself in different ways, the problem is probably not the graphics card. First of all try without the CPU overclocks, if it's still the same, you ideally need to try a PC with a different board in it. Whenever I've reformatted due to stability problems like these, tried different graphics cards and so on, it has always been the motherboard at fault, it's more common an issue than you might think.
Different clock speeds PLUS the Conroe -> Penryn difference makes some sense. I still don't see how it was a major bottleneck though. Would be interested to see some numbers now that average cards can run the game decently.
I have some great news - STABILITY has returned. More on that in a bit: Okay, maybe I'm starting to understand it a bit. Sam, thank you for trying to help me figure this out. But MAYBE - I'm not positive - but MAYBE I have discovered the culprit: namely a clash between Catalyst and Trixx. The proof of stability, is that I am ecstatic to report 10 hours of solid gaming last night without one single hangup or crash from the HIS IceQ card. This was on Crysis 2, fully loaded, all ultra settings (except motion blur on medium) 2560x1600, with the Direct X 11 package and also the high res textures 1.9 GB package. Holy moly! I had NO IDEA what anybody has been talking about when they say "Direct X 11 eye candy" but let me just say that: The Direct X 11 Crysis 2 graphics with High Res Texture Pack are indescribably stunning! Words don't do it justice. I guess I could post a screenshot, and that might help. Everywhere you turn, everywhere you look, no matter where you are almost, I have never seen anything this good looking. I have caught glimpses of it in some parts of Hells Highway. I caught many glimpses of it in Crysis. But Crysis 2, DX11, High res Textures Patch pack, puts it all together. I read a few net postings about people who rated crysis 2 an 8.,5, then came back to play it again with the dx11 and high res pack, and revamped the ranking to 9.5. For me, so far, I rank it 9.8 - the same as Crysis. And that is heavily biased due to the effect of these graphics. The gameplay itself is not super extraordinary so far, and maybe never will live up to Crysis action (although I really enjoyed the very brief amphibious vehicles section - firing rockets at everybody, shooting down the chase helicopter, greeting the two enemy AVs with a taste of their own medicine - very thrilling, albeit very short.) Here's what I discovered as I started to play this - mostly just as a test to compare later to when I have crossfire running. I loaded up the game, dx11, high res, and saw that my fps was about 17. Okay - not really playable - but let's check it out for a few minutes anyway. I noticed in that first little section in the park, that the graphics seemed to be extraordinary. My OSD reported a curiosity, memory clock at 1250, although I had set it at 1350. My HIS settings that had been very stable, were 975, 1350, and 1087 vddc. I changed the memory to 1400 to see what that would do - still in-game was 1250. Hmmmm. I alt-tabbed out of game and looked over at task manager, processes, and sorted by cpu to see what was running, just to check things out, and I noticed CCC and MOM. They should not have been there - I had turned them off by unchecking them in msconfig, quite a while back. "I wonder if Catalyst is holding back the memory overclock - of course it is, because I have not enabled the overdrive." Then I did one more thing. I had mentioned before that the Unigine Heaven benchmark is a very fast - maybe 30 second test - of whether I am getting full gpu performance. I just load it up with Jeff's settings, normal tesselation, 16x anisotropic, 4x AA, 1920x1080, and I run it windowed. As soon as it loads and starts running I should get around 50 fps. But the fps was only 27!! WTF!!! Ah hah, a Catalyst and Trixx clash!! I unchecked Catalyst from msconfig, rebooted, ran Heaven, got my 50, and started up crysis 2, and now I'm getting about 26 fps - playable and smooth. "I'll just check it out a bit." That turned into 10 hours of trying to reach the scientist guy, finishing off with protecting him as he headed out to downtown, and I have to take my own route. My frame rate was max 33, average 26, and minimum 23-24. For example, when the freeway collapsed, and all that dust filled the air, I noticed some lag - which I attributed to the seismic activity - but then I noticed my frames at about 24. However, as I am playing at hardest difficulty, I stay cloaked most of the time and am not running about very much - these guys are crack shots. Occasionally I have to run like hell when some unexpected shooter appears overhead, but amazingly I usually escape. LOL (The other interesting thing the OSD told me, was that my core loads are no longer 80-90% on fourth core only, with the remaining at 20-30% This time I noticed that all the cores were being loaded roughly equally, at around 46-55%. Following Jeff's explanation of cpu bottlenecking above, it sounds like this 3.334 overclocked 9450 will indeed carry me a long way on 30" gaming and crossfire 7950s, until a 2013 or more likely a 2014 motherboard upgrade to ivy bridge, allows a CFx3 with one more 7950, and maybe enough power to take me on into 2016.) Looking back, I guess the Catalyst incompatibility with Trixx caused the 3 hangups of the mighty HIS card every two hours in the final sections of Crysis, when I had removed the Gigabyte which was causing blue screens. That would mean, while I was testing the Gigabyte, the incompatibility was there unnoticed. But my frame rates were okay during that final crysis sequence, and I didn't notice the 1250 vs my 1350 memory setting. But maybe I just overlooked that. So I still believe that the blue screens were the result of extra heat being dumped into the case by the Gigabyte card, as opposed to the turbine fan of the HIS removing heat out the back. The extra heat didn't appear to affect my core temps, from what I can remember, but maybe my northbridge temps were excessive. I have a very mild voltage increase on the northbridge of only 1.35, vs standard 1.25, nowhere near what Sam told me was the highest I should go, at 1.5. So if my testing was unfair to the Gigabyte, due to Catalyst fighting Trixx, the only remaining mark against the Gigabyte is the very visible green sparkling on one part of the screen, about 1/2" high and 2" wide, against a black menu, or in-game against a black floor - which the HIS does not do. Since I am planning to change cases over the next few days, after I take receipt of and thoroughly test the second HIS which I ordered yesterday, perhaps the Gigabyte would work, since the spedo case will vent the hot air it produces - however the minor sparkling issue remains. So I will likely return it within the next couple weeks before my 30 day trial is up. By the time I buy a 3-graphics card motherboard, the Gigabyte price may have dropped from $359 down to the $200 range - since that may not be until 2014, if this HIS IceQ crossfire rig performs as well as I expect, given how well the single 7950 is performing so far. My ideal 3-gpu board would offer 16x size at 1, 4, and 7, but there will probably be no board like that, so always one card will have to not be the 3-slot HIS IceQ. The closest board so far that will work, is the 1155 ivy bridge gigabyte board, with 2, 5 and 7, allowing two HIS IceQ 3-slot cards, and one gigabyte, or similar, 2-slot card. So - as you can imagine = I am walking on the clouds. My 9450 overclock is rock-solid, and my HIS IceQ returns to its former status, as the can't-get-hot, never hangs, 7950 graphics card champion. Rich
Well, I'm all done with installing for now, and I'm real pleased with the results. Win7 has some minor issues, like the screen saver won't work, and the system's plug and play doesn't work very well. It won't even recognize my M$ 3000 mouse. Still, overall I'm very happy with the performance. Temps http://i1171.photobucket.com/albums/r556/theonejrs/Pics1/temps71612.jpg I'm finally going to have to pull the radiator and clean it out, as it's never gone over 53C before, and now it peaks at 55C. It hasn't been cleaned since it was installed in the case over a year ago WEI http://i1171.photobucket.com/albums/r556/theonejrs/Pics1/WEI74.jpg I'm pretty pleased with the Windows Experience rating of 7.4, up from 7.0. Switching to AHCI for the SSD increased the SSD's performance from 7.0 to 7.4, and is noticeably faster. It purrs like a kitten! LOL!! Best Regards, Russ
You've had nothing but problems with that Microsoft 3000 mouse, but I thought you found a good driver for it finally? Apparently not! So it didn't Plug 'N Play for you the first time around what made you think it would the second time? LOL I just re-installed my notebook with a 60G SSD for the OS drive and my WI # went up about the same as well, the second time around, but I didn't change anything so go figure.
Steve, It identified the mouse OK the first time, but wouldn't install it. Hardware conflict of some kind. They will get it working again. Typical windows install, never the same way twice! Are you able to get it into AHCI mode with your Notebook, for the SSD? Best Regards, Russ
Yes, by default my notebook BIOS sets up AHCI automatically and I installed the Intel H61 AHCI driver first, before the chipset which is typically backwards from normal practice. However if I don't do it that way Windows seems to have a conflict and it doesn't install the AHCI driver properly. That is probably why my numbers are better the second time around. Now as to my Desktop AMD Gigabyte setup which was more difficult as it was very confusing understanding what controller(s), two different ones, where what in the BIOS. There was much trial and error to get it right with my configuration. Plus it didn't help that Adobe kept corrupting my hidden shadow structure around I30, I should have asked Kevin how he disabled Adobe Updates instead of doing it the hard way. Both SSD systems work perfectly now in AHCI mode and do perform better since I straightened everything out. I see that OCZ 120GB's are done around $70 now so it has me thinking about upgrading my 60GB on the notebook as I'm 17GB from full just using it as a OS drive, the format total is around 55GB. Glad to here the mouse is in check again, Stevo
In the process of going through my old games collection and enjoying everything that requires a flight stick and throttle. To my surprise, X2: The Threat, which has normally worked fine on Win 7 before, was having massive audio issues. It was like only the very low end of the sounds were being played. Basically good bass response but no actual sounds coming from the speakers. After about an hour of fooling with the game, EAX, different versions of my hacked drivers, etc I had a great dawning of realization. I had previously installed a new MP3 codec called MAD, which is an open source codec that solves a myriad of sound issues in Gamebryo/Bethesda games(mostly the vicious sound stutter in Fallout and New Vegas on Creative cards). It seems X2 did not like this codec at all! Well broke out the ole' "InstalledCodec" utility and switched it back to the Windows Default "Fraunhofer IIS MPEG Layer-3 Codec". Everything worked without a hitch. ----------------------------------------------------------- It's amazing the problems I cause myself sometimes in the pursuit of flawless gameplay. I even have 3rd party drivers for my 360 controller that require me to update my driver software in device manager to switch between "XBCD" mode and "360 for Windows" mode. This driver allows me to set a deadzone for my slightly loose joystick as it's a very much original gamepad, lol. Any GFWL game that auto-detects it usually has a deadzone built in, but many older games and emulators do not have this hardware deadzone setting.
I've now had 2 actiontec Modem/routers fail And a 3rd I'm 99.9% sure it's failing as well. Needless to say, I'm not interested in another actiontec. I'm currently looking at a netgear. http://www.newegg.com/Product/Product.aspx?Item=N82E16833122377 Not to fond of the price, but it does seem to get reasonable reviews about quality, and range. What would you guys recommend? Keep in mind, it needs to be a DSL modem. And I REALLY want wireless N