Graphics cut out, windows default driver fixes

Discussion in 'Home Theater PC' started by jcalton88, Jul 1, 2009.

  1. jcalton88

    jcalton88 Regular member

    Joined:
    Apr 2, 2009
    Messages:
    460
    Likes Received:
    0
    Trophy Points:
    26
    I have a XFX Nvidia GeForce 9500 GT 1 GB DDR2 video card

    It worked fine but it just cut out this afternoon. I couldn't get a display up with it at all, took it out and used the onboard and uninstalled the drivers. It will not come up because windows automatically installs the drivers as soon as i start up the PC with it in. If I start in safe mode it will work(using default drivers I guess?) but if I install drivers again and restart it fails.

    I have been able to get it to work using really old drivers but it doesn't offer the features I bought the card for.

    I guess my question is, what card would you recommend? Is it worth it to keep trying or wait for a new driver release or just buy another card, possibly switching to a ATI card?
     
  2. 8chaos

    8chaos Regular member

    Joined:
    Dec 8, 2004
    Messages:
    100
    Likes Received:
    0
    Trophy Points:
    26
    First off, what operating system are you using? Second, what chipset is your motherboard using? Or better yet what is the motherboard?

    you are meaning that the card will give a post screen, but will not boot to Windows?

    Being that you can boot to safe mode means that this is most likely a driver issue, not an underlining hardware problem, as you well know by now.

    What features do you mention of that the old drivers don't have? Are you looking for the TV out function, HDMI audio, overclocking utilities?

    This is the card or a similar product? --> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150322

    Uninstall the Windows default drivers via device manager...if you are not able to do that in normal mode then do it in safe mode. When you boot to normal mode press 'cancel' when the system gives the "new hardware found" dialog box and install the driver with the disk. If that does not work uninstall the system drivers and try to force Windows into using the GPU driver. In the device manager > right click 'update driver' > 'install from a specific source' > 'dont search...' > 'have disk...' > point to the location.

    There is a work around if you are still unable to install, look at this thread it starts off talking about ati drivers and XP SP1, but has relevent information you can use with Nvidia drivers and XP/Vista

    Also, check out this Microsoft article, it may help

    Good luck and to answer your final question:
    IMO, this card will work it's just a matter of getting the system to take the driver...although that my include an amount of work that you may not want to do. It wont be worth it to wait for a new driver...if you are not able to install the newest drivers now off the bat, then you probably won't be able to with any of the other newest releases. It's up to you if you want to get another card...if you are in a position to return the card for a different one, than it's all good...it just boils down to how much time you want to spend
     
  3. jcalton88

    jcalton88 Regular member

    Joined:
    Apr 2, 2009
    Messages:
    460
    Likes Received:
    0
    Trophy Points:
    26
    I played around with it for a while last night, and with the newest drivers installed(186.xx) I got it to work with the S-Video output on the card.

    And yes,that card you linked to is the one I have.

    I'm not sure of the motherboard(pre-built PC, I upgraded components to use as a HTPC). Manufacturers website doesn't give much detail as to what MB it is either.

    I'm not sitting in front of the computer to know exactly, but I do know it is a AMD chipset.

    The functions I'm looking for are TV-Out functions. The computer is hooked up to a 42" HDTV and I use the computer as a HTPC. It has a DVI output and I use a DVI-HDMI cable.

    SO.....
    Heres a wrap-up so far:

    Got Windows to take the driver, displays in S-Video and (now that I played with settings when I could see settings) VGA.

    I am almost happy to leave it like it is, BUT the VGA output doesn't seem to support the resolution I want. I'd like a full HD resolution. If I put the resolution of the card at 1920x1080, it displays but with black bars on either side of the screen. The bars are about 2" wide each. On lower settings it takes up the whole screen. When the DVI output worked it would display in 1080P and fit the screen perfectly.

    I am going to play with it more today, try some of the tips in the posts you linked me to. Now that I've made some progress any new ideas as to why its not working with the DVI? If I cannot get it to work I might buy a new card anyway as I use the card to display movies and tv shows in HD to my HDTV.

    What card would you recommend for running a HTPC? I'm really leaning towards ATI as it seems most of their cards support audio through HDMI.
     
  4. 8chaos

    8chaos Regular member

    Joined:
    Dec 8, 2004
    Messages:
    100
    Likes Received:
    0
    Trophy Points:
    26
    It might be likely that your TV doesn't support 1920x1080p at 60 Hz over VGA input. Check your owner's manual to see what specific resolution it needs. Mine for instance requires 1920x1080@59Hz, and it's out of whack at 60Hz oddly with VGA. It's interesting to see what your TV will do at 1080i - 1366 x 768 @ 25Hz or even 720P - at 1280x720 with VGA. Regardless, quality is going to suffer comparing VGA to DVI.

    With the older Nvidia drivers there was a display mode called "CVT" that needed to be switched for bigger displays. Looking at Release 186 (July 2, 2009) documentation it mentions nothing regarding CVT or coordinated video timing mode. So it sounds like a moot point with this version.

    As far as not working with DVI, you had the DVI output working at one point? Was that with an older driver or setting? Try and boot into safe mode with DVI - it should display at 640x480 just to see if you can get signal. Also, there is a setting in the Nvidia Control panel named "Force Television Detection at Startup", that may help if your not getting any signal. Try hooking your standard computer monitor up to your HTPC and set the resolution you know that the TV will take such as 1024x786 (16:9). Then while connected to your TV change the resolution to 1980x1080.

    Although, 1980x1080 is an implied resolution and should work out of the box. That's why it sounds odd, if your TV does not support one of the 'standard' DVI resolutions, one being HDTV (1920 × 1080) @ 60 Hz, it may be a fruitless venture unfortunately.
     
  5. jcalton88

    jcalton88 Regular member

    Joined:
    Apr 2, 2009
    Messages:
    460
    Likes Received:
    0
    Trophy Points:
    26
    The TV is detected by the card over S-Video but of course I can't get nearly the resolution I'd like. It worked before, at full 1080P @60 Hz and after new drivers, nothing. I realize that quality will suffer, but I'm just happy to see it right now.

    I'm wanting a card that supports audio through HDMI, it looks like ATI is my best bet.

    I'm gonna try those steps you gave me, see if I get anything from it. Is there a way, file, xml or something, that I can change the DVI resolution without having it connected? I know the control panel doesn't give the option without it being connected. I do think this is the reason for my problems.

    And as far as the manual, the tv doesn't support any higher than 1280*1024 @ 60 Hz
     
  6. 8chaos

    8chaos Regular member

    Joined:
    Dec 8, 2004
    Messages:
    100
    Likes Received:
    0
    Trophy Points:
    26
    What happens when you change the resolution while your traditional computer monitor is connected to the machine with DVI, then switching the comp over to the TV via DVI input? As far as another method...you can try and change the resolution via the registry. I have not tried this method personally, but you can give it a shot. I would definitely try it with your computer monitor at first to be safe. Here are instructions:

    You may have just answered your own question. However, it's odd that you said you once had 1920×1080 displayed. Your best bet with your HTPC may be 1280×720 (720p) if nothing else works. Let us know what happens!
     
    Last edited: Jul 15, 2009
  7. jcalton88

    jcalton88 Regular member

    Joined:
    Apr 2, 2009
    Messages:
    460
    Likes Received:
    0
    Trophy Points:
    26
    I have to track down a monitor with DVI first. None of my monitors support it, only VGA. And as far as the TV not supporting 1920*1080, that is only through VGA. Through HDMI it supports full 1080P. That is how I had it viewed at 1920*1080 before.

    I'll see about getting another monitor to try it on, just wish I didn't have to spend money on something I'll never use.
     
  8. 8chaos

    8chaos Regular member

    Joined:
    Dec 8, 2004
    Messages:
    100
    Likes Received:
    0
    Trophy Points:
    26
    ah, I see. I just inferred that you had a monitor you were working with DVI. You can try the registry tweak none the less to see what happens at least...
     
  9. jcalton88

    jcalton88 Regular member

    Joined:
    Apr 2, 2009
    Messages:
    460
    Likes Received:
    0
    Trophy Points:
    26
    Nope, I have a flatscreen VGA monitor thats max resolution is 720P. I've had the card working through DVI on the TV, but thats the only thing thats ever used the DVI port on my card.

    As a matter of fact, I don't even have that flatscreen monitor right now, letting my parents use it. I'm going to play with it and the registry tweaks. I just wonder if the registry will have specific settings for the DVI vs the VGA outputs. I mean, it should, but hopefully I'll be able to tell them apart.
     

Share This Page