DVI monitor display

Discussion in 'PC hardware help' started by shadob, Jan 9, 2006.

  1. shadob

    shadob Guest

    I have a Samsung 19" monitor that will display both dvi and analog.I installed a GForce FX5500 to get the dig output, then found my motherboard will not support the dvi function. Can I get an upgrade for the MB, or buy a new board, or have I wasted the money on the card? Anyone have suggestions? Please let me know. Thx
     
  2. ozzy214

    ozzy214 Regular member

    Joined:
    Jul 28, 2005
    Messages:
    918
    Likes Received:
    0
    Trophy Points:
    26
    Its not the motherboard as far as im aware that supports dvi...its the card. Most likely that card will only support analog not digital dvi since its a older card. And yeah montor is definetly digital signal. Yes there are two different types of dvi.

    So doublecheck ya card and make sure its putting out a digital dvi signal. But I doubt the mobo has anytyhing to do with it.:>
     
  3. shadob

    shadob Guest

    The card has a dv output and the seller told me it would work with dv monitor. Monitor has analog and dv hookups. How do I check card to see if it is putting out dvi signal? Thank You
     
  4. ozzy214

    ozzy214 Regular member

    Joined:
    Jul 28, 2005
    Messages:
    918
    Likes Received:
    0
    Trophy Points:
    26
    Im not sure how, but I will say this. Experience with me. I bout a ibm monitor made by sony. Its in my sig. I went out and got a dvi cable to find out the monitor wouldnt work. I went over to ibm's site and found out the dvi was actually not a true dvi, but a analog dvi and just another means to hook the monitor to the compouter.

    So if it was me google for the card. Make sure it's listed as a dvi digital signal output and that your dvi on the monitor is also a digital input.

    Also another thing to try. You cant have both the analog and dvi cable hooked up at the same time. Just turn the comp off. Hook the dvi cable up to monitor and comp. Make sure the monitor is selected to use the dvi cable. Either switch on front or in the menu on the monitor. Now boot up the comp and see if yeah get a signal.

    Sometimes the dvi on the monitor will be plainly listed as input one or two. If all else fails then the card or the monitor is bust.:>

    Also you got to make sure you have the right cable. There three different ones. One for analog, one for digital, and one for both.

    http://www-307.ibm.com/pc/support/site.wss/document.do?sitestyle=lenovo&lndocid=MIGR-41219

    This is on my monitor but explains what are the differences between the digital and analog dvi cable. So you make sure ya have the right cable for starters.
     
    Last edited: Jan 9, 2006
  5. shadob

    shadob Guest

    Wow lots of good info! Will start checking each thing, one at a time. Will take awhile. Will post answer answer when I find it! Thank You for all of your great info. This site is really a help to know nothings like me. Thanks again.
     
  6. ScubaBud

    ScubaBud Regular member

    Joined:
    Dec 29, 2004
    Messages:
    1,951
    Likes Received:
    0
    Trophy Points:
    46
    If your monitor has a dvi input and your graphics card has a dvi output, you are in!

    Make sure you setup your monitor in it's setup menu to select dvi first, and once connected, your card should pick this up.
     

Share This Page