1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Help connecting DVI to HDMI

Discussion in 'Home Theater PC' started by knickzfan, Jul 28, 2006.

  1. knickzfan

    knickzfan Guest

    hi everyone...first time in this forum, but after reading some of the threads, you people rock. ok, here is the problem:

    trying to connect my nvidia geforce 7800 GS video card to my new Sony Bravia KDL-32S2000. the video card has a DVI output and i am connecting to the HDMI input on the Sony via a DVI/HDMI cable manufactured by Link Depot.

    i can actually get a signal, and the dimensions 1920 x 1080 are cool...BUT the resolution is crap! it looks fuzzy and there is a good deal of flickering. i have tried nvidia software, different connectors and even powerstrip software. NOTHING!! i am so at a loss and need one of the scientists from this forum to help me.

    thanks...
     
    Last edited by a moderator: Jul 28, 2006
  2. id3372

    id3372 Member

    Joined:
    Jul 21, 2006
    Messages:
    40
    Likes Received:
    0
    Trophy Points:
    16
    It sounds like your converter is converting the signal to analog first before converting to HDMI
     
  3. wildo2ne

    wildo2ne Member

    Joined:
    Apr 26, 2006
    Messages:
    86
    Likes Received:
    0
    Trophy Points:
    16
    hdmi = high definition multimedia interface

    dvi = digital video interface

    dvi is not meant to drive hdmi it does not have the bandwidth to do it properly, hook up a dvd player or xbox 360 to hdmi, see if nvidia has a rgb adapter for your card, if not ditch it and get a ati 9500 or greater, then get ati's rgb adapter and hook up your computer that way.
     
  4. id3372

    id3372 Member

    Joined:
    Jul 21, 2006
    Messages:
    40
    Likes Received:
    0
    Trophy Points:
    16
  5. knickzfan

    knickzfan Guest

    hey there. thanks for the responses. the nVidia card actually has a DVI and a VGA output. when i connect to the LDC using VGA, i can get a resolution of 1366 x 768. this is respectable, but everything is big and i want to take advantage of what the LCD can really do...which is 1920 x 1200 (and better). the only way i could get this was going from the DVI to the HDMI and as mentioned before, the clarity is crap.

    what i had hoped to get was some kind of (software or hardware) way to rig the DVI to the HDMI and get proper clarity. as for my converter possibly converting the signal to analog, this can't be...otherwise i wouldn't see the higher resolution. at least, that's my deduction. any more help would be greatly appreciated.

    wildo2ne, are you saying that the ati 9500 (or greater) has an HDMI signal? if so, i will ditch my $349 nVidia for it. money is just money. thanks again.
     
  6. 223322

    223322 Guest

    Hey knickzfan I have Nvidia 7600 GT and 32" LCD w/HDMI my video card is DVI out.The truth is theres no easy answer for these resoultion issues and I'm not sure what wildo20ne knows what he/shes talking about because hdmi and dvi are both fully digital though not always fully compatible. The only difference is dvi is a video only signal and hdmi includes both audio and video signal. The problem is that TV manufacturers dont give much thought to natural video card resolutions (which must always be divisible by 8!!!). Many common TV resez like 1366 x 768 are not divisible by 8. 760 vertical lines or 768 works with your video cards accepted resez because those numbers are divisble by 8. My problem was when I tried to run my TV at its native res(1366 x 768) it would swith to 1080i which looked less clear than 720p. The workaround that worked for me is to accept that you may not be able to run your set at its native res and accept a little less. Hdmi is just like it says its an hd signal so it will only truly display 720p/1080i if you try to exceed 720p as a resolution it will auotmatically switch to 1080i! When my tv did this it looked like crapolla so i tried another setting 1280 x 720p (a good natural 16 x 9 hd res.) then enable monitor scaling in your video card options finally go to overscan compensation and reduce the size of your display to fit your tv. I have mine running at 1216 x 676 in 720p and I'm very satisfied with my picture quality.
     
  7. 223322

    223322 Guest

    Hey knickzfan I have Nvidia 7600 GT and 32" LCD w/HDMI my video card is DVI out.The truth is theres no easy answer for these resoultion issues and I'm not sure what wildo20ne knows what he/shes talking about because hdmi and dvi are both fully digital though not always fully compatible. The only difference is dvi is a video only signal and hdmi includes both audio and video signal. The problem is that TV manufacturers dont give much thought to natural video card resolutions (which must always be divisible by 8!!!). Many common TV resez like 1366 x 768 are not divisible by 8. 760 vertical lines or 768 works with your video cards accepted resez because those numbers are divisble by 8. My problem was when I tried to run my TV at its native res(1366 x 768) it would swith to 1080i which looked less clear than 720p. The workaround that worked for me is to accept that you may not be able to run your set at its native res and accept a little less. Hdmi is just like it says its an hd signal so it will only truly display 720p/1080i if you try to exceed 720p as a resolution it will auotmatically switch to 1080i! When my tv did this it looked like crapolla so i tried another setting 1280 x 720p (a good natural 16 x 9 hd res.) then enable monitor scaling in your video card options finally go to overscan compensation and reduce the size of your display to fit your tv. I have mine running at 1216 x 676 in 720p and I'm very satisfied with my picture quality.
     
  8. 223322

    223322 Guest

    Hey knickzfan I have Nvidia 7600 GT and 32" LCD w/HDMI my video card is DVI out.The truth is theres no easy answer for these resoultion issues and I'm not sure that wildo20ne knows what he/shes talking about because hdmi and dvi are both fully digital though not always fully compatible. The only difference is dvi is a video only signal and hdmi includes both audio and video signal. The problem is that TV manufacturers dont give much thought to natural video card resolutions (which must always be divisible by 8!!!). Many common TV resez like 1366 x 768 are not divisible by 8. 760 vertical lines or 768 works with your video cards accepted resez because those numbers are divisble by 8. My problem was when I tried to run my TV at its native res(1366 x 768) it would swith to 1080i which looked less clear than 720p. The workaround that worked for me is to accept that you may not be able to run your set at its native res and accept a little less. Hdmi is just like it says its an hd signal so it will only truly display 720p/1080i if you try to exceed 720p as a resolution it will auotmatically switch to 1080i! When my tv did this it looked like crapolla so i tried another setting 1280 x 720p (a good natural 16 x 9 hd res.) then enable monitor scaling in your video card options finally go to overscan compensation and reduce the size of your display to fit your tv. I have mine running at 1216 x 676 in 720p and I'm very satisfied with my picture quality.
     
  9. 223322

    223322 Guest

    Also ignore these duplicate posts. DOH!
     

Share This Page