1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Cheap and Nasty s-video cable triumphs!

Discussion in 'Home Theater PC' started by dworkeen, Jan 6, 2006.

  1. dworkeen

    dworkeen Member

    Joined:
    Jan 6, 2006
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    11
    On a near finished HTPC setup I replaced a working but nasty plastic s-video/scart cable to CRT in. Oddly though the new and pricier cable meters out as good, the video card (a twin head ATI by Sapphire)has difficulty seeing the TV on the end of it. On boot it doesn't recognise it (no enabled display in Display Properties\Settings). Only way to get it to work is to boot with cheap then replace with better cable? Wierd huh. Any thoughts most welcome.
     
  2. SypherTek

    SypherTek Guest

    youve probably got it set up as a monitor and tv right?

    when your using the the two rgb ports you need it set to dual monitor as tv will only be for the s video port as tvs dont usually use rgb. it may get picked up as default monitor
     
  3. dworkeen

    dworkeen Member

    Joined:
    Jan 6, 2006
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    11
    Thanks SypherTek.
    Sorry my post isn't actually very clear.

    No monitor in system, just tv (crt) on scart end of 's-vid to scart' cable(s-video of course out from computer)
    If I boot with nasty cable (very cheap Maplin I think) the ATI card(Radeon 7000)recognises there's a TV on the other end of it's output and all is fine if not great.
    But if I boot with a better, though not great (Philips) cable, Radeon doesn't recognise it!

    Both cables are wired the same using correct pin-outs but only cheaper is seen on boot though after boot the better cable can be hot swopped in and then works (after a fashion since some of the ATI graphics properties are greyed out)

    I can only assume it's some arcane feature of the way the ATI 'sees' what on its output.
    Trouble is I'm not especially video literate.
    Also there's a very noticeable increase in luminance I assume since display way brighter.
    Any idea how ATI detects displays?
     

Share This Page