i have recently built a new pc, spec as follows cpu: amd phenom 2 x4 955 Black edition mobo: Asus m4a89gtd pro (hd4290 internal) gfx: XFX radeon HD4890 RAM: Corsair XMS3 2x2GB 1333MHz HDD: Seagate Barracuda 500GB, 7200rpm, 16mb cache dvd: liteon etc. i am using a dvi to d-sub adapter (on the gfx card) to connect to my tv, the output appears as a background with a foreground of vertical bands of horizonal lines like so: ............ ...................... ......................... ............ ...................... ......................... ............ ...................... ......................... ............ ...................... ......................... except closer together. also has bad quality images underneath the lines, and when i get past the XP loading screen the output becomes just a sea of dots and then cuts out. no output at all. when i use a d-sub(vga) cable or HDMI using the onboard graphics the output is fine. this is not acceptable as the main purpose of the build was for gaming and if i cant use the gfx card then it has already defeated its purpose. any ideas on how to solve this are apreciated thanks in advance
Does your TV support HDMI? If so, I would try like that. If it does not support HDMI, then it is probably an old tube TV with no HD support at all. If this is the case, you probably have to drop the resolution to 640x480...and hope it is not one of the older models that only support 320x240.
nah. full HD 32inch LG. the gfx card has dual-link DVI ports and an S-Video (which it calls HDTV-out) the tv accepts VGA(or D-sub, i cant tell the difference) and HDMI. one of the HDMI ports is HDMI1(DVI) is what it says so im gonna try and get a DVI-HDMI adapter
HDTVs cannot use D-SUB properly. Even high-end TVs have terrible D-SUB limitations, such as 1024x768 max res, on a 1080p TV. I'd recommend using HDMI or component to the TV.
the max res i get from my laptop using VGA(can someone tell me if this is the same as d-sub please) is full 1080p which is 1960x1080 methinks
d-sub is the name of the connector type (the general connector type); vga is the name of the format that is being used. 1080P is 1920x1080...those extra 40 horizontal pixels can cause lots of problems...sometimes they get cut off, sometimes they with resize the whole image in aspect, and sometimes it will just freak out and do random weirdness...it all depends on how ambitious or lazy the engineer was when designing the circuitry.
1080p (1920x1080) is what the laptop accepts for VGA. However most TVs are limited to 1024x768 for VGA, so sending 1080p at them over it is likely to just cause corruption or an 'out of range' signal. This is why you need to use HDMI or component.
my laptop has VGA and HDMI out. when at the max resolution they are the same, which is 1920x1080 (which i thank you for confirming for me). motherboard connectors for onboard graphics are DVI, HDMI and VGA, the HDMI and VGA output is max 1920x1080, which always works with my tv, the 2 DVI ports are dual-link so they have a few extra pins, the adapter is designed for that, so i cant test it with the DVI port on the mobo.
straight HDMI from Laptop or onboard desktop works perfectly fine at 1920x1080. straight VGA from laptop or onboard desktop works perfectly fine at 1920x1080. my tv has no dvi port, i am going to try a DVI-HDMI adapter to see if that works. as the dvi ports on the gfx card are dual link they have extra pins, so the adapter for them doesnt work on the mobo, so i have no way of currently testing it.
Ahh right, I see. Yeah it sounds like dodgy VGA on the graphics card in your PC. Just use DVI->HDMI, it provides a much better picture than DVI->VGA ever could, even when it is working properly