I have a 32" lCD tv with 1280 x 768 native resolution. My video card is displaying in 1024 x 768. Is this less than optimal as a resolution because my video card did not accept being reconfigured at full res. thx.
Your display is 16:9 or widescreen but you have your video card at a 4:3 non widescreen resolution. Why can't your video card display 1280x768? What video card do you have and do you have the latest drivers installed?
My card is a Nvidia geforce 6600gt 128mb PCI-E with DVI-I. To be honest my T.V. doesn't lend itself to easy configuration. I was looking into it like I can get some tool called powerstrip(Forces resolutions on your video card)you may have heard of this but its a pain in the ass to configure/maintain. Besides most games play in 1024 x 768 @60hertz so I guess its not that bad I was just wondering if others had this problem or if people think its worthwhile to force resolution?
Hold on, if you have an nvidia card, then the ForceWare driver software will easily do that resolution. It can also do custom resolutions so that you don't need powerstrip. If you run XP, the latest driver version is 84.21.
Don't be misled by native TV resolution. It all depends on the video card/computer configuration and the type of display you're using. You could literally have the same computer configured to a much less expensive display that functions quite well via DVI when a higher end, much more expensive display works better through VGA or SVGA. In fact, it is only on the rare occasion that a display looks better at a higher than the 1024/768 setting. Further, while it's true for the most part that newer drivers offer more stability and functionality, it is NOT true that they HAVE TO BE the drivers of choice. It varies greatly from one computer/display combo to another.
Basically I have a Proview 32" lcd with HDMI/VGA/component out. My video card is a msi nvidia geforce 6600gt. The TV manufacturer claims my TV has a 1366x768 native res. At first I had it set up as such: Digital sattellite into HDMI input and DVI-I out from my PC VGA converter on there and then into my VGA in on my TV. Between my card and TV it would only allow a configuration of 1024x768 @ 60hz which at the time I was OK with. Then I started thinking... for one, wouldn't the optimal aspect for my TV be the native resolution of this set and 2 I wanted to maintain a straight digital to digital conection from PC to TV so I went out and purchased a DVI-I to HDMI converter and went to set it up but it would only allow 1280x768 and the picture didn't look any better so I said hell with that(50$ CDN) Which is what like 150$ US now LOL!). Took converter back and got guild wars instead. I'm thinking now maybe I should've been more persistent with it / called the manufacturer for help. Do you think it would be worth it for me to try this configuration again to get this at maximum resolution?
Well, I have a variety of connections into my HDTV at home and different ones at my store. That being said I should have asked, are you using Windows Media Center and if so, what type of separate tuner card? I use a separate card for HD, as well, of course, which is now supported through MCE, but only with the appropriate updates.
I'm using windows Xp home edition. With no separate tuner card. If I was to reconfig this to max res wouldn't that only apply to my desktop/media players and not to my games? Cause I haven't come across many if any game that supports 1366x768 res.
It's been my experience that with certain displays if you set it to it's highest res and then run a program/game that doesn't support that res the program/game won't always scale back to the highest supported resolution. In many instances your screen will go black and remain that way longer than the normal 30 seconds Windows will allow if you purposely change the screen resolution to something other than what your screen will support. Then you'll have to go through the drill, in some cases, of rebooting your machine in the same mode and make the changes accordingly, and, as you know, a bit of a pain. As you've already noticed, just because you shelled out the big bucks for the DVI/HDMI that does NOT mean you're going to receive better video. As I indicated previously, it can be trial and error, which is OK for me since I have all of the cables, displays, computers, and tuner cards I need. That is why I explain to my customers that we "might" have to switch the video and/or tuner cards I've chosen for my custom built media center machines, since they don't always mate with the display like you might think. To answer you question though, if you're happy with your display, leave well enough alone.
Yeah good advice. Once they say what video cards are going to be compatible with directX 10 then I'll be purchasing a new graphics card so I'll be able to address a more permanent solution.