Hi all... this seems to be a popular topic in this forum. I'm sorry to bug you all with another one, but I can't seem to wrap my head around how all of this hi-def stuff. Here goes: I have a Toshiba 32" LCD TV (32HLC56) that I want to connect to my PC via my video card's DVI port. I know native resolutions are in some way an important piece of info; the native res for my TV is 1366x768. I currently have an ATI Radeon X300, though I'm upgrading within the week to another ATI card, the X1950XT. I figure that since they're both ATI and they both have DVI interfaces, any settings I adjust now will be ready and waiting when I install the new card. Anyway, here's the problem. I run my DVI-to-HDMI cable from the PC to the TV. I've messed around with Catalyst Control Center and gotten the TV to display an image using the "720p60 Optimized (1152x648 @ 60 Hz" setting in the Digital Panal Properties section. The problem is that everything seems too big and blocky now, and some of the text is difficult to read. The image just isn't as sharp as I would've thought it would be. Unlike my VGA cable, using the DVI-to-HDMI allows the picture to fill the entire screen. But when I try to set the TV to something close to its native resolution, the image produced looks terrible - very glitchy and flickery (is that a word?). Forgive my ignorance... I've tried looking at FAQs on this subject, but I can seem to figure it all out. Any help would be very much appreciated. Thanks!