I would do every kind of convertion in canopus because it doe's it right except for one thing that is mind boggling. undr source you can set or change the video aspect/pixle ratio to make a differnt size output. both aspect and pixle go hand in hand. you change one the other changes or vise versa. i understand that. HERE'S WHAT"S WEIRD. starting with sqaure pixle, witch means that the output will be the size as the input,at least for avi files witch are square pixle usally, the encoding works harder takes longer and the resualt is a video that looks pixled on playback. iv'e herd of pixle/ratio flags and i know what that is. i don't think that is the problem because if i set the aspect/pixle ratio just slightly lower on the hight side, say 9:8 pixle ratio, the encoding process is twice as fast and the result is a good looking picture. on the other hand if i set it at 9:7, it is slow and the result is bad again. or going the other way making the ratio higher on the hight side, say 9:10 it's good again, 9:9= bad , 9:8= good, 16:9= bad, 4:3 = bad, 8:6,good , or somtimes i might type in 720x480= bad, 720x 437 might be good again. why is it that during various stages of sixe changing doe's it do that. and it seems like just one small change will set it off like going from say, 640x300 to 640x301 or 400x250 to 395x 250. I usally can tell right off the bat if the resault is going to be pixley looking because when i hit the convert button it shows you how fast in realtime it's encoding. when hangs around .30 to.40 x realtime the resault is bad. around .70 up to 1.2 x realtime it looks good. now if i get a source file that is 4:3 fullscreen i can only make the hight bigger or smaller because the ratio that is slow and bad looking just so hapens to be 1:1 ratio-square. so with 4:3 i would have to go smaller on the hight or else it will start to crop, witch is fine bcause i do like a little widescreen, like 16:9 or close to it. but can't figure why i can't just type in any ol' thing, with in reason anyways, and have it work. it's not the codec's ether because it doesn't allways use the same one. but usally it uses ether the new divx or xvid one. PLEASE HELP IF YOU CAN. THANX.