ok.. so i encoded a piece of video in a standard svcd and then i encoded the same piece of video into a non-standard VCD that has all the same bitrates, i play them both back and compare.. but the vcd still looks like it did as a standard vcd :/ and the svcd looks good... but what i can't figure out is how would they look that much different if they were encoded at the same bitrate. IS it possibly the VCD and SVCD encoding standards that are different? like some different ways of encoding methods in vcd and svcd are different?_X_X_X_X_X_[small]uggh[/small]
Well vcd is mpeg-1 so that would be much less quality than mpeg-2. If you created a non standard vcd with an mpeg-2 file then its probably the dvd player playback that is reducing the quality!