Do TV's preserve the frame rate of a video source while displaying the video or do they convert them to a certain frame rate? To clarify the question consider this example: Let us say you have a 720p@24fps source and you connect it to a TV which you know is able to accept 720p@24fps. Will the TV display the video at 24fps or convert it to its default frame rate?
Its a complicated yet easy answer. It depends on the type of display in question. It is my understanding that... A CRT (based on the ATSC spec) will display a 720p@24fps signal in a 720p@60fps interlaced fashion at 60 frames per second. Meaning that it will upconvert to 720p@60fps. A Fixed Pixel display (Plasma, LCD, DLP, and LCoS) will upconvert the 720p@24fps to it's native resolution and frame rate. In both cases a "Good" video proccessor will recognize that the stream was originaly created from film and take the necessary steps to start 3:2 pulldown decoding. Then the display will either display the video stream using reverse 3:2 pulldown proccessing if the display is a CRT or if the tv is a Fixed Pixel Display it will use reverse 3:2 pulldown then deinterlace the fields. For example: CRT: 720p@24fps -> 3:2 pulldown -> CRT -> reverse 3:2 pulldown -> Scale to 1280x720 -> 720p@60fps Technically a CRT can support just about any resolution at 60fps since it isn't a Fixed Resolution Display... That is just one example of many possible resolutions Fixed Pixel Display: 720p@24fps -> 3:2 pulldown -> FPD -> reverse 3:2 pulldown -> Deinterlace -> Scale to "Native Resolution" (1280x768, 1366x768, or 1920x1080) -> (either) 720p@60fps or 1080p@60fps I could be wrong about the end frame rates but the method is good. For more explination: 3:2 Pulldown and Deinterlacing: http://www.theprojectorpros.com/learn.php?s=learn&p=theater_pulldown_deinterlacing Scaling: http://www.theprojectorpros.com/learn.php?s=learn&p=theater_scalers I love technical questions, Ced