Am I right to assume that a film or HDTV immage (maybe even a comp monitor) duplicates whole frames for a more flicker free image? I.E Frame 1 is identical to frame 2, then the next new frame is frame 3? Thus a 50fps signal is actually 50Hz?
I'm not sure I understand the question. You seem to be asking if a 50fps source needs to have frames duplicated to play at 50fps, so I assume I'm not understanding what you mean.
With interlaced it can scan one frame at 1/60th second skipping all the even numbers, then one frame at 1/60th second to scan all the even numbers, skipping the odd. Ths meaning it took 1/30th second to scan a frame. Progressive scans that frame at 1/60th second. Hence the increase in picture quality