1080i vs 720p

Discussion in 'HDTV discussion' started by gserve, Sep 17, 2006.

  1. gserve

    gserve Member

    Joined:
    Nov 7, 2004
    Messages:
    82
    Likes Received:
    0
    Trophy Points:
    16
    First off I have an HDTV that will display both formats. My cable converter box will display either format. Which format is better? I notice in all the flyer's for new HDTVs that they advertise 720p. When I bought my set they were advertised at 1080i.Please advise. Thanks
     
  2. dblbogey7

    dblbogey7 Guest

    It really depends on the display and the source. Basically what I do is to try both settings (720p and 1080i) on your source and choose whatever looks better to you. My cable box looks better at 1080i on my Sony SXRD while my cousin prefers 720p on a similar Mototrola box on his Panasonic PJ.
     
  3. MarkDogg1

    MarkDogg1 Member

    Joined:
    Aug 4, 2005
    Messages:
    26
    Likes Received:
    0
    Trophy Points:
    16
    look at it like this 1080i is like 540p,720p is like 1440i,1440i is always better than 1080i.
     
  4. BurningAs

    BurningAs Regular member

    Joined:
    Jan 4, 2006
    Messages:
    1,756
    Likes Received:
    0
    Trophy Points:
    46
    yeah what he said noly this important correction. 720p shouldnt be looked at as 1440i, it's just not! while 1080i is 540p because only 540 lines at diplayed at any given time, but 720p displays all 720 line at anygiven time.

    so yeha 720p should be better. but getting into how the eye works and tricking it... 1080i looks very good even though its acutally 540 at any given time
     
  5. sdifox

    sdifox Regular member

    Joined:
    Dec 30, 2003
    Messages:
    116
    Likes Received:
    0
    Trophy Points:
    26
    720p is better for action, 1080i for static images like scenery.
     
  6. lxfactor

    lxfactor Regular member

    Joined:
    Jun 5, 2005
    Messages:
    559
    Likes Received:
    0
    Trophy Points:
    26
    what about 1080p
     
  7. eatsushi

    eatsushi Regular member

    Joined:
    May 7, 2006
    Messages:
    572
    Likes Received:
    0
    Trophy Points:
    26
    I have the Sony 1080p SXRD and it deals with both 1080i and 720p sources very well. The HD signal from my cable box looks very good indeed and I've tried both the 720p and 1080i output. I don't really see any difference b/w the 1080i channels (HBO-HD, CBS-HD, NBC-HD etc) and the 720p channels (ESPN-HD, ABC-HD etc).
     
  8. gerry1

    gerry1 Guest

    My cable box will do 1080p but not my tv. I've tried both 720p and 1080i. Strangely, 720p seemed to have an even better pic than 1080i but it was "darker". I didn't have time to toy with readjusting the color, brightness and contrast etc. but I will when I have the time. For now, I'm back to 1080i but I've yet some playing around to do with 720p which, methinks, will be a better pic when I've got things readjusted (if possible).
     
  9. Chaosphr

    Chaosphr Member

    Joined:
    Oct 28, 2006
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    11
    Sounds like you're making up these arguments on the fly!!

    Well, you're all wrong. Here's the simple logical truth, just do the math:

    So you're saying.. 1920x1080i = 1920x540p
    Fine, it's not correct per definition, but it'll return the same numbers.
    So lets multiply.

    1920 * 540 = 1.036.800
    1280 * 720 = 921.600

    That's all you need to know about numbers. 1080i has more pixels, end of story.

    Then there's the other, more important side of this argument.
    When you're talking about MOVIES, which is what people usually talk about when it comes to home theatres. (you're not buying all that gear just to watch the news on)
    Movies are filmed at 24 full frames per second. Which is to say 24p, ie progressive. If you scan the movie to full HD you get 1920x1080p24, then to convert it to american broadcast standards which is 60hz you do a 3:2 pull-down (look it up) and then splits up the fields into 60 fields per second which exactly converts the 24p to 60i, interlaced. This is lossless and reversible, basically what it does is to show 60 half-frames per second (including some duplicate fields) instead of 24 fullframes. You can return to 24p by a process called Inverse Telecine (ie 2:3 pull-up).

    This, ladies and gents, means that for all movie content, a 1080i60-signal will deliver exactly 1080p24 quality if you treat the signal correctly. Make sure your TV knows how to do inverse telecine correctly without dropping any information.

    So let's multiply again, with this new found knowledge shall we?
    1920 * 1080 = 2.073.600
    1280 * 720 = 921.600

    Now, I don't want ANYONE saying that 720 is better than 1080 again, it's just half the information, if the source is a movie, even if it isn't a movie, there's still more information in 1080i.

    720p is a mid-ground in HDTV which really doesn't deliver that much more above DVD, it's nice but not great.

    You may wonder what 1080p is good for, well imagine 1080p60, that's 60 fullframes per second of 1080 resolution. That will double the bandwidth, and it's mostly live-situations and video that uses that sort of refreshrate, no movies use it.
     
  10. wolfniggr

    wolfniggr Regular member

    Joined:
    Dec 15, 2004
    Messages:
    803
    Likes Received:
    0
    Trophy Points:
    46
    Well put.
     
  11. egomaniac

    egomaniac Regular member

    Joined:
    Mar 24, 2003
    Messages:
    128
    Likes Received:
    0
    Trophy Points:
    26
    Ok so not to hi-jack but what are the top 1080p sets out now?
     
  12. diabolos

    diabolos Guest

    Don't forget about video games!

    Or my threads...

    Ced
     
  13. HD_nut

    HD_nut Regular member

    Joined:
    Oct 31, 2006
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    26
    I got news for you... 720p is better thatn 1080p.
     
  14. HD_nut

    HD_nut Regular member

    Joined:
    Oct 31, 2006
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    26
    720p has better temporal resolution that 1080p.
    720p has 60 frames per second
    1080p has 24,.. up to 30 frames per second.
    1080p sets pump at 60 frames to present a 60 frames broadcast like 720p
    But although 1080p TV's pump at a frame rate of 60 per second, does not give you a true 60 frames per second if you are viewing a format in 24 frames per second, like 1080p/24.
    The frames rates are inconsistent.
    An example of this would be like throwing 3 balls at a juggler no matter how fast his hands are moving, he still has 3 balls.
    Now, illustrate the balls as individual picture frames.
    The same with golf balls, if you have three and hit each ball twice at me, it is still 3 golf balls that come back to me.
    1080px1920 is 62208000 pixels per second Each frame consisting of 207300 pixels for each frame, 30 times a second Each frame will be pumped at you twice by the TV's 60 FPS output.
    If 1080p is converted to ATSC format of 720p/60, the 62208000 pixels will be fed into the scale, the pixels will be scaled to you in 60 different frames of 921600 pixels. 1080p ...24 frames per second ...49766400 pixels per second

    `````````` 1080p/24 going into a 1080p/60 TV
    Frame rate between signal and TV is inconsistent.

    720p/60... 60 different frames pumped at you once in one second. 55296000 pixels per second. Now Imagine a photographer snaps his camera 60 times. You now have 60 pictures, he know shows you each picture once. So you can see why this is better for moving images.
    The pixels are in a different position for each frame.

    720p/60 or converting 1080p to the ATSC 720p/60 format, the frame rate is consistent with a 720p/60 tv.

    A WXGA 768px1366 set is designed for the 720p signal. There is enough power in the 720p signal to scale the fixed panel resolution to 768x1366.

    This would be 62945280 pixels per second (60 frames) when choosing the 720p format.
    I rather have 720p or 768p. Faster rates, and not too much of a climb for my DVD or standard signals
    768p = 62945280 pixels per second
    1080p= 49766400 pixels per second
    Standard DVD bad on 1080p sets
    Standard DVD resolution is 480i vertical x 720 horizontal pixels
    When playing this on a 1080p set the must scale 240 of each vertical field into1080
    Then it must scale 720 horizontal pixels into 1920... THAT IS ADDING 1200 PIXELS TO EACH HORIZONTAL LINE that is 1296000 made up pixels
    1200 for each line is more than what is initially there.. the 720 real horizontal pixels
    It also has to make 600 vertical lines of resolution
    On a TV that has a resolution of 768x1366 a DVD is much better because is less of a scale to climb,.. it has to make up 646 horizontal pixels and 288 lines.
    On a TV that has a resolution of 768x1024 a DVD loks it's best through a DVD upscaler because it has less of a climb... now the it has to scale 720 pixels into 1024...that is only 304 Horizontal & 288 vertical. With a good DVD upscaler and a good TV converter chip many DVD's are able to make this trip up to 768p.
    With a HD broadcast of 720 H x 1280 V a 1080p set has to make up 640 vertical pixels per line & 360 vertical lines which results in color loss. & resolution loss as the set is trying to spread that data into a wider field!
    Some may say that a 1080i broadcast fits the 1080p scale well as it is 1080i scaled at 1080p but they fail to realize that MOST HD 1080i networks really broadcast at 1035 i officially 1035 x 1440
    Proof
    "1080/30i is defined to be 1080 lines of 1920 pixels each delivered every 1/30th of a second (540 of them at a time), but it is implemented by its practitioners as 1035 lines of 1440 pixels each"
    http://alvyray.com/DigitalTV/DTV_Bandwidths.htm
    Also the temporal resolution of a 1080p disk is 24 fps. This is horrible for sports especially on an LCD TV this is because an LCD TV does not have what CRT TVs do... a refresh rate. This is why there is no eye strain problems with an LCD monitor. Therefore a slow temporal resolution of 24 will result in motion blur for moderate to fast moving images especially on an LCD TV .
    WHY 1080p ???
    "at the average viewing distance, with the average size of consumer HDTV sets, the human eye would not actually be able to perceive the difference in resolution between 720p and 1080p. This is because the 720p image "saturates" the perceivable resolution of the eye at this distance"
    http://www.witwib.com/720p
    "In a 50-inch plasma display with an array of 1366x768 pixels, the pitch of individual pixels is typically less than 1 mm (about 0.9 mm), which equals 0.039 inches. Do the math, and you'll see that standing 10 feet from a 50-inch plasma means you can barely perceive the HD pixel structure, and that's only if you have 20-20 vision."
    http://proav.pubdyn.com/2005_January/13-ProAV-Old Site Content-2005-501proavparallaxview.htm
    720p better than 1080p the clear winner!!
    But humans can perceive 60 frames per second which makes 720p the clear winner over 1080p
    Proff
    60 frames is better than 24
    "humans can perceive up to 60+ fps".
    http://www.daniele.ch/school/30vs60/30vs60_3.html
    1080p has Problems with artifacts
    "A high-resolution image with image artifacts such as motion smearing, incorrect white balance or color points, and grayscale rendering problems may not look as realistic as a lower-resolution image without these problems."
    http://proav.pubdyn.com/2005_January/13-ProAV-Old Site Content-2005-501proavparallaxview.htm
    720p best for sports
    "1080p/24 is totally inappropriate for broadcasting sports. No sports fan would tolerate the motion blur and loss of fine detail in fast-moving objects. Even 1080i/30 would be a better choice."
    http://www.hdtvexpert.com/pages_b/followup.html
    There could compression problems that can hurt the 1080p image.
    "It would take too much bandwidth to do it. 1080i barely fits for some types of content–for example, sports–and concerts and specials," Beyler says. "For some types of content there just are not enough bits with 1080i. 1080p would be even that much more demanding."
    "And will going from 1080p to 19.4 be better than going from 1080i to 19.4?," Garvin asks. "We know that when we ultimately can record 1080p, the recording will be better. What we don't know yet is whether 1080p looks better compressed at 19.4 than does 1080i. We believe that it will, but we obviously haven't seen that yet."
    http://www.cedmagazine.com/article/CA422050.html
    768p & 720p has more pixels per second that 1080p
    1080p/24 49766400 pixels per second
    720p/60 55296000 pixels per second.
    720p on 1366x768 62945280 pixels per second
     
  15. HD_nut

    HD_nut Regular member

    Joined:
    Oct 31, 2006
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    26
    Also... about the 2 frames of 540i at a time or 517 as practiced argument VS 720p? I won't waste much time on that , it is like comparing a Tonka truck to a BMW.
    Interlaced is horrible and should be outlawed... how many monitors are made interlaced today? It washes out colors and flicker like hell.
    Please take into consideration that there are 3 resolution factors that must all come into play together. Vertical horizontal and temporal. When added together 720p wins out.
    Vertical 720 Progresive
    Horizontal 1280
    Temporal 60
     
  16. Chaosphr

    Chaosphr Member

    Joined:
    Oct 28, 2006
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    11
    Short answer: No


    You're talking about 720p60, which is never used for movies.

    Compare 720p24 to 1080p24 or 1080i60, and 720 will lose any way you choose to twist and turn the numbers.

    Such a silly argument to speak about temporal resolution when ALL MOVIES IN EXISTANCE is filmed on 24 frames per second. Broadcasting such a movie in 720p60 would mean you'll have every other frame as a duplicate of the previous frame. That simply doesn't exist because it's idiotic.

    720p isn't 60fps by definition, it's just ONE of many HDTV-standards, and besides, it's 720p50 in europe.

    Temporal resolution does not mean A THING in movies because framerate is allready FIXED by the physical film medium.
     
    Last edited: Nov 1, 2006
  17. HD_nut

    HD_nut Regular member

    Joined:
    Oct 31, 2006
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    26
    ,
     
    Last edited: Nov 1, 2006
  18. HD_nut

    HD_nut Regular member

    Joined:
    Oct 31, 2006
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    26
    Stop trying to even the field off to 24 frames, we're talking about a real 720/60. Monday night Football, Lost, etc. You can convert to 720/60 as you will see.
    But lets take all into consideration.
    Lets touch upon these issues

    Color
    Contrast
    what picture is painted on?
    which one flickers?
    which one has jaggies?
    which one loses pixels after compression?
    which one showed artifacts?
    which one displayed increasing quantization noise and blockiness?
    In a test by the ATSC and the FCC which one won out with no artifacts?
    In the time 1080i really does paint 1080...in that same time how many lines does 720 paint?

    When you convert 1080/30 to 720... say on CSI NY...
    The box is now pumping at a rate of 60 per second.... the format is rearranged.
    Lets round it off.. I'll even say 1080i is 1080i and not 1035... with 720, 768 and 1080i you get 1 million pixels at a time ... so lets say there is a 60 million pixel broadcast... with 1080i you get 2 different interlaced frames of about a million each, each frame and field 30 times.. you see 540 at a time....
    When you convert it to the progressive you change it's format it takes the information of the 1 million pixels per frame and gives it to you 60 times ...
    "ABC has chosen the 720 line, 60 frame per second, progressive scan high definition format. Fox Broadcasting will be using the 480 line, 24 or 30 frame per second, progressive scan format for all of its digital broadcasts.

    Each of these networks requires the delivery copies of programs to conform to their individual digital broadcast standard."
    http://www.laserpacific.com/emoryarticle.htm

    "To convert an interlaced image into a 60-frames/sec image you can simply combine successive fields and display them twice."

    http://www.hdtvprimer.com/ISSUES/what_is_ATSC.html

    The issue really is not movies.. it is what is better a true 720p/60 or a true 1080i/30... that is the issue we are discussing.

    What is the big difference? PROGRESSIVE scanning!
    You're not getting 2 million pixels at once with 1080i ....without even taking into consideration the temporal resolution.

    Get a HD tivo and a 1080i CRT TV... get a 1080i broadcast and pause it... notice the blurr
    Even though you don't notice that blur when you're watching the picture... you can see that this is what your picture is really doing. You notice it by the loss of color saturation, contrast... etc..
    You're seeing the set going between odd & even.. that is the blur...

    Now do this with the TIVO in 720 on a 768p set.. then pause picture... it looks like a glossy photograph.
    As far the 60 frames... it's very important...
    Get a camera and take 60 pictures of a basketball game and the guy next to you takes 30 at the same time you took yours... which set of pictures is going to give you more information? The frame are really pictures
    When a TIVO is in 1080i it is pumping 30x... when you convert to 720 the TIVO pumps at 60x
    It's brought to you in a different format.
    The only things you are losing in 720p compared to interlaced broadcasting is blur flicker and washed out color.
    I seen the same broadcast in 1080i on a 1080i CRT Sony... and next to it... the same 1080i broadcast on the Sony 768p XBR LCD
    The LCD blew it away in every aspect... and this is a cross conversion. A real 720p broadcast like Lost with give the ultimate resolution.




    "Interlacers often claim that their system has more pixels. See bandwidths for the actual numbers. Summary: It is not the pixels in a still frame that counts - still video is boring. It is the pixels per second delivered to viewers that matters. The 540I system, as defined, delivers only slightly more pixels per second than 720P, and, as practiced, it delivers less! This, of course, is before compression. As argued above, what matters to consumers is after compression. After compression, the formats are indistinguishable so far as pixel count is concerned. However, the interlaced system is fundamentally flawed because of its flickering, which does not go away during compression.

    CAUTION! When comparing systems, be sure you are watching a true comparison. If you are not, then what you are really comparing is the equipment used for format conversion, not the display technology. This seems obvious but I have seen many "comparisons" in the chat rooms, emails, websites, etc, where, for example, crummy source material displayed on a wonderful display system is "compared" with material originated in the format designed for a competing display technology. The crummy material still looks crummy. What a surprise! In a true comparison, the originating technology must be specified and all conversions that have been made to the material before display must be specified. Does the following work for a "comparison"?: Take material originated in 1080i (aka 540i) and (1) display it on a 1080i monitor; (2) convert it to 720p with a format converter; (3) display the converted material on a 720p monitor against the display in step (1)? Not a valid comparison! Step (2) will cause a degeneration (it has to because material has been thrown away for interlace - namely every other line) that no converter, except one costing about $100,000, can possibly compensate for well. So this "test" would be comparing a good version of the source material to a bad version. It would be testing the quality of the converter, not the display technologies. Another mistake I see is comparing low-res material upconverted to high-res against (originally) high-res material. Not a chance that the upconverted stuff will look as good. Again, this is comparing converters, not display technologies."

    From

    http://alvyray.com/DigitalTV/Naming_Proposal.htm
     
    Last edited: Nov 2, 2006
  19. HD_nut

    HD_nut Regular member

    Joined:
    Oct 31, 2006
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    26
    Last edited: Nov 1, 2006
  20. HD_nut

    HD_nut Regular member

    Joined:
    Oct 31, 2006
    Messages:
    139
    Likes Received:
    0
    Trophy Points:
    26
    With 720p


    "The number of lines of resolution in progressive and interlace pictures are not an "apples-to-apples" comparison. In the time it takes 720P to paint 720 lines, 1080I paints only 540 lines. And, by the time 1080I does paint 1080 lines, 720p has painted 1440 lines.

    Contrast and brightness have a greater impact on the human visual system than does resolution. The 720P picture is brighter and has greater contrast than the 1080I picture.

    In side-by-side subjective testing performed by the Advanced Television Test Center under the auspices of the FCC’s Advisory Committee on Advanced Television Services (ACATS), it was shown that 720P had "no artifacts" under a variety of conditions, while 1080I, under the same conditions, showed "increasing quantization noise and blockiness…" Nevertheless, these distinctions are slight, and the ACATS took pains to note that there was no substantive difference in picture quality between the two formats. "

    http://alvyray.com/DigitalTV/Progressive_FAQ.htm

    So there you have it my man... from the ATSC & the FCC... the holy grail of DTV. & I'll take their word over anyones!
    Also,
    By the time 1080p paints 25920 lines 768/60 paints 46080 lines

    The resolution of the human eye maxes out 768x1366

    "at the average viewing distance, with the average size of consumer HDTV sets, the human eye would not actually be able to perceive the difference in resolution between 720p and 1080p. This is because the 720p image "saturates" the perceivable resolution of the eye at this distance"
    http://www.witwib.com/720p

    "In a 50-inch plasma display with an array of 1366x768 pixels, the pitch of individual pixels is typically less than 1 mm (about 0.9 mm), which equals 0.039 inches. Do the math, and you'll see that standing 10 feet from a 50-inch plasma means you can barely perceive the HD pixel structure, and that's only if you have 20-20 vision."

    http://proav.pubdyn.com/2005_January/13-ProAV-Old Site Content-2005-501proavparallaxview.htm

    So why deal with flicker, jaggies, poorer contrast & color, low frame rates, increasing quantization noise, artifacts?




     
    Last edited: Nov 2, 2006

Share This Page