Connor's Blog

720p vs 1080i: Who’s the winner?

The short answer? 720p.  Now, here’s the long answer…

First we have to understand the difference between “p” and “i”.  The “I” stands for Interlaced, and the “P” stands for Progressive.  Video files, DVDs, and TV Broadcasts are either coded one way or the other, you don’t mix and match.  If you do, then you get an inconsistent image because it was not edited correctly.  Any video signal that you watch on your TV set will be 24fps (frames per second) (movies, film, etc…) or 30fps (sports, news, cartoons, sitcoms, etc…).  A 1080 interlaced video signal (1080i) means that each frame contains only half the content of the picture content.  Huh? Ok, let’s say you are watching NBC’s HD brodcast (1080i 30fps).  Each second contains 30 frames.  The odd frames contain 540 horizontal lines of the first image and the even frames contain 540 horizontal lines of the next image.  Like, when a car goes by the screen, left to righ, the even frames contain where the car is going to be next, while the odd frames contain the current position of the car.  The odd frames are called the “Odd Field” or “Upper/Top Field” and he even frames are called the “Even Field” or “Lower/Bottom”.  Since the broadcast is 1080 lines, the Odd Field contains lines 1,3,5,7,9,11,13, etc…whereas the Even Field contains 2,4,6,8,10,12, etc…

Now…we as humans have a hardtime seeing just 1/30th of a second, so we can’t tell that video consists of tons of sequential/interlaced images, however, if we were to slow it down to 15fps or 10fps we would see lots of glitchy video.  The first field contains the 540 odd numbered lines, because 1/30th of a second later, it is followed up with 540 even lines, completing the picture.  So basically within 1 second of video in a 1080i brodcast, there are only 15 full 1080-lined frames, if they were combined (deinterlaced).

With 720p, you do not have the even/odd frame method of displaying the picture.  Every single frame is a full frame, 720 lines.  So the bottom line is that in a 1080i video you have 540 lines per frame, and in a 720p video you have 720 lines per frame.  This is why the ABC network adopted 720p a long time ago, even when 1080i was an option, because when every frame is progressive you get a much smoother picture, on fast action scenes, sports, etc… So ABC made a smart move going 720p.

Interlaced vs Progressive clip:

Notice the lines that display very clearly on interlaced motion.  The lines are there during the slow scenes as well, you just don’t notice them until a fast, high motion sequence, occurs.

Sidenote: Don’t be fooled by this new Dish Network TurboHD service.  It claims “1080p VOD” (video on demand), but all it is, is deinterlaced 1080i signals, because no network right now is broadcasting in 1080p.  Right now if you have a “1080p” television, the tv is just deinterlacing the signal to make each frame a full frame…that’s it.  It is just a cosmetic gimmick, you are not really getting a 1080p video signal.  So all TurboHD does it convert/deinterlace the signal on their end before they send it out…it is still a 1080i signal, the only benefit is that it will look better on televisions that do not have a built in deinterlacer (i.e. older LCDs and RPTVs).  Most every new TV now has a built-in deinterlacer that can deinterlace any 1080i signal.  Why do you think their packages start at $24.99? It’s nothing new, but I am surprised they haven’t been sued by other companies since it’s not true 1080p.

So, back to the main discussion…720p contains full frames, for every 30th of a second, wheras 1080i contains interlaced frames every 30th of a second.  If you were to convert 1080i to progressive you would get 540p, which is less than 720p.  If you were to convert 720p to be interlanced you would have 1440i which is more than 1080i.  You could even go as low as a 542p video signal.  Even that would be higher quality than 1080i, because it would equivocate to 1084i (if it were interlaced). Get it?

720p is not only more lines in any way you look at it, but since every sequential frame is progressive (full, not interlaced) then it handles motion and fast action flawlessly.  While 1080i has to produce two frames to give you a picture, 720p just has to produce one frame to give you that same picture.  I will glady take (and film in) 1280x720p over 1920x1080i any day.

11 comments for “720p vs 1080i: Who’s the winner?

  1. Fred
    October 19, 2008 at 1:57 pm

    Sorry, but it doesn’t work that way in real life. I just bought my first HD TV, and one of the first things I noticed was that the broadcasts on CBS and NBC looked a lot better than the broadcasts on ABC. We’re talking “knock your socks off” clarity on CBS & NBC versus “OK” on ABC. I wondered why, but a press of the “info” button on my remote revealed that the CBS & NBC broadcasts were in 1080i, while the ABC broadcasts were in 720p. The eyes do not lie. There is no question that the 1080i broadcasts look remarkably better. I find myself gravitating away from ABC (which has been my favorite network in the past) and towards whatever is showing on NBC and CBS, simply because the 1080i image is so dazzling. ABC needs to get on the bandwagon before they lose viewers.

  2. October 21, 2008 at 2:22 pm

    That is because your TV (if it is new within the last year or two) has a built in deinterlacer which deinterlaces the 1080i (interlaced) image and makes it into 1080p (progressive) before it reaches your display.

    Now that all broadcast signals are digital (for the most part) the TV can determine if it is interlaced or not when it receives it, and thus deinterlaces it. However, although it “looks” better to your eyes, it is a fake progressive image since it is converted from an interlaced source.

    Pretty soon 2K and 4K TV will be out (in the next 10 years) and 1080p won’t “look” so good anymore.

    **Conclusion still stands: a 720p image is higer resolution than a 1080i image. I would much rather film in 720p than 1080i. All those people out there buying 1080i HD/HDV cameras that don’t have 720p functionality are getting ripped off. Why do you think consumer 1080i video cameras are $500 now while 720p consumer cameras are $1,500 and up? there’s a reason for that, and it’s “quality”.

    NOTE: If I deinterlaced a 720p video it would become 1440i (better than 1080i). It is a higher resolution image, and if a 1440i image wer epumped into your TV, you would see an even bigger improvement over your 1080i images.

  3. Brenton
    November 20, 2008 at 5:57 am

    Connor, I believe your information is misleading. if you converted a 720p image to interlaced you would not get 1440i, you would get 720i. The reason for this is that an interlaced image consists of alternate frames containing alternate (different) lines of the original image. 1080i does not equate to 540p, if it did you would lose half the data in the broadcast. In 1080i there are actually 1080 unique (different) lines, these lines are interlaced and shown to the end user in alternate frames. so frame 1 would show line 1, 3, 5, 7… 1079, frame 2 would show 2, 4, 6, 8…1080 (each frame has 540 unique lines). Because of this, 1080i video is displaying 1080 unique lines although not all at one time. the human eye is easily fooled and what you see is a higher resolution image – contrary to your information. 720p consists of 720 unique lines and therefore cannot be broadcast as 1440i ever. 720p as an interlaced image would end up as: frame 1 showing lines 1,3,5,7…719, frame 2 would contain 2,4,6…720. each of these frames only contains 360 lines each, hence it would be 720i. The difference can usually be seen (as you pointed out) in fast moving images where interlaced video begins to tear and progressive does not. It comes down to the end users preference and in my opinion after switching between 720p and 1080i is that generally 1080i produces a better image. now before you point out that if I were to slow the video i would see tearing you should realise that video is almost always shown at full speed and tearing is rarely seen (in my opinion).

  4. Brenton
    November 20, 2008 at 6:27 am

    PS: It should also be pointed out that your 1080i image at the top of the page is extremely misleading. An interlaced video is not shown with each frame containing 1 line of the image and another with no data (a grey line). Take an imaginary TV that has a resolution of 5 physical horizontal lines. Now this TV could show a 5p video or a 10i video natively. A 10i video displayed on a tv containing 5 (physical) horizontal lines would display line 1,3,5,7,9 in frame 1 and 2,4,6,8,10 in frame 2. Each frame completely populates the 5 (physical) lines of the TV unlike your image showing grey lines on the screen. If showing a 5p video it would simply show 5 lines (1,2,3,4,5) each frame. This also shows that although each single frame of a 720p video contains more lines than each single frame of a 1080i video- combined, a 1080i video contains more lines in total and generally fools the user into seeing a higher resolution image. which after all is what we’re all looking for :)

  5. November 30, 2008 at 5:17 pm

    can we all agree that 1080p is better than both 1080i and 720p? lol.

    720p is better because it is a progressive image. I can tell when a source is interlaced, and it is annoying knowing I could have watched a progressive version of it. Interlaced, even when it has more lines, is not as enjoyable to watch. 1080i is only enjoyable on an interlaced HD tv, which arent sold anymore, -or- if you have a built in deinterlacer…but then you are just being fooled watching a fake “p” signal. I will take 720p over 1080i anyday since I have a progrssive display with no deinterlacer.

  6. Mark
    June 4, 2009 at 8:13 pm

    Let’s break it down in a simple way ! During any 30th of a second sequence of viewing in 720p, you’re watching a full screen image every 30th of a second and in 1080i, you’re only viewing half of a screen image every 30th of a second. It doesnt matter how many lines there are, the fact is that only half of the lines are displaying an image in any screen that is interlaced.

  7. Greg
    July 11, 2009 at 7:38 pm

    Each frame is made up of two fields so a frame rate of 30 per second consists of 60 fields per second. This gives double the number of images per second, which results in superior movement resolution particularly in Sport.
    1080 lines verse 720 lines the answer is obvious. Idealy broadcasters would use 1080p with a frame rate of 50/60

  8. tom
    October 26, 2009 at 11:20 am

    1080p is just 440X4

  9. tom
    October 26, 2009 at 11:40 am

    bad math 1080p is just 540×4 not 1080 i x2

  10. Mike
    January 31, 2010 at 2:37 pm

    All I know is that i have a 3 year old Sharp 26″720p flat screen and a 6 month old Toshiba 42″1080i flat screen, and the quality of the 1080i picture is light years better than the 720p. I compared the 1080i picture to other 1080p sets in the store and the 1080i was the best in clarity and depth of color of any set at a comparable price. My 2 cents…

  11. February 1, 2010 at 9:42 pm

    It is all a matter of opinion really when it comes down to it. I am able to see lines in an interlaced image. Some people can, and some people cannot with the naked eye…I am unfortunately one of those who cat, and it drives me batty seeing half images interspersed through whole images. I would rather have a progressive image in SD over an interlaced image in HD. Now, with TVs including upscalers and automatic-deinterlacers now, this topics does not apply anymore, but at the time of writing it, de-interlacing was not a standard feature on all HD TVs…that is why I wrote this article originally, and still to this day, interlaces images, many times, give me headaches. Thanks for participating! :)

Leave a Reply

Your email address will not be published.