Difference between 1080i and 1080p: same solution,different things

Wyatt21

Member
There still seems to be some confusion about the difference between 1080i and 1080p. Both are 1,920x1,080-pixel resolution. Both have 2,073,600 pixels. From one perspective, 1080i is actually greater than Blu-ray. And, you can't even get a full 1080p/60 source other than a PC, camcorder, or some still cameras that shoot video.

True, 1080i and 1080p aren't the same thing, but they are the same resolution.

1080i
The 1080i designation is 1,920x1,080 pixels, running at 30 frames per second. This is what CBS, NBC, and just about every other broadcaster uses. The math is actually pretty simple: 1080 at 30fps is the same amount of data as 720 at 60 (or at least, close enough for what we're talking about).

1080p
Yes, what about it? Your 1080p TV accepts many different resolutions, and converts them all to 1,920x1,080 pixels. For most sources, this is from a process known as upconversion.

When your TV is sent a 1080i signal, however, a different process occurs: deinterlacing. This is when the TV combines the two fields into frames. If it's done right, the TV repeats each full frame to create 60 "fps" from the original 30.

If it's done wrong, the TV instead takes each field, and just doubles the information. So you're actually getting 1,920x540p. Many early 1080p HDTVs did this, but pretty much no modern one does. In a TV review, this is the main thing we're checking when we test deinterlacing prowess.

Elephant_clips.jpg


Bottom line
While 1080i and 1080p have the same number of pixels, they do have different frame rates (and one is interlaced). The reality is, other than PC games, there isn't any commercially available "real" 1080p content. It's either 1080i content deinterlaced by your TV, 1080p/24 content from Blu-ray, or upconverted content from console games.
 
Top