If you are confused as to what the difference between 720p, 1080i etc means, you have come to the right place. Here, I will explain to you what these formats mean, and some other information which you might find useful.
Firstly, I will explain to you that the numbers mean. We will go onto the whole 'i' and 'p' thing later. The two common types of hdtv formats in place today are 720p and 1080i. These numbers may seem meaningless to you right now, and that is understandable, however the concept is not really hard to grasp. You may know that a digital video is basically comprised of a large matrix of pixels. The amount of pixels you have is often refered to as a resolution and is measured as the # of pixels across x # of pixels high. Well... the number of pixels vertically on a screen is this number. Say you had a high definition video streaming through with a resolution of 1280x720. Then this would be either 720p or 720i. There is also a larger resolution out which measures 1920x1080. This is where the 1080 comes from. There are only a selected few screens on the market today that can display this kind of image and these are very expensive.
All hd is in a 16:9 aspect ratio. This means that the picture is 16 pixels wide for every 9 pixels it is high. This works out if you make some calculations. If you divide 1080 by 9 and then times by 16, you should get 1920, which is the horizontal resolution of the 1080 picture. With the 720 resolution, you should get 1280 in the horizontal direction.
There are who types of signals. Interlaced and progressive. Now, you probably already know that the pixels refresh from left to right, line by line. This is what progressive is. Every line is refreshed, one after the other for every frame. With interlaced, every second line is scanned. For example, every even numbered row is scanned in the first refresh and then every odd numbered row. So, if the refresh rate is 60hz with interlaced, then technically the whole screen is refreshed only 30 times every second. Wheras with progressive, its 60 times per second. Interlaced was introduced because of limitations on the amount of data that was able to be sent through. A 1920x1080 resolution is obviously a very large amount of pixels to be refreshing 60 times every second. This is why, for the time being, it is only available as interlaced. Some people preffer 1080i and others prefer 720p. It seems impossible that you can notice interlaced scanning when the usual eye can only distinguish at 12hz, however you can notice some difference. In scenes with a lot of movement, you can sometimes see a flicker, and for some people, this can get annoying. In general, the progressive scan is alot more smoother to watch than interlaced, but different people have different opinions.
If you have had experience with both types of hd, please post a comment telling us your thoughts on what is best. Also, if you have any other questions or queries about what is going on in this article, please do not hesitate to post a comment with your question. Thanks for reading.