# 720P better than 1080P ?



## thekochs

I couldn't resist to post this....figured it would get everyones juices going. 

http://www.electronichouse.com/article/expert_interview_720p_vs_1080p


----------



## macEarl

So long as the choice was 1080i vs. 720p, for me it was a no brainer - 720p.

Why?

Because interlace is horizontal, just like the whole eye-brain-perception wiring for us. So laying on your side watching TV is more fatiguing - and irritating - because of the differential in how you scan vs. how your TV scans.

But now that 1080p is out, we want one. Our 720p isn't good enough anymore. And you want to know why? Because we're Americans. Because bigger is better. And we don't mind living with ourselves. (We in this case literally means my household - we.)

So, here's an interesting factoid to offset any offense I've made with the above. Those flight simulators you see in the movies - the REAL ones - have projectors that spin with 6Gs of acceleration. Yep, 6 Gs. Put your hand too close to one when it's in full gear and it'll sawdust the bones, pronto. Your eye will move with that kind of rapidity - it takes a lot of stress and moves more and more quickly than you may know.

PS - oh yeah - I lie on my side watching TV a lot- er, sometimes a lot, you know....


----------



## Smthkd

I have both and 720p 50" Hitachi LCD and 1080p 65" HP DLP and I must say, my 1080p has a much clearer and cleaner picture. The colors are more vivid and black levels are no contest. 720p isn't even close to 1080p!


----------



## macEarl

Smthkd said:


> I have both and 720p 50" Hitachi LCD and 1080p 65" HP DLP and I must say, my 1080p has a much clearer and cleaner picture. The colors are more vivid and black levels are no contest. 720p isn't even close to 1080p!


My 720p LCD isn't as good looking as the 720p DLP I've got. But that's different - I want to join the 1080p DLP club, no shame about it!


----------



## Stewart Vernon

Something I NEVER see anyone say in the various 720 vs 1080 debates.

One of the "problems" with 1080i right now will also be the problem with 1080p via broadcast. In order to fit the 1080 data into the available bandwidth (I'm using OTA as an example since we all know cable/satellite is compressed even more) a lot of data is thrown away via MPEG2 encoding. FYI, MPEG4 for satellite throws away data too, it is the nature of the compression beast.

Anyway... When you have a higher resolution and throw away more data, you can see flaws sooner than if you had less data and threw away less.

So... many of the glitches attributed today by folks who don't know any better to 1080i as "interlace problems" and yell/scream that is why 720p is better for being progressive... will be in for a HUGE SURPRISE if OTA or satellite/cable goes to 1080p and they find the same glitches present there too!

The problem isn't interlacing... Unless you have epilepsy or extremely sensitive eyes or sit too close to your TV screen... your brain/eye combination processes interlaced imagery just fine and has for the entire history of television!

The problem is lossy compression on MPEG2/4... and unless they change the channel bandwidth requirements OTA or satellite allocates more bandwidth to a channel... then you will never see completely clean 1080i OR 1080p via broadcast.

DVDs (and now HD DVD or BluRay) can do it because they can cram as much data on the disc as they want (subject to space on the disc of course) and shoot out a much higher bitrate to your TV at as close to full-resolution as possible with the minimum of compression.

DVD media (again also HD media) will always always be capable of better and cleaner video and HD signals because of the nature of the beast... and I daresay anyone will be able to compare 1080i vs 1080p from an HD DVD/BluRay player and say one is better than the other by using their eyes watching the TV and not bean-counting numbers.

Bottom line... 1080p will have the same problems 1080i has OTA or satellite/cable because of limited bandwidth and the large amount of data present. 720p looks cleaner many times because there is less data, so less compression is used in the stream. Since there is no 720i standard in the US, we can never compare... but I have no doubt that 720i would be just as clean and indistinguishable from 720p if there were such a comparison that could be made.


----------



## macEarl

HDMe said:


> The problem isn't interlacing... Unless you have epilepsy or extremely sensitive eyes or sit too close to your TV screen... your brain/eye combination processes interlaced imagery just fine and has for the entire history of television!


Yep - unless it's interlacing up/down instead of side/side - then it creates fatigue.

Great post, thanks!


----------



## 4DThinker

Read the article. The only arguement for a 720p set is when you can't buy a 1080i set with all the other variables as good. There's also not much reason for 1080i if your eyesight is ordinary and your TV is small.

When the content is 1920x1080 on either of my 1080i TVs, it ALWAYS looks better than it does at 720p. It's easy to test, too, since I can switch back and forth between 1080i and 720p resolution with either the TVs or my DirecTV HD receivers.

Yes, my 1080i sets cost more than the same size of 720p sets from the same manufacturers. Yes, some 720p sets have better specs than my 1080i sets, but none have the same native resolution.


----------



## Kapeman

I bought my first HDTV back in '03 (pronounced aught three) and it was a 46" RP Sony with a native res of 1080i.

My new set is a Sammy 50" DLP with a native of 720p.

The big thing noticed this time around was that there weren't a lot of 1080i sets to be found. Almost all were 720p. The 1080p option may explain it, but still you would think there would be as many 1080is as 720ps.


----------



## Geronimo

The aarticle never makes the calim that 720p is better than 1080p. It simply says taht resolution is only one factor and that other factors need to be considered.


----------



## BudShark

Kapeman said:


> I bought my first HDTV back in '03 (pronounced aught three) and it was a 46" RP Sony with a native res of 1080i.
> 
> My new set is a Sammy 50" DLP with a native of 720p.
> 
> The big thing noticed this time around was that there weren't a lot of 1080i sets to be found. Almost all were 720p. The 1080p option may explain it, but still you would think there would be as many 1080is as 720ps.


The reason for the difference (less 1080i) is because of the move to digital displays - hence native progressive. Hitachi made some 1080i digitals (why???) but generally going interlaced on digital displays tends to create more problems than worth.

I went to a Panny 1080i, to a Hitachi 1080i RPTV to a Samsung 1080P DLP. There is no doubt that the Samsung is sharper, richer, better all around - but you're comparing apples to oranges at that point.


----------



## Cholly

The message is pretty clear -- since digital displays have fixed resolutions, upscaling or downscaling of program source must usually be performed. When one has converter boxes or DVD players with built-in scalers AND a TV with a built-in scaler, it becomes a problem of just which scaler to use. Throw into the mix those folks who buy high quality scalers to do the format conversion and it's a whole new ball game.


----------



## Radio Enginerd

Smthkd said:


> I have both and 720p 50" Hitachi LCD and 1080p 65" HP DLP and I must say, my 1080p has a much clearer and cleaner picture. The colors are more vivid and black levels are no contest. 720p isn't even close to 1080p!


I've been told that until you get above 50", The difference between 720p vs. 1080i is indistinguishable. One would imagine with you're monster 65" (I'm jealous) monitor you'd probably be the first one to notice.


----------



## premio

macEarl said:


> Yep - unless it's interlacing up/down instead of side/side - then it creates fatigue.
> 
> Great post, thanks!


They should create diaganol refresh patterns to balance that out


----------



## AllieVi

I didn't notice anyone mentioning camera resolution. If it captures X lines, the picture will look best on a display with X lines.

When lines have to be added (720 camera to 1080 display), picture information that didn't exist has to be generated. When lines are removed (1080 camera to 720 display), information has to be deleted.

You can’t expect a 1080 display to give a better picture if it was taken with a 720 camera or underwent a format change somewhere in the transmission process.


----------



## paulman182

Smthkd said:


> I have both and 720p 50" Hitachi LCD and 1080p 65" HP DLP and I must say, my 1080p has a much clearer and cleaner picture. The colors are more vivid and black levels are no contest. 720p isn't even close to 1080p!


I believe what you are saying, but I don't see what the number of scan lines has to do with vividness of colors or deepness of blacks. Is that really why the DLP is better, or is there some other reason?


----------



## Cholly

DLP is capable of more vivid colors than LCD, and better blacks. Also, it doesn't suffer from LCD's response time. That being said, Plasma is better than both DLP and LCD for picture quality.

On another track, just to clarify the talk about interlacing: horizontal interlacing is not "sideways interlacing". It is the scanning of a field of odd numbered lines followed by the scanning of a field of even numbered lines to make up a single picture frame. It takes 1/60 second per field, thus 30 frames per second. With progressive scan, the entire frame is scanned in a single pass -- 30 frames/second. 720p is perceived to be better with fast action -- no "jaggies". 1080i is perceived to be better for static pictures or slower motion, due to the greater number of pixels in the picture. As I indicated in my previous post, the quality of picture you see on your TV is dependent on its native resolution -- if the TV has a native resolution of 720p, then 1080i program content must be downconverted to 720 lines to be viewed. Conversely, if your TV has a native resolution of 1080 lines, 720p content has to be scaled to 1080.


----------



## AllieVi

Cholly said:


> ... On another track, just to clarify the talk about interlacing: horizontal interlacing is not "sideways interlacing". It is the scanning of a field of odd numbered lines followed by the scanning of a field of even numbered lines to make up a single picture frame. It takes 1/60 second per field, thus 30 frames per second. With progressive scan, the entire frame is scanned in a single pass -- 60 frames/second. ...


I thought progressive gave 30 frames per second, too, but produced each frame in a single pass.

If progressive presents twice as many full frames per second, it requires twice the bandwidth. Are you really sure?


----------



## P Smith

AllieVi said:


> I thought progressive gave 30 frames per second, too, but produced each frame in a single pass.
> 
> If progressive presents twice as many full frames per second, it requires twice the bandwidth. Are you really sure?


You can easily cut your confusion by reading a zillions articles in INTERNET, like this http://en.wikipedia.org/wiki/1080i and avoid increase noise here .


----------



## Cholly

AllieVi said:


> I thought progressive gave 30 frames per second, too, but produced each frame in a single pass.
> 
> If progressive presents twice as many full frames per second, it requires twice the bandwidth. Are you really sure?


Oops! In my haste to compose my reply, I goofed :grrr: Of course, 720p is transmitted 30 frames/second -- another nice link explaining differences and "who's better" is this one: http://ezinearticles.com/?720p-Vs-1080i-HDTV&id=91443


----------



## premio

I love this debate. Everyone has their own oppinion's so I'll list mine and someone will counter it. Some extreme opinions I have seen debated that still pictures of 480i weren't as good as the same paused image on 480p on the WII board..  

I thought long and hard when I bought a 720p plasma, vs the then uber expensive new 1080p ones. My plasma was originally the 5679LED LCD from Samsung but that just wasn't bright enough for my off angle viewing.

Progressive is better, interlacing was only developed because of bandwidth cosntraints. And 1080p is hands down better than 720p as long as your monitor is displaying 1080P without scaling it back to something else. It's just more resolution. Do you need it? 50" and under, It's a definate no. From a standard viewing distance, you just can't percieve the difference. Once you go bigger I'd say it's mandatory.


I would personally rather have progressive than interlaced even at less resolution, because of the way it cleans up fast moving sporting events (like playoff hockey  ). But a 1080i feed looks just as good mostly on the 50.. 56" and over, I think interlacing gets too visible.


----------



## Stewart Vernon

premio said:


> I would personally rather have progressive than interlaced even at less resolution, because of the way it cleans up fast moving sporting events (like playoff hockey  ). But a 1080i feed looks just as good mostly on the 50.. 56" and over, I think interlacing gets too visible.


Generally speaking, most folks cannot see interlacing. Some perceive it on a subconscious level and have headaches or in extreme cases epileptic seizures... but by and large you cannot see the interlacing in real-time because of the way our brain processes things.

That said... what most folk *think* they are seeing as interlacing problems are actually problems due to MPEG compression of the larger 1080 image data vs 720 image data. Current equipment does a better job on 720 resolution. It is that simple.

IF we had 1080p broadcast today, then it would be subject to the same problems folks see with 1080i right now because the real-time encoders just don't do a good job yet with the extra data. In time, however, this will be a thing of the past as the encoders are continuing to improve.

Now, when we talk about HD DVD or Blu Ray... they have the advantage of non real-time compression, and should theoretically not only be able to provide more bandwidth than broadcast but also they can let their computers churn as long as needed to produce a quality DVD image. This is something that just cannot be done in real-time with the current encoder technology.

So... I understand why some folks think 720p looks better than 1080i... and sometimes I agree with them... but most folks don't know why the image looks better, and lots of these mini urban-myth type descriptions get perpetuated.

With any luck, in a couple of years the 1080 compression in real-time will be good enough to compete and then folks will not notice the artifacting as much... but it will still have nothing to do with interlaced vs progressive displays.


----------



## guzmania

There is no home delivery system for 1080p at this time. The only thing that puts out 1080p other than brodcast cameras ot vtrs is a gaming console. The output of a sony 1500 HD cam has to use 2 coaxs to get the data to a vtr. 1080 p is better than 1080i but only in objects that are moving horizontally or that are still framed (unless both fields are displayed). 

I think 1080 i is the superior format. In tests which I was involved in using objects which moved at a constant rate the 1080i signal delivered more horrizontal resolution than the 720p pictures made with the same camera and lens on the same recorder. The test used all broadcast equipment and a CRT monitor. More data = more resolution.

Both systems scan horizontally. The guys who developed the idea of interlaced pictures considered many aspects of how we see. The is no pyscho-visual detriment to an interlaced picture. Ive watced interlaced pix for 50 + years and have been happy with the results.


----------



## 4DThinker

Look here: http://www.popularmechanics.com/technology/how_to/4216631.html

Pay attention to Myth #2. Ultimately it states that all 1080 flat panels display progressively. So if you're all hyped up over 1080p, you get it even if you buy a 1080i panel. 1080p panels simply let the set INPUT 1080p signals. But both sets (1080i and 1080p) are going to display the same way (progressively). Pay attention to the resolution, the refresh rate, the contrast ratios, the video settings, etc.. Those are things you can see. If you think you can see the interlacing flaws when watching a 1080i flat panel set then you've got a creative imagination.


----------



## guzmania

To make a 1080p picture out of a 1080i source both fields of the "i" picture are held for a bit of time and displayed line by line instead of line by line of field 1 and line by line of field 2. The picture source is the determining factor in the final display. Yes it is a 1080p picture, but it is from a 1080i source. As a point of fact about p and i being vastly different. The first year that I was involved in a live Fox broadcast in 720p we had cameras that were 1080i native. They had a board that made 720p pictures. Fox insisted that all cameras be capable of 720p "native" pictures. We brought in the appropriate cameras. When the show later in the season wanted to increase the number of cameras for the shoot Fox did not want to supply the 720p cameras so they accepted our 1080i version of 720p. We could never tell the difference, and neither could they. 1080 pix can easily be made 720 pix, the opposite cannot be said. (edit: Television cameras in the US all make 1080i native pictures. Some also make "p" native pictures, 1080 or 720. 1080p cameras can make 720p native pictures.)

Progresive sounds so macho and well, progressive. Whereas interlaced sounds kinda old lady with 7 cats. I think this is the reason many people are on the "p" band wagon. Fox built a whole distribution facility that was 480p, they contended that that was the format of the future for digital transmission, big mistake. They have since joined the ranks of HD "lite" distributers. The real underlying reason that ABC and Fox didn't join the 1080i "Grand Alliance" is they think that they can squeeze some use out of the bandwidth given by the government for other stuff with other revenue. That, and Bill Gates jumped in with both feet and wanted a computer style format for display. Now if we could get the geeks who do the compression to get rid of those artifacts we'd be in high cotton.

Bottomline: HD looks great in all HD formats, (well, I still have problems with 24p). Beauty is in the eye of the beholder, and after a couple of minutes, if the programming is decent, I want to forget the tech stuff and just watch.


----------



## Cholly

Guzmania and HDMe: I agree in general with both of you. Most people will be quite content with both 720p and 1080i programming. The Fox/ABC/ESPN sports shows all look great. Planet Earth on Discovery HD Theater was stunning. 

A few additions: 1080p HD DVD players are available from Toshiba for as low as $399 after $100 rebate. 
Until last year, if you wanted an LCD HD television, whether direct view or rear projection, it was 720p native, period. 1080i was downconverted for viewing.

The big disadvantages of LCD televisions are brightness falloff if viewed from an angle and LCD response (usually 8 to 12 milliseconds).

The big advantages of plasma are superior color rendering (including BTB), uniform brightness at all viewing angles and of course, very thin cabinets. The big disadvantage is heat output. Bulb life on current sets is not an issue.


----------



## Hoxxx

macEarl said:


> My 720p LCD isn't as good looking as the 720p DLP I've got. But that's different - I want to join the 1080p DLP club, no shame about it!


I have and it is a great improvement over 720P.


----------



## tkrandall

HDMe said:


> Something I NEVER see anyone say in the various 720 vs 1080 debates.
> 
> One of the "problems" with 1080i right now will also be the problem with 1080p via broadcast. In order to fit the 1080 data into the available bandwidth (I'm using OTA as an example since we all know cable/satellite is compressed even more) a lot of data is thrown away via MPEG2 encoding. FYI, MPEG4 for satellite throws away data too, it is the nature of the compression beast.
> 
> Anyway... When you have a higher resolution and throw away more data, you can see flaws sooner than if you had less data and threw away less.
> 
> So... many of the glitches attributed today by folks who don't know any better to 1080i as "interlace problems" and yell/scream that is why 720p is better for being progressive... will be in for a HUGE SURPRISE if OTA or satellite/cable goes to 1080p and they find the same glitches present there too!


I don't believe the ATSC standard (the OTA standard) even includes a provision for 1080p broadcasts OTA. There is not enough bandwidth in the 6 MHz channel slice, using MPEG-2 (also specified by ATSC I believe) for 1080p to be practical as far as I know.

As to 720p versus 1080i picture quality, i think it depends a lot on the network feed, the TV set, and the local channel's practices. I have a Samsung 1080p DLP set. In the Atlanta area, I'd have to say WGCL "channel 46" using 1080i has the best overall picture. For example Golf and football seem to look better compared to other channels. Nor is 46 co-broadcasting a 480i channel like some channels do, that seems to affect PQ a lot on the two network stations that do a 480i as well.


----------



## aim2pls

IF I remember correctly (please correct if I'm wrong) 1080P isn't part of the current brodcast standards for HDTV (ATSC)


----------



## packfan909

macEarl said:


> So long as the choice was 1080i vs. 720p, for me it was a no brainer - 720p.
> 
> Why?
> 
> Because interlace is horizontal, just like the whole eye-brain-perception wiring for us. So laying on your side watching TV is more fatiguing - and irritating - because of the differential in how you scan vs. how your TV scans.
> 
> But now that 1080p is out, we want one. Our 720p isn't good enough anymore. And you want to know why? Because we're Americans. Because bigger is better. And we don't mind living with ourselves. (We in this case literally means my household - we.)
> 
> So, here's an interesting factoid to offset any offense I've made with the above. Those flight simulators you see in the movies - the REAL ones - have projectors that spin with 6Gs of acceleration. Yep, 6 Gs. Put your hand too close to one when it's in full gear and it'll sawdust the bones, pronto. Your eye will move with that kind of rapidity - it takes a lot of stress and moves more and more quickly than you may know.
> 
> PS - oh yeah - I lie on my side watching TV a lot- er, sometimes a lot, you know....


This post reminded me of a Dennis Leary diatribe from one of his stand up acts.

Too good...

pf


----------



## Stewart Vernon

tkrandall said:


> I don't believe the ATSC standard (the OTA standard) even includes a provision for 1080p broadcasts OTA. There is not enough bandwidth in the 6 MHz channel slice, using MPEG-2 (also specified by ATSC I believe) for 1080p to be practical as far as I know.


I believe two flavors of 1080p are part of the broadcast standard. 1080p at 24fps and 1080p at 30 fps.

There is enough bandwidth to do this.

Right now 1080i is broadcast at 60 fps, with each frame being half of the full interlaced image... The same bandwidth could be used for 1080p at up to 30fps.

My point was just that 1080p at 30fps would have the same issues with compression causing artifacts as 1080i does because both would have to be compressed too much with today's encoders to fit into that 6MHz OTA channel. Better encoders in the future will make this problem go away though.


----------



## tkrandall

Good point. I also see that there does appear to be an ATSC [email protected] standard.


I guess 1080i is technically 30 fps, and 60 "fields" (every other line) per second. How much bandwidth does that require compared to 720p at 60 fps?

I wonder if/how they would be able to deploy new encoders and new tuners while still maintaining full operability with existing ATSC standard tuners. Because it is free OTA TV, you can't just improve the standard unless you preserve current functionality as well. I wonder if in 25 years we will still be unsing MPEG-2 and current encoders for OTA! NTSC lasted how many decades?


----------



## Stewart Vernon

tkrandall said:


> Good point. I also see that there does appear to be an ATSC [email protected] standard.
> 
> I guess 1080i is technically 30 fps, and 60 "fields" (every other line) per second. How much bandwidth does that require compared to 720p at 60 fps?


If you consider each of the 60 interlaced "fields" as equivalent to a progressive frame for purposes of sending 60 per second...

1920x540 = 1036800
1280x720 = 921600

You can see just how close the data is... just a little more required for 1080i, but really pushing the limits of the 6MHz bandwidth. Even another 1MHz would probably make a world of difference in the amount of compression required.

Consider that it takes 2 of the "fields" to get one frame... and each is slightly larger than a 720p single frame... there is alot more compression happening to get 1080 resolution in there.

A 1080p frame (1920x1080) would take the same bandwidth as two of the interlaced "fields" so nothing gained quality-wise would stay given the bandwidth. If we had more bandwidth, most of the "i vs p" arguments would go away because no one would notice.



tkrandall said:


> I wonder if/how they would be able to deploy new encoders and new tuners while still maintaining full operability with existing ATSC standard tuners. Because it is free OTA TV, you can't just improve the standard unless you preserve current functionality as well. I wonder if in 25 years we will still be unsing MPEG-2 and current encoders for OTA! NTSC lasted how many decades?


That's one downside to not making more bandwidth part of the ATSC spec in the first place... we are pretty much stuck now since we also can't (as you note) switch to MPEG4 without screwing over every HDTV with built-in digital tuner in the process.


----------



## Cholly

I haven't looked at the ATSC specs lately, but am wondering: Is ATSC transmission similar to NTSC -- vestigial sideband, 4.5 Mhz useable bandwidth?
(Update) -- The 8VSB transmission standard indicates a bandwidth at half power points of 5.38 MHz. Most of the text in the standard is incomprehensible to an old guy like me -- I've forgotten most of the math I ever learned other than the basics. :lol:


----------



## harsh

macEarl said:


> So laying on your side watching TV is more fatiguing - and irritating - because of the differential in how you scan vs. how your TV scans.


Because all non-CRT televisions are progressive scan, this assertion is non sequitur.

I'm not convinced that sideways is the best way to watch TV.


----------



## Radio Enginerd

Very helpful and informative information. Thanks.



HDMe said:


> If you consider each of the 60 interlaced "fields" as equivalent to a progressive frame for purposes of sending 60 per second...
> 
> 1920x540 = 1036800
> 1280x720 = 921600
> 
> You can see just how close the data is... just a little more required for 1080i, but really pushing the limits of the 6MHz bandwidth. Even another 1MHz would probably make a world of difference in the amount of compression required.
> 
> Consider that it takes 2 of the "fields" to get one frame... and each is slightly larger than a 720p single frame... there is alot more compression happening to get 1080 resolution in there.
> 
> A 1080p frame (1920x1080) would take the same bandwidth as two of the interlaced "fields" so nothing gained quality-wise would stay given the bandwidth. If we had more bandwidth, most of the "i vs p" arguments would go away because no one would notice.
> 
> That's one downside to not making more bandwidth part of the ATSC spec in the first place... we are pretty much stuck now since we also can't (as you note) switch to MPEG4 without screwing over every HDTV with built-in digital tuner in the process.


----------

