# DirecTV 1080p?



## sarfdawg (Jan 21, 2007)

I have seen some posts on this from a few months ago, but I was wondering if anyone out there knew of any updates on the possibility of DirecTV potentially doing a PPV or a special that was in 1080p? I know about the bandwidth issues with 1080p, but I wonder if it would be possible to have a one-time, weekly offering.

Thanks!


----------



## Stuart Sweet (Jun 19, 2006)

I don't think any DIRECTV hardware would handle 1080p. I suppose theoretically an HR20 in native mode might but I doubt it's been tried outside a laboratory.


----------



## say-what (Dec 14, 2006)

Since none of the boxes support 1080p, how would anyone see this?


----------



## 66stang351 (Aug 10, 2006)

sarfdawg said:


> I have seen some posts on this from a few months ago, but I was wondering if anyone out there knew of any updates on the possibility of DirecTV potentially doing a PPV or a special that was in 1080p? I know about the bandwidth issues with 1080p, but I wonder if it would be possible to have a one-time, weekly offering.
> 
> Thanks!


There are zero(0) DIRECTV receivers that support 1080p. So the answer would be there is zero(0) possibility of DIRECTV doing a 1080p PPV or special.


----------



## Redlinetire (Jul 24, 2007)

So there are no broadcasts, no bandwith and no boxes to support it.

So what's the hang up? :lol:

EDIT: I forgot, there is a company trying to startup a 1080p satellite service:
http://www.xstreamhd.com/


----------



## jodyguercio (Aug 16, 2007)

Even if the hardware was there, nothing that I know of is being broadcast in 1080p excpet Blu-Ray and HD-DVD.


----------



## sarfdawg (Jan 21, 2007)

Wishful thinking then. I wonder if there is a possibility of OTA? Probably wishful thinking again!  

I got this cool 1080p TV, and I've only been able to air it out with one movie on a PS3.


----------



## jodyguercio (Aug 16, 2007)

sarfdawg said:


> Wishful thinking then. I wonder if there is a possibility of OTA? Probably wishful thinking again!
> 
> I got this cool 1080p TV, and I've only been able to air it out with one movie on a PS3.


And thats going to be the only way to...nothing is being broadcast in 1080p.


----------



## Bugg77 (Jun 27, 2007)

Not trying to tinkle in your cereal, but there are a number of "experts" who say 1080p is worthless for displays under 60".

I wish they would have said that before I bought my 46" 1080p TV! I could have saved at least $500!

On second thought, it would have cost me more because I would have run out and bought a 70" 1080p just to be safe!


----------



## sarfdawg (Jan 21, 2007)

Not tinkled on, I assure you. It is sort of a novelty in that I had to borrow a PS3 just to experiment in 1080p. 

Kind of an interesting side note, I was waiting to be blown away by 1080p, and I certainly wasn't. It was very good, but not jaw dropping like I was expecting. What I did notice is that I saw a lot of imperfections when I went back to 1080i on Discovery HD Theatre. Not huge, but definitely noticeable. Maybe that goes back to the 60" TV theory.

I am completely satisfied with my picture in 1080i/720p until something bigger or better comes down on a permanent basis. Now, if the wife wants to drop a PS3 on me... :lol:


----------



## BudShark (Aug 11, 2003)

Bugg77 said:


> Not trying to tinkle in your cereal, but there are a number of "experts" who say 1080p is worthless for displays under 60".
> 
> I wish they would have said that before I bought my 46" 1080p TV! I could have saved at least $500!
> 
> On second thought, it would have cost me more because I would have run out and bought a 70" 1080p just to be safe!


The so called "experts" are sighting various theories about human perception, lighting, etc...

The fact of the matter is 1080P is more than twice the pixels of 720P. So regardless of what an expert says - there is more data there, tighter pixel clarity, etc. While they may feel you can't perceive a difference - a lot of factors come into play such as proximity to the viewing area, lighting, source material, etc.

I have 2 1080P displays - a 52" and a 61". If I were to buy a 40+" set tomorrow it would be 1080P. You made the right decision.

Chris


----------



## final_thrill (Jun 5, 2006)

Redlinetire said:


> So there are no broadcasts, no bandwith and no boxes to support it.
> 
> So what's the hang up? :lol:
> 
> ...


This service at best could only ever do movies in 1080p since no cameras exist that broadcast in 1080p correct?


----------



## jacag04 (Jul 12, 2006)

BudShark said:


> The so called "experts" are sighting various theories about human perception, lighting, etc...
> 
> The fact of the matter is 1080P is more than twice the pixels of 720P. So regardless of what an expert says - there is more data there, tighter pixel clarity, etc. While they may feel you can't perceive a difference - a lot of factors come into play such as proximity to the viewing area, lighting, source material, etc.
> 
> ...


Can you go into more detail about the other factors that come into play while viewing your 1080p that make it that much better? (I'm considering 1080p if I can get enough evidence to support it.) Also, what makes you so confident of a 40" 1080p since a lot of people assume that resolution is pointless under 50"? I'm very interested in your opinions thanks.


----------



## Kansas Zephyr (Jun 30, 2007)

BudShark said:


> The so called "experts" are sighting various theories about human perception, lighting, etc...
> 
> The fact of the matter is 1080P is more than twice the pixels of 720P. So regardless of what an expert says - there is more data there, tighter pixel clarity, etc. While they may feel you can't perceive a difference - a lot of factors come into play such as proximity to the viewing area, lighting, source material, etc.
> 
> ...


I'm not taking a side, but the argument over 720p v. 1080i is:

720p is 1280 x 720 that's 921,600 pixels scanned 60 times per second.

1080i is 1920 x 1080 that's 1,036,800 pixels scanned 60 times per second.

So, 720p "lights up" only 115,200 (11%) fewer pixels each pass, but each pass is a complete image.

Again, I take no side. But, there are some that argue that 720p is better for fast motion due to the progressive scanning.

Even if your TV up-converts 1080i to 1080p (like one of mine...too), you are essentially getting 540p, just "line-doubled" every scan. Yes there are more total pixels being hit, but when up-converting 1080i you duplicate the data on two scan lines.

Unlike a true 1080p source (blu-ray, HD-DVD, etc.), that has unique data on each line.

As far as future 1080p OTA broadcasts. The only way that can become a reality is if an encoding system can be developed that can support the current standard, and a new 1080p standard simultaneously. Since we can't go back and reprogram every existing HDTV with a new format.

If you've got the coin, go 1080p. That way if you ever get a blu-ray, or HD-DVD player, you will be truly impressed.

BTW...Bud: War Eagle! (Sorry, I couldn't help myself since you have that avatar.)


----------



## jodyguercio (Aug 16, 2007)

final_thrill said:


> This service at best could only ever do movies in 1080p since no cameras exist that broadcast in 1080p correct?


Correct. Even movies that are in 1080p are simply shot digitally and then remastered to 1080p if I remember correctly so nothing out there is currently being broadcast in 1080p.


----------



## Hansen (Jan 1, 2006)

jacag04 said:


> Can you go into more detail about the other factors that come into play while viewing your 1080p that make it that much better? (I'm considering 1080p if I can get enough evidence to support it.) Also, what makes you so confident of a 40" 1080p since a lot of people assume that resolution is pointless under 50"? I'm very interested in your opinions thanks.


There is quite a bit of information on this over at AVS forum. http://www.avsforum.com You might look through the various threads in the display forum. http://www.avsforum.com/avs-vb/forumdisplay.php?f=9


----------



## Kansas Zephyr (Jun 30, 2007)

jodyguercio said:


> Even movies that are in 1080p are simply shot digitally and then remastered to 1080p.


Or, shot on film which is scanned at 1080p for non-broadcast ATSC (blu-ray, HD-DVD) delivery.

Who knows, a cable/fiber/sat service may dedicate enough bandwidth for future 1080p, or better, channels?

But, the fixed bandwidth of OTA, and the requirement to be backward compatible with today's standard will be a huge hurdle.


----------



## apexmi (Jul 8, 2006)

Kansas Zephyr said:


> I'm not taking a side, but the argument over 720p v. 1080i is:
> 
> 720p is 1280 x 720 that's 921,600 pixels scanned 60 times per second.
> 
> ...


I have run my PS3 & HDA1 on 57" 1080i & 1080p sets and from my normal 12' seating distance see no noticeable difference.


----------



## bemenaker (Jan 6, 2008)

OTA broadcast and satellite broadcast are based on the Y-Pb-Pr standard. This standard does not support 1080p. That was added later. This standard was created specifically for satellites, to cut back on bandwidth requirements. The ORIGINAL spec was to do it through VGA, but it was far too bandwidth intensive. (I helped my dad build a home theater based on this, and we got screwed by the changing standards, in '97)

That 1080p signal can be recreated from a 1080i signal. The green channel has the full data. If your tv does 1080p, it should be converting any 1080i to p. My plasma does this. 

Other than xstream, I wouldn't expect any satellites to go 1080p for a long time, even with mpeg4's good compression. OTA doesn't have the bandwidth to broadcast 1080p.


----------



## cygnusloop (Jan 26, 2007)

jacag04 said:


> Can you go into more detail about the other factors that come into play while viewing your 1080p that make it that much better? (I'm considering 1080p if I can get enough evidence to support it.) Also, what makes you so confident of a 40" 1080p since a lot of people assume that resolution is pointless under 50"? I'm very interested in your opinions thanks.


I'm not Bud, but...

Many arguments can be made about how the human eye can resolve ~1/60 of a degree, and its ability to discern individual pixels at certain distances, but there is more to it than that. Inter-pixel spacing matters, visual acuity differs between people, and how the human brain processes motion is also a BIG factor. Some will see a difference, some won't. Human vision is a complicated process that can't be fully described by its ability to resolve pixels alone. If the display is a good one (with a quality deinterlacer), 1080i will be better on a 1080p set than on a 720p set.

Lest we forget, just as important as resolution (and in some ways more important) are contrast ratio, color saturation, and color accuracy. These qualities can vary wildly between TV's no matter what the resolution. I would spend at least as much time researching these qualities, and spend at least as much money on them as well.

As was said, a great place to research different displays and their relative qualities is over at AVS Forums.


----------



## Stuart Sweet (Jun 19, 2006)

jodyguercio said:


> Correct. Even movies that are in 1080p are simply shot digitally and then remastered to 1080p if I remember correctly so nothing out there is currently being broadcast in 1080p.


Actually I believe parts of Superman Returns were shot using a Panasonic Genesis in 1080p.


----------



## jodyguercio (Aug 16, 2007)

Stuart Sweet said:


> Actually I believe parts of Superman Returns were shot using a Panasonic Genesis in 1080p.


Stuart as always thank you for the help. I stand corrected.


----------



## cartrivision (Jul 25, 2007)

sarfdawg said:


> Not tinkled on, I assure you. It is sort of a novelty in that I had to borrow a PS3 just to experiment in 1080p.
> 
> Kind of an interesting side note, I was waiting to be blown away by 1080p, and I certainly wasn't. It was very good, but not jaw dropping like I was expecting. What I did notice is that I saw a lot of imperfections when I went back to 1080i on Discovery HD Theatre. Not huge, but definitely noticeable. Maybe that goes back to the 60" TV theory.
> 
> I am completely satisfied with my picture in 1080i/720p until something bigger or better comes down on a permanent basis. Now, if the wife wants to drop a PS3 on me... :lol:


Don't expect to be blown away by 1080p... even on bigger screens. 1080p doesn't give you any more pixels of resolution than 1080i does, it only eliminates certain interlacing artifacts that happen with interlaced video formats (such as 1080i).


----------



## rjheard (Dec 12, 2007)

1080P is useful for a computer input too.


----------



## phat78boy (Sep 12, 2007)

cartrivision said:


> Don't expect to be blown away by 1080p... even on bigger screens. 1080p doesn't give you any more pixels of resolution than 1080i does, it only eliminates certain interlacing artifacts that happen with interlaced video formats (such as 1080i).


You are a bit off. Interlaced means you only see half (540) the specified lines(1080) at one time. With 1080P you would see all(1080) the lines every time.


----------



## MarkN (Jul 13, 2007)

phat78boy said:


> You are a bit off. Interlaced means you only see half (540) the specified lines(1080) at one time. With 1080P you would see all(1080) the lines every time.


so does that mean that 720p is actually a higher resolution than 1080i?


----------



## Canis Lupus (Oct 16, 2006)

I have a 1080p and Im happy with the quality of both the PC input and the Blu-Ray and PS3. 

I will say, however, that the MPEG4 HD Channels are very impressive, so much so I usually delay a recording from HBO, for example, to get it MPEG-4. And I leave my Native Off and keep the HR20 at 720p.

There are times that, to my eye, the MPEG4-HD is every bit as good as a Blu-Ray disc. 

So much of it depends on the type of program as well, at least for me.


----------



## Maverickster (Sep 20, 2007)

Acting under the assumption that there will be no 1080p broadcast material in the near future (if ever), the more interesting question (to me anyway) is whether DirecTV will attempt to capitalize on the popularity of 1080p televisions by creating a HD DVR that will upconvert and output 1080p, then market it as a 1080p HD DVR. Sure, none of the broadcast formats will have changed, but they would honestly be able to advertise a "1080p HD DVR" (in the sense that it's capable of upscaling/deinterlacing the available broadcast formats to 1080p). I'm actually very curious to see if they'll do this. And, to be honest with you, if they do it "right" and produce something that can handle those duties better than my TV (with a high-quality processor a la Reon or Faroudja or something), that would be something I might actually be interested in buying. Since the broadcast material has to be upscaled/deinterlaced anyway to display on my 1080p TV, if DirecTV can give me an HD DVR that is capable of handling those tasks WELL and outputting the result at 1080p, I'd be first in line.

Incidentally, there appears to be some confusion over this in this thread, but I believe 1080p TVs are fixed-pixel displays and display everything at 1080p, regardless of whether the source is 720p, 1080i, or even 480i/p; those "lesser" formats are scaled/deinterlaced (as the case may be) to 1080p either by the TV or by some device along the way or both depending how you have it set up (native on/off; output format; whether you have a standalone video processor; etc.). As others have said, there's a wealth of information on this at AVS.

--Mav


----------



## cartrivision (Jul 25, 2007)

jodyguercio said:


> final_thrill said:
> 
> 
> > This service at best could only ever do movies in 1080p since no cameras exist that broadcast in 1080p correct?
> ...


That is not correct. There are many video cameras that are capable of producing a true (not upconverted) 1080p image.


----------



## cygnusloop (Jan 26, 2007)

cartrivision said:


> Don't expect to be blown away by 1080p... even on bigger screens. 1080p doesn't give you any more pixels of resolution than 1080i does, it only eliminates certain interlacing artifacts that happen with interlaced video formats (such as 1080i).


I disagree. I have been truly blown away by some 1080p Blu-ray content. Not so much by other 1080p Blu-ray content. It all depends on the source itself, and the care taken in producing the disc.

As I said a few posts up, resolution is only part of the equation. The biggest advantage of Blu-ray isn't the fact that it is 1080/24p as opposed to 1080/60i, it is the fact that the bit rate is so much higher. Particularly the newer BD MPEG4 encodes at 40+ Mbps. Compare that to the <10 Mbps that you get from MPEG4 over satellite, and you can see that the advantage of the 50GB discs.

IMHO, on a reasonably calibrated display, with good source content, 1080p can show a great deal of improvement over SAT delivered 1080i.


----------



## awdpaul (Nov 28, 2007)

phat78boy said:


> You are a bit off. Interlaced means you only see half (540) the specified lines(1080) at one time. With 1080P you would see all(1080) the lines every time.


I think there still might be some confusion. The screen is still displaying 1080 lines, it's just that only half of them are being updated each scan. A 1080i image, without motion on screen should look the same as a 1080p image. The difference comes into play in motion, as you can see artifacting since on each pass, only half of the lines are redraw. A motion across therefore isn't complete updated until the second pass (a complete screen draw). So in terms of bandwidth, 1080i does have the same amount of data as 540. Howevber, in terms of the image actually seen on screen ,it's the same resolution (and therefore, minus motion considerations would look the same as) 1080p. http://en.wikipedia.org/wiki/1080i

1080p benefits people with larger screens who want the performance on motion on larger screens, where the extra resolution over 720p is an advantage. The other advantage is since so many sources (OTA, DTV) are natively 1080i, you only need to do an interface confusion on a native 1080p screen, vs. potentially a resolution and interlace confusion on a 720p displaying a 1080i source. So even on a smaller display (40-60") the lack of a resolution conversion probably means more consistent quality of display, depending on the quality of the processors in the TV.

At the end of the day, it's what your eye catches that matters. On my 40" 720 looks really good. On my 65", 1080p is noticeable better than 720p.

I don't think for shows or regular channels on DTV 1080p would even matter, but it would be a cool concept if they had the bandwidth to offer HD DVD/BluRay style quality (including audio) at 1080p like xstreamhd service is. I imagine if xstream does halfway decent E* or DTV will buy them up anyway....


----------



## tflorman (Sep 20, 2007)

phat78boy said:


> You are a bit off. Interlaced means you only see half (540) the specified lines(1080) at one time. With 1080P you would see all(1080) the lines every time.


1080p is usually at 30fps and 1080i is 60fps... with a quality deinterlacer (most tv's that support 1080p have one) the tv waits for each half of the 1080i signal then displays them both at the same time, but at a 30 fps rate, which is still faster than the 24fps that most movies are shot in.

1080i TVs can have issues with fast motion that wouldn't show up on a 720p display due to this interlacing.... if I recall correctly from my AVS readings, they call it Jutter.

As for the original purpose of this thread, there simply isn't the bandwidth available to send 1080p signals and still provide the amount of channels we currently enjoy. I personally feel that what we currently have is (quite) good enough, and it also helps out the hd dvd/Blu-ray makers by allowing them to claim a modicum of superiority when it comes to their offerings.


----------



## cartrivision (Jul 25, 2007)

phat78boy said:


> You are a bit off. Interlaced means you only see half (540) the specified lines(1080) at one time. With 1080P you would see all(1080) the lines every time.


I know what interlaced means, and it's based on the fact even though only half of the lines of the frame are drawn each time, as long as the refresh rate is fast enough the human eye and brain sees both of the separate interlaced fields as one frame that has twice as many lines of resolution as each field. So since you don't see any more lines of resolution from a1808p source vs. a 1080i source, the advantage of 1080p isn't increased resolution, but elimination of certain motion artifacts that are caused by interlaced video.


----------



## SteveHas (Feb 7, 2007)

god 
I love dbstalk !!!!!!!!!!!!!!!!!!!!


----------



## cartrivision (Jul 25, 2007)

cygnusloop said:


> I disagree. I have been truly blown away by some 1080p Blu-ray content. Not so much by other 1080p Blu-ray content. It all depends on the source itself, and the care taken in producing the disc.
> 
> As I said a few posts up, resolution is only part of the equation. The biggest advantage of Blu-ray isn't the fact that it is 1080/24p as opposed to 1080/60i, it is the fact that the bit rate is so much higher. Particularly the newer BD MPEG4 encodes at 40+ Mbps. Compare that to the <10 Mbps that you get from MPEG4 over satellite, and you can see that the advantage of the 50GB discs.
> 
> IMHO, on a reasonably calibrated display, with good source content, 1080p can show a great deal of improvement over SAT delivered 1080i.


You are correct, but as you point out, it's the higher bit rate that makes Blu-ray look superior, much more so than the fact that they are in 1080p.... that higher bit rate gives you less motion (and other) artifacts, but not higher resolution. If DirecTV could broadcast a 1080i program with the bitrate that Blu-ray delivers, it would look almost as fantastic as a Blu-ray DVD does, but if they could broadcast a 1080p program at twice the current bit rate (and receivers could decode it), most people wouldn't be able to see a difference compared to 1080i.


----------



## cygnusloop (Jan 26, 2007)

cartrivision said:


> I know what interlaced means, and it's based on the fact even though only half of the lines of the frame are drawn each time, as long as the refresh rate is fast enough the human eye and brain sees both of the separate interlaced fields as one frame that has twice as many lines of resolution as each field. So since you don't see any more lines of resolution from a1808p source vs. a 1080i source, the advantage of 1080p isn't increased resolution, but elimination of certain motion artifacts that are caused by interlaced video.


This is all true. Just a few caveats, though.

Just to get our vocabulary consistent, with any interlaced video, two fields (even and odd) make up a frame. Different displays produce the frame from the two fields in different ways. The standard way, and how your CRT SD TV works, is that it scans the fields independently. It draws the even field, and then the odd field, then the even field, and then the odd field, and so on, at 60Hz, resulting in a _frame_ every 30th of a second. Only CRT's can reproduce interlaced video "natively"

Newer fixed pixel displays cannot reproduce the interlaced fields natively. They must use a deinterlacer to create frames from the fields. The problem with this is that with fast moving objects, the object will be in a different place when the two different fields were originally recorded. This results in a kind of tearing effect that can be very distracting.

Modern deinterlacers buffer the fields and use all kinds of statistical tricks to reduce the perceptability of these artifacts. The really good ones can eliminate it almost entirely.

This is why a 1080p display with a good deinterlacer will do a superior job with 1080i source material than a 720p display will.


----------



## cartrivision (Jul 25, 2007)

tflorman said:


> 1080p is usually at 30fps and 1080i is 60fps... with a quality deinterlacer (most tv's that support 1080p have one) the tv waits for each half of the 1080i signal then displays them both at the same time, but at a 30 fps rate, which is still faster than the 24fps that most movies are shot in.


I believe that a 1080p capable TV means that it can display a 1080p at 60fps signal, but regardless, the most common source of a 1080p signal for the typical consumer is going to be a Blu-ray DVD of a motion picture that only has a frame rate of 24fps, so a properly deinterlaced 30 fps 1080i signal can essentially deliver the same picture quality at the same frame rate as long as both have the same bit rate.


----------



## cygnusloop (Jan 26, 2007)

cartrivision said:


> You are correct, but as you point out, it's the higher bit rate that makes Blu-ray look superior, much more so than the fact that they are in 1080p.... that higher bit rate gives you less motion (and other) artifacts, but not higher resolution. If DirecTV could broadcast a 1080i program with the bitrate that Blu-ray delivers, it would look almost as fantastic as a Blu-ray DVD does, but if they could broadcast a 1080p program at twice the current bit rate (and receivers could decode it), most people wouldn't be able to see a difference compared to 1080i.


Agreed. A quality 1080/60i source will "look as good" as a quality 1080/24p source. Both have their advantages and disadvantages as far as motion is concerned. A 1080/60i source can suffer from some of the motion artifacts described above. A 1080/24p source can suffer from "judder", the phenomenon of an object skipping from one position to another on the screen, the same thing you can occasionally see in a movie theater due to the fact that 24fps is just not fast enough to capture smooth motion in some cases.


----------



## cartrivision (Jul 25, 2007)

MarkN said:


> so does that mean that 720p is actually a higher resolution than 1080i?


No. 1080i has 50% more horizontal lines of resolution and 50% more vertical lines of resolution than 720p does. 720p has a faster refresh rate for each complete frame (twice as fast as 1080i) but at the expense of having less than half the number of pixels (vs. 1080i) dedicated to displaying each frame.


----------



## Dazed & Confused (Jun 13, 2007)

Kansas Zephyr said:


> I'm not taking a side, but the argument over 720p v. 1080i is:
> 
> 720p is 1280 x 720 that's 921,600 pixels scanned 60 times per second.
> 
> ...


With all the information being transmitted here, I am shocked no one has challenged your math.

1920 X 1080 = 2,073,600


----------



## whynot83706 (Jul 27, 2006)

66stang351 said:


> There are zero(0) DIRECTV receivers that support 1080p. So the answer would be there is zero(0) possibility of DIRECTV doing a 1080p PPV or special.


I am new so I am not sure if this has been explained....since they are not supporting 1080p, what are they supporting?


----------



## cygnusloop (Jan 26, 2007)

Dazed & Confused said:


> With all the information being transmitted here, I am shocked no one has challenged your math.
> 
> 1920 X 1080 = 2,073,600


Except that his final answer is correct as 1080/60i is actually:

1920x540 @60Hz
or, alternately 1920x1080 @30Hz

Which is 1,036,800 pixels every 1/60th of a second either way.

Which is, amazingly, over 62.2 million pixels being transmitted every second. Wow.


----------



## cygnusloop (Jan 26, 2007)

whynot83706 said:


> I am new so I am not sure if this has been explained....since they are not supporting 1080p, what are they supporting?


All DIRECTV channels are broadcast in one of the following formats.
480i (All SD)
720p (HD)
1080i (HD)


----------



## dms1 (Oct 26, 2007)

tflorman said:


> 1080i TVs can have issues with fast motion that wouldn't show up on a 720p display due to this interlacing.... if I recall correctly from my AVS readings, they call it Jutter.


If you dismiss traditional CRT technology, there is no such thing as a 1080i, or indeed anything-i, TV. All modern TVs store the image data and therefore display a whole frame at all times (I'm glossing over the wobulation technique used by some DLPs). Unfortunately, the damage done by interlacing at source cannot be undone further down the line.


----------



## dtrell (Dec 28, 2007)

phat78boy said:


> You are a bit off. Interlaced means you only see half (540) the specified lines(1080) at one time. With 1080P you would see all(1080) the lines every time.


hes not off at all. theres still 1080 lines that your eyes see. your eyes are too slow to be able to pick up that there are only 540 lines on one frame and 540 on the other frame. 1080i looks the same as 1080p to your eyes. as he said, the only difference may be some motion artifacts.


----------



## Kansas Zephyr (Jun 30, 2007)

Dazed & Confused said:


> With all the information being transmitted here, I am shocked no one has challenged your math.
> 
> 1920 X 1080 = 2,073,600


That's after 2 interlaced "passes" of the screen scan, which occurs 30 times a second.

I was comparing EACH single pass of the screen scan, at 60 times a second. So, you divide that number by 2.

The number of pixels in each pass of the screen scan, for 720p vs. 1080i, quoted was correct.

Sorry, I didn't make that more clear in my post.


----------



## cartrivision (Jul 25, 2007)

Kansas Zephyr said:


> I'm not taking a side, but the argument over 720p v. 1080i is:
> 
> 720p is 1280 x 720 that's 921,600 pixels scanned 60 times per second.
> 
> ...


That last parragraph is incorrect. Every scan line of a 1080i picture is unique. There is no line doubling. In 1080i, two separate fields of 540 lines each are combined into one frame that contains 1080 unique lines.


----------



## Kansas Zephyr (Jun 30, 2007)

cartrivision said:


> That last parragraph is incorrect. Every scan line of a 1080i picture is unique. There is no line doubling. In 1080i, two separate fields of 540 lines each are combined into one frame that contains 1080 unique lines.


Only if the 1080p HDTV is updating "or scanning" at 30fps, if it's updating at 60fps then it's 540p "line-doubled" (when up-converting a 1080i source).

A 1080i HDTV is scanning at 30fps, or one "half-image" 540 lines, once every 1/60th of a second.


----------



## cartrivision (Jul 25, 2007)

Kansas Zephyr said:


> Only if the 1080p HDTV is updating "or scanning" at 30fps, if it's updating at 60fps then it's 540p "line-doubled" (when up-converting a 1080i source).
> 
> A 1080i HDTV is scanning at 30fps, or one "half-image" 540 lines, once every 1/60th of a second.


I don't know about your TV, but mine doesn't line double the fields of a 1080i signal. That would result in the display of only half of the vertical resolution that a 1080i signal is capable of displaying.... the same vertical resolution as a 540p signal. When I display a resolution test pattern on my TV using various output formats, I get twice the vertical resolution from a 1080i signal as I do from a 540p signal.


----------



## Kansas Zephyr (Jun 30, 2007)

cartrivision said:


> I don't know about your TV, but mine doesn't line double the fields of a 1080i signal. That would result in the display of only half of the vertical resolution that a 1080i signal is capable of displaying.... the same vertical resolution as a 540p signal. When I display a resolution test pattern on my TV using various output formats, I get twice the vertical resolution from a 1080i signal as I do from a 540p signal.


What are your fps rates?
If you have a 1080p HDTV, displaying a 1080i native source, you either have:

at 30fps: 1080 lines each scan (both "half-images" at 1/60th a second combined and rendered as one scan)

OR

at 60fps: 1080 lines each scan (each 540 half-image "line doubled" and render as one scan)

You can't create information that isn't there, when displaying a 1080i source on a 1080p HDTV.


----------



## cygnusloop (Jan 26, 2007)

Kansas Zephyr said:


> You can't create information that isn't there, when displaying a 1080i source on a 1080p HDTV.


Actually, you can... Sort of.

And that is in fact what most modern 1080p displays do. Line doubling alone is seldom used anymore as it does, in fact, do what cartrivision stated. It essentially halves the vertical resolution.

A good, modern deinterlacer will use some combination of edge detection, motion adaptive blending and other motion compensation techniques to 'create information' by making good statistical guesses. Using these interpolations, each frame at 60Hz is a unique 1080 vertical frame. In many cases, this can work very well. If the image is static, like a test pattern, it will be nearly perfect. If the source was created using the telecine process to convert 24 fps film to 1080/60i, a good inverse telecine deinterlace process can recreate the original almost perfectly. (This is what the "film mode" or 3:2 pulldown option that you may have on your TV does.)

In most cases, with satellite delivered 1080i programming, the compression artifacts will far, far, far outweigh any motion artifacts induced by a reasonably good deinterlace processor. All that said, however, interlace/deinterlace is a fundamentally lossy process.


----------



## Kansas Zephyr (Jun 30, 2007)

cygnusloop said:


> Actually, you can... Sort of.
> 
> And that is in fact what most modern 1080p displays do. Line doubling alone is seldom used anymore as it does, in fact, do what cartrivision stated. It essentially halves the vertical resolution.
> 
> ...


"Sort of" is the key here. The term "line-doubling", in my lexicon, includes the techniques used in up-scaling.

So, yes, you are right in that respect.

So am I. My point was that unless the source is 1080p you are not getting a true (read exact) reproduction on a 1080p HDTV.


----------



## cygnusloop (Jan 26, 2007)

Kansas Zephyr said:


> ...My point was that unless the source is 1080p you are not getting a true (read exact) reproduction on a 1080p HDTV.


That I agree with, as I said, the interlace/deinterlace process is fundamentally lossy.

The one exception is the deinterlacing done by a proper inverse telecine process if the source was properly interlaced in the first place. This is the one instance where "exact" reconstruction of the progressive source material is, in fact, possible (and routinely done with reasonably priced consumer gear).


----------



## cartrivision (Jul 25, 2007)

Kansas Zephyr said:


> What are your fps rates?
> If you have a 1080p HDTV, displaying a 1080i native source, you either have:
> 
> at 30fps: 1080 lines each scan (both "half-images" at 1/60th a second combined and rendered as one scan)
> ...


Since my TV isn't throwing out half the vertical resolution that is present in a 1080i signal (as confirmed with a resolution test pattern), I assume that my TV is properly de-interlacing two 540 line fields that occur at a rate of 1/60 second per field into a frame of 1080 unique vertical lines of resolution (a process that obviously takes two times the field rate, or 1/30 second to complete), and since the refresh rate of the TV is 60HZ, each de-interlaced frame is repeated twice so that the 30hz frame rate corresponds with the TV's 60hz refresh rate.

Of course a less desirable way of doing it (and the way you insinuated that it was always done) would be to just line double every line of every 540 line field and display each line doubled field for 1/60 of a second, but doing it that way essentially loses half of the vertical resolution that is present in a 1080i signal.

I found a test report from 2005 where 54 HD TVs were tested, and they found that about half of the tested TVs used the line doubling method and improperly only displayed half of the vertical resolution present in a 1080i signal. I assume that in 2008 the percentage of TVs that properly de-interlace and display the full vertical resolution of a 1080i signal is now better than that 50% that did it properly in 2005.


----------



## cartrivision (Jul 25, 2007)

Kansas Zephyr said:


> "Sort of" is the key here. The term "line-doubling", in my lexicon, includes the techniques used in up-scaling.
> 
> So, yes, you are right in that respect.
> 
> So am I. My point was that unless the source is 1080p you are not getting a true (read exact) reproduction on a 1080p HDTV.


Actually, your original point/statement (the one that I took exception to) was that each 540 line field of a 1080i signal is always just line doubled for display on a 1080p display and that therefore a 1080i signal only delivered 540 lines of vertical resolution. That's a lot different than your latest statement that a 1080i signal can't _*exactly*_ replicate what a 1080p signal can deliver. Nobody here ever claimed that a 1080i signal could without exception exactly reproduce what a 1080p signal can deliver (although in some cases, it can).


----------



## cartrivision (Jul 25, 2007)

cygnusloop said:


> Actually, you can... Sort of.
> 
> And that is in fact what most modern 1080p displays do. Line doubling alone is seldom used anymore as it does, in fact, do what cartrivision stated. It essentially halves the vertical resolution.
> 
> ...


cygnus, do you think that the above methods are typically done in today's $1500-3000 HD TV sets, or that they simply (and more cheaply) just interweave every 2 fields into a 1080 line frame and display it for 1/30 of a second? Or perhaps just combine every 2 adjacent fields into a 1080 line frame and display each of those frames for 1/60 second?


----------



## cygnusloop (Jan 26, 2007)

cartrivision said:


> cygnus, do you think that the above methods are typically done in today's $1500-3000 HD TV sets, or that they simply (and more cheaply) just interweave every 2 fields into a 1080 line frame and display it for 1/30 of a second? Or perhaps just combine every 2 adjacent fields into a 1080 line frame and display each of those frames for 1/60 second?


I think the number of reasonably priced displays capable of doing so is increasing. Here is an article by Gary Merson discussing exactly what we are talking about for a lineup of 2007 models. He claims the numbers improved significantly from 2005>2006>2007. The article includes a list of the tested displays and which displays passed or failed the different tests. Very informative article.

http://www.hometheatermag.com/hookmeup/1107hook2/


----------



## Stuart Sweet (Jun 19, 2006)

I feel it's important to mention...

There's a lot of technical stuff here and it's all very sound. However, the most important thing to you is going to be how it looks to you. Whatever you're planning on getting, see it in action and decide how it looks to you.


----------



## cygnusloop (Jan 26, 2007)

^^^

While I certainly agree with the gist of what Shadow is saying, some things are very difficult to ascertain in the store. This is due to the lack of proper source material in most stores, bad lighting, sets in "torch mode" so that they look brighter and more vibrant than the set next to it (which is often perceived as "better" at first glance).

This, coupled with the fact that most first time HDTV buyers don't really know what to look for in the first place, can lead to some less than ideal purchases. What "looks best to you" at first glance in the store is not necessarily what will look best to you in your home, or in your home 6 months hence.

I certainly don't recommend that someone buy a set based on the spec sheet, but your greatest tool when shopping for a new HDTV is education. Threads like this, the hundreds and hundreds like it over at AVS Forums, and things like the Gary Merson article linked above are part of that educational process.

The three main types of HDTV's out there now (LCD, Plasma, DLP) all have their own "look" with their own advantages, disadvantages and quirks. They all have their own price point per feature set and screen size. There is really no way to know which type of set is going to be the best "fit" for you without going out there and looking at a lot of sets.

Someone who has, or plans to get a STB like a DIRECTV H/HR2x, will be doing most of their HDTV viewing with a 1080i source. How their brand new 1080p HDTV will deinterlace and display that content should be an important part of the decision making process. It is simply something you can't do by casually "looking" in the store. It takes some research as well.

The point of threads like this are to help ensure that the HDTV that looks good to you in the store, looks good to you when you get it home and hook it up to _your _equipment, and still looks good to you a year or two later.

When you arm yourself with information, "how it looks to you" will have a great deal more meaning.


----------



## Kansas Zephyr (Jun 30, 2007)

cartrivision said:


> Actually, your original point/statement (the one that I took exception to) was that each 540 line field of a 1080i signal is always just line doubled for display on a 1080p display and that therefore a 1080i signal only delivered 540 lines of vertical resolution. That's a lot different than your latest statement that a 1080i signal can't _*exactly*_ replicate what a 1080p signal can deliver. Nobody here ever claimed that a 1080i signal could without exception exactly reproduce what a 1080p signal can deliver (although in some cases, it can).


The implied intent of my initial statement was, that you are only getting 540 lines of "real data", and the that the other 540 must be interpolated/created by the up-scaling process.

Not you. But some others have implied that the information exists within the 1080i broadcast to produce a true 1080p image.


----------



## gio12 (Jul 31, 2006)

Bugg77 said:


> Not trying to tinkle in your cereal, but there are a number of "experts" who say 1080p is worthless for displays under 60".
> 
> I wish they would have said that before I bought my 46" 1080p TV! I could have saved at least $500!
> 
> On second thought, it would have cost me more because I would have run out and bought a 70" 1080p just to be safe!


Unless your sitting like 6ft or closer, basically the human eye cannot detect the difference between 720p and 1080p on sets smaller than 47".

Like those new Triple HD sets that are looming. Those resolutions are higher than the human eye can see!

Look a good thought is this. 47" and under 720p unless you can buy a 1080p and your will sit closer than 6ft.

Over 47" go with 1080p especially if you are going to get a BR or HD-DVD player.

I have viewed a 1080p BR on a 1080P 42" LCD and a 720p HD-DVD on my Plasma. I could NOT tell a difference on either picture. I sit about 9ft away. Now when getting close like 3ft maybe, maybe....

Now my buddy has a 92" LCD projection in 720p. It looks like a good SD picture at that size. Now when it's down to 55" it looks really nice. BR or HD via OTA/MPEG-4 the picture does look the same.

He mentioned a 1080P projector for his HT was waaaay too much. Sadly I think his picture sucks.

Think of it when it comes to a picture that is printed. A 3MP picture will print nice for a 4x6. Blow it up to 8x10 and it looks ok. Now take the same picture with a 7-10 MP camera. The 4x6 will look the same. Now blow it up to 8x10 or larger it will look just a good.

Basically same with TV screens. a 35" 1080p TV is a waste of money. a BR/HD-DVD will look the same a s a 720p OTA broadcast .


----------



## final_thrill (Jun 5, 2006)

cartrivision said:


> That is not correct. There are many video cameras that are capable of producing a true (not upconverted) 1080p image.


I was under the understanding that broadcast equipment capable of 1080p have not even been invented yet. Can you shed more light on this?

I also have another question. I have a Sammy FPT5084 1080p, and my eye tells me that the the tv does a much better job with the 1080i channels than the 720p channels. There are more compression or motion artifacts (not sure which) visible on the 720p channels. I assumed that this was because the 1080i channels are closer to the tv's native res than the 720p channels. However after reading this thread, shouldn't the 720p channels look better than the 1080i channels because the tv doesn't have to de-interlace?

Please clear up my confusion,
Thanks


----------



## sarfdawg (Jan 21, 2007)

Holy cow...thanks to all of you for all the info. You answered a bunch of questions that I didn't even know to ask. Thank you again!


----------



## Kansas Zephyr (Jun 30, 2007)

final_thrill said:


> I was under the understanding that broadcast equipment capable of 1080p have not even been invented yet. Can you shed more light on this?


There are 1080p cameras and recording equipment.

There is no big push for OTA 1080p broadcast equipment, since it can't fit into the bandwidth currently allocated to local stations.


----------



## cygnusloop (Jan 26, 2007)

Kansas Zephyr said:


> There are 1080p cameras and recording equipment.
> 
> There is no big push for OTA 1080p broadcast equipment, since it can't fit into the bandwidth currently allocated to local stations.


Nor is allocating the necessary bandwidth practical for any existing cable or satellite provider.


----------



## phat78boy (Sep 12, 2007)

1080i, while displaying 1080 lines (perceivable) is really 540. Now if you challenged me to see only 540 lines, there is no way I could that. What sticks out to me is motion scenes and sports. Which I'm sure is the main reason big sports networks use 720P. I actually prefer 720P over 1080i. I see much more artifacts and blurring with 1080i. This is because that split second between movement or screen changes is where those 540 lines get crossed. With a progressive image you will not get that. 1080P would be even better as there are more lines and thus greater detail along with no "blurring".


----------



## phat78boy (Sep 12, 2007)

gio12 said:


> Unless your sitting like 6ft or closer, basically the human eye cannot detect the difference between 720p and 1080p on sets smaller than 47".
> 
> Like those new Triple HD sets that are looming. Those resolutions are higher than the human eye can see!
> 
> ...


Its all about room light and viewing distance for HT projectors. If you are far enough back and room is not overly bright, the HD projector should look very good. Also, proper configuration of a projector is a must. Older ones(even a year or two old) need a lot of tweaking. The newer ones have much better tools for this.

1080P projectors are not as bad as they were even a year ago. I very nice 1080P model can be had for 3000$. Thats cheaper then most 50" highend flat panels.


----------



## gio12 (Jul 31, 2006)

phat78boy said:


> Its all about room light and viewing distance for HT projectors. If you are far enough back and room is not overly bright, the HD projector should look very good. Also, proper configuration of a projector is a must. Older ones(even a year or two old) need a lot of tweaking. The newer ones have much better tools for this.
> 
> 1080P projectors are not as bad as they were even a year ago. I very nice 1080P model can be had for 3000$. Thats cheaper then most 50" highend flat panels.


Well this guy has gone thorugh the roof with his HT. You sit about 7ft in the first row and 10ft in the second. The projector is tweaked and calibrated. His screen is one of the nicest out thre in quality. The lighting is well, well thought out.

The projector was bought last year and cost around $4000. The 1080p was close to $10K.

Again it looks good, but when it's 96" it's not a nice as a well calibrated plasma.


----------



## cartrivision (Jul 25, 2007)

Kansas Zephyr said:


> The implied intent of my initial statement was, that you are only getting 540 lines of "real data", and the that the other 540 must be interpolated/created by the up-scaling process.
> 
> Not you. But some others have intimated that the information exists within the 1080i broadcast to produce a true 1080p image.


I understand what you are saying, but the way you are saying it may be misleading to some people, and to say that we only get 540 lines of "real data" for every frame of a1080i picture is incorrect. It *might* be displayed that way if your TV doesn't de-interlace the 1080i signal correctly, but a correctly de-interlaced 1080i signal will yield a 1080i frame that has 1080 "real" (i.e. unique) lines of vertical resolution (but at half the frame rate of 1080p), and it is capable of displaying the same vertical resolution as a 1080p signal. Additionally, as cygnusloop pointed out, for 24fps film source, 1080i properly de-interlaced can provide the same reproduction of the source material as 1080p does. That doesn't mean that 1080i can give you everything that 1080p does in all cases. When the source is live video rather than 24fps film, yes there can be some interpolation done in the de-interlacing process, but that interpolation is not to create 1080 lines out 540 lines&#8230; there are still 1080 unique lines present in the signal that make up 1080 "real" lines of a 1080i frame to start with, so no additional lines have to be created through interpolation. What cygnusloop was saying about interpolation is that some advanced de-interlacers may in addition to weaving two 540 line fields together into a true 1080 line frame, do further processing on those 1080 unique lines to try to smooth out things like motion artifacts that are a side effect of interlaced video, but that process starts out with 1080 unique lines and has nothing to do with creating an additional 540 interpolated lines from 504 "real" lines of data.


----------



## phat78boy (Sep 12, 2007)

gio12 said:


> Well this guy has gone thorugh the roof with his HT. You sit about 7ft in the first row and 10ft in the second. The projector is tweaked and calibrated. His screen is one of the nicest out thre in quality. The lighting is well, well thought out.
> 
> The projector was bought last year and cost around $4000. The 1080p was close to $10K.
> 
> Again it looks good, but when it's 96" it's not a nice as a well calibrated plasma.


I get more favorable responses from my projector then I do from any of my other big screens. I have a DLP, LCD, and a plasma that are all 1080P and 50" or larger and most people enjoy the projector the most. It could be just the ambeience of the whole room, but the picture is outstanding.

Also, the Sony pearl(1080P) was only $5000 last year and could actually be had cheaper if bought through discount retailers. I'm not sure what kind of projector brands he was looking at, but there was a lot of good 1080P projectors for around $8000 last year. This year that price has already fallen to $3000 for the same high quality.


----------



## gio12 (Jul 31, 2006)

phat78boy said:


> I get more favorable responses from my projector then I do from any of my other big screens. I have a DLP, LCD, and a plasma that are all 1080P and 50" or larger and most people enjoy the projector the most. It could be just the ambeience of the whole room, but the picture is outstanding.
> 
> Also, the Sony pearl(1080P) was only $5000 last year and could actually be had cheaper if bought through discount retailers. I'm not sure what kind of projector brands he was looking at, but there was a lot of good 1080P projectors for around $8000 last year. This year that price has already fallen to $3000 for the same high quality.


I know prices came down this year. But last year his 720p cost was high enough.

he has gone all out in this room. It's been sound proofed and acousticly made. Sound panels, etc. Most the equaipment his about as good as it get's unless you are talking like a mutli-million dollar set-up.

hell his sub was as much as the projector.

he said it's been adjusted latlely again and he has changed the lighting. But I was not impressed. Regular DVDs look pretty crappu IMO. BR does look nice, but I cannot see all the hyp on a projector. Then again a 10" Plasma costs too much anyways so this was the only way to go over 75" within a resonable cost.


----------



## Kansas Zephyr (Jun 30, 2007)

cartrivision said:


> I understand what you are saying, but the way you are saying it may be misleading to some people, and to say that we only get 540 lines of "real data" for every frame of a1080i picture is incorrect. It *might* be displayed that way if your TV doesn't de-interlace the 1080i signal correctly, but a correctly de-interlaced 1080i signal will yield a 1080i frame that has 1080 "real" (i.e. unique) lines of vertical resolution (but at half the frame rate of 1080p), and it is capable of displaying the same vertical resolution as a 1080p signal. Additionally, as cygnusloop pointed out, for 24fps film source, 1080i properly de-interlaced can provide the same reproduction of the source material as 1080p does. That doesn't mean that 1080i can give you everything that 1080p does in all cases. When the source is live video rather than 24fps film, yes there can be some interpolation done in the de-interlacing process, but that interpolation is not to create 1080 lines out 540 lines&#8230; there are still 1080 unique lines present in the signal that make up 1080 "real" lines of a 1080i frame to start with, so no additional lines have to be created through interpolation. What cygnusloop was saying about interpolation is that some advanced de-interlacers may in addition to weaving two 540 line fields together into a true 1080 line frame, do further processing on those 1080 unique lines to try to smooth out things like motion artifacts that are a side effect of interlaced video, but that process starts out with 1080 unique lines and has nothing to do with creating an additional 540 interpolated lines from 504 "real" lines of data.


Sorry, but if it's interpolated, it's still not "real" data, even if it's unique (unlike a 1080p source like blu-ray, etc.). It may be damn good, or near perfect. But, it's still being created (even if using information from the previous 540 "half-frame" to create the next 1080p up-scaled image).

If 1080i only needs a 1080p HDTV to create prefect 1080p images, why develop 1080p sources like blu-ray and HD-DVD? Since, you seem to say, there is no difference in PQ.


----------



## cartrivision (Jul 25, 2007)

Kansas Zephyr said:


> Sorry, but if it's interpolated, it's still not "real" data, even if it's unique (unlike a 1080p source like blu-ray, etc.). It may be damn good, or near perfect. But, it's still being created (even if using information from the previous 540 "half-frame" to create the next 1080p up-scaled image).
> 
> If 1080i only needs a 1080p HDTV to create prefect 1080p images, why develop 1080p sources like blu-ray and HD-DVD? Since, you seem to say, there is no difference in PQ.


I don't keep saying that there is never any difference, I keep saying that you are wrong in insisting that there is *always* a difference. In _*some*_ cases 1080i can deliver the exact same results that 1080p can, and in some cases it can't.

The main problem is that you keep trying to insinuate that going from 1080i to 1080p involves being "upscaled" from a source that only has 540 lines of vertical resolution to "create" another 540 lines, when that is not even close to what is happening when 1080i video is properly de-interlaced. 1080i video has the same vertical resolution as 1080p video has and no upscaling or interpolation is necessary to achieve that resolution. The only interpolation that is sometimes done on 1080i video has nothing to do with increasing the lines of resolution from 540 to 1080 (because it already has 1080 lines of vertical resolution), but to remove other undesirable artifacts that are sometimes (but not always) present in 30fps interlaced video, however in some cases such as in the reproduction of 24fps film source, no such interpolation is necessary at all, and in that case, a 1080i signal can reproduce the exact same so called "real" data that a 1080p signal does.... not just damn good.... not just near perfect.... but the same picture that would have been delivered by using 1080p.... and no interpolation required.


----------



## Kansas Zephyr (Jun 30, 2007)

Only for a perfectly still image...

In that case the up-scaling is using the last "half-image" (lines 1,3,5,7...1079) to fill-in between lines 2,4,6,8...1080 on the next. Since there was no change in the image, then yes, it's a perfect copy.

I don't know of many, if any, real-world TV programs that are static.

Otherwise it's taking that same info and creating a best fit or guess for the 540 lines that have no new "real" information available.

Every time the 1080p HDTV scans 1080 lines at 60fps, the data from a 1080i source is only providing 540 lines of new information. The other 540 are the product of the HDTV's up-scaling process. I'm not saying this a bad thing. I do own a 1080p HDTV.

If it's scanning at 30fps, then...yes the HDTV "waits" for two "half-frames" combines them and displays a true 1080p image.


----------



## P Smith (Jul 25, 2002)

Oh man !
"*The other 540 are the product of the HDTV's up-scaling process*."
You are a product of YOUR definitions.


----------



## Kansas Zephyr (Jun 30, 2007)

P Smith said:


> Oh man !
> "*The other 540 are the product of the HDTV's up-scaling process*."
> You are a product of YOUR definitions.


Main Entry: prod·uct
Pronunciation: 'prä-"d&kt
Function: noun
1 : the result of work or thought
2 a : the output of an industry or firm b : a thing created by manufacturing
3 in the civil law of Louisiana : something (as timber or a mineral) that is derived from something else and that diminishes the substance of the thing from which it is derived -compare FRUIT 2a

Geez...I thought the output created by the HDTV's up-scaling can be considered its product.

Uncle!


----------



## P Smith (Jul 25, 2002)

There are no such producing 'other' 540 lines by HDTV; those lines came from other fields by collecting packets from MPEG stream, decompressing I/B/P frames and updating video buffer's rows dedicated to those lines (if not go deep into details).


----------



## flipptyfloppity (Aug 20, 2007)

1080i is 1080 lines, not 540. Yes it has less resolution, but it has less TEMPORAL resolution. That means, it is sending 1080 lines, just not 60 times a second.

Given that movies are only 24 frames a second anyway, in practice, with a good TV, there will be no visible difference between 1080i and 1080p. Any true native 1080p/60 material cannot be fully sent over 1080i without throwing away have the temporal information however.


----------



## Kansas Zephyr (Jun 30, 2007)

flipptyfloppity said:


> 1080i is 1080 lines, not 540. Yes it has less resolution, but it has less TEMPORAL resolution. That means, it is sending 1080 lines, just not 60 times a second.
> 
> Given that movies are only 24 frames a second anyway, in practice, with a good TV, there will be no visible difference between 1080i and 1080p. Any true native 1080p/60 material cannot be fully sent over 1080i without throwing away have the temporal information however.


I've never said that 1080i isn't 1080 lines. This back and forth was started over how a 1080i OTA broadcast is displayed on a 1080p HDTV.

What about live sports at 30fps (CBS/NBC Sports)? Not movies.

There are 540 lines of information sent every 1/60th of a second during a 1080i OTA broadcast.

Two of these "half-images" (540) are combined for a *30fps* "full-image" (1080).

True or False?

If a 1080p HDTV is refreshing at *60fps*, only 540 of those 1080 lines can be from the newest data broadcast at 1080i. The other 540 lines are created by the 1080p HDTV's up-scaling process.

True or False?


----------



## P Smith (Jul 25, 2002)

"The other 540 lines are created by the 1080p HDTV's up-scaling process."
FALSE !
HDTV processor is refreshing FULL video buffer with full size - 1920 by 1080 (1080) pixels.
Those two fields ( odd and even rows ) came into BUFFER with 1/30 tempo, vizualization done from the buffer with 1/60s rate.


----------



## cygnusloop (Jan 26, 2007)

The enigmatic P Smith is correct.

Real issue is, how does the processor deal with the fact the the two fields were recorded 1/60th of a second apart. The (LCD, DLP, or Plasma) 1080p TV will refresh the entire frame at 60Hz. At each refresh it has one "fresh" field to use, and one field that is 1/60th of a second old. The next refresh the alternate field is fresh, and one that was new, is now 1/60th of a second old.

If there is little to no motion in the scene, just displaying the new field and the field that is 1/60th of a second old will look very good.

Once you add motion, the "smarter" TV's will modify the older field for the current refresh in order to compensate for the motion. Some TV's do a better job of this than others. So who is still not quite getting this?

Of course if you have an HDTV with a 120Hz refresh rate, things get even more fun.


----------



## P Smith (Jul 25, 2002)

Just imagine video memory buffer with two buses: one for writing - from stream processing, other - (independant) for reading and converting pixels' info into analog (Component, composite, S-Video) or digital signals (HDMI,DVI).


----------



## gregjones (Sep 20, 2007)

flipptyfloppity said:


> 1080i is 1080 lines, not 540. Yes it has less resolution, but it has less TEMPORAL resolution. That means, it is sending 1080 lines, just not 60 times a second.
> 
> Given that movies are only 24 frames a second anyway, in practice, with a good TV, there will be no visible difference between 1080i and 1080p. Any true native 1080p/60 material cannot be fully sent over 1080i without throwing away have the temporal information however.


But even at less temporal resolution 1920x1080i is more information than 1280x720p. That is almost always overlooked. 1080i still represents more information per second.


----------



## greenwave (Oct 23, 2006)

Bottom line it for me: I have a 1080p Sony XBR5 46" LCD. What is going to look better for me, having the HR20 set on 1080i or 720p output? I know it is "in the eye of the beholder", but I just want to know which you honestly think will consistently look the best. I do love my HD sports for what its worth.

And yes, my LCD has a 120hz refresh rate.


----------



## cygnusloop (Jan 26, 2007)

greenwave said:


> Bottom line it for me: I have a 1080p Sony XBR5 46" LCD. What is going to look better for me, having the HR20 set on 1080i or 720p output? I know it is "in the eye of the beholder", but I just want to know which you honestly think will consistently look the best. I do love my HD sports for what its worth.
> 
> And yes, my LCD has a 120hz refresh rate.


The answer is both.

Set your HR20 to Native=ON, and at least 720p and 1080i selected. The progressive 720 stream will then only need to be scaled to 1080p, which your TV will do very well. (As opposed to the HR20 scaling it, then interlacing it, sending it to your TV to be deinterlaced - less processing is almost always better than more processing).

The 1080i channels will be sent straight to your TV for deinterlace, which should also look great. The debate about whether or not 720p is better than 1080i for sports has to do with the broadcaster. That decision is, however, already made for you. You will likely do best by giving the native resolution to your TV, and let it do what it has to to get it to 1080p.

Whether or not you want to include 480i/p is a personal choice, and the subject of many, many, many threads here.


----------



## Canis Lupus (Oct 16, 2006)

FWIW - I've had the best viewing experience on my Sammy 1080p DLP using 720p. I've found the combination of the DLP and keeping the whole delivery system in progressive has yielded the best results, especially with increased action/motion. 

As always though - eye of the beholder.


----------



## veryoldschool (Dec 10, 2006)

greenwave said:


> Bottom line it for me: I have a 1080p Sony XBR5 46" LCD. What is going to look better for me, having the HR20 set on 1080i or 720p output? I know it is "in the eye of the beholder", but I just want to know which you honestly think will consistently look the best. I do love my HD sports for what its worth.
> 
> And yes, my LCD has a 120hz refresh rate.


As another Sony XBR owner, all resolutions selected and native on is the only way to go.


----------



## Canis Lupus (Oct 16, 2006)

How'd this one make it through? :eek2: :lol:



veryoldschool said:


> Ass


----------



## cartrivision (Jul 25, 2007)

Kansas Zephyr said:


> Only for a perfectly still image...
> 
> In that case the up-scaling is using the last "half-image" (lines 1,3,5,7...1079) to fill-in between lines 2,4,6,8...1080 on the next. Since there was no change in the image, then yes, it's a perfect copy.
> 
> ...


Only for a perfectly still image??? Once again you are wrong. As several people have pointed out, 24fps moving images (you know like virtually every theatrical motion picture release that you might see on HBO or Showtime) can be reproduced with 1080i just as well as with 1080p and *without* any "best fit guessing, or interpolation, but just by using the pure "real" data present in a 1080i signal. In that very common use of 1080i, playback of properly deinterlaced 24fps source material encoded with 1080i will yield the exact same end result that using 1080p would give you.


----------



## cygnusloop (Jan 26, 2007)

cartrivision said:


> ... 24fps moving images (you know like virtually every theatrical motion picture release that you might see on HBO or Showtime) can be reproduced with 1080i just as well as with 1080p and *without* any "best fit guessing, or interpolation, but just by using the pure "real" data present in a 1080i signal. In that very common use of 1080i, playback of properly deinterlaced 24fps source material encoded with 1080i will yield the exact same end result that using 1080p would give you.


You are absolutely correct, however, the operative phrase is *can be reproduced*. Unfortunately most displays don't (yet) do this properly.

From Gary Merson's article...



> ...A good processor should recognize when there is a 3:2 sequence in the video signal and only combine like fields (say, the first two fields that are from just the first film frame). This process is called inverse telecine. Done right, you'll see everything from the original film frame. That's worth repeating. As far as film-based content is concerned, as long as the 1080i is deinterlaced properly, it will appear identical to the original 1080p content. Done incorrectly, you can have artifacts, or worse, a loss of resolution when anything on the screen moves....
> 
> ...The failure rate (of the 74 2007 model year displays tested) for proper 3:2 processing is still very poor at 81.09 percent....


----------



## blueline (Dec 7, 2007)

But aren't LCD display non-interlaced panels by default? What I don't understand is how is an interlaced signal being displayed on an LCD since there are no horizontal scan lines?


----------



## P Smith (Jul 25, 2002)

blueline said:


> But aren't LCD display non-interlaced panels by default? What I don't understand is how is an interlaced signal being displayed on an LCD since there are no horizontal scan lines?


See post#82.


----------



## cygnusloop (Jan 26, 2007)

blueline said:


> But aren't LCD display non-interlaced panels by default? What I don't understand is how is an interlaced signal being displayed on an LCD since there are no horizontal scan lines?


I'll expand a bit on P Smith's reply.

Yes, LCD displays are progressive by nature. Only CRT's can "natively" display interlacing. So, how does it do it?

Take, for example, a 1080/60i signal coming into a 1080p display that refreshes at 60Hz. The 540 even lines come every 1/30th of a second, and the same for the odd lines, meaning the display receives a "field" every 1/60th of a second. The display is buffering these fields as they come in. So, for every refresh, the display buffer has one field that is brand new, and one that is 1/60th of a second old. The next refresh, the alternate field is new.

At its simplest, for every refresh, the display will alternatively show the new odd fields with the older even field, then the new even field with older odd field. Some displays will try to modify the older field to compensate for any motion that is happening in the frame. Some do it better than others.

And, :welcome_s to DBSTalk, blueline.


----------



## HoTat2 (Nov 16, 2005)

cygnusloop said:


> I'll expand a bit on P Smith's reply.
> 
> Yes, LCD displays are progressive by nature. Only CRT's can "natively" display interlacing. So, how does it do it?
> 
> ...


OK, I'm still trying to get my mind wrapped around this cygnusloop. So please bear with me. Now this here you wrote seems clear enough, but how exactly does the inverse telecine process work with this "new field + buffered "old field" presentation sequence every 1/60 sec, when and if the receiver properly recognizes a 3-2 field pattern on the input signal indicating 24 fps film source material?

And I also take it that if the display's refresh rate were higher. Such as the 120Hz of greenwave's Sony XBR5 mentioned earlier, then it is simply a case of repeating the same constructed 1080P frame twice every 1/60 of a second?


----------



## cygnusloop (Jan 26, 2007)

HoTat2 said:


> OK, I'm still trying to get my mind wrapped around this cygnusloop. So please bear with me. Now this here you wrote seems clear enough, but how exactly does the inverse telecine process work with this "new field + buffered "old field" presentation sequence every 1/60 sec, when and if the receiver properly recognizes a 3-2 field pattern on the input signal indicating 24 fps film source material?


The short answer is that if the display can only do 1080/60i, it will never be able to reproduce the 1080/24p exactly. If the display has a variable refresh rate that can be a multiple of 24 (72, 96, or the magic number 120 - which is divisible by BOTH 60 and 24) then a good 3:2 pulldown mode can reproduce the 24fps exactly.

To understand how the _inverse _telecine process works, you, naturally, need to understand how the telecine process works. Telecine (pron. tele-seen) is the name of the process that converts 24fps film to 30fps (60 fields per second) video. It does this by using a 2:3 cadence. The first film frame is the first two fields, the second film frame is the next three fields, and so on. So, in the case of a telecine converted program, the complete frames are available. A good inverse telecine processor is smart enough to recognize the cadence, and not combine fields that are from different frames. This is what a 3:2 pulldown mode (or film mode) on some HDTV's do.

Here's how it works. Lets call some four frames of a film, A B C D. The telecine process would convert that to A A B B B C C D D D. When these fields are combined into frames with a straight deinterlace, you get AA BB BC CD DD. A good inverse telecine deinterlace process will not combine fields from different frames. It will give you AA BB CC DD, the original four frames.

In order to reproduce the original 24fps, the display must have the aforementioned variable refresh. A 60Hz only TV is stuck with combining fields from different frames together in some fashion, as 60 is not divisible by 24. Some 60Hz only displays use the same type of motion adaptation to minimize the "combing" that happens when dissimilar fields are combined. Some do this well enough to pass the HQV deinterlacing tests.



> And I also take it that if the display's refresh rate were higher. Such as the 120Hz of greenwave's Sony XBR5 mentioned earlier, then it is simply a case of repeating the same constructed 1080P frame twice every 1/60 of a second?


I would expect that, for the most part, a 120Hz display will simply repeat the same constructed frame when reproducing true 1080/60i video. The great thing about a 120Hz refresh rate is that it is capable of exactly reproducing both 24fps (24*5=120) material as well as 30fps (30*4=120) material.


----------



## HoTat2 (Nov 16, 2005)

cygnusloop said:


> The short answer is that if the display can only do 1080/60i, it will never be able to reproduce the 1080/24p exactly. If the display has a variable refresh rate that can be a multiple of 24 (72, 96, or the magic number 120 - which is divisible by BOTH 60 and 24) then a good 3:2 pulldown mode can reproduce the 24fps exactly.
> 
> To understand how the _inverse _telecine process works, you, naturally, need to understand how the telecine process works. Telecine (pron. tele-seen) is the name of the process that converts 24fps film to 30fps (60 fields per second) video. It does this by using a 2:3 cadence. The first film frame is the first two fields, the second film frame is the next three fields, and so on. So, in the case of a telecine converted program, the complete frames are available. A good inverse telecine processor is smart enough to recognize the cadence, and not combine fields that are from different frames. This is what a 3:2 pulldown mode (or film mode) on some HDTV's do.
> 
> ...


OK, thanks cygnusloop;

So when and if the inverse telecine process has been completed properly. And the HD TV set derives the original 24 Hz frame rate of the cinema source material (and this is a big "if" I realize). Then using the Sony XBR5 120Hz refresh rate example again. It is then simply a matter of repeatedly displaying each 1080P reconstructed film frame five times every 1/24 of a second. Or the same 1080P frame once per 1/120 of a second, 5 times before presenting the next film frame 5 times this way?


----------



## cygnusloop (Jan 26, 2007)

HoTat2 said:


> OK, thanks cygnusloop;
> 
> So when and if the inverse telecine process has been completed properly. And the HD TV set derives the original 24 Hz frame rate of the cinema source material (and this is a big "if" I realize). Then using the Sony XBR5 120Hz refresh rate example again. It is then a simply a matter of repeatedly displaying each 1080P reconstructed film frame five times every 1/24 of a second. Or the same 1080P frame once per 1/120 of a second, 5 times before presenting the next film frame 5 times this way?


Frankly, I don't know, but 5 repeated 120Hz frames would be my guess. I suppose some enterprising designer could try to interpolate the changes between the film frames and update through the 5 120Hz refreshes, but that would require some serious voodoo that I wouldn't pretend to understand. My understanding is HDTV's with true 5:5 pulldown are just coming to market.


----------



## cartrivision (Jul 25, 2007)

cygnusloop said:


> Here's how it works. Lets call some four frames of a film, A B C D. The telecine process would convert that to A A B B B C C D D D. When these fields are combined into frames with a straight deinterlace, you get AA BB BC CD DD. A good inverse telecine deinterlace process will not combine fields from different frames. It will give you AA BB CC DD, the original four frames.


So when the 3:2 pulldown process isn't done correctly and some video frames are a combination of two different film frames instead of a repeat of the previous frame, how does that manifest itself visually on the TV screen? I know how it would look if you could freeze the frame, but I mean what is the effect when you are watching it in real time? Does it look obviously worse than if it was done correctly. I'd guess not since I have never noticed any obvious difference between 1080p and 1080i when watching something that was encoded from a 24fps film source, but then I've never done a true side by side comparison.


----------



## inkahauts (Nov 13, 2006)

cartrivision said:


> So when the 3:2 pulldown process isn't done correctly and some video frames are a combination of two different film frames instead of a repeat of the previous frame, how does that manifest itself visually on the TV screen? I know how it would look if you could freeze the frame, but I mean what is the effect when you are watching it in real time? Does it look obviously worse than if it was done correctly. I'd guess not since I have never noticed any obvious difference between 1080p and 1080i when watching something that was encoded from a 24fps film source, but then I've never done a true side by side comparison.


It won't look quite as sharp. It may even look grainy. Frankly, you are probably more likely to notice a problem if it does some of a program right and some of it wrong, than if its just being done wrong all the time, because you would never see it done right and would assume that the picture you are see is as sharp as its meant to be.... and most people at this point find HD so much better than watching sd that the differences won't show much to the average consumer....

Then there are people like me that notice every detail in my screen, and I hate improperly processed material, so I always try and buy the highest quality dvd's and tv's and turn off as many stupid noise reduction junk as I can, because those functions are usually trying to compensate on this type of material and end up doing more harm than good.....


----------



## DanER40 (Oct 25, 2007)

How do these threads turn in to a 1080i vs 1080p debate so often? Are you guys just bored at work?


----------



## ColdCase (Sep 10, 2007)

FYI: Be careful about what you read into a 120hz display. HDGuru says none of the current 120Hz LCDs have 5:5 pull down for 1080p/60 or 1080i/60. The Sony LCD TV internally converts to 120 Hz from 60 Hz by doing a frame interpolation of the 60 Hz content. There seems to be some disagreement on what happens when you feed the sony 1080p/24 directly, many LCDs first pulls it to 60Hz and then interpolates to 120Hz. With its default settings, the XBR fails deinterlacing tests, resulting in a50% loss in resolution. It does better when set to cinema custom modes.

The Pioneer plasmas and some projectors will extract 24fps from 1080i or take 24fps directly from a DVD player and display it at 72hz refresh at even multiples, so you do not need to deal with the 3:2 judder (if you notice it at all).


----------



## Kansas Zephyr (Jun 30, 2007)

P Smith said:


> "The other 540 lines are created by the 1080p HDTV's up-scaling process."
> FALSE !
> HDTV processor is refreshing FULL video buffer with full size - 1920 by 1080 (1080) pixels.
> Those two fields ( odd and even rows ) came into BUFFER with 1/30 tempo, vizualization done from the buffer with 1/60s rate.


I think we keep going around and around here looking at the same thing, using different language.

Every 1/60th of a second there are only 540 lines of new information broadcast on a 1080i stream?

Yes or No?

If the answer is "yes" and there are 1080 lines being displayed every 1/60th second on the screen of a 1080p HDTV. There are only 540 lines of "new, real" data. The other 540 comes from somewhere other than the live broadcast stream, at that instant.

It may be from the last 1/60th, or created by the up-scaling process within the HDTV, but it did not come from information in the current 1/60th of a second data.

Right?


----------



## gregjones (Sep 20, 2007)

Kansas Zephyr said:


> I think we keep going around and around here looking at the same thing, using different language.
> 
> Every 1/60th of a second there are only 540 lines of new information broadcast on a 1080i stream?
> 
> ...


But 540 is only one of the important numbers.

It is important to not that those 540 lines of information each contain 1920 pixels. This is in comparison to the 720 lines of information that contain 1280 pixels each. Saying 540 lines is less than 720 lines is easy, but horribly oversimplified if you ignore the resolution of each of those lines.


----------



## cygnusloop (Jan 26, 2007)

cartrivision said:


> ...how does that manifest itself visually on the TV screen? I know how it would look if you could freeze the frame, but I mean what is the effect when you are watching it in real time?


Well, as always, it depends on the source material. And, your right, in most cases, it won't be all that noticeable. Fast motion is, well, fast, and while the interlace artifacts may be the most pronounced in this case, they may not be the most noticeable.

To me, where I notice the combination of dissimilar fields is during a slow pan across a detailed background. For example, take a landscape shot. As the camera slowly pans, the trees in the background, instead of just moving slowly across the frame, kind of wiggle and pulse due to the deinterlace.

But, as inkahauts said, most people are used to this kind of artifacting due the the fact that we have been watching interlaced television all of our lives. HD is such a dramatic improvement, that some minor deinterlace artifacting is no big deal (which it usually isn't, to me).


----------



## cygnusloop (Jan 26, 2007)

gregjones said:


> But 540 is only one of the important numbers.
> 
> It is important to not that those 540 lines of information each contain 1920 pixels. This is in comparison to the 720 lines of information that contain 1280 pixels each. Saying 540 lines is less than 720 lines is easy, but horribly oversimplified if you ignore the resolution of each of those lines.


This is true, and there is also temporal resolution to consider. As in how many new pixels are there every second?

*1080/60i*
1920*540*60 = 62,208,000 pixels/second

*720/60p*
1280*720*60 = 55,296,000 pixels/second

*1080/24p*
1920*1080*24 = 49,766,400 pixels/second

Interesting, huh? As (I think) I said, many, many posts ago, there is much more than just raw resolution that goes into a good HDTV picture.


----------



## phat78boy (Sep 12, 2007)

cygnusloop said:


> This is true, and there is also temporal resolution to consider. As in how many new pixels are there every second?
> 
> *1080/60i*
> 1920*540*60 = 62,208,000 pixels/second
> ...


The above stats for 1080i aren't entirely accurate as only half of those pixels get refreshed every second. So while they may be displayed, they aren't actively in motion. Both progressive resolutions refresh every pixel during the secound.


----------



## cygnusloop (Jan 26, 2007)

phat78boy said:


> The above stats for 1080i aren't entirely accurate as only half of those pixels get refreshed every second. So while they may be displayed, they aren't actively in motion. Both progressive resolutions refresh every pixel during the secound.


Look again. The above calculation is done refreshing only 540 of the 1080 lines every 1/60th of a second. The numbers are accurate. I didn't do a calculation for 1080/60*p*, as I am not aware of any available source material capable of producing that (which would be 124,416,000 pixels/second).


----------



## phat78boy (Sep 12, 2007)

cygnusloop said:


> Look again. The above calculation is done refreshing only 540 of the 1080 lines every 1/60th of a second. The numbers are accurate. I didn't do a calculation for 1080/60*p*, as I am not aware of any available source material capable of producing that (which would be 124,416,000 pixels/second).


Whats wrong is that with 1080i, 540 lines are only refreshed every 1/30th of a second. The other 540 are refreshed the next 1/30th. So instead of times 60 it should have been times 30.


----------



## cygnusloop (Jan 26, 2007)

phat78boy said:


> Whats wrong is that with 1080i, 540 lines are only refreshed every 1/30th of a second. The other 540 are refreshed the next 1/30th. So instead of times 60 it should have been times 30.


I'm sorry, but you are misunderstanding. With a 1080i stream, there is a new *field *every 1/60th of a second. A field contains 540 horizontal lines. Every 1/60th of a second there is a new, unique field. There is a new *frame *every 1/30th of a second, if you want to look at it this way.

The calculation can be doe one of two ways:

1920*540*60
or
1920*1080*30

the answer is the same either way.


----------



## phat78boy (Sep 12, 2007)

cygnusloop said:


> I'm sorry, but you are misunderstanding. With a 1080i stream, there is a new *field *every 1/60th of a second. A field contains 540 horizontal lines. Every 1/60th of a second there is a new, unique field. There is a new *frame *every 1/30th of a second, if you want to look at it this way.
> 
> The calculation can be doe one of two ways:
> 
> ...


You are correct. I just don't like the whole 1080i/60 thing. Its really 1080i/30. My point was really that while there are more pixels, only half of them are actively changing. While with 720P every pixel changes every time.


----------



## gregjones (Sep 20, 2007)

phat78boy said:


> You are correct. I just don't like the whole 1080i/60 thing. Its really 1080i/30. My point was really that while there are more pixels, only half of them are actively changing. While with 720P every pixel changes every time.


But his post is accurate. 540 lines change every 1/60th of a second. This is why he multiplied 1920 x 540 x 60. The pixel count is correct. 1080/60i does update more pixels per second than 720/60p. This is the fact that many, many people continually overlook.

With 720p, each pixel does change every time, as you state. But you cannot ignore the fact that every pixel changing is still fewer than the number changing in a 1080i picture.


----------



## Kansas Zephyr (Jun 30, 2007)

gregjones said:


> But 540 is only one of the important numbers.
> 
> It is important to not that those 540 lines of information each contain 1920 pixels. This is in comparison to the 720 lines of information that contain 1280 pixels each. Saying 540 lines is less than 720 lines is easy, but horribly oversimplified if you ignore the resolution of each of those lines.


I don't remember ever saying that.

I was just explaining how a 1080i broadcast signal is displayed on a 1080p HDTV (1080 lines at 60fps), rather than a 1080i HDTV (1080 lines at 30fps), period.

I am not "taking sides" on the 720p vs. 1080i debate.


----------



## P Smith (Jul 25, 2002)

gregjones said:


> But his post is accurate. 540 lines change every 1/60th of a second. This is why he multiplied 1920 x 540 x 60. The pixel count is correct. 1080/60i does update more pixels per second than 720/60p. This is the fact that many, many people continually overlook.
> 
> With 720p, each pixel does change every time, as you state. But you cannot ignore the fact that every pixel changing is still fewer than the number changing in a 1080i picture.


Those "540" fields are DIFFERENT lines ! ODD and EVEN !


----------



## phat78boy (Sep 12, 2007)

gregjones said:


> But his post is accurate. 540 lines change every 1/60th of a second. This is why he multiplied 1920 x 540 x 60. The pixel count is correct. 1080/60i does update more pixels per second than 720/60p. This is the fact that many, many people continually overlook.
> 
> With 720p, each pixel does change every time, as you state. But you cannot ignore the fact that every pixel changing is still fewer than the number changing in a 1080i picture.


I agree with the pixel counts, but the count is deceiving. Every pixel changes with 1080i, but only half at anyone given time. So while all pixles get changed in 1/60th of second, only half are changed for the first 1/60th and the second half the next part of that 1/60th. So my point of reasoning is not that their are not more pixels with 1080i, its that only half of them are actively changing.

Great examples have been made, but fast moving or wide pan shots are the most evident. At the point you see the picture, half of the sceen (odd or even) is behind the rest. While at 720P or 1080P all the lines are exactly synced up.


----------



## Kansas Zephyr (Jun 30, 2007)

phat78boy said:


> I agree with the pixel counts, but the count is deceiving. Every pixel changes with 1080i, but only half at anyone given time. So while all pixles get changed in 1/60th of second, only half are changed for the first 1/60th and the second half the next part of that 1/60th.


No, all pixels do not "get changed" every 1/60th from a 1080i source.

Lines (1,3,5...1079) take 1/60th.

Line (2,4,6...1080) take the NEXT 1/60th of a second. (not the "next part of"...but a "new" 1/60th)

Therefore, all lines "get changed" (odd lines are combined with even lines) every 1/30th (2 1/60th per second fields) of a second from a 1080i broadcast source, when displayed on a 1080i HDTV.


----------



## phat78boy (Sep 12, 2007)

Kansas Zephyr said:


> No, all pixels do not "get changed" every 1/60th from a 1080i source.
> 
> Lines (1,3,5...1079) take 1/60th.
> 
> ...


Yes, broadcast 1080i is 1080i/30. I just find it hard to compare motion video between progressive and interlaced. Interlaced is only changing half its resolution at any one time. Progressive is changing all of it. So to say 1080i is better then 720P is not a good comparison. While 1080i might have the greater resolution, the data being changed at anyone point is less then 720P.


----------



## cygnusloop (Jan 26, 2007)

Kansas Zephyr said:


> I am not "taking sides" on the 720p vs. 1080i debate.


A debate which can never have a clear winner. Both have their strengths and weaknesses.



phat78boy said:


> So my point of reasoning is not that their are not more pixels with 1080i, its that only half of them are actively changing.


Which is precisely what the meat of this thread was discussing. How different intrinsically progressive displays compensate for this phenomenon. The short of it is that some do it quite well, and others, not so well. This is one quality of an HDTV that can be researched quite easily. I would not put any HDTV on my "short list" that doesn't at least pass the basic deinterlace tests. (Which my current HDTV does). My next purchase will almost certainly require passing the 3:2 tests. (Which my current HDTV does not). Hopefully, in the near future, virtually all 1080p TV's will pass both of these tests with flying colors.



Kansas Zephyr said:


> Therefore, all lines "get changed" (odd lines are combined with even lines) every 1/30th (2 times 1/60th) of a second from a 1080i broadcast source, when displayed on a 1080i HDTV.


This is what the 1080/60i signal is, and what a native 1080i CRT display would show. However, on a 1080p display with good motion adaptation, each displayed frame does have 1080 "new" lines every 1/60th of a second. One field is the "fresh" one, and the other is an interpolated and "updated" version of the older field. The quality of this interpolation is what makes some displays better than others.


----------



## Kansas Zephyr (Jun 30, 2007)

cygnusloop said:


> This is what the 1080/60i signal is, and what a native 1080i CRT display would show. However, on a 1080p display with good motion adaptation, each displayed frame does have 1080 "new" lines every 1/60th of a second. One field is the "fresh" one, and the other is an interpolated and "updated" version of the older field. The quality of this interpolation is what makes some displays better than others.


Yep.

Notice that I specifically was talking about a 1080i HDTV, not 1080p in that post you quoted.

If you look at some of my others, you will see me post what you just described.


----------



## cygnusloop (Jan 26, 2007)

Kansas Zephyr said:


> Yep.
> 
> Notice that I specifically was talking about a 1080i HDTV, not 1080p in that post you quoted.


Yes, and I should have said as much.


> If you look at some of my others, you will see me post what you just described.


Which just goes to show, that great minds think alike.


----------



## gregjones (Sep 20, 2007)

phat78boy said:


> While 1080i might have the greater resolution, the data being changed at anyone point is less then 720P.


No, the amount of data being changed at one point is always greater in 1080i.

I was discussing with the others a preference for referring to 30fps or 60fps. Both are numerically identical. People may prefer either format. I am not fighting that fight. But either way, 1080i updates more pixels each cycle than 720p. This is not a matter of opinion, but one of math.

It is completely acceptable for people not to like the way 1080i does it or to prefer the way their set handles 720p. But for the amount of data, it is very straightforward. Even half of the lines being updated in one cycle of 1080i represents more pixels than all of the lines in 720p.


----------



## gregjones (Sep 20, 2007)

Kansas Zephyr said:


> I don't remember ever saying that.
> 
> I was just explaining how a 1080i broadcast signal is displayed on a 1080p HDTV (1080 lines at 60fps), rather than a 1080i HDTV (1080 lines at 30fps), period.
> 
> I am not "taking sides" on the 720p vs. 1080i debate.


I was responding to phat78boy's questions, not yours. I know lots of people prefer 720p over 1080i and vice versa. But lots of people consistently get the math wrong.


----------



## Kansas Zephyr (Jun 30, 2007)

gregjones said:


> No, the amount of data being changed at one point is always greater in 1080i.


+1


----------



## phat78boy (Sep 12, 2007)

gregjones said:


> No, the amount of data being changed at one point is always greater in 1080i.
> 
> I was discussing with the others a preference for referring to 30fps or 60fps. Both are numerically identical. People may prefer either format. I am not fighting that fight. But either way, 1080i updates more pixels each cycle than 720p. This is not a matter of opinion, but one of math.
> 
> It is completely acceptable for people not to like the way 1080i does it or to prefer the way their set handles 720p. But for the amount of data, it is very straightforward. Even half of the lines being updated in one cycle of 1080i represents more pixels than all of the lines in 720p.


If you are looking at period of time, 1080i does change more pixels. At anyone time 1080i only changes half the pixels though. The other half are not from that moment or are "mirrored" by the processor in any given TV. With 720P, at any one moment, all pixles are changed by the source. There is no "mirroring" or other method to refresh pixels not in motion. I'm not saying its extremely noticeable, but it is fact as you say.


----------



## cartrivision (Jul 25, 2007)

Kansas Zephyr said:


> cygnusloop said:
> 
> 
> > This is what the 1080/60i signal is, and what a native 1080i CRT display would show. However, on a 1080p display with good motion adaptation, each displayed frame does have 1080 "new" lines every 1/60th of a second. One field is the "fresh" one, and the other is an interpolated and "updated" version of the older field. The quality of this interpolation is what makes some displays better than others.
> ...


Although originally you said that upconverting 1080i to 1080p was done by line doubling the 540 lines of each 1080i field, essentially yielding a 540p frame and limiting the vertical resolution to 540 lines, which isn't true for correctly upconverted 1080i.


----------



## gregjones (Sep 20, 2007)

phat78boy said:


> If you are looking at period of time, 1080i does change more pixels. At anyone time 1080i only changes half the pixels though. The other half are not from that moment or are "mirrored" by the processor in any given TV. With 720P, at any one moment, all pixles are changed by the source. There is no "mirroring" or other method to refresh pixels not in motion. I'm not saying its extremely noticeable, but it is fact as you say.


For me, I'll take half of 2073600 over 100% of 921600 anyday. So many other factors play into the image you get (camera, transmission method, compression), that I will stick with the numbers. Your mileage may vary.


----------



## Kansas Zephyr (Jun 30, 2007)

cartrivision said:


> Although originally you said that upconverting 1080i to 1080p was done by line doubling the 540 lines of each 1080i field, essentially yielding a 540p frame and limiting the vertical resolution to 540 lines, which isn't true for correctly upconverted 1080i.


...and you can reread my posts following that, to better understand my intent.

Rather than actually re-posting this now circular discussion, as new content.


----------



## phat78boy (Sep 12, 2007)

http://www.hometheaterblog.com/hometheater/2004/11/true_hdtv.html



> Interlaced scanning produces a still picture field, or a 'frame', by scanning two sets of alternating lines. Progressive scanning creates a frame in one pass. If both are moving at the same rate "refreshing" the screen at the same number of passes per second, that gives progressive scanning the advantage, because it scans a complete picture 'frame', not half a picture 'field'. It produces fewer dots and lines, but at twice the speed. So now it's a question of timing. As ABC's FAQ touches on: The number of lines of resolution in progressive and interlaced pictures is not a clear cut comparison. *In the time it takes 720p to draw 720 lines, 1080i draws only 540 lines. And by the time 1080i does draw 1080 lines, 720p has drawn 1440 lines.*


While the posting may be old, (I will look for newer sources) I find it still relevant.


----------



## Kansas Zephyr (Jun 30, 2007)

phat78boy said:


> http://www.hometheaterblog.com/hometheater/2004/11/true_hdtv.html
> 
> While the posting may be old, (I will look for newer sources) I find it still relevant.


The number of LINES you quote is correct.

However, the total number of PIXELS for any given amount of time is still greater with 1080i.

720p is 1280 x 720 that's 921,600 pixels scanned 60 times per second.

1080i is 1920 x 1080 that's 1,036,800 (half of the total delivered in 1/30th of a second) pixels scanned 60 times per second.


----------



## P Smith (Jul 25, 2002)

"720p has drawn 1440 lines"
It's out of comparision - those lines do not belong one scan and do not increase number of pixels, ie PQ.


----------



## Kansas Zephyr (Jun 30, 2007)

P Smith said:


> "720p has drawn 1440 lines"
> It's out of comparision - those lines do not belong one scan and do not increase number of pixels, ie PQ.


+1


----------



## phat78boy (Sep 12, 2007)

P Smith said:


> "720p has drawn 1440 lines"
> It's out of comparision - those lines do not belong one scan and do not increase number of pixels, ie PQ.


You are correct, but if less lines are being drawn wouldn't that effect PQ? Less refresh means less data being seen, correct?


----------



## Kansas Zephyr (Jun 30, 2007)

phat78boy said:


> You are correct, but if less lines are being drawn wouldn't that effect PQ? Less refresh means less data being seen, correct?


See post #128

1080i still sends more data per field or "half frame" than 720p sends each frame.


----------



## phat78boy (Sep 12, 2007)

Kansas Zephyr said:


> See post #128
> 
> 1080i still sends more data per field or "half frame" than 720p sends each frame.


We will have to agree to disagree. The way I view it is that while 1080i has more pixels, it is only changing half of those pixels at any one period of time. So the pixels in actual motion are only half of its total output. Giving it less actual pixels in motion then 720P.

I respect your view, just don't share it.


----------



## cygnusloop (Jan 26, 2007)

Sigh....


----------



## Kansas Zephyr (Jun 30, 2007)

phat78boy said:


> We will have to agree to disagree. The way I view it is that while 1080i has more pixels, it is only changing half of those pixels at any one period of time. So the pixels in actual motion are only half of its total output. Giving it less actual pixels in motion then 720P.
> 
> I respect your view, just don't share it.


Dude.

720p is 1280 x 720 that's *921,600* pixels scanned 60 times per second.

1080i is 1920 x 1080 that's *1,036,800* (half of the total delivered in 1/30th of a second) pixels scanned 60 times per second.

These are the number of pixels "changed" in the exact same amount of time (1/60th of a second), for each format.

In two "scans", or 1/30th of a second.

720p will "light" 1,843,200 pixels. (2 full frames)
1080i will "light" 2,073,600 pixels (2 fields = 1 frame)

I'm not "picking the best format". I can see both sides, so I don't take one.

But, these are the numbers.


----------



## cygnusloop (Jan 26, 2007)

Hi again...

I've had some developments with my HDTV that might be of interest to some of you that have been following this thread.

Due to a strange, intermittent flicker issue with my Samsung HLS5687, an issue that multiple light engine replacements were unable to resolve, they offered an exchange. When I wasn't happy with the newer model that they were offering as an exchange (as I didn't consider it really a like for like replacement), they have issued a refund, and will be picking the TV up soon. (And this was 13 months into a 15 month warranty - props to Samsung CS for that.)

So the short of it is that I have a new HDTV. Much of my participation in this thread has been due to the research that I have been doing as I was deciding on a potential replacement. So here's the verdict...

Sony KDS-60A3000 SXRD. This HDTV is a 60" 1080p/120Hz native display, and much of why I chose it is related to the topics that we have been discussing in this thread.

There has been some discussion (and confusion) here about how some of the new 120Hz displays deal with 1080/24p source material. As I was still in "research mode", and trying to separate fact from speculation, I didn't want to post what I thought I was learning about the performance of this (and other) HDTV's.

Now that the research is done, I feel pretty comfortable sharing what I have learned. This particular Sony, and a few other Sony 120Hz LCD panels, plus a small handful of other HDTV's, do exactly as we were hoping with a 1080/24p source.

The Sony, with its "MotionFlow" circuitry turned off (more on that in a moment), and a 1080/24p source, will simply repeat the film frame 5 times. A true 5:5 pulldown (yipee!). There is some debate about whether or not it does 5:5, or if it actually does 4:4, (thereby going into a 96Hz mode), but either way, it is a 24n refresh rate. It has been confirmed that it DOES NOT simply accept a 24Hz input, telecine it to 30fps/60i, and double that to 120Hz as most 120Hz sets currently on the market do. Of course, if the source is 60Hz, this converted to 120Hz by simply showing each frame twice.

Now, on to the "MotionFlow" features. The first part of this feature, Motion Enhancer, does precisely what we were speculating. With a 24Hz source, it interpolates the motion and generates 4 unique new frames between the original frames. Whether or not this is a pleasing effect seems to depend on the source material, and the viewer. It has a "standard" and a "high" setting, and it seems for most, that the "high" is just too over the top. Many comments on it say things like it is "too real" or "kinda spooky" and sometimes that it just looks too artificial. For 60Hz source material, it simply interpolates a single frame between original 60Hz frames. The motion enhancer seems to, perhaps, be more useful for 60Hz than for 24Hz.

The second part of the "MotionFlow" feature is the Motion Naturalizer. This feature simply inserts black or dark frames as part of the 5:5 frame repetition. This is done in an attempt to better simulate the effect of the shutter closing on a real film projector. It seems that many of the "It's just gotta look like film" crowd really appreciate this feature. Others say that it just makes the picture look darker.

Anyway, the limited viewing I was able to do in the store, coupled with my research convinced me that this would be a nice upgrade to my now defunct Sammy. This article is a nice review on this set and gets into the 24Hz processing capabilities a bit.

This thread over at AVS Forums gives a (surprisingly small) list of current displays that support multiples of 24Hz refresh rates, that the author has confirmed properly process a 1080/24p source. The same author also maintains a similar thread at Blu-ray.com. Having "done my homework", I have a good degree of confidence in this author's findings, but as with all things on the internet, YMMV, and one should always do their own research.

Once I've had the weekend to gather some of my own impressions, I would be happy to share them with anyone who is interested. Please feel free to send me a PM.


----------



## phat78boy (Sep 12, 2007)

Kansas Zephyr said:


> Dude.
> 
> 720p is 1280 x 720 that's *921,600* pixels scanned 60 times per second.
> 
> ...


I'll try one last time. If you were to freeze frame a 1080i picture, how many lines are being changed at the one moment? Half, or 540. If you were to do the same for 720P it would be all 720. That is my point. It is not about if 1080i refreshes all its lines in a certain time period, its about how many are changed at one instant. If the screen is moving fast enough, say sports, those lines are constantly behind each other. I know, its just a fraction of a second. With 720P however, there is no behind. They are all fully synced.

I honestly don't have a preference. I like to have the broadcast come through however it is sent. 1080i or 720P...they both look darn good to me.


----------



## veryoldschool (Dec 10, 2006)

While specs are specs, they are watched by the human eye. Interlaced was used at a speed the human eye couldn't detect when it first was developed. Film is only 24 frames.
Kind of makes one wonder what some are seeing.


----------



## phat78boy (Sep 12, 2007)

While faster then the human eye, a constant motion or panning on an interlaced display will produce slight jags and artifacts that are apparant. I'm not saying its bad enough to make you want to return your TV, but it has been noted by various reviewers on various TV's. Some TV's also do a good job at hiding these flaws. 

720P while less resolution, seems to have smoother feel for me. 1080i is more crisp, once again for me.


----------



## cygnusloop (Jan 26, 2007)

veryoldschool said:


> ...Interlaced was used at a speed the human eye couldn't detect when it first was developed....


I don't quite agree with this statement. The human eye/brain is capable of temporal resolution far in excess of 24Hz, 60Hz, or even 120Hz. It's just that the brain can be convinced to cooperate, and let one enjoy the illusion of smooth motion.


----------



## furjaw (Jul 29, 2007)

720p is obsolete.
All LCD TVs 40" or greater being made today are 1080p.


----------



## cartrivision (Jul 25, 2007)

phat78boy said:


> I'll try one last time. If you were to freeze frame a 1080i picture, how many lines are being changed at the one moment? Half, or 540. If you were to do the same for 720P it would be all 720. That is my point. It is not about if 1080i refreshes all its lines in a certain time period, its about how many are changed at one instant. If the screen is moving fast enough, say sports, those lines are constantly behind each other. I know, its just a fraction of a second. With 720P however, there is no behind. They are all fully synced.
> 
> I honestly don't have a preference. I like to have the broadcast come through however it is sent. 1080i or 720P...they both look darn good to me.


Your "one last try" only further demonstrates your lack of understanding of the basic concepts being discussed here. In a freeze frame *NO LINES* are being changed "at one instant", which is why the vertical resolution (ie. # of lines) of one still 540 line *field* (that you incorrectly called a "frame") of a 1080i signal means nothing. The fact remains that when viewing continuous *frames* of video, 1080i is not only changing more pixels than 720p in any given time period, but also providing 50% more lines of horizontal resolution, and 50% more lines of vertical resolution compared to 720p.


----------



## pnyberg (Oct 31, 2007)

Wow, you guys really eat your wheaties. As a video engineer, I can say that yes there are professional cameras that are now adaptable for 1080p (Thompson/GVG are coming out with some for starters.) 

The problem is in the broadcast: OTA ATSC standards MANDATE 720p or 1080i. True we are not OTA with D* (as is cable and E*), but the FCC is not going to mandate a new standard anytime soon; they are too busy moving bandwidth around with HDTV as is, and getting rid of NTSC. Why the CATV and satellite carriers would want to do this is more or less pointless; the expense of retrofitting the current system would make it prohibitive after companies like D* just went to MPEG-4 and 5 LNB dishes to include HD LILs.

Say what you will, and I like progressive scan too, but 1080p was something of a marketing ploy by consumer electronics mfgrs (remember "enhanced definition"?) to get you to spend a little more on really high end electronics to enjoy with your BR/HD-DVDs.

SACD anyone? Digital VHS?

1080p pro cameras will be used to downconvert into broadcast at either 720p or 1080i. Not to mention a concert video shoot I worked on last year with the Smashing Pumpkins that was shot in HD at 24p fps (film style) that will go straight to DVD.

It all boils down to what works for you. For me the scan rate is ok; it's the compression rates that need improvement, but D* has done a good job thus far.


----------



## cartrivision (Jul 25, 2007)

furjaw said:


> 720p is obsolete.
> All LCD TVs 40" or greater being made today are 1080p.


Despite the wide availability of 1080p capable TVs, 720p is not at all obsolete. It has certain advantages over 1080i, and it has a more than a 2 to 1 advantage over 1080p in the area of it's required data rate.... but at the expense of having less resolution than what 1080i and 1080p delivers.


----------



## cartrivision (Jul 25, 2007)

pnyberg said:


> Wow, you guys really eat your wheaties. As a video engineer, I can say that yes there are professional cameras that are now adaptable for 1080p (Thompson/GVG are coming out with some for starters.)
> 
> The problem is in the broadcast: OTA ATSC standards MANDATE 720p or 1080i. True we are not OTA with D* (as is cable and E*), but the FCC is not going to mandate a new standard anytime soon;


DirecTV is not bound to any OTA ATSC standards. If they chose to, they could broadcast 1080p and deploy receivers capable receiving and reproducing 1080p video. It's just that the added bandwidth requirements would greatly outweigh any perceptible improvement in picture quality, so it's unlikely that they will ever broadcast 1080p programming.


----------



## HoTat2 (Nov 16, 2005)

furjaw said:


> 720p is obsolete.
> All LCD TVs 40" or greater being made today are 1080p.


You should keep in mind what was mentioned near the beginning of this long thread. And that I see to confirmed by pretty consistent opinions from various technical reviews on this issue. Testifying to the fact that, unless your 1080P set is at least 55 inches or greater. You will have a very difficult time distinguishing the difference in PQ between a conventional 720P HD set and the new generation of 1080P displays. If you can truly notice any real differences at all. Therefore 720P in hardly "obsolete" as you say.


----------



## phat78boy (Sep 12, 2007)

cartrivision said:


> Your "one last try" only further demonstrates your lack of understanding of the basic concepts being discussed here. In a freeze frame *NO LINES* are being changed "at one instant", which is why the vertical resolution (ie. # of lines) of one still 540 line *field* (that you incorrectly called a "frame") of a 1080i signal means nothing. The fact remains that when viewing continuous *frames* of video, 1080i is not only changing more pixels than 720p in any given time period, but also providing 50% more lines of horizontal resolution, and 50% more lines of vertical resolution compared to 720p.


Freeze frame is well known phrase for taking one particular point in time. I was not implying that anything in a "freeze frame" was changing. What I am saying is that in that "freeze" only half of what you are seeing is current on 1080i. The other half is slightly behind. My reasoning is not so much about resolutiion. Obviously 1080i is higher resolution. My reasoning is that at any one point you are looking at 1080i, it only has 540 lines of new resolution.

720P always has all of its lines refreshed at the same instant. None of them is "behind" another. So while some people may enjoy the higher resolution, others prefer a more fluid refresh as opposed to a higher resolution.


----------



## cygnusloop (Jan 26, 2007)

^^^
Strengths and weaknesses.
Advantages and disadvantages.
Very old and tired debate...


----------



## cartrivision (Jul 25, 2007)

phat78boy said:


> Freeze frame is well known phrase for taking one particular point in time. I was not implying that anything in a "freeze frame" was changing. What I am saying is that in that "freeze" only half of what you are seeing is current on 1080i. The other half is slightly behind. My reasoning is not so much about resolutiion. Obviously 1080i is higher resolution. My reasoning is that at any one point you are looking at 1080i, it only has 540 lines of new resolution.
> 
> 720P always has all of its lines refreshed at the same instant. None of them is "behind" another. So while some people may enjoy the higher resolution, others prefer a more fluid refresh as opposed to a higher resolution.


While what you say about a more fluid refresh is true, in reality that more fluid refresh does virtually nothing in terms of providing a better looking picture, even in a fast moving sports situation. Some of the worst video of sports I've ever seen on DirecTV has been on ESPN (which is 720p) and some of the best has been on the NFL Network channel (which is 1080i). Any possible interlace artifacts that might or might not be perceptible in 1080i video are insignificant in comparison to the very obvious compression artifacts that are often visible on ESPN. I think that some broadcasters choose 720p more because it's lower resolution gives it a slightly lower required bitrate, which can help PQ when the bitstream must be compressed to fit within a limited bandwidth allocation available with typical satellite, digital cable, or OTA transmission of the signal.

The bottom line is that even with sports programming, differences between interlaced and progressive video would be very hard to see, and for the most part they are insignificant and/or lost in a video signal that contains very obvious compression artifacts.


----------



## gregjones (Sep 20, 2007)

cartrivision said:


> The bottom line is that even with sports programming, differences between interlaced and progressive video would be very hard to see, and for the most part they are insignificant and/or lost in a video signal that contains very obvious compression artifacts.


Furthermore, the quality of the picture is often much more the product of the equipment used than the transmission format. Crappy equipment produces crappy picture in any resolution. The posters above are right, football coverage is an excellent example of picture quality. CBS and NFL Network are widely regarded as having an excellent picture and ESPN much less so. This is contrary to the 720p/1080i argument. Making the right decisions along the workflow from the camera to the TV are what gives a better product. The resolution is only one part of that.


----------



## gregjones (Sep 20, 2007)

phat78boy said:


> Freeze frame is well known phrase for taking one particular point in time. I was not implying that anything in a "freeze frame" was changing. What I am saying is that in that "freeze" only half of what you are seeing is current on 1080i. The other half is slightly behind. My reasoning is not so much about resolutiion. Obviously 1080i is higher resolution. My reasoning is that at any one point you are looking at 1080i, it only has 540 lines of new resolution.
> 
> 720P always has all of its lines refreshed at the same instant. None of them is "behind" another. So while some people may enjoy the higher resolution, others prefer a more fluid refresh as opposed to a higher resolution.


I have tried. I understand what you are trying to convey but you are missing the math. There are always more pixels in 1080i every second. Your argument is not valid.

Imagine finding two $10 bills versus 9 $1 bills. Your argument says you'd be happier with all of $9 versus half of $20 based solely on the fact that you'd rather have 100% of something instead of 50%.

That is my last attempt. You are free to prefer 720p or 1080i. I have never argued for one or the other. But the idea that there are more pixels in 720p is objectively wrong. Not up for discussion whatsoever.


----------



## sarfdawg (Jan 21, 2007)

cartrivision said:


> While what you say about a more fluid refresh is true, in reality that more fluid refresh does virtually nothing in terms of providing a better looking picture, even in a fast moving sports situation. Some of the worst video of sports I've ever seen on DirecTV has been on ESPN (which is 720p) and some of the best has been on the NFL Network channel (which is 1080i). Any possible interlace artifacts that might or might not be perceptible in 1080i video are insignificant in comparison to the very obvious compression artifacts that are often visible on ESPN. QUOTE]
> 
> I'm glad I'm not crazy. I'm on my second HDTV - one being an RPTV CRT, and now an LCD, and ESPN is by far the worst picture for sports of any that are out there. ABC HD (the mothership of ESPN) is as bad. For my money, CBS HD is head and shoulders above all others in HD Sports.
> 
> By the way, thanks to all who have given me a great lessons you all have given me. I appreciate the information you all have provided.


----------



## MIMOTech (Sep 11, 2006)

Kansas Zephyr said:


> I'm not taking a side, but the argument over 720p v. 1080i is:
> 
> 720p is 1280 x 720 that's 921,600 pixels scanned 60 times per second.
> 
> ...


pat ....


----------



## Rob55 (Sep 14, 2006)

Technically 1080p/25 is one of the 18 formats that comprise the ATSC standard. Whether they ever broadcast this or not is a whole other question.


----------



## tnflyboy (Dec 9, 2007)

Due to some problems with my HR20-700, the tech that came out today swapped me out with the HR21. I was very willing to change, and am still not convinced it was the right thing to do. But in the conversation with the tech, he said with a software upgrade in the future, 1080p would be possible on the HR21.

Not sure if he knew what he was talking about, but thought I would pass along his comment.


----------



## veryoldschool (Dec 10, 2006)

tnflyboy said:


> Due to some problems with my HR20-700, the tech that came out today swapped me out with the HR21. I was very willing to change, and am still not convinced it was the right thing to do. But in the conversation with the tech, he said with a software upgrade in the future, 1080p would be possible on the HR21.
> 
> Not sure if he knew what he was talking about, but thought I would pass along his comment.


Software won't fix the hardware limitation, so I wouldn't be holding my breath for it.


----------



## sandman207 (Nov 20, 2007)

veryoldschool said:


> Software won't fix the hardware limitation, so I wouldn't be holding my breath for it.


+1


----------



## gregjones (Sep 20, 2007)

tnflyboy said:


> Due to some problems with my HR20-700, the tech that came out today swapped me out with the HR21. I was very willing to change, and am still not convinced it was the right thing to do. But in the conversation with the tech, he said with a software upgrade in the future, 1080p would be possible on the HR21.
> 
> Not sure if he knew what he was talking about, but thought I would pass along his comment.


He was incorrect. It may have been out of ignorance or incompetence, but he was not correct.


----------



## l123 (Sep 18, 2007)

Here is another try to illuminate those who believe that higher number means more.

One needs to distinguish between "hardware" capabilities (the actual screen matrix resolution sometimes called native resolution) and software capabilities (the listed programing capabilities).
I will use examples to illustrate concepts. It is assumed that no software conversion is done.

Eg 1. the TV with a matrix with native resolution of 540 pixels is capable of displaying 480i 480p 540i 540p 720i and 1080i, but it cannot do 720p
Eg2. the TV with a matrix native resolution of 720 pixels can do all of the eg 1 and 720p.
Eg3. the TV with a matrix native resolution of 1080 pixels can do everything including 1080p.

Therefore it is obvious that the TV with the matrix with 720 pixels is better than the one that is capable of displaying 1080i programming.

Another way: if your TV set has 25 inches height - this means that the 540 matrix has 21.6 pixels/inch, and 1080 matrix has 43.2 pixels/inch.
When you watch 1080i programing (on 540 matrix) you are only seeing 540 pixels every 1/30 second and their size is such that one pixel is 0.046 inch, but when you watch 1080p programing (on 1080 matrix) your pixels size is 0.023 inch.

So no matter what you do it is better to have the 1080 pixel matrix regardless of the programming watched. With one note - software is still required to watch old other the air programs or SD on high resolution hardware.

All the above examples are particularly important for LCD displays. For Plasma the facts are that the 768 resolution screen will do very good competing with 1080 resolution screen even in large sizes. But this is not true with LCD.

As to human perception and acuity etc - we all are different - my dear friend fought me tooth and nail that she does not need more than 21 inch lowest possible resolution LCD until she had her cataract removed. 
*But it is reasonable and economically justified to buy 1080 matrix resolution for TV above 40".* - and I do as I say.

And for all who do not see the difference - save your money and be happy for those who do see the difference.


----------



## l123 (Sep 18, 2007)

As to possibility of 1080p in the future from the Satellite or OTA digital. The standard is there. 
As to the fact that no sat receiver today from DirecTV does 1080p - so what.
By paying $5.00 per month - in 12 months we all pay for the cost and more of the receiver itself. So if and when they decided they can simply sell us another receiver. But of course it is more economical for them that we pay the $5 forever.

And please do not tell me this is a lease - it is not - it is a rip off.


----------



## P Smith (Jul 25, 2002)

Well, not that brutal rip-off, but gentle riiiip.


----------



## Maruuk (Dec 5, 2007)

I can attest to the vast superiority of 1080i over 720p on my 42" 1080p set. It's like night & day detail-wise. And I can detect no motion smoothness factor between the two resolutions. It's a fantasy. On a 1080i golf broadcast, I can see the blades of grass, the faces in the crowd, the clarity is stunning. 

In 720p, say in a Fox NASCAR or football presentation, the people on the sidelines are a blur, there's no detail in the crowds, even the players in their lineup stances in football turn into a slightly mushy blur, no detail. I've seen 1080i football and each player in the wide-shot before the snap is crisp and clear. 

I'm not saying 720p is anything like the horrors of SD! Migod, let's not even talk about SD. But at least on my set, there's a world of difference between 1080i and 720p. 720p looks like television, 1080i starts to fool your brain into thinking it's reality. You're in the stands.

Sure one-or two-shots with the sportscasters' heads very big in frame 720p looks pretty sharp. But as soon as you show the wide shot of a sporting event the individuals on the field start to mush badly in 720p. To my mind, it ain't really HD, more like ED.

Retail chains are blowing out 720p sets like mad, everybody's going to 1080p as the prices continue to drop. Viewers are voting with their wallets: 720p is crap. 1080i rules. Not to mention as Blu-ray players drop below $200 by Christmas the full jaw-dropping benefits of 1080p will be mindblowing to a wider and wider audience.

BTW, quick question: is the 1080i we get actually 1440 x 1080i as opposed to 1920 x 1080i? I had read that a couple of places but don't get why they would cheese out on bandwidth/pixels.


----------



## l123 (Sep 18, 2007)

Maruuk said:


> I can attest to the vast superiority of 1080i over 720p on my 42" 1080p set. It's like night & day detail-wise. And I can detect no motion smoothness factor between the two resolutions. It's a fantasy. On a 1080i golf broadcast, I can see the blades of grass, the faces in the crowd, the clarity is stunning.
> 
> In 720p, say in a Fox NASCAR or football presentation, the people on the sidelines are a blur, there's no detail in the crowds, even the players in their lineup stances in football turn into a slightly mushy blur, no detail. I've seen 1080i football and each player in the wide-shot before the snap is crisp and clear.
> 
> ...


Well Maruuk - you almost got it.
Since you do have 1080 native resolution matrix - your 1080i is displayed using many but not all of your 1080 pixels you have. In 720p it is being recalculated - ergo mush results. In addition becasue of the timing issues - interlacing 540 lines versus single flash of 720 lines - the 1080i will do better on static shots, but on dynamic shots 720p is better than 1080i. In addition interlaced pictured are almost always blurred to minimize flickerring.
Becasue the 720p is better than 1080i particularily for dynamic shots ESPN is broadcasting in 720p. On teh other hand Discovery that has most shots relatively static uses 1080i.
Conculssion: always buy 1080p native rather if you can


----------



## Yoda-DBSguy (Nov 4, 2006)

They did infact circulate a hr21 pro series whitch touted a 1080p resolution as well as double the hard drive compacity then it's regular retailly distributed model. This was later retacted as a type (which you can see postedhere on the forums if you do a quick search for that matter). However it wasn't just a typo as the picture also depicts a 1080p resolution option as well.

In any event the model isn't available thus far.

HOWEVER, you can get an upscaler (since this is all that the proclaimed model would be doing anyways. Alot of higher end A/V Receivers such as Denon have 1080p upscaling capabilities for both analog and digital inputs. Thus everything being ran though it would be exported to your TV at a 1080p resolution.

Again Denon is just one of the brands incorporating this feature in part of their lineup. I not only am a dealer for them, but personally have the AVR-3808 which does just as stated above.

I've attached the origional HR21pro product sheet as well as Denon's AVR-3808CI's sheet (in pdf format) for those interested.


----------



## Maruuk (Dec 5, 2007)

It's kind like if a tree falls in the forest do I know or care? I don't deny the technical realities of the faster 720p full-screen refresh rate, but if I literally can see no blur or motion artifacts at 1080i, and the picture detail is dramatically enhanced, then do I care? And I'm talking hockey which is nothing but high-speed action. Hockey looks WAAAY better in 1080i than 720p.

Fox needs to rethink its 720p national scheme. Even many of their own local RSNs are broadcasting NHL hockey in full 1080i, with no motion blur or pixillating. Stunning image that stands up to speed just fine. The tree may fall, but I don't care.


----------



## Earl Bonovich (Nov 15, 2005)

Yoda-DBSguy said:


> They did infact circulate a hr21 pro series whitch touted a 1080p resolution as well as double the hard drive compacity then it's regular retailly distributed model. This was later retacted as a type (which you can see postedhere on the forums if you do a quick search for that matter). However it wasn't just a typo as the picture also depicts a 1080p resolution option as well.
> 
> In any event the model isn't available thus far.
> 
> ...


That original publication in the one trade magazine was incorrect about the 1080p... 
I confirmed that shortly after someone made the page available here in the photo's.

The HR21Pro will not have 1080p support.


----------



## P Smith (Jul 25, 2002)

Earl, that wasn't typo in a publication, but a LABEL on face plate of the model.


----------



## Earl Bonovich (Nov 15, 2005)

P Smith said:


> Earl, that wasn't typo in a publication, but a LABEL on face plate of the model.


Did I say Typo?

I said the publication was Incorrect... and that includes the photo.

I can guarantee 110%, that the production HR21Pro models will not have 1080p support.
And their front bezels will say 1080i not 1080p


----------



## veryoldschool (Dec 10, 2006)

Here is the pdf for it: http://www.valueelectronics.com/images/pdf/HR21 pro_SpecSheet.pdf


----------



## seemenewd (Dec 19, 2007)

veryoldschool said:


> Software won't fix the hardware limitation, so I wouldn't be holding my breath for it.


I'm curious what chipset the HR21 is using so that I can get datasheets for it and verify that indeed the limitation is in hardware rather than just being something that software hasn't enabled.

A direct link to the part spec .PDF would be even better.

Thanks in advance...


----------



## veryoldschool (Dec 10, 2006)

seemenewd said:


> I'm curious what chipset the HR21 is using so that I can get datasheets for it and verify that indeed the limitation is in hardware rather than just being something that software hasn't enabled.
> 
> A direct link to the part spec .PDF would be even better.
> 
> Thanks in advance...


I've never pulled the cover to read what the chips are. There have been a few to list the broadcom chips by number and post about them.
Here is what I found with a search of broadcom: Broadcom 7038 CPU + Broadcom 7411 decoder
http://www.dbstalk.com/search.php?searchid=2968094


----------

