# Any networks moving to 1080p anytime soon?



## n3ntj

Since there are now TVs available that offer 1080p resolution, has anyone heard of any networks planning on offering 1080p programming? Right now, all we can use 1080p for is watching Blu-Ray movies. I can see 1080p really being nice for sports, due to their fast motion, and even movies. I would expect a network moving to 1080p would be expensive since some new equipment would be needed.


----------



## kevinturcotte

I doubt we'll see anything _broadcast_ in 1080p for quite awhile. The cost and the bandwidth required is just too much.


----------



## rudeney

Just because we have 1080p TV’s and some 1080p content via BluRay and on-demand, there is not real reason to expect 1080p broadcasts. Even without 1080p content, a 1080p screen is still worth having. Basically, with a 1366x768 native screen, everything has to be rescaled from 720 or 1080. With a 1080p screen, the 1080i broadcasts only need to be deinterlaced so there isn’t a need for rescaling.


----------



## bonscott87

TV stations are just now starting to go HD in the first place. Let's get that done first before thinking about 1080p.


----------



## Grentz

bonscott87 said:


> TV stations are just now starting to go HD in the first place. Let's get that done first before thinking about 1080p.


No kidding!

and 1080p is not that much better, the major jump is from SD to HD period. Lets get that going before worrying about 1080p that has just gotten marketed to death since HD-DVD/Blu-ray.

Yes it is better, but it is not THAT much better for a lot of situations and just having HD instead of SD is much more important IMO.


----------



## kevinturcotte

They'd also have to again send out 2 signals wouldn't they? 1080p for those that have the 1080p tuner, and 720p or 1080i for those that don't have a 1080p tuner.


----------



## spartanstew

n3ntj said:


> Since there are now TVs available that offer 1080p resolution,


Not sure what your definition of "now" is, but 1080p TV's have been around for a few years.


----------



## TomCat

kturcotte said:


> They'd also have to again send out 2 signals wouldn't they? 1080p for those that have the 1080p tuner, and 720p or 1080i for those that don't have a 1080p tuner.


Any ATSC tuner will work, including the very oldest one sold on August 6th, 1998. The tuner doesn't even know if its 1080p or 1080i or 720p or 480i. All the tuner sees is a RF signal modulated as 8VSB. After it is demodulated and demuxed, and then decoded from MPEG back into full baseband HD, only then is the flag telling the display what format it is framed in accessed to differentiate between the various formats, and that is much after the tuner has already done its job.


----------



## TomCat

kturcotte said:


> I doubt we'll see anything _broadcast_ in 1080p for quite awhile. The cost and the bandwidth required is just too much.


 To make such a statement, it is really necessary to differentiate between 1080p60 and 1080p24.

Actually, stuff is broadcast as virtual 1080p24 all of the time. Any encoder that is fed 24 fps content (or is fed 24 fps original content that has 3:2 pulldown) automatically reverts to "film mode" (which would be every time a movie is broadcast in HD). There is an automatic 2:3 pullup process done just before encode (if necessary) and the frames are encoded as progressive frames without pulldown (or true 1080p24). The local decoder after the 8VSB tuner can recognize this mode and re-add 3:2 pulldown after decoding (assuming a broadcast format of 1080i30) which also happens all of the time.

While the 1080p24-encoded stream is technically sent with a flag indicating that it is to be decoded as interlaced content or 1080i30, any 1080p display can easily reconstitute it completely transparently into true 1080p24 (plus 3:2 pulldown, which may add a tiny smidgen of judder), which means essentially that you are receiving the exact same quality as 1080p24 and it does not suffer interlace error as typical 1080i content normally does. A 720 set reconstitutes it into 720p, also without the interlace error that would remain for typical 1080i30 content. So it is sent in progressive format with "progressive" quality (meaning no interlace error), flagged as interlaced (1080i30) and reconstituted as 1080p24 (once your display deinterlaces it and displays it progressively).

The re-adding of pullup is only necessary to maintain consistency with the interstitial material, such as commercial breaks, which are typically 1080i30. Otherwise there would be an ugly glitch as the decoder has to transition from 1080p24 to 1080i30 and back. The pulldown also decreases the 24 fps flicker factor which likely offsets any negative aspects of the addition of judder and is therefore actually an overall improvement. But essentially, every time a 1080i station broadcasts a movie, if you have a 1080p set you will get the very same quality you would get had you bought a 1080p hard copy in Blu-Ray.

What is difficult to do (and impossible so far within a 6 MHz bandwidth) is to transmit 1080p60. That is the format that takes more bandwidth to do properly. And that is also the only format that would really improve over 1080p24, 1080i30, or 720p60, as far as motion artifacting goes (which would indeed make it really nice for sports). So far, even all DOD 1080p is really only 1080p24, and I would not really even expect much 1080p60 content to be considered, at least for now, as the _de facto _format world-wide is 1080p24 for content acquisition, specifically because any other format can be easily extracted from it making conversion universal.

So if you are talking specifically about 1080p60, you are absolutely right--I would not expect to see content broadcast or sent by DBS/FIOS/cable either, but 1080p24? We actually get that all the time.


----------



## Tower Guy

n3ntj said:


> Has anyone heard of any networks planning on offering 1080p programming?


When there is no motion from field 1 to field 2 of an interlaced image it doesn't matter if the display is interlaced or progressive.

Prime time dramas and sitcoms on CBS and NBC are shot in 24P, converted to 1080i 30 frames, and displayed by your TV as 1080p.


----------



## HD AV

TomCat said:


> But essentially, every time a 1080i station broadcasts a movie, if you have a 1080p set you will get the very same quality you would get had you bought a 1080p hard copy in Blu-Ray.


Dude, you have seriously lost it. There is no compression with Blu Ray, the bitrates are much higher providing much higher quality and detail that blows away MPEG2 broadcast TV. And most, not all, 1080P TVs do not properly implement 3:2 pulldown, which is not necessary with one that will accept 1080P/24. I think you may want to have your glasses checked.


----------



## TomCat

HD AV said:


> Dude, you have seriously lost it. There is no compression with Blu Ray, the bitrates are much higher providing much higher quality and detail that blows away MPEG2 broadcast TV. And most, not all, 1080P TVs do not properly implement 3:2 pulldown, which is not necessary with one that will accept 1080P/24. I think you may want to have your glasses checked.


Thanks for the free consultation, but maybe later. Actually, you may want to have your facts checked. And I can help you with that:

1) All HD is compressed. That includes Blu, which uses the very same MPEG-4compression scheme that DTV uses. Since uncompressed HD has a bit rate of about 1.485 Gb/s, that would mean that a 30 GB disc could only hold less than 3 minutes of video, were Blu-Ray uncompressed. So you are grossly misinformed on that count. And, you can look that up. Most HD is compressed in the very act of acquisition.

2) "Detail" refers to how sharp the picture is, which is directly tied to resolution. All else held equal (focus, lens quality, imager quality, etc.), 1080p or 1080i resolution is exactly the same regardless of what medium it might be delivered to you in, which means the detail in 1080i OTA TV is precisely the same as that in Blu-Ray. If you are seeing more "detail", then you are simply fooling yourself. There isn't any more to see. Also, resolution and bit rate are two completely different things that typically do not affect each other. There are some low-bit delivery schemes that effectively lower resolution, but that is not the case in OTA and cable, and thankfully is no longer the case for DBS.

3) While that means that higher bit rates do not provide higher detail (which as we have established is instead a function of resolution), higher bit rates also do not necessarily provide higher quality in any other way. I know that's a hard one to swallow, and an inviting conclusion that would be very easy to jump to. In a case where there are plenty of bits to go around, such as in a OTA broadcast at 14.5 Mb/s, increasing the bit rate will not buy you any increase in quality whatsover. Shocking, (to those who are misinformed or who have not been able to resist jumping to that conclusion) but completely true. Only when lower compression techniques use trade-offs that compromise quality to prevent equivalent artifacting at those much-lower bit rates, is there necessarily any reduction in quality. It is a common, yet highly flawed misconception, that raising bit rates or having a higher bit rate on one medium as compared to another, will automagically yield better quality. As someone who gets paid handsomely to work with compression algorithms on a daily basis, I think I can safely say that it just doesn't work that way.

That said, I will agree that some HD delivered by bit-starved systems will not track motion with the same level of artifacting as will HD delivered without equivalent bit starving. And true, some OTA stations do that. In that case, "Transformers" from an ABC station with two sub channels may not look quite as good during action sequences as a Blu-Ray copy will. But the difference in the action sequences may not be that significant even for the most bit-starved frames, and 99% of all frames with motion will look nearly indistinguishably the same in either case, and all frames with low or no motion will look identical in either case. But then that is not the hair-splitting argument I was speaking of, and is somewhat beyond the scope of the thread and the discussion we all were having.

4) Pulldown is not all that difficult to do. Especially with digital circuitry, which makes storing and repeating frames pretty simple, and is exactly what pulldown requires. If a display manufacturer can't do pulldown properly, then they have no business even being in the business.

Not only that, but for the case I posted about, which is 1080p24 delivered in a 1080i30 format (well, 29.97 if you do want to split hairs) It is not the display that does the pulldown, it is the decoder. The ability of the decoder to do pulldown is a part of the ATSC spec, and a part of "film mode" which is the mode invoked when 1080p24 content is sent. Every ATSC decoder conforms to that spec, and includes this ability. All your TV has to do is to accept the 1080i30 content that is reconstituted WITHIN the last stage of the decoder, and display it progressively as 1080p, which is what the set is not only designed to do, but is exactly what it does with all content (displays it as 1080p60). And that means it does exactly the same thing when fed 1080p24 directly from Blu--it displays it as 1080p60. And that would imply pulldown done in the display, not the decoder. If, as you say, some sets might not do pulldown properly, then it would actually be the Blu content that is at risk, not content that was already pulled during decoding via ATSC.

Of course it is important to distinguish 1080p60 as a display format from 1080p60 as an acquisition format, which I mention only because most folks confuse the two and incorrectly impart the benefits of the acquisition format to the display format. For those who only could get into community college, I can break that down: 1080p60 acquisition format = good. 1080p60 display format = possibly not so good, depending on the original content.

For acquisition, it implies that there are 60 _unique _1920x1080 fields recorded progressively each second, with no interlace error. As a display format, it only implies that the raster displays 60 1920x1080 fields progressively each second. It _does not_ imply that the fields are unique (which they won't be with pulldown) _or that they will not include interlace error _(which they will if acquired as interlaced content). Bottom line, 1080p60 as an acquisition format is significantly better than other formats due to more frames and no interlace error(except for 720p) while a 1080p display is somewhat better than a 768 or 720 display, only for completely different reasons, and not for the reasons enjoyed by the 1080p60 acquisition format.

But then regardless of all of that, if you are thick enough to actually buy a set that doesn't do reinterlace or pulldown properly, (after all, it _IS _2008), then I probably would have a hard time drumming up sympathy for you.


----------



## davring

After reading countless discussions on this topic, yours is the first that actually has made some sense. Thanks TomCat. ( I skipped CC)


----------



## TomCat

Tower Guy said:


> When there is no motion from field 1 to field 2 of an interlaced image it doesn't matter if the display is interlaced or progressive...


 All displays other than CRTs are progressive by nature. They have no choice but to display progressively, because they _can only_ display progressively, which is why you never see any "1080i" displays at Best Buy. A 1080p set is not designed as a 1080p set because that is some wonderful improvement to display technology (although the hype masters would love for us to think that this was the intent all along), it is designed as 1080p because there is no capability or choice to display in interlaced mode on modern flat panel displays, which it can't possibly support.

But I get your drift, I think. The result of taking both fields of an interlaced frame and displaying them progressively (interlacing them into one field, one frame) is virtually identical to displaying the equivalent originally-progressive frame, well, "progressively". That eliminates any interlace flicker factor, which leaves only interlace error. And as you say, if there is no motion, there can be no interlace error.

That said, television is not radio with pictures. By their very nature television images move most of the time, just as things do in the real world. Even if, theoretically speaking, still pictures have no interlace error under 1080i or 480i, interlace error is still very significant, because in most video there is motion.


----------



## Stewart Vernon

TomCat said:


> All displays other than CRTs are progressive by nature. They have no choice but to display progressively, because they _can only_ display progressively, which is why you never see any "1080i" displays at Best Buy. A 1080p set is not designed as a 1080p set because that is some wonderful improvement to display technology (although the hype masters would love for us to think that this was the intent all along), it is designed as 1080p because there is no capability or choice to display in interlaced mode on modern flat panel displays, which it can't possibly support.


Just to be technically accurate...

It is all but impossible to display a progressive image natively on an interlaced imaging device because that device simply cannot scan fast enough or scan consecutive lines accurately. This is why interlaced even exists in the first place.

However, a progressive-native device most certainly can display a native interlaced image. Granted they might not typically do it, but there is no reason to not be technically capable. It is a relatively easy thing to display interlaced on a native-progressive device.

To say that they don't is usually accurate, but to say that they can't is not accurate.


----------



## Cholly

+1
There's no reason a 1080p device can't display interlaced. If the circuitry exists, interlaced scan can be done. Flat panels are not inherently progressive. Scanning msthod is determined by support circuitry.


----------



## kw2957

I don't think this will be a possibility for at least a minimum of 5-7 years, and that's being extremely optimistic. IMO, 1080p is primarily a gimmick created by Sony and a few others designed to dupe uneducated viewers into believing that they are in fact seeing a better picture than a 1080i/720p TV. Granted, the picture is a little better but it is nothing to drop your jaw over.


----------



## BattleZone

Another issue that we have been living with since the 1930s that is just starting to be addressed for the first time is the "24 frames displayed across 60 frames" issue.

The vast majority of HDTVs, and all TVs before them made for the US market, are 60 Hz refresh-rate TVs. 24 does not go into 60 evenly, so anything shot at 24 frames per second has ALWAYS had to be shown with 3:2 cadence on TV.

Just in the last two years have there been consumer-level TVs that are capable of switching to a refresh rate that is an even multple of 24 Hz, which allows the TV, finally, to display 24 fps sources at 24 fps. In modern TV lingo, we refer to this as a TV that can not only *accept* a "1080/24p signal" (many fixed 60 Hz TVs can _accept_ these signals), but can render them properly at 1/24th of a second (actually, at a multple).

It is really only these newer TVs that get the full benefit of a 1080/24p signal. There are many TVs that can correctly display 1080/60p (great for video games, computer use, and a few Blu-Ray titles) that can't do 1080/24p "correctly" (i.e., without the 3:2 cadence that causes judder).

Anyway, to get back to the original question: 1080/60p broadcasting would require scrapping existing ATSC tuners and hardware, and a doubling of broadcast bandwidth, neither of which is likely to happen in the next 20-30 years. Heck, HDTVs have been around for a decade, and we are still in the transition period, which has at least another 5 years to go before most folks have HD in their home, and before most networks are sourcing all NEW content in HD.


----------



## Tower Guy

TomCat said:


> In most video there is motion.


True, but misleading. There is no motion between field one and two when the show is shot on film.


----------



## harsh

Tower Guy said:


> True, but misleading. There is no motion between field one and two when the show is shot on film.


Pulldown may introduce changes from one field to the next.


----------



## Tower Guy

harsh said:


> Pulldown may introduce changes from one field to the next.


A smart progressive display drops any intermediate frames that happen to have been interlaced.


----------



## TomCat

Tower Guy said:


> True, but misleading. There is no motion between field one and two when the show is shot on film.


That's a different issue. When I speak of "video" I am speaking of what is displayed on a video screen, such as on any HDTV display available at Best Buy, and that includes film telecined to video. I am not talking about how film projectors repeat frames to decrease the flicker factor by displaying 24 fps content at 48 fps. That is somewhat analogous to multiple fields per frame as is the concept for interlaced video, but is still actually a very different thing. If you want to get anally technical about it, there actually is no "motion" whatsoever in film itself, or in video for that matter, since both are a series of still pictures, but then to say that, even though the truth, would be exceptionally misleading, as clearly there is perceived motion, which is why a series of images each devoid of motion but used as a system to create "moving pictures" was invented in the first place.

So if we limit the discussion to what video as displayed is (which I am quite surprised to see has been necessary) it is then not at all "misleading" to suggest that almost all video has motion, as that is simply a matter of fact. Still pictures in video are but a very tiny fraction of that video.


----------



## TomCat

harsh said:


> Pulldown may introduce changes from one field to the next.


Since pulldown only repeats the fields or frames that were already there, you're going to have to explain how that might be possible.


----------



## TomCat

HDMe said:


> Just to be technically accurate...
> 
> It is all but impossible to display a progressive image natively on an interlaced imaging device because that device simply cannot scan fast enough or scan consecutive lines accurately. This is why interlaced even exists in the first place...


Ironically, I disagree that this is accurate at all, technically or otherwise. Interlace was designed "in the first place" due to limitations of the day, not limitations of today. Had devices of the 1930's had the capabilities of devices of this millenium, there might have never even been a need to develop interlace. Also, to be "tecnically accurate", devices are not "interlaced", the images they display are.



HDMe said:


> ...However, a progressive-native device most certainly can display a native interlaced image. Granted they might not typically do it, but there is no reason to not be technically capable. It is a relatively easy thing to display interlaced on a native-progressive device.
> 
> To say that they don't is usually accurate, but to say that they can't is not accurate.


I don't recall the term "native" being used for displays other than in regards to resolution, so I don't think we have any frame of reference to understand what you might mean by "native" as referring to a device being capable of displaying one format or the other, which makes arguments to that affect pretty meaningless.

If a display has the capability as a laboratory curiosity to display interlaced, yet still has no user handles to make it do that even if moved to your living room, then it still "can't" display interlaced in any usable fashion, simply by definition. There is really very little theoretical difference between "don't" and "can't", especially when the bottom line is that they won't.

Until you can produce evidence of a flat-panel display that is a consumer item at a store like Best Buy that can actually display in the interlaced mode, only then would you be correct, since that is the subset of the not-much-larger universe of displays that we are limited to having and therefore speaking about, rather than some pie-in-the-sky device that nobody actually can buy. Since I don't think you can do that, then saying that they "can't" display in interlaced mode or that they "have no choice" but to display progressively seems to be a pretty-darned on-the-money assessment of the state of affairs regarding modern displays, and that then makes what I said completely accurate and what you are claiming, both inaccurate and unproven.

But I am certainly happy to be proven wrong. I always like learning new things. Dazzle me.


----------



## TomCat

kw2957 said:


> I don't think this will be a possibility for at least a minimum of 5-7 years, and that's being extremely optimistic. IMO, 1080p is primarily a gimmick created by Sony and a few others designed to dupe uneducated viewers into believing that they are in fact seeing a better picture than a 1080i/720p TV. Granted, the picture is a little better but it is nothing to drop your jaw over.


"Primarily a gimmick created by Sony..." might be a little harsh. It is a legitimate ATSC standard created by, you guessed it, the ATSC, and NOT by Sony or anyone else. Sony and others simply are taking unfair advantage of the fact that 1080p as a display format is confused with 1080p as an acquisition/delivery format. That is the gimmick.

And you are right, the difference between 1080i and 1080p as a delivery format, even 1080p60, is so small as to not even be that noticeable, probably less-noticeable than the small difference between 720 or 768-native sets and 1080p sets. And to claim that 1080p24 has benefits beyond 1080i worth hyping borders on the criminal.

It is this very-small payoff coupled with the very-steep price in bandwidth and infrastructure that will keep 1080p60 delivery from ever really happening in the foreseeable future, IMHO.


----------



## Tower Guy

TomCat said:


> And to claim that 1080p24 has benefits beyond 1080i worth hyping borders on the criminal.


Let's take two hypothetical situations;

#1

Shoot film at 24p, transfer it to 1080i, encode it at 1080i, transmit it as 1080i, display it on a 1080p LCD with a 2:3 pulldown detector.

#2

Shoot film at 24p, transfer it to 1080p, encode it at 1080p, transmit it as 1080p, display it on a 1080p LCD with a 2:3 pulldown detector.

The two displays would look exactly the same.

Hence for prime time dramas and theatrical film presentations, there is no benefit whatsoever to a 1080p transmission system. The benefit of progressive has already been achieved while using 1080i encoding.


----------



## Stewart Vernon

TomCat said:


> Ironically, I disagree that this is accurate at all, technically or otherwise. Interlace was designed "in the first place" due to limitations of the day, not limitations of today. Had devices of the 1930's had the capabilities of devices of this millenium, there might have never even been a need to develop interlace.


I'm confused, since you agreed with what I said but talk as if you are disagreeing. IF, in the beginning of TV, they had the capability to do progressive then we would likely have never had interlaced scanning. What I said was factually accurate in that "in the first place" very much has to do with the limitations of the past.

Once invented, however, and used long enough... interlaced has a place in many people's homes and can't just be obsoleted overnight. It also has its uses as it turns out.



TomCat said:


> Also, to be "tecnically accurate", devices are not "interlaced", the images they display are.


Well... that isn't accurate either... so we're both wrong  Images aren't interlaced either, any more than the devices... To be the most technically accurate (at least until someone else corrects us both): The method of displaying an image on some types of devices is interlaced.



TomCat said:


> If a display has the capability as a laboratory curiosity to display interlaced, yet still has no user handles to make it do that even if moved to your living room, then it still "can't" display interlaced in any usable fashion, simply by definition. There is really very little theoretical difference between "don't" and "can't", especially when the bottom line is that they won't.


If you want to argue semantics, then you should lead off with that... When you say "can't" instead of "doesn't" you should qualify it. If the reason they "can't" is because the manufacturers do not provide the firmware/input necessary then you should say that... but when you definitively say "Progressive TVs cannot display an interlaced image" then you should mean that... which you obviously didn't.

If your computer needs a firmware update to support more memory... Would it be more correct to say your computer can't support more memory? Or that it currently doesn't support more memory because of a lack of firmware support? Some computers can't support more memory no matter what you do... while others just weren't designed with that need in mind, but could support it with a little upgrade.

There's no reason why a progressive display can't support interlaced display if the firmware were there to do it... so I would not say it "can't". I would say that it currently doesn't... BUT for that matter, how would you know whether or not it doesn't?


----------



## n3ntj

The main reason for my OP was having to do with sports. Only ESPN, ABC, and Fox use 720p. The others use 1080i.

Would there be no better PQ of watching a sporting event in 1080p compared to watching it in 1080i?


----------



## BattleZone

n3ntj said:


> The main reason for my OP was having to do with sports. Only ESPN, ABC, and Fox use 720p. The others use 1080i.
> 
> Would there be no better PQ of watching a sporting event in 1080p compared to watching it in 1080i?


Yes, 1080/60p would look better for high-action content than 1080/60i. But it would require TWICE the bandwidth, and given that all TV providers already have problems of not having enough bandwidth, that's a huge problem.

Then there are issues with the HD equipment use to originate the game. All of that equipment was purchased based around the 720p standard, and would need to be replaced at great expense in order to go to 1080p.

So, while it would be better, it wouldn't be "better enough" to get everyone to spend more money and for providers to cut other channels to make room for 1080p. It ain't gonna happen.


----------



## Tower Guy

n3ntj said:


> The main reason for my OP was having to do with sports. Only ESPN, ABC, and Fox use 720p. The others use 1080i.
> 
> Would there be no better PQ of watching a sporting event in 1080p compared to watching it in 1080i?


It would be slightly sharper in vertical resolution, and the horizontal resolution would be the same as 1080i. But I doubt that most eyes could see the difference.

1080p sports using MPEG2 would need about 30 mb/s of bandwidth to look good. That's far more than the 19.39 used for over the air DTV.

1080p sports using MPEG4 would fit easily in the 19.39 mb/s allocated to ATSC. The ATSC standard approved MPEG4 AVC (Advanced Video Coding) just last week. Unfortunately, that would make all existing DTVs obsolete. Existing DTV decoders don't do MPEG4.


----------



## TomCat

Tower Guy said:


> It would be slightly sharper in vertical resolution, and the horizontal resolution would be the same as 1080i...





n3ntj said:


> ...Would there be no better PQ of watching a sporting event in 1080p compared to watching it in 1080i?


It would no sharper at all in vertical resolution for static images or slowly-moving images (as that is a fixed ceiling determined by the number of scan lines, which is the same), but for images with considerable motion the vertical resolution of 1080i as compared to 1080p is effectively reduced by the interlace error, to a perceived resolution much _below _that of 720p, which then makes 1080p comparitively better than 1080i for motion (in that it has better perceived vertical resolution).



Tower Guy said:


> ...1080p sports using MPEG4 would fit easily in the 19.39 mb/s allocated to ATSC. The ATSC standard approved MPEG4 AVC (Advanced Video Coding) just last week. Unfortunately, that would make all existing DTVs obsolete. Existing DTV decoders don't do MPEG4.


And that is why it was approved for ancillary services, and not for the main HD service most stations provide. For instance, it could be used as a quasi-movie on demand service. It won't obsolete legacy tuners, but provide new services to new tuners (which have not even been designed yet).

There is a standard called ATSC M/H, or colloquially referred to as ATSC II, which should be approved by the end of the year. It is designed to add features to the existing standard, but features that also won't necessarily be enjoyed by legacy ATSC tuners, other than possible improvement in multipath rejection (which could solve borderline reception issues for some locations). It is primarily a way to target mobile devices like the iPhone, and the AVC incorporation may also play a role in that.

One problem with that standard is it has significant overhead, which will stretch the ability to provide HD and multiple channels in a 6 MHz bandwidth even more than current standards do. IOW, some of the bits will go into that overhead, leaving less for legacy tuners, potentially affecting quality in a negative manner.


----------



## TomCat

HDMe said:


> I'm confused, since you agreed with what I said but talk as if you are disagreeing. IF, in the beginning of TV, they had the capability to do progressive then we would likely have never had interlaced scanning. What I said was factually accurate in that "in the first place" very much has to do with the limitations of the past.


My best guess would be that your confusion stems from misinterpreting what I said. I never said they did not have progressive at the dawn of television, they in fact did. Interlace was invented around 1932 or so to primarily reduce the flicker factor which progressive _of that day,_ had by doubling the field rate. It was an early form of analog "compression" in that it doubled the field rate without increasing the payload rate. If they had the capability to have a field rate that high using progressive without increasing the payload rate, _THEN _there never would have been a need for interlace.

Displays (for the consumer world) from before the 70's were all interlaced for that reason. But modern displays can increase the field rate _at the display itself,_ which kind of removes that benefit of interlace, since displayed scan rate is now independent of transmitted field rate.



HDMe said:


> ...Well... that isn't accurate either... so we're both wrong  Images aren't interlaced either, any more than the devices... To be the most technically accurate (at least until someone else corrects us both): The method of displaying an image on some types of devices is interlaced...


Would you accept the definition from Wikipedia.org?


> Interlace is a technique of improving the picture quality of a video signal...


While this is a truncated quote, the entire webpage as well as this excerpt seems to indicate that as I said, images are interlaced, and not devices. Or maybe Wikipedia is wrong too?


HDMe said:


> ...If you want to argue semantics...


I don't. I'd rather have a hot-lead enema, frankly. I only will do either under protest, or when dragged there by those such as yourself (not to imply that you might be motivated to give hot-lead enemas, of course  ), and I am even now trying vainly to extract myself from it. (IOW, don't expect this discourse to continue much beyond this).



HDMe said:


> ...There's no reason why a progressive display can't support interlaced display if the firmware were there to do it... so I would not say it "can't". I would say that it currently doesn't... BUT for that matter, how would you know whether or not it doesn't?


Does it really matter "how" I might know? What seems to matter is the actual fact of the matter which is that none of them do. There may indeed be no reason, as you claim, that interlace is not supported. Fair enough. But the obvious elephant in the room is that since none of them do, there might indeed be a pretty compelling reason why they don't, and that reason probably is that they can't. I'm not sure what parsing that any further actually buys us.


----------



## TomCat

Tower Guy said:


> Let's take two hypothetical situations;
> 
> #1
> 
> Shoot film at 24p, transfer it to 1080i, encode it at 1080i, transmit it as 1080i, display it on a 1080p LCD with a 2:3 pulldown detector.
> 
> #2
> 
> Shoot film at 24p, transfer it to 1080p, encode it at 1080p, transmit it as 1080p, display it on a 1080p LCD with a 2:3 pulldown detector.
> 
> The two displays would look exactly the same.
> 
> Hence for prime time dramas and theatrical film presentations, there is no benefit whatsoever to a 1080p transmission system. The benefit of progressive has already been achieved while using 1080i encoding.


I agree nearly completely, with one exception: when something is shot as 1080p24, it is typically telecine'd as 1080p24 (to do otherwise compromises it by inserting interlace error). It is rarely if ever transferred to 1080i, and instead is flagged as 1080i30, which means it is transmitted in "film mode" (meaning it is sent as 1080p24 flagged to be restored to 1080i30). The decoder itself handles pulldown, and the resultant frames are displayed as 1080p60. That is pretty close to your example one, but not exact. It is, however, how the system works when presented with 1080p24 content.

The entire operation of flagging as 1080i30 and adding pulldown is only to accomodate the interstitial video which is already 1080i30, and to avoid major glitching when going from the 1080p24 content to the 1080i30 content. Theoretically, you could send as 1080p24 and display as 1080p24, but not practically.


----------



## reddice

I am kind of new to this but why film in 24 fps. Would it be smoother and better to film in 30 or 60 fps.


----------



## Jim5506

reddice said:


> I am kind of new to this but why film in 24 fps. Would it be smoother and better to film in 30 or 60 fps.


The film industry used the slowest frame rate they could without introducing visible flicker into the projected image (use the least film stock for the picture).

You get much below 24fps the picture flickers, much above that you do not gain any picture quality and thusly waste film.


----------



## Stewart Vernon

TomCat said:


> Would you accept the definition from Wikipedia.org?
> While this is a truncated quote, the entire webpage as well as this excerpt seems to indicate that as I said, images are interlaced, and not devices. Or maybe Wikipedia is wrong too?


Here's what I find interesting about Wikipedia, and please don't take this as a personal smack against you.

On internet forums, people are routinely called upon to "prove" their claims by quoting a source (usually an internet one)... Your average Joe posting in a forum like this, is thus deemed untrustworthy on his own merits unless he can provide proof. Fair enough... BUT Wikipedia is essentially just another forum where your average Joe can post unproven stuff, and unless/until corrected it will stay there.. So average Joe can post on Wikipedia, then quote himself to provide "proof" he is right.

Essentially, while I might use Wikipedia for casual interest... I wouldn't take what is "said" there as better than any other internet forum unless I could prove it elsewhere.

And for the record, Wikipedia may be a bit off-base based upon the snippet you quoted 



TomCat said:


> I don't. I'd rather have a hot-lead enema, frankly. I only will do either under protest, or when dragged there by those such as yourself (not to imply that you might be motivated to give hot-lead enemas, of course  ), and I am even now trying vainly to extract myself from it. (IOW, don't expect this discourse to continue much beyond this).


Actually, I'd rather argue semantics  As much as I hate semantics, given those two choices semantics seems more pleasant by comparison!



TomCat said:


> Does it really matter "how" I might know? What seems to matter is the actual fact of the matter which is that none of them do. There may indeed be no reason, as you claim, that interlace is not supported. Fair enough. But the obvious elephant in the room is that since none of them do, there might indeed be a pretty compelling reason why they don't, and that reason probably is that they can't. I'm not sure what parsing that any further actually buys us.


Incidentally, my "you" didn't mean you specifically... but rather the generic "you" that refers to anyone. But that aside... and it might be a semantic argument.. but sometimes such things get dispersed as fact and is part of why so much confusion exists about the digital TV resolutions, interlaced vs progressive, and so forth so I try to nip some things where I can.

Even in the cases where it is hard to tell the difference in progressive vs interlaced, it is still fair to say that a proper progressive display is better than the otherwise equivalent interlaced display... In other words, if you had the choice and ALL other things were equal, progressive is better... so there is no reason why a progressive display would be used to display an interlaced image unless there were no other choice really.

If you always give me the choice of $100 or $50, I'll take $100... but that doesn't mean I'm refusing the $50 or that I can't accept $50.... Same for the concept here. No reason technically that an interlaced image can't be displayed on a progressive display. In fact, I've done this on computer displays (you can still display something interlaced, though there is not usually any reason to do so)... so just because something doesn't usually happen, isn't cause to say that it can't happen.

That's really all I was going for in my minor correction, since I otherwise pretty much agreed with what you were saying.


----------



## TomCat

HDMe said:


> Here's what I find interesting about Wikipedia, and please don't take this as a personal smack against you.
> 
> On internet forums, people are routinely called upon to "prove" their claims by quoting a source (usually an internet one)... Your average Joe posting in a forum like this, is thus deemed untrustworthy on his own merits unless he can provide proof. Fair enough... BUT Wikipedia is essentially just another forum where your average Joe can post unproven stuff, and unless/until corrected it will stay there.. So average Joe can post on Wikipedia, then quote himself to provide "proof" he is right.
> 
> Essentially, while I might use Wikipedia for casual interest... I wouldn't take what is "said" there as better than any other internet forum unless I could prove it elsewhere.
> 
> And for the record, Wikipedia may be a bit off-base based upon the snippet you quoted ...so just because something doesn't usually happen, isn't cause to say that it can't happen.
> 
> That's really all I was going for in my minor correction, since I otherwise pretty much agreed with what you were saying.


So I guess your answer to my question "would you accept Wikepedia as a source" is probably "no", even though your "off-based" (meaning Wiki is off-based) characterization of it is suspiciously lacking support, so far, and that Wikipedia is widely regarded as typically a very solid and accurate source regardless of the open-souce nature it enjoys which has rarely ever been an issue (and is exactly why it is so successful as being the premiere _de facto _"internet encyclopedia"). World Book doesn't enjoy scrutiny and vetting outside their institution, either, while Wikepedia does. OK, then it's your turn to provide a better source that does support your allegation, don't you think? And I am not trying to be adversarial either, just enjoying the healthy debate. 

I agree that somewhere somehow someone may have made a modern display work in interlaced mode at some time, probably, but they also may not have, as there really is nothing yet in evidence supporting that assertion. My argument was not so much "can't" as "don't", and that in a world where they never "do", that's just about as good as "can't". But I do feel I am allowed to characterize it as "can't", seeing as how the end result is the same, which is that each and every one of them "don't". To get any more specific than that is irrelevant.


----------



## TomCat

Jim5506 said:


> The film industry used the slowest frame rate they could without introducing visible flicker into the projected image (use the least film stock for the picture).
> 
> You get much below 24fps the picture flickers, much above that you do not gain any picture quality and thusly waste film.


Actually, higher frame rates do indeed provide real gains in quality, which is why there are 60 fps frame rates in ATSC, such as 720p. There is considerable motion artifacting at 24 fps, especially on pans, which we have all seen, and higher frame rates also reduce the "wagon wheels going backward" strobe affect, which is another artifact.

What I find really fascinating is Roger Ebert's assertion, which very-well may be factual, that 24 fps is more likely to react with brain-wave frequencies to allow viewers to suspend their disbelief psychologically than would a faster rate such as 30 or 60 fps. Roger attributes some of the dream-like and enveloping quality (read: magic) of film to that factor, and he may just be on to something. It's certainly a very-intriguing notion, at the least.


----------



## Stewart Vernon

TomCat said:


> So I guess your answer to my question "would you accept Wikepedia as a source" is probably "no", even though your "off-based" (meaning Wiki is off-based) characterization of it is suspiciously lacking support, so far, and that Wikipedia is widely regarded as typically a very solid and accurate source regardless of the open-souce nature it enjoys which has rarely ever been an issue (and is exactly why it is so successful as being the premiere _de facto _"internet encyclopedia"). World Book doesn't enjoy scrutiny and vetting outside their institution, either, while Wikepedia does. OK, then it's your turn to provide a better source that does support your allegation, don't you think? And I am not trying to be adversarial either, just enjoying the healthy debate.


You're actually providing an excellent example of my point. You would ask me to prove Wikipedia is unreliable, but are willing to trust Wikipedia blindly where I might very well be a contributor. So if I am random-unknown-guy you would trust me unquestioningly via Wikipedia but as specific-guy on this forum I have to prove myself. That seems like an odd conundrum.

I offer to you that IF the Wikipedia community model is so accurate and reliable, why are not all things done this way? Why not have Doctors go to Wikipedia instead of Medical Reference Libraries?

I'm not wholesale bashing Wikipedia... but I wouldn't just blindly trust it either. The fact is that most information you will read there and take for being gospel is posted by the same folks you won't trust in any other internet forum... so I remain completely confused by its acceptance as proof sometimes.



TomCat said:


> I agree that somewhere somehow someone may have made a modern display work in interlaced mode at some time, probably, but they also may not have, as there really is nothing yet in evidence supporting that assertion. My argument was not so much "can't" as "don't", and that in a world where they never "do", that's just about as good as "can't". But I do feel I am allowed to characterize it as "can't", seeing as how the end result is the same, which is that each and every one of them "don't". To get any more specific than that is irrelevant.


The thing is... the limitation was the inability to progressive-scan at fast enough rates... so once they could do that, there is usually not a reason to do interlaced... but it pretty much has to be capable of it.

You have to learn to crawl before you learn to walk... but most folks don't go back to crawling much once they learn to walk and run... but the capability is still very much there even if it goes unused.


----------



## Cholly

Jim5506 said:


> The film industry used the slowest frame rate they could without introducing visible flicker into the projected image (use the least film stock for the picture).
> 
> You get much below 24fps the picture flickers, much above that you do not gain any picture quality and thusly waste film.


In actuality, when film is projected, each frame is shown twice before advancing to the next frame. This fools the eye into perceiving 48 fps, and is thus faster than eye response time. To the eye, flicker then doesn't exist.
Home movies used to be shot at 16 fps. Once again, each frame is projected twice, giving the appearance of 32 fps. Think image1/blank/image1/blank with advance/image2/blank/image2/blank with advance/image3/etc.


----------



## woj027

TomCat , HDMe, and others

I am hoping you two will continue your discussion here. I find it very interesting and informative. I really appreciate how you two are being very articulate, full of good information, and although you have had some different views this hasn't become a pissing match like many other threads have degraded to.

I'm still a bit uninformed, confused. I have some questions, and maybe they are misguided but then hopefully you, or others can help me understand this all better.
In layman's terms.

1. What are the different means of recording HD? (Is this the correct answer 720i24, 720i30, 720i60, 720i24, 720i30, 720i60, 1080i24, 1080i30, 1080i60, 1080p24, 1080p30, 1080p60, others?)

2. How many different types of common HD feeds (signals transmited )are there? (Same as above?)

3.If I have been understanding the posts correctly, the answers from # 1 and #2 are the same. That doesn't necessarly mean that the HD Feed is being transmitted in the same format it was recorded? (1080p60 could actually be tranmitted in 720i24? or a different format?)

4a Ok now that I've made assumptions that the HD feed could be transmitted in all types of Formats, my HD TV (if designed using todays TV Standards I don't want to get into the Can't, Doesn't discussion) may very well take that HD feed transmitted in 720i24 and convert it to some other type of viewing image (that is what I see) My TV could convert that to 480p, 1080i, 1080p, and so on? Correct?

4b Ok using that same 720i24 feed, my TV could also covert (pulldown) that image as 3:2, 16:9 (or another stardard format Pillar, Letterbox, Stretch) correct?

5. How does MPEG2 and MPEG4 compression affect all of these different types of feeds?

6. So if I really do understand all of this (which i'm pretty sure I don't) The image displayed on my HD TV has Many variables affecting it before it even gets show on my TV. 
(A) Format in which the live action was recorded.
(B) Format in which the recorded information was transmitted through to my TV
(C) Format of compression that was used to get the information to me faster?
(D) Technology available within my TV and/or Receiver (Cable, Sat, OTA) to convert the compressed formated signal
(E) The method of viewing the signal on my HDTV 
(i) can be affected by the pulldown format (3:2, 16:9)
and (ii) can be affected by the format in which I request my HD TV to display (480i, 480p, 720i, 720p, etc.)

7. After all that, what's the best (undefined word) format to view something that is transmitted (telecine'd) OTA received by my HDTV' and its' decoder, pulled down into ?( 3:2, 16:9) and displayed in 1080p (using today's technology)
(Is it an event recorded in 1080p60, event transferred in 1080p24, MPEG4 Format?)

8. finally assuming my asusmptions are all correct up above, couldn't a spreadsheet be made (with many fields) descirbe all the possible ways to record, transmit, compress, receive, pulled down, and shown a specific display and have a rating for all of these?


OK where did I loose it?


----------



## Tower Guy

woj027 said:


> TomCat , HDMe, and others
> 
> I am hoping you two will continue your discussion here. I find it very interesting and informative.


Try reading the documents available here.

http://www.atsc.org/standards/
and
http://www.atsc.org/standards/practices.php


----------



## Stewart Vernon

woj027 said:


> I'm still a bit uninformed, confused. I have some questions, and maybe they are misguided but then hopefully you, or others can help me understand this all better.


Tower Guy provided a couple of links worth visiting... Ultimately to get all the answers you probably need to do a little reading & research, because while the questions you asked are relatively simple ones... the answers can be complex!



woj027 said:


> 1. What are the different means of recording HD?


I'm not aware of any 720i sources. Doesn't mean that there aren't or couldn't be... but I'm not aware of any. Technically, I'm only aware of 720p and 1080i or 1080p recording sources (used by movie/TV producers)... however, for 1080i/1080p cameras not all of them have been 1920x1080. I've heard of 1440x1080 and I think 1280x1080 source cameras as well.

There are also higher resolution digital cameras in use for developing masters, and those are downconverted for broadcast to the "standard" HD resolutions. It's always worth noting too that film (particularly 35/70mm) has a higher resolution than HD and than some of the 4K digital cameras as well... so anything "filmed" can be converted to HD as long as any special effects from post-production were generated at a high enough resolution.



woj027 said:


> 2. How many different types of common HD feeds (signals transmited )are there? (Same as above?)


Again, this becomes muddy water. OTA (ATSC standard) says only 720p, 1080i at proper resolutions. I believe 24/30/60 fps are covered and also 1080p at 24fps or 30fps as well... Satellite/cable has additional resolutions covered like those 1280x1080 and 1440x1080... BUT, even if a broadcast is 1920x1080 that doesn't mean the image itself is that resolution... since a 1440x1080 camera could have been originally used and then converted to 1920x1080 for broadcast.

It gets complicated in a hurry!



woj027 said:


> 5. How does MPEG2 and MPEG4 compression affect all of these different types of feeds?


Both are lossy compression methods, meaning that some of the original quality is lost. You can lookup the MPEG2 and MPEG4 algorithms online to get a better idea of how each differs in the way it uses key frames and determines changes from frame to frame... but suffice it to say a good bit is lost from the original uncompressed source out of necessity in order to actually transmit via satellite/cable/OTA. There'd be no reasonable way to get to your TV otherwise.

In theory, an MPEG4 vs MPEG2 comparision should result in one of the following scenarios:

1. MPEG4 can yield better video/audio quality at the same bandwidth than MPEG2.
OR
2. MPEG4 can yield equal video/audio quality as MPEG with lower bandwidth usage.

So it ultimately depends on how they are used... Typically satellite/cable will eventually go for lower bandwidth at same quality in order to get more channels online... rather than maximize quality at a fixed level.

I tried to hit the highlights where I had the best answers.


----------



## woj027

Thanks to both of you. I have really enjoyed reading, and re-reading, and re-reading this thread. It's one of those threads that has a all sorts of information, and some exciting commentary as well.

I'm mostly curious for these answers so I can communicate a solid understanding to friends.

Or as a famed politician might say;
"I can talk with Joe Sixpack and he can understand HDTV with me."


----------



## TomCat

HDMe said:


> You're actually providing an excellent example of my point. You would ask me to prove Wikipedia is unreliable, but are willing to trust Wikipedia blindly where I might very well be a contributor. So if I am random-unknown-guy you would trust me unquestioningly via Wikipedia but as specific-guy on this forum I have to prove myself. That seems like an odd conundrum.
> 
> I offer to you that IF the Wikipedia community model is so accurate and reliable, why are not all things done this way? Why not have Doctors go to Wikipedia instead of Medical Reference Libraries?
> 
> I'm not wholesale bashing Wikipedia... but I wouldn't just blindly trust it either. The fact is that most information you will read there and take for being gospel is posted by the same folks you won't trust in any other internet forum... so I remain completely confused by its acceptance as proof sometimes...


I get it. Wiki not credible. I disagree, but let's move on. I would hate to get distracted by the bashing and so lose sight of the fact that there are plenty of credible sources in the world, and that you still have not provided one that supports your position. I think we both know what that really means.


----------



## TomCat

Cholly said:


> In actuality, when film is projected, each frame is shown twice before advancing to the next frame. This fools the eye into perceiving 48 fps, and is thus faster than eye response time. To the eye, flicker then doesn't exist.
> Home movies used to be shot at 16 fps. Once again, each frame is projected twice, giving the appearance of 32 fps. Think image1/blank/image1/blank with advance/image2/blank/image2/blank with advance/image3/etc.


But that is a later improvement. Film was originally also shown at 24 fps, and the doubling improvement a later development. Flicker is also a function of both screen size and brightness (which I found surprising), so became more of a problem as movies were shown on larger screens with brighter projectors, which is about the time this technique was first employed.

And while it reduces flicker, it does not improve motion artifacting at all. If the film were actually _shot _at 48 fps, then it would.


----------



## TomCat

woj027 said:


> ...1. What are the different means of recording HD? (Is this the correct answer 720i24, 720i30, 720i60, 720i24, 720i30, 720i60, 1080i24, 1080i30, 1080i60, 1080p24, 1080p30, 1080p60, others?)...
> 
> my HD TV (if designed using todays TV Standards I don't want to get into the Can't, Doesn't discussion) may very well take that HD feed transmitted in 720i24 and convert it to some other type of viewing image (that is what I see) My TV could convert that to 480p, 1080i, 1080p, and so on? Correct?
> 
> 4b Ok using that same 720i24 feed, my TV could also covert (pulldown) that image as 3:2, 16:9 (or another stardard format Pillar, Letterbox, Stretch) correct?...
> 
> 7. After all that, what's the best (undefined word) format to view something that is transmitted (telecine'd) OTA received by my HDTV' and its' decoder, pulled down into ?( 3:2, 16:9) and displayed in 1080p (using today's technology)
> ...


There is 720p60, 1080p24, 1080p60, 1080i30, and 480i30. These are approved transmission formats for ATSC. There are also the 1/1000 variants using 29.97 and 59.94 fps. For acquisition, there are consumer variants which probably include 720p24, and there are DVB variants such as 1280x1080i30, and 1440x1080i30. There are no interlaced versions of 720 that I know of. The perceived resolution of 1080i30 and 720p60 are actually about the same, and is why both are viable. An interlaced 720 would not have the same perceived resolution, which is probably why it doesn't exist. Most stuff is shot at 1080p24 or 1080p60. FOX shoots "American Idol" at 720p60, and that is also becoming more prevalent.

But how you display them is typically tied to the native resolution and scanning scheme of your display. A 1080p display scales everything to 1080, and deinterlaces all interlaced content to progressive, no matter how it is shot or how it arrives. It could be argued that the best way to display something is in the original format, but there are conditions and gotchas there, and usually it doesn't really matter if something gets cross-converted along the way.

Pulldown only happens to content that was first pulled up. Often though, content that is originally 1080p24 will have the equivalent of pulldown created at the decoder in order to keep it compliant with interstitial 1080i30 material.

Was anyone aware that the NFL network transmits everything in 1080p24? I guess they have had such a rich history with film that they find 720p and 1080i30 unacceptable, which is virtually what every other channel uses.


----------



## Stewart Vernon

TomCat said:


> I get it. Wiki not credible. I disagree, but let's move on. I would hate to get distracted by the bashing and so lose sight of the fact that there are plenty of credible sources in the world, and that you still have not provided one that supports your position. I think we both know what that really means.


I'm not here to prove anything, and I agree I don't want to bicker especially when it would seem we mostly agree on everything aside from this one aspect. My only point of contention really is that there is no evidence to suggest that a progressive display cannot display an interlaced image. I can think of no reason why they can't, and you have not provided any evidence yourself that says they can't. You've said (and cited a source) that says they mostly don't... but don't and can't are different beasts.

Logically I can't think of any reason why interlaced cannot be displayed on a progressive monitor.. and as I said, I've done it myself on my LCD computer monitor. You can too, there are interlaced JPG files you can save and they get displayed that way on your monitor. Granted, with a fast enough computer it completes faster than you can sometimes see... but it is still doable.

Doing interlaced on progressive is easy... progressive on interlaced, however, is virtually impossible thus the need for improved technology.


----------



## Cholly

TomCat said:


> But that is a later improvement. Film was originally also shown at 24 fps, and the doubling improvement a later development. Flicker is also a function of both screen size and brightness (which I found surprising), so became more of a problem as movies were shown on larger screens with brighter projectors, which is about the time this technique was first employed.
> 
> And while it reduces flicker, it does not improve motion artifacting at all. If the film were actually _shot _at 48 fps, then it would.


While in the army in the 50's and later at NBC in Chicago, I worked with Simplex 35 mm projectors, which had shutters that operated at 48 fps, while the frame advance was 24 fps, so I wouldn't call it a "later development". I believe that the 48 fps projection concept dates back to the early days of sound. As to the 16/32 fps concept, that also dates back to the thirties.


----------



## TomCat

Cholly said:


> While in the army in the 50's and later at NBC in Chicago, I worked with Simplex 35 mm projectors, which had shutters that operated at 48 fps, while the frame advance was 24 fps, so I wouldn't call it a "later development". I believe that the 48 fps projection concept dates back to the early days of sound. As to the 16/32 fps concept, that also dates back to the thirties.


I think that only supports what I said. Movies have been around since the time of Edison, have they not? That would make even the "thirties" much later, so I'm a bit puzzled why characterizing this as a "later development" has been problematic for you. It seems we are arguing in favor of the same point, which probably means that we are not arguing at all, but agreeing.


----------



## TomCat

HDMe said:


> I'm not here to prove anything...


Well then, congratulations must probably be in order. You've been very successful in not proving anything as well as in not supporting your allegation.



HDMe said:


> ...don't and can't are different beasts.
> 
> Logically I can't think of any reason why interlaced cannot be displayed on a progressive monitor.. and as I said, I've done it myself on my LCD computer monitor. You can too, there are interlaced JPG files you can save and they get displayed that way on your monitor. Granted, with a fast enough computer it completes faster than you can sometimes see... but it is still doable...


You seem to be very, very confused about the basic way that these things work. All computer monitors ALSO only work in the progressive mode. It doesn't matter what the source material is, it matters what the display mode of the monitor itself is.

And if you have any modern computer operating system, there is typically a control panel or some other utility to interface with the various available parameters of the monitor. There are numerous refresh rates, there are bit levels, there are pixel resolutions. Pick the combo that works for you. But not one of them offers an ability to change from progressive to interlaced. And there is a very good reason why, which is the same reason why modern flat-panel HDTV displays also don't have that capability.

They can't. That's right, I said it, can't. Not don't, which while slightly different is every bit as good a reason since the outcome is exactly the same. Not don't, can't. Can not.

So do I need to provide proof that something can't be done, when history, absent any proof from you or anyone else, shows that it never has been? I hardly think so. But since indeed it does not occur, the burden is probably on those who claim it can be done to prove that it can.

"Interlaced JPG"? I have grave doubts about that being a possibility as well, since the term "interlaced" refers to how something is displayed rather than how it exists. Interlacing implies a difference in time between when one part of an image is displayed and when the other is displayed, which is at it's basic core definition. A JPG is a file format and refers to a type of file, and a file exists, all parts of it, all at the same time. So, "interlaced" being a temporal state, or an adverb in this case, means that an "interlaced JPG" is a completely ludicrous concept, not like a "warm thought" or a "cold war", which can only be understood as metaphors, not as a thought that is actually warm or a war that is actually cold. In the same way, JPGs can't be "interlaced". Only the display of the file or video image, which CAN be a temporally-divergent process, can be interlaced, not the file or image itself, which can't exist partially at one time and partially at another. I can't believe anyone should have to even begin to explain something this basic.

And again, all of this is an argument I never wanted to be a part of. "Don't" is perfectly acceptable as a reality to me, even if "can't" is only theoretical (a concept I vehemently disagree with). I live in the real world, where "don't" means it doesn't happen. "Can't" really isn't important, because if they "don't" display in interlaced mode (which they don't) then I "can't" get them to. Which means to me, that they "can't".

And, absent proof, which you have had at least 3 chances to bring us already, I am then through discussing it. Period. I "don't" want to anymore, whether I "can't" or not.


----------



## Stewart Vernon

Sorry, TomCat. I've tried to be polite and don't want to get into a personal battle with you... you haven't provided any proof yourself except to say "I haven't seen it so it can't be done". That's not proof.

But again, I don't need to prove anything other than use common sense. Don't does not equal can't, and just because something doesn't happen (or at least not often) doesn't mean it can't.

I did make one misstep, though... I should have said interlaced GIF. There is a format of GIF that displays in an interlaced method to allow essentially a "preview" of a graphic before it is finished loading. It is an encoded method of display. Oddly enough this same feature for JPG format is called "progressive", which just adds confusion.

The real point is... A display is just a display. A display itself is really neither progressive nor interlaced. The method of the scan determines whether the resulting image is a progressive one or an interlaced one.

There is a limiting factor in the scan and refresh rates of older CRT technology that essentially prevents progressive scanning and thus the origins of interlaced display technology. As improvements happened, we finally got to progressive. By the time Plasma, LCD, DLP, etc. etc. there was no need to make them be interlaced because the technology for progressive scan exists.

BUT, it is a simple matter of software inside any of these devices to perform an interlaced scan rather than a progressive one. Any system that can perform a 60fps progressive scan rate could certainly perform 30fps of interlaced scanning (2 interlaced frames equalling 1 complete frame).

The only reason they don't do it, is because why would you? You can put your car in neutral and push it everywhere to save gas, but you wouldn't do it because it wouldn't make sense to have a car that can move itself and not use it that way... but you wouldn't say you can't just because you don't.

Again, it's just common sense here relating to the technology... and the more you argue what I initially took as a mistake on your original post, the less I feel you actually know about what you are talking about.


----------



## TomCat

It seems to me that the simple fact that no non-CRT display that you and I can walk in to any BestBuy and point to CAN display in interlaced mode is more than proof enough that all of them CAN'T.

I can hardly even believe I'm still deep in this nightmare, but as long as I am let me try one final simple example: Most of us here are familiar with both the HR10-250 and the HR2x HD DVR platforms. One can decode both MPEG-4 as well as MPEG-2 encoded content (the HR2x), and the other can only decode MPEG-2 content (the HR10). Isn't it much more accurate to say that the HR10-250 "CAN'T" decode MPEG-4 content than to say that it "DOESN'T"? Would not that same argument also apply to every available non-CRT HD or computer display ever for sale in the USA? That it is not a matter of that they "DON'T" support interlaced scanning, but that they "CAN'T"?

The reason they can't, is because they are not designed to and do not have the firmware that instructs them how to. The reason the HR10-250 can't decode MPEG-4 is for exactly the same reason, which is that it is not designed to nor does the firmware have that capability. It can't decode MPEG-4, just like a modern display CAN'T display in interlaced mode, and both CAN'T, for pretty much the same exact reason. And I'm really sorry, but there just is no wiggle room for "but technically they could" in either case, even if that somehow mattered. They CAN'T, they won't, and they never will.

What could be more simple to understand than that?

Holding vainly onto the microscopically-thin theoretical point that they "might" be able to under certain laboratory-curiosity circumstances is just as silly as it is meaningless, and does nothing to further your standing in any of this, in fact it does quite the opposite.

And thanks for the cheap shot. Since civility has been all but discarded, I guess I can now not hold back on how I see things as well. There was no mistake on my original post, yet you still keep clinging to the now-faded hope that you can somehow argue a point that seems to be obviously much thinner than John McCain's campaign platform. The reason you feel less and less that I actually know what I am talking about is only because you just don't (and apparently, "can't") understand what I am talking about, facts which are actually simple concepts with simple answers, which everyone else seems to have no trouble with. Maybe it's a blessing that you can't see how pitiful this is and how it appears to everyone else, but the fact remains that it is still indeed pitiful.

I have been nothing but consistent. You have been nothing but consistently stubborn in your refusal to face the facts and have provided no proof of support whatsoever, while proof of support of my position is all around us. If you could only get out of the way of your own ego you might actually learn something. History has shown me that folks who have these issues seldom ever learn anything much at all, and typically never change. It's fine to go through life that way if that is your choice, but you take the chance of being called out if you try to post wacky ideas as if they are understandable facts. Believe me, being the one who calls that out is hardly a position I am at all comfortable being in, but it's still a part of duty and due dilligence as a forum member, and I probably won't ever shrink in the face of such duty, even as distasteful as I might find it.

I'm probably also taking a chance that you can understand and distinguish frankness, which is what this is, from rudeness or attack, which it is not. But then that is no longer my problem, and I wish you well and hope you find both yourself and true understanding one day.

Peace, but I'm out. You're on your own.


----------



## Stewart Vernon

TomCat said:


> I can hardly even believe I'm still deep in this nightmare, but as long as I am let me try one final simple example: Most of us here are familiar with both the HR10-250 and the HR2x HD DVR platforms.


Since this is in a non-provider-specific forum I don't know if most of us are familiar with DirecTV receivers. I know I'm not, as I am a Dish customer and haven't had DirecTV in many years way before HD... so I can't say too much about specifics with their equipment.



TomCat said:


> One can decode both MPEG-4 as well as MPEG-2 encoded content (the HR2x), and the other can only decode MPEG-2 content (the HR10). Isn't it much more accurate to say that the HR10-250 "CAN'T" decode MPEG-4 content than to say that it "DOESN'T"? Would not that same argument also apply to every available non-CRT HD or computer display ever for sale in the USA? That it is not a matter of that they "DON'T" support interlaced scanning, but that they "CAN'T"?


MPEG compression/decoding and displaying on a monitor are not even as similar as apples and oranges, so I'm not sure what the relevance here is. Some hardware actually can't decode MPEG4 in real-time because it lacks the horsepower to do so no matter what you update in the firmware. So in that case "can't" would absolutely apply... but Dish, for example, has some MPEG2 channels that they flagged for their VIP receivers only... so in that case receivers that can get them, don't not because of lack of capability but because of a conscious decision made by Dish to prevent. In that case, "don't" is more accurate because it isn't impossible, just not enabled.



TomCat said:


> And thanks for the cheap shot. Since civility has been all but discarded, I guess I can now not hold back on how I see things as well. There was no mistake on my original post, yet you still keep clinging to the now-faded hope that you can somehow argue a point that seems to be obviously much thinner than John McCain's campaign platform.


Not sure what political platforms have to do with anything here at all.

I really don't even understand what the argument is about anyway. If you truly understand how interlaced scanning works and how progressive scanning works, then there really is nothing to discuss. If you don't, then nothing I say is going to be of value.


----------



## Jim5506

All digital displays are designed to accept 480i, 480p, 720p and 1080i input and internally convert it to match the frequency and resolution of the attached screen, whether it be plasma, LCD or whatever.

Now manufacturers are adding the capability to handle 1080p to the menu.

Nearly ALL of these digital displays "publish" their data to the screen progressively and these screens are built to display progressive ONLY, they CANNOT accept an interlaced output from their electronics, but they don't get one, so don't worry.

In actuality Nearly ALL digital displays are natively progressive in nature, but their electronics can receive and correctly interpret (de-interlace) an interlaced source. The difference recently is that the brains have been upgraded, improved, to be able to correctly accept 1080p and use it.


----------



## TomCat

HDMe said:


> ...I really don't even understand what the argument is about anyway. If you truly understand how interlaced scanning works and how progressive scanning works, then there really is nothing to discuss. If you don't, then nothing I say is going to be of value.


Anyone who even took basic communications classes at a community college knows how interlaced and progressive scanning work, it's usually taught in the first week of classes for any communications degree. And, it's not brain surgery or rocket science, it's really pretty simple. Plus, you're not exactly dealin' with a chimp here, so don't even begin to try to imply that I don't understand it.

Since you seem puzzled by this, I will explain what the argument is, even though it's not my argument, it's yours. Unfortunately I can't _ s p e a k v e r y, v e r y s l o w l y _ as I would if we were having an actual conversation, so you will have to really concentrate. Here goes. Try to keep up. You seem to want to cling desperately to a semantic argument regarding the philosophical difference between "can't" and "don't" regarding the capability of modern flat-panel displays inability to display in interlaced mode, probably because you painted yourself very firmly into a corner regarding that a couple of posts ago and haven't got the stones to admit it.

I, on the other hand, am perfectly happy with either that they "can't" or that they "don't", because the end result is that they "don't", whether they "can" or not, making whether they "can" or not a totally moot point. Your take on this is exactly as loony an argument as it sounds, and a thin excuse for you to smokescreen the real issue, which is that it really doesn't matter if they "can" if you can't produce an example of one that "can" that isn't some laboratory curiosity. Which you can't.

The only thing you have said so far that seems to be even close to accurate is the last 9 words of your post.


----------



## TomCat

Jim5506 said:


> All digital displays are designed to accept 480i, 480p, 720p and 1080i input and internally convert it to match the frequency and resolution of the attached screen, whether it be plasma, LCD or whatever.
> 
> Now manufacturers are adding the capability to handle 1080p to the menu.
> 
> Nearly ALL of these digital displays "publish" their data to the screen progressively and these screens are built to display progressive ONLY, they CANNOT accept an interlaced output from their electronics, but they don't get one, so don't worry.
> 
> In actuality Nearly ALL digital displays are natively progressive in nature, but their electronics can receive and correctly interpret (de-interlace) an interlaced source. The difference recently is that the brains have been upgraded, improved, to be able to correctly accept 1080p and use it.


Well put, counselor. You've said it all. I rest my case.


----------



## Stewart Vernon

Is there any reason why a 2-month old thread has been revived simply to take personal shots? I thought the thread resolved itself already and had forgotten about it myself.


----------



## n3ntj

I am almost sorry I created the thread.. I simply was curious to know if any networks may eventually move (anytime soon) to a 1080p format from whatever they are using now (either 720p or 1080i) since most brands offer 1080p sets... I got my answer which was 'probably not'... Calm down. ;-)


----------



## harsh

There will be little interest in doing 1080p network broadcasts until broadcasters are given a 1080p standard to work with. As it is, none of the existing OTA and CATV equipment supports it so it would be another transition to get there with new tuners and additional channels to carry the new content.


----------

