# Why not in 1080p



## 1953 (Feb 7, 2006)

We just watched Imitiation Game from DirecTV. Checking the movies resolution I found it to be 720p. The highest I could increase the resolution was 1080i. Why did DTV show the movie in 720p and not in 1080p? My next wonder is why couldn't I change the resolution to 1080p? In TV Resolutions all boxes are checked. And of course our Sony HX750 is 1080p.

Irritated. Look forward to your comments.


----------



## sigma1914 (Sep 5, 2006)

It needs to be a 1080p title which is indicated on some PPV titles.


----------



## peds48 (Jan 11, 2008)

According to this

Click for large view - Uploaded with Skitch

The movie is available on 1080p. Is your receiver connected with HDMI?


----------



## inkahauts (Nov 13, 2006)

And you need to make sure you select the 1080p version to watch as well.


----------



## jimmie57 (Jun 26, 2010)

1953 said:


> We just watched Imitiation Game from DirecTV. Checking the movies resolution I found it to be 720p. The highest I could increase the resolution was 1080i. Why did DTV show the movie in 720p and not in 1080p? My next wonder is why couldn't I change the resolution to 1080p? In TV Resolutions all boxes are checked. And of course our Sony HX750 is 1080p.
> 
> Irritated. Look forward to your comments.


Check your settings in your TV and make sure it is set to use 1080p/24 ( True Cinema ). The Cnet web site shows that it does this resolution.

Your manual shows these are available. I have not found if they are automatic or you have to choose. Still looking.
*Video (2D): 1080p (30, 60 Hz), 1080/24p, 1080i (60 Hz), 720p (30, 60 Hz), 720/24p,*
*480p, 480i, PC Formats*

In the manual it shows you have a choice of Cinema 1 or Cinema 2. You will have to see it on screen to know which one to choose. http://docs.esupport.sony.com/imanual/NA/EN/hx750/c_picscrn_picmode.html

Configuring Various Settings, Selecting Picture Mode.

[Cinema 1]

Provides film-based content for a cinema-like environment.

[Cinema 2]

Provides film-based content for basic home use.


----------



## dpeters11 (May 30, 2007)

jimmie57 said:


> Check your settings in your TV and make sure it is set to use 1080p/24 ( True Cinema ). The Cnet web site shows that it does this resolution.


Sony says it does 1080p/24.

http://store.sony.com/55-class-54.6-diag.-sony-led-hx750-internet-tv-zid27-KDL55HX750/cat-27-catid-EOL-Sony-HDTVs


----------



## 1953 (Feb 7, 2006)

peds48 said:


> The movie is available on 1080p. Is your receiver connected with HDMI?


Yes


----------



## jimmie57 (Jun 26, 2010)

1953 said:


> Yes


I added to post #5.


----------



## 1953 (Feb 7, 2006)

inkahauts said:


> And you need to make sure you select the 1080p version to watch as well.


I know we paid $5.99 not $4.99. Did not check to see if was being shown in 1080p.


----------



## 1953 (Feb 7, 2006)

jimmie57 said:


> I added to post #5.


Scene Select is set to -

[Auto (24p Sync)]
Automatically selects "Cinema" for 24Hz signal content. Behaves as "Auto" for all other signals.


----------



## 1953 (Feb 7, 2006)

I now believe we selected an HD but not a HD 1080p version of Imitiation Game. This is something I will check more closely in the future.


----------



## 1953 (Feb 7, 2006)

Are there any television shows routinely broadcast in 1080p?


----------



## litzdog911 (Jun 23, 2004)

1953 said:


> Are there any television shows routinely broadcast in 1080p?


No. 1080p is not an official HD broadcast format. Only 720p and 1080i.


----------



## 1953 (Feb 7, 2006)

Thank you. So now there's 4K and we're not yet receiving 1080p. My my.


----------



## jimmie57 (Jun 26, 2010)

1953 said:


> Thank you. So now there's 4K and we're not yet receiving 1080p. My my.


Your TV will probably need replacing before we get any 4k broadcasts from the satellites. Just an opinion, no facts to back it up.
If I bought a new TV today it would probably be a 1080p, LED. Mine is a 2009 LCD Samsung 1080p and I love it. I possibly will not replace it until it dies one day.


----------



## Rich (Feb 22, 2007)

1953 said:


> Thank you. So now there's 4K and we're not yet receiving 1080p. My my.


That's always puzzled me too.

Rich


----------



## Rich (Feb 22, 2007)

jimmie57 said:


> Your TV will probably need replacing before we get any 4k broadcasts from the satellites. Just an opinion, no facts to back it up.
> If I bought a new TV today it would probably be a 1080p, LED. Mine is a 2009 LCD Samsung 1080p and I love it. _*I possibly will not replace it until it dies one day.*_


At least you have a chance of the TV dying. My plasmas will undoubtedly outlive me.

Rich


----------



## WestDC (Feb 9, 2008)

1953 said:


> Thank you. So now there's 4K and we're not yet receiving 1080p. My my.


I run my Receiver outputs 1080i and then it connects through a ONKYO 609 that up scales it to my Sony in 1080p - So i'm getting 1080p even if it's 1080i


----------



## James Long (Apr 17, 2003)

litzdog911 said:


> No. 1080p is not an official HD broadcast format. Only 720p and 1080i.


There is an FCC approved standard for OTA 1080p (it is required on ATSC TV tuners). I have seen no sign of any OTA broadcaster using the standard - nor do I expect any OTA broadcaster to do so.


----------



## peds48 (Jan 11, 2008)

1953 said:


> Are there any television shows routinely broadcast in 1080p?


Channel 125 The Screening Room


----------



## TomCat (Aug 31, 2002)

James Long said:


> There is an FCC approved standard for OTA 1080p (it is required on ATSC TV tuners). I have seen no sign of any OTA broadcaster using the standard - nor do I expect any OTA broadcaster to do so.


That is true. ATSC has always had this availability, from the beginning. But it was approved hoping technology would somehow make it practical someday. It won't; it is superceded by 4K/ATSC 3.0 and will likely never be used.

There is a good reason, which is that TV stations are lucky to have 6 MHz allocated to them. If you compress 1080p60 enough to fit in a 6 MHz channel, the resulting artifacts ruin the picture quality, so its a tradeoff that makes no sense for broadcast. As it is they have to compress 1080/720 by a ~100:1 ratio just to get that to you, which means that 99% of the original data never makes it to the transmitter, and is reconstructed by the decoder in your STB.

First, 1080p as a display format in a TV is very different from 1080p as a pixel map resolution. One has ramifications for perceived resolution, but the other has no ramifications and is simply an arbitrary choice for what a target display resolution might be. If you confuse their meaning, you will be even more confused understanding why each is important, or not important. It does not help that both are referred to as "1080p". They are not the same, and 1080p (the first one) is not the same as 1080p (the other one).

Second, 1080i30 and 720p60 have essentially the same perceived resolution in double-blind studies, meaning that viewers, all viewers, actually can't tell them apart. The tiny advantage of 720p is smoother motion, while the tiny advantage of 1080i is a bit more resolution for static, non-moving video, both of which are not even perceived. If you can't tell them apart, which is better?

720p30, common on the internet, is not quite as good as 1080i30, for obvious reasons, but you have to be sitting 7.8 ft or less from your 60-incher to really take advantage of that; most sit beyond the range where any flavor of 720p or 1080i looks essentially the same.

Finally, 1080p24 may not be as good as 1080i30, because it has poorer motion performance, so the perception that 1080p (which is always 1080p24) is better than 1080i30, or 720p60m, for that matter, is a false perception and basically a marketing hook more than something that might actually matter.

1080p24 is best, and actually a bit superior, when used with interpolated frames, but most folks don't even like that, dubbing it the "soap opera effect". But guess what those interpolated frames do; they turn 1080p24 into virtual 1080p60, or 1080p120, or 1080p240, or 1080p480, maybe even 960.

So, is 1080p better? Probably really not.


----------



## peds48 (Jan 11, 2008)

TomCat said:


> That is true. ATSC has always had this availability, from the beginning. But it was approved hoping technology would somehow make it practical someday. It won't; it is superceded by 4K/ATSC 3.0 and will likely never be used.
> 
> There is a good reason, which is that TV stations are lucky to have 6 MHz allocated to them. If you compress 1080p60 enough to fit in a 6 MHz channel, the resulting artifacts ruin the picture quality, so its a tradeoff that makes no sense for broadcast. As it is they have to compress 1080/720 by a ~100:1 ratio just to get that to you, which means that 99% of the original data never makes it to the transmitter, and is reconstructed by the decoder in your STB.
> 
> ...


Awesome post!


----------



## TomCat (Aug 31, 2002)

Thanks.

One thing I left out is motion judder, something that will be an artifact in 1080i30 when the content originates at 24 fps, and not an issue at all with 1080p24. But judder is an artifact that we essentially grew up with, and that most of us find natural and not something that detracts from the viewing experience. The motion blurring in interlaced content sort of makes the judder less apparent, too. But technically, in that aspect, 1080p24 can be perceived as better than 1080i30. "Technically" better, but possibly not better as a viewing experience.

Actually, the absence of flicker and judder that occurs with interpolated frames is why the "soap opera effect" actually appears as it does. For some reason, probably because judder and flicker were so common as most of us grew up, many have rejected the smoother, judder-free experience that interpolated frames brings. It is supposed to be an improvement, and technically it definitely is, but folks naturally associate 3:2 pulldown, which is what creates judder in 30fps television, and the lower flicker rate of television, with "video" because video shot for analog TV was shot at 30 fps and therefore had no judder and had a lower flicker threshold, and they also associate the higher flicker threshold and judder of movies shot on film with a movie experience rather than a "soap opera" looking experience.

So the emotional component and what folks are used to and what they expect plays into the experience heavily. Clinging to what we expect is a legacy holdover; something we just have had a hard time adjusting to. No judder and a high refresh rate are better, technically, but for many, not acceptable.

Roger Ebert had a theory that 24fps was closer to brain wave frequencies that induced a dreamlike state of mind that brought the viewer into the story better while higher refresh rates and absence of 24-frame 3:2 pulldown judder were not associated as well, so film, simply by its built in artifacts, made the viewing experience more immersive than 30fps video. There seems to be nothing to support this, but it certainly is an interesting idea.


----------



## slice1900 (Feb 14, 2013)

TomCat said:


> That is true. ATSC has always had this availability, from the beginning. But it was approved hoping technology would somehow make it practical someday. It won't; it is superceded by 4K/ATSC 3.0 and will likely never be used.
> 
> There is a good reason, which is that TV stations are lucky to have 6 MHz allocated to them. If you compress 1080p60 enough to fit in a 6 MHz channel, the resulting artifacts ruin the picture quality, so its a tradeoff that makes no sense for broadcast. As it is they have to compress 1080/720 by a ~100:1 ratio just to get that to you, which means that 99% of the original data never makes it to the transmitter, and is reconstructed by the decoder in your STB.


The ATSC standard for 1080p allowed use of MPEG4, so bandwidth would be fine. But that makes it even "more" incompatible with existing ATSC tuners since many would be unable to handle MPEG4 encoding.

While 1080p60 is better, it wasn't improved by enough versus the existing 720p60 that every ATSC tuner and HDTV can already handle to make it worth the incompatibility. It remains to be seen whether broadcasters will even adopt ATSC 3.0 to go 4K which is a much larger step. There is little benefit to them because they aren't going to get more ad revenue on 4K programming. Maybe they can get cable/satellite providers to pay more for a 4K feed, but they're already getting push back from the fee increases they're seeking today.


----------



## Shades228 (Mar 18, 2008)

TomCat said:


> That is true. ATSC has always had this availability, from the beginning. But it was approved hoping technology would somehow make it practical someday. It won't; it is superceded by 4K/ATSC 3.0 and will likely never be used.
> 
> There is a good reason, which is that TV stations are lucky to have 6 MHz allocated to them. If you compress 1080p60 enough to fit in a 6 MHz channel, the resulting artifacts ruin the picture quality, so its a tradeoff that makes no sense for broadcast. As it is they have to compress 1080/720 by a ~100:1 ratio just to get that to you, which means that 99% of the original data never makes it to the transmitter, and is reconstructed by the decoder in your STB.
> 
> ...


Next thing you know you're going to tell me that my 10000000000000000000:1 contrast ratio is meaningless as well.


----------



## jimmie57 (Jun 26, 2010)

Shades228 said:


> Next thing you know you're going to tell me that my 10000000000000000000:1 contrast ratio is meaningless as well.


I found this article very interesting and looking at the specs on several TV brands they do not even list the Dynamic Contrast Ratio today.
http://www.cnet.com/news/contrast-ratio-or-how-every-tv-manufacturer-lies-to-you/

and this from Wikipedia.
http://en.wikipedia.org/wiki/Contrast_ratio


----------



## miss_my_utv (Jul 25, 2007)

TomCat said:


> <snip>
> 
> So, is 1080p better? Probably really not.


Thanks for the educational post.

Some supporting info on the topic:

http://www.cnet.com/how-to/1080i-and-1080p-are-the-same-resolution/

http://www.cnet.com/news/why-ultra-hd-4k-tvs-are-still-stupid/


----------



## Rich (Feb 22, 2007)

Shades228 said:


> Next thing you know you're going to tell me that my 10000000000000000000:1 contrast ratio is meaningless as well.


Or that direct current has phases. :rolling:

Rich


----------



## TomCat (Aug 31, 2002)

Shades228 said:


> Next thing you know you're going to tell me that my 10000000000000000000:1 contrast ratio is meaningless as well.


CR is actually very important. HDR is one of the things new TVs will be bringing us, and some say it is more important than the rez of 4K.

I feel that a good handle on gamma control is even more important, because it lets you extend the blacks and darks giving not only good CR, but better visibility of the darks.

Just to set the record straight, you do not see any characterizations by me either in my posts or in your quote of my posts, that refer to anything as meaningless. Overhyped, maybe. Perceived as more valuable than it really is, probably pretty definitely. You will never see me get on the "bash" bandwagon simply to make me look smart, because smart don't do that. Ferschizzel.

BTW, the Vizio 55" 4K display I've seen at a couple of WalMarts is pretty amazing.


----------



## Delroy E Walleye (Jun 9, 2012)

I definitely find something very off-putting about 30fps video (see my whiny rants about Super Bowl halftime shows). If motion is going to be "dumbed-down," I guess it "feels" better (to me, at least) if it's taken all the way down to 24 rather than leaving it at 30.

_Interlaced_ 30 fps with full motion being properly rendered at 60 *fields* per second definitely looks more natural for video content, whether blurry or not.

I have had the exquisite experience of seeing 1080p 60, and have wondered in the past why it's rarely used. There's plenty of info out there from fans of HFR, along with downloadable files that can be tried on various displays.

I do, however, understand why premium channels and the like stick with 1080i, both for bandwidth _and_ appearance reasons. I've no quarrel with this. Filmed content looks just fine and natural, as does video, plus the need not to have to switch between the two.


----------



## Laxguy (Dec 2, 2010)

Interjection: You can see 1080 @ 24fps on Channel 125. (If your TV supports that.)


----------



## Delroy E Walleye (Jun 9, 2012)

Laxguy said:


> Interjection: You can see 1080 @ 24fps on Channel 125. (If your TV supports that.)


Useful for checking your 1080p connection, but not much else. My guess some (not all) crappily down-converted from 1080i content. Should look much better than it does. Some is unwatchable herky-jerky!

Makes HR graphics look great, though. Also, better than the low-def version.


----------



## Laxguy (Dec 2, 2010)

Really? They seem to be trailers from recently released movies. None was unwatchable, no artifacts. 

What TV did you watch on?


----------



## NR4P (Jan 16, 2007)

I never bothered with 125 till today. Went there it looks very good. TV reports 1080p/24

But go to 100 which is the exact same content at 1080i/60, doesn't look like HD.

Tried it on two different DVRs, two different sets and 100 is lousy. Even the Directv Cinema logo in upper right is blurred. 

So now I question the source content for 100 and 125 which is identical.


----------



## TomCat (Aug 31, 2002)

Delroy E Walleye said:


> I definitely find something very off-putting about 30fps video (see my whiny rants about Super Bowl halftime shows). If motion is going to be "dumbed-down," I guess it "feels" better (to me, at least) if it's taken all the way down to 24 rather than leaving it at 30.
> 
> _Interlaced_ 30 fps with full motion being properly rendered at 60 *fields* per second definitely looks more natural for video content, whether blurry or not.
> 
> ...


1080i just sort of evolved into a standard for delivery. It mostly became convenient because the standard for acquisition became 1080p60, which is a format that you can derive any common other format from very easily and without much degradation. It also used to be very difficult and expensive to cross-code to other formats.

But a delivery entity such as a network or station has to pick a format and stay with it. Changing formats on the fly results in long, ugly glitches, and potentially harmful transients at the transmitter. You can't then have a 1080i program and run 720p or 480i commercials in the middle of it; everything has to be converted to the same output format. So there is one good reason why they stick with it; they have to.

The only net using 1080p24 that I know of (sure, there may be sat channels that have some content) is NFL, and they probably chose that format because they have a huge NFL FIlms library of 24 fps content. But it means they have to broadcast everything in 1080p24 and they have to cross-code commercials and backhauls to 1080p24. And it probably does them no real good, because most MVPDs that carry them simply convert their entire feed to 1080i30 anyway, so that they can insert _their _1080i commercials and promos without the bother of cross-coding every single clip.

Del, what you find "offputting" about 30 fps video is probably not related to judder, because only 24 fps content converted to 30 fps content has judder, and that usually is limited to things originally shot on film. So it is unclear what you mean. All live video is probably either 1080i30 or 720p60, while all film is probably 24 fps pulled down to 1080i30 or 720p60 for broadcast, and therefore does have judder.

And I agree that 1080i30 looks natural and pretty good, regardless of "blurring". What I mean is that once things begin to move in interlaced video, the H rez goes completely to hell. But really, that is not a problem. If you look at something in your field of vision and then quickly look at something to your left or right, the motion of what goes to your retinas due to changing what you are looking at is blurry anyway. So it is natural to us that moving video has a certain blur to it. For that moment between focusing on one point and another, resolution is not important, or even possible, and most people use that as an opportunity to blink because it is unimportant data. So having really "smooth" motion isn't even natural, which may be why so many people find _that _offputting.

But I am a fan of interpolated frames, and I am happier when a slow pan across a landscape does not have a lot of flicker and judder. If I were running things, I would mandate that all TVs have interpolated frames, and that all broadcasts be in 1080p24. That allows the source to use less compression, gives the viewer virtual 1080p60 or 1080p120 identical to real 1080p60/120, and frees up bandwidth for subchannels, even possibly in HD.


----------



## Laxguy (Dec 2, 2010)

I wonder if Del's TV doesn't handle 24fps very well.....

Del?


----------



## slice1900 (Feb 14, 2013)

TomCat said:


> But I am a fan of interpolated frames, and I am happier when a slow pan across a landscape does not have a lot of flicker and judder. If I were running things, I would mandate that all TVs have interpolated frames, and that all broadcasts be in 1080p24. That allows the source to use less compression, gives the viewer virtual 1080p60 or 1080p120 identical to real 1080p60/120, and frees up bandwidth for subchannels, even possibly in HD.


I for one am very glad you are not running things, if you would mandate the abomination of interpolated frames!


----------



## inkahauts (Nov 13, 2006)

We really wouldn't have all these problems if they would have just stuck with exactly 2 atsc formats, 1080p60 and 480p60 and been done with it. I've always felt having two Hi Definition formats was a terrible terrible idea. I'm glad they are attempting to avoid that with UHD.


----------



## Shades228 (Mar 18, 2008)

TomCat said:


> CR is actually very important. HDR is one of the things new TVs will be bringing us, and some say it is more important than the rez of 4K.
> 
> I feel that a good handle on gamma control is even more important, because it lets you extend the blacks and darks giving not only good CR, but better visibility of the darks.
> 
> ...


It was more of just a joke on the contrast ratio vs the actual contrast ratio that is almost impossible to quantify in a context for marketing. Contrast ratio is extremely important and unfortunately something the marketing machine has destroyed in terms of relevant information.


----------



## KyL416 (Nov 11, 2005)

NR4P said:


> But go to 100 which is the exact same content at 1080i/60, doesn't look like HD.


The HD feed is only available via channel 125, if you're getting 1080i on channel 100 it's because your receiver is upconverting the 480i SD feed.


----------



## slice1900 (Feb 14, 2013)

inkahauts said:


> We really wouldn't have all these problems if they would have just stuck with exactly 2 atsc formats, 1080p60 and 480p60 and been done with it. I've always felt having two Hi Definition formats was a terrible terrible idea. I'm glad they are attempting to avoid that with UHD.


That was never an option. When ATSC was ratified, MPEG4 did not exist and almost every TV (HD or SD) sold was a CRT that could not do progressive scan.


----------



## inkahauts (Nov 13, 2006)

They most certainly could have done it. The made the standard before the tvs came out and the channels where allocated. 

.... progressive scan crts. I sold some. They had amazing picture quality. But there where never a lot of them because it was considered an upscale thing. If it had been the norm everyone would have done it. Instead they used it for just the high end stuff if a company did it at all.


----------



## slice1900 (Feb 14, 2013)

If they mandated 1080p60 they couldn't fit a decent quality picture into the 6 MHz RF channel without higher order modulation that would have required better SNR to get a picture. Are you suggesting they should have gone to 8 MHz channels like DVB-T? Likewise they could have mandated progressive scan formats only but that was a lot more costly to implement in a CRT, especially at full HD resolutions, which is why it was only seen on high end models.

Doing either would have made the ATSC rollout a lot more troublesome, all to save a bit of format hassle and get a slightly better picture. If they were going to mandate something to reduce format pain, they would have been _much_ better off to make 480i/480p 16:9 only.

4K may repeat that mistake with the 21:9 aspect, though hopefully that will be for Blu Ray / movie streaming only and no one will broadcast at that aspect ratio.


----------



## NR4P (Jan 16, 2007)

KyL416 said:


> The HD feed is only available via channel 125, if you're getting 1080i on channel 100 it's because your receiver is upconverting the 480i SD feed.


Good point.
Not obvious to me that it was an SD channel but glad I stated it didn't appear to be HD. So I knew what I was seeing.


----------



## Delroy E Walleye (Jun 9, 2012)

Laxguy said:


> I wonder if Del's TV doesn't handle 24fps very well.....
> 
> Del?


My TV handles all formats, and very well, I might add. 720p (and esp D*'s crap 480i) look much better letting the TV convert them from "native" setting, rather than the HR.

1080p24 looks excellent, whether BD or D* PPV. My complaint with the "screening room" is their handling of the source material. I'll admit it seems to have improved in the last month or so, but I believe it should look much better than it does. It's nowhere near the PQ of the actual movies.

I can only theorize that folks not being able to perceive "herky-jerkyness" of some of the badly-converted trailers must have sets with motion interpolation in them. My lower-priced plasma has no such features, so what I see is what's being transmitted.


----------



## Delroy E Walleye (Jun 9, 2012)

TomCat said:


> --<snip>--
> 
> Del, what you find "offputting" about 30 fps video is probably not related to judder, because *only 24 fps content converted to 30 fps content has judder*, and that usually is limited to things originally shot on film. So it is unclear what you mean. All live video is probably either 1080i30 or 720p60, while all film is probably 24 fps pulled down to 1080i30 or 720p60 for broadcast, and therefore does have judder.
> 
> ...


I must disagree with the part of the sentence in bold. There_ is_ such a thing as video being juddered to 30fps. It's done plenty these days.

I think that the problem I have is that for me, 24fps and 60fps feel "natural," while 30fps contains judder at a frequency that I find personally off-putting. I don't want to further confuse the issue of fields vs frames, but full-motion (live) interlaced TV looks natural (motion-wise) because the screen is being "painted" 60 times per second, naturally "interpolating" the motion, as it always has.

I guess it's hard to describe it in words. There's no way for me to really demonstrate the difference between 24, 30 and 60fps to show what I mean.

I guess the thing for me to do is to make sure the next set I buy has frame rate interpolation in it that can be turned on and off, so that I can engage it when such "offending" 30fps content is displayed, and disengage it when watching "filmed" 24fps content.

I think you may have touched upon something with Ebert's theory of frequency. Kind of makes sense to me.


----------



## Laxguy (Dec 2, 2010)

Delroy E Walleye said:


> My TV handles all formats, and very well, I might add. 720p (and esp D*'s crap 480i) look much better letting the TV convert them from "native" setting, rather than the HR.
> 
> 1080p24 looks excellent, whether BD or D* PPV. My complaint with the "screening room" is their handling of the source material. I'll admit it seems to have improved in the last month or so, but I believe it should look much better than it does. It's nowhere near the PQ of the actual movies.
> 
> I can only theorize that folks not being able to perceive "herky-jerkyness" of some of the badly-converted trailers must have sets with motion interpolation in them. My lower-priced plasma has no such features, so what I see is what's being transmitted.


You say that almost like it's a bad thing to have a set that doesn't show jerky-jerky pictures!  So, I can't agree with your first sentence.

At the same time, you may have watched a segment different from what I've seen. But when you're comparing the trailers (@[email protected] I believe) to the actual movies, they are mostly at [email protected], no?) Or is there something different?


----------



## Delroy E Walleye (Jun 9, 2012)

Laxguy said:


> You say that almost like it's a bad thing to have a set that doesn't show jerky-jerky pictures!  So, I can't agree with your first sentence.
> 
> At the same time, you may have watched a segment different from what I've seen. But when you're comparing the trailers (@[email protected] I believe) to the actual movies, they are mostly at [email protected], no?) Or is there something different?


No, quite the opposite! It really *is* a good thing to have a set that can correct horrible picture errors. Mine doesn't (nor my older CRT HD sets ever did, although they did display very nice 480, both i and especially p).

As I've said, my current set displays excellent 1080p24 pictures (when properly encoded, such as BD or D*PPV). The problem I experience with the "screening room" is with their lousy encoding/downconverting or whatever they're doing to some of those trailers to make them play at 24fps. A few of the frames in the badly-converted ones actually appear torn!

And you bring up another reason why I have a hard time understanding this: The actual original source material is (or at least *should* actually already _be_) 24fps! I can only theorize that some of the studios are providing 30 or 60 Hz "trailers" and they're just not translating properly into that particular channel's 24fps encoders.


----------



## Laxguy (Dec 2, 2010)

Hmmmm. How about posting a specific trailer where it blocks or stutters or shudders (whatever it does), and we can compare notes. I am watching the Inherent Vice trailer- it is not full on HQ HD, but no artifacts, either. The Imitation Game is all right, too.


----------



## Delroy E Walleye (Jun 9, 2012)

Yeah, [Edit] _Inherent Vice_ _Most Violent Year_ I think is one of the two or so that don't seem to have motion artifacts. _Penguins_ is the other one. The rest of them seem to go in and out between smooth and jerky, with _Exodus_ being one of the worst by far. Downright ridiculous, especially for trying to be a "grand spectacular-type" production. It's to LOL at! Some of the worst motion occurring in the grandest shots.

Now granted, there are a few clips in _Exodus_ where the motion is slowed intentionally for effect (which looks even stupider when it's not smooth).

If you have time and want to try an experiment with your HR, try advancing frame-by-frame while paused. It should advance one frame for every button push if properly rendered. Note how many times frames seem to be duplicated. This should never happen in shots containing any motion in a 24fps signal. Even more noticeable if there are moving graphics under the picture (the DirecTv-added stuff). They _do_ move once for every button push, while some of the frames from the movie above them are duplicated.

If trying this experiment during Exodus clips, note the disappearing and reappearing shadows in a couple of shots. I believe this may have been what caused me to think some frames were actually "tearing."

Something that just occurs to me is that if they're needing to duplicate this channel in SD, they're doing it backwards. It's the low-def that should be adapted to the 24fps, not the other way around. Maybe this channel should just be 720p or 1080i. Like I've said, good for checking if your set can get the1080p signal but not much else (for me, at least). If I were DirecTV and wanting to showcase my 1080p PPV movies, I think I'd get this fixed.


----------



## TomCat (Aug 31, 2002)

Delroy E Walleye said:


> I must disagree with the part of the sentence in bold. There_ is_ such a thing as video being juddered to 30fps. It's done plenty these days...


You may be disagreeing with something I never said or ever alluded to.

As you must know, the term "judder" as commonly used typically refers to a perceived artifact caused only by 3:2 pulldown, where alternate frames are scanned 2 times and 3 times, in order to get the refresh rate from 24 to 60 (fields). On a pan or slow zoom, this causes the flicker to be "jerky" because one image (3 frames scanned) appears to stay on the screen longer than the alternate image (2 frames scanned), and so while the flicker rate itself is at a steady rate of 60 fields per second, it appears not to be. Since 24 does not divide into 60fps (field rate) evenly, you need a refresh rate of 120Hz typically to avoid it, because 24 does divide evenly into 120, eliminating the need for 3:2 pulldown, and therefore eliminating judder.

Since virtually all commercially broadcast linear video is either 30fps or 60fps, there is no 3:2 pulldown and consequently no judder when converting between them. There could be if content were acquired at 24fps, but only film is generally acquired that way professionally, so it is unlikely to experience any judder artifacts outside of the typical 3:2 pulldown used to convert it for distribution or playback, to 30 or 60fps.

And one of the issues with 1080p24 video is that how that is pulled down by TVs (or STBs that convert refresh rates) incapable of doing it in a conventional way for that particular format, makes it less than universal. That is exactly why some sets and STBs don't accept 1080p24. The required pulldown characteristics in ATSC tuners does not specify converting from 1080p24, because that was never a broadcast format, so it is sometimes pot luck whether a TV or STB can handle that properly.

So the most common place you might see judder created other than when 24fps content is broadcast, would be when you receive a 1080p24 streamed video and end up pushing that to a TV that does not have the capability of a 120Hz refresh rate, through an Apple TV or something like that. This will cause judder due to the pulldown created directly in the device between your computer and your TV, just like the pulldown done to 24fps movies when converted to 30fps when broadcast. This also means that any tiny advantage of 1080p24 as a source format (other than file size and compression efficiency) has just been fully negated by that process for the viewer.

But then I actually never said that video was not "juddered to 30fps", because even while the characterization of "juddered" being a process is incorrect, that is usually exactly what happens: 24fps content pulled down to 30fps is where the judder comes from, so the common process that creates judder results in 30fps. But "judder" is not a process, it is a perceived artifact resulting from a process.

"Juddered" video is nearly always from a 24fps source, and "juddered" video is nearly always "to" 30fps. So it makes very little sense that you might disagree with me saying something regarding video not being "juddered" to 30fps, because it very nearly always is. It only makes sense if you possibly did not understand what I was saying, because what I said did not include that video was not "juddered to 30fps"; it is.

And here is what I think is the most interesting thing about the pulldown process. When the content is 24fps, pulldown is necessary to convert to the 60fps field rate for broadcast. This results in 60 separate images, 3 identical reflecting even frames of the film, and 2 identical reflecting odd frames of the film, for a total of 60 images for the 24 frames in a second of film.

But rather than do the pulldown before transmit, ATSC transmits just the original 24 images, and at 24fps. It adds a metadata flag invoking "film mode" and the other 36 images are actually created directly after the decoder in your tuner instead. That eases compression requirements producing a better quality than if the entire 60 fields were broadcast. 24 images are sent, but 60 fields/frames, including pulldown, are displayed.

Now, your 120Hz-capable HDTV effectively doubles those 60 images to 120 images. The question is, what does it do with them? If it simply spits them out serially, the judder that is created remains; instead of 2 images for even fields/frames and 3 images for odd fields/frames, that becomes 4 vs 6, and the judder remains. For it to remove the judder, it has to be smart enough to pull the video back up (2:3) to the 24 original frames and then create 5 images for each of those frames, and then spit those out in serial order. Or it has to create 4 interpolated images between each of those 24 images, but at a minimum it has to pull the video back up first. The next question is, does every TV that boasts an 120Hz refresh rate actually do this? Or not?

It also has to be able to distinguish field order. If it identifies the second field as the first (does not know which is the first field in a frame and guesses wrong) 40% of the fields created will be a blurred composite of two dissimilar fields, meaning 3 of every 5 images will be good and 2 will be a blurry mess, resulting in a perceived loss of resolution on motion even worse than on normal 1080i30.


----------



## slice1900 (Feb 14, 2013)

TomCat said:


> Now, your 120Hz-capable HDTV effectively doubles those 60 images to 120 images. The question is, what does it do with them? If it simply spits them out serially, the judder that is created remains; instead of 2 images for even fields/frames and 3 images for odd fields/frames, that becomes 4 vs 6, and the judder remains. For it to remove the judder, it has to be smart enough to pull the video back up (2:3) to the 24 original frames and then create 5 images for each of those frames, and then spit those out in serial order. Or it has to create 4 interpolated images between each of those 24 images, but at a minimum it has to pull the video back up first. The next question is, does every TV that boasts an 120Hz refresh rate actually do this? Or not?


Every TV sold today able to display 1080i (i.e. all of them) is capable of reverse 3:2 pulldown. Otherwise when video that is pulled down from 24 fps broadcast on a 1080i station was deinterlaced, some frames would be blurred together. Maybe if you go back more than 5-10 years some did this, but it is irrelevant when discussing 120 fps capable displays.

The reason it must be capable if it avoids this blurring is that it has to do reverse pulldown before it deinterlaces. After deinterlacing it pulls it down for display at 60 fps, but if it is capable of 120 fps it wouldn't need to do that and can instead display the same frame five times in a row (or make it look awful by interpolating frames)


----------



## Delroy E Walleye (Jun 9, 2012)

Well then, TC, maybe "judder" for me isn't exactly the right term, but I always thought it was. For me it means I can "perceive" that 60Hz (live video image) is "smooth-as-real-life" motion, while 30Hz to me is perceived as "judder." I don't necessarily believe that pulldown is always the source of my complaints (with the exception of the 1080p24 "screening room" apparently doing it incorrectly).

Perceived - or real - 24 Hz (film) for me is "natural," (like watching a movie in a theater) while the over-used-these-days" 30Hz - my reference to the number of times-per-second that the picture changes (more and more television these days) I find* annoying*. I guess I can understand why it's used, being exactly half the 60Hz TV signal, so no "pulldown" is needed. I'm guessing it's also easier to transmit through the net and play on other devices than TV sets. To me, 24fps "filmed" content has always looked just fine in broadcast for decades.

If one wants to stay overly-technical, even the theater with its 24fps old-fashioned projector is still "flickering" with its shutter at a much higher Hz, even though the film is only transported through the mechanism at the rate of 24fps. While this hasn't much to do with my complaints about 30Hz, one could make the comparison of the projector's shutter to display refresh rates, FWIW.

I'm just saying I find 30Hz -rate of frame change- "uncomfortable," and I'm with you on interpolation in these instances. Believe me, my next set is going to have it! But I don't think I'd be using it on 24fps content, just the "juddered to 30" stuff. As I've said, 24fps seems more "natural" to me than 30, and there might be something to that Ebert theory.

To Slice: Thanks for your reference. It reminds me that I've had experience with earlier, inexpensive capture devices exhibiting horrible artifacts when trying to capture NTSC video that contained 24fps content. Apparently it de-interlaced the video while it was being captured, and some frames would contain annoying double-images. (I never understood why it had to try to de-interlace, I would've been quite happy to capture it normally, like a DVD recorder does.)

I guess exact definitions of "judder" are technically different. I always thought it referred (in the video realm) anything less than full-motion, 60Hz. Motion pictures (with the possible exception of HFR) have always contained it, whether in the theater, or displayed on TV. It's just that we're "used to it" at 24Hz and not 30. Well, me anyway...

I have noticed that some programs seem to alternate between - what I'll just call - 60Hz and 30Hz within the same broadcast (watch an older episode of _American Pickers_, for example). I always thought they were doing that for some kind of production effect. But increasingly more programs seem to be just sticking with the 30Hz frame change rate. I guess now it's the viewer's responsibility either to use a set that has interpolation, or view on a different device (like a phone or tablet) where 30Hz isn't quite as annoying.

Thanks for the responses, and allowing me to "vent" here.


----------



## Laxguy (Dec 2, 2010)

I've now seen the trailer for Exodus, and yes, it is bad in the second half especially. 

What little I knew about judder I have forgotten, and I kinda suspect the source material as the cause of the missing frames- or over repeating of existing frames.


----------



## Rich (Feb 22, 2007)

Laxguy said:


> I've now seen the trailer for Exodus, and yes, it is bad in the second half especially.
> 
> What little I knew about judder I have forgotten, and I kinda suspect the source material as the cause of the missing frames- or over repeating of existing frames.


My computer tells me this:

_ jud·der_
_(especially of something mechanical) shake and vibrate rapidly and with force._

That term also seems to apply to the shaky picture I see on LCDs from time to time.

Rich


----------



## slice1900 (Feb 14, 2013)

A good example of judder is watching the credits roll by at the end of a film shown on TV. You don't see it as often anymore because many channels run the credits at turbo speed in a small window, others may have fixed the judder during encoding by dropping the dupe frames in the credits since it is so noticeable - and because that fix speeds up the credit roll by 25%.

Watch on something like ThisTV or Grit where they still show the credits normally and you can still see judder sometimes. The credits jerk a bit, sorta looking like they get stuck for a split second as they scroll upward, because six times a second a frame is doubled so the smooth motion you would have seen at 24 fps is lost.

Judder is definitely NOT what Delroy seems to think it is.


----------



## Delroy E Walleye (Jun 9, 2012)

I guess it just makes my head _feel_ like Rich's definition when I see 30Hz frame rate on TV.

Try watching some older BBC productions with horizontal rolling credits. Those could make one downright nauseous trying to read them. Probably has something to do with 50 to 60Hz standards conversion. Doesn't look good even if it's done well.


----------

