# 60Hz vs 240Hz



## jlsohio (Dec 24, 2007)

I have a Samsung series 8000 TV. I have a HR21/700 box. The TV displays 60Hz as the signal coming in. 

Question: is the 60Hz a function of the how the broadcasters send the signal? Or is it a function of the box that I have. Do any broadcasters brodcast at 120 or 240 Hz?


----------



## Davenlr (Sep 16, 2006)

Satellite video comes in two flavors, 24 FRAMES PER SECOND, and 60 FRAMES PER SECOND, seen usually as 1080/24 or 1080/60.

Your TV refreshes the screen either 24 times per second, 60 timees per second, 120 times per second, or 240 times per second. It can also process the signal, such as taking those 24 frames per second, doing math on them, and outputting them as 60, which means your tv would need to refresh its screen 120 times per second (120 hz) to do that.

Basically, the signal you cannot change. The TV you can. 60hz is ok for normal tv. 120 hz will make sports look better, and allow the tv to display 24 Frame/Sec VOD downloads without flicker. 240 hz just does what 120hz does twice as fast.

Select the rate on your TV that looks the best to you while still delivering a picture. Since 24 wont go into 60 an even number, you cannot usually display 24 frame per second video on a 60 hz tv.


----------



## dpeters11 (May 30, 2007)

A lot of the reviews I've seen, they suggest there is diminishing return as well. Big difference going from 60 to 120, fairly small difference from 120 to 240 though it can be a larger difference when it's a 3D TV.


----------



## BattleZone (Nov 13, 2007)

Higher refresh rates for LCD TVs allow for several things, not all of which are necessarily implemented by the manufacturer.

- More refreshes of the screen reduce visible artifacts common to LCD panels of motion "blur" or "smear", due to the pixels taking longer to change. This was the primary reason for higher refresh rates, but as pointed out, there is a diminishing return as refresh rates are increased.

- "Motion Enhancement", called AMP by Samsung and MotionFlow by Sony, is where a processor in the TV "creates" intermediate frames in between the actual frames delivered by the source device to smooth out motion. This is similar to how animation is drawn: a lead artist will draw a character at the beginning of a motion (Frame 1 in the pic below), and another frame at the end of that motion (Frame 10). Those two frames will then be sent to lower-paid artists called "Tweeners" who draw the frames in between.










Motion Enhancement gives you the "looking through a window" effect, which most people agree is a positive thing when watching something like a live sports broadcast, but a negative when watching movies. As such, you'll need to get used to turning it on and off depending on what you are watching.

- True 24-frame support. Nearly all film is shot at 24 frames per second, and both directors and many movie viewers feel strongly that 24 fps gives film a very distinct feel that is destroyed when converted or changed. With SDTVs, which were fixed at 60 fps (in the US; 50 fps for much of the rest of the world), all film had to be converted using a 3:2 ration of odd to even frames, causing "judder" as odd-numbered frames would be displayed for a longer duration than the even-numbered frames.

With modern HDTVs capable of refresh rates other than 60 Hz, it is possible to offer true 24 fps capabilities by using a refresh rate that is evenly divisible by 24, such as 120 Hz (divisible by both 60 and 24) or 240 Hz. Plasmas, which generally can't refresh at such high rates, but also don't need to due to a difference in display technology, may offer refresh rates of 72 or 96 Hz in addition to the standard 60 Hz, to accomplish the same goal of being able to display 24 fps content at an even 24 fps.

This is done by simply repeating each frame the same number of times that 24 is a multiple of the refresh rate. For example, a 120 Hz panel will repeat each frame of a 24 fps movie 5 times. 5 x 24 = 120 frames per second. Your eye can't tell the frames are being repeated, so it just seems that that frame is on the screen for 1/24th of a second. The result is judder-free 24 fps content on your home TV. For a plasma running at 72 Hz, each frame would be repeated 3 times. 3 x 24 = 72 frames per second.

Note that this display mode is also called different things by different manufacturers; often some variation of "Cinema Mode", and more importantly, note that virtually all TVs that support this mode require their Motion Enhancement to be turned OFF. And some lower-end 120 Hz and even 240 Hz TVs do NOT offer correct 24 fps support, even though they probably could have, so a higher-than-60 Hz refresh rate isn't itself a guarantee that 24 fps content is properly supported.


----------



## bpratt (Nov 24, 2005)

Actually, DirecTV broadcasts 3 different signals. They broadcast 1080I, 1080P and 3D. 1080I is broadcast 60 half frames per second. They broadcast all the odd lines then all the even lines so the actual full frame rate is 30 frames per second. 1080P is 24 full frames per second. I'm not sure about 3D, but I believe it is currently 2 pictures of 1080I broadcast alternatively.

If you own a 60 Hz TV, it will show 1080i with no change to the way it is broadcast. Some TVs will accumulate the odd and even frames and will show the accumulate picture 2 times. 1080P or 24 frames per second is displayed by showing the first frame 2 times, the second frame 3 times, the third frame 2 times, the fourth frame 3 times ......... Doing this creates a thing called Judder which some people can see, but most can not.

The advantage to a 120 Hz TV is it will display 1080P by showing each frame 5 times and thus eliminating Judder. There really is no advantage in 1080I because these TVs just show the accumulated 1080I frames 4 times.

Many of the 240 Hz TVs have a processor which can do motion smoothing. On these sets a 1080I accumulated frame is shown 8 times, and a 1080P frame is shown 10 times. Lets say an object is moving across the screen (like a football), at 1 inch per frame. Normally, the first frame received would show the football. The second frame received would show the football 1 inch to the right. Since the frames are each going to be displayed 8 or 10 times, the processor can do motion smoothing. The processor would display the first frame, then move the football 1/8 or 1/10 inch to the right then display next frame etc. The end result is that 240 Hz TVs give you a much smoother picture.

3D requires either 120 Hz or 240 Hz. New sets are being developed which run at 480 Hz. These sets allow 3D and motion smoothing to be shown.


----------



## P Smith (Jul 25, 2002)

_"Actually, DirecTV broadcasts 3 different signals. They broadcast 1080I, 1080P and 3D."_

Actually you mean two - MPEG-2 and MPEG-4 .

Inside of that two, it's matter of controlled conversion by STB to TV output: remember those 480/720/1080 LEDs ? 
It is not a *broadcast* per se.


----------



## bpratt (Nov 24, 2005)

P Smith said:


> _"Actually, DirecTV broadcasts 3 different signals. They broadcast 1080I, 1080P and 3D."_
> 
> Actually you mean two - MPEG-2 and MPEG-4 .
> 
> ...


MPEG-2 and MPEG-4 define how the transmission is compressed and has nothing to do with TV refresh rates we are discussing in this thread.


----------



## P Smith (Jul 25, 2002)

That's why you wasted time and space to bring the sentence.


----------



## HoTat2 (Nov 16, 2005)

As far as I can tell DirecTV broadcast 2D HD images in 1080P/24 Hz, 1080i/60 Hz, and 720P/60 Hz formats.

3D HD I think is 1080P/24 Hz Side-by-Side (SbS) for film material, and 1080i/60 Hz SbS for video.

What I don't know is that as is done with SD when DirecTV is broadcasting film material in regular 2D HD do they actually eliminate the redundant field or frame to save bandwidth and really transmit film sources in 1080i/48 Hz or 720P/48 Hz (FX HD channel, etc.) and the receiver simply adds the redundant fields or frames back in based on the repeat flags or something.


----------



## veryoldschool (Dec 10, 2006)

HoTat2 said:


> What I don't know is that as is done with SD when DirecTV is broadcasting film material in regular 2D HD do they actually eliminate the redundant field or frame to save bandwidth and really transmit film sources in 1080i/48 Hz or 720P/48 Hz (FX HD channel, etc.) and the receiver simply adds the redundant fields or frames back in based on the repeat flags or something.


I think this is done at the program provider and they're the ones sending out the "HD" signal, so any upconverting is done before DirecTV gets it.
Think back to HD locals that were still using an SD camera, they added the side bars.


----------



## Max Mike (Oct 18, 2008)

NTSC broadcast television signal is 29.97 interlaced frames of video per second or rounded up 30 frames a second. Each frame consists of 2 fields (with the half lines of the pictures full resolution) ... 2 X 30 = 60. 480p 720p and 1080p/60 broadcast 60 full non-interlaced picture frames. So you only have a max of 60 individual pictures coming to the TV each second no matter the refresh rate of the TV. A 120Hz or 240Hz TV either repeats frames or creates interpolated frames to fill out their increased refresh rates. 

Digital signals in theory could be broadcast at higher rates.

Contrary to what they say and think many and probably most people cannot see the difference between 60Hz and 120Hz. virtually no one has really sees the difference between 120Hz and 240Hz. As stated above 120Hz does offer real benefits when watching 1080P/24 content.

The biggest benefit of 240Hz screens is not the 240Hz refresh rate but the fact that 240Hz TV tend to have panels that have better contrast. blacks. ect...simply put 240Hz panels tend to have better pictures because they are better panels picture wise with or without 240Hz.


----------



## veryoldschool (Dec 10, 2006)

Max Mike said:


> NTSC broadcast television signal is 29.97 interlaced frames of video per second or rounded up 30 frames a second. Each frame consists of 2 fields (with the half lines of the pictures full resolution) ... 2 X 30 = 60. 480p 720p and 1080p/60 broadcast 60 full non-interlaced picture frames. So you only have a max of 60 individual pictures coming to the TV each second no matter the refresh rate of the TV. A 120Hz or 240Hz TV either repeats frames or creates interpolated frames to fill out their increased refresh rates.
> 
> Digital signals in theory could be broadcast at higher rates.
> 
> ...


Let's not bring reality in here or the marketing departments will need to find something else to hype. :lol:


----------



## HoTat2 (Nov 16, 2005)

veryoldschool said:


> I think this is done at the program provider and they're the ones sending out the "HD" signal, so any upconverting is done before DirecTV gets it.
> Think back to HD locals that were still using an SD camera, they added the side bars.


I understand that, but what I was questioning whether or not DirecTV actually removes the redundant fields or frames when broadcasting film based material in HD as they do with SD. Thereby reducing the field/frame rate by ~20% to save on bandwidth.

Not the up-conversion of 4:3 material to an HD format.


----------



## veryoldschool (Dec 10, 2006)

HoTat2 said:


> I understand that, but what I was questioning whether or not DirecTV actually removes the redundant fields or frames when broadcasting film based material in HD as they do with SD. Thereby reducing the field/frame rate by ~20% to save on bandwidth.
> 
> Not the up-conversion of 4:3 material to an HD format.


I'd say the MPEG-4 is removing the redundant parts, verses MPEG-2 not.


----------



## Rich (Feb 22, 2007)

BattleZone said:


> Higher refresh rates for LCD TVs allow for several things, not all of which are necessarily implemented by the manufacturer.
> 
> - More refreshes of the screen reduce visible artifacts common to LCD panels of motion "blur" or "smear", due to the pixels taking longer to change. This was the primary reason for higher refresh rates, but as pointed out, there is a diminishing return as refresh rates are increased.
> 
> ...


Damn! That was a FINE post! Thanx for the clear, easy to understand info.

Rich


----------



## hasan (Sep 22, 2006)

(much snipped)



BattleZone said:


> - "Motion Enhancement", called AMP by Samsung and MotionFlow by Sony, is where a processor in the TV "creates" intermediate frames in between the actual frames delivered by the source device to smooth out motion. This is similar to how animation is drawn: a lead artist will draw a character at the beginning of a motion (Frame 1 in the pic below), and another frame at the end of that motion (Frame 10). Those two frames will then be sent to lower-paid artists called "Tweeners" who draw the frames in between.
> 
> Motion Enhancement gives you the "looking through a window" effect, which most people agree is a positive thing when watching something like a live sports broadcast, but a negative when watching movies. As such, you'll need to get used to turning it on and off depending on what you are watching.


That is an outstanding post, a *very* clear and concise explanation, much better done than many other places that have tried to explain. Good job Battle Zone!

I would add:

Vizio calls their motion processing: Smooth Motion Technology

As you noted my Sammy calls it AMP (Auto-Motion Plus).

Of the motion processing implementations I have looked at, I like the Samsung the best, although that's simply a matter of personal preference. I like the Vizio as well, but it is not as "striking" as the Samsung (set to defaults...there are a ton of variables that you can play with). I have not seen the Sony implementation. Lower end Sanyo 120 hz machines also have some smoothing, but no controls, and while better than not having it, I find their approach unimpressive.

What I have heard from the consumers with respect to motion smoothing is that they either love it, or hate it. I had one person tell me that it gave them vertigo. My wife referred to it as "super-realistic" on our Sammy.

I very much like AMP myself. I also agree that the jump from 60 to 120 is readily observable if you have the right program material and know what to look for, and anything above 120 hz is confounded by so many other variables, that I can't tell what is what (like another poster noted, differences in the panel itself produces all sorts of "perceptions".

Thanks for taking the time to make such an informative post.


----------



## BattleZone (Nov 13, 2007)

Can you tell I get asked those questions all the time?


----------



## P Smith (Jul 25, 2002)

Time to reveal truth about 480 and 960 (!) Hz refresh rates.


----------



## Rich (Feb 22, 2007)

BattleZone said:


> Can you tell I get asked those questions all the time?


I could tell that you know what you're talking about and are able to dumb it down enough for me to understand. That alone takes a certain talent.

Rich


----------



## BattleZone (Nov 13, 2007)

I'm originally from an IT background, running Tech Support or Help Desk call centers, so a key part of those jobs is taking complex technical information and finding a way to make non-techie users understand it. Of course, that is probably why I tend to over-use analogies, but, hey, I do what works.


----------



## Rich (Feb 22, 2007)

BattleZone said:


> I'm originally from an IT background, running Tech Support or Help Desk call centers, so a key part of those jobs is taking complex technical information and finding a way to make non-techie users understand it. Of course, that is probably why I tend to over-use analogies, but, hey, I do what works.


It's still a thing most people don't understand how to do. I struggled with it for a while. Then I read someplace that if you can write at a level that fifth graders can understand, it makes it a lot easier to get your points across. And, let's face it, the whole point of posting is to get your point across. Hard to help people that don't have the slightest idea what you're talking about.

Rich


----------



## BattleZone (Nov 13, 2007)

The hard part is finding the right balance, because the audience here has a huge range of knowledge levels about technical satellite and A/V topics. Some know more than I do about many of the subjects, and some know nearly nothing, with most folks somewhere in between. You don't want to talk over anyone, but you also don't want to over-simplify too much or seem condescending in any way (which I'm not trying to be at all). Hopefully I'm able to find the right middle ground most of the time.


----------



## TomCat (Aug 31, 2002)

HoTat2 said:


> As far as I can tell DirecTV broadcast 2D HD images in 1080P/24 Hz, 1080i/60 Hz, and 720P/60 Hz formats.
> 
> 3D HD I think is 1080P/24 Hz Side-by-Side (SbS) for film material, and 1080i/60 Hz SbS for video.
> 
> What I don't know is that as is done with SD when DirecTV is broadcasting film material in regular 2D HD do they actually eliminate the redundant field or frame to save bandwidth and really transmit film sources in 1080i/48 Hz or 720P/48 Hz (FX HD channel, etc.) and the receiver simply adds the redundant fields or frames back in based on the repeat flags or something.


First, there is no 1080i/60 (the second number refers to frames per second) but that's OK, at least everyone else got something wrong in their posts, too.

1080i for US TV is 1080i30, which means 1080 lines of 1920 pixels each, divided into two fields of 540 lines each, at 30 frames per second and 2 fields per frame (so a field rate of 60 fields per second), displayed field-sequentially ("interlaced"). To provide continuity with legacy TV, the frame rate is actually 29.97. That came about in 1953 when they needed the frame rate of NTSC to be an exact submultiple of the chroma subcarrier rate of 3.58 MHz, and is still used today for legacy reasons. A technique called "drop frame" which drops a frame every couple of minutes or so is used so that video longer than 2 minutes does not play out slightly longer due to the discrepancy between the frame rate and the expected rate of 30 fps.

For those same legacy reasons, all US 720p60 video runs at 59.94 fps. It is the fact that 720p has twice as many frames compared to 1080i30 that causes it to be the best format for motion (other than 1080p60, which no one yet uses). This is the same theory behind higher frame rates such as 120 and 240 both with frame interpolation being an "improvement" even over 720p or 1080p60 (although the stronger benefit is that 120 and 240 can be divided by 24 with the result being an integer, meaning they can also remove pulldown judder, something you can't do with the typical 60Hz scan rate).

I use quotes around "improvement", because many are of the opinion that changing the 24 fps rate of movies is anything but an improvement. "Too many" frames gives video what is known as the "soap opera" look, which many do not care much for. Roger Ebert claims the 24 fps rate is closer to mimicking the rate of brain waves that allow the suspension of disbelief and therefore greater immersion of the viewer psychologically within the drama. I defer to Roger. But on the other hand a SuperBowl at 24 fps might not be as good as FOX's 720p60, if motion artifacts get in the way, and for most, they do. That is a great debate.

Also, LCD displays do not "blur", technically speaking. Video pixels refresh about once every 17 ms, and the refresh rate of an LCD pixel is about 4 ms, so there is no trouble with LCDs refreshing at a rate at least as fast as video itself refreshes. But it does appear to blur compared to plasma displays, because the duty cycle of LCD pixels is "always on", while plasma pixels are "on-off", meaning there is a blanking of each pixel for part of the duty cycle. Plasma advocates feel that this allows a better match to the human persistence of vision (or IOW allows their retinas to refresh quicker, as persistence of vision is where the blur effect actually comes from). I'm not sure I completely buy that because real-world experience is "always on" just like LCDs are, and we have no blur issue with the real world. Another great debate.

As to your question, MPEG uses a technique called "film mode". When there are redundant frames in 30fps or 60 fps content due to pulldown (film mode is inactive for 30 or 60 fps content that does not have pulldown), MPEG algorithms such as MPEG-2 or MPEG-4 detect that, remove those redundant frames before encoding, and flag it so that the decoder (in your STB) knows this is the case. Only 24 (or 48) frames of content are encoded, sent, then decoded. The metadata flag tells the decoder to duplicate frames at the STB in order to restore the pulldown that was in the original. It then appears to the viewer exactly as it would had the redundant frames been sent as well. It does indeed save bandwidth (or at least incurs less artifacts for the same bandwidth) by not sending those redundant frames, and that is exactly why it is done.

It is not a coding option, it is automatic. If there is pulldown, film mode invokes automatically. If there is no pulldown, it is disabled.


----------



## NR4P (Jan 16, 2007)

TomCat said:


> I use quotes around "improvement", because many are of the opinion that changing the 24 fps rate of movies is anything but an improvement. "Too many" frames gives video what is known as the "soap opera" look, which many do not care much for. Roger Ebert claims the 24 fps rate is closer to mimicking the rate of brain waves that allow the suspension of disbelief and therefore greater immersion of the viewer psychologically within the drama. I defer to Roger. But on the other hand a SuperBowl at 24 fps might not be as good as FOX's 720p60, if motion artifacts get in the way, and for most, they do. That is a great debate.


Now that I have a TV with all these modes, I see what I've been reading about for the past few years. Watch a movie on my 240hz TV, and the soap opera effect is terrible. Put it in Theater mode, and it looks like film is expected to look. I think they call it Cinemotion.

With NFL Sunday's, having Motionflow on, really smoothed out any pixelation on those very fast moving scenes on CBS that I had on my older 60hz 1080p TV (I know 1080i OTA). 720p was definitely better.

These features are worthwhile if you know how to spend alot of time setting them and using them at the right time.


----------



## DodgerKing (Apr 28, 2008)

Will a 60Hz TV actually do 1080p/24? Will the 1080p setting on the Direct boxes work on a 60Hz TV?


----------



## HoTat2 (Nov 16, 2005)

TomCat said:


> First, there is no 1080i/60 (the second number refers to frames per second) but that's OK, at least everyone else got something wrong in their posts, too.
> 
> 1080i for US TV is 1080i30, which means 1080 lines of 1920 pixels each, divided into two fields of 540 lines each, at 30 frames per second and 2 fields per frame (so a field rate of 60 fields per second), displayed field-sequentially ("interlaced").


Well ... confusingly I've actually seen them illustrated in both ways. Even my own TV set's "info." button displays a 1080i input signal as "1920 x 1080i @60Hz."

The way I understood this, at least in the U.S., was that the "i" or "P" immediately following the vertical resolution number sets the context for the number that follows. So an "i" means it refers to the field rate, and the "P" means it's the frame rate. Though after review I do see that whether the second number intends to refer to the field or frame rate, it is more customary to leave out the slash symbol such as "1080i60," "1080P60," "1080i30," "1080P30," or "720P60."



> To provide continuity with legacy TV, the frame rate is actually 29.97. That came about in 1953 when they needed the frame rate of NTSC to be an exact submultiple of the chroma subcarrier rate of 3.58 MHz, and is still used today for legacy reasons. ...


Yes, that's the old "Color Lock" specification of the NTSC standard where all video timing sync references, line, field, and frame, are derived from the 3.58 subcarrier to insure backward compatibility and noninterference with monochrome TV sets from the addition of the chroma signal.



> A technique called "drop frame" which drops a frame every couple of minutes or so is used so that video longer than 2 minutes does not play out slightly longer due to the discrepancy between the frame rate and the expected rate of 30 fps. ...


"Play out," you mean as in today's file based video servers?

But yes I've been familiar with the "drop frame" principle for decades starting with the old SMPTE time code readers which used it.



> For those same legacy reasons, all US 720p60 video runs at 59.94 fps. It is the fact that 720p has twice as many frames compared to 1080i30 that causes it to be the best format for motion (other than 1080p60, which no one yet uses). This is the same theory behind higher frame rates such as 120 and 240 both with frame interpolation being an "improvement" even over 720p or 1080p60 *(although the stronger benefit is that 120 and 240 can be divided by 24 with the result being an integer, meaning they can also remove pulldown judder, something you can't do with the typical 60Hz scan rate). ... *


Yes true, though on the issue of judder I must admit that I have a very difficult time seeing it even when intentionally looking for it. Other than maybe a slight jerkiness in the motion of slow camera pans across a scene, the most visible appearance of judder I notice is only during the credit scrolls at the end of movies or other film based material.

And who really cares about that?



> As to your question, MPEG uses a technique called "film mode". When there are redundant frames in 30fps or 60 fps content due to pulldown (film mode is inactive for 30 or 60 fps content that does not have pulldown), MPEG algorithms such as MPEG-2 or MPEG-4 detect that, remove those redundant frames before encoding, and flag it so that the decoder (in your STB) knows this is the case. Only 24 (or 48) frames of content are encoded, sent, then decoded. The metadata flag tells the decoder to duplicate frames at the STB in order to restore the pulldown that was in the original. It then appears to the viewer exactly as it would had the redundant frames been sent as well. It does indeed save bandwidth (or at least incurs less artifacts for the same bandwidth) by not sending those redundant frames, and that is exactly why it is done.
> 
> It is not a coding option, it is automatic. If there is pulldown, film mode invokes automatically. If there is no pulldown, it is disabled. ...


OK thanks;

So I take it that whether its SD or HD, the TV broadcaster's equipment compliance with the MPEG standards requires it to automatically switch to "film mode" and encode only 24 fps for any film based programming?


----------



## HoTat2 (Nov 16, 2005)

DodgerKing said:


> Will a 60Hz TV actually do 1080p/24? Will the 1080p setting on the Direct boxes work on a 60Hz TV?


Yes, all recently manufactured 60 Hz HDTV sets do, or certainly should, be capable of receiving 1080P/24 Hz nowadays.

However, they must of course add a 3:2 pulldown frame sequence to the signal to pad it up to 60 fps for display, and thus reintroduce judder.

So other than the good feeling it may give one that they can actually receive it, displaying a 1080P/24 Hz signal on a 60 Hz set is really self defeating because of the 3:2 pulldown addition requirement.


----------



## Rich (Feb 22, 2007)

BattleZone said:


> The hard part is finding the right balance, because the audience here has a huge range of knowledge levels about technical satellite and A/V topics. Some know more than I do about many of the subjects, and some know nearly nothing, with most folks somewhere in between. You don't want to talk over anyone, but you also don't want to over-simplify too much or seem condescending in any way (which I'm not trying to be at all). Hopefully I'm able to find the right middle ground most of the time.


I think you've found out how to get your points across. The way you use pictures helps a lot too.

Rich


----------



## Rich (Feb 22, 2007)

HoTat2 said:


> Yes, all recently manufactured 60 Hz HDTV sets do, or certainly should, be capable of receiving 1080P/24 Hz nowadays.
> 
> However, they must of course add a 3:2 pulldown frame sequence to the signal to pad it up to 60 fps for display, and thus reintroduce judder.
> 
> So other than the good feeling it may give one that they can actually receive it, displaying a 1080P/24 Hz signal on a 60 Hz set is really self defeating because of the 3:2 pulldown addition requirement.


By the way, when you upscale a standard DVD, you don't get 1080/24p, you get 1080/60p. There is such a thing.

Rich


----------



## HoTat2 (Nov 16, 2005)

rich584 said:


> By the way, when you upscale a standard DVD, you don't get 1080/24p, you get 1080/60p. There is such a thing.
> 
> Rich


Well yeah sure ...

And not just with SD DVDs, but all 1080P 60 Hz flat panels will up-convert any input to its native display resolution and frame rate of 1080P60. As will 120 or 240 Hz sets up-convert to their native display of 1080P120 and 1080P240 respectively.

It just that no one currently broadcasts in a 1080P60 format. Its used in some broadcast studios and other production facilities for in house distribution over HD-SDI cables. (For instance, the so called "3G," for the ~3 Gbps data rate it requires in uncompressed format) But no one actually broadcast this standard yet, even with MPEG compression.


----------



## TomCat (Aug 31, 2002)

DodgerKing said:


> Will a 60Hz TV actually do 1080p/24? Will the 1080p setting on the Direct boxes work on a 60Hz TV?


They are usually capable of 1080p24, but the only way they can do that is by creating 2:3 pullup frames (later interpolated by the set using 3:2 pulldown), because 24 does not divide into 60 evenly. It does divide into 120 evenly, however, and most TVs that boast 1080p24 capability have either a 120 Hz mode or some other mode such as 48 or 96 or 240, all of which will be able to do it without adding pullup and without inducing pulldown judder artifacts.

The 1080p24 setting will work with many 60Hz TVs, but will have the same pulldown artifacts that the 1080i version (of original 24 fps content) would have. IOW, you will not enjoy what is the only benefit of 1080p, although it is a very small one, for the most part.

Motion judder is really only visible on slow pans and zooms; fast motion (or no motion) makes it virtually undetectable. We have been living with 3:2 pulldown on NTSC for our entire lives, and no one ever really complained about it hurting their enjoyment of the movies they were watching.

I forgot to mention that there is also another small benefit of 1080p24, which is that motion does not destroy H rez (as does 1080i) due to interlacing (720p also has this benefit along with twice as many frames). On the way to your set there may be an interlace step added, either in the TV or in the DVR's 1080i output mode, but deinterlace in the set will hopefully reverse any artifacts that would create (as long as the content was acquired in a progressive mode, which includes most if not all movie telecine processes). Some Vizio sets don't deinterlace consistently, so this may cause some brief line-dicing on them.


----------



## TomCat (Aug 31, 2002)

HoTat2 said:


> Well ... confusingly I've actually seen them illustrated in both ways. Even my own TV set's "info." button displays a 1080i input signal as "1920 x 1080i @60Hz."


It is confusing. The refresh rate of your TV is 60 Hz (59.94, actually). The field rate of 1080i30 is 60 Hz, but the frame rate of 1080i30 is 30 Hz. 1080i displays field 1 for 1/60th of a second followed by field 2 for 1/60th of a second, which comprises a single frame and takes a total of 1/30th of a second, making the frame rate 30 fps.



> The way I understood this, at least in the U.S., was that the "i" or "P" immediately following the vertical resolution number sets the context for the number that follows. So an "i" means it refers to the field rate, and the "P" means it's the frame rate. Though after review I do see that whether the second number intends to refer to the field or frame rate, it is more customary to leave out the slash symbol such as "1080i60," "1080P60," "1080i30," "1080P30," or "720P60."


Speaking as a broadcasting professional, the second number always refers to the frame rate. The Standard Handbook of Video and Television Engineering by Jerry Whitaker and Blair Benson, which is the bible of the industry (as well as numerous other publications including the ATSC part 53 paper from the FCC) lists the available 18 permutations of ATSC broadcasting, and there is no 1080p30 nor 1080i60 among them. If you are seeing something different than this, you can probably assume it is not correct. There are some non-square pixel formats used outside of ATSC broadcasting such as the 1280x1080 format that DTV used to use before their MPEG-4 era, however. ATSC does not allow non-square pixels as it might be incompatible with some computer formats.



> "Play out," you mean as in today's file based video servers?


It would apply to that but also to analog or digital tape, as long as the format was NTSC.



> on the issue of judder I must admit that I have a very difficult time seeing it even when intentionally looking for it. Other than maybe a slight jerkiness in the motion of slow camera pans across a scene, the most visible appearance of judder I notice is only during the credit scrolls at the end of movies or other film based material.
> 
> And who really cares about that?


A lot of folks think that they do, especially since it became a marketing issue. More correctly, they have been told by the marketers that they care about it, and some are gullible enough to assume it is important. It honestly is not all that important, at least IMHO.



> So I take it that whether its SD or HD, the TV broadcaster's equipment compliance with the MPEG standards requires it to automatically switch to "film mode" and encode only 24 fps for any film based programming?


Yes sir. It is my understanding that this is universal to both MPEG-2 and MPEG-4 codecs, and may be one of the few issues the committees did not argue all that much about.

But broadcasters did not use MPEG for analog NTSC, so for them it only applies to HD or SD upconverted to HD, which since the analog shutoff in 2009 means everything, including the SD versions of their channels on cable and DBS (although the cable SD channels are usually converted back to analog NTSC). For DBS, they have been MPEG from day one (DTV originally used an enhanced version of MPEG-1 in 1994, followed quickly by MPEG-2) so it applies to all of their content that was originally 24 fps.


----------



## TomCat (Aug 31, 2002)

rich584 said:


> By the way, when you upscale a standard DVD, you don't get 1080/24p, you get 1080/60p. There is such a thing.
> 
> Rich


I can't argue with that. There is such a thing as 1080p60 when referring to a display format, and that is the same exact display format that most "1080p" sets have used since 2005. But that is not the same as 1080p60 the content format, which implies many benefits beyond 1080i30 or 720p60 that the simple display format 1080p60 does not enjoy. They are two very different things that just happen to have the same name.

(Of course if you upscale 1080p24 to 1080p60, you can only do it by adding duplicate frames in a pattern that creates pulldown judder on playback, so since your TV can do the very same thing there is really not much point in doing that.)

My name is Tom and Tom Cruise's name is also Tom. Like 1080p60, the name can stand for two different things. In our case, one of us is a very rich and famous movie star, and one of us is a not very rich or famous broadcasting Engineer.


----------

