# 1080P



## 325xia (Oct 28, 2006)

I have a 1080P Samsung HDTV. Yet, when I try and Test the 1080P from my DirecTV HD DVR, it says "This TV does not support the DirecTV 1080P Signal". Is there something I need to do in order to get the two to communicate correctly? I have a Samsung LN-S4695D TV and a HR20/700 DirecTV Receiver.

Thanks


----------



## sarfdawg (Jan 21, 2007)

Your TV may not support 1080p/24fps. You may only get 1080p/60fps. I'm in that boat as well. If you try to view 1080p content via DirecTV, chances are you will only be able to see about 1/4 of the picture you should. If you turn 1080p off of your settings on your DirecTV receiver, you will notice that your 1080i picture of what should be 1080p programming will look exceptionally sharp. Not 1080p, but it may be the best you can do.


----------



## 325xia (Oct 28, 2006)

Okay. Thanks. I sure wonder why I was told I was getting a TV that would support 1080P in the future. What a joke. Why did I spend the extra cash for something I can't use through DirecTV. I guess Blue Ray is all my 1080P is good for. I also wonder why DirecTV chose to go with this method of 1080P? I'm sure a lot of people are experiencing the same thing you and I are.


----------



## LarryFlowers (Sep 22, 2006)

As the production year for this set dates from 2005, it is unlikely to support 1080p/24.

1080P/24 most closely represents the frame rate that movies are produced in. Most sets built now fully support the frame rate and a majority have since around 2007.



325xia said:


> Okay. Thanks. I sure wonder why I was told I was getting a TV that would support 1080P in the future. What a joke. Why did I spend the extra cash for something I can't use through DirecTV. I guess Blue Ray is all my 1080P is good for. I also wonder why DirecTV chose to go with this method of 1080P? I'm sure a lot of people are experiencing the same thin you and I are.


----------



## 325xia (Oct 28, 2006)

I understand what you are saying. I bought my HDTV back in the Fall of '06. I'm not about to buy another one just to utilize 1080P24.


----------



## veryoldschool (Dec 10, 2006)

325xia said:


> I guess Blue Ray is all my 1080P is good for... I'm sure a lot of people are experiencing the same thing you and I are.


 Sony screwed me also by not disclosing "which" 1080p.
BluRay is also 1080p/24, but the player upconverts this to 1080p/60.
The chips in the STB can't convert/output 1080p/60.


----------



## 325xia (Oct 28, 2006)

^^^Good to know. Thanks.


----------



## veryoldschool (Dec 10, 2006)

325xia said:


> ^^^Good to know. Thanks.


 1080p programs are still a "plus" for us though. The bit-rate can be better used for PQ [ranges from 2-16 Mb/s] and the STB converting this to 1080i has given [me] some great looking images.


----------



## Kojo62 (Aug 9, 2007)

Sadly, I have the same issue. My Sharp LCD does not support 1080p/24, only 1080p/60. So I know how you feel 325xia.

I didn't even know to _look_ for 24fps support when I bought my TV back in 2007. I thought I had really done my HDTV homework, too.

So it was quite a shock when my set failed D*'s "Incredible Hulk" 1080p test video last year. It had always been able to handle the 1080p signals from my Xbox 360 and upconverting DVD player, so I never knew it was lacking in anything.

Hopefully, downconverted 1080i programs won't look too bad on it, although I can't verify that since I haven't bought any of the 1080p content from D* yet.


----------



## barryb (Aug 27, 2007)

Here you will see the specs on your TV, along with refresh rates (on the left):

http://www.hdtvexpert.com/pages_b/Samsung_LN-S4695D.html

As others have pointed out.... 1080p/60 is the culprit.

I have a Sony similar to what veryoldschool has, so I am in the same boat with you guys.


----------



## thekochs (Oct 7, 2006)

veryoldschool said:


> The chips in the STB can't convert/output 1080p/60.


I was wondering how DirecTV was able to firmware upgrade to 1080P support...guess this now makes sense....1080P/24...not 60. Too bad, but hey I'm still on WXGA (1368x768) TVs with 720P running.


----------



## BattleZone (Nov 13, 2007)

In 2005, exactly ZERO TVs from any manufacturer supported 1080/24p, and in 2006, exactly ONE TV did (a single high-end Pioneer Pro model). It wasn't until 2007 that 1080/24p support was added by most (but not all) manufacturers. It isn't brand-specific.

Most HD content (both video and film transfers) is shot at 24 frames per second, which is the speed that pro film cameras have always used. Blu-Rays are virtually all mastered at 1080/24p, as is all "1080p" VoD content. It's the best format for most content. The issue is that no manufacturer wanted to pay to include 1080/24p support until there were source devices that could output in that format, and that didn't happen until Toshiba released their first HD-DVD player in 2006.


----------



## sarfdawg (Jan 21, 2007)

I hate to be a DirecTV party pooper (I LOVE my DirecTV). That said, I don't even think about purchasing a movie on DirecTV now. I have the lowest priced Netflix plan with Blu-Ray, and it comes to about $11/mo. At $6/movie on DirecTV, it is pointless. If you don't like only one movie at a time, 2 and 3 at a time are not much more expensive. The picture on Blu-Ray is 1080p no matter what, and I'm getting way more bang for the buck. 

Again, I LOVE DirecTV, but I don't get terribly worked up about not having a 1080p/24 set.


----------



## dcowboy7 (May 23, 2008)

Technology just cant be kept up with.

hdtvs had a 60hz refresh rate....then 120hz....now even thats old....the new models out now have 240hz.

Its impossible to keep up with the latest/greatest stuff.


----------



## veryoldschool (Dec 10, 2006)

dcowboy7 said:


> Technology just cant be kept up with.
> 
> hdtvs had a 60hz refresh rate....then 120hz....now even thats old....the new models out now have 240hz.
> 
> Its impossible to keep up with the latest/greatest stuff.


 120 is used because it works for both 1080/60 [2x] & 1080/24 [5x] and 240 is just "must be better" than 120.


----------



## Mike Bertelson (Jan 24, 2007)

veryoldschool said:


> 120 is used because it works for both 1080/60 [2x] & 1080/24 [5x] and 240 is just "must be better" than 120.


Of course it's better...it's a bigger number.... :lol:

It is odd how some people have problems, even though they have TVs that are supposed to display1080p/24.

When I go my new TV I went into Setup and added in the 1080p under the resolutions tab. Other then me selecting 1080p does the HR2x have any idea what the TV is capable of?

Mike


----------



## 325xia (Oct 28, 2006)

Thanks for the input and especially the info on my TV! So, does anything offer 1080p 60??? (ie., X-Box, Blue Ray, Dish, etc.). From what I'm hearing hear, there is nothing.


----------



## DogLover (Mar 19, 2007)

MicroBeta said:


> Of course it's better...it's a bigger number.... :lol:
> 
> It is odd how some people have problems, even though they have TVs that are supposed to display1080p/24.
> 
> ...


If you are connected with HDMI, that information is transmitted back to the DVR.


----------



## Stuart Sweet (Jun 19, 2006)

325xia said:


> Thanks for the input and especially the info on my TV! So, does anything offer 1080p 60??? (ie., X-Box, Blue Ray, Dish, etc.). From what I'm hearing hear, there is nothing.


As far as I know there's nothing that is really encoded in 1080p/60, no.


----------



## 325xia (Oct 28, 2006)

^^^ Ok. Thanks. I guess there is no way to see how well my TV can look. Ha!


----------



## veryoldschool (Dec 10, 2006)

Stuart Sweet said:


> As far as I know there's nothing that is really encoded in 1080p/60, no.


 Video Games, or to put it another way 1080p/60 is a PC output.


----------



## Rich (Feb 22, 2007)

Stuart Sweet said:


> As far as I know there's nothing that is really encoded in 1080p/60, no.


My Sony BD player upscales standard DVDs to 1080/60p (which is, according to Sony, the correct way to "spell" either 1080/24p or 1080/60p). Didn't know that until I got the BD player and hit the "Display" button and got the message that 1080/60p was being output to the TV. Beautiful PQ for a standard, upscaled DVD, by the way. For a BD it says "1080/24p".

I did post a list of all the TVs that were capable of 1080/24p. Don't remember what thread it was in, but it's out there somewhere. I've seen it reproduced in several other threads.

Rich


----------



## Rich (Feb 22, 2007)

I found the list. Here is a link.

Rich


----------



## Rich (Feb 22, 2007)

325xia said:


> I have a 1080P Samsung HDTV. Yet, when I try and Test the 1080P from my DirecTV HD DVR, it says "This TV does not support the DirecTV 1080P Signal". Is there something I need to do in order to get the two to communicate correctly? I have a Samsung LN-S4695D TV and a HR20/700 DirecTV Receiver.
> 
> Thanks


Your TV is NOT on the list I just posted. At least I couldn't find it.

Rich


----------



## Rich (Feb 22, 2007)

barryb said:


> Here you will see the specs on your TV, along with refresh rates (on the left):
> 
> http://www.hdtvexpert.com/pages_b/Samsung_LN-S4695D.html
> 
> ...


Huh. Sammy says 1080p/60. Who's correct, Sony or Sammy? Sony says 1080/60p. Must be a standard somewhere, no? Like with the eSATAs, some were calling they ESATAs (which is easier to type), then someone came along and posted the standard, which is eSATA (almost as easy to type, but not your average acronym's spelling).

Rich


----------



## Rich (Feb 22, 2007)

325xia said:


> ^^^ Ok. Thanks. I guess there is no way to see how well my TV can look. Ha!


I bought that Sony BD player for about $300 and if all I did was watch standard wide screen DVDs at 1080/60p (think I'll stick with Sony's spelling until someone posts a standard) it would be worth every cent. I have several upscaling DVD players, Sammy's and Sony's (the Sonys are much better and much cheaper) and none of them comes close to the picture pumped out by the Sony BD player when upscaling standard DVDs.

Might not be quite what you want, but it is a viable 1080/60p source. At least it's something. Damn shame that they did that to you guys, especially when they knew 1080/24p was inevitably coming.

Rich


----------



## veryoldschool (Dec 10, 2006)

rich584 said:


> Huh. Sammy says 1080p/60. Who's correct, Sony or Sammy? Sony says 1080/60p. Must be a standard somewhere, no? Like with the eSATAs, some were calling they ESATAs (which is easier to type), then someone came along and posted the standard, which is eSATA (almost as easy to type, but not your average acronym's spelling).
> 
> Rich


Remember who write this crap.
480i [interlaced]
480p [progressive]
1080i [interlaced]
1080p [progressive]
1080p/60 [60 frames/sec]
1080p/24 [24 frames/sec]


----------



## -Draino- (May 19, 2008)

My TV says it supports 1080P/60 /30 /24

I am trying to test the 1080P using something from DirecTV...because I don't have a Blue Ray player.

Is there any FREE content on DTV that I can test with?


----------



## CCarncross (Jul 19, 2005)

Stuart Sweet said:


> As far as I know there's nothing that is really encoded in 1080p/60, no.


Funny you should mention that....several BD titles I recently purchased are in fact 1080p/60 titles.

Firefly: The complete series
Stevie Nicks:Soundstage
Heart:Alive in Seattle

I'd have to check but I suspect alot of TV Series will be 1080p/60 titles, of course all my BD movies seem to be 1080p/24. Which is probably another reason all the VOD PPV movie content will be 1080p/24. I don't believe anything besides movies is being offered in 1080p.


----------



## Mike Bertelson (Jan 24, 2007)

-Draino- said:


> My TV says it supports 1080P/60 /30 /24
> 
> I am trying to test the 1080P using something from DirecTV...because I don't have a Blue Ray player.
> 
> Is there any FREE content on DTV that I can test with?


I'm pretty sure that all the 1080p content is PPV.

At least I couldn't find any free 1080p when I looked last week so I could be wrong...but I don't think so. :grin:

Mike


----------



## mikeinthekeys (Feb 10, 2007)

My Vizio only displays 1080/50p or 1080/60p. Recently after a software download (that I can't discuss here) I tried to play a VOD episode of Weeds. The receiver prompted me if I wanted to see this in 1080p, I said yes, and after that, the list of available resolutions on the Misc Options page showed the usual 480 - 1080i options PLUS 1080/50p and 1080/60p.
I watched the program and it looked great. The TV has a button to display what it was showing and it said 1080p. The receiver had the last two lights on indicating 1080p. I'm not sure why I erased this episode... would love to see if it is repeatable. 
Now after the most recent download, the 1080p options do not appear in Misc Options.
Any ideas on what could have accounted for this? A relic of MRV perhaps.
Apologies if this is in the wrong forum, I am aware of the rules, but this thread seems to be right on topic to address my issue.


----------



## Rich (Feb 22, 2007)

veryoldschool said:


> Remember who write this crap.
> 480i [interlaced]
> 480p [progressive]
> 1080i [interlaced]
> ...


It's the BD player that says 1080/60p not the manual. Now I gotta check the manual...Yup, they use the same format. This manual is a lot better than the Panny BD player was. Must be a standard format somewhere. I must go and search for one. Be back later.

Rich


----------



## veryoldschool (Dec 10, 2006)

rich584 said:


> It's the BD player that says 1080/60p not the manual. Now I gotta check the manual...Yup, they use the same format. This manual is a lot better than the Panny BD player was. Must be a standard format somewhere. I must go and search for one. Be back later.
> 
> Rich


 If any of your DVRs have the misc options under setup, DirecTV is following "the standard" with 1080p/60 & 1080p/24


----------



## Rich (Feb 22, 2007)

veryoldschool said:


> If any of your DVRs have the misc options under setup, DirecTV is following "the standard" with 1080p/60 & 1080p/24


Microsoft uses the format 1080/24p too. Back to hunting for more...Too much like work, I'll stick with Sony and Microsoft's format of 1080/XXp. Nasty search, couldn't narrow it down or find a table of standards. As long as we can understand it, I guess either way is acceptable.

Rich


----------



## veryoldschool (Dec 10, 2006)

rich584 said:


> Microsoft uses the format 1080/24p too. Back to hunting for more...
> 
> Rich


 I'll go with "common use" as used here:
http://www.engadgethd.com/tag/1080p24


----------



## Rich (Feb 22, 2007)

Did your Vizio say 1080/60p or 1080p/60? If you remember.

Rich



mikeinthekeys said:


> My Vizio only displays 1080/50p or 1080/60p. Recently after a software download (that I can't discuss here) I tried to play a VOD episode of Weeds. The receiver prompted me if I wanted to see this in 1080p, I said yes, and after that, the list of available resolutions on the Misc Options page showed the usual 480 - 1080i options PLUS 1080/50p and 1080/60p.
> I watched the program and it looked great. The TV has a button to display what it was showing and it said 1080p. The receiver had the last two lights on indicating 1080p. I'm not sure why I erased this episode... would love to see if it is repeatable.
> Now after the most recent download, the 1080p options do not appear in Misc Options.
> Any ideas on what could have accounted for this? A relic of MRV perhaps.
> Apologies if this is in the wrong forum, I am aware of the rules, but this thread seems to be right on topic to address my issue.


----------



## Rich (Feb 22, 2007)

veryoldschool said:


> I'll go with "common use" as used here:
> http://www.engadgethd.com/tag/1080p24


Why is there so much confusion about this? Should be a standard somewhere. As long as we can understand each other, I guess it doesn't matter. Your link even has Sonys on it and they are listed as 1080p/XX. Curious...

Rich


----------



## litzdog911 (Jun 23, 2004)

-Draino- said:


> ...
> Is there any FREE content on DTV that I can test with?


The first 5 minutes of any PPV movie are free.


----------



## cartrivision (Jul 25, 2007)

325xia said:


> Thanks for the input and especially the info on my TV! So, does anything offer 1080p 60??? (ie., X-Box, Blue Ray, Dish, etc.). From what I'm hearing hear, there is nothing.


Don't know the specifics about the various game boxes, but I think some of them do output at 1080p/60, and all Blu-ray players output at 1080p/60 even though the source material is coming off the disc at only 24fps or 30fps.


----------



## cartrivision (Jul 25, 2007)

litzdog911 said:


> The first 5 minutes of any PPV movie are free.


All the 1080p PPV in the "Movies Now" folder (or whatever it's called these days) now give you about an 8 minute free preview. I don't know if all PPVs are now the longer 8 minutes or some are still at 5 minutes.


----------



## dcowboy7 (May 23, 2008)

litzdog911 said:


> The first 5 minutes of any PPV movie are free.


Not the adult PPV. :new_cussi


----------



## -Draino- (May 19, 2008)

dcowboy7 said:


> Not the adult PPV. :new_cussi


 Figures!!!


----------



## mikeinthekeys (Feb 10, 2007)

rich584 said:


> Did your Vizio say 1080/60p or 1080p/60? If you remember.
> 
> Rich


Sorry I missed your post... the manual never uses either format... they say 1080p in the specs, and then in other separate references state they support 60hz. The on-screen display shows only 1080i or 1080p.


----------



## Rich (Feb 22, 2007)

mikeinthekeys said:


> Sorry I missed your post... the manual never uses either format... they say 1080p in the specs, and then in other separate references state they support 60hz. The on-screen display shows only 1080i or 1080p.


That's what my Panny TVs do too. Unfortunate, but better than nothing. My Sony BD player does tell me exactly what the output is, or I would have never known.

Rich


----------



## mikeinthekeys (Feb 10, 2007)

I don't know if you saw my recent post on this strange thing...
When viewing a VOD episode of Weeds, when I started it, the HR20-700 prompted me that the program was available in 1080p and did I want to view it that way. Of course I did, and the display (according to the TV) showed 1080p. Unfortunately I didn't save it... not sure why, but I tried to download it again and didn't get a repeat of the 1080p. At first the bottom third of the screen was distorted, but it cleared up after a few seconds and looked spectacular (hard to say if it was really better than 1080i, but it did look good). And the TV thought it was getting a progressive signal, so I'm pretty sure the HR was putting out a progressive signal (the last two indicator lights on the box were on). I have never seen anything like this, and I have not purchased a 1080p movie to see what happens.


----------



## ticmxman (Aug 28, 2007)

rich584 said:


> That's what my Panny TVs do too. Unfortunate, but better than nothing. My Sony BD player does tell me exactly what the output is, or I would have never known.
> 
> Rich


Well put my new Panasonic 1080P TC-P46S1 Plasma to the 1080P test by down loading and watching a movie last night. It passed the test with a great picture.
So at least until I get a BRD player I have some 1080P content available. But the 24 hr time limit will push me to get a BRD player pretty quick. Any suggestions?
Tic


----------



## Ken_F (Jan 13, 2003)

cartrivision said:


> Don't know the specifics about the various game boxes, but I think some of them do output at 1080p/60, and all Blu-ray players output at 1080p/60 even though the source material is coming off the disc at only 24fps or 30fps.


Eventually, all DirecTV DVRs will too.

DirecTV (and Dish Network) DVRs are limited by the existing Broadcom DVR SoCs, which all support a maximum of 1080p24 output. Earlier this year, Broadcom announced its first DVR SoCs with full 1080p60 support; these are expected to ship by the end of the year, and we should see them in new DVRs next year.


----------



## Rich (Feb 22, 2007)

ticmxman said:


> Well put my new Panasonic 1080P TC-P46S1 Plasma to the 1080P test by down loading and watching a movie last night. It passed the test with a great picture.
> So at least until I get a BRD player I have some 1080P content available. But the 24 hr time limit will push me to get a BRD player pretty quick. Any suggestions?
> Tic


I got the Panny 605 BD player from Costco. That comes with an HDMI cable. Only difference from a Panny 60. Couldn't get the sound to work right, called Panny tech service, they said it must be a bad box and I took it back. Bought a Sony BDX 1 and couldn't be happier.

Rich


----------



## mikeinthekeys (Feb 10, 2007)

In re-reading this thread... there is a way to test your TV's ability to play 1080p for free. Select one of the 1080p choices in Direct Cinema that shows a green dot with checkmark. Those movies are already on your drive. It will play for around 5 minutes... enough for you to see the results. The last two resolution lights on your front panel will light, and you can check your TV for what it is getting (if it allows that).

In my case, my Vizio displays these movies just fine, though spec'd for 1080p/50 and 1080p/60.


----------



## Rich (Feb 22, 2007)

mikeinthekeys said:


> In re-reading this thread... there is a way to test your TV's ability to play 1080p for free. Select one of the 1080p choices in Direct Cinema that shows a green dot with checkmark. Those movies are already on your drive. It will play for around 5 minutes... enough for you to see the results. The last two resolution lights on your front panel will light, and you can check your TV for what it is getting (if it allows that).
> 
> In my case, my Vizio displays these movies just fine, though spec'd for 1080p/50 and 1080p/60.


Doesn't tell you what it is receiving? No "Info" button? I thought the Vizios had some way of telling you exactly what they are receiving. My Panny plasmas only tell me 1080p, but my Sony BD player tells me what it is outputting. It outputs 1080/60p for regular DVDs and 1080/24p for BluRay.

And that 1080/60p is a lot better than 1080i. I've got Sony and Sammy upscalers, but the Sony really upscales regular DVDs much better.

Rich


----------



## ticmxman (Aug 28, 2007)

rich584 said:


> I got the Panny 605 BD player from Costco. That comes with an HDMI cable. Only difference from a Panny 60. Couldn't get the sound to work right, called Panny tech service, they said it must be a bad box and I took it back. Bought a Sony BDX 1 and couldn't be happier.
> 
> Rich


Thanks I'll be checking out the Sony's


----------



## Rich (Feb 22, 2007)

ticmxman said:


> Thanks I'll be checking out the Sony's


Doubt if you'll find that model. I would expect the Sony BD players to keep evolving into better players.

I went thru just about every VCR and ended up using Sonys only. My first VCR was a Sony Beta-Max and I ended up with about twelve Sonys VHS units when I discovered DVRs.

Anyhow, for about twelve years, I tried every upscale VCR that I could find. The Mitsubishi VCRs were easy to use and had a great remote. Did everything but pump out a picture that equaled the Sony's. I never saw a VCR that put out a better picture than a Sony.

I hope to have the same experience with the Sony BD players. (Yes, I know the PQ is equal on all BD players, it's the other things about the Sony BD players that I like. Such as a manual you can read and understand.) I do wish they could stream NetFlix as some BD players do.

Rich


----------



## ThePrisoner (Jul 11, 2009)

I must say that I'm enjoying D* 1080p movies. I still miss the lossless audio of Blu-ray but if I want to see a movie that's a long wait in my Netflix queue I can deal with it. Last night I watched Watchmen VoD and I'll probably go out sometime today and pick up the Blu-ray. I'm currently using a Panasonic BD30 which I purchased in March 08'. Has been flawless although I'm thinking of getting the Oppo BDP-83. Oh, my display is a Panasonic 50PZ800U THX plasma, capable of 1080p/24


----------



## V'ger (Oct 4, 2007)

The following is specifically for digital transmission over HDMI.

For 24fps source material (i.e. film), Mpeg compression reduces the transmission over the satellite or internet down to two frames. To get 24p or 60p is just a matter on how many times the frame is repeated across the HDMI cable. 

1080i has the same number of pixels and if you deinterlace the fields in the TV, it too can be boiled down to two frames or four fields. If the DVR interlaces the video perfectly and if the TV delinterlaces it properly, the TV will show pixel for pixel accurately as if the DVR could put out 1080p/60 natively. So, for 1080p/24 source material only, there is no difference between 1080i and 1080p/60. For 1080p/60 source material there would be a great loss of detail if you were forced to watch a program in 1080i.

There is a small quality difference for sets whose panels can display 1080p/24 at a true multiple of 24fps (i.e. 48, 72, or 120 Hz) because both frames get repeated an even number of times. To go from 24fps to 60fps, one frame is repeated three times and the next is repeated twice (or fields in lieu of frames for 1080i). The advantage of repeating frames an even number of times is that it gives a movie theater like effect on pans and scrolling end credits, where fine detail would otherwise appear to start and stop (called judder).

Now DirecTV 1080p movies downloaded from the internet have better multipass compression applied to them so that action spots in the movie gets more bandwidth when needed versus the realtime compression used for anything transmitted over the satellite. So there is an advantage for people who watch 1080p/24 movies in 1080i.

Some TVs will accept 1080p/24 and scale it to 1080p/60 because the display panel only does 60Hz. So you have to watch carefully in the specs as I am sure some marketing types will stretch the truth can call their sets 1080p/24 compatible even though the TV internally upconverts to 60 Hz.

Finally, some TVs change brightness, contrast, and other parameters when switching from 24p to 60p. So in some instances. 60p might look better to the end-user. Some sets may not perfectly deinterlace the 1080i signal properly and will not be pixel for pixel perfect representation of the original image. So, in general, it is always best to watch a program at it's native resolution to keep scaling in the DVR or TV set from altering the picture.


----------



## TomCat (Aug 31, 2002)

V'ger;2167468 said:


> The following is specifically for digital transmission over HDMI.
> 
> For 24fps source material (i.e. film), Mpeg compression reduces the transmission over the satellite or internet down to two frames. To get 24p or 60p is just a matter on how many times the frame is repeated across the HDMI cable. .


I hardly think this could be anywhere close to accurate. MPEG compression does not reduce the number of frames in any way unless the content has pullup. For typical video at 30 (29.97) fps, 30 (29.97) fps are encoded and later decoded. For 24 fps, 24 fps are encoded/decoded. MPEG compression reduces the amount of data by removing redundant data and data that falls outside of psychovisual perception limitations of human vision. The number of frames encoded is left alone, with the exception of "film mode".

In film mode, content that has 2:3 pullup is detected and the redundant frames are not transmitted. This means that if there are 60 fps only 24 are encoded, transmitted, and decoded (the rest are copies of other frames anyway, and are therefore 100% redundant). But 3:2 pulldown is added in the decoder (the missing frames are recreated from the transmitted frames simply by clocking them out of the decoder buffer more than once--repeating them. This is done whenever the "film mode" metadata flag is set during encoding) so that content will still display with the 60 frames that it originally had, even though only 24 are transmitted.

In any case, intermediary compression has nothing at all to do with a difference in original fps and displayed fps. Those are always the same. 60 frames may be reduced to 24 between encode and decode, but certainly not to any rate below that and the original frame rate is always restored after decode.

HDMI is not involved in this process and has nothing at all to do with it nor is it affected by it. It is something totally unrelated. HDMI is a protocol for delivery of digital video in an uncompressed state, long after remote MPEG encoding, transmission, and local MPEG decoding is over with. The HDMI transmit chip inside the DVR has no earthly idea what may have happened with the signal prior to it receiving it for formatting to HDMI, nor does it care. It's one purpose in life is to take streaming digital now-uncompressed video from the output of the MPEG decoder and transmit it over a cable to a HDMI receiver chip, and that is all.



V'ger;2167468 said:


> 1080i has the same number of pixels...
> 
> ...If the DVR interlaces the video perfectly and if the TV delinterlaces it properly, the TV will show pixel for pixel accurately as if the DVR could put out 1080p/60 natively. So, for 1080p/24 source material only, there is no difference between 1080i and 1080p/60. For 1080p/60 source material there would be a great loss of detail if you were forced to watch a program in 1080i..


The same number of pixels as what?

Neither 1080i30 nor 1080p24 can be reconstituted by any DVR or STB to "as if it could put out 1080p/60 natively". Neither has the pixel rate or sheer data rate required for 1080p60, which is also the reason there is no 1080p60 content in the consumer world (no transmit infrastructure can handle it and no download speed is practical for it). Both have an obviously-lower frame rate, meaning significant motion artifacting not present in 1080p60. 1080i30 also has interlace error that is not reversible. 1080p60 content can indeed be interlaced and deinterlaced without inducing interlace error, but interlace error already in the content (such as 1080i30 which is what most HD is) will remain when your 1080p set deinterlaces it to 1080p60. You can't make chicken salad out of chicken $#!+, and you can't make 1080p60 out of either 1080i30 or 1080p24. Neither are quite as good.

But on the other hand, even 1080p60 is not really a significant improvement over 1080i30 or 720p60, and 1080p24 certainly isn't. The amount of detail is fixed by the pixel map, which is exactly the same for 1080i and 1080p (all flavors) so they all, including 1080i30, 1080p24 and 1080p60, start out with the exact same amount of detail, and maintain that for static objects. The tiny advantage that 1080p60 has (an advantage shared by 720p60) is that when there is motion in 1080i, then there is a perceived reduction in detail.

Is that a big deal? Hardly. Real human vision suffers from the exact same loss of perceived detail when there is motion across the field of vision, so we would barely notice the difference anyway, and it would seem perfectly natural. The stuttering motion of 1080p24 is much less natural and more noticeable, because it does not emulate the real life reduction of perceived detail as does 1080i30. 1080p60 and 720p60 suffer much less from that due to the higher frame rate.

This in some ways makes 1080p24 the worst of all choices, rather than the best or an improvement over 1080i30 or 720p60. And since an isolated frame of 1080i30 is exactly the same as one of 1080p24, it really makes one wonder what the heck people think they are seeing that they describe as "better" or "richer" or "more colorful" or with "more detail" (we've already established as a hard fact that there is no more color or detail in 1080p24). Placebo effect is more likely what they are experiencing (if I convince you that this will be better, that's how you will perceive it, whether it is or not).

And there seems to still be great confusion over the fact that the display rate 1080p60 and the content format 1080p60 happen to have the same nomenclature. They are two completely different things relating to two completely different processes, which just happen to go by the same name. A 1080p60 TV simply means that it has a native pixel map of 1920x1080 pixels and displays progressively at 60 fps. It does not mean anything more. It does not imply that it will display content that originates in the 1080p60 acquisition format and it does not imply any of the benefits of that format at all. Joe Schmoe and Joe the plumber are both called "Joe" because they both have the same first name, but they are still two different people. A distinction between 1080p60 the display format and 1080p60 the content format is really not that hard of a distinction to make, and doing so could save folks a lot of grief and allow people not to waste so much time and effort chasing their tails over it.

Does anyone remember the Firesign Theatre TV ad from the 70's implying that used cars had air conditioning (when they really didn't) by claiming that they had "factory air-conditioned air from our air-conditioned factory"? A brilliant marketing claim, if you are a complete huckster, that is. Current 1080p claims are almost as misleading.

Bottom line, 1080p60 TVs are everywhere. 1080p60 content doesn't really even exist. More importantly, the promise that 1080p60 content holds (which is pretty thin itself) is not implied by any 1080p60 TV. Blu-Ray movies that claim they are 1080p60 are telecine'd from movies that originally have a 24 fps frame rate, so claiming them as true 1080p60 content is nothing more than a lie.


----------



## P Smith (Jul 25, 2002)

Progress is not stopped at 1080p24 with that film movie's shots - there are 4K cameres now and new compression methods coming, new standards for high bitrates ... We will see it ! The true 1080p60.


----------



## Maruuk (Dec 5, 2007)

So are all D* VOD HD 1080p movies running at 24fps? I watched one on my 2007 Vizio set for 1080p and it looked fine, though I doubt it had the new 24fps feature. And what about the new films like the latest Star Wars which was shot in HD video, not film, at 60fps and I presume they want to keep it that way since that's how they project it in theaters with HD projectors.


----------



## Rich (Feb 22, 2007)

Maruuk said:


> So are all D* VOD HD 1080p movies running at 24fps? I watched one on my 2007 Vizio set for 1080p and it looked fine, though I doubt it had the new 24fps feature. And what about the new films like the latest Star Wars which was shot in HD video, not film, at 60fps and I presume they want to keep it that way since that's how they project it in theaters with HD projectors.


I posted a list a while ago that had all the TVs that would play in 1080/24p. And those that wouldn't.

Rich


----------



## BattleZone (Nov 13, 2007)

Maruuk said:


> And what about the new films like the latest Star Wars which was shot in HD video, not film, at 60fps and I presume they want to keep it that way since that's how they project it in theaters with HD projectors.


Star Wars Ep 1 was shot on film, with a few shots done with an early prototype Sony CineAlta F-900, with 1440x1080 resolution. Ep 2 was filmed with a Panavised production CineAlta F-900, and Ep 3 was filmed with the newer CineAlta F-950, which shoots in 1920x1080. In all cases, the camera shoots 16:9 (1.77:1), and the resulting video was cropped to 2.35:1, thus using only 817 of the 1080 horizontal lines. And in all cases, the video was 24p, which is the native format of the CineAlta.

http://en.wikipedia.org/wiki/CineAlta


----------



## TheRatPatrol (Oct 1, 2003)

I have a 2008 Panasonic 85U and it supports 1080p/24 via 3:2 pulldown. Is that normal?

Thanks


----------



## jediphish (Dec 4, 2005)

Directv still needs to allow a "force 1080p/24" option like there is on the PS3.

I have a Pioneer Elite Pro1130HD that will "unofficially" accept and process a 1080p/24fps signal (confirmed in use with a PS3). It downrezzes to 768p but plays virtually judder-free because it repeats each of the 24 frames 3 times at a 72hz rate. This is the only way to get "3:3" in HD on this set because its cadence detection and repeat mode is normally only set to work on 480 material. It otherwise accepts 720p/60 and 1080i/30(60) material (displaying at 60hz).

Because its acceptance of 1080p/24 is unofficial, the box for 1080p is greyed out and I cannot check it on the D* resolutions screen. I imagine this is result of HDMI handshake confirming max 1080i acceptance "officially."

Again, a "force" option should be available, even if it is a backdoor.


----------



## P Smith (Jul 25, 2002)

> I have a 2008 Panasonic 85U and it supports 1080p/24 via 3:2 pulldown. Is that normal?


For accept those sat movies at 1080p24 it would be OK, but a 5:5 conversion for latest 120 Hz models will produce more smooth picture ( as other 96 Hz or at least 72 Hz rare models).


----------



## veryoldschool (Dec 10, 2006)

jediphish said:


> Directv still needs to allow a "force 1080p/24" option like there is on the PS3.
> 
> I have a Pioneer Elite Pro1130HD that will "unofficially" accept and process a 1080p/24fps signal (confirmed in use with a PS3). It downrezzes to 768p but plays virtually judder-free because it repeats each of the 24 frames 3 times at a 72hz rate. This is the only way to get "3:3" in HD on this set because its cadence detection and repeat mode is normally only set to work on 480 material. It otherwise accepts 720p/60 and 1080i/30(60) material (displaying at 60hz).
> 
> ...


I can force the receiver to output to my 1080p/60 Sony, by going into the setup menu and selecting the 1080p resolution. When I check this option, the receiver outputs this and [if I could see it] has the same message asking me to press "OK" [select] as with other resolutions. I can "fake it" by pressing the remote during this time, even though I can't see the message.
Fairly useless for me since the TV has the unsupported signal message, but if your TV will show an image, you should have no problems "manually" selecting the 1080p resolution in the setup menu.


----------



## skyboysea (Nov 1, 2002)

325xia said:


> I have a 1080P Samsung HDTV. Yet, when I try and Test the 1080P from my DirecTV HD DVR, it says "This TV does not support the DirecTV 1080P Signal". Is there something I need to do in order to get the two to communicate correctly? I have a Samsung LN-S4695D TV and a HR20/700 DirecTV Receiver.
> 
> Thanks


325xia,

I saw this thread just today. I have the same set you have and it does support 1080p/24. Just make sure you have the last firmware. See the 4x95/96 thread on AVSForum for more information.


----------



## Rich (Feb 22, 2007)

TheRatPatrol said:


> I have a 2008 Panasonic 85U and it supports 1080p/24 via 3:2 pulldown. Is that normal?
> 
> Thanks


Yes, I have one too.

Rich


----------



## jediphish (Dec 4, 2005)

veryoldschool said:


> I can force the receiver to output to my 1080p/60 Sony, by going into the setup menu and selecting the 1080p resolution. When I check this option, the receiver outputs this and [if I could see it] has the same message asking me to press "OK" [select] as with other resolutions. I can "fake it" by pressing the remote during this time, even though I can't see the message.
> Fairly useless for me since the TV has the unsupported signal message, but if your TV will show an image, you should have no problems "manually" selecting the 1080p resolution in the setup menu.


Problem for me is I cannot "select" the 1080p box on the setup screen, so getting to this next step is impossible. I assume this is because my TV does not tell the receiver that 1080p is a supported format. So, the box is grayed out ... I can't even cursor down to it using the arrow keys. The PS3 has a specific option to "force" 24fps output.


----------



## veryoldschool (Dec 10, 2006)

jediphish said:


> Problem for me is I cannot "select" the 1080p box on the setup screen, so getting to this next step is impossible. I assume this is because my TV does not tell the receiver that 1080p is a supported format. So, the box is grayed out ... I can't even cursor down to it using the arrow keys. The PS3 has a specific option to "force" 24fps output.


 OK, it must be because mine is a 1080p TV, even though it reports to the receiver that it's a 1080p/60, I still have the option.
My Vizio doesn't have the option either, but it's only a 1368 x 768 panel.


----------



## mikeinthekeys (Feb 10, 2007)

veryoldschool said:


> OK, it must be because mine is a 1080p TV, even though it reports to the receiver that it's a 1080p/60, I still have the option.
> My Vizio doesn't have the option either, but it's only a 1368 x 768 panel.


I have been hesitant to post this because I can't explain what is happening. However, it is absolutely repeatable... so here goes:
When I reboot the HR20-700 it comes up with resolutions (on Misc Opt) of 480i, 720p, 1080i only. However, when a BD is playing through an HDMI switch to my projector (output on BD set for 1080p/24) if I turn the Vizio off and back on the grayed-out box on the system/hdtv/resolutions page becomes clear and can be selected. Then, by doing that, and pressing the info button, it says 1080p is available. Back on the Misc Opt page 1080p/50 and 1080p/60 are added to the list of available resolutions. Several times, the Misc Opt page showed the added resolutions without my doing anything.

Playing a PPV movie in 1080p plays just fine on the Vizio (120hz) and it shows that it is playing a progressive picture... the screen displays 1080p but not the frame rate. I have gone through this process many times before reporting this, but I would love to know what any of you think is going on. Somehow it appears that the HDMI switch is communicating something back to the DVR even if it is not selected. Is this explainable?

Also, there are no noticeable artifacts (frames being dropped, or tearing) while viewing these PPV 1080p movies on the Vizio. The pictures may or may not be better than 1080i versions, but they look pretty good to me... and I'm pretty fussy about this.


----------



## Maruuk (Dec 5, 2007)

So are all the 1080p VOD movies on D* actually being presented at 24fps?


----------



## ColdCase (Sep 10, 2007)

Tom Cat tried to clear this up, but I think this thread has way to much miss information in it to be rescued. Its just incredible, anyone reading this sequence needs to pretty much forget it. Go over to one of the HD sights, look at CNET reviews. do a wicki on 1080p, and you will find much more credible, accurate and useful information. You should be able to also find what your TV is up to. There is more to it than be described in a short forum post. The AV forums have sections dedicated to TV models and technology, and the participants there may be much more knowledgeable about your video equipment than those here. You will get better answers there, I think.

As was mentioned elsewhere, there is a lot of marketing hype, especially over at Sony, that mislead many into thinking they just purchased the best thing since sliced bread and was happy to pay extra for it... where in fact, they have a poorly performing video system. Its near criminal in nature how customers are duped, IMHO.

Remember that a 1080 picture, whether delivered 1080i or 1080p has the exact same resolution, one does not provide more picture detail than the other. There are some differences in motion artifacts due to how the 1080 lines are delivered to you, but that depends on the source, compression applied by the channel, TV processing of the signal, etc. Currently 1080p60 would be ideal, but has been way too expensive to distribute. 720p is another HD resolution, not quite as much detail, but better than the 480 you get in a DVD.

There are exceptions, but movies are typical shot at 1080p/24 and the video enthusiast endeavor to deliver that video from the camera to your TV screen without modification. BlueRay and HDDVD have the pure 1080p/24 on the disk, DIRECTV, FIOS, others have the pure 1080p/24 on there server, but there is plenty of opportunity to screw it up along the way to your screen. There are pioneer TVs that can pull the original 1080p/24 video out of a 1080i/60 delivery mechanism.... but I digress.

Typically sports and live TV are shot at 720p or 1080i at 30 fps, depending on the broadcaster. (25 and 50 fps are european standards), and sports enthusiasts endeavor to deliver that on your screen without conversion or compromise. A lot of TVs do 720p or 1080i at 30 or 60 fps just fine, very few do 24fps right, many BR players screw up the 24fps video before delivering it to your TV.

I'm guessing that DIRECTV also may apply some compression to the 1080p/24 video it has in its servers in order to get it delivered to your box

An up conversion never provides more detail than the source, the converter makes up or guesses at the missing information, some better than others, some TVs converts better than DVD players. In the end, the casual viewer may not tell the difference anyway.


----------



## ColdCase (Sep 10, 2007)

mikeinthekeys said:


> I have been hesitant to post this because I can't explain what is happening.


HDMI has a handshake between components so that both systems can negotiate the best video rates, but some companies don't implement it right. So you end up with much confusion. Then you can't always trust the resolution the TV indicating. You are probably not getting 24fps directly without some conversion and another conversion of some type.

There are experts at the AVS forum that may be able to help you with your particular model of TV. ( http://www.avsforum.com/avs-vb/ ). Guys here know their DIRECTV, but not so much the nuances of various TV models.


----------



## BattleZone (Nov 13, 2007)

Maruuk said:


> So are all the 1080p VOD movies on D* actually being presented at 24fps?


Yes. All "1080p" VOD movies are formatted in 1080/24p, as are nearlly all Blu-Ray movies (Blu-Rays have more bandwidth available, so they can use higher bitrates and more/higher quality audio tracks as well). For those folks whose TVs don't accept 1080/24p signals, the receiver can convert to 1080/60i, which all 1080-capable TVs will accept. As long as the TV can deinterlace decently, there won't be a lot of difference.

TVs that can display 1080/24p content at 24 fps (using a refresh that's an even multiple of 24) will give the best possible display. Generally, any "motion enhancement" has to be turned OFF in order for this mode to work properly.


----------



## TomCat (Aug 31, 2002)

ColdCase said:


> ...Go over to one of the HD sights, look at CNET reviews. do a wicki on 1080p, and you will find much more credible, accurate and useful information. You should be able to also find what your TV is up to. There is more to it than be described in a short forum post. The AV forums have sections dedicated to TV models and technology, and the participants there may be much more knowledgeable about your video equipment than those here. You will get better answers there, I think...


I think you can find just as much misinformation from what are supposed to be "credible, accurate, and useful" sources. For instance, the manual for the HD DVR+ states that HDMI has better PQ than component. That is just not in any possible way a true statement. That is an assumption, a jump to a conclusion based on expectation, laziness, or a little of both. Whoever wrote that simply did not do their homework. They took a shortcut and jumped to an incorrect conclusion instead.

Most of what you will get here or nearly anywhere is poorly-informed opinion based on very little real understanding. What is often discussed here is not as simple as it might seem. But then you might also be surprised that some of the information here is as credible, accurate, and useful as it ever gets, if you know where to look and who to believe.

There is a huge difference between "understanding" complex concepts by virtue of a few accumulated hours of googling compared to understanding them due to a lifetime of working directly in the field. Those who fall into the latter category typically write above the heads of those in the first category who might even accidentally stumble on the information anyway, not because they are smarter or more intelligent, but because they have had the advantage of a lifetime of working towards that understanding, which like everything else in life that is complex, takes a good deal of time and effort. A few days spent on WebMD.com is nothing at all like 4 years of medical school at Northwestern University. The level of understanding is quite different, and the level of misunderstanding is also.

No one can be expected to really understand these concepts intimately if it is not their field of endeavor, just like I can't sit having a beer and watching the game and simultaneously give sage advice to Derrick Jeter about how to hit with men on 1st and 2nd and 1 out if I have never swung a bat. These are just not simple concepts, and folks sometimes think they understand them when they really do not, and they sometimes forward incorrect info along, possibly even in good faith, making it all that more confusing to those attempting to understand.



ColdCase said:


> ......Remember that a 1080 picture, whether delivered 1080i or 1080p has the exact same resolution, one does not provide more picture detail than the other. There are some differences in motion artifacts due to how the 1080 lines are delivered to you, but that depends on the source, compression applied by the channel, TV processing of the signal, etc...


And here is a prime example of that. I agree about the statement about resolution (as it is pretty much exactly how I described it earlier) but compression has nothing at all to do with motion artifacts that are due to interlace, judder, pulldown, or other issues related to the differences between 1080p and 1080i. Compression will only affect the level of motion artifacts due to compression issues. While it is true that "there are some differences in motion artifacts" regarding 1080i and 1080p, and it is also true that the level of all motion artifacting can be partly due to "compresson applied to the channel", one does not inform the other, and one is not due to the other. Motion artifacts from compression are a separate issue from motion artifacts from interlace, from judder, and from pulldown.

All of these different sorts of motion artifacts are distinct and separate, all can be traced to their individual causes, and none of them interact (other than to create a final aggregate level of reduction in PQ) with each other. To begin to understand this, you must be able to understand the physics behind each of them and realize why they occur from that viewpoint, which automatically reveals that they are all separate and distinct, rather than a big fuzzy mess of interrelated variables, which they are absolutely not. If you can't understand the physics of it, you will have a very difficult time understanding the distinctions revealed by that, and why they are indeed separate issues.

To sum up, compression (and its motion artifacts), while a bit more difficult to do without motion artifacts on interlaced rather than progressive content, has nothing whatsoever to do with judder, pulldown, or interlace-induced motion artifacts. They are separate issues. Speaking about them as if they are not separate or as if they do interact only tends to cloud the issue and prohibit true understanding.


----------



## P Smith (Jul 25, 2002)

TC, can we dig into your statement "_HDMI has better PQ than component. That is just *not in any possible way a true statement*_". I would like to hear technical explanation from your side; don't hesitate use any deepest technical level -I'll take it.


----------



## texasbrit (Aug 9, 2006)

P Smith said:


> TC, can we dig into your statement "_HDMI has better PQ than component. That is just *not in any possible way a true statement*_". I would like to hear technical explanation from your side; don't hesitate use any deepest technical level -I'll take it.


You'll probably get other replies on the subject, but I'll just refer you to this article http://www.bluejeanscable.com/articles/dvihdmicomponent.htm?hdmiinfo written by someone who has no stake in the answer.


----------



## ztrips (Nov 28, 2007)

texasbrit said:


> You'll probably get other replies on the subject, but I'll just refer you to this article http://www.bluejeanscable.com/articles/dvihdmicomponent.htm?hdmiinfo written by someone who has no stake in the answer.


So after reading that article, I have a question... Does component video support the same quality/detail as the "new" HDMI 1.3/deep color/etc add ons? I would assume that the analog connections have an near unlimited bandwidth to transport the additional data? Have component connections always had "deep color"?

Thanks.


----------



## P Smith (Jul 25, 2002)

texasbrit said:


> You'll probably get other replies on the subject, but I'll just refer you to this article http://www.bluejeanscable.com/articles/dvihdmicomponent.htm?hdmiinfo written by someone who has no stake in the answer.


:down:
Sorry, after reading the 'custom' made report what should create doubts in a head of technically unexperienced user (I see it as main purpose of the article ) instead of providing comparable facts, I see no technical basis for select component over DVI/HDMI.
Nope, it doesn't count as valuable source for support BB conclusion. I'm still waiting for his response.


----------



## mikeinthekeys (Feb 10, 2007)

texasbrit said:


> You'll probably get other replies on the subject, but I'll just refer you to this article http://www.bluejeanscable.com/articles/dvihdmicomponent.htm?hdmiinfo written by someone who has no stake in the answer.


Very instructive article... thanks for the reference.


----------



## TomCat (Aug 31, 2002)

P Smith said:


> TC, can we dig into your statement "_HDMI has better PQ than component. That is just *not in any possible way a true statement*_". I would like to hear technical explanation from your side; don't hesitate use any deepest technical level -I'll take it.


Well, it doesn't even have to be that technical, happily enough, because it is not really as complex as some of the other issues.

And, BTW, BlueJeans cable most definitely has a vested interest in swaying our understanding to their benefit, while I really have no dog in this fight. They sell cables and want you to buy them. Some of the articles I have read there are riddled with falsehoods and poorly-understood assumptions. I, on the other hand, am just after truth, something hard to come by on their web site. Sadly, they may even believe their own hype, but hype it still is.

The signal on component is really the same signal on HDMI. It is presented a little differently, in a different protocol and a different domain (analog rather than digital for component). But there is no real difference in any of the aspects that add up to PQ. The resolution is the same, the colorimetry is the same, and in the end (at viewing) everything about each and every pixel is the same. There is nothing about either component or HDMI that can degrade the signal over reasonable distances, so the end result is the same (for signals delivered by cable, sat, FIOS, or OTA).

Ironically, a signal that has never been digitized (a process that limits infinite measurements to a defined set of sampled points in time, a mere subset of the original visual information) would result in actually a better end result than HDMI, because HDMI implies a digitization process while component does not. But for all practical purposes HD video as it comes to consumers has always been digitized at one point or another, erasing that potential advantage.

The only real functional difference between HDMI and component is that where the signal is converted to analog is different. If we assume that the input signal is HD from the HDD of your DVR, what we have is a SPTS, or digital Single Program Transport Stream, which is essentially a bit stream that carries a video stream and associated audio streams and embedded closed captioning and metadata. The video is in the MPEG format, while the audio is in the AC3 format.

The signal is presented to an MPEG decoder (let's ignore audio and the rest for the sake of simplicity of the argument for now, although HDMI does have the advantage of carrying audio as well) and is converted to an uncompressed digital video bit stream. Now it follows two main paths:

1) It is formatted to HDMI (which changes no aspects of the signal itself and merely places it in a protocol "wrapper") and is presented to an HDMI transmitter chip. It then goes through a HDMI cable to a HDMI receiver chip in your TV (which handles the "handshake" and removes the wrapper). The very first thing that happens after that is it is presented to a digital-to-analog converter and is processed from that point forward as analog YUV inside the TV. Your TV may claim "100% digital processing", and technically that is true. The processing is digital, but the signal being processed is actually in the analog domain for at least part of the time and usually for about 99% of the time. Hard to believe, but true.

2) The other path? The uncompressed bit stream from the decoder goes instead to a DAC chip inside the DVR which is functionally identical to the DAC inside your TV (it may even be the exact same chip). The output is HD component analog. At this point, the signal is virtually in the same state as the signal would be if it were converted by the DAC in option 1 above. The signal goes through 3 cables to your TV where it then follows the same exact path through the TV that the YUV signal coming from the DAC in your TV in option 1 above follows. It is essentially the same signal, processed from that point forward in the same exact manner.

As long as the cables are not too long, neither HDMI or component cables will degrade the signal as it passes from DVR to HDTV. So the only difference is WHERE the signal is converted to analog. The process is functionally the same, and neither HDMI or component degrades the signal, which is why the quality level is the same.

Version 1.3, deep color, etc., is just more hype. The colorimetry and other aspects of HD video are fixed at a particular quality level long before it ever reaches the viewer, at word lengths bit levels and bit rates that are not going to be improved by a newer protocol that can handle longer digital words, higher bit levels and faster bit rates. So any "improvement" there is not realized by the viewer.

It's just that simple.


----------



## texasbrit (Aug 9, 2006)

The point I was making about bluejeans cable is that although they want to sell you cable, they don't really care whether you buy HDMI or component.

I agree with your post, in fact to expand on one thing, where you say "as long as the cables are not too long", as the engineer at one of my local TV stations pointed out to me, it's more of a challenge running HDMI over long distances and maintaining picture quality than it is to run component.
And your comments about the hype over 1.3 ( and now 1.4) are so true, but are completely encouraged by the industry. One poster with a 6ft cable run from a DirecTV DVR posted that he had been told that a 1.3 cable was essential if he were going to get maximum performance, and had been quoted "deep color" and new audio schemes as reasons why this was true. 
And of course the other people who are told that they need a more expensive cable because their LCD TV is 120Hz and the cable they have might not carry the 120Hz signals..... (!!)


----------



## P Smith (Jul 25, 2002)

Lets hold for a minute here: "_The very first thing that happens after that is it is presented to a digital-to-analog converter and is processed from that point forward as analog YUV inside the TV. Your TV may claim "100% digital processing", and technically that is true. The processing is digital, but the signal being processed is actually in the analog domain for at least part of the time and usually for about 99% of the time. Hard to believe, but true."_
I want a proof of the claim. Could you me give some technical cues ? What chips they are using ? I would try read white papers, manufacturer's data sheets by myself. And would be nice if we could see some schematics of those HD TV sets ?


----------



## TomCat (Aug 31, 2002)

texasbrit said:


> The point I was making about bluejeans cable is that although they want to sell you cable, they don't really care whether you buy HDMI or component...


I realize that, and I confess I took the opportunity to take a shot at them because I think their technical explanations are exceptionally fuzzy (my "nice word" for "lies") and it would be a disservice to readers to put a lot of stock in that. Different issue, I agree (but then I am nothing if not an opportunist  )



texasbrit said:


> ...I agree with your post, in fact to expand on one thing, where you say "as long as the cables are not too long", as the engineer at one of my local TV stations pointed out to me, it's more of a challenge running HDMI over long distances and maintaining picture quality than it is to run component...


Yes. And if you understand (again) the physics behind this, you will understand why. Digital signals are sensitive to time smear, something typically not a problem for most analog scenarios, while analog signals are sensitive to attenuation and interference, something typically not a problem for digital scenarios.

HDMI is meant for short-haul, and it is not robust enough to travel very far. In the application it is designed for, it doesn't need to be. But a long HDMI cable just won't work at all unless it is specifically designed to, or unless expensive reclocking of the signal is involved (which removes the time smear).

Long HDMI or digital cables suffer from the same problem that difficult OTA reception suffers from, which is intersymbol interference, due to time base errors once the cable gets too long. The signal will fall off the digital cliff in a very similar manner to falling off the digital cliff in OTA reception at great distance.

Nevertheless, if it does work at whatever distance, what you put into one end can be converted to analog in a manner that would provide an end product identical to if it were converted before becoming HDMI at all. Again, this is the same hallmark of digital OTA broadcasting that *IF* you receive the signal solidly, the PQ will be the same if you are 2 miles away from the transmitter or 90 miles away.

There is no PQ degradation over HDMI even at long distances, as there is (eventually) over component. While it may break down in its ability to maintain time base integrity, which causes digital processing (D-to-A) to become confused about which bits belong to which packets, thwarting that process, the information itself (if it can be extracted) is not affected at all by signal degradation. And the information is where the nature of PQ exists, not in the protocol, and not in the integrity of the packet structure. That's why digital technology exists, due to its high invulnerability to information degradation.

Analog signals attenuate gradually over distance, and the problem is that since chroma or color and luma or B&W are at different frequencies, they attenuate unevenly, which can reduce sharpness and wash out the color. But you can reamplify them using an equalizer circuit that can reverse the losses, and still get by pretty good, with an end product that is virtually the same as the original. As long as the cable is shielded well, interference is also not really an issue. For short cables, neither interference nor attenuation is a problem and again, information integrity is maintained.

As for "more challenging" I think that can be translated as "more costly". Either is pretty simple to do, but the cost of reclocking HDMI is far greater than using an EQ amp on analog.


----------



## P Smith (Jul 25, 2002)

I'm still puzzling - why on that market with existing overpriced cables like Monsters, we don't see fiber-optic solutions for HDMI/DVI [TMDS signaling] connections what will put on rest the 'long cable - bad cable' issue.


----------



## texasbrit (Aug 9, 2006)

P Smith said:


> I'm still puzzling - why on that market with existing overpriced cables like Monsters, we don't see fiber-optic solutions for HDMI/DVI [TMDS signaling] connections what will put on rest the 'long cable - bad cable' issue.


there are indeed solutions for HDMI over fiber. In fact the HR21pro DVR has support for HDMI over fiber. See http://owlink.com/dli_fiberoptic_cable.htm


----------



## TomCat (Aug 31, 2002)

P Smith said:


> Lets hold for a minute here: "_The very first thing that happens after that is it is presented to a digital-to-analog converter and is processed from that point forward as analog YUV inside the TV. Your TV may claim "100% digital processing", and technically that is true. The processing is digital, but the signal being processed is actually in the analog domain for at least part of the time and usually for about 99% of the time. Hard to believe, but true."_
> I want a proof of the claim. Could you me give some technical cues ? What chips they are using ? I would try read white papers, manufacturer's data sheets by myself. And would be nice if we could see some schematics of those HD TV sets ?


The first HD set I ever owned was a Sony Wega 60" GWIV, bought in October 2004. It is a microdisplay, and for its time, probably had arguably the best PQ I have ever seen.

It uses a Silicon Images 9993 HDMI receiver chip (or two, in some models). This chip incorporates a DAC as well. Now granted, the 9993 has a digital out as well as a analog out (plus audio and HDCP data outputs) as it is a multipurpose chip. But Sony ignores the digital output in favor of the analog one.

In the attached block diagram (showing the input side of the GWIV) you can see the analog composite inputs 1-4 which go directly to an analog video switcher chip. You can see HD analog component inputs 5-6 which go to a mini-switch. The selected YUV output of that unit also goes to the video switcher chip.

What else is connected to that video switcher chip, which BTW is the source from that point forward for all video processing? Inputs 7 and 8, each of which are HDMI receiver chips, each with an integrated DAC. The wiring connecting the HDMI chip to the analog video switcher chip is clearly marked YUV, which is the same indication given here for the component inputs, meaning that the component and originally-HDMI signals are now all YUV, all the same.

And the switcher chip is not capable of digital input. It is the router that selects the various input signals when you punch different inputs on the remote, all of which are analog video by the time they reach the inputs on that switcher chip. That, and the lack of any other processing prior to this analog-only chip, proves that the output of the HDMI/DAC chips that connects to it are YUV analog component video, even though signals connected to the HDMI inputs (7-8) are ported in to the set within the digital domain.

OK, when I first saw this it took me a while to accept this as well. It took me a day or two to process it, to answer the burning question "why would they want to do that?". I had just rolled off $3700 and was under the common misconception and expectation that processing was certainly all done within the digital domain. But I was mistaken.

There is a hard-bitten position (and completely incorrect) that digital is automagically better than analog. In truth, analog has its advantages depending upon the application, and can be a better choice. There is another intractable bit of universal folklore that converting from A to D or D to A imparts significant errors. That is also not true. Signals can be routinely converted from D to A and back numerous times without incurring visible errors, which happens all of the time in the SD world. As much as I intellectually understand that, the expectations, assumptions, and even the hype are is so engrained in all of us by marketing that there is still a strong continuing emotional desire to hang onto these false constructs as if they were factual, ridiculous as they actually are. I at first was not immune to this feeling either.

To understand why processing within the digital domain is more demanding, lets look first at how the volume of an audio track is raised in the analog world. As has been done for a century or more, a variable resistor is used to increase or decrease signal voltage. Nothing could be more simple.

To raise volume in the digital world, things are very different. Every sampled point is now represented by a 16-32 bit digital word. That's a lot of ones and zeroes, and there is a new word 48,000 times every second. The only way to raise the volume is by adding a constant, another binary coefficient, to each and every one of the digital words you have already, changing them into new digital words that represent a higher volume. IOW, a discrete binary mathematical operation has to be performed 48,000 every second. To lower the volume? subtract a constant from each word.

Changing the amount of bass or treble or adding an effect such as reverb is monumentally more complex that that, because you have to derive a different coefficient for each binary word dependent on where in the frequency spectrum it resides, as a constant will not work, meaning you also have to have some algorithm to discover what frequency a tone exists at by analyzing a series of coefficients on the fly, and then generate the proper offset and add or subtract that. 48,000 groups of calculations a second. Reverb requires delaying these words in a buffer and reapplying them over and over again at different times to each coefficient in a repeating loop, and at different decreasing levels over time (implying different calculations for each time).

And that's just audio. And to keep that processing from accumulating compounded rounding errors, they usually process at 32 bits or even higher. Analog audio doesn't have to worry about any of those sorts of errors.

Video is much more complex than even that. First, the bit rate is now 1.458 GB per second, rather than 384 KB. And the sorts of operations needed are comparatively very demanding compared to "simple" digital audio.

Consider gamma correction. In the analog world, increasing gamma was done historically with simple passive LC filtering, which imparts extremely-small invisible errors. All you have to do is exponentially raise the luminance level (more for pixels higher in luminance than those that are closer to the middle of the grey scale) above a particular point typically in the middle of the grey scale and exponentially lower the luminance (again more for those lower than those closer to the middle) below that point. This is simple to do without creating errors in the output because the analog luminance scale has infinite setting possibilities between 0 IRE and 100 IRE. Its an analog scale in the analog domain.

But to do it digitally, you again have to perform countless mathematical operations on a much more complex stream of binary coefficients than in digital audio, and at a much faster rate. Also, you can easily induce rounding errors that give incorrect quantization values which are compounded with every calculation. In the digital domain there is no advantage of infinite possibilities to the resultant answers, as there is in the analog world. In digital, everything must conform to one of a number of pre-determined, fixed quantization levels. If the calculation falls somewhere between levels, it must be forced to one or the other, rounded up or down, which means it includes a lot of error that accumulates and compounds for every subsequent operation.

To do all of that inside a consumer HDTV is just not tenable. The industry has been successfully processing video in the analog domain since about 1939, so after 70 years they have that down pretty good. To emulate that in the digital domain, which is nearly impossible to do at the consumer level without your set costing $100 K, just isn't practical.

Bottom line, it makes sense for many reasons to process in the analog domain inside a HDTV. The signal has to be converted to analog at some point anyway (our eyes are analog devices and can't decipher binary coefficients all that well), we have a rich history of doing analog successfully, its much simpler and cheaper, interference and attenuation are not a factor inside such a controlled environment, and most importantly, we don't have to worry about accumulated rounding errors which would devastate PQ. It's actually a no-brainer, which is why manufacturers do it universally. "100% digital processing" makes a lot of sense, too, but only if the information being processed exists in the analog domain.


----------



## P Smith (Jul 25, 2002)

Well, TC. 
I took time for reading documents at SI site and I found a HDMI 1.3 Receiver IP Core and its 1.4 version doesn't carry anymore analog inputs/switches or DACs.
Seems to me the unnecessary conversion finally been cut and all processing done purely in digital domain. What my personal "SW part" out of my EE core tell me - that's for good, we can quickly find SW solution for new methods, squash bugs, etc rather going thru hoops in silicone changes for analog circuits.


----------



## TomCat (Aug 31, 2002)

That is most likely a version sold for professional application. There are applications for which keeping things in the digital domain make sense, otherwise the 2004 version of that chip would not have had the option to bypass the DAC (the option Sony chose wisely not to use for their application in the Wegas). Its the same circuitry as the 2004 chip, with the costly DAC removed. Probably cheaper to build and more efficient energy-wise, but not an improvement in any other way.

And make no mistake, that will probably trickle down some day. Just like you can't find a HDD that is less than 500 GB to put in the DVR you are manufacturing anymore, there will come a day when digital processing of consumer video can be done fast enough and at a bit depth that minimizes rounding errors to the same level of invisibility that analog enjoys today, simply because the cost fell and it makes no sense for suppliers to include the DAC anymore. But today isn't that day. Not yet.

Until it is as cheap as doing it analog, and until those tasks can be done better in the digital domain, which currently is definitely not the case, TV manufacturers have no reason to change. They have painted themselves into a corner with their own hype. They already have people convinced that "100% digital processing" means the information is in the digital domain, which it absolutely is not. And there is no way to display the information without converting to the analog domain so that our eyes can interpret it as a representation of what we see in the analog world, where we live, so "100% digital processing" will always be a lie.

If they were able to keep processing in the digital domain, what is the benefit to them? Analog already does it better and has virtually no visible drawbacks. How is digital going to improve on that, knowing what we already know, which is that in some applications, digital just is not better? And why would they do something they have already convinced us they have done, wiping out any marketing advantage it might have had? If the perception is that everything is in the digital domain already (which could not be more of a lie), what are they going to hype once it really is?

I find it ironic that newer SW will avoid or allow us to "squash bugs" as if that's an improvement. Analog technically doesn't have "bugs". But all software does, including that used to go into space and back. And typically, the newer the buggier. The word "bug" was coined directly from early software development in the 50's, and is still a phenomenon that plagues us daily. We didn't need or have that word before there were computers and everything was analog. Windows has over 100,000 lines of code that has been patched around, and it still crashes regularly if not preventatively rebooted. The "hoops" seem to e much more daunting in the SW world than the analog solutions ever have been. It's not too late to undrink the Kool-Aid and realize that digital might not be quite the advance over analog we assume it is (or have been sold a bill of goods that it is), at least not in every case.


----------



## TomCat (Aug 31, 2002)

P Smith said:


> Well, TC.
> I...found a HDMI 1.3 Receiver IP Core and its 1.4 version doesn't carry anymore analog inputs/switches or DACs...


So why would an HDMI chip designed to handle digital signals need an analog input?


----------



## P Smith (Jul 25, 2002)

My mistake, after reading that scheme from '04 in part of audio input. Sure, there wasn't *video *analog input for SiI9993 or newest chips.


----------



## DeanS (Aug 23, 2006)

Question: Are 1080p/24 video signals transmitted only via HDMI or can component cables carry this video signal?


----------



## BattleZone (Nov 13, 2007)

DeanS said:


> Question: Are 1080p/24 video signals transmitted only via HDMI or can component cables carry this video signal?


Component cables *could* carry 1080/24p, but in practice, they don't. I'm not aware of any consumer-level devices that support 1080/24p via component, though a few may exist.


----------



## P Smith (Jul 25, 2002)

TC, would you have right part of that Sony block diagram ? 
I become questioning myself - if that sloppy design was forced by analog input of what ?
If they are struggled that time to accept analog video signal at right of the diagram, then it's explain why the video DAC added after SiI9993.
Newest models LCD/plasma wouldn't incorporate such awkward design. You can find that in diagrams of HDMI receivers on Silicone Image web site.


----------



## TomCat (Aug 31, 2002)

P Smith said:


> TC, would you have right part of that Sony block diagram ?
> I become questioning myself - if that sloppy design was forced by analog input of what ?
> If they are struggled that time to accept analog video signal at right of the diagram, then it's explain why the video DAC added after SiI9993.
> Newest models LCD/plasma wouldn't incorporate such awkward design. You can find that in diagrams of HDMI receivers on Silicone Image web site.


Sloppy? Awkward? I think you are going to have to qualify that. This is not a Hyundai or a Yugo we're talking about here, but the company that has shipped more top-quality TVs than anyone else. Whatever limitations their design might have had, somehow it still resulted in the best pictures anyone had ever seen at that point in time. The SXRDs from the next year were even better, and still have the best PQ I have ever seen. Forget all of the numerous other arguments and reasons supporting them that I have given you already; the fact alone that they used analog processing to do it proves without a doubt that analog processing makes perfect sense in this type of application.

Realizing that your TV processes in the analog domain might be surprising or unexpected, but if you ever begin to start to understand why, you will be happy they were enlightened enough to do it that way.


----------



## TomCat (Aug 31, 2002)

DeanS said:


> Question: Are 1080p/24 video signals transmitted only via HDMI or can component cables carry this video signal?


Component cables are designed for analog signals, meaning that they carry signal based purely on voltages. They have no earthly clue what the format of the video is that they are carrying (or even if it's really video), and are therefore completely agnostic about it, not discriminating between things they can't differentiate between in the first place.

Put any analog signal into one end of a (reasonably short) video cable of any flavor other than HDMI, and it will come out the other.


----------



## texasbrit (Aug 9, 2006)

TomCat said:


> Component cables are designed for analog signals, meaning that they carry signal based purely on voltages. They have no earthly clue what the format of the video is that they are carrying (or even if it's really video), and are therefore completely agnostic about it, not discriminating between things they can't differentiate between in the first place.
> 
> Put any analog signal into one end of a (reasonably short) video cable of any flavor other than HDMI, and it will come out the other.


Yes, the issue isn't the ability of component cables to carry 1080p, it's that component does not support the HDCP copy-protection protocol (because it's digital) and so the gods of the Motion Picture Association and others have decided that they won't allow 1080p movies to be transmitted over component.


----------



## TomCat (Aug 31, 2002)

texasbrit said:


> Yes, the issue isn't the ability of component cables to carry 1080p, it's that component does not support the HDCP copy-protection protocol (because it's digital) and so the gods of the Motion Picture Association and others have decided that they won't allow 1080p movies to be transmitted over component.


 If you can get the video into the cable, nothing will stop it from working, so there is nothing inherent about component that will prevent this. But, the stickler in that statement is "if you can get the video into the cable...". If they truly are preventing all 1080p24 content by requiring an HDCP handshake, which component can't do, that could prevent the signal from ever leaving the box if just component were connected.

The workaround might be to hook up both HDMI and component. The HR2x can output both component and HDMI at the same time, IIRC, so the HDMI destination could provide the handshake, and now both outputs are active. There is nothing about HDCP that can govern an analog output directly. Unless the STB is designed to shut off the component output when the HDMI output is active and seeing 1080p24 content, that should work.

The original question was "*CAN* component cables carry this signal?", and the answer is still "yes". If you want to change the question to "*WILL* component cables carry this signal if the movie lobby doesn't want them too?" the answer is "probably not".

My understanding, possibly limited though it might be, was that the movie lobby didn't give a rat's hat about analog copies, but were guarding digital copies closely, so that would make me think there would be more resistance to allowing HDMI than to component.


----------



## ToBeFrank (May 15, 2009)

TC, perhaps your argument would be stronger if you weren't showing a design that was 5 years old. That's an eternity in electronics.

Your reasoning for why they chose to use the analog output of the HDMI chip is nothing more than conjecture. It's likely the decision was made to simplify the hardware/software design.

The stuff about the math operations per second and floating point rounding errors also doesn't hold water.

Do you have signal processing software experience?


----------



## nneptune (Mar 30, 2006)

mikeinthekeys said:


> In re-reading this thread... there is a way to test your TV's ability to play 1080p for free. Select one of the 1080p choices in Direct Cinema that shows a green dot with checkmark. Those movies are already on your drive. It will play for around 5 minutes... enough for you to see the results. The last two resolution lights on your front panel will light, and you can check your TV for what it is getting (if it allows that).
> 
> In my case, my Vizio displays these movies just fine, though spec'd for 1080p/50 and 1080p/60.


Thanks. Tried it with my new Sony, and it was in glorious 1080p! Looked fantastic!


----------



## TomCat (Aug 31, 2002)

ToBeFrank said:


> TC, perhaps your argument would be stronger if you weren't showing a design that was 5 years old. That's an eternity in electronics.
> 
> Your reasoning for why they chose to use the analog output of the HDMI chip is nothing more than conjecture. It's likely the decision was made to simplify the hardware/software design.
> 
> ...


I don't know if it was that head injury, if you've gone off you meds AMA, or if you are still living over a Sherwin-Williams store in the summer heat, but you are obviously not paying attention, and I am running out of patience.

Here are the top five reasons you are all wet:

1) The design of a DAC is not something fluid. This is mature technology, and has been for some time. As I stated, improvements are there in newer models, and they are in the areas of power consumption and cost. What the DAC does to digitized information and how it does it is something that has not needed improvement for some time, because it is already doing it's job to a level of transparency that is below human perception.

IOW, there really is no room for improvement that would be worth addressing regarding PQ. Digital goes in, a virtually artifact-free analog conversion of that comes out. That is what a DAC does. Enlighten us, Mr. software guru, on how would you improve on that and why, other than in the areas of cost and power consumption. Why would there be any reason to improve it in any other way? How would YOU do that, with all of your scary software skills? From the point of preserving PQ, a 5-year-old DAC is every bit as good as is a 5-day-old one. If not, prove to me how I'm wrong.

The design of how TV sets process video has also not moved forward dramatically in that time. Name one that means PQ has improved since 2004, other than display element technology.

We're still waiting.

2) My "reasoning" was supported right here by very cogent arguments that neither you nor anyone else has yet been able to overturn. "Conjecture" implies argument without basis. My arguments have basis, and that has been provided right here in black and white and so far not disproven by you, nor by anyone else. I think my version of reality stands, at least until someone can present equally cogent arguments that supercede it. Nothing you have said here seems to be accompanied by support of any kind at all. Now THAT, is the textbook definition of conjecture.

But I do agree with your wild-assed guess that "It's likely the decision was made to simplify the hardware/software design.", so thank's for making my argument for me (consistency, anyone?). That is exactly why they did it. Processing video information while keeping it within the digital domain would have meant a more-costly, more-complex, less-reliable, and prohibitively-expensive circuit design which it is doubtful could even match the quality of analog processing to do that particular job.

Simplifying the hardware and software was a no-brainer. Only a fool would opt for something more complex and costly with no payoff in a product meant to make a profit as a mass-produced product for consumers. Rather than actually doing that, simply telling the sheep out there that "Oh, we do all that digitally now" is enough to get them to believe it, even though it's a total lie. That doesn't cost the manufacturer a dime.

3) I said nothing about the number of math operations per second that video processing would take, so I'm not sure how I could be accused of being wrong about that, but I do know that there are at a bare minimum 48,000 of them per second that must be done to process digital audio, which may have been the example you appear to be attacking. So now here's your opportunity to prove "that stuff about the math operations per second" is wrong.

Never mind; I'll save you the bother. It's a well-known and accepted hard fact that there are 48,000 digital words per second representing the audio information in a single broadcast digital audio track, so it doesn't take a math whiz to understand that that would also be the bare minimum number of operations needed to change the value of them, which is what a simple operation like changing the volume would entail. It would probably be a great deal more than that for multi-channel audio, and I have no idea how much auxilary support math must be done. The intent of the example was simply that there is a lot of math to be done in a short period of time, not to quantify it to a strict number. I can hardly be wrong about a number I never gave you.

I also know that the pixel rate and the color space format can be used to calculate the number of mathematical operations, at a minimum, that video processing takes, and that is in the millions per second rather than thousands. As I said, that is monumentally more challenging. Is something off about those facts that you would like to challenge? Maybe with some actual proof?

4) I said absolutely nothing about floating-point operations, yet you've found a way to attack me for that as well. Next thing you know you'll be demanding to see my birth certificate to see if I'm truly an american citizen. Seems to be a lot of that going around.

What I said about rounding errors refers instead (as was spelled out in detail) to accumulated quantization rounding error, which is a real fact of life in recalculating or processing digital signals. It does not refer to floating-point rounding errors, which are based on a very large number of places after the decimal point and are therefore comparitively very small. The distinction is not that fine, most folks would have picked up on it.

Quantization error of HD signals is comparitively very large, because, for instance, there are only about 870 quantization levels to represent the entire scale of luminance, and that means that many calculations that fall between the levels will have high error because they are not allowed to fall there and must instead be rounded to the nearest quantization level value.

That means that for any digital process of the information, that nearly every recalculation of the luminance or chominance of nearly every pixel is an *incorrect value*, unless by rare accident the resultant just happens to fall directly on a quantization level. Fat chance. "Error", by definition. That eventually manifests as a visible reduction in PQ after accumulating through not all that many chained processes.

I'll even break it down for you: Concatenated FP error = insignificant. Concatenated quantization error = significant. Was I talking about FP? Not at all. Was I referring to quantization error? Absolutely.

5) Finally, "doesn't hold water" is nothing more than an ubsubstantiated cheap shot.

Simply saying I'm all wet without proving it doesn't feed the bulldog. So how about you put your scary reputation where your mouth is. Prove it. At least provide some argument, even if it _IS_ BS and I can pick it apart in my sleep. If history is any indicator, I sincerely doubt that you can. So far on this thread, other than unsubstantiated claims such as that, the silence is deafening. Ball's in your court. Prove it is "conjecture". Prove it "doesn't hold water". Prove me wrong. Otherwise, you're going to have to accept that I'm completely right, which incidentally, I couldn't give two $#!+s about.

I thought so.


----------



## ToBeFrank (May 15, 2009)

TomCat said:


> I don't know if it was that head injury, if you've gone off you meds AMA, or if you are still living over a Sherwin-Williams store in the summer heat, but you are obviously not paying attention, and I am running out of patience.


WOW. Based on this sentence I'll assume the answer to my question is a big fat NO and not bother to read the rest of your reply. Cheers!


----------



## veryoldschool (Dec 10, 2006)

TomCat said:


> Simply saying I'm all wet without proving it doesn't feed the bulldog.


:lol:
Gee, this seems like it wasn't that long ago that "I was here".
I wonder what's in common?


----------



## DeanS (Aug 23, 2006)

Thanks for the response regarding component cables carrying 1080p/24 content. My specific issue is the purchase of a new receiver with HDMI connections for my home theatre and an issue I ran into playing back 1080p/24 content from D*. Specifically, there is a lip synching error. I attribute this to the fact that I do not currently have a digital optical audio connection between the LCD TV and the new amplifier. I ran into the same problem with my old amplifier, and by making this connection, rectified it.

Now, however, with the HDMI output from the HR-20 routed through the receiver, how do I correct this issue? The receiver (Yamaha RX-V65) needs to be on "HDMI2" input since that is the input used by the HR-20. There is an "AV1" input on the receiver that is potentially available (currently occupied by an older standard-def DVD player with component and optical digital output). Would simply removing this component and connecting the HR-20's component and optical audio output to "AV1" on the back of the new receiver fix this issue? I think somehow the optical audio output from the LCD TV to the receiver is what is really needed to address this issue, but I am at a loss on how to do this with this new receiver.


----------



## dhkinil (Dec 17, 2006)

Not sure this is the appropriate thread, but what content besides ppv or d* cinema is available through d* at 1080p. I am getting a new tv which supports 1080p 60, 30 and 24. I plan to have it hooked up to an h20 which does not support 1080p but can move some things around so that it would be available. 

Thanks, and if this is not the right thread, which is?


----------



## LarryFlowers (Sep 22, 2006)

There are no other sources... no broadcaster or cable channel uses 1080p/24 so at this time it is limited to movies. I wouldn't look for that to change anytime in the near future.


----------



## dhkinil (Dec 17, 2006)

LarryFlowers said:


> There are no other sources... no broadcaster or cable channel uses 1080p/24 so at this time it is limited to movies. I wouldn't look for that to change anytime in the near future.


I assume your response was aimed at me, so thanks,


----------



## pinegein (May 13, 2007)

I am looking to get a new TV how would I know if it supports 1080/24. Thanks


----------



## Mike Bertelson (Jan 24, 2007)

pinegein said:


> I am looking to get a new TV how would I know if it supports 1080/24. Thanks


I think the listed specs and maybe a search here or google are about all there is.

I can't think of anything else.

FWIW, my TV supports 1080/24p (see link in my signature)

Mike


----------



## pinegein (May 13, 2007)

Thanks I looked though the specs but did not see any thing so I got this one Philips 47 47PFL5704D F7 120Hz 2MS 1080p LCD HDTV at 999 plus tax seems like a good deal. I still hope it supports 1080/24 though.


----------



## veryoldschool (Dec 10, 2006)

pinegein said:


> Thanks I looked though the specs but did not see any thing so I got this one Philips 47 47PFL5704D F7 120Hz 2MS 1080p LCD HDTV at 999 plus tax seems like a good deal. I still hope it supports 1080/24 though.


 The 120Hz would suggest it does support 1080p/24.
This is because 120 Hz is a multiple of both 60 Hz & 24 Hz, which is the key. 60 Hz gets "shown" twice and 24 Hz is five times.


----------



## Mike Bertelson (Jan 24, 2007)

pinegein said:


> Thanks I looked though the specs but did not see any thing so I got this one Philips 47 47PFL5704D F7 120Hz 2MS 1080p LCD HDTV at 999 plus tax seems like a good deal. I still hope it supports 1080/24 though.





veryoldschool said:


> The 120Hz would suggest it does support 1080p/24.
> This is because 120 Hz is a multiple of both 60 Hz & 24 Hz, which is the key. 60 Hz gets "shown" twice and 24 Hz is five times.


To continue what VOS is saying, 240Hz, which is popular these days, is also a multiple of 24Hz & 60Hz.

Even multiples is key...or get a plasma...let's see the backlash on that one. 

Mike


----------



## P Smith (Jul 25, 2002)

And new line slow coming - 480 Hz.


----------



## ToBeFrank (May 15, 2009)

pinegein said:


> Thanks I looked though the specs but did not see any thing so I got this one Philips 47 47PFL5704D F7 120Hz 2MS 1080p LCD HDTV at 999 plus tax seems like a good deal. I still hope it supports 1080/24 though.


It looks like it does not do it natively. The Philips site says it does 1080p/60 and has 3/2 pulldown: http://www.consumer.philips.com/consumer/en/us/consumer/cc/_productid_47PFL5704D_F7_US_CONSUMER/LCD-TV+47PFL5704D-F7


----------



## mhammett (Jul 19, 2007)

dcowboy7 said:


> Technology just cant be kept up with.
> 
> hdtvs had a 60hz refresh rate....then 120hz....now even thats old....the new models out now have 240hz.
> 
> Its impossible to keep up with the latest/greatest stuff.


I don't get excited over TV refresh rates. The content is generated at 24 or 30 frames... 120 is pointless.


----------



## cygnusloop (Jan 26, 2007)

mhammett said:


> I don't get excited over TV refresh rates. The content is generated at 24 or 30 frames... 120 is pointless.


Except that 120 is smallest number evenly divisible by 24 and 30. Which makes 120 kind of a magic number. Now, the merits of 240Hz, 480Hz, etc... that may be another matter.


----------



## mhammett (Jul 19, 2007)

I guess I've just been used to monitors that can display different refresh rates natively. ;-)


----------



## Mike Bertelson (Jan 24, 2007)

cygnusloop said:


> Except that 120 is smallest number evenly divisible by 24 and 30. Which makes 120 kind of a magic number. Now, the merits of 240Hz, 480Hz, etc... that may be another matter.


IIRC, it's the whole reason for 120Hz. :scratchin

Mike


----------



## pinegein (May 13, 2007)

Thanks for all the answers and the research that was done.


----------



## cygnusloop (Jan 26, 2007)

mhammett said:


> I guess I've just been used to monitors that can display different refresh rates natively. ;-)


Fair point. 
There are a few (mostly plasma/CRT) displays out there that have a variable refresh. For example the KUROs are capable of 60Hz or 72Hz, IIRC. I'm sure there are a few more.


----------



## TomCat (Aug 31, 2002)

MicroBeta said:


> IIRC, it's the whole reason for 120Hz. :scratchin
> 
> Mike


Mike, I am willing to bet that you are correct that this was the _original_ reason, because it does indeed provide the benefit of the ability to remove judder from content that has 3:2 pulldown (which is what most movies and filmed content displays with).

But I am also willing to bet with long odds that the reason for 240 or 480 is simply more marketing one-upsmanship, as it provides no benefit whatsoever.


----------



## TomCat (Aug 31, 2002)

LarryFlowers said:


> There are no other sources... no broadcaster or cable channel uses 1080p/24 so at this time it is limited to movies. I wouldn't look for that to change anytime in the near future.


There is indeed 1080p24 content available live from DirecTV 24/7 on _The NFL Network_. As a matter of fact, that is their native format.

_The NFL Network_ allows TV stations to carry certain in-market games as broadcast events (we carried one last Thanksgiving, in fact) using _NFL_ as the backhaul. One requirement was a crossconversion of their content to the native format of the station (from their native 1080p24 to our 720p60). All content must be broadcast at the same format to prevent reacquiring resolution at changes between broadcast media elements (which would glitch every TV every time we went to a commercial), and that required us to crossconvert them to match us for that broadcast.

The network relies on a huge library of old _NFL Films_ content, all shot in original film format at 24p. Rather than convert that to something more common such as 1080i30 or 720p60, they opted instead to crossconvert all other interstitial programming and commercials, which likely is produced at 1080i30 or 720p60, to 1080p24, for the same reason we had to crossconvert them to match us. So this channel also _transmits_ no judder, even on filmed content.

Also, you can pause their content on DTV and frame advance, and every frame is unique, meaning there is no pulldown. Note that you will see a frame repeat on occasion, and that is a transmission phenomena (a repeated frame replacing a corrupted frame) which is common. Your DVR may state "1080i resolution original format", and that is technically true, as 1080p24 does indeed have the same _resolution_ as does 1080p24.

Ironically, this means that TV's incapable of 1080p24 native, will actually add the pulldown back in, _actually adding judder_, as they need to pull it down to match the 60 fps native refresh rate of the display.

This puts in question the wisdom of NFL doing this at a point where 95% of the TVs receiving them are incapable of reproducing their native format without adding pulldown and judder. Seeing as how crossconversion is not something done ahead of time to library content, and is instead a live transparent function using a single piece of equipment that simply sits there in the program stream and does the same thing all day long, they could have just as easily stayed with a conventional format until 1080p24-capable sets were commonplace, and then made the switch.


----------



## veryoldschool (Dec 10, 2006)

TomCat said:


> Mike, I am willing to bet that you are correct that this was the _original_ reason, because it does indeed provide the benefit of the ability to remove judder from content that has 3:2 pulldown (which is what most movies and filmed content displays with).
> 
> But I am also willing to bet with long odds that the reason for 240 or 480 is simply more marketing one-upsmanship, as it provides no benefit whatsoever.


 Ding, ding, ding,.. we have a winner here.

With 24, 30, & 60 frame rates, having refresh rates any higher, would only be repeating the same frame, with no motion/movement between them, other than being able to display all frame rates [120 being the common].
The human eye still is the end receiver.


----------



## Garyunc (Oct 8, 2006)

TomCat said:


> There is indeed 1080p24 content available live from DirecTV 24/7 on _The NFL Network_. As a matter of fact, that is their native format.
> 
> _The NFL Network_ allows TV stations to carry certain in-market games as broadcast events (we carried one last Thanksgiving, in fact) using _NFL_ as the backhaul. One requirement was a crossconversion of their content to the native format of the station (from their native 1080p24 to our 720p60). All content must be broadcast at the same format to prevent reacquiring resolution at changes between broadcast media elements (which would glitch every TV every time we went to a commercial), and that required us to crossconvert them to match us for that broadcast.
> 
> ...


English translation for this please


----------



## TomCat (Aug 31, 2002)

veryoldschool said:


> Ding, ding, ding,.. we have a winner here.
> 
> With 24, 30, & 60 frame rates, having refresh rates any higher, would only be repeating the same frame, with no motion/movement between them, other than being able to display all frame rates [120 being the common].
> The human eye still is the end receiver.


Well, here's what I find exciting about that:

The technology is moving toward the capability to actually interpolate and create unique frames between the actual frames. IOW, I as a broadcaster send 24 unique frames per second, and your TV is now smart enough to take frames 1-24 and create 36 new unique frames to place in between them, not just repeats of the original frames, but creating new frames which show moving images where they really would exist were I sending you 60 original frames.

This means no motion artifacts, or motion artifacts at the level of 720p60 or 1080p60. It also means lower data rates of 1080p24 rather than the currently prohibitive rates of 1080p60, while providing 1080p60 quality. This frees broadcasters to add more sub channels without sacrificing as much quality, and allows DTV to do the same. Everybody wins.

I can only wish that my current TV lasts until that technology matures and becomes ubiquitous. But we may see that even before 3D, and in fact it may be a key component of 3D. It would also make the current advantages of 720p over 1080i a moot point, and you will see all 720p outlets then move to 1080p24, possibly even the 1080i30 outlets as well.

48 fps is about all that the average human eye can detect, but there are some who are more sensitive to it than others, just like there are folks who can't stand DLP because they can see the rainbows that are invisible to the vast majority of us. This "critical flicker frequency" is affected by brightness (brighter images are more detectable) larger images (bigger screens or shorter seating distances aggravate it), by whether there is screen blanking time, and how much, etc., and is different for foveal and peripheral vision. So 60 fps would probably make artifacts invisible to everyone under all circumstances. anything above that (in particular 240 and 480) is just wasted.


----------



## TomCat (Aug 31, 2002)

Garyunc said:


> English translation for this please


That wasn't a shot, was it, Gary? A little fun at my expense? I'm cool with that. :lol: Can't you see me just ROTF and LMAO? Leno has nothing on you.

The first line _is_ the translation. The rest is just supporting info. I apologize if I'm not skilled enough to dumb it down any more than I already have for you. Maybe someone else would like to try.


----------



## Greg Alsobrook (Apr 2, 2007)

*Moderator Note:*

I've deleted quite a few posts in this thread that are borderline personal attacks. Please keep the discussion on topic.

Thanks!


----------



## paulman182 (Aug 4, 2006)

TomCat said:


> There is indeed 1080p24 content available live from DirecTV 24/7 on _The NFL Network_. As a matter of fact, that is their native format.


Then why does neither the HR20 nor TV give the 1080p indication on NFL Network, as they do when I watch 1080p PPV movies that DirecTV pushes to my hard drive?


----------



## veryoldschool (Dec 10, 2006)

paulman182 said:


> Then why does neither the HR20 nor TV give the 1080p indication on NFL Network, as they do when I watch 1080p PPV movies that DirecTV pushes to my hard drive?


Because the 1080p/24 gets converted into 720p/1080i by the broadcaster.
We have yet to see any "broadcast" 1080p/24 programing.


----------



## Impala1ss (Jul 22, 2007)

I could use some help with an issue that really confuses me regarding the resolution of my Mitsubishi 73" 73837 DLP.. 

I have DirecTV and have the HD DVR HR20-700. In its menu I can set it to pass through 480i, 480p, 720, 1080i, and 1080p to the tv.

If I pass through 480i/p does the TV upgrade it to 1080i? If it does, is the TV rescaling the resolution?

If I set the DVR to only pass through 1080i/p, is the DVR doing the upscaling? WHat does the TV do then?

Where does "Native" on/off come into play here?

Is there a "best" way to set the DVR?


----------



## veryoldschool (Dec 10, 2006)

Impala1ss said:


> I could use some help with an issue that really confuses me regarding the resolution of my Mitsubishi 73" 73837 DLP..
> 
> I have DirecTV and have the HD DVR HR20-700. In its menu I can set it to pass through 480i, 480p, 720, 1080i, and 1080p to the tv.
> 
> ...


Let me answer these "backwards":
"Best" is what you like.
"Native" means the DVR will output the program in its "native" resolution [if you have it selected in the setup menu].
With 480i/p, "something" is going to scale the image to fit your screen, or you'd see a postage stamp size image.
So with native "on", your TV is doing the scaling and with it "off", you're manually selecting "a resolution" for the receiver to output, and the receiver is doing the scaling.
"Which is better", is something only the viewer will know.


----------



## Garyunc (Oct 8, 2006)

TomCat said:


> That wasn't a shot, was it, Gary? A little fun at my expense? I'm cool with that. :lol: Can't you see me just ROTF and LMAO? Leno has nothing on you.
> 
> The first line _is_ the translation. The rest is just supporting info. I apologize if I'm not skilled enough to dumb it down any more than I already have for you. Maybe someone else would like to try.


Sorry man I did not mean it to be a shot. Just the explanation was very technical for me.

Does this mean that 1080P will be possible in the future for network programming and if so how far down the road would this be?


----------



## Impala1ss (Jul 22, 2007)

veryoldschool said:


> Let me answer these "backwards":
> "Best" is what you like.
> "Native" means the DVR will output the program in its "native" resolution [if you have it selected in the setup menu].
> With 480i/p, "something" is going to scale the image to fit your screen, or you'd see a postage stamp size image.
> ...


I'm still confused, Oldschool - If I set the DVR by checking 480i, Native on, it will send a 480 signal to the TV and the TV will upscale it to 1080? Where does the FORMAT button come in. I guess if the TV upscales it to 1080 the Format button won't work?

If I only change the DVR to pass thorough only 1080i/p, the DV R is doing the scaling, correct?

Between these settings on the DVR, and Native on/off and the format button, I am still trying various combinations. I just like to know the tech. reasons for what I'm doing.


----------



## veryoldschool (Dec 10, 2006)

Impala1ss said:


> I'm still confused, Oldschool - If I set the DVR by checking 480i, Native on, it will send a 480 signal to the TV and the TV will upscale it to 1080?
> *yes*
> 
> Where does the FORMAT button come in. I guess if the TV upscales it to 1080 the Format button won't work?
> ...


It can be a bit confusing, until you understand "what is doing what".
"Format" is not quite the same as "resolution". 
Resolution [of course] is changing the "scale" of the image, which can be done by either the TV or the DVR.
Format is what to do what "the image". How to adjust [a given resolution] it fit either a 4:3 or 16:9 screen. If you set the receiver for a 4:3 TV, then format works on HD channels only. If you set the receiver for a 16:9 TV, then it works for SD channels.
Your options are pillarbox [or letterbox], stretch, crop, and original broadcast [no bars].
With native "on" the format button only cycles through formats and not resolutions.
With native "off" the format button cycles through each format, then the next resolution and formats. It will cycle through all resolutions and formats.

[trying to break this down]
Native on: the TV does the [scaling] resolution change.
Native off: The receiver does the [scaling] resolution change.

Format changes can be done by either the TV or the receiver.
Setting the TV format to "Full" [or whatever yours calls it] lets you use the receiver format options.
Setting the receiver format to "original format" lets you use your TV's format options.

As you can see both the TV and the receiver gives you similar options, so you need to try all of them and see which you like best.
Some find their TV does the same [or worse] then the receiver and tend to set the receiver to do the formating/resolution changes.
Others [like myself] find their TV does a better job, so they have the receiver output "original resolution and format" and have their TV do all of the changes.

So do I have you totally  now?


----------



## jediphish (Dec 4, 2005)

veryoldschool said:


> It can be a bit confusing, until you understand "what is doing what".
> "Format" is not quite the same as "resolution".
> 
> Your options are pillarbox [or letterbox], stretch, crop, and original broadcast [no bars].
> ...


And, of course the "original format" feature functions differently depending on whether you are connected via HDMI or Video Component. It works the way its supposed to with HDMI.


----------



## veryoldschool (Dec 10, 2006)

jediphish said:


> And, of course the "original format" feature functions differently depending on whether you are connected via HDMI or Video Component. It works the way its supposed to with HDMI.


 Yeah, but "let's not go down that road" here. :nono: :lol:


----------



## TomCat (Aug 31, 2002)

veryoldschool said:


> Because the 1080p/24 gets converted into 720p/1080i by the broadcaster.
> We have yet to see any "broadcast" 1080p/24 programing.


NFL actually IS the broadcaster for DTV. The only ones converting that are OTA TV stations using NFL's main feed for the backhaul on an _ad hoc _basis. If you are on 212, you are getting NFL direct, not rebroadcast from a local station.


----------



## TomCat (Aug 31, 2002)

Impala1ss said:


> ...If I pass through 480i/p does the TV upgrade it to 1080i? If it does, is the TV rescaling the resolution?
> 
> If I set the DVR to only pass through 1080i/p, is the DVR doing the upscaling? WHat does the TV do then?
> 
> Where does "Native" on/off come into play here?...


If you have a non-CRT FP TV, 480i "passed through" (native on) is upscaled (and deinterlaced) by the TV to 1080p60, which is the native resolution of a 1080p TV. If you set the DVR to put out some other fixed rez, such as 1080i, your DVR upscales it to 1080i and your TV deinterlaces it to 1080p.

Your TV must always display at a fixed native resolution, so it will always rescale anything not 1920x1080 to that resolution. It must also always display progressively, so it will always deinterlace interlaced content, unless it is an older CRT. 720p60 and 1080p24 content is about the only thing that your TV will not need to deinterlace. If you have an older (pre-2004) TV, you can probably avoid poorly-done deinterlacing in the TV by setting the DVR to 720p, but that of course is a trade off of a resolution hit.

Most TVs will also add pulldown to content that comes in at 24 fps, but some newer sets will avoid pulldown judder by using a refresh rate that is a multiple of both 24 and 30 (120 or 240), and others shift to 72 or 96 to accomplish much the same thing.

Modern scalers/deinterlacers are all about the same and all pretty transparent, so it really doesn't matter much where the scaling/deinterlacing takes place, but if your curiosity is simply academic, that should give you an idea where it takes place.


----------



## TomCat (Aug 31, 2002)

Garyunc said:


> Sorry man I did not mean it to be a shot. Just the explanation was very technical for me.
> 
> Does this mean that 1080P will be possible in the future for network programming and if so how far down the road would this be?


No worries. I've become a little thin-skinned what with all the daggers and arrows lately .

1080p24 and 1080p60 have both been valid choices since the original 18 formats were approved by the ATSC. They _could _have chosen either originally.

1080p24 was impractical because legacy content has always been 480i30, and you can't change frame rates on the fly without making every TV (and your transmitter) glitch like crazy, so it made no sense to use a format for broadcast that had a wacky frame rate. The solution of pulldown for films works very well, so most broadcasters stuck with 1080i or 720p, which have the same refresh rate as 480i. Also, crossconversion was expensive and artifact-ridden in 1999 making that a bad choice.

1080p60 has always been impractical because it takes twice the bandwidth for an equivalent level of artifacting to the other formats, and broadcasters were only doled out 6 Mhz, or about 20 Mb/s of bandwidth, which is barely enough in the first place.

I can't predict when, but I believe we will see broadcasters convert to 1080p24 at some point. The interpolated frame technology is a real deal, and once it becomes common, that will drive things. But legacy sets that do not have the technology would tend to keep broadcasters from going there for some time.

What I really foresee is that 3D will drive this. 3D is going to be based on time-sharing frames between left and right eyes, implying a need for a faster refresh rate (so that it can be divided back in half for left and right eyes). It will also take a bit more bandwidth to send the delta or difference frames representing the "other" eye. The best way to do this is to create a backward-compatible system based on 24 or 30 fps for normal TVs, and a locally (at the set) reinterpolated 48 fps or 60 fps for 3D, so that the frame rate can be time-divided between left and right eyes.

I still can't comprehend how they will do this without some sort of glasses, but it is conceivable that they could build in persistence of vision into the glasses somehow to really smooth out any flicker.

Figure 5 years before 3D starts to be something networks start to latch on to. It all depends on complete compatibility with existing sets, which is doable. It just seems that the interpolative frame technology will be at the heart of it, so converting to 3D would probably coincide with moving to broadcasting 1080p24 and being able to watch that on newer TVs as true 1080p60, and probably in 3D for prime time.

As for broadcasters actually transmitting in 1080p60? I don't ever expect that to happen, as there are only 2 ways to do that:

1) more bandwidth, and since bandwidth = money, no one will do that

2) greatly-improved compression. The level it would have to be improved would be like about 3 or 4 times better than current MPEG-2, and even though MPEG-4, a 30% improvement, is readily available, legacy issues will keep even that out of broadcast TV probably forever.


----------



## Alebob911 (Mar 22, 2007)

Check to see if you receiver has a timing adjustment for your audio. By adjusting the timing you can eliminate the lip sync issue. Check to make sure DD is enabled on your HR2x. Using the HDMI "should" sync the audio to the video since your running it to your receiver. Hope that helps. I know how annoying that can be.


DeanS said:


> Thanks for the response regarding component cables carrying 1080p/24 content. My specific issue is the purchase of a new receiver with HDMI connections for my home theatre and an issue I ran into playing back 1080p/24 content from D*. Specifically, there is a lip synching error. I attribute this to the fact that I do not currently have a digital optical audio connection between the LCD TV and the new amplifier. I ran into the same problem with my old amplifier, and by making this connection, rectified it.
> 
> Now, however, with the HDMI output from the HR-20 routed through the receiver, how do I correct this issue? The receiver (Yamaha RX-V65) needs to be on "HDMI2" input since that is the input used by the HR-20. There is an "AV1" input on the receiver that is potentially available (currently occupied by an older standard-def DVD player with component and optical digital output). Would simply removing this component and connecting the HR-20's component and optical audio output to "AV1" on the back of the new receiver fix this issue? I think somehow the optical audio output from the LCD TV to the receiver is what is really needed to address this issue, but I am at a loss on how to do this with this new receiver.


----------



## DeanS (Aug 23, 2006)

Alebob...

Thanks for your suggestions. I'm pretty sure that DD is enabled on my HR-20 as I have been listening to DD all along. But I will check it again this evening, just to be sure. In addition, I have sent an email to Yamaha technical support regarding this issue and they may have some specific instructions or answers for me.

This new Yamaha receiver has a timing adjustment for video. However, I don't know how or what kind of adjustment to make. The manual that came with the receiver says that "HDMI Auto automatically adjust output timing of audio and video signals when a monitor that supports an automatic lip sync function is connected to this unit."
My monitor is a Sony KDL-40z410 LCD T.V. I don't think it supports this function. But you can leave the automatic function turned "off" in my case and enter a manual delay. The delay range you can set manually is 0 to 240ms. I have fiddled around with this a bit, and can't quite get it to work with D* 1080p/24 material. If I were somehow able to correct the lip sync problem with D* 1080p/24 material, would it screw up alternate audio sources (such as regular DD with 1080i or 720p programming)? 

Either way, not a huge deal for me, since I am renting blue-ray disks to watch at home much more frequently, but I would like to download and watch a 1080p movie from D* now and then and be assured that the video and audio match up.

Thanks.


----------



## TomCat (Aug 31, 2002)

DeanS said:


> ...This new Yamaha receiver has a timing adjustment for video. However, I don't know how or what kind of adjustment to make. The manual that came with the receiver says that "HDMI Auto automatically adjust output timing of audio and video signals when a monitor that supports an automatic lip sync function is connected to this unit."
> My monitor is a Sony KDL-40z410 LCD T.V. I don't think it supports this function. But you can leave the automatic function turned "off" in my case and enter a manual delay. The delay range you can set manually is 0 to 240ms. I have fiddled around with this a bit, and can't quite get it to work with D* 1080p/24 material. If I were somehow able to correct the lip sync problem with D* 1080p/24 material, would it screw up alternate audio sources (such as regular DD with 1080i or 720p programming)?...


The manual delay is a delay to audio to match video.

Typically, lipsync issues for MPEG material are audio leading video (MPEG video can slow down letting audio get ahead under difficult compression scenarios). That delay may come and go, while some part of it may be permanent. Likewise, most FP displays have permanent delay in them in the video (another good reason among many why processing is done in the analog domain--if not, the delay would become intolerable, as nearly every process step would add up to another frame or 33 ms of delay to whatever video delay is already there).

If you feed audio to your AVR directly, that audio is not delayed (compared to the video going through the display), and audio will lead video even more, typically closer into the realm of perception by the viewer. After a certain amount of accumulated error, it becomes noticeable and then annoying. The trick is to minimize it by removing or compensating for as much as you can, which even if not perfectly accurate, will probably make it unnnoticeable.

Many if not all but the cheapest displays also have a delay for audio, built-in on purpose so that the video delay will not add lipsync error, and normally not mentioned in the literature at all (they usually don't want to mention that the display has video delay or that the added audio delay compensates for that, as it might push buyers away, even though all monitors have delay).

To me, for that reason alone it makes sense to _use the display as the switchpoint for all sources that are "video with audio"._ IOW, feed your DVRs and DVD players, both their audio and their video, to the display first, feed the audio output of the display to your AVR, and use the TV remote to select the video sources.

This has three benefits:

1) There is less chance for lipsync error, because the audio is delayed through the display to match the natural video delay.

2) Switching sources at the display implies automatic matching of the audio source (audio-follow-video) so you don't have to also switch sources on the AVR.

3) Preswitching video/audio at the display leaves more inputs available on the AVR for audio-only sources (CD players, SiriusXM, whole-house audio servers, etc.)

Other than these issues there is also the issue that some sources just will have excessive lipsync error already. This was the case for many of the HD channels that actually launched on DTV, as they were new and they did not anticipate delay to the video. Once realized, it took time for them to budget for and acquire delay boxes to fix the problem.

The AVR manual delay is just that--one adjustment will affect everything the same amount all of the time. But there is a window--audio leading video is noticeable at about 20 ms, and audio trailing video is noticeable at about 50 ms (audio often trails video in nature, so a bit of delay seems natural). Your sources may have unequal video delay, but you can probably find a happy medium audio delay offset adjustment that will minimize the error beneath your perception level.

My AVR has only 0-100 ms delay. 80 ms seemed about right when DTV launched a bunch of channels last year, but I cut it back after they all got a bit better. I can't imagine needing 240 or that being necessary for 1080p content, but maybe it is. If you connect using the above method, however, that will probably remove a lot of the delay and the rest can be taken care of with the manual delay.


----------



## texasbrit (Aug 9, 2006)

It would make a lot of sense to use the display as the switchpoint, but in general that isn't happening. Most TVs won't even output DD 5.1 over their "digital audio out" if the original audio came in over HDMI; all you get is PCM. If the TV had a DD 5.1 output which it could delay based on its own internal processing time, then HDMI 1.3 would not need the audio sync capability (which seems not to be adopted widely anyway)


----------



## TomCat (Aug 31, 2002)

texasbrit said:


> It would make a lot of sense to use the display as the switchpoint, but in general that isn't happening. Most TVs won't even output DD 5.1 over their "digital audio out" if the original audio came in over HDMI; all you get is PCM. If the TV had a DD 5.1 output which it could delay based on its own internal processing time, then HDMI 1.3 would not need the audio sync capability (which seems not to be adopted widely anyway)


Many TVs do have that capability, but this is just one more of a number of problems with digital audio. Even HDMI 1.0 was capable of 8 channels of digital audio (192 KHz/24 bit LPCM at 37 Mb/s) from the start in 2003, so there is no reason 5.1 could not be decoded in the TV and sent to the AVR as discrete digital audio. The pipe is there. Most just don't do that, even 6 years later.

I have found enough compelling reasons to avoid distributing audio as digital altogether. The only real possible advantage for using digital audio distribution is 5.1, but the advantages of 5.1 over converting 2.0 to Dolby PLIIx 7.1 surround are minimal and highly over-rated (and over-hyped). Sorry to rant off topic, but other than the inherent video delay problem with FP displays:

1) 5.1 is a small (yet growing) subset of available audio. All 2.0 audio (virtually all audio other than 5.1) can be successfully converted to Dolby PLIIx 7.1 surround, including all programs, all local news, all audio-only sources, even local commercials and promos, at a quality level of 5.1.

2) The limitations of PLIIx are not nearly as restrictive as older Dolby surround systems were. The steering logic is improved to about 30 dB of separation, which is significant enough to make the separation of 5.1 not really significantly an improvement over it. The frequency response of surround channels is also improved over earlier surround techniques. Nothing else about 5.1 is a significant improvement over PLIIx, neither inherently or in common implementation.

3) The promise of 5.1 is not that great in the first place. Dolby PLIIx 7.1 surround simulates it adequately. People think 5.1 rocks because they expect it to and salesmen tell them it will, and they believe it because when they first experience it they are hearing it on a better system than they are used to--either a store system or a system they just bought. They want to validate a large purchase decision they either made or hope to make by convincing themselves that 5.1 makes a lot of difference, but what typically makes a lot of difference is the other aspects of the new system, and not so much 5.1.

You will hear a difference in a few scenes of the few sources that utilize 5.1 the way it was designed to be used, and that can be impressive, but the difference between most 5.1 and proper matrixing of 2.0 to Dolby PLIIx 7.1 surround is typically quite minimal, and usually not a compelling enough reason to submit to the negative aspects of using 5.1.

When surround is its most effective, the effect itself is ironically enough, something that is not noticed consciously, but something that contributes subconsciously to the quality of the viewing experience. If the effect is so distracting as to take you out of the suspension of disbelief needed to enjoy the movie, then what good is that? People who buy surround systems so they can consciously hear the surround channels are buying them for the exact wrong reason. You are not supposed to consciously perceive the surround, it is supposed to envelope you subconsciously, the same way life does. But try to tell a salesman that you want to buy a system that you won't consciously notice a difference on, and he'll think you're crazy. You must want something that the advantage of is that it grabs your attention, right? Actually, if you buy surround for the correct reason, then not at all.

4) The potential for 5.1 mastering is seldom realized, even on most 5.1 material. Audio mastering engineers have just not figured out how to make it reach that potential, and acquisition of audio at the record site is typically not done very well either, except for big-buget Hollywood blockbusters. In may cases, what is sold as 5.1 is 2.0 source audio mastered to 5.1 using, guess what--Dolby PLIIx surround encoding.

5) Digital audio distribution, other than by HDMI, causes a lot of clocking problems (clicks and pops especially on source changes). It is a regular forum complaint, and rarely is there a solution.

6) Other than rudimentary tone controls in AVRs, there is no easy way to process audio for distribution locally. Unless you are lucky enough to find the perfect expensive speakers or build the perfect audio-constructed room, audio really needs equalization to sound good. It also needs automatic level control, much of both which can be done much easier in the analog domain.

7) Processing audio in the analog domain locally is perfectly adequate. Sadly, many assume that digital is automagically better and that analog is somehow inherently flawed by comparison, but that is a complete fantasy, even though a widely-held if incorrect concept. The potential advantages of digital do not really apply in this isolated situation where control is easy and done in a non-hostile environment to audio signals.

Users have a long and successful history of processing audio in the home environment as analog (My father was a hi-fi stereo enthusiast as far back as 1958). It's a very mature field, and all of the issues threatening analog audio quality have been long fixed. You can get an AVR for under $300 these days that will outperform the typical $1000 AVR of a decade ago, and even those AVRs that are sold as "digital" amps, are not really digital, but still process in the analog domain. They are class G PWM which is analog processing (it has to use PCM in the power amp directly to be truly processing in the digital domain).

I don't have a bone to pick with digital (just from it being over-hyped). Digital audio gets better all of the time, and eventually my arguments will not be as strong as they are currently. But as far as home enthusiasts using digital distribution, things are still in an infancy stage and there are still a lot of significant nagging problems.

It will get much better. HD will also get much better. Technical developments are complex and it takes a lot of time to fine tune them. When "color" came to TV in the 50's, it took a good 15 years before it got nailed down to where you didn't have to keep twisting the hue control each time you changed channels or the channel changed programs. But they eventually figured out how to control that, and by the 90's, hue controls had disappeared from the front of the TV, no longer needed. They still exist as menu items, but rarely are they needed at all anymore.


----------



## DeanS (Aug 23, 2006)

Tomcat:

Thank you for your excellent and very thorough response to my question regarding audio/video delay problems with my current setup. Luckily, in my case, the Sony KDL-40Z4100 ouputs full DD 5.1 from its optical audio output. I will route this audio output to the Yamaha receiver and that should correct the sync issue. 

FYI, it turns out that the optical audio input on the Yamaha receiver which I will use to make the new connection is matched with component video inputs and is not assignable to other inputs. I presume the 1080p/24fps video content from D* can pass through its component outputs and is not just limited to HDMI? I think that this is the only way I can get it to work on this receiver. 

Thanks for your help.


----------



## CCarncross (Jul 19, 2005)

I believe 1080P from D* is HDMI only IIRC. I do have a question for you: Your Sony actually passes on the 5.1 audio from its optical output when it receives it via HDMI? I thought most all tvs only gave 5.1 out of the optical if the source was from the internal ATSC tuner.


----------



## DeanS (Aug 23, 2006)

The answer is "Yes," the Sony actually passes the 5.1 audio from its optical output when it receives it via HDMI from the HR20. I don't use the internal ATSC tuner in the set itself.


----------



## cygnusloop (Jan 26, 2007)

CCarncross said:


> I believe 1080P from D* is HDMI only IIRC. I do have a question for you: Your Sony actually passes on the 5.1 audio from its optical output when it receives it via HDMI? I thought most all tvs only gave 5.1 out of the optical if the source was from the internal ATSC tuner.


My Sony KDS-60A3000 behaves just as you describe. It outputs 5.1 from the internal ATSC tuner, but only passes thru 2.0 from a 5.1 HDMI input. If the newer Sonys don't have this limitation, that is good news indeed.


----------

