# What HD Monitor Resolution to Get?



## Oompah (Feb 8, 2006)

I'm in the tire-kicking stage of HD shopping now; long-time Dish subscriber (since 1998) but no HD equipment at all yet. I currently have a 501 DVR (great box!) and a 4000 from last century in another room. I may be interested in going to HD before the Germany '06 World Cup (on ABC & ESPN) this summer. Since I don't want to give up the DVR features, I figure I could replace both boxes with a ViP622.

I am familiar with NTSC (SD) and digital video in general and the difference between progressive and interlaced scanning. I've been reading up on the new standards, but I have some questions about LCD and Plasma monitor "native resolution" (the number of dots physically present on the screen) and the conversions necessary when the actual source is a different resolution. From reading, I find that some broadcasters (i.e. ABC, ESPN) use 720p (is it 60 or 30 fps? 60, I hope!) while others (CBS, Fox?) use 1080/30i (30 frames = 60 fields per second). These formats are downlinked from the satellite using whatever standard the originator provides [more or less - I read in the archives where E* was sometimes reducing horizontal resolutions over the satellite to save bandwidth, and the arguments over resolutions that ensued].

My understanding is that you set a ViP622 receiver's HD output to whatever standard you choose from those available (720p, 1080i, etc.), to match your monitor's capability. If some other resolution comes into the receiver, it is converted to the selected resolution. The receiver does not "pass thru" whatever resolution it receives. Is this correct?

I have looked at the HD offerings of the local Big-Box stores and specialty electronics and video retailers. In teh 37" size I'm most interested the big-box stores carry mostly 1280 X 720 (720p) monitors with an occasional 1920 X 1080 (1080i or 1080p) monitor; the specialists have a larger proportion of 1080's.

Say I get a 1080i or 1080p-capable monitor and set the 622 to deliver 1080i to it (that's the highest available, isn't it?) If I'm watching 1080i source material, everything should match and the PQ will be as good as possible, right? Now, suppose I'm watching a 720/60p ESPN feed. I presume the receiver converts this to a 1080/30i signal for transport to the monitor; this means scan conversion and interlacing a previously non-interlaced signal. Yuck. ESPN made a decision to go with progressive scan at somewhat lower resolution instead of interlaced for sound technical reasons (flickering, combing on stop-action), and doing this conversion would seem to nullify that.

My question: if I plan to watch a lot of - but not only - 720p source material, would I be better off with a 720-line monitor?

Even the people in the high-end video stores don't seem to have a clue.

Does any of this really matter?

Thanks for all the info. I have spent hours browsing the forums and archives and learned more here than everywhere else combined!


----------



## liferules (Aug 14, 2005)

Boy that's a loaded question! There are many differing opinions on this issue. Personally, I don't think most people would notice a different between 720p and 1080i. Some say 720p is better for fast moving frames compared with the 1080i...all I know is that 1080p is not being used by anyone...DVD or TV thus it would be of no benefit at this time.

Good luck!


----------



## Oompah (Feb 8, 2006)

Thanks for the reply, liferules. Yeah, I don't want to start a flame war over resolution and bitrates, I'm just trying to find some details that no one I've asked seems to know, or isn't forthcoming about.

I tend to buy stuff for the long haul (consider the vintage 4000 I'm still using that still works fine...) and want to be happy with what I choose for a while, anyway. Flat HD displays are still expensive...

The difference between computer screens with 768 lines and 1024 lines is very apparent. TV is more forgiving since the pictures are moving and you don't look at them as closely, but I'd hate to settle for 720 lines when 1080 are (sometimes) available, for not much more money. IOW, I'd like to get the best available within reason, but hate to think that I end up with the disadvantages of both reduced resolution (lower res source to start with then smeared by scan conversion) and introduced interlacing - a double whammy - in what I watch most now, for higher cost.

What to do? What to do? Maybe I'm worrying too much about too little.


----------



## rbyers (Jan 15, 2004)

The AVS Forums have a Plasma and LCD Flat Panel Display forum that should be a good resource for anyone mulling this question. Be forwarned that a lot of the posters on that forum are awfully particular (and sometimes dead wrong) just like on these forums. But, with a lot of reading, you can get a good sense of what to look for in a particular display technology.


----------



## liferules (Aug 14, 2005)

The other thing to keep in mind is that the decision may be already made for you if you're buying a big screen (greater than or equal to 42 inches) as they are almost all now 1080i from what I've seen lately in the stores...

Certainly if you're looking to not limit yourself, then the 1080 would be the highest resolution and if necessary could downgrade the resolution...


----------



## logicman (Feb 9, 2006)

Oompah,

I was were you are at now about 2 months ago ...

1) If budget was no issue (yeah, right) then I would have purchased the Sony LCoS technology, which produced beautiful HD images and did decent with SD.
2) Projection systems are good and give you "true" theater effects, but the image quality is slightly below direct view systems.
3) Sony has the best image quality of all the brands (IMHO) but this is very subjective and you may find Samsung, Panasonic, etc. have the edge.
4) I honestly think that 1080i and 720p deliver nearly equivalent pictures and these formats will dominate for the next few years. When 1080p is "mainstream" it will be an upgrade in quality but probably not so drastic that you'd wished you waited.
5) Look at 1080i and 720p HDTVs that you can afford and pick the one that you think looks best. Then enjoy it. That's what everyone here has done. It's been both a curse and a blessing for most of us.


----------



## voripteth (Oct 25, 2005)

Personally I'd recommend getting a set that has a native resolution of 1080. For me that meant a Sony SXRD that outputs a 1080p picture. That way if the incoming signal is 1080i or 720p I won't lose any detail. If I went with a set that could just generate 720p then all the 1080i signals would have to be downconverted.

IMHO 720 sets are "old" technology.


----------



## LtMunst (Aug 24, 2005)

liferules said:


> The other thing to keep in mind is that the decision may be already made for you if you're buying a big screen (greater than or equal to 42 inches) as they are almost all now 1080i from what I've seen lately in the stores...


They say 1080i only because they will accept a 1080i signal. You often have to dig deep into the fine print to find that most are still 720p native.


----------



## alindber (Jun 30, 2005)

IMHO, purchase the highest resolution you can afford. There are several systems with 1080p now available. Some will say so... My answer would be that the new "DVD" devices, "Blue Ray" etc, will show DVDs with 1080p resolution. By summer, there will be several to choose from.

Myself, would love the new Mits., 82" 1080p system..


----------



## guruka (Dec 27, 2003)

For a fixed pixel display (LCD, DLP, Plasma) 1080 native = 1080p (1920x1080) That's the way to go. I would not buy a 720p native display today. Things are moving too fast.

.....G


----------



## bhenge (Mar 2, 2005)

LtMunst said:


> They say 1080i only because they will accept a 1080i signal. You often have to dig deep into the fine print to find that most are still 720p native.


Also remember that going from 720p to 1081i is downconverting (you lose information). 720p displays 720 lines every 1/60th of a second, 1080i displays 540 lines every 1/60th of a second (in some HD circles 1080i is actually called 540p). This is also why 720p tends to be better for fast motion video (sports) and was adopted by ESPN and ABC. 1080p displays 1080 lines every 1/60th of a second. The only problem here is the lack of 1080p source material which forces an upconversion of all existing signal formats be it 480i, 480p, 720p, 1080i whatever. Your PQ in a 1080p set will be more dependent on the built in scaler that upconverts the signal than virtually anything else (but it is always better to upconvert than downconvert). That said, if you can afford it, go 1080p native.... 1080p sources will come and you will be ready for them (some PC and gaming systems are already there I hear).


----------



## olgeezer (Dec 5, 2003)

In sets smaller than 40" it may be difficult if not impossible for the human eye to see the difference between 720p and 1080p. p refers to frames per second. i refers to fields per second. There currently is nothing broadcast and may not be for some time in 1080/24p or 1080/30p. HDDVDs and Blu-Ray can support 1080p, but not sure when and how much these discs are. There are many great sets out there. I personally like the SXRD, but who knows what else this year may bring. Get something soon you're missing a lot of great TV.


----------



## jrb531 (May 29, 2004)

Since 1024i is really just 512 interleaved then any set that does 720p should do 1024i?

-JB


----------



## Stewart Vernon (Jan 7, 2005)

bhenge said:


> Also remember that going from 720p to 1081i is downconverting (you lose information). 720p displays 720 lines every 1/60th of a second, 1080i displays 540 lines every 1/60th of a second (in some HD circles 1080i is actually called 540p).


This is wrong!

720p is not higher resolution than 1080i. So 1080i is NOT a downconvert of 720p in any way.

I wish people would quit getting this confused. Newbies are getting confused by erroneous information floating out there.


----------



## tomcrown1 (Jan 16, 2006)

1080p is a marketing gimmick as this format will never come into being. One problem is the amount of bandwidth to do a true 1080p. If you compress the 1080p format than it will be no better than 1080i or 780p. The inexpensive 1080P 37 inch lcd Westinghouse is one example how bogus 1080P can be. The picture on the westinghouse is not anywere near as good a picture, as the 37 inch lcd sharp tv, which has a 720p native resolution. Their are other factors to a tv on how good a picture you will get--EX how good is it tracking for color showing details in the black background etc. In general a more expensive set like the 52 inch plasma tv from pioneer will give a better picture because of the better compontents inside.


----------



## Rogueone (Jan 29, 2004)

There was a very lengthy discussion in these forums a few months ago on this topic.

720p gives you the fewest pixels per second. 1080i gives you about 50% more pixels per second. 1080p could double that theoretically, but in actuality, can not put up any more "original" pixels than does 1080i, but it should be able to put them all up at once versus in 2 passes.

bhenge, your statement that 720p to 1080i is a downgrade is in error. this is not a matter of opinion. TV signals are 30fps, movies are filmed in 24fps. it doesn't matter how you do the math, 720p/30 only puts up 720 scan lines of original pixels for each frame of a program. 1080i puts up 1080 scan lines per frame, 1/30th second, so 1080i is a much more defined image. it doesn't matter than only 540 of the 1080 were lit at any given time, the BRAIN processes them as a full 1080. The human eye/brain can NOT differentiate images faster than 30fps, so even if 720p were working at 60, like 1080p might do as well, the unit has to put the same frame up twice before putting up the next new frame, since there are only 30 frames to put up in each second of TV (24 in the case of a movie). But as I understand it, 720 is 30 not 60.

1080 will be the best picture. 1080p could, possibly, be better than 1080i, but 1080p is not a supported mode according to the FCC docs linked in that other discussion. 1080i and 1080p will produce essentially the same picture, the same number of unique pixels per second. The difference is 1080p should be smoother in fast action, if the source were 1080p, but there are no 1080/60p sources, not even BluRay. The "source" is still at best 1080/24p or 1080/30p. But the point is, what is the difference in a 1080/30p and a 1080/60i image? To the brain, none. It's still compiling a complete frame every 1/30th a second. The phospurs/pixels in a 1080i system are designed to stay lit for 1/60th a second (1080p they stay lit 1/30th, or it would have to draw the same frame twice at 1/60th), so as the 2nd line is drawn, the first is going out. Due to how the Brain works, it still sees them both lit. it doesn't matter who you are, that is how your brain works, and why TV works as it does. The only flaw to interlaced is the drawing of the next frame, where the even lines are the old frame and the odds are the new frame. If your eye is sensitive enough, you MIGHT be able to see this. That is why fast action is always mentioned. In normal pictures the changes between frames are so slight you won't "see" the difference. But something like Nascar is so fast it makes it easier to notice.

*Are the flat panels really 1080?*
And as others pointed out, the bigger problem is most every HD set says it handles 1080i, but that does NOT mean it displays 1920x1080. Very few LCD and Plasma sets are actually 1080, and those that are, are not the sub $2000 models. The top of the line models are the ones that might do 1920x1080, so if you want 1080, be careful. DLP is a good example of this as well. The older DLP is 720p. This year, DLP is selling a 1080p product. It's not really 1920x1080 mirrors though. It's 960x1080. The Horizontal mirrors "wobble" to reflect onto 2 spots at once. it's very fast, and likely has no noticable impact on image quality, and was likely a lot cheaper than trying to build a chip with 1920x1080 mirrors. oh yeah, and you mentioned only 37", at the smaller sizes like this, you are less likely to see the benefits of 1080 as the pixels are so close together for both. As you reach 60" and bigger, 1080 makes a big difference in eliminating the "screen door" look.

*Future considerations*
If you have to have a flat unit, you are limited in how inexpensive you can get. DLP could allow you to get in a little cheaper, but those units are going to be 20" or so deep. LCoS might be an option, but again, for a real 1080i display set, expect to be out of the entry level prices. expect to be in the middle to upper pricing range. Also, depending on what is driving you to get HD now, keep in mind that all the of major Plasma and LCD panel manufacturers (Matsu****a, LG, Samsung etc) have announced in the past 2 months plans to push their existing plants to full production while bringing online new plants which have the same capacity, so by summer, all of them will be producing roughly 4x the panels they were this past fall. Some of the same articles announcing these new plants also noted current store stocks are 50-75% over stocked. Obviously, if the makers are ramping up production, and units are NOT selling well, prices are due to drop hard and fast by next Christmas. Also, these makers are just starting to push out higher volumes of 1080 panels, so by later this year, you could expect much better choices.

I would take that into consideration, and if you really want a better product for long term without paying out the nose, I'd buy the entry level 720p panel if you want HD now, then keep an eye on things around Christmas and this time next year to see if the TV you really "want" now is finally available and at the price you want to pay. And by this time next year, SED should be out in force. From one article :


> "A Surface-conduction Electron-emitter Display (SED) is a flat panel display technology that uses surface conduction electron emitters for every individual display pixel. The surface conduction electron emitter emits electrons that excite a phosphor coating on the display panel, the same basic concept found in traditional cathode ray tube (CRT) televisions. Based upon a FED design first pioneered by Sikh Harjinder Kamboja. This means that SEDs can combine the slim form factor of LCDs with the high contrast ratios, refresh rates and overall better picture quality of CRTs. Canon also claims that SED consumes less power than LCD displays.


and another I found reported this 


> "The Canon and Toshiba Joint Venture SED Inc. (Surface-conduction Electron-emitter Display) shows a Flat Panel Display with 100,000:1 contrast ratio.
> 
> We reported about a Samsung PDP TV, that has 10,000:1 and that was already huge.
> I would really like to see 100,000:1 contrast ratio first hand, but maybe it will make my eyes bleed. Who knows. "


so I'm holding onto my old 65" projection unit until there is an SED I can afford  (unless things change again like they always do haha)


----------



## LtMunst (Aug 24, 2005)

HDMe said:


> This is wrong!
> 
> 720p is not higher resolution than 1080i. So 1080i is NOT a downconvert of 720p in any way.
> 
> I wish people would quit getting this confused. Newbies are getting confused by erroneous information floating out there.


Just can't keep those 1080i=540p clowns down. :lol:


----------



## jrb531 (May 29, 2004)

LtMunst said:


> Just can't keep those 1080i=540p clowns down. :lol:


What I meant was that if a set was capable of doing 720 scan lines then it should be able to do 1080i because the set only displays 540 at a time while 720p.

Nothing to do with better picture or whatnot.

-JB


----------



## LtMunst (Aug 24, 2005)

jrb531 said:


> What I meant was that if a set was capable of doing 720 scan lines then it should be able to do 1080i because the set only displays 540 at a time while 720p.
> 
> Nothing to do with better picture or whatnot.
> 
> -JB


Nope. 1080i is 1080 lines. The cathode ray (if CRT) scans the first 540 on one pass and the 2nd 540 on the next pass to make up a full frame. It does not use the same lines. No way to fit into 720 lines without a downconvert.


----------



## jrb531 (May 29, 2004)

LtMunst said:


> Nope. 1080i is 1080 lines. The cathode ray (if CRT) scans the first 540 on one pass and the 2nd 540 on the next pass to make up a full frame. It does not use the same lines. No way to fit into 720 lines without a downconvert.


Don't think so.

The very reason for interleaving is to allow a larger resolution with lessor equipment.

If your monitor can do 720 lines in one pass surely it can do 540 interleaved.

This is why 1080i sets are cheaper to produce than 720p.

-JB


----------



## Stewart Vernon (Jan 7, 2005)

jrb531 said:


> Don't think so.
> 
> The very reason for interleaving is to allow a larger resolution with lessor equipment.
> 
> ...


More completely wrong info!

1080i scans 1080 lines. 720p scans 720 lines.

720p displays all 720 lines in one sweep of the screen. 1080i displays 540 in the first sweep, then 540 more in the 2nd sweep to complete the 1080 lines.

1080i is NOT 540 lines overlapping. It is interlaced scanning vs progressive. I hate to keep typing the same info in discussion after discussion...

For the sake of argument, lets talk about 720p as displaying lines 1-7 for a second. The monitor would display them as follows:

Line 1
Line 2
Line 3
Line 4
Line 5
Line 6
Line 7

All of the above would be done in one pass, progressive, to fill the screen with one full frame of info.

Now, lets talk about 1080i as displaying lines 1-10 (because it has more lines total than 720p does). The monitor would display them as follows:

First pass:
----------
Line 1
Line 3
Line 5
Line 7
Line 9

THEN, second pass:
----------
Line 2
Line 4
Line 6
Line 8
Line 10

All of the above would be done in two passes, as indicated interlaced, to fill the screen with one full frame of info.

This is done so fast that the human brain cannot detect it is happening. This is why movie pictures and TV works. With the exception of some folk who are particularly sensitive and get headaches or have seizures (rare), we simply cannot see the interlacing happen.

Thus... end result... 1080i displays a higher definition image than 720p. Period. It just displays them differently.

Another example is dealing cards... Cards are dealt in an interlaced format! Each player does not get all 5 (or whatever depending on the game) at one time, but rather each player gets a card before anyone gets a 2nd card or a 3rd and so forth. It doesn't affect at all the number of cards dealt. Everybody gets all their cards.

1080i gives more information than 720p. It just displays differently.

The reason why 1080i is a supported format:

1. Because it works, and the human brain cannot see the interlacing!
2. It can be done using less bandwidth than a progressive picture would.

Sure 720p at 60 frames might look better than 720p at 30 frames.. but most of those 30 more frames are identical information, and not new data... So yes, 1080p if it ever happened might look better than 1080i... but the reality is, 24-30 frames per second is all that is needed for motion. Film is only 24 frames per second, and you don't hear folks in the movie theater complaining!


----------



## LtMunst (Aug 24, 2005)

jrb531 said:


> Don't think so.
> 
> The very reason for interleaving is to allow a larger resolution with lessor equipment.
> 
> ...


So not true. What is the pixel resolution of a 1920x1080i native display? Answer-1920x1080.

A 1080i set does not scan 540 lines twice per frame. It scans 2 separate sets of 540 lines for each frame. If you cannot understand the fundamental difference between these these 2 scenarios, then what can I say? :nono2::whatdidid


----------



## olgeezer (Dec 5, 2003)

On another note, what type of interlace is a pinnacle deal


----------



## LtMunst (Aug 24, 2005)

Technically, at any given point in time, your retina is only being struck by photons from a single pixel. That means that 1920x1080i is really 1x1p. :lol:


----------



## jrb531 (May 29, 2004)

LtMunst said:


> So not true. What is the pixel resolution of a 1920x1080i native display? Answer-1920x1080.
> 
> A 1080i set does not scan 540 lines twice per frame. It scans 2 separate sets of 540 lines for each frame. If you cannot understand the fundamental difference between these these 2 scenarios, then what can I say? :nono2::whatdidid


If the monitor could display 1080 all at once it would be a 1080p

The very reason most monitors ar 1080 "i" is because they are capable of doing 540 in one pass but 720 is too much.

In fact a 720p set is more expensive than a 1080i set. Why? Because a 720p set has ro be able to display 720 lines "at the same time" while a 1080i set only has to do half of that number.

720p = 720 scan lines every second = 60 frames refreshed per minute thus why fast moving programs look better on 720p - because the picture is updated more.

1080i = 540 scan lines every second

Even lines second #1
Odd lines second #2

so the full 1080 scan lines are painted only 30 times a minute.

This is the very reason for the "i" = interlaced!

So stop saying that a 1080 "i" set can display 1080 scan lines at the same time - it cannot - only a 1080 "p" can display 1080 lines at the same time and not only are those sets expensive beyond belief (if you can find them) but the current hardware has a hard time compressing that much data with current tech.

-JB


----------



## harsh (Jun 15, 2003)

rbyers said:


> The AVS Forums have a Plasma and LCD Flat Panel Display forum that should be a good resource for anyone mulling this question. Be forwarned that a lot of the posters on that forum are awfully particular (and sometimes dead wrong) just like on these forums. But, with a lot of reading, you can get a good sense of what to look for in a particular display technology.


What other people think is good is pointless. What they know is bad is worth observing and weighing.

I think the key is to find a reseller that will let you trade if you made the wrong decision. You can stir specifcations and familiarize yourself with the technologies until you have no eyesight left. In the end, it comes down to what pleases you given the programming that you watch. For those who watch Disney provided programming, the answer is probably a 720 line set, otherwise, a 1080 line set has the potential to offer the best picture.

As was pointed out in a recent technology comparison, make sure you drag the whole household down to the store to see if they all agree. If one person in the house can't stand to watch the TV, you're going to regret it.


----------



## Rogueone (Jan 29, 2004)

jrb531 said:


> If the monitor could display 1080 all at once it would be a 1080p
> 
> The very reason most monitors ar 1080 "i" is because they are capable of doing 540 in one pass but 720 is too much.
> 
> ...


jb, please stop showing how uneducated you are. Please take the time to read THIS it shows how TV actually works.

Your statements are so screwed up it's almost funny, except someone is likely to think you know something you don't. Also, being as i have a Degree in communications that required learning how TV's work and how to repair them, um, I'm pretty sure I know what the heck I'm talking about, and I'm pretty sure you need to stop listening to teen agers who work at Best Buy (sorry couldn't help the poke at amazingly inaccurate statements that get made at stores like Best Buy)

here's the deal, JB and EVERYONE else who is too lazy to read thru that link above:
TV was created with a 30 frame per second requirement. Movie cameras operate at 24 fps (this was already told to you several times now). Why movies picked 24 fps as best I've been able to conclude was cost. It saved 6 frames of film per second, but it was still fast enough for our brains not to "see" the individual frames (like one of those picture books where you flip the pages to see the scene move). Why TV is 30 and not 24 fps, is really really simple and based on how TV is powered. It's called AC. You know, that plug in your wall? AC is 60Hz, so by pulling that 60Hz directly into the TV, and configuring the signal so that frame 1 goes from top to bottom in 1/60th a second (each cycle of the AC wall power), it made the vertical element very simple. In the beginning, technology wasn't sufficient to handle a 30fps progressive scanned image, so engineers decide to take that 30 FPS signal and make it a 60 cycle signal with half of each frame drawn each sweep of the vertical beam. This reduced costs and made the TV easier to build. They came up with the Interlaced concept. Take the odd lines, paint them first, in a single 60hz sweep, then paint the even lines. Phospurs were picked that would stay lit just over 1/60th a second, so line 1 is still barely lit as line 2 is drawn. So that you don't see a mixed image when the even lines are lit and the next frame's odd lines start, these phospurs are designed to basically stop glowing at the same time the matching line is being lit. Since this is occuring 60 times a second, the brain is seeing 30 unique images, which it is in turn processing as a single, continuous, moving image. The brain is incapable of actually seeing the 30 individual frames.

Progressive scanning came into being due to PC's and text. The static, non moving images of a computer, at 60 hz, are so synchronized with the eye/brain and it's 30 fps processing, that it caused most people to see flicker. This is why even bumping your display to 62hz or 70 hz eliminates all flicker. So when DVD's came out, and Digital TV's were designed, it was realized you could take the 480i dvd image, read it digitally, and before sending it to the TV, it could be sent as a full frame. But the TV still works on a 60HZ system, and the phospurs are still designed to work only for 1/60th a second. So the incoming 480p images couldn't be at 30fps, they had to send each frame twice, hence 60 fps, in order for the TV to display correctly.

When HD is discussed, 720p is the equivalent of "poor mans" HD. The computer equivalent image is 1280x720 roughly. 1080i/p are 1920x1080 pixels, actual, individual entities in the signal, and hopefully on the CRT though some aren't large enough to actually have that many phospurs. That is a different issue that your claim of 1080i being only 540 lines.

Please keep in mind, everybody who is confused on this, there are NO MORE than 30 frames per second sent to your TV. NEVER more, and Film will be 24 FPS. Having a display that refreshes 60 times a second is useless, but is required due to having to work within the limitations of the phosphurs on the inside of CRT tubes. And you'll notice almost no CRT's do 720p, and I would guess that 720p is 30fps instead of 60 as a result. If 720p were 60fps, then it could easily be viewed on a crt (the phosphurs have to be relit every 1/60th for progressive scanning). But LCD and Plasma can work at 30fps no problem, since the pixels are individually lit, not lit by a common, sweeping electron beam that won't return for 1/30th a second.

If some sales guy told you 720p was higher definition than 1080i, sorry you got misinformed by either a sleazy sales person or someone more likely who didn't have a clue he/she didn't know what they were talking about.


----------



## olgeezer (Dec 5, 2003)

And the reason this was done, is that original TV was developed as a B&W system. Because of bandwidth issues, when color was developed it had to fit in the same space. Color wheels were rejected, and an interlaced system was developed. It has served very well, and can do an excellent job with HD as well as SD in CRT displays.


----------



## Rogueone (Jan 29, 2004)

oh, and while discussing how AC impacts your TV, the reason TV's were 4:3 originally, was of course movies were originally. But also, the Horizontal scan rate is like 15,575 hz. The Horizontal part of your TV (CRT's only) is driven by multiplying that 60hz signal from the AC, to a speed that allows the electron beam to get from the left to the right and back in time for the second line to start. TV is actually some odd number like 520 lines, but some of those lines are intentional unviewable. They carry other information not intended to be viewed. So 480 becomes the "viewable" portion of the image, and why we call it 480i and p, as that is about how many viewable lines are on normal TV. 

So, when JB asks why a 720p set costs more than a 1080i set, 2 factors. 1) 1080i's are typically CRT which is still cheaper to make 2) if you look at 720p and 1080i within the same technology, like LCD or Plasma at both res's, you'll notice the 1080i is MUCH more expensive. it's easy to make a 720 res LCD glass, it's much more difficuly and costly to make a 1080 piece of glass which is the same say 1.3" size the 720 is. 

And with CRT's, it all has to do with the Horizontal scan rate. You see, for 480i, the Horizontal is only 15575 hz, which is just under 260 lines per pass. For 1080i, you need to do 540 lines per pass, or 32,400hz. This takes more expensive circuitry to boost the 60hz to 32400. And for a 720p based CRT, it has to do all 720 per pass, or 43,200hz. again, more cost to make. If you look at high end monitors for computers like SGI's, their monitors have to be able to sync over 100kHz, which is why the high end monitors cost so much more than the ones we see in stores. The 3 gun crt projectors I used to install in the 90's were $20g and $25g based on being able to do 80k or 100+k horizontal signals. 

That is what drives up costs the most in CRT's that are high res. And of course, with those higher resolutions, you need more phospurs on the tube, a mask with more holes etc.


----------



## Stewart Vernon (Jan 7, 2005)

LtMunst said:


> Technically, at any given point in time, your retina is only being struck by photons from a single pixel. That means that 1920x1080i is really 1x1p. :lol:


Hey, and if we are going to consider all factors... Consider that anything you see has already happened, since it takes time for the light to travel to your eyes... so you are always looking at the past!

Even live events are always looking into the past!


----------



## tomcrown1 (Jan 16, 2006)

http://www.sonyhdvinfo.com/article.php?filename=1080i-vs.-720p
Please read this link from Sony as this is the true story behind 720p and 1080I


----------



## Rogueone (Jan 29, 2004)

HDMe said:


> Hey, and if we are going to consider all factors... Consider that anything you see has already happened, since it takes time for the light to travel to your eyes... so you are always looking at the past!
> 
> Even live events are always looking into the past!


dats a good one


----------



## liferules (Aug 14, 2005)

Rogueone said:


> here's the deal, JB and EVERYONE else who is too lazy to read thru that link above:


Your link doesn't work. I guess I'll have to be lazy...


----------



## Rogueone (Jan 29, 2004)

liferules said:


> Your link doesn't work. I guess I'll have to be lazy...


wierd, how did the link I pasted get all jacked up like that?  it's a workin' now. and that is a great web site for simple laymans type explanations for lots of stuff.


----------



## Rogueone (Jan 29, 2004)

tomcrown1 said:


> http://www.sonyhdvinfo.com/article.php?filename=1080i-vs.-720p
> Please read this link from Sony as this is the true story behind 720p and 1080I


pretty good read. but the guy is sorta wrong on one comment

"However, because the interlaced half-frames differ in time by 1/60sec, subjects moving rapidly will appear doubled or blurry if one "froze" the video as when hitting Pause (or taking a screen snapshot). When viewed normally, high speed motion will still appear to be very smooth and rest of the scene will be with high detail due to the high resolution."

The "frame" is sent comlete, there is no "shift" of 1/60th a second between the 2 halves of the same frame. 1 of the 30 frames is cut in half, and displayed in 2 parts. Where does one come up with the idea that the original would look "time shifted" as long as both halves are of the same frame? urg this type of thing drives me nuts 

But he is half right, just stated it wrong. IF you were to pause at a point the 2nd half of 1 frame and the 1st half of another were being drawn, THEN you'd MAYBE see his "time shifting". But you'd have to have some serious fast motion if you think about it.

If you are watching normal TV like CSI or Lost or 24 etc. most of what you see is "normal" motion. Even in most sporting events, there is less fast action than normal action. Basketball, hockey and soccer sorta, car racing, these are sports with constant motion, but depending on camera angles, zoom distance, etc., the action may or may not be "fast" from a drawing perspective. a Slap shot would definitely be "fast", or a camera looking down the start/finish line watching the cars whizz buy in a blurr, haha.


----------



## bhenge (Mar 2, 2005)

Hey, I didn't mean to restart an old war, but the discussion is great. When I made my comment, we were talking about the native resolution of displays, not which resolution provided a 'better' or higher quality picture once you construct a full frame of data 30 times per second (24 for film). The point I tried to make was that if you feed a 720p signal to a native 1080i display, you would need a scaler to downconvert the 720p signal to 540 lines every 1/60th of a second. In other words to display in 1080i, images in 720p must be "downconverted" by eliminating 180 lines (or 25%), every 1/60th of a second. This is because a native 1080i screen can only display (is buffer a better word?) 540 lines per 1/60th of a second to display 1080 lines per frame. 720p images, however, display 720 lines every 1/60th of a second. My older Pioneer Elite RPTV is native 1080i... it won't even display a 720p source because the TV's internal scaler cannot convert the signal, I need the scaler in my 811 to do that. If this is wrong, please let me know. ;-)


----------



## jrb531 (May 29, 2004)

Ok Mr. Insult answer this:

You are telling me that my good old Projection TV that does 1080i really is displaying a full 1080 scan lines "at one time" and for some reason it cannot do a lowly 720?

Hmmm it can do SD, VCR, DVD and even 1080i but not 720?

Why? You see I had always thought that the reason was that in 1080i the projection set only had to draw 540 lines per pass and this is why no 720 but I guess I am wrong.

Silly me.

LCD's have a fixed resolution so and "LCD" 1080 does have 1080 lines - this I understand which is why most LCD's only go up to 720. but what about CRT's and projection sets that do 1080i... these all have 1080 lines of resolution?

-JB

Quote:

1080i indicates a frame composed of 1920x1080 pixels, usually at 60 interlaced frames per second. This means that there are actually 30 full complete 1920x1080 frames per second made up of two half-frames each 1/60th of a second. The half frames alternate between the even numbered horizonal lines and the odd lines. Upon viewing, the two half-frames are seen as a whole entire frame, although they differ in time by 1/60th of a second. 

Hmmm still seems like a 1080 "i" only draws 540 each pass. Guess I'm just to stupid to understand.


----------



## Oompah (Feb 8, 2006)

Thanks for all the replies, folks! I've been working late the last few nights and need to do more than skim over them to do them justice.

Please, gents... I *did not* want to restart some old arguments when I asked the question!

More later. I'm tired.


----------



## Rogueone (Jan 29, 2004)

bhenge said:


> Hey, I didn't mean to restart an old war, but the discussion is great. When I made my comment, we were talking about the native resolution of displays, not which resolution provided a 'better' or higher quality picture once you construct a full frame of data 30 times per second (24 for film). The point I tried to make was that if you feed a 720p signal to a native 1080i display, you would need a scaler to downconvert the 720p signal to 540 lines every 1/60th of a second. In other words to display in 1080i, images in 720p must be "downconverted" by eliminating 180 lines (or 25%), every 1/60th of a second. This is because a native 1080i screen can only display (is buffer a better word?) 540 lines per 1/60th of a second to display 1080 lines per frame. 720p images, however, display 720 lines every 1/60th of a second. My older Pioneer Elite RPTV is native 1080i... it won't even display a 720p source because the TV's internal scaler cannot convert the signal, I need the scaler in my 811 to do that. If this is wrong, please let me know. ;-)


what I keep trying to get you to understand is that is not the case, most of the time (sporting events maybe, normal HD it's not). 720/60p does seem to be the camera speed for most sporting events, but again, that is still useless during normal playback as the eye only sees 30 of them (but great for slow motion/stop action). normal TV is 24fps according to some data I'll post next.

In any case, the 1080i set isn't displaying half of one frame then half of another. This is where you keep stumbling. 1080i sets take a full 720p frame and simply reproduce the 2 halves in 2 passes. As you'll read shortly, you will see 1080 lines, not 540. So NO it is not accurate to say 1080i only displays 540 at a time. Now, sure the 1080i drops every other frame from a 720/60p since it can only show 30 fps, but again, you couldn't see more than 30 of the 60 being shown anyway


----------



## Rogueone (Jan 29, 2004)

jrb531 said:


> Ok Mr. Insult answer this:
> 
> You are telling me that my good old Projection TV that does 1080i really is displaying a full 1080 scan lines "at one time" and for some reason it cannot do a lowly 720?
> 
> ...


I'm not insulting you, I am informing you that your understanding of the technology is in error.

your CRT does not do 720p because of the way the phosphurs work. here is a snipet from this LINK"Of the three HDTV formats that are at the bandwidth limit two won out. ABC, and FOX for the over the air (OTA) networks and ESPN and ESPN2 for the cable/satellite networks chose 720p/60fps and NBC, CBS, PBS, WB, UPN and independents along with the other cable/satellite networks chose 1080i/30fps. Here is that darn interlaced format that is technically inferior again, why? In a word, or phrase, CRTs. The CRT based television was based on an interlaced scan system from the beginning of TV. Basically the cost to provide interlaced video on a CRT is much less expensive than the cost to provide progressive video on a CRT. Additionally, the persistence of phosphors had evolved to a point where interlaced video was more than acceptable for HDTV use. It was either force HDTVs to be even more expensive than they are for CRT based units or allow the interlaced format to spur on faster acceptance. Obviously the cost factor won out.​But what of the argument that with 1080 line interlaced video there is only 540 lines of video being displayed on the screen at any given time? If you have poked about on the internet much exploring this subject I'm sure you have seen this claim. In short it is a false claim. The claim of only 540 lines of video is based on the fact that the odd lines are scanned on one pass, or field, and then the even lines are scanned. What is forgotten here is a couple of things. First, on a CRT the persistence of the phosphors I mentioned before. Persistence is the ability of a phosphor to glow for a time after the electron beam has moved on, sort of like the glow of a filament in an incandescent light for a while after the electricity is turned off. This persistence is what keeps the first 540 lines of video lit while the second 540 lines of video is being painted. Now it is true that the prior scan of video will not be as bright as the current scan, but our brains will average this out which is why TV works for most humans and some dogs even. In short the phosphors provide the deinterlacing on the screen itself.​Now move to fixed pixel type displays like plasmas, LCDs, LCOS, DLPs, SEDs, etc and you have a completely different matter. These type displays are progressive in nature and any interlaced video fed into these displays will be deinterlaced by combining the two fields into a common frame for display. In the case of 1080i/30fps a video memory image of 1920x1080 is created and then scaled to the resolution of the display and displayed at the refresh rate of the display. Since most, if not all, fixed pixel type displays refresh at 60 times per second, each deinterlaced frame would be displayed twice. If the display has a resolution of 1920x1080 pixels, then the full HDTV resolution will be displayed, obviously not just 540 lines of video."​then there is this on the actual recording of the programming:"Another issue to discuss when talking about the difference to the viewer between 1080i/30fps and 720p/60fps video is the source of the video. If the source of the interlaced video is the same frame for both the odd and even lines, such as it would be for movie frames and progressive cameras, the deinterlacing will reconstruct the progressive frame back to the original. Movies are shot at 24fps and even if displayed on a 60fps display the effective frame rate will still be at 24fps, so having a 720p/60fps signal and corresponding display does not help at all. In fact the efficiency is not as good as a lot of data is redundant. The 1080i/30fps matches up for 24fps video with half as many redundant frames. Most prime time HDTV shows are also 24fps, so the only case where the 60fps would offer an improvement would be when the source is also 60fps, such as sporting events.​Also there is the interlace artifact where the object moves in the 1/60th of a second between the odd lines being scanned and the even lines being scanned. This was important back in the days of iconoscope cameras which were interlaced in the capture the same as the CRT tubes used for display, because these cameras had the same constraints as CRTs as far as interlaced video is concerned. Modern CCD solid state cameras use a matrix of pixels to capture the images as a full frame and the pixels are shifted out of the captured image matrix electronically. No longer is it necessary to have a different frame between the odd and even scans and these naturally progressive cameras are making the classic interlace artifact a thing of the past. Remember if the two passes are made from a common frame capture, the reconstructed image will end up progressive, even if the transmission is interlaced."​Keep in mind that even if a sporting program is shot at 60fps, the brain can only see 30. So, when a 720p signal is "upconverted" not downconverted, the "frame" is multiplied by 3, then divided by 2 (hence the term 3:2 or 2:3 pulldown, whichever it is). This transforms the frame in 720 to 1080, THEN the converting chip sends out the 1080 frame in 2 passes. It's still the same frame!! and since YOU can't see more than 30 fps anyway, the fact that it tosses out every other frame is irrelevant since you can't "see" those anyway. and when you do slow mo's, all 60 "frames" are still there and displayable, as the harddrive of the DVR didn't stop storing the 720p program. (This is a lot like in video gaming, where there are video cards capable of running games at 300+fps now. But at 300fps the game isn't any smoother that at 35/45/55/60. For a long time, 60 has been the magic number for online play, as it allows for a 30fps drop in framerate before you "notice" stutter from the video card being overwhelmed. )

In both cases, you will "SEE" 30 frames per second, as that is ALL your brain can process. If the initial image was a 24p recording, as the notes state, you are repeating twice as many frames in 720p as in 1080i, and 1080i is displaying more than DOUBLE the pixels per second. For 60fps recorded programs, while this makes 720p only 12.5% less pixels per second of 1080i, you can see all 100% of the 1080i pixels while you can only "see" 50% of the 720p, since your brain can't detect every other frame. it's like the addage, If a tree falls in the forest, and no one hears i, did it make a sound? the answer, it doesn't frackin' matter, cause if you WERE there to hear it, you would have  If a TV frame is shown on screen, and your brain can't detect it, was it really there? hmm 

I'm not saying all this to say 720p looks bad, I'm saying it to help you understand your ascertions about 1080i are wrong, and that except for programming recorded in 60fps, which is limited to some sporting events, 1080i will have the better picture. And in any case, 1080i will always have the higher resolution picture. as to the question of why CRT's don't normally do 720p, that article explains it well. Expense.

I'll give a fast example. 1995, worked at a company that sold conference room equipment. We sold 35" Mitsubishi tube "monitors". They were over $5000, cost was about $3500 to $4000. Circuit City et al were selling the same 35" sized sets for under $2000. Why the difference? The CC model only did TV signals, the one we sold could handle anything short of an SGI computer, so up to about 80kHz Horizontal (which is somewhere around 1600x1200 or maybe closer to 2000x something). SGI's went over 100k, and it took extremely special monitors to handle SGI computers.

As the article stated about the phosphurs as well, they are now-a-days staying lit for nearly 1/30th a second for interlaced TV's. What happens to the picture if you suddenly dump 720/60p into that set? it's blurry as hell. the phosphurs would be lit twice as long as necessary, and the picture would blow chunks. So you'd have to do 720/30p or 1080i. And since 1080 only requires a Horizontal/Vertical combo speed able to draw 540 lines per pass, it's cheaper to make and the same phosphurs that work for 480i work for 1080i. Does it make any more sense now? 720/60p isn't possible on a 480i based CRT. And 720/30p would cost more for an inferior picture in most cases


----------



## Rogueone (Jan 29, 2004)

this might be a fun read too. this is a Faq page for a Panasonic HD camera that appears to be the type your local TV news crew might take into the field


----------



## olgeezer (Dec 5, 2003)

The quote seems a bit strange. Help me. I thought 720p was either 30 or 24 frames. And currently 1080i was 60 fields per second. I thought that 3:2 pulldown had to do with scaling of film. Line doubling would cause blurring of images on an interlaced display, so the frames are displayed in fields as such: the first 3 times the second twice, the third 3 times, the fourth twice, and so on until the second is competed with 60fields and 24frames


----------



## bhenge (Mar 2, 2005)

Rogueone said:


> In any case, the 1080i set isn't displaying half of one frame then half of another. This is where you keep stumbling. 1080i sets take a full 720p frame and simply reproduce the 2 halves in 2 passes. As you'll read shortly, you will see 1080 lines, not 540. So NO it is not accurate to say 1080i only displays 540 at a time. Now, sure the 1080i drops every other frame from a 720/60p since it can only show 30 fps, but again, you couldn't see more than 30 of the 60 being shown anyway


Thanks for all the time and effort to explain this stuff and try to get thru my thick skull. I do realize that in a native 1080i system that the display of a frame will be 1080i, not 540, my concern was what the electronics (scalers? interlacers? deinterlacers?) did with the 720p signal before displaying it on a 1080i device, (regardless of how many frames our brains can process). I had always thought that 3:2 (or 2:3) pulldown (or telecine) was the process of converting 24 frame per second film to 60 field per second (30 frames) video and did not realize that it was also used to interlace a 720p signal. Thanks. BTW, a quick look at pulldown can be read at:

http://en.wikipedia.org/wiki/3:2_pulldown


----------



## jrb531 (May 29, 2004)

Thanks for the more civil response.

I still have that one burning question in my head if you would:

Why does my Rear Projection HD set "only" display 480 and 1080i and not 720?

Surely if what you say is true about all these sets running 1080 as the "real" resolution then any set that can do 1080i can do 720p?

Why do we not see many 1080p even if the boxes cannot do them? If the resolution is really 1080 and not 540x2 then I see no difficulty having a set paint 1080p - isn't the issue with 1080p more with the tuner and not the set?

Thanks for your explanation of the current system. Some I knew, some was new to me but these burning questions are still in my mind.

-JB

P.S. Some might say that rendering in two passes "is" only doing have the resolution at one time but I guess that is wrong


----------



## olgeezer (Dec 5, 2003)

That sounds like an RCA. All displays are supposed to display all 18 digital signals. In some cases this calls for some type of conversion. I do recall some RCA sets that were 720p sets that wouldn't display a 720p signal. Talk about strange.


----------



## tomcrown1 (Jan 16, 2006)

http://www.audioholics.com/techtips/specsformats/displaytechnologiesguide.php
Please read this link as it is a very good airtcle on all the displays available today.


----------



## jrb531 (May 29, 2004)

tomcrown1 said:


> http://www.audioholics.com/techtips/specsformats/displaytechnologiesguide.php
> Please read this link as it is a very good airtcle on all the displays available today.


Quote from article:

Rear projection TVs typically utilize 7" CRT guns, with some of the higher-end models using 9" guns. 7" guns can typically resolve about 700-800 lines of resolution. The high end 9" guns can do upwards of 900 lines. Typical direct view televisions deliver just over 600 lines of resolution. Most RPTVs have at least 30Mhz of video amplifier bandwidth, which is good for just under 720p or 1080i. Better models have upwards of 75Mhz. Most direct view televisions have 20Mhz video amplifiers, with some higher-end units extending above 30MHz.

Here we go back to what I was saying and got blasted for LOL - At least on Rear Projection HD sets they cannot do a full 1080 in one pass so they must interlace two 540 line passes. Of course I was shot to all hell so I still must be understanding this wrong 

-JB

P.S. My Samsung 1080i set is about three years old so I'm pretty sure it only has 7" guns. Even the 9" guns, according to this article, cannot do 1080.


----------



## Rogueone (Jan 29, 2004)

bhenge said:


> Thanks for all the time and effort to explain this stuff and try to get thru my thick skull. I do realize that in a native 1080i system that the display of a frame will be 1080i, not 540, my concern was what the electronics (scalers? interlacers? deinterlacers?) did with the 720p signal before displaying it on a 1080i device, (regardless of how many frames our brains can process). I had always thought that 3:2 (or 2:3) pulldown (or telecine) was the process of converting 24 frame per second film to 60 field per second (30 frames) video and did not realize that it was also used to interlace a 720p signal. Thanks. BTW, a quick look at pulldown can be read at:
> 
> http://en.wikipedia.org/wiki/3:2_pulldown


 well, the 3:2 pulldown I've seen all over likely came from film, I'd say that is correct. It's just that 3:2 happens to also be the ratio it takes to convert a 720 line image into a 1080 line image and back, so I might hve hijacked the term since it seems to fit  haha. I think I hijacked the term since I hadn't noticed another 3:2 conversion term many places and it seems like it's the same basic concept going on

(what follows if my 12 year old understanding of this, if things have changed, someone please let me know hehe)
anyway, if you have a 720 line image, how do you get it into a 1080 line image? It's like digital pictures, or CD with oversampling, and the like. If you take the 720 and triple every pixel you get an image with 2160 lines. Half of 2160 is 1080. (same works for 1280x3=3840/2=1920)So in a digital world, you split the difference in each line. The 'chips' that do all this converting would need to look at pixels 1 & 2, determine if they were identical or not (1&2 will always be identical), then create the 1080 pixel 1 as a duplicate of 1 of them. Then it looks at pixels 3 & 4. 3 will be identical to 1 & 2, but 4 will be identical to 5 & 6, so the 'chip' needs to interpolate (I think that's the term used) the 2 lines, and look at each pixel, and create a new pixel that is split between the two. Then the 3rd pixel is identical to 5 & 6. I'm likely oversimplifying this, but hopefully the general idea is there.

At least, this has always been my understanding for how digital data is handled thru compression and size changing etc.

Of course, going the other way, 1080 would get doubled to 2160, then 720 would need to pick off every 3rd pixel. So pixels 1 & 2 are the same, 3 & 4 are the same, 5 & 6 are the same from the 1080 image. 720 would take pixel 1, 4, 7, 10 etc. So right off, it either takes 2 of the 1080 pixels then drops the 3rd, or it would have to create an image of 720 lines, then go back and interpolate every line pair it had just generated into a new line (I doubt they do this, but if they did it would be like the new line 1 is interpolated from 1 & 2, new 2 is 2 & 3, new 3 is 3 & 4). I'm pretty sure, like in photo resizing, the simplest and cleanest method is to simply drop the every third pixel so that all the other pixels are identical to the original.

so, I'll say clearly, this how I understand digital to work, and it's possible they've changed it over the years and I just never caught the message  haha but I hope this makes sense and is still accurate. 1080 upconverts the 720 image and keeps 2/3rds of the pixels identical while needing to create 1/3rd of them as likely transitional pixels. 720 when downconverting 1080 simply is a copy of 2 out of every 3 pixels. And the "reasoning" behind this all is, at these resolutions, and normal viewing distances, the human eye can not absolutely detect the changes. it can sort of think it does, or maybe it'll think one is sharper and one softer, but unless you get close enough to "see" the pixels, you shouldn't be able tell.

Of course, seeing the pixels is actually a problem when viewing LCD and Plasma, as it's not that hard to see the "screen door" effect when a little too close, or the size is a little too big (like my buds 13'x7' projected 720p theatre. you just have to accept the fact you can see the tiny gaps between the lcd's so you can have that large an image haha). CRT has the same issue, but it seems the way the phospurs work, it's much less noticeable until you are much closer to the screen. And this would seem another reason why LCD and Plasma would like to get to 1080 panels as quickly as they can, it'll reduce the screen door look even more


----------



## Rogueone (Jan 29, 2004)

jrb531 said:
 

> Thanks for the more civil response.
> 
> I still have that one burning question in my head if you would:
> 
> ...


this is the last portion of the post on the 1st page


> As the article stated about the phosphurs as well, they are now-a-days staying lit for nearly 1/30th a second for interlaced TV's. What happens to the picture if you suddenly dump 720/60p into that set? it's blurry as hell. the phosphurs would be lit twice as long as necessary, and the picture would blow chunks. So you'd have to do 720/30p or 1080i. And since 1080 only requires a Horizontal/Vertical combo speed able to draw 540 lines per pass, it's cheaper to make and the same phosphurs that work for 480i work for 1080i. Does it make any more sense now? 720/60p isn't possible on a 480i based CRT. And 720/30p would cost more for an inferior picture in most cases


phosphurs have to "die out" just about the time they will be hit again by the electron beam. Since we are talking progressive, there would constantly be phosphurs overly lit during each pass if you were using interlace based phosphurs (1/30th a second light time). Keep in mind, there are 3 phosphurs per pixel for CRT (red green blue) and 3 beams. If the first pass lights all 3 phosphurs, and the 2nd pass only lights red, what will you see on the tV? strong red, weaker blue/green mixing to whatever color their percentages of mix makes. not the red you were suppose to see. For a 720/60p signal, the set would have to do 30p in order to not overwrite the phosphurs like that.

and there is the expense of the electronics it takes to make 720 sweeps per pass instead of the 540 per for the interlaced. With the same phosphur setup, the 720 pass would need to occur in 1/30th a second, while the 2 540's would occur in 1/60th each, or 1/30th for the pair. Since CRT's need to be compatible with the 480i signal, they have to use the 480i phosphurs, and 1080i uses the same phosphurs  it would simply cost too much extra for the mainstream buyers to make the crt's compatible with 720p and 1080i


----------



## olgeezer (Dec 5, 2003)

I'm not sure I follow that.

http://www.tvtechnology.com/features/Tech-Corner/f_technology_corner-04.07.04.shtml


----------



## Rogueone (Jan 29, 2004)

olgeezer said:


> I'm not sure I follow that.
> 
> http://www.tvtechnology.com/features/Tech-Corner/f_technology_corner-04.07.04.shtml


haha i just saw that as well 

here is the basic CRT stuff from that link
*CRT RESOLUTION*Cathode ray tubes are completely analog devices, and unlike other display technologies we have discussed, they do not have discretely ad-dressed pixels that de-fine a native resolution. A color CRT has a matrix of red, green and blue phosphor dots, while a monochrome CRT has a continuous coating of a single [nom-inally] white phosphor over the entire screen. The resolution capability of a given color CRT depends on a number of factors-some of the important ones being the dot pitch (the distance between dot centers, which effectively defines the size of the dots themselves), how tightly the electron beam is focused and the electron beam's scanning speed. A phosphor dot is not a discrete pixel; the sweeping electron beam does not uniquely turn each individual dot on and off, as is the case with discrete pixel types of displays. Rather, as the beam scans across the face of the tube, the areas of phosphor that the beam strikes fluoresce with an intensity proportional to the beam's instantaneous amplitude. The size of the phosphor area illuminated by the beam depends on the focus of the electron beam and the distance between adjacent dots of the same color. If the electron beam is sufficiently well-focused, the tube will be able to resolve an area smaller than a single dot: We may think of a dot as a tiny area of continuous phosphor. If the electron beam's focus is sufficiently diffuse, on the other hand, it may simultaneously illuminate more than a single dot. The dot pitch plays a significant role in determining a color CRT's resolution capability, because it determines the distance between adjacent dots of the same color. If we consider just red dots, for example, one red dot is not directly adjacent to another red dot. There are, rather, green and blue dots that are to some degree positioned between the two red dots. The result is a gap between adjacent red dots, and the size of this gap influences the attainable resolution. The farther apart the red dots, for example, the less red resolution the screen is capable of displaying, and likewise for the green and blue dots. The smallest dot pitch found in a typical top-quality monitor tube is about 0.22 mm, or about 4.45 dots per millimeter, which is about 115 dots per inch. Such a display with a width of 36 inches has about 4,156 dots in a horizontal line. It is safe to say that it is possible for a CRT monitor to display the resolution of HDTV, either 1280 x 720 or 1980 x 1020, if the proper conditions of dot pitch, electron beam focus and scanning speed are met. It must also be pointed out that many CRTs, sometimes even those found in nominally HDTV displays, are not capable of displaying a resolution of 1920 x 1080.
​Then it concludes with this about Projection TV CRTs:CRT projectors are subject to the conditions and caveats enumerated above for direct-view CRTs, with the recognition that the tubes used for projection are smaller than those used for direct view. Projector CRTs have round imaging screens, and are monochrome tubes with a continuous phosphor coating across the screen, not phosphor dots separated by a shadow mask. CRT projectors have resolutions up to 1920 x 1080, but it should be noted that they are scanned at 1080i.
​I can't believe this didn't dawn on me before this clearly. A Direct view TV is greatly limited by it's "shadow mask". This mask is there to align the 3 beams onto specific phospur areas. The mask does create a maximum resolution, while not a "native" one. But, with projection sets, there are 3 CRT's each a single color. As explained, those CRTs don't need a mask, and the only limiting factors to resolution are the size of the individual phosphurs, the beam size and the speed it can draw. With our 1080i tv's, beam speed isn't an issue, it only has to go fast enough for 540 lines per pass, and considering these units are "1080" line sets, they do have to have beams tight enough and phosphurs small enough to allow for 1080 lines. Since it's only 1 color each, as noted, that isn't all that hard.

I've seen and heard that 800 line limit many times and never understood where it comes from. now i think I do. It's either based on old technology before HDTV became mainstream, or that limit comes into play for a 7" crt if it is a 3 color tube with a mask. Dot pitch is obviously an issue with single tube sets, but dot pitch is nonexistant for 3 tube sets  This is another example of bad information being propogated by many people to the point it becomes fact and no one any longer remembers where the concept originated. If indeed these 7" crt were limited to 700 lines or so, wouldn't they just have made them 720p sets? if you sold a TV as 1080 line capable and it could only do 800, you would very shortly be involved in a class action suit. too many techie people out there would jump on the lie if that were the case.


----------



## harsh (Jun 15, 2003)

Rogueone said:


> The "frame" is sent comlete, there is no "shift" of 1/60th a second between the 2 halves of the same frame. 1 of the 30 frames is cut in half, and displayed in 2 parts. Where does one come up with the idea that the original would look "time shifted" as long as both halves are of the same frame? urg this type of thing drives me nuts


With respect to NTSC video, the article is correct. This is one of the unfortunate compromises of "crystal clear" digital television. Digital television at the SD level is strictly for the benefit of the delivery company and the viewer gets something that, at its worst, appears to be half the frame rate.

I can remember the first time I saw a live shot with a brand new digital ENG camera on the "local" evening news. Being from Oregon, it was raining. I will always remember how distracting it was to see the rain drops stop and start further down the screen. The waterfalls are a lot less restful to watch in digital.

There is a reason that HDTV demos typically avoid waterfalls and fountains.


----------



## Rogueone (Jan 29, 2004)

haha. interesting harsh  

yeah, one of those articles mentioned how the old cameras recorded in interlaced as well, so I can see where in a case like that, a 1080i set wouldn't look overly good, and why some still think it's 540 in reality. Because, with the wrong camera, it would technically be 60 different frames  but thankfully there aren't gonna too many of those around anymore 

after all this conversing, I feel better about my rptv, but still hate the damn issues with convergence and edge focus  If not for the cost I'd just replace the damn thing with a dlp and be happy, especially one of the newer 1080p models. but I just don't have that kind of money to burn, so I'm gonna ride this sucker to it's death!! haha still 1 or 2 years on the 5 yr warranty so that helps too


----------



## jrb531 (May 29, 2004)

Rogueone said:


> haha. interesting harsh
> 
> yeah, one of those articles mentioned how the old cameras recorded in interlaced as well, so I can see where in a case like that, a 1080i set wouldn't look overly good, and why some still think it's 540 in reality. Because, with the wrong camera, it would technically be 60 different frames  but thankfully there aren't gonna too many of those around anymore
> 
> after all this conversing, I feel better about my rptv, but still hate the damn issues with convergence and edge focus  If not for the cost I'd just replace the damn thing with a dlp and be happy, especially one of the newer 1080p models. but I just don't have that kind of money to burn, so I'm gonna ride this sucker to it's death!! haha still 1 or 2 years on the 5 yr warranty so that helps too


Funny you mention that LOL

I've gotten so good at manual focus that I can do it in about 30 seconds 

The old Samsung auto focus never worked after a tech replaced one of the guns that put me back $300 - a few weeks past the 1 year warranty. Why oh why did I not get the extended 3 year warranty... oh wait I rem... it was $350 for 3 years LOL.... I guess they got my $300 from me one way or another 

-JB


----------



## Rogueone (Jan 29, 2004)

hahaha. I do miss the old NEC convergence menus I used to use. Back in the 90's I installed NEC 3 gun projectors in places like NIH and the CIA and some Navy conference room in Crystal City (2 in that one, cool place). And I needed to converge it for each video type so it could store the settings. 

It would take 4 to 8 hours to get perfect, I liked it to be perfect. But unlike this stupid Mitsu, that sucker allowed you to adjust every cross point and every type of skew, keystone etc. Oh, and I could focus the lenses as well, but then again, the rptv doesn't have variable distances from it's screen hehe

what i find bad with my mitsu the most are the corners. i just can't get them perfectly adjusted as it doesn't allow manual manipulation there. thankfully most shows are forgiving, but football is funny because I've gotten the lines across the top and bottom out of whack a little, and there is a wavy affect to the screen that is most notable when the field is shot from the side so the lines run vertically the width of the image  too lazy to get into that secret menu and redo it all. and Sears hasn't a clue how to fix what I've messed up haha


----------



## Antknee (Oct 13, 2005)

I recently bought a 50" plasma. What I have realized is that you have to take a look at what HD sources you have available and how much you will actually watch. For me, even with Dish HD pack, there isn't much stuff I really want to watch; after the initial coolness factor of HD wears off. 
If you live in a big market and can receive OTA HD or sat HD locals, great. But if you can't or mainly watch SD channels such as History, Sci-Fi, etc. You will be better off going with a smaller set, maybe 42", and maybe even EDTV. 
Why EDTV? Because it's resolution more closely matches SD programming and even DVD. Keep in mind that DVD is not an HD source. SD programming is notorious for looking bad on HDTV sets, especially on larger sets. Most reviews say that EDTV looks great and it is hard to tell the difference unless the sets are side by side and viewing high rez HD. 
In a few years when HD is more prevelant you can buy a brand new HD set. Also, maybe the resolutions will be more standardized. And you will have the EDTV as a nice second set.


----------

