720p vs. 1080i - Which is better

 

New member
Username: Hilltom

Post Number: 1
Registered: Sep-04
Can anyone explain to me if it is wise to buy an HDTV that only gets 480, then skips to 1080i. Is 720 necessary, should I really get one that has both, because i have a great deal on one that only gets 1080i. thanks
 

xvxvxvx
Unregistered guest
Currently no display actually displays more than one native HD resolution. If you purchase a CRT it will natively display 1080i. If you purchase a fixed pixel type display such as DLP LCD LCOS they will display in 720p.

As far as an HDTV that only gets 480 this does not make any sense. please list the exact TV you are considering, manufacturer and model number.

xvxvxvx
 

Omaha
Unregistered guest
I personally think 720p looks alot smoother when it comes to fast action. You still get flicker with 1080i due to the fact it is interlaced. 720p draws all 720 lines per 1/60th of a second. 1080i only draws every other line (540 lines each pass) per 1/60th of a second. If you want to watch alot of football, 720p is definitly the way to go if you can afford a fixed pixel display (LCD, LCoS, DLP, D-iLA.
 

Anonymous
 
480P picture is ED, not HD. If you have a Progressive scan DVD player, you can get 480P otherwise you have 480i. I bet the TV Tom was talking about is a CRT HDTV that displays HD in 1080i (convert 720P signal to 1080i also) and SD to 480i. 720P is better for fast moving pictures such as sports. But if it is not a big screen, you would hardly notice the difference.
 

New member
Username: Hilltom

Post Number: 2
Registered: Sep-04
Sorry, it doesn't only get 480, it also gets 1080i, and yes, sports is the main reason I want it. It is a CRT Panasonic 34", do they make CRT that will take a 720p, and is it that big of a difference, worth paying alot more for when I can get a good deal on the one that only recieves 1080i. Is the flicker that bad?
 

New member
Username: Curbina

Post Number: 1
Registered: Mar-05
How can people say that 720p looks better than 1080i, I have a Sony KD-30XS955 (WEGA Super fine pitch CRT) and every Monday i watch "24" on FOX (720p) and then switch to "CSI:Miami" on CBS (1080i) and CSI looks much, much better than 24, and even the NFL football games on CBS looked better than FOX, even though a lot people say that 720p is better for sports I don't see it, just face it 1920 X 1080 has more resolution lines than 1280 X 720, is as simples as that, even though 1080 lines are interlaced and all you see are 540 lines each field it is still more lines than 720p. 1920 X 1080= 2073600 pixels, 1280 X 720= 921600 pixels, 1920 X 540= 1036800 pixels.
 

Silver Member
Username: Dmwiley

Post Number: 168
Registered: Feb-05
Carlos, people can say that because it depends on the source, the display device and their visual acuity. In some situations, 720p does look better, despite the technical info you provide. I did not know that you could purchase a digital tv that only displays 480. Are you sure about this?
 

New member
Username: Vikingknut

Post Number: 10
Registered: Mar-05
Carlos,

Some of the reason that you may see a better picture with 1080i is that your TV is CRT based. Unlike micro-display technology LCD, DLP, and Plasma, CRTs do not display non-interlaced signals very well. Therefore, for CRT based systems, people tend to prefer 1080i over 720p. In my case (DLP-set), 1080i almost always looks worse than 720p. Now you could argue that having only 720 lines of resolution, of course 720p will look better. It has the same number of lines as your native display and is non-interlaced. However, there is more to it than that. This is my belief (albeit backed by many professional video folks), that even if you had a TV that supports 1080p (which will support both 1080i and 1080p natively), the 720p picture will look better on any program that has even a moderate amount of movement. Sports, for sure, and even most movies. Material that stays pretty static on the screen (beautiful vistas, talking heads, etc.) may look better in 1080i.

Now as I said in the beginning, this will probably not be the case for those watching CRT based HDTV's.
 

frankie nj
Unregistered guest
I just bought a Mitsubishi WD-52327 DLP TV. The only HDTV format it supports is 1080i. It says in the manual that a 720p signal must be converted by the DTV receiver to be displayed by the tv. I am getting Comcast HDTV service but I'm unsure if the box is capable of the conversion. Am I screwed? Do I need a special conversion component? Would appreciate some help.
 

paulm
Unregistered guest
Frankie,

Usually the cable provider's set top box will allow you to switch between resolutions - the DCT5100, for example, can go between 1080i, 720p, 480p and 480i.
 

Anonymous
 
I'm kinda' new at this. Why would you want your HDTV to display in 480i or 480P if you have 720 and 1080 as options?
 

Silver Member
Username: Paul_ohstbucks

Post Number: 723
Registered: Jan-05
It depends on the signal source.

A low-def source will always be 480 regardless of what your TV is capable of. If you send it a junk signal, that's exactly what the TV will output.
 

HD Guy
Unregistered guest
I have another reason to display 480i or 480p. My Hitachi 57S700 offers 2 modes (16x9 and 16x9 zoom)of display for 720p or 1080i inputs and 5 modes (16x9, 16x9 zoom, 4x3, 4x3 stretch and 4x3 zoom) for 480i/p inputs. I would choose 480p options when watching SDTV material and use the 4x3 mode because I don't like the distorted pictures.
 

Rob in Orlando
Unregistered guest
Thanks guys, I posted as Anonymous at 11:18 am.

The reason I ask is as follows: my Samsung 32" flat panel LCD HDTV accepts the four resolutions (480i, 480p, 720p, 1080i). I have a Scientific Atlanta 8300HD set top box from Bright House (Time Warner). In the box's options, I can select the resolutions that my TV can display. However, when I have all four resolutions checked, it seems to flicker and jump a lot when switching from a resolution on one channel to a different resolution on another channel. It flickers so much, it almost looks like it is damaging the display. Once the new channel comes up everything is fine. I can tell by the box that it does switch to match the resolution that the channel is broadcast in unless I uncheck one of the resolutions.

What is my strategy here? Should I just select 1080i and not select the other 3 resolutioins, or should I select all four and just deal with the momentary jumping and flickering? Maybe just select 1080i and 720p? Are there pros and cons?

Thanks for reading, I'm sure I don't sound very technical. I appreciate any help.
 

Silver Member
Username: Paul_ohstbucks

Post Number: 730
Registered: Jan-05
Rob.......

yes, your best bet is to leave it on one mode as default rather than have your TV detect and change resolutions all the time.

Set it at either 1080 or 720, and just leave it that way.(whichever mode you prefer)
 

Jetson
Unregistered guest
Rob:

I have the Scientific Atlanta 3250HD, and I have set my T.V. to accept the 1080i and 480P (because my t.v. won't display 720P).

In your case i would set the T.V. to accept the 1080i, 720P, and the 480P. This will allow you to watch your HD Channels in the mode that they are broadcast in (i.e. Your T.V. won't upconvert ESPN HD to 1080i).

Also, Your Non-HD "DIGITAL" channels are broadcast in 480P, and your analog signals can be converted to 480P from 480i, so choosing 480i as an option is worthless. The 480P picture of your analog channels will look much better than the 480i..

Bottom line, try setting your STB to broadcasat in 1080i, 720P, and 480P, and see how that works. You will still get the flicker, but it is only a brief second, and you probably don't jump from back and forth from HD to Non HD.

Hint: Make sure you select 480P "standard" or "4:3". As these channels are broadcast in 4:3 mode and not 16:9, your t.v. will automatically stretch the picture to fit your t.v. (assuming you have a 16:9 t.v.) This allows you to use the "aspect" button on your t.v. remote to change the screen size.

If you choose 480P widescreen, it will broadcast your HD channels in 480P, at least it did on my t.v. I learned this the hard way.
 

Troy Heagy
Unregistered guest
"CRTs do not display non-interlaced signals very well."

.

Funny. My CRT seems to display my non-interlaced computer screen just fine. (I think your statement is in error.)
 

Rob in Orlando
Unregistered guest
Jetson:

I really appreciate you and Paul's help. It is great to talk to people who understand all this stuff! Most of the time, I feel like I'm educating Bright House.

Based on what you both said, I have it set to 720P and 1080i only. I actually do a lot of jumping back and forth between HD and non-HD, so this helps prevent the constant flickering. What if anything am I losing by not having 480P selected?

Thanks again!
 

fx
Unregistered guest
"Funny. My CRT seems to display my non-interlaced computer screen just fine."

Actually it doesn't at all. Your CRT display downscales the progressive signal, interlaces it and then displays it in an interlaced format.

xvxvxvx
 

Troy Heagy
Unregistered guest
Nonsense.

My Magnavox computer monitor (CRT) displays up to 1600 x 1280 resolution, progressive scan. And it looks beautiful. "CRTs do not display non-interlaced signals very well," is a false statement.

troy
 

HD Guy
Unregistered guest
Although most CRT based HDTV's have 1080i as native, they CAN display progressive format (e.g., 480P, or similar for DVD).
 

Jetson
Unregistered guest
Rob:

By not selecting 480P, you lose the ability to display the picture in its native format. This results in reduced viewing quality. My T.V. (Panasonic 47WX53)automatically "fills" the screen when watching a non-HD channel. If you only select 1080i and 720P, you will have to use your comcast remote to "stretch" the picture when watching Non-HD channels. Usually a T.V. does a better job of filling the screen then a STB can stretch the screen to remove the bars on the side.

I would try it each way for about two weeks and make a decision based on what works best for you. Remember it takes time for your eyes to adjust to your t.v. after you make adjustments, so give it some time.
 

Silver Member
Username: Joe_c

Oakwood, Ga

Post Number: 230
Registered: Mar-05
dale you prefer 720p over 1080?
 

Anonymous
 
Question, if anybody can help:

I have Panasonic 60" LCD HDTV monitor
Comcast HD box going dvi out to hdmi in
I can choose one setting to output from the box; should I choose 720p or 1080i?
When I choose 720p, my HD programs are in 720p
when I choose 1080i, my hd programs are 1080i
Should I just manually switch them all the time; like for sports..etc..
sorry if I ask silly question
 

RoyBrown
Unregistered guest
"I have Panasonic 60" LCD HDTV monitor"
Your TV is probably 720p native, so you should probably select 720p. This way you will see 720p channels natively. For 1080i, the selection really just affects which device does the conversion from 1080i to 720p. If you select 1080i, the TV does the conversion. If you select 720p, the cable box does the conversion. The output quality will depend on which device has the better scaler. Changing selections based on channel sounds like a pain, so I'd be inclined to leave it on 720p all the time unless you notice a significant difference in quality.

720p and 1080i are pretty close in native quality as far as I can tell, but both will look worse coverted than the other displayed natively. There is no reason that a CRT can't display 720p and 1080i natively like computer monitors, but my understanding is that most HD CRT televisions display 1080i native and convert 720p to 1080i or 480p. This might explain Carlos's issue.
 

New member
Username: Greggiebuffet

Alpharetta, Ga USA

Post Number: 1
Registered: May-05
I have the Sony KP-46WT500 and am concerned that some of the 720 hdtv signals are being downloaded to 480p, since only 1080 input is viewed at 1080.. Does this mean I'm not getting true HDTV? I have Comcast with their Motorola two tuner hdtv/DVR. Any suggestions as to what I should do or be looking for?
 

fx
Unregistered guest
Set the internal settings of your DCT-6412 to output 1080i if that is what you truly believe. However if it is like most Sony CRT's it will not even accept a 720p input, much less downconvert it to 480p.

You are probably confusing an SD broadcast on a channel that ocassionally transmits an HD signal. Be sure to understand that not all programming is broadcast in HD even if your channel says it is the HD channel.

xvxvxvx
 

gEmRN
Unregistered guest
I have a panasonic pt50LC13 50" hdtv set. it does 480p, 720p, 786p (or something like that), and 1080i. I just want the BEST picture and sound.

1. which res is the BEST? 1080i is interlaced isn't it? so shouldn't it be not as good?

2. i just bought a hi-def dvd player 1 month ago that has a dvr cable... will this be obsolete soon, i heard there will be HDDVD technology by toshiba and Blu-Ray by Sony... will i need to get a new dvd player or will the one i have be able to play the new format?

3. sometimes when i watch normal tv (not hi-def channels) the show says "WIDESCREEN: BROADCAST IN HI DEFINITION WHERE AVAILABLE" but on my tv the show looks just as shitty as the rest of the shows. why?
 

New member
Username: Phialpha

North Canton, OH

Post Number: 4
Registered: Apr-05
gEmRN,
1) The 720 vs. 1080 argument just never dies. There are fanboys who will argue until they are blue in the face for each format, but does it really matter? Just be glad you have HD. Seriously, watch a game in SD and and you'll wonder how you ever enjoyed TV before HD. Your set has a native resolution of 1280 x 720 which means any HD signal you feed it will look great. End of story. It's inputs accept 480/720/1080 signals, and it converts everything to 720p. Is 1080i better? Is 720p better? It's HD- don't worry about the rest!

2) Your DVD player probably converts the DVD 480p signal to a higher resolution format by way of a built in line doubler or scaler. Yes, there are 2 new DVD formats on the way, both with higher capacity and therefore more info able to be displayed. There are rumors of combining both standards into 1 (which would be better for consumers) but as of Thursday May 19th, 2:15 PM EST, there are 2 formats. Yes, it will probably be a rehash of BetaMAX/VHS for the consumer market. Your DVD player will not be able to play bluRay/HD-DVD discs, unless these discs are encoded with today's DVD standard. HD-DVD might be backwards compatible, but your DVD player will not be able to decode the full HD-DVD standard.

3) Tell us more about your cable/satellite system. What market (city) are you in? Are you using an HD set top box? Are you using cableCARD? Are you tuning over the air? Are you tuned to the SD channel when there is an HD channel available? These are things we need to know.
 

iworkforcable
Unregistered guest
greg may 11 - not many poeple broadcast in 720 the box will display it if its coming to u in that format like fox hd - look on the back of your t.v it sometimes says what it supports
like 1080/480 which means no 720 is supported
 

Ron JL
Unregistered guest
Just to thank everyone for this great discussion, you have solved several of my problems just from reading here. RJL...
 

Anonymous
 
I have a computer monitor that displays up to 2048x1536 @ 85Hz and that is progressive scan. the level of detail in games is great. I think alot of people think crt's cant do progressive scan, because they remember their older standard scan tv's. also computer monitors are muti-scan and non native meaning no matter what resolution you set it at you still get a clean picture.

A few questions though.

1. Are Crt based Hdtv's multi-scan also, and do they have the ability to play any-everything at its native res, or do they force (upconvert/downconvert) everything to a preferred native res like lcd's, dlp's,and plasma's have to?

2. I have notice a few large 22"Crt monitors have DVI inputs now (IBM), does this allow for a clearer picture at high res (something the VGA interface had trouble with).

3.Where are the Slim CRT computer/HDTV monitors, where any ever announced?

4. Why are the resolutions on Crt based and LCD based HDTV's so much lower than their computer based monitor brothers. 1920x1200P or more compared to 1400x768P scan or less, respectively? I have never understood that, As I want to buy one 32-34 with ful 1920x1200P or better.

5. Can A computer monitor with just a DVI input (Apple CInema 30", 23", or IBM CRT DVI 22" be used to recieve HDTV through cable satelite or over the air?
And How?

Thanks
 

Anonymous
 
i was just playing around with my 15" sony crt monitor and radeon 9200 pci videocard on an old computer and low and behold this small 13.8 viewable .25 trinitron monitor shows 1920 by 1080 progressive at 60 HZ crystal clear. WOW. unfortunately the computer is to slow to show any hdtv content, but i never new it could do that. the box said either 1028x768 or 1280x1024 MAX.
cool. wish i had a powerful computer to see how full HDTV would look on such a small display.
 

New member
Username: Phialpha

North Canton, OH

Post Number: 7
Registered: Apr-05
Anonymous, 5/30/05, 2:12 AM,
5. Can A computer monitor with just a DVI input (Apple CInema 30", 23", or IBM CRT DVI 22" be used to recieve HDTV?
Yes. You can use a cable box or a satellite receiver or a HDTV tuner with a DVI output and plug it into the display.
 

fx
Unregistered guest
To Anonymous AKA RT Joby,

Still searching for that unicorn CRT?

xvxvxvx
 

New member
Username: Avernus

Post Number: 3
Registered: Jul-05
I noticed something funny...

I think 720p is technically a "better picture"...

I know this may sound silly, but when I went into my menu options on my Comcast HD-STB, I scrolled through the 480i, 480p, 1080i and 720p...

And they technically scroll through settings from lowest to highest...and 720p was the last option..

As I said, it may sound simple...and it doesn't matter...both display amazing quality in HD...but it's definately food for thought for the people who like to argue over this type of stuff lol
 

Gold Member
Username: Paul_ohstbucks

Post Number: 1670
Registered: Jan-05
Actually, TVs are either or..........not both. Regardless of available cablebox settings, your TV will only convert one or the other based on the type of technology built into your specific TV.
 

New member
Username: Avernus

Post Number: 4
Registered: Jul-05
"Actually, TVs are either or..........not both. Regardless of available cablebox settings, your TV will only convert one or the other based on the type of technology built into your specific TV."

My TV displays 720p and 1080i, pending on which setting I select on my cable box...

when I hit "display" on my TV and had 1080i on my box, the display said 1080i | 16:9 ... now, since I changed to 720p, it said 720p | 16:9

and the manual that came with the TV says it can do both of the settings...

I'm not sure if you mean it can only display one at a time...because that's obvious(at least to me *shrugs*) or if you mean that TV's usually only have 720p or 1080i settings and not allow the option...
 

Gold Member
Username: Paul_ohstbucks

Post Number: 1673
Registered: Jan-05
What is your TV Brand and model number?? From what I've seen, TVs are either 720p or 1080i.......not both.

 

New member
Username: Avernus

Post Number: 7
Registered: Jul-05
I swear I already replied to this...

but umm

sony grand wega kdf-42we655
 

fx
Unregistered guest
The Sony Grand Wega kdf-42we655 displays in 788p 100% of the time, period, finished, no other correct answer!!!

It will accept all inputs from 480i, 480p, 720p and 1080i and the display will show you what is being input but it always displays 720p (actually 788p x 1386 but most people call this 720p).

xvxvxvx

 

Gold Member
Username: Paul_ohstbucks

Post Number: 1685
Registered: Jan-05
I just looked up that model myself.......

It's a 720p model........LOL

AS fx says, it will absolutely NOT convert 1080i.

Sorry, but it's either or. I have yet to see a single TV that uses both technologies.
 

Wilson Pickett
Unregistered guest
I find that 1080i gives a crisper picture, but when in motion, as in helicopter shots over landscapes, there is blurring and artifacts. I assume my TV just doesn't have the response time.
If it becomes too annoying on certain programs, I will switch to 720p.
 

Gold Member
Username: Paul_ohstbucks

Post Number: 1690
Registered: Jan-05
I found it to be completely the opposite. With 720p, I noticed pixelation during motion sequences and could see lines.(screen door effect). My TV is 1080i, and when I went to my parents over the holidays, it was especially noticable on their 720p TV. While watching the NFL playoffs during the xmas holidays, it was driving me nuts.

I've never ever detected that on my TV.
 

Wilson Pickett
Unregistered guest
[ ...(actually 788p x 1386 but most people call this 720p)]

fx-

Correct me if I'm wrong, but I think you're confusing signal resolution and screen resolution.
The picture is encoded 720 lines (p); and if your TV display is 788 pixels,
then there is not a one to one match and there is a small distortion effect as the screen must fill itself
proportionately to 788 with only 720 lines of info.

This theory could explain some artifacts.
 

fx
Unregistered guest
"AS fx says, it will absolutely NOT convert 1080i. "

I did not say that at all. I said it will not display in 1080i, it does indeed convert all inputs to it's native resolution of 720p/788p.

xvxvxvx
 

fx
Unregistered guest
"Correct me if I'm wrong, but I think you're confusing signal resolution and screen resolution. "

OK, you are wrong! I have not confused anything at all. The screen resolution is what is displayed no matter the input. All inputs to this display are converted with potential scaling artifacts.

What I wrote is that some people call all LCD and DLP displays 720p although none of them in reality are, they are mostly 768p or 788p displays and some scaling occurs at all times.

xvxvxvx
 

Gabriel
Unregistered guest
Hi, I recently purchased a Mitsubishi WS-55315 CRT set. I went ahead and gave Comcast a call and they came and installed the Motorola Dual Tuner DVR/HD cable box. I believe my TV only supports 1080i, and I have absolutely no complaint with the picture whatsoever. However, I hooked my DVD up to an S-Video connection, and I have been less than impressed with the picture quality. When I switch outputs to the S-Video input for my DVD, I see that it says the resolution is at 480i. My DVD player is NOT a progressive scan player; if I were to purchase a progressive scan DVD player would it automatically make the picture 480p? I ask b/c I really don't know how to mess with the TV input settings. Any help would be highly appreciated.
 

New member
Username: Avernus

Post Number: 8
Registered: Jul-05
ahaha wow....calm down people

but umm...all I know is that when I toggled between 1080i and 720p on my HD STB, it changed to whatever I selected on my TV when I hit display...

That's all...but it does do 1368x786 or something...I forget what I read yesterday, so yeah...

but wouldn't going to 1080i be possible when it interlaces 540 lines...blending the pictures?...it would just downgrade it without utilizing it's potential...

I'm no expert, I know a thing or two..but I'm still sketchy on this topic...all I knew is what the "display" on my TV said and changed to..

 

New member
Username: Avernus

Post Number: 9
Registered: Jul-05
"Hi, I recently purchased a Mitsubishi WS-55315 CRT set. I went ahead and gave Comcast a call and they came and installed the Motorola Dual Tuner DVR/HD cable box. I believe my TV only supports 1080i, and I have absolutely no complaint with the picture whatsoever. However, I hooked my DVD up to an S-Video connection, and I have been less than impressed with the picture quality. When I switch outputs to the S-Video input for my DVD, I see that it says the resolution is at 480i. My DVD player is NOT a progressive scan player; if I were to purchase a progressive scan DVD player would it automatically make the picture 480p? I ask b/c I really don't know how to mess with the TV input settings. Any help would be highly appreciated. "

you might want to purchase an upconversion DVD player that supports DVI or HDMI, you will feel alot better about your TV when you get the DVD player to match it's standards...

...at least until HD DVD's and Blu-Ray comes out ahah
 

Gold Member
Username: Paul_ohstbucks

Post Number: 1704
Registered: Jan-05
Gabriel,
Yes....


Get the progressive scan DVD player.
 

RR
Unregistered guest
Could use some expertise help here: my Projection TV is HDTV Upgradeable; as soon as I connect an HDTV Direct TV Box to it along with the proper dish, I'm good to go. The problem is it will broadcast HDTV Programs in 1080i, but programs broadcast in 720p will not show that way, and will show in 480p instead. My question is this: how much different (or better) is 480p than just a regular broadcast? I'm really doing this for the upcoming NFL season, and have noticed that the games that are National games are all 720p--meaning I will be seeing them @ 480. Is it still much better at 480p than a regular broadcast thereby making it worthwhile to me? Thanks!
 

fx
Unregistered guest
Dear Donald,

One more time slowly just for you. :-)

When your tv shows 1080i on the screen it is telling you the input resolution, not the display resolution. Do you now understand? The input resolution, the input resolution, please repeat after me, the input resolution, not the displayed resolution.

xvxvxvx
 

New member
Username: Avernus

Post Number: 10
Registered: Jul-05
"Dear Donald,

One more time slowly just for you.

When your tv shows 1080i on the screen it is telling you the input resolution, not the display resolution. Do you now understand? The input resolution, the input resolution, please repeat after me, the input resolution, not the displayed resolution.

"

and one last time...even more slowly for you :-) I just stated what my television said...

some people need to get laid instead of arguing over a HD format ;)
 

Nicky G 2012
Unregistered guest
Maybe I can offer some perspective. I sell, as part of my job, high def. video production systems. A whole aspect of the equation these arguments tend to miss are the production and post-production stages, as well as broadcast (OTA, cable, satellite, etc.)

There is way more complexity than 720p vs. 1080i. What HD encoders are used, what production workflow is followed, what compressors are used... There are MANY factors in the HD production pipeline that make a difference to quality, well before the broadcast hits your TV set. Because of this, in real-world scenarios, no direct 720p vs. 1080i comparisons can even be made, despite which set you use and whether its native resolution matches that of the broadcast itself.

The "better" and "worse" broadcasts people talk about may have crappy compression (often the case, and varying greatly between over-the-air, cable, and satellite, as well as between channels, and even between different HD shows on the same channel). These qualities you find appealing or not appealing may have nothing at all to do with the resolution it was shot in, or progressive scan vs. interlaced playback.

Consider this: two groups of people could be issued the EXACT same set of production, post-production, broadcast, and playback equipment for HD, and come up with VERY different results in terms of picture quality.
 

Bronze Member
Username: Avernus

Post Number: 15
Registered: Jul-05
I agree with Nicky G...because the INHD channels I receive are in crystal clear quality...appearing in better picture than what is shot on most other HD channels...

The broadcast plays a huge factor in the reception..
 

Agrees with Nicky
Unregistered guest
Way to go, Nicky. What people don't seem to understand is that they're often comparing apples and oranges here. A terribly produced HDTV signal is going to look bad whether it's 1080i or 720p, or vice versa.

I think the one thing we can all agree on is that either way, TV looks a whole lot better when it's HD. And, we all probably need to get laid more.
 

Agrees with Nicky
Unregistered guest
Right on Nicky. Bad HDTV broadcasts look bad whether they're 1080i or 720p, and it's often hard to know whether that's even the problem or if you're really noticing a difference in quality between the formats.

I think the real answer is, none of us would even notice either way if we would just stop thinking so damn hard about it.
 

Still Agrees with Nicky
Unregistered guest
Ugh. Thought it lost my post, retyped it as well as I could recall, and then they both wound up getting posted. It's a good illustration of just how poor my short-term memory is, however.
 

Kinnfrey
Unregistered guest
fx or other knowledgeable person,
Lcd, plasma, dlp have 768p, 788p etc.,and scaling occurs as you mentioned. I can explain this to people but a few ask why don't they make them 720p so their is no scaling on 720p sources like Fox and ESPN HD. Neither myself or anyone I've asked knows. Can you shed any light on this? Thanks



 

fx
Unregistered guest
I'll give you a hint 768 x 1.33333333= ?????

Some/most DLP displays are 1024x768 pixels.

A widescreen picture is presented in a 16:9 ratio so true DLP HDTV should have 768x1366 Pixels or 1280x720 pixels.

SVGA = 800x600 Pixels
XGA = 1024x768 pixels

TI chose the minimum progressive format which also happens to be a 4:3 ratio and scales both the vertical and horizontal resolutions for an HD input but requires no scaling for an NTSC input.

It costs much less to make a 1024x768 chip and scale than it costs to produce a true HD chip. About 786,000 mirrors compared to about 1.3 million mirrors for 1280x720 (true 720p) which would then require scaling for any NTSC (4:3 ratio) inputs. Either way something must be scaled and it makes more sense to scale with more data rather than less.



Got it?

xvxvxvx

 

Kinnfrey
Unregistered guest
fx, Your very informative answer is deserved of many thanks. I now possess a clear understanding. -K
 

New member
Username: Insearchof

SAN FRANCISCO, CA United States

Post Number: 1
Registered: Aug-05
I too was confused about 720P and 1080i. I have a Projector, a plasma and and rear projection TV that offers both. Comcast cable's motorola dct6400 allows you to switch between the 2 settings (which I have hooked up to all my displays) and I dont see a huge difference between 720P and 1080I. here is a link I found that might help you all decide for yourdself which is better. http://alvyray.com/DigitalTV/Naming_Proposal.htm
 

Anonymous
 
fx,

I read a review of a Samsung 46" DLP set from cnet:

http://reviews.cnet.com/Samsung_HL_R4667W/4505-6484_7-31307908-2.html?tag=top

In the review it states this:

"All models in this series use Samsung's latest fifth-generation light engine, with a DLP chip that has a native resolution of 1,280x720. This means that they should be able to display the full detail of 720p material"

Is this just marketing speak or are DLP tv's starting to come with a 720 native resolution, rather than 768 or 788 that was previously mentioned?

Thanks,
 

fx
Unregistered guest
I wrote:

Some/most DLP displays are 1024x768 pixels.

So it is likely with the increased production volume TI is now producing 1280x720 chips for little to no additional cost. After all it is the 5th generation chip so I consider it probable.

xvxvxvx


 

nick kufahl
Unregistered guest
hi

i am thinkinng of buyting either a 1080i or 720p dlp tv for mostly gaming and xbox 360 game playing which one would be better form me thanks again
 

Unregistered guest
whats the best resolution for gaming. 480p,720p or 1080i. Does it depends on the game or tv?
 

Bronze Member
Username: D_singh

Post Number: 25
Registered: Sep-05
I find my 720p picture from Dish Network to be terrible (pixelation and lines all over the picture) on my 720p native LCD projector.

Weird, since people all say 720p is better than 1080i. I watch all my HD programming in 1080i.
 

fx
Unregistered guest
Dish compresses their HD content due to the limited number of transponders they have available. It is the fault of Dish that you have macro blocking and pixelation on your display. Inherently 720P is good for fast motion viewing and 1080i is best for increased detail/.

xvxvxvx
 

Bronze Member
Username: D_singh

Post Number: 28
Registered: Sep-05
Apparently this is what Nicky G was talking about. Hopefully, with the release of MPEG-4, Dish can better serve me the better picture I want, not that I need it. 1080i is still perfectly fine for me...although 1080p picture is freeking awesome.
 

fx
Unregistered guest
MPEG4 compresses the data more than MPEG2 not less, therefore the signal will be worse not better assuming that Dish wants to add more channels. What you have now is most likely the best you will ever see with Dish save in the unlikely event that they temporarily increase the bandwidth used for their HD channels after the MPEG4 conversion. Keep in mind that for quite some time they will continue broadcasting the MPEG2 data streams until all customers have new receivers capable of processing the MPEG4 codecs.

A 1080p display will not have any more detail using the same compressed signal of either 720p or 1080i. I'm sure you have heard of GI-GO.

xvxvxvx
 

Bronze Member
Username: D_singh

Post Number: 30
Registered: Sep-05
Currently, Dish is broadcasting in MPEG-2. All customers have MPEG-2 receivers/decoders and when Dish decides to switch the compression, all receivers/decoders must be switched, right? Unless it will be optional which would make life terribly frustrating...

Anyway, doesn't satellite radio use MPEG-4 and as a result, customers are able to hear DVD audio? Unless I'm mistaken in my logic, I would think MPEG-4 would help one to better receive HD content...

Maybe this shitty picture is a result of conversion going on from digital to analog? I'm currently running component cables from the receiver to the projector. Think DVI may provide less conversion and less shittiness?

BTW, I was just saying 1080p was awesome in and of itself, sorry about the ambiguity.
 

fx
Unregistered guest
If MPEG4 compression was used and the initial signal babdwidth either stayed the same or increased the quality will invrese. Video and audio are so much different. You can transmit 30 audio channels compressed in the MPEG4 format in the same amount of bandwidth required to transmit a signal HD data stream using the same compression techniques.

Not understanding the difference in video and audio compression is why many people think MPEG4 will drastically help HD quality, it likely will not but that is up to the implementation of the individual retranmission providers which include Sat companies, cable companies and even the local and national broadcast companies. Due to the drastic increase in multi-casting I expect HD to stay the same quality at best and likely continue to degrade as time goes on.

xvxvxvx
 

Bronze Member
Username: D_singh

Post Number: 34
Registered: Sep-05
Thanks a lot man. You're quite educated in this field. I take it that HD is give and take...want more channels in HD? Sacrifice quality.

BUT, I would certainly hope that with current and advancing technologies that a fundamental problem like this can be fixed. Maybe decreasing the total bandwidth used with a larger compression index?
 

fx
Unregistered guest
Thanks for the compliment Mr Singh. As you compress more you lose data so there is a practical limit which MPEG4 will likely reach.

A few bits of info on compression:

Compression removes duplicate data from one field to the next. In scenes with little motion the data is compressed a great deal since duplicate portions of each field/frame are eliminated then reconstructed by the MPEG decoder on the receiving end. The more motion in a program, such as sports, the less a frame can be compressed otherwise you will see jerky movement. When fast action is transmitted and the data is necessarily compressed less you need increased bandwidth to keep the same quality. Unfortulately nearly all data transmissions today used a fixed bandwidth.

It becomes a compromise of allowing enough bandwidth for the average bitrate of all the programming on a specific channel. The end result is either very beautiful if the transmitting entity allows 19MPS bandwidth and MPEG4 compression you get the best possible picture on your display. Unfortunately most Sat companies use half that amount so HD programming suffers.

In case you have any local channels which multicast you will see the difference. In my locale ABC, CBS and NBC local afilliates all multicast up to three data streams for each channel. This becomes most evident on prime time HD programming.

I hope this clarifies matters somewhat.

Forgive any mispellings, I am too tired to proofread my post.

xvxvxvx
 

New member
Username: Maxman

Post Number: 1
Registered: Nov-05
Observations with Toshiba 30HF85 HDTV tube & Charter's HD DVR cable service:

After roughly a week of research and experimentation with the above combination, I've finally dialed into what I perceive to be the best recipe for output/input settings & cable configurations.

After initial set up of the HD DVR box (Sci Atlanta Explorer 8x00HD) and the Toshiba 30' widescreen tube, my reaction was one of dissapointment.

With DVR output set to 1080i & component video inputs connections to TV, the contrast ratio of the HD programs seemed out of whack. Human faces were shadowed (one side bright, the other side dark) and the set seemed to be hunting for the correct brightness level as the scene changed. Not good!

After reading this post and other sources of info, along with a fair amount of trial and error, I have now achieved what I have to think is 'near perfect' for this combination of source and components.

Here's my recipe (in order of importance):
1) HDMI cable - this made a huge improvement to the contrast / brightness issue! The HDMI digital feed is clearly better than component, at least for this particular configuration. I attribute this to the TV's conversion capability being better than the DVR.

2)Set the DVR output to 720p - the perceived clarity and overall quality of the image is better when compared to the 1080i feed. Given that the Toshiba converts everything to 1080i anyway, I'm surprised at these results.

3)Adjust the TV's contrast, brightness and color temperature settings to your liking.

...bake at 72 degrees on Discovery HD channel and the results are fantastic!

Thanks to all who contributed to this thread, especially those of you who posted accurate info ;)
 

Bubba Jimmy
Unregistered guest
Yo, I currently own a Mitsubishi WS-55313, and just found out that it's only outputs are 480i/480p/1080i. I can't seem to find anywhere in the manual where it says that it will upconvert 720p broadcasts into 1080i. Am I out of luck when it comes to 720p signals? I was really looking forward to the HD features of the Xbox360. Does this question make any sense, or do I sound like a total noob?
 

New member
Username: Maxman

Post Number: 2
Registered: Nov-05
Yo Bubba, Xbox 360 supports 1080i and 720p...

"The high-definition support for Xbox 360 is dynamic, so even if a game is designed with 720p in mind, but your TV only supports 480p or 1080i, Xbox 360 runs the game in the best resolution your TV offers (up to 720p and 1080i)."
 

Bubba Jimmy
Unregistered guest
Excellent! *air guitar*

Thanks for the info, man. Also, would you please post the site that you got that info from? I would like to bookmark it.
 

New member
Username: Maxman

Post Number: 3
Registered: Nov-05
here ya go Bubba

http://www.xbox.com/en-US/hardware/xbox360/powerplay.htm
 

Ninjamurf
Unregistered guest
"The "better" and "worse" broadcasts people talk about may have crappy compression (often the case, and varying greatly between over-the-air, cable, and satellite, as well as between channels, and even between different HD shows on the same channel). These qualities you find appealing or not appealing may have nothing at all to do with the resolution it was shot in, or progressive scan vs. interlaced playback."

This really nailed it for me. In the past I've always thought that the concerts on my local PBS-HD channel (San Diego) and the people filming the football games and other sporting events somehow got ahold of the "good" cameras. LOL I was always amazed that concerts and sporting events always seemed to look that much better. DiscoveryHD is also a nice one for me. When there is nothing on TV I will just flip to DiscoveryHD, mute the sound, and read the paper. I look up everyone once in awhile and say, "ooo, pretty." Makes sense knowing that some stations care more (or can afford to) and take the extra step to give us truly beautiful HD.

(Another one I have to mention is CSI:Miami. I always record this one in HD. I believe that the people making this show actually CARE about the quality. Upon returning from every commercial break they almost always have some absolutely stunning vistas of Miami.)

Another question that I think is going to become relevant soon, if it isn't already, is what is 1080p going to do to this landscape? The TV's are already here with Toshiba, Mitsubishi, and Samsung already making 1080p DLP sets for fairly reasonable prices (which I'm assuming will only continue to come down.) How about the transmissions? Does anyone know if anything is even transmitting in 1080p? Or if they are even PLANNING on transmitting 1080p? I get kind of giggly thinking about this "next step" and wonder if I should wait a bit for 1080p sets to come down in price and broadcasters to start sending 1080p signals. Am I waiting for a pipe dream though? Any thoughts would be appreciated.
 

Bronze Member
Username: D_singh

Post Number: 37
Registered: Sep-05
Ninjamurf: read all posts by fx from Nov. 2 onwards.
 

Ninjamurf
Unregistered guest
Okay...re-read all of fx (and yourself D) and it seems that he is talking about the compression techniques being an issue with Dish in particular? That they don't have the bandwidth to send the "best" picture possible so it really doesn't matter if they go to 1080p. If they are barely sending enough data to get you to 1080i then they will fall short with 1080p. I'm wondering about the future. I have cable (Time Warner in San Diego) and I'm not sure if they are better or worse?

I will agree with fx's assumption that the signal will most likely get worse for the NEAR future as more and more channels are broadcast in HD and start sucking up bandwidth. I think that the signal will get better after that, however, as more and more companies "catch up", compression techniques get better, and we rid the world of the 27 non stop QVC channels to free up some bandwidth! I'm wondering if anyone has any thoughts on how long that may take? When will we start coming out of this "dip" in signal quality and how much better will 1080p be when compared to 720p/1080i?
 

fx
Unregistered guest
Ninja,

I was directly referencing D Singh's Sat company but the principle remains the same for all data transmission. Everyone including cable companies have a finite amount of bandwidth available, compression techniques will not change this neither will a cable company be able to improve an already degraded video signal. Fox, ABC, NBC, CBS, WPN, ESPN and others determine how good the HD signal will can be at it's best. As does the Discovery HD channel and INHD1 and INHD2 which have the very best quality available period. They both transmit full bandwidth uncompressed video to the cable companies and Sat companies alike. What is done with it afterwards is up to your provider.

An (expensive) alternative is a C-Band dish and receiver. With that large Sat dish you can get unedited uncompressed video unavailabel from other sources. A good example is the talk in the broadcast booth during commercial breaks. It all becomes available on C-Band if unencrypted. Pretty interesting if you have the space and money to devote to C-Band. One downside is that more and more programming is encrypted than in years past.

xvxvxvx
 

New member
Username: Ninjamurf

San Diego , Ca USA

Post Number: 2
Registered: Nov-05
So you don't think there will be any further advancements fx? We're stuck with 720p/1080i for life? Compression techniques won't improve allowing more transmission with the same bandwidth? Broadcast companies won't use better and better feeds and find ways to transmit them? I understand you are much more knowledgeable in this area than I but you sound like the guy who said, "color TV? Not possible. The technology barely exist to transmit black and white images." "The moon? Are you joking? Those Wright brothers barely got that thing off the ground." Or Bill Gates and his, "256K is all the memory anyone will ever need so we don't need to advance past that." Mr. Moore (and his law) would be very dissappointed.

I can't imagine that the human spirit will be content to be "stuck" with the current resolutions and signal quuality/bandwidth issues. People will always continue to work to better the system.
Whether that requires a radical change in the system ("Okay broadcasters, all digital by 2006") or merely improving the system itself (getting rid of the "useless" channels to free up bandwidth, better compression/decompression, etc.) so I can't imagine we will be stuck at this point for very long. Human innovation shall overcome!

That said, I'm going to give it some time before I rush out and buy a 1080p set. (And yet the craving is soooooo bad. I want one, I want one, I want one.)
 

fx
Unregistered guest
Mike,

You are mistaking my understanding of the limited broadcast spectrum available and the manner in which data is processed and transmitted for pessimism.

I never said 720p or 1080i is the best it can either be or will be. I did explain why that there will not be 1080p broadcast data in the forseeable future. There will likely be 1080p material available on both BluRay and HDDVD formats in 2006 but this has nothing to do with broadcast signal transmission.

What is better than all the above will be holographic projection which I see coming as the next breakthrough and 1080p will be skipped entirely as a viable broadcast medium. It is already possible to have a holographic projector just not financially feasible yet. Give me a quantum computer, a couple of billion dollars and 10 years and I'll have one ready for you to pickup. Cash only please, no checks will be accepted. :-)

xvxvxvx
 

Aria plasma
Unregistered guest
Hi,

I have a 42" Plasma TV purchased from www.aria.co.uk.
the resolution capabilities are; VGA, SVGA, XGA, SXGA, UVGA, UXGA, DTV(480i, 480p, 720p, 1080i)
I need to know if this is HDTV capable, as I will order the new SKY+ box when it is available if so.

Thanks
 

New member
Username: Ninjamurf

San Diego , Ca USA

Post Number: 3
Registered: Nov-05
"Give me a quantum computer, a couple of billion dollars and 10 years and I'll have one ready for you to pickup. Cash only please, no checks will be accepted. :-) "

I'll see what I can scrape together! 10 years though...not sure I can wait that long! LOL
 

Unregistered guest
yo guys i am looking at buying a dlp rear projection tv which is 106cm it says it has 720p system scan i would like to know if i can hook my comp onto this without any loss in quality as long as the display settings are at 1280*768, also this tv is a hdtv, help would be appreciated
 

Unregistered guest
oh heres the site i should be buying the tv from thanks again http://www.dse.com.au/cgi-bin/dse.storefront/43854b0b00ffb6dc273fc0a87f9c0713/Pr oduct/View/G4283
 

Anonymous
 
cant tell which dlp you are buying but if its samsung its piece of krap. it uses only half the 1080 line picture making crummy detail.
 

jaybo
Unregistered guest
Purchased the Samsung HLR5678W, great, clean , crisp, bright picture. No problems.
 

Anonymous
 
you got suckered cause you didnt do your research. all samsung dlp uses cheap proceser which does not blend the 1080 line together. it only uses half the lines. dont have to believe me. it s common knowlege and samsung will not deny.
 

jaybo
Unregistered guest
Purchased the Samsung HLR5678W, great, clean , crisp, bright picture. No problems
 

Brad C. Johnson
Unregistered guest
"you got suckered cause you didnt do your research"
Obviously, you have not done your research. You are describing bob deinterlacing which is present on all 720p micro displays be it dlp, plasma, lcd, etc., that use a 1080i source. Post some digital pics showing the difference between bob and weave deinterlacing. You cannot because one cannot see the difference without specific 1080i test patterns.
 

houston mike
Unregistered guest
I think I have some concept of these resolutions, but I could use some help. Using an LCD HDTV that supports multiple formats. I'm not a purist on SD channels, so I want to see the image fill the screen. My TV's "fill" function only stretches to fill the screen using 480p widescreen or 480i wide (not standard). I have the explorer STB by TWC.

My questions... should I enable just 480p wide, 720p, 1080i or is 480p standard better for some reason?

Is there some way to have my TV detect the incoming signal and use that... I am set to "fixed" for format because I don't understand what upconvert 1,2 are or "pass-through".

Thanks
 

fx
Unregistered guest
Obviously, you have not done your research. You are describing bob deinterlacing which is present on all 720p micro displays be it dlp, plasma, lcd, etc., that use a 1080i source.

Incorrect, as tested by Perfect Vision:

Samsung - 2 DLP rear projectors failed - other passed
Sony - all their LCD rear projectors failed - LCD flat panels passed
All JVC, Hitqachi, Pioneer, and Toshiba passed

The SXRD series passed as did all LYCOS based displays.


Post some digital pics showing the difference between bob and weave deinterlacing. You cannot because one cannot see the difference without specific 1080i test patterns

This is funny stuff, you stating that discarding 50% of the information cannot be noticed. Pretty dumb thinking.


Several links:

http://www.100fps.com/

http://www.100fps.com/why_bobbing.htm

http://www.hthoma.de/video/interlace/index.html


xvxvxvx
 

Brad C. Johnson
Unregistered guest
"This is funny stuff, you stating that discarding 50% of the information cannot be noticed."
If this is so apparant why is the general public still unaware. Only a small segment of hdtv buyers, including you, knew of the bob deinterlacing 'scam" well before Gary wrote his now infamous article. You might as well argue the blue hue problem on the latest sxrd displays. It is there but not visible to the naked eye. The point is, the average hdtv viewer is (or should I say was) blissfully unaware of 1080i as 540. Jaybo, for example, is completely happy with his purchase. I stand by my challenge to show a difference between bob and weave deinterlacing to the general public. The dlps with wobulation and bob deinterlacing show just as an amazing picture as sxrd 1080p with weave deinterlacing in side by side tests with the general public.
 

Unregistered guest
I have bought a Samsung LE32T51 HD ready LCD tv. This is my first HD tv and am realy satisfied with, what i think, superior machine. I needed to get a HD tv fotr my xbox360 console which supports 720p and 1080i. I later found out that the tv i bought only supports 720p, am I now stuck with a HD tv with a lower quality screen for my xbox360? Or is the progressive vs. interlaced motion problem the same for a true 1080i source.
 

fx
Unregistered guest
"Only a small segment of hdtv buyers, including you, knew of the bob deinterlacing 'scam" well before Gary wrote his now infamous article.

The point is, the average hdtv viewer is (or should I say was) blissfully unaware of 1080i as 540. Jaybo, for example, is completely happy with his purchase. I stand by my challenge to show a difference between bob and weave deinterlacing to the general public.
"

I agree with all the above Brad, it is hard to know what you are missing when you one cannot see what is missing and two don't know that it is missing in the first place.

"You cannot because one cannot see the difference without specific 1080i test patterns."

Now this is the quote I disagreed with. Anyone that has seen both displays will know the difference.

xvxvxvx
 

Silver Member
Username: Cableguy

Deep in the ... U.S.

Post Number: 517
Registered: Mar-05
fx,

Thanks for the links above...that is some pretty neat s**t. Information stored for future reference. I'm a little curious though, some of the pictures looked a little bit like what motion artifacts are seen as, is there any difference between the motion artifact syndrome and this "bob and weave" effect? I read the page and it refers to some of the picture examples as deinterlacing artifacts, which is why my question was posed. Thanks again for the links
 

Brad C. Johnson
Unregistered guest
"Now this is the quote I disagreed with. Anyone that has seen both displays will know the difference."
I envy your eyesight. Even with my coke bottle thick glasses, I am unable to ascertain displays using 540p bobbing. I would need the SMPTE 133 pattern. Then again, try as a might, I also cannot detect rainbows even in 6 segmented lower spinning wheel dmd devices.
 

fx
Unregistered guest
Cableguy,

The best answer to your question is at the below link:

http://www.videsignline.com/howto/showArticle.jhtml?articleID=171201964

It also elaborates on the current discussion of discarding the even or odd numbered fields in a 1080i signal.

xvxvxvx
 

Silver Member
Username: Cableguy

Deep in the ... U.S.

Post Number: 518
Registered: Mar-05
thanks fx
 

gEmRN
Unregistered guest
The Sony Grand Wega kdf-42we655 displays in 788p 100% of the time, period, finished, no other correct answer!!!Wrongo! It is 768 100% of time. Famous sony misprint. Source: Panasonic tech level 3.

 

.50 rocks!
Unregistered guest
On xbox360 I don't know what setting is better 1080 or 720. Am still confused as which is better. First, people saying 1080 was better now everyone saying 720 better because of this 1080 problem? Ok. Im a noob at this so I dont want to screw up and get wrong tv. Can anyone reexplain which 1080 or 720 is better?
 

12345
Unregistered guest
mate 720 p is smoother than 1080i so i would recommend 720p for a 360
 

.50 rocks!
Unregistered guest
Thanks man. This stuff was driving me crazy. 720 smoother sounds cool.
 

crazy.viking
Unregistered guest
currently there arent many HDTV's that support 1080p. And if you are watching on a small tv than 720p is defenitly the way to go, but on a large tv 1080i is somewhat better.
 

New member
Username: Mperk1

Post Number: 1
Registered: Dec-05
I'm considering the Samsung 30-inch CRT HDTV.
It supports 480 and 1081, but not 720.
I'll be upgrading to Comcast's digital service.
Am I better off with a set that takes 720 or does this one make sense? The price is certainly attractive.
Thanks for any advice or suggestions.
 

New member
Username: Eric_hill

Palo Alto, CA USA

Post Number: 1
Registered: Dec-05
Just because you send 1080i (1080 verticle lines of image) to your TV, it doesn't mean your TV can display these lines.

In fact, most plasma's don't support displaying more than 800 verical lines these days. Unless yours does, I don't see the benefits to encode video into 1080i (1080 vertical lines) just to have your TV take it back down to less than 800. Now if you have one of those fancy plasmas that support more than 1000 lines then you may get some benefit from 1080i.

In my case, I have a 50" Panasonic (th-50phd8uk) and it's display supports 768 vertical lines. So I use 780p to get the progressive scanning benefits (less flickering). The 780 progressive lines are coverted into 768 lines that physically are shown on the screen.

I played around comparing HD channels from my comcast digital HD box using both 1080i and 720p settings. I found that 720p was noticably better. Perhaps my TV doesn't handle the 1080i into 768 lines very well.
 

fx
Unregistered guest
Eric,

You are confusing horizontal and vertical resolution. Any HDCRT is capable of displaying 1080 lines it is the 1920 dots of horizontal resolution that may not be completely rendered by the display circuitry.

The reason a plasma doesn't display 1080 lines is because it is a 720p native resolution display. It converst any input to 720p. And again no one should care about that portion of the 1080 x 1920.

xvxvxvx
 

rayman27
Unregistered guest
Honestly, My xbox 360 looks brighter and clearer using 1080i. 720p setting looks much darker and less detailed.
 

New member
Username: Danadane

St. anthony, Iowa U.S.A

Post Number: 1
Registered: Dec-05
i have an xbox 360 and a high def. t.v,and i want the best picture for my games what resolution is better for gaming 1080i or 720p.
my t.v only supports 480 and 1080i though but i just want to know which one would look better.pls help thanks.
 

Unregistered guest
Wow - what a Thread!

fx - I am sure I am just one of multitude that appreciates your time spent on this thread. Thank you.

Note to you gamers: 360 will send what ever signal you want, just play it in your native and it will be fine. If your native is 720p (i.e you have a flat panel) run 720. All things being the same 1080i is better, but all things are rarely the same (see rant below).

Incednetally I understand the HD-DVD industry is going 1080 not 720, and 1080p panels just got a lot cheaper:

***RANT***
About an earlier post -- Brad Johnson -- you may know a lot about this HD stuff, who knows, but what your saying actually kinda smells a bit. IMHO the number one problem with HD is that I can go out with my cellphone, take some footage, re-edit in 1080i and say --"Check it out -- HD!"

The problem, to be a little less sarcastic, is that all HD cameras were NOT created equal, and it only starts there. I have a 1280x1024 projector (a freaking steal on eBay at $600 -- woohoo!), and my father has one of the finest HDTV's made, the 65" Mitsubishi. On many signals his FAR superior television looks just a little better. But I have seen some footage, wasn't much, but he nabbed in on his HDVCR and played it for me. My jaw hit the floor, and understand -- I'm used to pretty darn good HD. Maybe some one with coke bottle thick glasses shouldn't be judging the merits of HD, but ANYONE can tell when everything clicks.

I think in terms of math, so forgive the analogy (although I think it a perfect description). But imagine the process of creating an HD picture a series of percentages multiplied by each other, and then your actual TV just capping what it sees

Example 1:
Subject material: 98%
Camera : 70%
Editing: 80%
Broadcast: 97%.
A substandard HDTV: 60%

Result: 98%x70%x80%x97% = 53.2% and that falls under the cap of 60% so the viewer sees 53.2%. Obviously on the very best TV the image would roughly the same.

Example 2:
Subject material: 98%
Camera : 98%
Editing: 98%
Broadcast: 97%

In this case, which is the exact same material (say a football game) and broadcaster (say NBC) and the result is 91.3% . A great television with a 90% cap compared to the 60% would be 150% better (1.5x60 = 90). But there both 1080! They're both NBC! They're both the same football game! Weakest link. Limiting factor. It's not what is the same -- its what isn't that matters. I just listed a few factors. I imagine there are just as many more I didn't list.

The problem is, the manufacturers know this better than anyone, and that's why they can sell a 540 tv as 1080 and much of the public is too ignorant to know the difference. It is rare to see a picture where everything is hitting on all eight cylinders.
*end rant*

Sorry for the rant -- just my 2 cents.

PPS - I am hoping HD-DVD wil change that, and that 1080p will be something we get to see more of. When good material is readily availiable at 1080p, I'll be getting a 1080p device - until then, I guess my hunk a junk will do.
 

Brad C. Johnson
Unregistered guest
SPeach,
It would be most difficult to explain the error of your thinking since you have no comprehension of simple bob deinterlacing on a 1080i signal to a 720p display. Leave the advanced thinking to others and save your completely incorrect rants for the less informed less you embarass yourself even more.
 

MythicZohar
Unregistered guest
I have a general question. Are power filtration boxs like those made by Monster, or Panamax needed. I am running my cable connection thru the panamax product. Is this helping the signal quality?
 

New member
Username: Magjammer

Post Number: 1
Registered: Jan-06
I read reference to this higher-up in the thread, but it never got answered... I recently switched from Dish Network to Time Warner. Using my receiver's (Explorer 3250HD) "auto" scaling / conversion capabilities there is an obvious and bothersome flicker. This did not occur with Dish Network's 811 receiver or if I choose not to use the 3250 upconversion-1 feature; opting to manually adjust the scaling between HD and non-HD signals. Will this flickering damage my Hitachi rear projection TV?

Thanks,
-Mike
 

Unregistered guest

Michael,

" opting to manually adjust the scaling between HD and non-HD signals. Will this flickering damage my Hitachi rear projection TV?"


No, it shouldn't damage your rptv.

You know, we, as consumers, would be able to better assess which cable/sat companies and which consumer products would better suit our needs.. should manufacturers of the devices list which internal scaler (model #) is being used in each device. You'd think that just maybe a $4000 hdtv would likely have a higher-end internal scaler than would a $200 dvd player or $400 sat/cable receiver. Right ?? What I've observed (through application and readings) is that IT VARIES.

I own one of the best HDTV's available, today;
Sony's KDF-60XS955. I'm using the Explorer 8300HF (HD DVR) through Time Warner with U2V600HD's (24k gold plated hdmi 4' Monster cables)

Would you believe the DVR's scaler is a higher quality than the one in my XS ? Of course there are high-end EXTERAL scaling devices (ranging from about $2500 to tens of thousands of dollars) usually used with high-end projectors that will kick azz compared to the ones previously mentioned. (i.e. Lumagen Vision Pro Video Scaler)
But for most consumers we'd gave better display results just knowing what, exactly, we're working with.

In the mean time.. I've gotten displays with fewer artifacts using the scaler (processor) in the set top box (STB) and DVD player. (fyi the highest rated HD-DVD player, hands down, is the OPDV971H by Oppo) due to it's processor. The digital output (DVI) beat 39 other DVD players from 11 other companies including Panasonic, Pioneer, Sony, Toshiba, Samsung, LG, Onkyo, Teac, Oritron, Harmon Kardon, and McCormack. The price range of these products was $199 to $13,000. Guess who sells for $199 ?. . . Oppo.

http://www.projectorcentral.com/oppo_opdv971h_dvd_player.htm


Send this Page Home > Projector Reviews > A Winning Digital DVD Player
A Winning Digital DVD Player
David Colin - July 7, 2005
ProjectorCentral.com

Every once in awhile a new company arrives on the scene with an exceptional product. Such is the case with Oppo Digital and the introduction of its OPDV971H digital DVD player, also referred to as the Oppo 971 in this article.

So who is Oppo? They certainly are not a household name and with good reason as Oppo is a newly created North American branch of BBK Electronics of China, a giant in the manufacture of consumer electronics, employing 12,000 people worldwide. So while Oppo is a new name, the company and the technology behind it are quite substantial.

Oppo's task in North America is to build the Oppo brand by focusing on quality and customer support. You will not find this product at a local store just yet. Oppo wants to build demand by first establishing its reputation. To do this, it wants some firsthand experience with its customers and the fastest way to get it is to sell direct. (Update September 12, 2005: Oppo has begun to sell direct to some selected dealers. See Where to Buy below.)

So what makes the Oppo OPDV971H DVD player so hot? While its form factor is sleek with a low profile and uncluttered look, its digital performance is without equal at apparently any price based on the 2005 DVD Player Benchmark shootout by Secrets of Home Theater and High Fidelity that ended last month. The Oppo OPDV971H digital output (DVI) beat 39 other DVD players from 11 other companies including Panasonic, Pioneer, Sony, Toshiba, Samsung, LG, Onkyo, Teac, Oritron, Harmon Kardon, and McCormack. The price range of these products was $199 to $13,000. Guess who sells for $199 . . . Oppo.

Like an earlier post mentioned.. another factor with flicker/artifacts is the conversion that must take place when sending a 720p signal to 768/788 native display panel. The tv set has to scale the signal to fill 768/788 lines of diplay space with 720 lines of video information.. which can cause unwanted artifacts and somewhat of a delay. So the flickering is, more than likely, caused by the processing

 

CJB
Unregistered guest
Hi, I started reading the above comments and stuff, and have become really interested in all this HDTV stuff. Unfortunately, i am extremely busy right now, and don't have time to read through all of these pages of responces (so forgive me if i ask a question that was answered somewhere above or happen to be very similar to each other). Here are my questions:

1) what do all the words mean and abreviations stand for (LCD, DLP, plasma, threads, CRT, LCOS, tubes, and any other that would be good to know but i didn't list)?

2) how do all these relate and affect your experience?

3) using this info, could you please give a basic run down one last time on 720P, 1080I, AND 1080P?

4) would an HD projector (NOT rear projection TV) be as good as a normal HDTV (and which HDTVs - plasma, DLP, ... - give the best quality for picture/motion/color...)?

5) many of my friends (and myself if i had more time) are/would be interested in this mainly for video gamming purposes (Xbox 360)... which TVs would be the optimal pick for such an experience?

if anyone could answer these quesions with a good deal of confidence/knowledge or professional info, it would be GREATLY appreciated

thank you
 

Silver Member
Username: John_s

Columbus, Ohio US

Post Number: 619
Registered: Feb-04
CJB, you can start by consulting a good audio video glossary like:

http://www.soundandvisionmag.com/article.asp?section_id=10&article_id=859&page_n umber=1

Then you could read some articles on HDTV:

http://www.soundandvisionmag.com/article.asp?section_id=10&article_id=854&page_n umber=1

And then some stuff on surround:

http://www.soundandvisionmag.com/article.asp?section_id=10&article_id=856&page_n umber=1

Help yourself.
 

INHD1
Unregistered guest
CJB

You may also find this website helpful:

webopedia.com
 

CJB
Unregistered guest
Hey John, thank you so much for the links... they appear to have about everything i need!
 

CJB
Unregistered guest
INHD1 webopedia is great too. thanks for directing me... i know i asked for a lot so this is all very helpful!
 

New member
Username: Booker21

Post Number: 8
Registered: Jan-05
Hello guys
i need some adivce, i don`t want to restart the 1080 vs 720 war over but i have the chance to buy a CRT tv wich is 1080i, i think.

It`s a great deal and it will be my first HDTV.
My only "doubt" is the 1080 scaling.. the tv support 480i/480p/720p/1080 but 720p signal upscale it to 1080 so i don`t get real 720p.

My question is.. it still worth it? to jump to this tv from my analogical 480i tv, even i can`t get real 720P?
My main use will be xbox360 games and Dvds.
 

Unregistered guest
i read all this and am still confused. im a wedding videographer and i want to get a new camera. an HD camera. iv been looking at canon's xlh1 which is 1080i and jvc's gyhd100u which is 720p.... i cant decide. can anyone help me decide?
 

Bob Pio
Unregistered guest
I have a Westinghouse 32" LCDHDTV $999 special- Resolution: 1366x768, Vertical scan lines (Native Mode): 768, Max Resolution: 1080i. Connected via DVI cable to a Comcast/Motorola DCT6412 Dual tuner DVR cable box.

My Cable box menu allows for only one output 720p or 1080i. I DVRed the Super Bowl and have been playing it back in the two different modes for comparison. Is this a true comparison or am I watching the same thing? Frankly I can't tell the difference, is this because I have a lower end HDTV? When comparing the two output mode settings while watching live HDTV I think I am seeing production quality differences more than 720p/1080i differences? I am not interested in taping up a piece of paper with every channels brodcast format, and changing the menu setting every time I change the channel. So what should I leave it on.

I will be purchasing a DVD player soon, (using the PS2 as one right now, not good) My TV only has 1 DVI input (redundant?) Should I just swap the DVI cable from the cable box to the DVD when I use it, or should I use componet cables? Is there a quality loss?

Thank you,
Bob P.


 

Anonymous
 
fx.Just shut off your damn computer and get laid!
 

fx
Unregistered guest
Thanks for adding your useful comments to this discussion Anon. It has been 55 days since I've posted in this thread. Do you miss me that much or are you just a fanatic?

xvxvxvx
 

dude101
Unregistered guest
OMG

Most of the posts over here are so stupid!!!!

All HDTV's can only do 1080i or 720p THEY CANNOT DO BOTH, they rescale the signal if they get the wrong one.
 

Bob Pio
Unregistered guest
Yes I understand the display is the same. My question is what scaler is better the TV or the STB. I have been swithching back and forth but without a side by side it is hard to tell the difference.

I HAVE A NEW PROBLEM!!!!!

As described in my earlier post I have been swtiching my cable box output from 1080i to 720p to see if I notice a difference. I had no problems with this for several days. NOW when I switch the cable box to output 720p I usally only get the top half of the picture zoomed to fill the entire screen. This is an intermintent occurence. The menu screen does it as well I can go through a progression like this:

TV-on, Cable box-off, hit menu and the output menu is displayed just fine.
when I switch the output to 720p the screen flickers then only displays the top half of the menu screen zoomed in to fill the entire screen. when I swithch back to 1080i the screen flickers and the menu displays just fine. when I switch back to 720p it flickers and then displays the menu just fine. There is no rhyme or reason as to why it works only sometimes, 2 weeks ago it worked fine all the time, I changed nothing.

Is this a problem with the STB or my TV? has anyone seen this before?

Thank you,
Bob P.
 

fx
Unregistered guest
Dude101 wrote: All HDTV's can only do 1080i or 720p THEY CANNOT DO BOTH, they rescale the signal if they get the wrong one.

All newer HDTV's is correct but some older (high end) HD-CRT displays would accept and display any input in that format. Many Lowe models (The Anaconda for one) would do so and the early expensive Sony HD-CRT's did as well.

xvxvxvx
 

Silver Member
Username: D_singh

Post Number: 105
Registered: Sep-05
Dude101, no. fx is right. I also have an older HD rear-projection that only accepts 1080i signals and 480p, not 720p, I've tried and I've looked. It's a Toshiba Cinema series 65".
 

New member
Username: Boa

Post Number: 1
Registered: Mar-06
"My question is what scaler is better the TV or the STB."

been my question too, since I got the new Sony 1080p SXRD (60" - I know, vid in a box :-)), & the SciAtlan 8300HD. sorry, know this thread is old, but just now came across it on a google. maybe/hopefully it can be resurrected.

like someone was saying above, it depends on the quality of scaler/line-doubler/chip in each device, i.e., YOUR TV & YOUR STB.

after massive research, I finally found that the chip used in the 8300HD is the "Wonder" by ATI (S Korea). as I understand it, Sony, for this SXRD at least, makes their own chip(s).

my final verdict after much trial & error: whether I put the output on the 8300HD to only 1080i, or to 1080i, 720p, AND 480p, I CAN'T tell ANY damn DIF!!!

hope this helps (& any further research/comments like on which "should" look better would be much appreciated. ;-)

Boa
 

New member
Username: Aanddg

Post Number: 1
Registered: Mar-06
Help---- I'm on my 3rd tv. I have the Toshiba 30hf85. I notice when I'm on 1080i my hdtv channels flickers and shadows from light to dark (especially on action movies and sports), and my hdnbc channel is blue. My cable company was here and rewired inside and out. All component cables are put in right place. When I switch it to 720P it's fine. On my first model I bought a $45 HDMI to DVI cable and the blue nbc went away, and the flickering diminished but was still there. At the same time the picture quality was not as good with the HDMI cable. Is this something where the technoligy just isn't perfected yet? Please any advice would help. Thanks :-)
 

New member
Username: Cassjov30

Post Number: 1
Registered: Mar-06
i want to know is there another way to see my locals channel in my hdtv ,, with out rent that thing the cable company rent
 

Bronze Member
Username: Formerly_fx

Dallas, Tx

Post Number: 39
Registered: Mar-06
David it depends, we need more information. Do you live in the US? Does your TV have a built-in ATSC tuner? Do you have an antenna if it does? If not are you willing to purchase an OTA receiver and an antenna (and assume the obsolescence factor as well as the initial cost and maintainance upkeep) rather than rent the cable box from your cable company. Finally are you sure all your local channesl are being broadcast in HD?

Please answer these questions and I can followup with further suggestions.

xvxvxvx
 

New member
Username: Tengen

Post Number: 1
Registered: Apr-06
i have the Samsung SyncMaster 930B computer monitor. i am going to hook up my xbox 360 to it and i am wondering, should i input 1080i or 720p? i thought this question might be different considering the resolution on a computer monitor. any feedback would be appreciated, thank you.
 

New member
Username: Hotstuff619

San diego, California Usa

Post Number: 4
Registered: Mar-06
i have two hdtvs...#1)sony 34in crt xbr & #2)sharp aquos 32in lcd...here are my insights on which format is better :720p/1080i...whenever i use my sharp lcd, my xbox 360 set to 720p does provide a much smoother game play compared to being set to 1080i...however when i view ota hd content, i personally prefer my sony xbr over the sharp lcd...the pictures on the sony crt does seem a little more detailed in 1080i compared to viewing hd programs in 720p on my sharp aquos lcd...i view sports programs often in hd and i seriously cannot distinguish any advantage that the lcd has over the crt when it comes to fast motion movement that several others have posted before...so that is basically my two cents...720p for gaming/1080i for hd sports/movies
 

New member
Username: Noidea123

Post Number: 2
Registered: Mar-06
i have a pioneer elite 43in plasma, and ive been trying to figure out whether to use 720p vs 1080i. my provider is dish network. so far ive been leaving it on 720p b/c in situations where text scrolls across the screen it looks less choppy than if i was using 1080i and i can't find any other reason supporting either resolution.

my concern now is that when watching, say, a bball game in hd, players in the background look a little choppy. this occurs in those wide shots that are taken essentially from midcourt and are primarly used throughout the game. although i can't quite say for sure it seems to happen with both input resolutions. is this just a natural occurence for objects that are out of focus? or some other problem at hand? close ups look perfect in either resolution, but i guess there is rarely fast moving action during a close up shot anyway.

any info would be appreciated.
 

Bronze Member
Username: Formerly_fx

Dallas, Tx

Post Number: 68
Registered: Mar-06
Not every camera in a game is an HD camera, you will get som SD shots as well. This seems like what you are describing on the wide camera angles.

Without seeing exactly what you are describing it is impossible to be sure.

xvxvxvx
 

New member
Username: Jasony

Post Number: 1
Registered: Apr-06
I have a question about my T.V. and the Xbox 360. It's a 32in. Phillips T.V. model number 32pt740h37a(not LCD, DLP, or flat screen). On the back of my T.V. it says, 480p/1080i for the component video inputs. Which one would be better for gaming? I have the HD cables plugged in staight to the T.V., do I need some sort of HD reciever or is it OK to plug the HD cables from my Xbox 360 to the back of my T.V.? Thanks you for any help.
 

New member
Username: Jasony

Post Number: 2
Registered: Apr-06
I have a question about my T.V. and the Xbox 360. It's a 32in. Phillips T.V. model number 32pt740h37a(not LCD, DLP, or flat screen). On the back of my T.V. it says, 480p/1080i for the component video inputs. Which one would be better for gaming? I have the HD cables plugged in staight to the T.V., do I need some sort of HD reciever or is it OK to plug the HD cables from my Xbox 360 to the back of my T.V.? Thanks for any help.
 

New member
Username: Noidea123

Post Number: 3
Registered: Mar-06
thanks for the reply. that's an interesting point, but based on overall quality i would say that it is an hd shot. also, this is the main camera angle, so i would assume it were hd. and if it were sd, it wouldn't be 16x9, right? i guess they could stretch it, but that wouldn't really make sense for the main camera angle.

its not the quality that is the problem, but the focus/sharpness/choppyness of objects in the background.
 

Bronze Member
Username: Formerly_fx

Dallas, Tx

Post Number: 71
Registered: Mar-06
I cannot understand how the quality can be acceptable and at the same time the picture is out of focus and not sharp or choppy. I am now 99% sure it is an SD wide angle shot you are describing, and yes they will stretch the SD shot, actually they upconvert it before transmission but you still have less data from an SD camera than an HD or film source.
 

New member
Username: Noidea123

Post Number: 4
Registered: Mar-06
thanks for the relpy.

i pretty sure it's an hd shot, but i could be wrong. i'm talking about the main camera angle for any basketball game on ESPN, ABC, CBS, etc. have you known these to be SD?

the picture as a whole is not out of focus or choppy. it's just the objects in the background (players on the farther side of the court)
 

Bronze Member
Username: Formerly_fx

Dallas, Tx

Post Number: 72
Registered: Mar-06
Thanks for the clarification on what you meant by acceptable yet out of focus. Yes I have seen (in fact most of them are) SD cameras used for the full court shots. Keep in mind HD is still in it's infancy. This is the first year for portable wireless hand held HD cameras to be used by the networks and they are still heavy, bulky and expensive. Golf tournaments are a very good example of mixing HD with SD camera shots. Even most NFL games still have some SD cameras other than the Super Bowl they all use SD cameras in some instance or another.

xvxvxvx
 

New member
Username: Fearhopelove

Post Number: 1
Registered: Nov-06
I just recently purchased a Sharp Aquos 32" and have finally seen the difference between 1080i and 720p. I basically stole a "high-end" HDMI cable and have it connecting my tv to a lower end Philips dvd player.
I noticed 720p looked rougher. So I step in close to the tv and replayed the scene once in 1080i and could not believe how I have been decieved! 1080i was so clean by comparison.
I thought maybe the moving parts of the shot will benefit more from the progressive. Nope, wrong again...1080i was CLEARLY superior in all respects.

So, yes I have an LCD and 1080i is clearly better than 720p on it.

So don't take anyone's word or mere theoretical explanation of why one is better. Turn on your tv and play around and figure out for yourself which is better on your set up.
 

New member
Username: Anacortes

Post Number: 1
Registered: Nov-06
I just bought a Sony KDL-32S2000 LCD TV. I'm connected to basic cable and have found the h-def channels. The screen size changes when I'm viewing a 1080i program vs a 720p. The 720p fills the whole screen. While a 1080i looks more like a 4.3 format even though it says 16.9. I can fill the screen by changing the screen mode. IE Zoom Wide but the picture quality goes down. Any idea why this is?
thanks
 

Silver Member
Username: Formerly_fx

Dallas, Tx

Post Number: 186
Registered: Mar-06
" Any idea why this is?"

Yes, the channel is not transmitting in 1080i at the moment you are looking at it. All programming on HD channels is not always broadcast in HD. If it was you could not have changed the zoom feature, it would have been fixed.

xvxvxvx
 

Silver Member
Username: Cableguy

Somewhere on... U.S.

Post Number: 867
Registered: Mar-05
In other words John, the programming you are watching is being broadcast in SD (standard def), hence the pillar bars to the left and right of the screen. This is most common on local network HD channels, especially local news broadcast and commercials. Since SD is 4:3 and not 16:9 you get the bars, as Scooby indicated you can zoom to eliminate them, but the originating source isn't always HD. It's best to leave them as is so you're not cropping true 16:9 pictures. Since the bars are embedded at the source, your TV will display it is a 1080i/16:9 picture even though it really is a 480i/4:3. Remember when you zoom, or zoom wide, it will distort the 4:3 nature of the picture combined with the fact that it is an SD picture it's not going to look near as good as the true (non-upconverted) 1080i or 720p HD picture.
 

New member
Username: Username20061110

Post Number: 1
Registered: Nov-06
Here is some insight (and more than you probably wanted to know) about HD & SD sports production - coming from a guy who has been in the television sports production industry for many years before HD....
In most cases HD sports broadcast television productions don't mix HD/SD formats - all cameras shooting the field or court are matched for HD only production or SD only production. However HD productions are often down-converted after (or before) the transmission leaves the truck for broadcast to SD subscribers. HD cameras send the video signal to the truck via glass fiber cables. SD cameras send their signals via copper triax cables. The truck will send the video signal to a nearby satellite uplink truck or via other landline transmission to be distributed by the networks and cable providers. More and more sports production trucks are going HD (because transmissions can always be down-converted to SD subscribers - and the demand continues for HD subscribers). Also, in most cases at this time, HD camera operators shoot for a 4:3 audience because if HD is down-converted to SD then the SD audience sees a properly composed shot (ie all 10 players on the basketball court from the 'game camera' shot)- the HD audience sees a wider composed shot. SD productions are rarely if ever up-converted to HD (there are occasions where tape from a previous SD broadcast is played back for a live HD broadcast). When HD sports production began there was an HD truck along-side an SD truck - one SD cam op operated next to the HD op (it got a little crowded).
Unfortunately with satellite up/down-link compression, redistribution, modulations, additional distributions etc. the final display on HD or SD televisions appear very compressed (and time delayed) regardless if you subscribe to HD or SD content (especially with dish services).
Also unfortunately, too many subscribers (most of the sports bars in the U.S.) view 4:3 NTSC video stretched on a 16:9 display which just isn't the right way to view a 4:3 sports broadcast!
:o)
-B
 

New member
Username: Scharbag

Post Number: 1
Registered: Nov-06
Here is some reading that might interest you.

http://alvyray.com/DigitalTV/Naming_Proposal.htm

Remember that you MUST compare apples to apples. ALL flat-pannel displays are made of a FIXED number of pixels. This can not be altered, no matter what signal you input to them. If a screen was made to display 1080i NATIVELY then it will not do a great job at 720p due to the fact that electronics within the TV will have to convert the picture from 720p to 1080i. This is much like using your LCD computer monitor at a non-native resolution. It will look fuzzy compared to the native resolution.

The rub is that 1080p will be king (if they figure out how to send it to us without breaking rules of broadcating) but right now there is very little 1080p material out there. This is the problem: I buy a native 720p TV cause I like sports and want to eliminate flicker, now 1080p looks bad cause it is downsampled to fit on my smaller pixel area OR I buy a native 1080i TV and watch 1080i (and native 1080p IF the set is TRUE 1080p) natively and have an upsampled but fuzzy 720p picture. The second choice is better for the future, the first is better for the now. BUT, in practice, 1080i is not even broadcast at 1080i:"***1080/30i is defined to be 1080 lines of 1920 pixels each delivered every 1/30th of a second (540 of them at a time), but it is implemented by its practitioners as 1035 lines of 1440 pixels each. Even Joel Brinkley of the New York Times - strongly in the Interlacers camp - uses these numbers. Of course, the format should be called 1035/30i in this case (or, by the renaming proposal, 518/60i)." So if you buy a TV now, hoping to display NATIVE 1080p (1920X1080) in the future, you will not be able to display 1080i (1440 X 1035 as implimented) or 720p (1280 X 720) natively.

As far as is 1080i better than 720p I would give my vote to 720p. 1080i should really be called 540i because we only get 540 lines every 1/60th of a second vs 720 lines per 1/60th of a second with 720p. Yes 1080i has more pixels, but 720p eliminates interlace flicker which can be annoying in high motion situations.

This whole mess is another example of big-business JUMPING into the frey without looking what is below. Now we will all have to be frustrated with the new BS about BlueRay and HD-DVD!! Bahh, why can it not be simpler!!
 

New member
Username: Lisa1018

Post Number: 1
Registered: Dec-06
Hi all,
I am new to this board but it looks helpful for the question I have. My son has xbox 360 and he just got a Philips Maganovox 15MF605T that is HD Ready. I followed some advice in these threads and have it all hooked up correctly. The xbox is set to HD and the inputs are in the correct spots, however, when you hit the HD input, it still says 420p not 720p. My son says that 420p is not HD so what am I doing wrong? From everything I read, even though this TV is only HD ready, I don't need a tuner if I don't care about getting stations in HD. My son only wants to play his games in HD. So can anyone help??? Please!!!! Thanks!
 

New member
Username: Fearhopelove

Post Number: 2
Registered: Nov-06
Lisa, that problem is not the tv but merely xbox settings.
Tell your son to check display settings under the "System" tab. Where he can increase to 720p, 1080i or 1080p.

Tell him not to give up so easily, too. hah.
 

New member
Username: Rattler14

Post Number: 1
Registered: Sep-07
I have a question (like many of you): We just bought two new DVD players, 1) Sony} RDR-VX525, HDMI output 720p/1080i and 2) Philips DVP5982, 1080p with HDMI. Our LCD tv is a Visio 47 inch 1080p liquid crystal display. Which DVD player is better for our tv? I'm not really that interested in why, a lot of this is above my head and I'm really not that interested. I'm just looking to find out how we can get the best picture.}
 

New member
Username: Fearhopelove

Post Number: 3
Registered: Nov-06
Tim, since you already bought both of them. Why not connect them both and then you decide which is better for you.

Granted, the logical choice would seem to be the 1080p capable player. However, it would be best to try them both and then choose. Besides video resolution, you'd be able to compare audio, unit display, setup menu's, and the feel of the remote to get the one that suits you the best.

If you have 2 hdmi cables, then you can set up each player to their best setting then just compare them side to side by switching your inputs on the tv.

But if you don't have and don't feel like pickin up an additional one. Just grab a couple of movies and try out scenes from each on each dvd player and then make your choice.

I apologize for any redundancy. hah
 

New member
Username: Luisfc1972

Post Number: 1
Registered: Dec-07
I AM CONFUSED!!!!!!!!!!!!!

I own a Sony Grand Wega KDF-E60A20, Toshiba A3 (which I will receive tomorrow) and dish network High Definition cable.

I want to bash my head in regarding 720P vs 1080I. kasjdf;alskjadiopjfasdo

Anyway, ive done TONS of research but i still have no idea what to do. I read about 50 posts on this thread before i decided I should just ask.

i believe my television supports 720P and 1080I.

I mostly want to watch movies. Sure I will catch a HD broadcast here and there but I am most interested in watching HD movies.

For best picture and sound I always thought just set to 1080i but then i keep reading 720P would be smoother but with a little bit less detail. I also read that the naked eye cant tell the difference between 720P/1080I.

Now i am not so sure. If my main interest is watching DVD movies should i set my toshiba A3 HD DVD player at 720P?

Also, i cant find anywhere on the menus of my television where i can set it to 720P or 1080I as default.

GOD HELP US ALL
 

New member
Username: Luisfc1972

Post Number: 2
Registered: Dec-07
i forgot to mention my television is Native Resolution: 1280 x 720 Cinema Black Pro, if that helps.
 

Gold Member
Username: John_s

Columbus, Ohio US

Post Number: 1972
Registered: Feb-04
­
"i forgot to mention my television is Native Resolution: 1280 x 720 Cinema Black Pro, if that helps"

It certainly does --- the Toshiba A3 owner's manual states:

"You may find that setting the output resolution of the player to match as closely as possible the native resolution of your display provides the best picture performance (e.g., 720p for 720p and 1080i for 1080i or 1080p)."

That means you should go with 720p for your 720 display. That doesn't mean you would get a bad picture with 1080i. Toshiba is simply saying that 720p is probably best for your 1280 x 720 display. I've got my new A3 set to 720p, but when I get time I will try 1080i to compare.

BTW, are you ready for the firmware update?

http://www.highdefforum.com/showthread.php?t=56607

I don't think you have to do this right away. The player will play movies OK, but I believe this update is for advanced functions available on some discs.
­
 

New member
Username: Luisfc1972

Post Number: 3
Registered: Dec-07
thank you. i updated firmware to 1.3 and everything seems fine. looks good. my 300 hd dvd didnt want to work at first but i cleaned it and all is good. that kind of bugged me though.

question. with my old phillips 5982 upconverting player i was able to choose an audio setup like action, drama, etc. i cant seem to find that option on the toshiba a3. i read the manual. am i missing something or is there no option for that on the A3?
 

Gold Member
Username: John_s

Columbus, Ohio US

Post Number: 1973
Registered: Feb-04
­
"my 300 hd dvd didnt want to work at first but i cleaned it and all is good. that kind of bugged me though."

My "300" played without a problem. I rented the "Bourne Ultimatum" HD DVD and no problem there either. What did you clean the disc with?

There are no simulated surround settings on the A3 that I know of. That kind of thing is usually done in the receiver rather than in the player. You should be hearing downmixed Dolby Digital 5.1 from the A3's optical output, unless you using an HDMI 1.3 receiver.
­
 

New member
Username: Tech101

Post Number: 1
Registered: Oct-10
Paul, or John, or someone that knows HD Can you help?
I have scientific atlantic 8300 hd box that was hooked up to a flat screen and never displayed HD and the picture was perfect. Now the cable company (cablevision) switched all channels to display hd 1080i and the picture is really distorted on the screen. I do NOT want HD. They gave me an old cable box back that does 480 but that displays really blurry. No one can figure out why the box was displaying a normal picture before the channels were switched permanently to HD. Could it have been that it was displaying 720? The cable guy said he couldnt get that option on the box. I can't get the good picture back and its drivign me crazy and hurts to watch! The 480 is blurry and hd is too distorted. How could I have had an hd box hooked up to a non HD flatscreen for a year with no problem? What can i try??
 

Platinum Member
Username: Plymouth

Canada

Post Number: 15067
Registered: Jan-08
Welcome to eCoustics juls!

You can try a good ATSC tuner box with picture size option!

When you talk about distorted picture, is it the size of the picture which cause your problem?
 

New member
Username: Tech101

Post Number: 2
Registered: Oct-10
Thank you! Its not the size it's literally just either really blurry or distorted from hd.
 

Platinum Member
Username: Plymouth

Canada

Post Number: 15072
Registered: Jan-08
OK that is normal because the TV is not HD ready!

Buy a HD ATSC tuner with a SD output, you will have a very good picture on 480i.
 

Gold Member
Username: John_s

Columbus, Ohio US

Post Number: 2791
Registered: Feb-04
OK that is normal because the TV is not HD ready!

Plymouth is most likely right, but that can be easily confirmed if we know the model number of the TV in question.

How could I have had an hd box hooked up to a non HD flatscreen for a year with no problem? What can i try??

Change the video hookup from what it is now to something else. You'll have to be more specific about how the cable guy hooked up their latest box to your old TV. You say you have a non-HD flatscreen??
 

Gold Member
Username: John_s

Columbus, Ohio US

Post Number: 2792
Registered: Feb-04
OK that is normal because the TV is not HD ready!

Plymouth is most likely right, but that can be easily confirmed if we know the model number of the TV in question.

How could I have had an hd box hooked up to a non HD flatscreen for a year with no problem? What can i try??

Change the video hookup from what it is now to something else. You'll have to be more specific about how the cable guy hooked up their latest box to your old TV. You say you have a non-HD flatscreen??
« Previous Thread Next Thread »

Add Your Message Here

Bold text Italics Create a hyperlink Insert a clipart image Add a YouTube Video
Need to Register?
Forgot Password?
Enable HTML code in message
   



Main Forums

Today's Posts

Forum Help

Follow Us