Progressive Scan at 60fps


I know that in general progressive scanning doubles the # of horizontal lines on the screen/sec. An interlace signal runs half the lines in 1/30 sec and the other half the next 1/30 sec. A progressive signal displays the whole picture in 1/60 sec. This means that progressive scan displays run at 60fps. Now my question mostly talks about video games. For anyone that plays videogames on HDTV with progressive scanning, I am curious if there was a difference between running a game at 30fps and 60fps. Anyone who plays games knows that there is a hugh difference in seeing a game at 30fps and 60fps. But what effect does progressive scanning have on it?

All TVs (Except a new model, by Panasonic? that does 72Hz for 3:3 pulldown of DVDs, which is a different story) refresh the screen at 60 frames per second.

Now, an interlaced signal, has 30 fields per second, and it shows the even lines of field 1, in step 1/60, shows the odd lines of field 1, at step 2/60, then even lines of field 2 in step 3/60, and so one and so forth...

In a progressive signal, it has 60 full fields (and the term frame is equivalent to field), and every full field/frame is shown in each 1/60 of a second.

As for video games which can stutter and lag and such things, frames/fields will drop out, but the TV will still get duplicates from the game system. A game system will tend to send out 60 full frames and the TV will actually then send even lines of the first frame in step 1/60, and then send the odd lines of frame 2, in step 2/60, then even lines of frame 3 in step 3/60....which is what causes some motion artifacts, like line shimmer, when a line is nearly horizontal and such things.

So, to answer your question, the prograssive will look a little better, removing motion artifacts and the like. but as far as stuttering and frames dropping out and the like, that is all about programming by the game developer, etc...

The great thing about a true progressive source, like a computer, is that there are NO motion artifacts because the source has never been interlaced. Computers are capable of doing hundreds of frames per second (only useful if the refesh is set the same). There are 60 unique fields in 60i video but they aren't full resolution. When upconverted to 60p you get full resolution but the upconversion algorithm either guesses (cheap TVs), averages the odd and even lines (Most TVs) or does motion estimations (Farodja [?] and ATI video processors). As good as they are and as close as they get, they can not get it exactly right.

Many people can't tell - many people buy a progressive-scan DVD player and connect it to a TV with a built in line-doubler and can't tell them appart. If you can, watch a computer game utilizing a native progressive source. They may look similar but high motion scene may show limpy, jagged edge artifacts.

Top Deals

Loading, please wait...
« Previous Thread | Next Thread »



Main Forums

Today's Posts

Top Deals

Forum Help

Follow Us