I know simply hooking up a DVI cable would be easier, rather than obtaining a transcoder for component input and using powerstrip, but are the results the same? What would the difference be?
Another concern- This computer is going to be used mainly for playing games.
Many computer games run full-screen, and force the screen resolution to whatever the game is set to display... This can usually be set to some standard computer resolutions.. 800x600, 1024x768, etc..
Are HDTVs capable of displaying these resolutions with simply a DVI connection? if not, then a question about powerstrip:
I know it lets you set a custom resolution, but do games override that custom resolution when they are run? Would this change in resolution (assuming it happens) scramble the display, or would the picture just be cut off / bordered by blank space?
Any input is appreciated.
You're best bet is to use DVI. DVI forced the scaling to happen on the source device and not the display device. You should be able to display any resolution up to the displays' native resolution without having to use Powerstrip or a transcoder. DVI also has lower noise because its digital. If you used Powerstrip, you would need to create custom setttings for all of the standard resolutions.
Hope that helps.