Dvi to hdmi

 

New member
Username: Mcphee

Edina, MN United States

Post Number: 1
Registered: Nov-06
I have a new Samsung lcd hdtv. It has two hdmi inputs. my comcast box is a Scientific Atlanta 3250hd with a dvi output. my tv does not recognize the hdmi input when connected. the option is "grayed" out in the menu.

I have disconnected/reconnected everything. I have tried as many scenarios as I can think of . . . reboot the box while connected to the tv, while unconnected, etc.

I called Samsung, they suggest it is the box or the cable.

I called comcast, they suggest it is the tv or the cable.

Both were gracious enough to not suggest operator error, although it is certainly a possibility.

does anyone have any advice/similar experience. Before buying new cables and carting my tv in for service, I would really appreciate some help.

Thanks
 

Silver Member
Username: Tommyv

Rowlett, Texas

Post Number: 123
Registered: Aug-06
I see this over and over. DVI to HDMI from cable boxes often doen not work. Your best bet is component video (red,green,blue). I think DVI to HDMI often does not look good anyway. I did the comparison with my Denon DVD player because it has a DVI and an HDMI input. The picture from the DVI output look grainy and noisy on my TV. The picture from the HDMI output looks amazing.
 

New member
Username: Potentialjvcowner

Post Number: 10
Registered: Sep-06
Tom, your comment makes no sense. The video component of HDMI is EXACTLY the same signals as DVI. Guy, effectively the only difference between DVI and HDMI is the physical connector. The electrical signals are IDENTICAL ... the same, exact same, there is no conversion, just a different connector (an HDMI connector, however, can carry the audio signals as well as the video signals, while DVI is video only).

[But a DVI-I output can have both analog and digital video signals, while HDMI is digital video only. However this really only applies to computers.]

As to why it doesn't work (connecting a DVI output to an HDMI input using a DVI to HDMI cable), there are two possible reasons:

1. The signal formats are incompatible. The digital signal from the source (cable box) may be of a resolution or refresh rate that the TV set can't display.

2. HDCP: The source (cable box) may only work with a TV set (or digital VCR) that has HDCP (High Definition Copy Protection), and your TV may not have HDCP.
 

Silver Member
Username: Tommyv

Rowlett, Texas

Post Number: 129
Registered: Aug-06
Barry, I understand that buddy and it is exactly what I always thought. It wasn't until I got my TV and first hooked up my Denon player with a DVI to HDMI cable I had laying around and then later with an HDMI to HDMI cable that I saw a difference. It is very apparent; what ever the reason is I can see it clearly. It could be something with the player or the cable I dont know but it is deffinatly there. I also read a review of that Oppo player with the DVI output where a guy was complaining about noise in the picture on the DVI and he actually preferred the component output. It sounded similar to what I was seeing on mine.

From my experience Comcast boxes arnt compatable more times then they are compatable out of the DVI to an HDMI display. I always suggest people go with component cables instead.
 

New member
Username: Carz10

Post Number: 1
Registered: Nov-06
I just got a Sony LCD KDL40V2500. I am not an expert with this but I had the same problem with a DVI to HDMI cable. I have Time Warner Cable and use a Pioneer HD box. The picture on HD, Digital and Analog channels were not as sharp. I went back to the componemnt hookup and the picture was significantly better. I returned the cable.
 

Bronze Member
Username: Malakaii

Post Number: 25
Registered: Oct-06
DVI input is only good for Computer if you want to connect your box you should use HDMI
 

Bronze Member
Username: Rysa3

Houston, Texas

Post Number: 50
Registered: Nov-06
Hi Barry- actually DVI and HDMI are not exactly the same as far as the bit rate capabilities. Obviously the impetus for HDMI has more to do with data protection and handshake establishments but DVI and HDMI are NOT exactly the same Totally false as far as video signal.

In any case, the recommendation for Componet is the correct one as HDMI is frought with problems right now and color accuracy issues abound, as has been clearly delineated and explained in writing including the Janaury 2006 widescreen review magazine.

Most of the movies we watch were recorded using component cables FYI. Truth is a beatch aint it?
 

Bronze Member
Username: Potentialjvcowner

Post Number: 22
Registered: Sep-06
Malakaii, that's not true. DVI and HDMI are electrically the same as far as the video is concerned. Further, since very few devices have both DVI and HDMI, you don't normally have a choice of which one to use, and saying "you should use HDMI" is kind of pointless when one of the devices in question has no HDMI interface. But since they are the same except for the connector, it's easy enough to get a DVI to HDMI converter or cable. That doesn't mean it will work (see below), but if you look around you will find people with HDMI-to-HDMI that also doesn't work. Because HDMI vs. DVI isn't the issue.

Marc, sorry to disagree but I disagree.

Both DVI and HDMI can carry signals covering a wide range of resolutions and refresh rates (as can, for that matter, analog VGA), but the electrical signals for video are the same. Exactly the same (electrically, not necessarily in terms of data content, which isn't a function of the interface type).

As for HDCP (copy protection), that's a software issue, there are no extra signal lines associated with it. You can do HDCP over a DVI connector, and a few new computer video cards do (but most computer video cards don't, yet .... not because their output is DVI, but simply because computers don't yet commonly support HDCP).

The major differences between DVI and HDMI are simply that HDMI also carries audio, and of course the physical connector is different (although I won't argue that FAR more HDMI devices support HDCP than do DVI devices). But as far as the video is concerned, the signals are the same.

If it doesn't work, it's either because of HDCP issues or it's because the resolution and/or refresh rate being supplied by the source are simply not supported by the display.

As to HDCP, the point of HDCP is that a video source (STB in this case) won't supply a signal unless it and the signal recipient (display or, possibly, digital VCR or disc recorder) establish a secure connection consistent with the DRM allowed by the signal provider, using HDCP protocols. I'd suspect that if the STB was HDMI (which usually but not always supports HDCP) and the display was DVI (which usually doesn't support HDCP ... but it CAN, if it's late model). But here, the source is DVI and the monitor is HDMI. Thus, an HDCP issue is, if not completely impossible, unlikely.

Consequently, I believe that the most likely issue here is that the STB is sending out a signal format (resolution and/or refresh rate) that the display doesn't support. When the source is a comptuer, you can possibly fix that by changing the resolution and/or refresh rates in Windows display properties. But when the source is a set-top-box, you are pretty much stuck if it doesn't work with your display, unless the stb has a setup menu function that allows changing the signal format parameters of it's digital output.
 

Bronze Member
Username: Rysa3

Houston, Texas

Post Number: 51
Registered: Nov-06
Hi Barry. Thank you for your response. Now let me school a little bit...

DVI is limited to 24 bit color depth. And in fact, the human eye cannot resolve individual colors beyond this. However, with the increased resolution of HDTV or Hi Def Signals ( 1920x 1080), the human eye CAN tell the difference in overall quality of color between 24 bit video data and higher levels.

HDMI 1.3 CAN carry 30, 36, and 48 bit color depth greatly increasing the cquality of color that can be displayed. DVI cannot.

The video characteristics are NOT the same between DVI and HDMI and HDMI DOES posses a distinct video quality advantage.

At a more basic level DVI cable is limited to about 15 feet and you better get a device to measure whats really being transmitted. So anyone setting up a home theater with a front projector better have a REALLY small room or that DVI cable isnt going to reach!!

DVI was a nice interim step as far as video display technology and will be commonly found for quite awhile, but it has its limits and disadvantages for Hi Def Home Theater and I wouldnt voluntarily sink money into a new purchase involving DVI.

Class dismissed!
 

Silver Member
Username: Tommyv

Rowlett, Texas

Post Number: 136
Registered: Aug-06
All I know is I have had nothing but problems with DVI. Everything HDMI I have used has worked and of course component video. I always avoid anything DVI whenever possible based on my negative experience with it.
 

Bronze Member
Username: Chico

Post Number: 74
Registered: Sep-05
are all hdmi cable the same or are the name-brands(m0nster) better.
 

Bronze Member
Username: Potentialjvcowner

Post Number: 34
Registered: Sep-06
Neither; they are not all the same, but some of the name brand cables are ridiculous. Do yourself a favor, the cables at www.monoprice.com are extremely good and extremely inexpensive (get the cheap ones, not the "w/net jacket" ... you will be stunned at how good they are).
« Previous Thread Next Thread »



Main Forums

Today's Posts

Forum Help

Follow Us