Is more $$ alway better?

 

New member
Username: Intrepidz

Post Number: 2
Registered: Jan-06
I have spent some time reading up on what receiver to replace my fried Yamaha with. One thing that I really agree with is someone said he does not hook his video up to the receiver and use it for a switch. I think you will loose picture quality doing this. It already costs enough to buy high dollar cables from the dvd player to the TV. Why buy extra ones from the dad player to the receiver to the TV? Then you have to have your receiver turned on all the time.

I don't have some of the high dollar speakers some of you have.
I have Yamaha book case speakers. I paid $150.00 for both of them. I also have Yamaha rear speakers $100.00 both of them as well and a Bose center speaker on top of my 65" MIT RPTV. They sound fine for movies and DVDs. I don't listen to cranked up music anymore, Well sometimes in my truck.

I did go all out back in 1993 and got a Yamaha compete system for $1800.00. It was bad a** at the time, but now is it old and out dated.
$1800.00 back then was a LOT of money, so I really only want to buy what I need. I don't need my receiver to have a lot of video inputs I'm NOT going to use.
I want a kick a** receiver but don't want to spend too much on something I'm not going to hear on my old Yamaha speakers.
I have the funds to pay for high dollar stuff, but not sure if it is really worth it. Do you guys really think a $1000.00 system is three times better then a $300.00 system?
I'm really asking you for help on this so do not take my question the wrong way?
I mainly watch Dish and cable, and sometimes watch DVD's.
What Receiver should I be looking at and why?
 

Gvenk
Unregistered guest
An advantage of the newer HDMI/DVI switching is that you don't have signal loss as would happen in analog switching (transmission/switching delays are negligible compared to processing delays). The main advantage of routing everything through the receiver is to simplify the usage via a single remote. Some receivers are also capabale of HD upconversion which might help on some sources.

Your question about money is an easy one. As in any other thing, the answer is no, of course. But this doesn't mean that you can do better or equal with $200 what you can do wisely with $1000. You can put together a better system as you pay more because of the choices you have and more likely you can put together one that you will like than if you were constrained by a smaller budget. If you aren't careful, you can also spend $1000 on something that doesn't taste any better than a good $200 system.

On the other hand of a Panny XR55 connected to the Bose speaker sounds great to you, why spend any more. Depends on what you are looking for and what you are satisfied with.
 

New member
Username: Intrepidz

Post Number: 3
Registered: Jan-06
My MIT TV has DVI in, but I have not been using it. The only thing I knew of that had DVI in was some of the newer DVD players. So some of the new receivers have a DVI plug on the back? That sounds pretty cool, I like the idea of a receiver being able to help the picture quality. You also are saying that HD upconversion will give me a better picture?
My Yamaha receiver playing dvd on DTS sounded really great when I first got it. I could not believe how cool movies sounded in DTS. And you are right, about be satisfied with less is possible for some of us. I have worked in the refineries 15 years and my hearing is not as good as when I started.

A buddy of mine builds a new house in 1996. He put a killer sound system with b&w speakers and a Denton receiver. He told me at the time It was around 10K total. This was for installing everything as well. It sounds really good, but not really great. I never told him but the room was not setup right.
He had in the ceiling speakers and the room was an open floor plan. I think you can get the picture.
So, I'm thinking just spending a lot money doesn't mean you end up the kick a** system you were hoping for.
I'm off to Best Buy and will looking at the new HDMI/DVI stuff.

Thanks
 

Gvenk
Unregistered guest
ok then go to Best Buy you fool, that's where you belong with your Panny and Bose!
 

Gvenk
Unregistered guest
Unfortunately people seem to have no problem with using other people's handle to post like the one above. Very sad what this board has become.
 

Gold Member
Username: Artk

Albany, Oregon USA

Post Number: 2610
Registered: Feb-05
Register Gvenk. It ain't foolproof but it helps.
 

Gvenk
Unregistered guest
I want to but am afraid of forgetting my password.
 

Silver Member
Username: Nuck

Parkhill, Ontario Canada

Post Number: 998
Registered: Dec-04
Friggin' helpless, these kids.
 

New member
Username: Intrepidz

Post Number: 4
Registered: Jan-06
I was lucky enough to catch a manager and we talked about a few of the new receivers and some of the older ones on the market. I will share some of things we talked about with you. You can disagree with any of this and that's ok with me. I'm just sharing the information for those who would like to know.

7.1 = THX (7) speaker sound, 2 front, 2 back, 1 sub, and 2 centers. Only some DVD's have THX sound. How many? I'm not sure? But if you don't have 2 center speakers you are wasting your money.
Video in receivers
I asked why do all of these receivers have video in them? He said years ago some of the movie companies were going to add some of the sound in the video signal. This was only done for a very short time on VHS tapes. But none of the receiver companies wanted to be left behind on something new. So they all added video to their receiver at that time.
HDMI/DVI
I asked him do the newer HDMI receiver really helped the picture when using a DVI cable? He said, he has asked the same question to the manufactory and their answer was? We may audio equipment NOT video equipment. So does it help? Let's say it helps NOT to loose picture quality. Again, this is his opinion!


Older equipment
He told me some of the older receivers are worth having them repaired. Most of the newer equipment on the market today is NOT as good as just a few years ago. He said most of the manufactorys are cutting cost at the expense of quality.

I KNOW SOME OF YOU WILL DISAGREE WITH THE ABOVE, and that is fine with me.
I'm just sharing what we talked about.
 

New member
Username: G_venk

Post Number: 2
Registered: Jan-06
The problem with talking to someone anywhere is that he is no better or worse than any other person here. What helps is to think about what they are saying and whether it makes sense.

The reason there is video in receivers these days is because the market for HT is growing and a large percentage of them have a single place to listen to both music and video and would like a reasonably integrated system rather than buy multiple pieces (the reason you buy a receiver in the first place is precisely for this reason) and have a single remote control.

THX has nothing to do with how many channels. It has to do with certification of equipment to have certain features and capabilities as determined by the people that provide the certification.

DVI can only do audio while HDMI can do both audio and video. So you can connect a DVI and HDMI ends with special cables you get but only video will be propagated and you will need to find a separate audio route (which in some cases make things difficult because when you use HDMI in certain components, they insist that the audio come from the same cable).

Ok to share what you heard but this person doesn't seem very knowledgeable.
 

the real Gvenk
Unregistered guest
This is despicable!!!
 

New member
Username: G_venk

Post Number: 3
Registered: Jan-06
Indeed since you can know longer masquerade with my old handle. :-)

The posts speak for themselves.

Go away troll.
 

Gold Member
Username: Frank_abela

Berkshire UK

Post Number: 1132
Registered: Sep-04
G Venk

DVI can only do video...

I believe HDMI, which does both audio and video, has limitations in terms of its audio quality due to the very small size of the plug and the way the wires are laid out in it.

Regards,
Frank.
 

New member
Username: G_venk

Post Number: 4
Registered: Jan-06
Ooops, you are right. I meant to say DVI can only do video as is clear from the sentence following it.

About the HDMI audio limitations, that is a curious urban legend that has persisted in many circles (especially luddite ones). The format of transmission is digital, so any problem with the cable and plug design will result in a dysfunctional audio not a degraded audio (i.e., you don't get lower quality but rather drops and breaks). I don't know if this is what you mean by audio quality. If the audio is working then you are getting the quality that the source had or it becomes unusable. It isn't an issue in the usual audio terms of less clarity, imaging, noise, etc.

In terms of the theoretical capabilities, HDMI specs are designed for multi-channel (8), 24 bit 192 khz sampling WITHOUT compression while coax and optical can carry only 16-bit, 48khz stereo without compression or 16-bit, 48 khz multi-channel with lossy compression (which degrades audio quality). So there is no reason at all why HDMI would result in lower audio quality.

Either the audio works at the quality provided by the source or it doesn't work at all (typically due to problems with very long cable runs).

If you know something I don't I will be happy to hear it.
 

Silver Member
Username: Gman

Mt. Pleasant, SC

Post Number: 760
Registered: Dec-03
From the start, HDMI was defined to carry 8-channels, of 192kHz, 24-bit uncompressed audio, which exceeds all current consumer media formats. From the beginning it was able to carry DVD-A, but Sony didn't allow it to carry SACD. HDMI can carry any flavor of compressed audio format such as Dolby or DTS. (Such compressed formats are the only multi-channel or high-resolution audio formats that can be carried across the older S/PDIF or AES/EBU interfaces.) The fact that the vast majority of HDMI products shipped are two-channel TVs that don't support more than two-channel audio doesn't make this any less the case. Most existing HDMI sources can output any compressed stream, and the newer sources can output uncompressed 6-channel, 96kHz audio from a DVD-Audio disk. There are several A/V receivers on the market that can accept and process the 6- or 8-channel audio from HDMI and more are expected to be available shortly.

While HDMI is backward compatible, if you have a component with only HDMI 1.0 and others at 1.1 or higher, it will only perform as good as the weakest link. HDMI 1.0 won't pass SACD. HDMI 1.2 will pass SACD. And if you hook up HDMI to DVI you will be limited to only video.

Many manufacturers have made receivers that only processed two channels from the HDMI 1.0 and 1.1 spec, even though it was capable of multi-channel delivery from the beginning. More receivers are enterring the fray with the ability to pass and process HDMI 1.1 and later with up to 5-8 channels.

HDMI has a 5 GB bandwidth, of which only half is needed to pass up to a 1080p or lower HDTV signal. That leaves it with plenty of room to pass up to 8 channels of audio.

My current preference is using i-link (firewire) for all audio from a universal dvd player and the HDMI for video. When I have enough equipment with the latest iterations of HDMI and the equipment designed to pass multi-channel HDMI, then I might feel comfortable swithing to HDMI audio too. I also use the i-link on the receiver for hooking up DVHS.

When purchasing HDMI equipment, you must either read the manual or get good information on whether the manufacturer is passing the multi-channel specification. Many early providers didn't pass all channels.
 

Gold Member
Username: Frank_abela

Berkshire UK

Post Number: 1136
Registered: Sep-04
Thanks for the responses. My understanding is that the proximity of the cabling in the plug is that much closer on the audioo side such that crosstalk (digital crosstalk) can occur giving a perception of lower quality.Yes, the cable can accept 24bit/192khz, but it can be corrupted in transit without the receiving DAC knowing about it.

Regards,
Frank.
 

New member
Username: G_venk

Post Number: 5
Registered: Jan-06
That is indeed more of an issue with the plug manufacture than the standard and if that intereference occurs, it wouldn't be a subtle effect. As in most digital formats, the encoding uses check bits that makes larger portions of the stream invalid if there is local corruption and it cannot be corrected with redundancy bits. The probablity of such interference creating a valid but different stream (and so unknown to the DAC) is very small. So the DAC will know about it. However, what DACS do with that information is then upto the design of that DAC and the surrounding logic. The most likely effect will be more like "hiccups" in the audio rather than any degradation of quality as might happen with a lossy compression or smaller sampling rate.

Most people who have digital TVs would have witnessed the above at some point in their TV viewing.

The analogy with video is that rather than seeing grainy, blurred video or lack of detail or effects on color, corruption in digital video streams will manifest itself in jitters or pixelation effects on the screen.

Some component circuitry may try to circumvent such corruption in the stream with extrapolation, etc. leading to a perceived "loss of quality" raher than hiccups but that wouldn't be universal. A good HDMI cable works or it doesn't.

Of course, that wouldn't prevent a whole industry cropping up with oxygen-free and/or "properly polarized ions" in HDMI cables that make the sound more open, image better, feel more live, etc. :-)
 

Silver Member
Username: Chitown

Post Number: 653
Registered: Apr-05
Correct. (I wrote this before, but the system had SQL malfunction and wouldn't post)

That's hard to do in a digital transmission Frank. Usually any digital transmission has a bit checksum routine that is embeded into the signal. The transmitter encodes it and the receiver decodes it. If the receiver checksum fails because the packet is corrupted, then the whole packet is rejected. So yes the packets can get corrupted, although in such short distances that is hard to do, but the receiver does know about it and it won't result in lower quality video. It would be much harsher such as blackouts on the screen that you might see in a digital cable channel.



 

Gold Member
Username: Frank_abela

Berkshire UK

Post Number: 1138
Registered: Sep-04
Hmmm, but that doesn't explain why, for example, one HDMI cable can come up with a grainy over-saturated picture and another cable will come up with a more neutral (natural) picture. I had exactly this effect just recently running between a Naim DVD5 and a Pioneer PDP436XDE. A relatively inexpensive but well made HDMI cable gave a relatively fuzzy, over-bright, over-saturated picture which also suffered from motion artefacts whereas another, more expensive cable gave a great picture. And yet it's all digital...

regards,
Frank.
 

New member
Username: G_venk

Post Number: 7
Registered: Jan-06
Two possible reasons. One is the possibility is that it wasn't done with a double-blind test (the blindness refers to not knowing about the cable not to watching the screen, of course). :-)

The other possibility is that the video circuitry is compensating for errors in the stream from a bad cable via extrapolation and other smoothing techniques. The result would be the same as lossy compression or undersampling. Such errors is more likely to arise due to bad soldering/crimping/connectivity than EM interference between inner cables.

The advantage of digital is that unlike analog cables where the degradation can be gradual over a wide spectrum and any loss in the cable is a loss of information, a wide range of digital cables with differing quality can all give the exact same picture/audio because there is no bit loss (detecting 0s and 1s have a lot more tolerance for degradation of electrical signals than detecting relatively unlimited number of analog levels and moreover redundancy bits can recreate lost information in some encodings).

So it is necessary to have a minimum level of quality in HDMI cables which differ a lot in the way the plugs are attached to the wires. But once above a certain level, improving the cable with voodoo magic provides no further improvement (and hence no additional business for videophile magic enhanced cables). It just means that one should avoid no-name, generic branded HDMI cables (just as one would avoid such cables for ethernet, printer cables, etc.)

Going back to your original thought, first the audio/video circuitry WILL know any errors induced by the cable and so anything you notice/hear will be due to conscious decisions in the design on handling errors. Second, the design of the HDMI plug does not HAVE to result in losses because of wire density/routing if designed and fabricated properly. Computer/telecom industry uses far more dense packing of cables to carry data without problems. HDMI plug isn't all that different in wire density from a typical RJ-45 ethernet plug.
 

Silver Member
Username: Eramsey

South carolina United States

Post Number: 406
Registered: Feb-05
I agree with Stof a digital transmission is based on the "all or nothing principle". Either you get a good signal or you get an unusable one. No more evident is this with the purchase of a HDTV with a built in tuner,which I helped a friend of mine set up recently. He had bought an off air antenna and I helped him tune in the local HD stations which after moving and adjusting it for some time we were able to tune in the stations. We found basically, which I already knew that we got either a good signal with a great picture or nothing at all. There is no "ghosting" as evident with analog reception. When a disk is read be it DVD or CD, it is scanned in it's entirety and stored in memory in the source component. During playback the player compares the playback signal to the scanned signal stored in memory, hence this is called a "signal comparator" which most people refer to a "bit checksum". This is also how a source component will cope with errors on the disk which are responsible for corrupted data, not cables which are atypically incorrectly blamed for signal loss. The possibility of "crosstalk" between two diffrent types of signals at widely different bandwiths within a cable is nil. Several test disks,both audio and video,containing errors, are available to check for the quality of player error correction. This is exactly how they test for this in audio and video magazines. The plug for HDMI is standardized and a typical HDMI cable will have one twisted pair of conductors, one for audio and one for video with at least one copper braided shield often two or more. In a more expensive cable there will be two twisted pairs two for audio and two for video. Perhaps this is where Frank is noticing a difference. My mother bought a HD TV about three years ago and a matching prog. scan DVD player of the same brand. When i connected them for her I first used the inbox unshielded component cables and the picture was poor. The color had a noticable "bleed" and the picture was dark and somewhat subdued. I then disconnected the inbox cables and bought a cheap- $20, but adequately shielded set of cables and there was a noticable change in the picture. Colors were more natural with no bleed and the brightness was restored. This lack of shielding is where most issues of poor picture quality will result with video cables.
 

Gold Member
Username: Frank_abela

Berkshire UK

Post Number: 1147
Registered: Sep-04
A cursory look over the cheaper of the two cables indicated that the cable was well made. This is a 'value' oriented brand, but generally speaking they're better quality than expected for the price. It is interesting I think that you can get such variation.

Regards,
Frank.
 

Gold Member
Username: Nuck

Parkhill, Ontario Canada

Post Number: 1033
Registered: Dec-04
However, if transferring pure digital signals, and sending the expected signal means nothing to the receiving unit, I highly doubt that the stuff would ever work, as an all or nothing.
The receiver plays only what it is told, and I doubt the purity of transmission is that complete, including shecksums, that the receiving componant would not play the information it received.
My printer quit printing red.I changed the cable.Now it is fine.

The receiving componant must have some flexibility in delivering as compared to the input, Like 1bit Mash players that 'fill in the holes'?
 

New member
Username: G_venk

Post Number: 9
Registered: Jan-06
You are correct that most receiving components are designed to "interpolate"/"extrapolate" or otherwise "fill in the holes"... to a limit. They have to account for the possibility that the source itself being faulty. Typically they give up after a certain level.

Playing a bad DVD can show all of these effects of errors in digital data terminating in "Bad Media" errors at the extreme. Same thing happens with bad transmissions.

It is not easy to judge the quality of a cable for digital transmission by visual inspection (except when there are visible breaks) as anyone who has worked with ethernet cables will tell you.

In digital encoding, the transmission does not have to be pure (it seldom is) to transfer information without a loss. Even without checksums, there is resilience. As a simple (trivialized) example, consider an analog signal where the amplitude (or frequency depending on encoding) of the waveform corresponds to the information. Any variation from the input value is a "distortion" that is taken as valid unless some redundant transmission is used to detect/correct such transmission losses.

Binary encodings, on the other hand, assign a large spectrum of such amplitude (or frequency) values as 0 or 1 and small transmission losses/failures around the input value will keep the received value still within the allowed spectrum to be received correctly as 0 or 1. It is only when there is very large disruption will the received value differ from the input value to toggle between 0 and 1 which is typically detected via checksums and often corrected with things like CRC. If analog cables experienced such disruptions, they would sound terrible.

Because of this resilience, digital transmissions can pack more bandwidth and tolerate more crosstalk (and hence higher wire density) etc.
 

Silver Member
Username: Chitown

Post Number: 665
Registered: Apr-05
Nuck actually your printer problem actually proves this point here. Any problem in the cable will cause a substantial (very noticeable) a/v disturbance (such as loosing the color red) not so much a vague loss of picture resolution.



 

Silver Member
Username: Eramsey

South carolina United States

Post Number: 409
Registered: Feb-05
The only thing important in a cable is the integrity of the conductor. Except for optical digital and various other light transmisson mediums such as phone lines,most cables have a copper or other metal conductor. The signal through this type of cable,one with a conductive metal, will be a voltage whether signal is analog or digital, the only way for a cable to disrupt the signal is if the conductor has a break. This break in the wire of course has to be substantial enough to cause a rise in impedence which will effect the voltage of the signal and also cause a rise of current through the wire. As long as wire retains 90% of it's diameter no changes from an electrical standpoint should ocur with respect to the signal. I think in Nuck's case the cable that's used has many small diameter conductors because this is a mutltipin connector,24 I think. I would say for sure that the wire that the signal for "red" is sent through is likely broken. Provided the cable is intact and is not of a poor/incorrect design then these will be the only ways a conductive cable will cause an error in the signal. Errors in a signal othewise and normally are caused by the sending and or receiving equipment or the software itself.
 

New member
Username: G_venk

Post Number: 10
Registered: Jan-06
Eric, there is no separate wire for "red" in printer cables (unlike VGA cables to the monitor). What is sent to the printer is a sequence of printing commands (for example in PS or PCL) which might include vector drawing instructions (in which case the output is likely to be distorted rather than color missing) or bitmaps in which case the cable-induced problems can lop off high or low level bits (or the error correction attempts at the printer can result in such) that might result in lack of information in one or more colors. This is why with a bad cable to the priner, one program may print perfectly while another might show problems.

Unfortunately, cables are the source of problems in 90% of the problems in digital transmissions. Note that the cable is not a conducting wire, it contains soldered/crimped connectors and the connections to the corresponding receptables in source or destination all of which are less than 100% reliable. In addition, the cables themselves are subject to EM interference and crosstalk (unless shielded properly) which affects the transmission via induced currents, etc.

I think we are all saying the same things to the original question posed though. :-)
 

Silver Member
Username: Eramsey

South carolina United States

Post Number: 410
Registered: Feb-05
G Venk, Im well aware of problems arising in a cable that lacks RFI/EMI shielding, did I not discuss this on this thread previously? Don't take this as a adverse tone but I really don't know if we are saying the same thing/on the same page.Correct, the entire cable is is not conductive,just the wire inside it. Cable that is not optical/light based will have a conductor inside of it. I don't know of any other way to send a voltage signal through a cable, you've lost me there. If it doesn't have a conductive metal what the hell else could it be?!! I agree problems are more commonplace with cables of multistranded wire but this is much more problematic with computer systems than audio equipment,which primarily uses cables with single conductors. Also since a computer printer cable is 24 pin as I suggested, how do you know there is not a specific pinout on the transmission IC from the CPU for the operation of "red" to the printer? Do you have factual info that would negate this. Please don't fear being to technical as I'm an industrial electrician who writes PLC programs.
 

Bronze Member
Username: G_venk

Post Number: 12
Registered: Jan-06
Eric, this is going OT. There isn't a print IC in the computer to print (you have the wrong picture of how this works).

Take a look through http://www.interfacebus.com/Design_Connector_1284.html
It has links to the pinouts of the common type of connectors.

The computers talk a language most commonly PostScript or PCL to a printer via the printer cable. This language is encoded in a sequence of bytes and sent in parallel or serially depending on the printer cable. These bytes are then interpreted by the print engine inside the printer to print. Typically only 8-of the pins are used for data to do these byte transfers which encode the printer commands sent by the computer.

We agree that digital transmissions are more "all or nothing" than analog transmissions which show subtle interpretations of output based on level of distortion.

I was just pointing out that a cable consists of a number of components, the connecting wire, shielding, and the end connectors connected to the wire and the mechanism of connecting the end connectors to the corresponding receptacles in the components. All of these are prone to introducing noise into the electrical signals (not just a "break" in the conductor) being transmitted due to QC problems in fabrication, interference, etc.

But I think we both agree, digital transmission is more resilient to such noise compared to analog transmission and so digital transmission tends to work correctly even with some noise in the cable. But once the quality of transmission falls below a certain treshold, the error correcting and smmothing/interporlation capabilities of the DAC are exceeded and you get large visible/audible results.

Frank could be right in noticing some degradation of quality if the errors in digital cables are within the interpolation capabilities of the DAC but this is not because the DAC doesn't know about it but rather because it is trying to compensate for detected errors. But good quality cables that aren't "broken" don't exhibit these kinds of errors so there is no advantage in getting "better" cables above a certain level of quality.
 

Silver Member
Username: Eramsey

South carolina United States

Post Number: 412
Registered: Feb-05
G Venk: Actually an analog signal contains higher levels of noise than a digital signal because the signal does not correct itself and analog is linear,"constantly changing" whereas digital is discrete,only one set voltage,either on or off. With analog, signal timing requirements are a bit more liberal,than they are with digital,but digital can maintain a much tighter control over the signal purity. Unbalanced connectors-RCA, can and do in rare circumstances have noise problems,i.e. ground loops. This amount of noise is the subject of considerable debate amoung the audio and scientific community. Other than in the case of a ground loop the amount of noise possible is far,far below the level of an audible threshhold. Balanced audio connectors, aka XLR offer an advantage over RCA due to the fact that they actually have a true ground, a seperate pin dedicated for such. In an RCA cable the shield serves as a ground as it is connected at one end of the cable, this is in practice, an incomplete ground as it only grounds one piece of equipment,not both.,this is where problems can arise. The IEE 1284 connector you referred me to,my apology for not knowing it's type, is a 25 pin connector, so I was close. The connector serves as a ground and at least one pin, probably two serve as grounds also. You say that only 8 pins are used for data transmission,worst case this leaves 15 pins unaccounted for. These other pins would have to be used also for the transmission of the signal and or communication this is too many to be "inert". I'm aware of serial and parallel transmission as these are the two types of data transmission.You are of course correct in both analog and digital signal there is a marginal percentage of error that as long as the signal stays within this limit there should not be a problem, hearable distortion;as no signal is or has to be 100% perfect.Even a well trained ear can't hear any less than about three percent distortion from an amplifier. No cable,provided it is not damaged, will cause this amount of distortion. Most digital transmissions,at least those made by home audio equipment,are probably over 98% uncorrupted, in the real world, this is close enough to be considered perfection. I do agree with you completely about cables as I buy sensibly designed reasonably priced cables. A cable that costs $50 will seldom offer less, particularly a noticable difference in performance than the same type cable of another brand costing two or even three times as much,I know this from experience. This rare and fleeting difference rarely justifies such an increase in spending. In the case of audio components and speakers more $ almost always is better as this equates to better build quality which is certainly not a noticable one for the average system,rarely for even high end systems as well, and more money for a system budget is better spent on better quality speakers or equipment. I have nothing against expensive cables,people should do what they want with their money. Just beware of bogus claims based on pseudo science and things contrary to established principles of electricity and physics. Don't expect miracles either,but it's alright, the desire for audio splendor is a noble pursuit. We also haven't discussed jitter either which is sometimes audible,although rarely, but is definately caused by electronics not cables. Sorry to be very long winded,but if you cared enough to read It and contemplate my viewpoint I thank you.
 

Silver Member
Username: Eramsey

South carolina United States

Post Number: 413
Registered: Feb-05
Correction,Better build quality is more noticable for any system and is a tangible difference worth paying more $ for.
 

Bronze Member
Username: G_venk

Post Number: 13
Registered: Jan-06
Eric, like I said we have been saying the same things about the digital transmissions in different ways along with Stof, so no need to prolong this. :-)

You will need to educate yourself a bit about printer cabling in a different forum. It is not of much relevance here in this thread.
 

Silver Member
Username: Eramsey

South carolina United States

Post Number: 414
Registered: Feb-05
I'm sorry if I seemed to prolong this but it was not really an argument,just an interesting discussion,at least to me, and this is an open forum. Thanks for the suggestion but I'm rather content with my level of computer knowledge. In my line of work sure a person has to be computer literate, but a computer is a mere tool for record keeping(maintanence schedules) and for writing PLC programs using special software as I said before. Other than that I probably spend more time using an O scope and a multimeter than a pc. I can run just about Microsoft program without difficulty so this is satisfactory for me.Your level of computer knowledge seems a cut above mine. I wonder are you a programmer or do you work in the industry? My only point was that while computer cables and wire networks often have problems, the likelihood of such with home audio is much,much less whether the signal is analog or digital. No matter what programmers,software and hardware engineers call the communication language between a computer and it's connected equipment it is still binary in it's most simplistic form. Best of luck.
« Previous Thread Next Thread »



Main Forums

Today's Posts

Forum Help

Follow Us