Looking at numerous receivers for home theater, mostly for DVD viewing and occasional music. For similar prices, can find Pioneer, Kenwood, Sony etc. at 100W per channel, whereas higher end Denon, H/K for the same price are running only 60-70W per channel. Does that make much difference if I'm only watching movies and listening to CD's, and not trying to wake up the neighborhood? Is it worth buying a Denon with less wattage than a Pioneer with more, if other features the same? Also, what does a high-current receiver mean exactly? If Denon is a "high-current", and a Sony is not, is that better? Depends on speakers? Sorry for the long question, any help is appreciated.
Some of the lower end receivers spec high power but if you look at the fine print it would be into 1KHz or 40Hz-20KHz or to 1%THD. That simply mean that the receiver doesn't have the Damping Factor (needed for good bass) or continuous current (also needed for good bass) to play the full audio spectrum. The power supply could be weak. The amp could be weak near DC (20Hz is VERY close to DC as far as amps are concerned). The high current design also allows an amp to drive unusual loads with consistancy - woofers typically have impedence responses between 4 and 22 ohms with poorly designed cross-overs. Piezo tweeters response easily rise above 20 ohms. High current design will drive this without rising distortion or clipping. That's when amps start to sound different.
Some of the more expensive receivers also some additional features like; Dual-zone, Binding Posts, Component switching or wide-band upconversion, Bass Management, Independent record-out control, better DACs (useless in and amp that only goes to 20KHz), Universal Remotes, better DSP, 7.1 sound and more digital INs and OUTs. If that's what you need, then they are worth the extra money.
Hope this helps.
To determine if an amp will give you enough watts, you need to know your speakers' sensitivities and determine how loud you want your system to play.
If your speakers have a sensitivity rating of 85, for example, that means that feeding one watt into them will produce 85 db (measured one foot away). If you want to play them at 86 db, then you need 2 watts. 87 db requires 4 watts. 88 db = 8 watts. 89 = 16 watts. 90 = 32 watts. And so on.
If you are using speakers that have high sensitivity, for instance mine are rated around 94, then you do not need very many watts to reach a high volume. I have a Yamaha receiver rated at 70 watts, and it plays louder than I need it to.
Remember, watts aren't everything. Some of the most expensive amps only deliver 4 or 8 watts.
The Wattage progression is correct but the db progression is wrong. 2xW=(+3db). So the correct progression is 1W=85db, 2W=88db, 4W=91db, 8W=94db, 16W=97db, 32W=100db and so forth. So speakers that are 6db more sensitive need only 1/4 the power to drive To simplify better quality reciever amps use more stringent testing methods to give you your wattage rating and cheaper ones don't. sony can list 500 watts on their little dream system and not be sued for false advertising because they can use whatever testing method to get that power rating they want. They are required to let you know the testing methods though so you can find out frequency range of the test, distortion level and channels driven in the manual. Bose is smart and doesn't report their power so they don't have to tell you about the distortion.