Bronze MemberUsername: Kano
Post Number: 59
Most good receivers use a test tone covering all frequencies between 20Hz and 20,000Hz. As they turn up the volume they test the level of distortion. Typically when between 0.05-0.07% distortion is found they stop and take the power output at that point. They then set the volume control to 0db for that point.
Some receivers test all channels at once, some one at a time giving an inaccurate power rating.
Some receivers test the distortion level at a higher rate. Sony for ex, tests their speakers at a distortion rate of 0.6% (Almost ten times more distortion), and their ES line at 0.15%.
1) Why isn't there an industry standard for power ratings so customers aren't blindsided by an all-in-one Onkyo system that states 1000W total power, but is being measured one channel at a time with a 1Khz test tone?
2) In cases where the distortion rate is allowed to be higher to rate the power, don't those receivers experience clipping as they near 0db, possibly damaging the speakers?
Silver MemberUsername: Elitefan1
Post Number: 796