Amp Sensitivity

 

Bronze Member
Username: Kdryden

Post Number: 96
Registered: Dec-09
Hey guys, i want to adjust my amp settings to minimize distortion. Now on my amp i have the possibility to adjust sensitivity according to voltage output. Now do I adjust the amps sensitivity according to rated output power or the birthsheets peak power?

Thanks in advance
 

Silver Member
Username: Skdooley

Roanoke, VA Usa

Post Number: 233
Registered: Oct-09
I would adjust it to the stated output. For example if its a 1000 rms amp at 1 ohm but the birth sheet says more, just do it to the 1000 rms. That'd be the optimum way to reduce distortion/chance of clipping.
 

Bronze Member
Username: Kdryden

Post Number: 97
Registered: Dec-09
Now i got a procedure to do this can you confirm its right? Turn up the volume on hud to 3/4 with a 1khz sine wave, with the speakers disconnected measure the output voltage from each channel and increase the voltage sensitivty on the amp to match the optimum output voltage using the multimeter.

So if an amp is rated 50watts @ 4ohm than p=v^2/r v=14.14 volts?

so if im getting 12v i should increase the sensitvity to 2.14 to get a total of 14.14v right?
 

Silver Member
Username: Skdooley

Roanoke, VA Usa

Post Number: 236
Registered: Oct-09
Use a 50 Hz sine wave. You can find them online for free. Depending on the voltage output from your HU you want to choose the high or low setting on the amp.
 

Bronze Member
Username: Kdryden

Post Number: 98
Registered: Dec-09
I cant use 50hz because the settings are for the speakers not sub and its cut at 80hz
« Previous Thread Next Thread »



Main Forums

Today's Posts

Forum Help

Follow Us