Silver Member Username: Alias747MN Post Number: 655 Registered: Apr-05 | Ok why do they say here that it takes 96 amps of current to power a 600 wrms amp? That seems totally off to me. If you were running at 12v woudln't you only need 50 amps? I understand Ohms law(obviously), but I don't know where they are coming from. http://www.the12volt.com/info/recwirsz.asp |
Gold Member Username: CarguyPost Number: 4281 Registered: Nov-04 | IW, that's cause they're assuming the amp is only 50% efficient. With your logic, the amp will have to be 100% efficient. That is not possible cause you will always have some power lost to heat. |
Silver Member Username: Alias747MN Post Number: 661 Registered: Apr-05 | Thanks Isaac, didn't think about that right away, my amp is around 80% efficient, so I don't think I will be pulling that much current anytime soon! |
Platinum Member Username: GlasswolfWisteria, Lane USA Post Number: 10239 Registered: Dec-03 | it's explained in the paragraph at the top. they figure based on class AB amplifiers, which are 50-60% efficient. they go by 50% to be safe. if you're using a class D amp, figure 80% efficiency. you can work backward by adding 50% to the figure to get 100% ratings, then subtract 20$ from that to get your 80% efficiency ratings using that chart. |