B
Bob Masta
Guest
On Mon, 14 Apr 2008 22:10:49 -0700 (PDT), Delsol
<Aaronofkent@gmail.com> wrote:
because the amp specs are not based on output impedance. (Unless this
amp has an output transformer... which pretty much went out with
vacuum tubes.) Modern solid state amps have output impedances that
are near zero at low frequencies.
The "8-16 ohms" rating is more likely based on power handling
capability of the output stage. If you use a lower impedance load, it
takes more current at lower voltage to get the same power, which
means the output stage has to dissipate more power. Like a pair of
voltage regulators with changing setpoints, the positive and negative
output devices have to pass the output current while dropping the
voltage difference between the supply voltage and the output voltage.
Let's take a simple example: Say the supply is 40 VDC and at some
instant the positive output device of the amp has to deliver 50W into
its load. (No RMS here, just instantaneous power.) So with an 8 ohm
load the output is 20V to give 20^2/8 = 50W, and the device itself has
40-20 = 20V across it, while the current through it is the same as the
current through the load = 20/8 = 2.5A. So the device has to
dissipate 2.5 * 20 = 50W.
Now consider a 4 ohm load. First of all, you need to turn the volume
control down to get the same 50W into the load, since now the output
voltage must be 14.14 since 14.14^2/4 = 50W. So the device now has
40-14.14 = 25.86V across it. But its current is 14.14/4 = 3.535 and
the power it dissipates is 25.86*3.535 = 91.4W.
So if the power rating of the amp is based on the power that the
output devices can handle, you'd need to derate it significantly to
use a 4 ohm load. Some amps list separate power ratings for 4 ohms
(and sometimes 2 ohms). If the 4 ohms rating is less than the 8
ohm, the output devices are the limiting factor. If the amp is more
sturdily built (a pro amp, or better grade home amp) the 4 ohm rating
will be higher, indicating that the limit is more likely just the
supply voltage. (+/-40V gives 28VRMS, which into 8 ohms gives 100W,
or into 4 ohms gives 200W.)
But this amp wasn't rated for 4 ohms, so you may have to derate it.
And you'd be taking a chance that someday somebody didn't crank up
the volume too high.
Best regards,
Bob Masta
DAQARTA v3.50
Data AcQuisition And Real-Time Analysis
www.daqarta.com
Scope, Spectrum, Spectrogram, FREE Signal Generator
Science with your sound card!
<Aaronofkent@gmail.com> wrote:
The old "impedance matching" thing almost certainly doesn't apply hereOn Apr 14, 3:11 pm, mrdarr...@gmail.com wrote:
So I'm looking at this ($149.99 at Sears)
http://www.sonystyle.com/webapp/wcs/stores/servlet/ProductDisplay?cat...
Manual says 8-16 ohm impedance, and I'm wondering how bad will it be
if I connect my 4-ohm subwoofers to it?
Are we talking Blue Smoke here?
Michael
No blue smoke, but you wont be sending the maximum power to the
speakers (and thus hurting yourself in the long run).
Check out "maximum power transfer", it basically says that impedance
in must equal impedance out for power to be transferred effectively.
http://en.wikipedia.org/wiki/Impedance_matching
because the amp specs are not based on output impedance. (Unless this
amp has an output transformer... which pretty much went out with
vacuum tubes.) Modern solid state amps have output impedances that
are near zero at low frequencies.
The "8-16 ohms" rating is more likely based on power handling
capability of the output stage. If you use a lower impedance load, it
takes more current at lower voltage to get the same power, which
means the output stage has to dissipate more power. Like a pair of
voltage regulators with changing setpoints, the positive and negative
output devices have to pass the output current while dropping the
voltage difference between the supply voltage and the output voltage.
Let's take a simple example: Say the supply is 40 VDC and at some
instant the positive output device of the amp has to deliver 50W into
its load. (No RMS here, just instantaneous power.) So with an 8 ohm
load the output is 20V to give 20^2/8 = 50W, and the device itself has
40-20 = 20V across it, while the current through it is the same as the
current through the load = 20/8 = 2.5A. So the device has to
dissipate 2.5 * 20 = 50W.
Now consider a 4 ohm load. First of all, you need to turn the volume
control down to get the same 50W into the load, since now the output
voltage must be 14.14 since 14.14^2/4 = 50W. So the device now has
40-14.14 = 25.86V across it. But its current is 14.14/4 = 3.535 and
the power it dissipates is 25.86*3.535 = 91.4W.
So if the power rating of the amp is based on the power that the
output devices can handle, you'd need to derate it significantly to
use a 4 ohm load. Some amps list separate power ratings for 4 ohms
(and sometimes 2 ohms). If the 4 ohms rating is less than the 8
ohm, the output devices are the limiting factor. If the amp is more
sturdily built (a pro amp, or better grade home amp) the 4 ohm rating
will be higher, indicating that the limit is more likely just the
supply voltage. (+/-40V gives 28VRMS, which into 8 ohms gives 100W,
or into 4 ohms gives 200W.)
But this amp wasn't rated for 4 ohms, so you may have to derate it.
And you'd be taking a chance that someday somebody didn't crank up
the volume too high.
Best regards,
Bob Masta
DAQARTA v3.50
Data AcQuisition And Real-Time Analysis
www.daqarta.com
Scope, Spectrum, Spectrogram, FREE Signal Generator
Science with your sound card!