Thanks Wayne. That's what I thought. Just wanted to verify before I go into my next question.Why on earth are preamps and amps designed with so much gain, that input signals need to be reduced by up to a factor of 1,000, just so the power amp will not over load. My understanding is that most active pre-amps are built with a gain of 20 dB while most power amps have gains between 30 to 40 dB. This means that a 2V signal from a CD player is attenuated by 40 to 60 dB at the volume control, just so it can go through the two gain stages for normal listening levels. It seems very ineffecient to me. Shouldn't designers aim for a system that uses as much of the original signal as possible? To design an amp where gain and power are better matched?
Technically, the 2V output from a CD player is enough to power speakers to "normal" listening levels, provided impedence is matched properly and current is available. So, high gain in amps is not necessary with modern source components. If source signal is used whole, all you need is just over 20dB of gain to clip a 100W amp.
What are you thoughts on the follow system to make the most use out of the source signal:
- Preamp based on a unity gain buffer. No gain, but buffered for impedence matching
- Low gain amp such that input sensitivity is line level.
Gar.