I have been thinking about distortion in tubes verses transistors. I thought transistors distorted less than tubes but what was there was higher orders. This would be OK in most cases but when it gets high enough to hear it is really bad because high orders are so much more objectionable than low orders. Even though tubes have higher distortion it is lower orders so it isn't so bad.Then a friend told me something that floored me and I am looking for more information regarding this. I was told that transistors often distort more than tubes and it isn't the part but the feedback that makes the difference. What I mean is the reason transistor amps distort less isn't the transistors at all but instead its the fact that so much negative feedback is used. This cancels the distortion at least in theory. The problem is it can't work in all situations like when the amplifier nears clipping. The feedback loops can no longer work because there is no gain left. This can happen under other situations too like a fluxuating load.
If that's true it brings everything into a new light. If you take a 3 watt SET amplifier and compare it with a 3 watt bipolar transistor amplifier both running class A but one triode and the other solid state, will the tube amplifier distort less? Of course the tube and transistor have to be suitably rated for this power level, one can't be driven too hard and the other not for a fair comparison. What do you think? Is my friend right?