“distortion from an 8 tweeter array will be {insert equation and result here} less than a single tweeter operating at the same total output level.” -------------------------------------------------------------
I've been trying to figure this out as well. I think that cone/dome drivers in an array would have the distortion drop for two reasons - the sensitivity of the array increased (from multiple drivers) and the reduced drive signal going to each driver.
For the sensitivity portion:
- one driver = (X)dB of distortion (pick whatever harmonic you'd like).
- two drivers = (X-3)dB of distortion (two drivers are 3dB more efficient, so they need to be driven with 3dB less signal to achieve the same level
- four drivers = (X-6)dB of distortion
- eight drivers = (X-9)dB of distortion
- nine drivers = (X-9.5)dB of distortion
- twelve drivers = (X-10.8)dB of distortion
- sixteen drivers = (X-12)dB of distortion
With a ribbon array, you would have the reduced signal going to each individual ribbon, but the array sensitivity remains the same as one driver.
If you have an array of cone/dome drivers, you have the benefit of reduced signal PLUS the sensitivity increase.
I know that distortion increases as drive level increases (and not linearly). This makes me think that you have an increasing benefit as you continue to add drivers to an array.
Can someone help me work through this theory?