It was interesting to see a mention of this in the manual. And you're absolutely right, phase (polarity) is not always maintained during recording. In fact, because I nearly always test my new amp/preamp designs in mono, I am painfully aware of just how bad the situation is. Typically, when testing/tweaking new designs, I add both channels of the source (usually CD) through resistors in order to capture all the music. However, I have several CDs for which this technique cannot be used. One example is a Yardbirds CD that appears to have been recorded with the L and R channels exactly 180 degrees out of phase. Either channel alone sounds fine, but when mixed, total output amplitude decreases about 9-10 dB. Several other discs suffer from audible frequency response anomalies when the channels are mixed into mono. This usually manifests itself as a significant reduction of high frequency content. When listening to a particular Steve Miller CD for example, everything above 5 kHz or so virtually disappears when the channels are combined.These defects are more difficult to detect when stereo transducers are used, because cancellation from out-of-phase signals is incomplete (or in the case of headphones, nonexistent). Nevertheless, this issue points out how poorly source material is being produced and processed these days, and how important it is to select programming carefully when evaluating a system.
Sorry, I know this doesn't have much to do with the preamp under discussion, it just caught my attention.