The standard test used on analog magnetic tape recorders to determine the maximum output level (MOL), which was defined to occur at the magnetization level at which a recorded 1 kHz sine wave reached 3% third-harmonic distortion.” Of course, third-harmonic distortion is nothing more than a measurement of the amplitude of the third harmonic of the input frequency and is the most prominent distortion component in analog magnetic recording systems. The third-harmonic level was used as a convenient figure-of-merit because the 2nd harmonic is difficult to hear, since it tends to reinforce the pitch of the fundamental. The 3rd harmonic is easy to detect on pure tones (although less so on music), thus it makes a good benchmark for comparing sound “off tape” with the original. The distorted tone has an edge to it, containing a component one octave and a fifth above the fundamental. For this reason the third-harmonic is also called a musical twelfth (Octave + fifth). Here’s the interesting twist. This test was commonly abbreviated and listed on the specification sheet as “THD“, which, of course, was mistaken to mean “total harmonic distortion” instead of “third harmonic distortion.” This led to it being mistakenly shortened to just “distortion,” so you still find old analog tape date sheets, and many text books defining MOL as the point at which there exist “3% distortion,” instead of the correct reference to “3% third-harmonic distortion” – quite different things.