Abbreviation for Total Harmonic Distortion plus Noise. A measurement of both noise and distortion added to a signal by a given circuit or piece of equipment. To arrive at the results a single sine wave frequency of known harmonic purity is passed through the unit under test, and then patched back into distortion measuring instrument. A measurement level is set; the instrument notches out the frequency used for the test, and passes the result through a set of filters, adjusted for the bandwidth of interest (usually 20-20kHz). What remains is noise (including any AC line [mains] hum or interference buzzes, etc.) and all harmonics generated by the unit. This composite signal is measured using a true RMS detector voltmeter, and the results displayed. Often a resultant curve is created by stepping through each frequency from 20 Hz to 20kHz, at some specified level (often +4 dBu), and bandwidth (usually 20 kHz; sometimes 80 kHz, which allows measurement of any 20 kHz early harmonics). [Note that the often-seen statement that “THD+N is x%,” is meaningless. For a THD+N spec to be complete, it must state the frequency, level, and measurement bandwidth.] While THD+N is a common audio test measurement, it is not the most useful indicator of a unit’s overall performance. What it tells the user about hum, noise and interference is useful; however that information is better conveyed by the signal-to-noise (S/N) ratio specification. What it tells the user about harmonic distortion is not terribly relevant simply because it is harmonically related to the fundamental, thus the distortion products tend to get masked by the complex audio material! The various intermodulation (IMD) distortion tests are better indicators of sonic purity.