Analog is superior, in theory, because it doesn't sample and doesn't quantize. When you record a sine wave in analog, you're getting a sine wave. In digital audio, you're getting a stair-step representation of a sine-wave. As you increase to higher resolutions, those stair-steps look more and more like a sine wave until they're nearly indistinguishable; however, you're never getting *exactly* that sine wave. So, yes, in theory analog is potentially more hi-fidelity. But, as I tried to illustrate in the earlier post's guitar example, you're not really getting the whole sine wave in analog recording either.
I guess I assumed that analog audio was lossless because vinyl and tape have been described as infinitely uncompressed audio. Since digital data is expressed as bytes and essentially a series of switches couldn't analog equipment be theoretically made to be infinitely more hifi than digital since analog signals are expressed as voltages which can vary rather than switches which are only on or off?