0% Interest for 24 Months! Learn more »
(800) 222-4700
  • Español: (800) 222-4701
Microphone Month

Issues in synchronizing gear with MTC and external word clock.

“This question has been bugging me for a long time, so I thought I’d finally ask. There are some MIDI/Synchronizers that have internal word clock, but are not capable of slaving to external word clock–specifically the MTPA/V for example. If you are using one of these units, but wanted to use a different clock source as the master clock (given that there are many quality ones now available), when is it safe to take the MIDI/Synchronizer out of the word clock chain, but still use it as the SMPTE/MTC master for digital devices which are slaved to the new improved external clock? Does this mean that the time code of the MIDI/Synchronizer would be unresolved, and that drift would occur between the digital devices? I’m very confused.”

As relatively easy as it has become to deal with synchronization issues these days (as compared to where it was 10 or 15 years ago) there are still a surprisingly large number of gray areas like this.

First, we should clarify the capabilities of the MTP/AV. While it does not have a word clock input, and can’t derive its clock from a word clock master, it can resolve to LTC and video black burst for external speed control. This doesn’t directly address the question, but can be important for overall clarity.

A high quality master clock can be introduced into most digital studios and DAW systems by simply connecting it to the clock inputs on the devices in question. For example, if your DAW’s audio interface has a word clock input you could connect it to the output of a high quality clocking device and possibly achieve better sounding results through the reduction of jitter and other clock related anomalies. (This is a deep subject that goes well beyond this simple statement or this Tech Tip – we’re just trying to lay some groundwork here.)

This injected clock will take over the sample rate of the digital recording, and thus control its timing. You are correct in observing that this creates a situation where a system potentially has two different sources for timing information: the newly injected word clock, and the timing information coming in via MTC from the MIDI interface/synchronizer. The rest really depends on how the software handles things, which means there are potentially as many answers to this question as there are programs available for digital audio recording. With this in mind there are a few things to think about:

In many cases applications have some method of switching hardware between internal clocking and external clocking. Whether you are using the clock coming in from the interface/synchronizer or the high quality word clock box you will need to have your system set up to listen to external clock. When this is the case the sample rate (read recording speed) of your system will usually be dictated by this clock, whether it comes in over the word clock jack or through some digital audio input.

Since this clock may not be resolved to the frame rate of the MTC, this brings up the question of what happens inside the software when they begin to drift from one another? Understand that the software will usually not know they are drifting. There is no address or location information in the word clock data so the only concrete information present to tell the software there is drift between the speed of the clock and the speed of the time code is if it compares the relative speed or phase (see WFTD Phase Lock) of the incoming signals. How your program may react to this if it even measures it is something that could be investigated. Under most normal circumstances this is not a major problem because any drift is likely to be minimal for relatively short durations of time, such as five minute songs, etc. Many software apps can reconcile the two in a way that is transparent to the user (they aren’t really reconciled, but graphically it can be made to appear so, assuming the discrepancy is very small). Situations where it could be a problem include working with time code that has a lot of drift or changes of speed, such as an old (analog) tape machine. In those cases it is often better to just resolve the DAW to the inconsistencies of the tape machine while they are synchronized together. Situations where MIDI parts must play along with the digital audio files with very tight synchronization can reveal problems as well. In the case we’re examining here, MIDI will be following MTC for speed, while the audio is following the external word clock. Again, drift should be minimal, but if very tight synchronization is required you may have a problem. Keep in mind, however, that the timing resolution of MTC is very low compared to sample clock, so there’s always going to be a small amount of slop there (although with the advent of technologies such as MTS this is greatly improved). Also, if your project is long, such as with a movie score, you will notice that the problem accumulates over time. What may be a slight discrepancy at five minutes could become substantial an hour into the piece. How the software handles this varies from program to program, but in many cases it can still be a workable situation so long as all the material is recorded and played back on the same system with the same synchronization sources. If everything stays the same the manifestation of the discrepancy should stay about the same. The problem will show up, however, after you’ve recorded a bunch of tracks and then begin to start playback well into the piece. Now, the error hasn’t had time to accumulate, yet you have tracks recorded with the cumulative error. They will sound out of time. This cumulative error problem can also show up even if you only have audio tracks recorded for the same reasons – instead of being out of time with MIDI it will be out of time with the other audio or video source (the master). Further, if the session is moved from one system to another (with its own set of speed inconsistencies) you are likely to experience timing discrepancies. In fact, this is sometimes what causes users to discover this problem in the first place. Unfortunately it’s too late to do much about it at that point. One workaround for all of this can be to break audio regions up into smaller chunks. Many programs trigger audio playback of separate regions based only on the MTC address, not on the cumulative time that has passed since playback began. This fundamental difference can make it possible to salvage material that is drifting (or may drift) out of time by breaking the audio up into small regions. Each region can being playing at the proper MTC address, and if short enough will not have time to drift out of time.

In those cases where extremely tight or consistent timing is required across the board or where you are working with long form material, you are going to have to employ a method that allows all sources of timing information to be resolved together. This means you need a device that can generate MTC or LTC that is resolved to your high quality word clock source. One such (relatively affordable) device is MOTU’s DTP (Digital Time Piece).

Finally, in situations where the MTC is coming from a device that is digital – let’s say ADATs for example – your best bet is to go ahead and get them resolved to this high quality word clock to begin with. That way the MTC they ultimately give you will be resolved, and they can sonically benefit from the better clock as well. Realizing that the MTP/AV doesn’t really provide for synchronizing some things (such as the ADAT) in a way that’s conducive to this you may be well served by stepping up to a more sophisticated synchronizer (such as the DTP). Your Sweetwater Sales Engineer can help you sort out the details and determine which solution is best.

Share this Article