Skip to main content

Hi friends,

there is a problem:

I own an ADAT HD24 and an RME UFX.

Now I would like to record more than 24 or 30 tracks simultaneously.

After recording I use Pro Tools for mixing all tracks

How I synchronize these two devices in such a way, that I haven't timing problems in the postproduction or mixing.

Is the synchronizing via ADAT enough or must I use the Midi Time Code?

jokeramik

Comments

Boswell Fri, 10/23/2015 - 03:06

There are two main components to synchronising two or more devices such as these.

Firstly, there is making sure they all run at the same sampling rate, so that when started together, they stay in sync and do not drift. Fortunately, this is a relatively easy problem to solve, and in your case would simply be a matter of using the UFX as a clock source for the HD24. This can be done either by using a BNC wordclock out of the UFX and into the HD24 or by taking an ADAT lightpipe from one of the ADAT outputs on the UFX to ADAT 1 input on the HD24. You then select "wordclock" bzw. "ADAT" clock on the HD24. Assuming that you will be sampling using the HD24 converters, the BNC wordclock method is preferred. Note that if you are using the lightpipe ADAT clock, you do not have to send any audio down the lightpipe for it to work. An added benefit of using an external clock is that it gets round the small inaccuracy in the internal HD24 clock division at 44.1 and 88.2 KHz.

Secondly, there is the problem of starting the recording on the separate devices at the same instant. BNC or ADAT clock synchronisation does not do this for you, as the clock is running all the time the units are powered up. MTC/MMC is indeed the standard way of achieving this, although there are other techniques such as dedicating a track on every recording unit to a time code which is recorded ("striped") along with the audio. Most of these techniques pre-date the advent of computer DAWs, and so do not take into account what can be achieved with a computer-based editor.

Given that your mixdown process will involve transfering the tracks from the HD24 to the DAW, there will need to be a small adjustment due to the different audio path times through the two devices. I suggest that you record using BNC or ADAT clock to keep the device clocks the same but not worry too much about accurate start times, and instead use the DAW to time-shift all the tracks from (say) the HD24 so they line up with the tracks from the UFX. Recording a single click or short tone burst on one of the active tracks of each device at the start of each song helps in doing this. If your DAW is able to output an MMC command to start the recording on the HD24 when you start the UFX recording, the subsequent time shifts needed to line up the tracks will be consistent.

DonnyThompson Fri, 10/23/2015 - 06:23

Boswell, post: 433256, member: 29034 wrote: Recording a single click or short tone burst on one of the active tracks of each device at the start of each song helps in doing this. If your DAW is able to output an MMC command to start the recording on the HD24 when you start the UFX recording, the subsequent time shifts needed to line up the tracks will be consistent.

Bos is referring to a method known as "Slating", which goes way back to old film-making... you ever see the guy with the little clapboard, and just before the director says "action", the guy clicks the two pieces of the clapboard together? That's what is known as a "slate". It was meant to make editing easier so that all the different angle cameras would have a visual cue, so that the editor could "sync" all the camera angles.

So, taking that same principle and applying it to audio, as Bos mentioned, simply record a short burst of some kind into both of your recording devices... it could be a handclap, or even you simply saying "Slate". When you import the tracks into your DAW, you can look for that cue point on all of your tracks.

Side note: A word about MTC - depending on what device you are using, it might be beneficial if you would allow an empty measure at the top of the project; MTC has been known to occasionally present a bit of "start lag", and could possibly cut off your first bit of audio info if you start right at 01:01:01:000. This mostly occurs when syncing external midi devices to a DAW or another Midi device, but it could happen with two recording devices as well. Personally, I'd go with MMC instead. Or, if you can, SMPTE.

Types of Synchronization:

MIDI Machine Control : Designed for controlling the transport on tape and multitrack machines. If the multi-track was the MMC master, you press "play" on the multitrack, the sequencer will start. Or in the example below, if you make the sequencer the MMC master, pressing play will start the multi-track. MMC really just controls the transport and sends a locate point--not the timing; its just a switch that says "Go here, then start, stop, rewind, fast forward". What you do to complete the picture is send MTC from the MMC slave.

SMPTE (Society of Motion Pictures and Television Engineers) is still in use in Film and Video today and is embedded right into digital video so no tape tracks are required. In the days of tape recorders, we used SMPTE and striped it on the last track of the tape recorder. On playback, a standard audio cable took the output to a MIDI interface and controlled the sequencer from it.

MTC stands for MIDI Time Code. A series of midi messages that tells other devices what time it is at any given moment in hours, minutes, seconds and frames. MTC is simply understood as a conversion of SMPTE timecode that goes down a MIDI cable. It is sent in quarter frame intervals as MIDI system

FWIW
-d.

jokeramik Sat, 03/12/2016 - 03:07

DonnyThompson, post: 433258, member: 46114 wrote: Bos is referring to a method known as "Slating", which goes way back to old film-making... you ever see the guy with the little clapboard, and just before the director says "action", the guy clicks the two pieces of the clapboard together? That's what is known as a "slate". It was meant to make editing easier so that all the different angle cameras would have a visual cue, so that the editor could "sync" all the camera angles.

So, taking that same principle and applying it to audio, as Bos mentioned, simply record a short burst of some kind into both of your recording devices... it could be a handclap, or even you simply saying "Slate". When you import the tracks into your DAW, you can look for that cue point on all of your tracks.

Side note: A word about MTC - depending on what device you are using, it might be beneficial if you would allow an empty measure at the top of the project; MTC has been known to occasionally present a bit of "start lag", and could possibly cut off your first bit of audio info if you start right at 01:01:01:000. This mostly occurs when syncing external midi devices to a DAW or another Midi device, but it could happen with two recording devices as well. Personally, I'd go with MMC instead. Or, if you can, SMPTE.

Types of Synchronization:

MIDI Machine Control : Designed for controlling the transport on tape and multitrack machines. If the multi-track was the MMC master, you press "play" on the multitrack, the sequencer will start. Or in the example below, if you make the sequencer the MMC master, pressing play will start the multi-track. MMC really just controls the transport and sends a locate point--not the timing; its just a switch that says "Go here, then start, stop, rewind, fast forward". What you do to complete the picture is send MTC from the MMC slave.

SMPTE (Society of Motion Pictures and Television Engineers) is still in use in Film and Video today and is embedded right into digital video so no tape tracks are required. In the days of tape recorders, we used SMPTE and striped it on the last track of the tape recorder. On playback, a standard audio cable took the output to a MIDI interface and controlled the sequencer from it.

MTC stands for MIDI Time Code. A series of midi messages that tells other devices what time it is at any given moment in hours, minutes, seconds and frames. MTC is simply understood as a conversion of SMPTE timecode that goes down a MIDI cable. It is sent in quarter frame intervals as MIDI system

FWIW
-d.

Many thanks!

jokeramik

bouldersound Sun, 03/13/2016 - 23:56

The most critical thing is to sync the word clocks. Slating is the next most critical thing. Do those two things and you can sync tracks up pretty easily. Timecode is not necessary or all that helpful, and it won't fix a clocking mismatch if the HD24 tracks get imported to PT. If things aren't clocked together they will drift and it may be noticeable.

Special note about the HD24 (standard version, not the XR): the clock is known to be inaccurate at 44.1k. If you are going to record at 44.1k and transfer to another device for mixing you really need to clock it externally.

Boswell Mon, 03/14/2016 - 03:48

bouldersound, post: 437111, member: 38959 wrote:
Special note about the HD24 (standard version, not the XR): the clock is known to be inaccurate at 44.1k. If you are going to record at 44.1k and transfer to another device for mixing you really need to clock it externally.

Quick correction: the HD24 generates its clock on the mainboard, and this is unchanged between standard HD24 and the XR, so the effect is present on both the models.

The reason for the error is that the 54MHz crystal used in the HD24 divides exactly down to 48KHz and 96KHz, but not exactly to 44.1KHz and 88.2 KHz.

Boswell Mon, 03/14/2016 - 08:24

One thing apart from the very obvious audio quality difference between the HD24 and the HD24XR is that the analogue outputs on standard HD24 are impedance-balanced where they are true balanced on the XR. Impedance-balanced in this instance means that both output lines have the same resistors to ground but the +ve output carries 100% of the signal and there is none on the -ve output. When fed into a real differential input, this arrangement still affords rejection of interference picked up on the connecting cable.