Skip to main content

Hi to all !

I tested a lot of gear over the last year and for a long time I just assumed that once my signal is in digital it would be safe and no quality would be lots. (Via ADAT or SPDIF)
But I am asking myself, could the audio quality of a digital signal be altered when hitting the audio interface chipset and sent to the computer and also when coded in a wave file.
If so, are some interface builders more succesfull in being transparent more that others.
Of course I know that clocks and jitters can make a difference.
What I'm asking is if even with a good clock (the same one with different setups) the chipset and how the interface was build could make a difference.

My master clock comes from a Mytek AD96 and I also have the UA 4-710 available to use as a master clock.
I will soon have to decide between using RME 9652 and Multiface as main interfaces or a Focusrite liquid saffire 56.

Comments

Boswell Thu, 11/06/2014 - 09:59

This is a good question, but it's an area that's loaded with misconceptions and misinformation.

The detailled quality of a clock is only important at the point where signals cross the domain boundary, that is, the interface between analog and digital. This means that A-D and D-A converters should be clocked with the best quality clock available in terms of stability, noise and jitter. It should be noted that sometimes a product's standard internal clock can out-perform a higher-quality external clock simply due to the way the external clock has to be delivered through cabling, connectors and interface circuitry to get to the point in the design where the selection is made between internal and external. Generally, only one device in a system can be running on internal clock with the others slaved to it, or none on internal clock if you are using a master clock box.

Once the data is in digital form, the quality of the clock is largely unimportant as long as its good enough that no data corruption occurs. In principle, noise and jitter on a clock within one bit period cannot change the data. In practice, a designer likes to keep the clock as clean as possible to allow for perturbations due to external effects such as mobile phone interference or mains-borne noise conducted through the power supply.

Storage of data as files or a disk on other storage medium is a very different matter, as they are simply lists of numbers with no clock involved. However, one of the slightly shocking things I found recently when carrying out a detailed examination of the design and performance of a relatively wide range of audio interface products and their drivers is how many of them do not guarantee to pass digital data to and from a computer in an unmodified form. By this I mean that if you were to send 24-bit digital data words to an interface over an S/PDIF or ADAT connection, store them in some way on the computer and then replay them via a similar output route, the 24-bits being replayed may not exactly match the original 24-bit words sent. I'm not in a position to name any names, but they include some whose products are accepted as being at the top of their bracket. That's not to say that a bit or two difference at the 24-bit level is going to be audible, but the point is that there was a difference, and one has to ask why. It has nothing to do with clocking, by the way.

pcrecord Thu, 11/06/2014 - 10:25

pcrecord, post: 420797, member: 46460 wrote: What I'm asking is if even with a good clock (the same one with different setups) the chipset and how the interface was build could make a difference.

Sorry for quoting myself..
It's exactly what I was asking. So I'm not crasy for thinking that the process of transfering digital audio then coding it on a drive then playing it back may not be 100% accurate (for some interfaces).

Boswell, post: 420804, member: 29034 wrote: That's not to say that a bit or two difference at the 24-bit level is going to be audible,

So your saying that even if there is a difference, we shouldn't be able to hear it. I'll stop to paranoid then ;)

audiokid Thu, 11/06/2014 - 11:18

Boswell, post: 420804, member: 29034 wrote: By this I mean that if you were to send 24-bit digital data words to an interface over an S/PDIF or ADAT connection, store them in some way on the computer and then replay them via a similar output route, the 24-bits being replayed may not exactly match the original 24-bit words sent.

Interesting.
I'm only speculating how you captured and analyzed the results? Did you record the clock itself and then null to multiple passes as we do when matching up transient peaks to match? If so, were changed random in a time line or inconsistent where precise bit loss accrued? How did you actually do this?

Boswell Thu, 11/06/2014 - 11:50

Nothing so interesting, as it turned out. I used an HD24 as the source and destination of the digital data sent via ADAT and transfered the files to a computer from the HD24 drive. It was then simply a matter of pasting the sent words as decimal numbers into one column of a (very long) spreadsheet and the received words in another column and flagging any differences. I had to perform some other statistics on the results, but I'm not able to say much about that aspect of it.

In the case of simple comparisons, I was not really looking for consistent small differences that would be heard as very low-level noise, but searching for random lone larger differences that might point to a design or implementation problem in that particular piece of gear. The best results were when there were absolutely no differences. Surprisingly, more of the cheap units achieved this than the higher-spec devices. I guess this was due to their not having internal DSPs that the data had to pass through, probably with gain factors that were very nearly but not quite unity.

x

User login