1. Register NOW and become part of this fantastic knowledge base forum! This message will go away once you have registered.

What does clock jitter sound like?

Discussion in 'Converters / Interfaces' started by MrEase, Jan 20, 2016.

  1. MrEase

    MrEase Active Member

    Why I started this thread!


    Recently there have been several threads where the "sound" of clock jitter has been mentioned. These discussions seem to get "lost" in the diversity of the forum and points are often repeated. Hence I thought it would be worthwhile starting this thread to collate information in to a single “reference” source and hopefully debunk a some of the common misconceptions about clock jitter. Hopefully others will contribute.


    OK that's why I've done this. I hope others agree and, if not, perhaps another method could be proposed.
     
    audiokid and DonnyThompson like this.
  2. MrEase

    MrEase Active Member

    As I have said elsewhere, I do not think that there is such a thing as a characteristic "sound" produced by clock jitter which makes it all the more difficult to identify. In fact I doubt, without specific efforts, anyone has really "heard" the effects of clock jitter.


    Of course to back this up I need to produce some sort of evidence for these comments. First I will refer to the "What is clock jitter?" thread which is a "sticky" at the top of this "Converters & Interfaces" Sub-Forum. Here I spent some time explaining the sort of amplitudes we could expect from given amounts of clock jitter and also how I believe that the usual "audio" method of measuring clock jitter differs from other more conventional (and meaningful) measurements used in other fields. As seems “normal” in the audio world, the measurement method employed presents the best possible figure of merit whilst giving almost no useful information about the jitter characteristics and most certainly does not give the worst-case timing errors a given clock will produce.


    First, let's make it clear what we mean by a “clock” in the electronic context. This is not a true “clock” in that it conveys time and date but is more accurately described as a reference frequency, which allows many different signals can be synchronised. This frequency usually takes the form of a simple sine or square wave. For digital use, we naturally use a square wave.


    This frequency, like everything else electronic is subject to a certain amount of noise and error, which, with careful design, can be minimised but never eliminated. In the context of recording or replaying audio this noise ONLY has an effect at the A-D or D-A interface and causes small timing errors from the “ideal” sampling time. Inevitably this has the wherewithal to affect the quality of our recordings.
     
  3. MrEase

    MrEase Active Member

    To evaluate what any clock jitter will “sound” like is not straight forward as the noise is not necessarily random and will certainly not be consistent from one sound-card model to another. It will also vary depending on whether one sound-card is slaved to another or a “master” clock or uses it’s own internal clock.


    Let’s make a start by trying to make some sort of analogy. Clock jitter is actually a basic measure of what is called the “phase noise” of the reference clock and this noise has a noise spectrum – just like we can see the audio spectrum with a plug in. This noise spectrum is what can (and does) vary between sound-cards.


    Essentially what the jitter does is “phase modulate” the audio signal. Phase modulation is very similar to frequency modulation (as per FM broadcasts) and it is valid to use this as an analogy. With an FM broadcast, the audio is used to modulate (vary) the frequency of a radio “carrier” wave, which is then received and duly “demodulated” to reproduce the audio. Essentially, in the case of clock jitter, the clock phase noise (jitter) is the audio source and its noise will be modulated on to the audio signal – which in this analogy is the carrier wave. Whilst the analogy is not exact it is close enough and I hope it is sufficient to explain why the spectrum of the phase noise (jitter) is modulated on to the required signal. It should also be clear that different characteristics of phase noise would create different modulation “sounds”. Hence the characteristic of phase noise is just as important, if not more so, than the jitter spec’s we see in the specification sheets.


    Also I know some refer to clock jitter effects as distortion. Personally I think this can be misleading as most are familiar with harmonic distortion or intermodulation distortion. Whilst it is indeed a distortion of the audio wave, I prefer to think of it as an unwanted “modulation”. Perhaps I should refer to it as clock modulation distortion!


    I’ll try to explain more, later. In the meantime I would appreciate any input on whether you think this is a worthwhile exercise!
     
  4. Boswell

    Boswell Moderator Distinguished Member

    Yes, good move.
     
  5. DonnyThompson

    DonnyThompson Distinguished Member

    It's a great thread - well described and explained; and as far as I'm concerned, providing very pertinent - and valuable - information and definitions, in today's world of $6000 clocking devices.

    Other than that, I don't have anything else to add. I'm definitely not the guy who should be weighing in or giving advice 0n this subject.

    I can guarantee you that I will be following it, though. ;)

    -d
     
  6. audiokid

    audiokid Staff

    right on. (y)
     
  7. MrEase

    MrEase Active Member

    OK, so the site owner and two moderators seem to like the idea! Wow! (EDIT): Crikey, I see it's a sticky already! I hope that's a vote of confidence and also hope that I don't fall flat on my face! :unsure:

    I have to warn Boswell that I'll be hoping for the usual corrections whenever I make my careless slips. It'll be good to know you have my back! I also hope you'll chip in also if you find any analogies I may think up for illustration are wide of the mark...

    As with the clock jitter thread, I'll do this in smaller chunks for two reasons. 1. Smaller chunks are easier to digest and 2. I don't have the time to do this all in one big "paper" which in any case would put many off.

    I actually have "paying" work to do right now but hope to do another instalment later today.
     
    audiokid likes this.
  8. MrEase

    MrEase Active Member

    As a side note, Boswell raised a point in the "Antelope Audio 10M" thread which is perhaps worth expanding on a little.

    What this means is that distortion due to clock jitter will always, for a given signal, be a fixed amount (usually measured in dB) below the peak music signal. This is quite easy to show (as I did in the "clock jitter" thread). This is not the same as the usual distortion measurements of harmonic or intermodulation distortion. With these distortions, as the signal rises in amplitude, the relative amount of distortion increases at a greater rate. In the case of intermodulation distortion, each 1dB increase in peak amplitude results in a 2dB increase in intermodulation distortion. Harmonic distortion does not have the same "straight line" consistency between differing levels but it is true to say that as signal level increases the relative distortion will too but this may be very small until signals get quite large. Just a small point but what Boswell says (as usual) is true! As he carefully notes though, his test will only indicate the possibility of jitter induced modulation distortion and is not a definitive test. It will however discriminate quite well against intermodulation distortion.
     
  9. DonnyThompson

    DonnyThompson Distinguished Member

    Okay... trying to understand here... bare with my ignorance...

    So then, distortion due to clock jitter - let's call it a "symptom" of jitter ... will always be of a fixed amount below the peak... but does the amplitude of a signal play a factor in the amount of jitter itself?
    Or am I bum-fuzzled on this topic as usual? ;)
     
  10. kmetal

    kmetal Kyle P. Gushue Well-Known Member

    I'm lost on this one. I think I lack the fundamentals.
     
  11. audiokid

    audiokid Staff

    Maybe we need to back up a bit and start over with what the 10M does and how any of this relates to sound.

    I see the 10M as a clock signal amplifier/ router/ junction box, stabilizer.

    Constant vs inconsistencies

    Lets ask (in layman terms) if I am correct and then learn how this has anything to do with sound, distortion/ shifting of bits that inevitably create phase shift with random peak burst or drops as well.
    Related: Why is a dedicated PCIe interface more stable? Then there is USB, FW, MADI, AES EBU, TB,
    Then there is a relation to the CPU (Mac, PC, Desktop, laptop, ipad, iphone).
    How stable the PSU is and how that relates/ effects stability which in turn effects sound.

    Plus, I'm also interested if various cable lengths effect latency? To my understanding, all the cable should be the same length?

    Am I close to where the others are lost?
     
  12. MrEase

    MrEase Active Member

    Sorry folks, I've been busy making a living! I had written a fair amount and realised that I'd been getting far too technical and started on a re-write - having seen Donny & Kmetal's posts then I guess I need to also take a little step back.

    Chris, I'm not sure how relevant your questions are to the thread as I'm not really intending to evaluate the 10M or various interfaces. I'm trying to explain how any degradation of audio occurs due to clock jitter and why no particular sound can be associated with jitter.

    @donny The errors due to clock jitter arise simply because of these facts - Clock jitter will cause the audio to be sampled at slightly the wrong time. Let's take for instance a signal that is sampled 10us late (from "perfect time). The sample that will be measured will change in that 10us by an amount determined by the signal, let's say that the signal changes by 10uV in 10us, then the error is simply 10uV. Now if you increase the signal by 6dB, the voltage doubles so the error will now be 20uV. That means that the error has changed by the same 6dB that the signal was increased. Therefore for this given example, the error remains at the same relative level to the signal itself. Note that the signal has no effect on the amount of jitter present.

    Does that example help?

    As before I realise that a few pictures may help but I'll need to work on setting this up.

    @kmetal Sorry if this is confusing but please bear with me, I would prefer if you understand what I'm talking about. What I've done so far is to try and present the analogy of an FM transmission that is similar to the way clock jitter can generate different "sounds", much like we recognise lots of different music or speech on an FM transmission. The phase noise (i.e. jitter) of a clock will cause a modulation on to our audio that carries the "signature" of that particular clock. That's about as far as I've got but what I want to explain is why different clocks (and slaved clocks) won't always sound the same. My first attempt is to really show what a good real world clock should do but also indicate what can go wrong in the design of these clocks. Maybe that's deeper than you want to go but I'll do my best to keep it understandable.

    To all, thanks for the feedback! Please keep it coming so I can try to pitch this so it's worthwhile - to write and read! There's no point if everyone glazes over and rolls their eyes! :)
     
    DonnyThompson and kmetal like this.
  13. kmetal

    kmetal Kyle P. Gushue Well-Known Member

    Much appreciated! I'm willing to plunge as deep as your willing to go, just might take me longer to get there. I've grown fond of technical things over the years as they represent relative truths/untruths. Looking forward to more on this nasty little thing called jitter!
     
  14. audiokid

    audiokid Staff

    No worries, I actually think its all relative to the thread though. I only mention the 10M because those following this have also been following the 10M thread. The 10M isn't in any need to judged. Its a beautiful product.

    Basically I was just expanding, to demystify some of what people think the clock is doing when it could also be more about what its not doing. Which is to do with why jitter happens, yes, no? Most likely my wording is not "worded" well.
     
  15. DonnyThompson

    DonnyThompson Distinguished Member

    From what I've been able to understand, the 10 M is a super-uber-precise clock. And to that end, it does what it's supposed to do in the best way possible ( ie. the most accurate)... so it's not that I'm disputing the quality of the piece.

    The question, which I think has been on the minds of most here since it was first introduced for discussion, is if a system that is already properly clocked to begin with requires something like the 10M?

    Am I correct in saying that those who are hearing a noticeable, audible improvement while using the 10M on their system are doing so because they're clocking was off to begin with? At which point the 10M would indeed make a big difference; but at the same time, would it be possible that a less expensive clock could also provide the same level of improvement, too?

    For the typical DAW user ( and I understand that "typical" is a loose description), let's say, someone like myself who works strictly with audio, of no more than 6 minutes in length, at 44/24 (or 32)... working with an average of 18 -24 tracks, and who isn't incorporating any sending of a signal to OB digital processing that needs to be returned to the DAW...
    I simply connect mics or instruments to a pre/i-o, ( like the Presonus 1818VSL ) and then route that signal via USB to my DAW. When I'm doing a final mix, I either render through the stereo bus ITB, or will use the 2 DAW method (on the occasion that I have the proper equipment at hand to do that). With that scenario, is there ANY reason why I would need any form of external clocking in place? And if so, why?

    I've also read ( and heard) that having a highly accurate clock like the 10M might be valuable to certain specific workflows - video being one of the main type of production where it has been suggested as benefiting from an accurate external clcoking device - where over a length of time, "sync drift" can occur if the project is over a certain length...
    Is this true?

    Then we get into audio interfaces that are connected directly to the motherboard of a system; PCIe cards with formats like ADAT/Optical, Madi, etc. of which are, presumably, the best current choices because the routing is so "fast" and so "direct" to the DAW, without USB or FW cables, which can't carry the same sizes of data at the same lightning fast rates... is this a correct statement?

    I'm not trying to hijack Ease's thread here... I'm simply trying to understand the need for an expensive piece like the 10M; I'm not suggesting that it's all balloon juice, either. I'm sure that it does what it does in a very accurate way... but who actually needs a piece like this? Perhaps we could start there...
     
  16. kmetal

    kmetal Kyle P. Gushue Well-Known Member

    My question is essentially sort of like Donny's. When talking about interfaces clocks, what makes for better clocking from something like a basic m-audio to something like an apogee symphony. Is it the actual clock itself? Or is it the overall design and incorperartion of the clock.

    And if you were running multiple digital pieces converters/interfaces like we are at the studio apogee ensemble, Rosetta,motu, or a digital console with something like an spx 90, is there a particular peice that would be better than the other to clock to? Or better than the internal clock from a daw?

    Also, would a different daw program, or more or less powerful computer effect the clocking? My (mis) undertanding of that would be as long as everything was in sync 'marching to the same beat' or pulse, that the daw and more so the computer wouldn't have an effect.

    Again I'll mention, in know Dave mentioned when he clocked his digi 002 or 003 to his alesis HD hard disk recorder he noticed an improvement. Now that digi was notoriously mediocre, and conversion was the usual culprit. I belive he was on PTLE at the time.

    Which also bring up does conversion effect jitter or jitter effect the actual conversion? Or is it again an overall design/integration situation?

    Are there situations where the 10m would/should show a significant improvement?
    Does it do anything relavent that a 'lesser' clock might, be external or internal?

    Lol mr ease an bos will have a whole book written by the time this thread tapers off. Thanks for your patience guys. I feel like this is one of those completely don't get it till you all the sudden get it completely topics, for me at least.
     
  17. MrEase

    MrEase Active Member

    @kmetal & @DonnyThompson

    Hi guys. A lot of your questions have already been covered in the "what is clock jitter?" thread. I know it's over 5 years since I did that but the facts have not changed in the interim. I cover on board clocks and slaved clocks in that thread and also evaluate the effect of "worst case" jitter scenarios. I know it's several pages but I hope it'll be worth your while sifting through it.

    What I'm trying to deal with here is going on from there and I don't really intend to re-hash the older material but move on to concentrate on why a jitter problem could be hard to diagnose due to there being no real characteristic sound. I do believe that some problems ascribed to clock jitter are erroneous and that when clocking problems exist, they are often put down to jitter but other greater problems exist that must be resolved first before the finger is pointed at jitter.
     
  18. kmetal

    kmetal Kyle P. Gushue Well-Known Member

    Ok heard loud and clear, I'll read the thread and get up to speed.
     
  19. Boswell

    Boswell Moderator Distinguished Member

    I keep coming back to the point that a conversion clock (also called sampling clock) is different from a synchronising clock. They can, of course, have the same source, but what they have to do is different.

    The synchronising clock's job is to make sure the various digital gear or digital sections of gear have the same notion of what elapsed time is. If this is done, then audio sent as digital serial messages between the different pieces of equipment will be guaranteed to be sequential and unique, that is, it will not be lost or duplicated. Usually, each piece of gear has a time window somewhat less than a clock period in which new data should arrive in order to be treated correctly. It makes no difference to the audio quality of that signal exactly when in that time window the audio arrives. This makes it clear that some jitter can be tolerated on this clock with absolutely no difference to the audio result.

    The conversion clock's job is to define as precisely as possible the instants at which the sampling (input) or reconstruction (output) happens at the boundary between analog and digital sections of equipment. It does not govern what happens to the digital samples after sampling nor how digital samples arrive at the output prior to reconstruction. For the lowest conversion errors (in either direction) the sampling clock must have well-defined edges that are as repeatable in time as possible, and hence lowest jitter. Note this has no bearing on clock accuracy, that is, whether there is always a given number of sampling clock edges in some specified external period.

    Now this would all be relatively straightforward to think about if every piece of gear had a simple selection between its own internal clock source and an external clock input, and that whichever clock was selected acted as both the conversion clock and also the synchronising clock. Indeed, this was the way most digital gear operated up to a few years ago. Since then, the designers of much of the higher-end digital gear have adopted the policy that a properly-managed internal clock gives the best jitter figures, and that if external synchronisation is required, this is best done by long-term phase-locking the internal clock to the external source. This results in the internal clock taking on the exact frequency of the external clock, but keeping its superior short term jitter properties. Doing this neatly avoids jitter multiplication when cascading clocks through several pieces of gear. It has also largely removed the need for low-jitter external master clocks, frequency accuracy notwithstanding.

    I should declare my interest here to say that I have been one of the design engineers involved in this movement to use managed phase-controlled internal clocking. It's been exciting.
     
    kmetal likes this.
  20. audiokid

    audiokid Staff

    excellent.

    And right on Bos!
     

Share This Page