Skip to main content

Hi all
I was wondering what you learned gentlemen think about digital cables.

I just got an AKG C426, which sounds great when I'm listening through cans. However, when playing back recordings I've been disappointed. Things sounded very flat, the upper mids were what can only be described as hard, and I now understand what people mean by 'smeared'.

After a rather depressing evening on the Internet looking at the price of better converters, I decided to change cables and switched from my dirt cheap optical cable, to a more expensive coaxial cable and was VERY surprised by the difference. The sound appeared much less 'recorded'.

I'm rather cynical about stuff like this -especially where digital cables are concerned. The only logical thing I can think of is that the cheapo optical cable was causing errors. My reasoning is that the character of mics self noise on recordings was different from what I heard while monitoring live, leading me to conclude that a layer of noise was being added somewhere, which I imagine would be consistent with random ones and noughts being mixed up with my signal as they replace the correct bits.

I'd be interested to know whether anyone here has noticed a difference when changing digital cables. I always assumed a bad cable would simply fail or at best cause dropouts.

Any ideas?

Cheers
John

Topic Tags

Comments

Cucco Fri, 10/07/2005 - 05:17

Hey John!

Great question!

FWIW, I've been a 'Home Theater' magazine subscriber since the first publication. Back in the day, the magazine was run by Brent Butterworth and Butterworth labs would actually "TEST" equipment that came through. This was also back when they would give a product a rating of 59 out of a 100!

They did a fantastic write up on the differences in digital cables. The main focus of the article was comparing toslinks to toslinks and coax to coax. They did touch a little bit on the difference though. (And they DIDN'T say "toslink uses fiber optics and light, therefore is obviously superior." They actually touched on jitter and timing issues and stability issues and so on.)

In the end, they were able to definitively show the difference between certain cables and were able to quantify those differences.

All that being said, this article set me into the frame of mind that maybe there is a difference in digital cables and that not all 01101001010001010010100 transmission streams are created equal.

Since then, I have actually subjectively compared digital cables and types and I have definitely found that digital cables DO in fact matter. (The computer dorks have known this for a long time. Anyone running a fiber network can tell you the quality of fiber-optic cable can make or break a network node.)

My favorite digital cables are from MIT and Kimber, though Monster makes some good ones too. (The wierd thing with MC is that one of their series - pro or consumer, I don't recall which one - is far more expensive than the other, but it's the same exact cable as its other-market counterpart.)

J.

DavidSpearritt Fri, 10/07/2005 - 11:27

John

I am very confused by your post as to what you are asking. Can you clarify please.

I just got an AKG C426, which sounds great when I'm listening through cans. However, when playing back recordings I've been disappointed.

So you mean, cans during recording direct from analog mixer, but "playback" is afterwards from the DAC-1??? What converters are you using during recording?

I decided to change cables and switched from my dirt cheap optical cable, to a more expensive coaxial cable and was VERY surprised by the difference. The sound appeared much less 'recorded'.

Between what and what. Are you using an optical cable between the DAW and the DAC1? Why not a professional quality AES cable?? I use any old "decent" AES cable only for interconnects, mogami or canere.

In any case, the DAC1 will remove "all" jitter, but not correct gross errors, your optical cable or tx/rx connections must be damaged if you are getting gross errors between the DAW and the DAC. Your sound card does not have an AES output?

My reasoning is that the character of mics self noise on recordings was different from what I heard while monitoring live, leading me to conclude that a layer of noise was being added somewhere, which I imagine would be consistent with random ones and noughts being mixed up with my signal as they replace the correct bits.

Not sure about these conclusions, errors has nothing to do with self noise, I think you mean recording noise floor. I think something is damaged in your setup.

ptr Fri, 10/07/2005 - 13:28

I never run optical cables when recording live, but do use them at my secondary workstation for transfering DAT tapes to computer. In duscussion with my local cable manufacturer ([[url=http://[/URL]="http://www.jenving…"]Supra[/]="http://www.jenving…"]Supra[/]), they say that the optical leader in it selft has very small loss factors (They use a medium quality Japanese make), but that it is the termination that makes the diffrence (And that its even more important with optical cables than with electrical).

And AB blind listening test between a $5 cable from the local "RadioShack" equivalent (Clas Ohlsson) and the Supra X-ZAC Toslink ($120) proved the latter to give a more wide spectrum and less haze.. (I have not done any other comparisons, but I'm sure that there are other alternatives on the market till will yeild god results as well!)

To my mind; termination and plugs are the most important part of a good cable!

/ptr

larsfarm Fri, 10/07/2005 - 14:59

ptr wrote: And AB blind listening test

It ought to be possible to compare what is sent to what is received while still in the digital domain. You send a file and can capture a file. Then you can objectively determine the exact error measured in number of bad bits for a particular transfer over a particular cable. This result should match your listening tests...

L

Cucco Fri, 10/07/2005 - 18:01

larsfarm wrote: [quote=ptr]And AB blind listening test

It ought to be possible to compare what is sent to what is received while still in the digital domain. You send a file and can capture a file. Then you can objectively determine the exact error measured in number of bad bits for a particular transfer over a particular cable. This result should match your listening tests...

L

Agreed.

When I have 10 spare minutes, I'll get on it.

J.

John Stafford Fri, 10/07/2005 - 19:22

DavidSpearritt wrote:

My reasoning is that the character of mics self noise on recordings was different from what I heard while monitoring live, leading me to conclude that a layer of noise was being added somewhere, which I imagine would be consistent with random ones and noughts being mixed up with my signal as they replace the correct bits.

Not sure about these conclusions, errors has nothing to do with self noise, I think you mean recording noise floor. I think something is damaged in your setup.

Hi David
When I use a decent cable to the DAC-1, the familiar self-noise of the mic is apparent on playback (with little other significant noise) so this is not a recording noise issue. I'm not confusing self-noise with recording noise, nor am I talking about jitter. Perhaps I should have explained that the first thing I do when experimenting with any recording setup is to listen to how faithfully the character of the mic's self-noise is captured.

I think it is reasonable to assume that random errors in the data stream will manifest themselves as something resembling white noise. I have played around with adding noise to recordings, and the first thing to happen is that the perceived character of the noise on the recording changes (in this case the mic's self-noise); it is difficult to perceive two random noise sources as separate. BTW if you take an audio file and add random single bit errors something similar happens, but I must admit it is a long time since I deliberately added errors to a file.

I am just talking about what happens when I use the lowest quality optical cable it is possible to buy, and whether this cable is likely to cause data errors. This is the only optical cable I have, and I generally use higher quality cables, which happen to be coaxial. I plugged this crappy cable in a while ago, but then forgot about it. As I hadn't tried out a new mic in a while, I was alarmed that the difference in the live sound and the recorded one seemed to be greater than what I expected based on past experience.

John

John Stafford Fri, 10/07/2005 - 20:32

Cucco wrote:

In the end, they were able to definitively show the difference between certain cables and were able to quantify those differences.

All that being said, this article set me into the frame of mind that maybe there is a difference in digital cables and that not all 01101001010001010010100 transmission streams are created equal.

Since then, I have actually subjectively compared digital cables and types and I have definitely found that digital cables DO in fact matter. (The computer dorks have known this for a long time. Anyone running a fiber network can tell you the quality of fiber-optic cable can make or break a network node.)

J.

Hi Jeremy
This whole area is interesting. I remember being at a Nordost demo, and listening to some of their cables. Apparently a lot of their research is sponsored by mega corporations trying to improve data transmission. Some of their conclusions are interesting, but whether or not they are using their other work to get credibility from the audio community, their cables do make a difference. As I've said before, I'm not necessarily convinced that these changes in the sound brought about by their more expensive products are for the better, but that's a different issue....

I'd like to try out some more digital cables to see if I can tell one from the other at home. One concern is that I don't know about the best way to get digital in and out of my computer. I'm using an Edirol box to feed my DAC, but I don't see too many alternatives for a laptop (sorry David, you asked me what I was using. I use an Apogee Mini-Me through USB to get data onto my computer). The Edirol messes with the signal to set the copy protection bit, so I wonder if it also has a detrimental effect on the rest of the signal. I assume the components in this thing are the cheapest of the cheap. At least they can write drivers properly!

One thing I am curious about is the level of noise in digital cables. I wonder if a high level of background noise could cause some confusion at the other end. One of the things I remember Apogee saying was important in the design for the Big Ben was a very clear articulation of the signal, given that the effect of sloppy rise times is often overlooked, and can cause problems when devices are communicating. There were so many other issues as well that made for an interesting read.

It would be interesting to get a selection of cables and send directly from the ADC to the DAC directly. Unfortunately the Mini-Me doesn't have optical out, so I can't do this with my dirt cheap cable.

End of rant...
John

DavidSpearritt Sat, 10/08/2005 - 02:33

I think it is reasonable to assume that random errors in the data stream will manifest themselves as something resembling white noise.

Not sure that this is a safe, correct or useful assumption.

By definition, error correction in a digital stream is inaudible, because the data stream is restored perfectly, inaudibly, without your knowledge. Error correction will work happily away correcting a few errors here and there.

If you have errors that cannot be corrrected and they are dense in the stream and the results audible then your digital transmission system is thoroughly broken.

There is no in-between with all this. That's the beauty of digital. In a healthy system, therefore, cables make no difference.

John Stafford Sat, 10/08/2005 - 23:28

DavidSpearritt wrote:

I think it is reasonable to assume that random errors in the data stream will manifest themselves as something resembling white noise.

Not sure that this is a safe, correct or useful assumption.

How else could highly random errors manifest themselves?

By definition, error correction in a digital stream is inaudible, because the data stream is restored perfectly, inaudibly, without your knowledge.

There is no error correction of data when passing digital audio through a cable using a consumer S/PDIF. With professional S/PDIF and AES/EBU, error detection can be carried out so a receiving device can deal with a bad sample by removing it or muting it. The data stream is not restored perfectly.

John

larsfarm Sun, 10/09/2005 - 02:03

John Stafford wrote: How else could highly random errors manifest themselves?

Some errors will be in the least significant bit, some in the most significant bit, others in between. Some will be 1 instead of 0. Some 0 instead of 1. So, it is not just an even level hiss or sound at some barely audible level from randomness in the low order bits (like dither). It must also be random in volume. From a full throttle spike where it should be silent to silent where it should be full speed ahead.

L

DavidSpearritt Sun, 10/09/2005 - 04:17

If one end of your cable receives a 1 and the other end says its a zero, then your cable is broken. The voltage tolerances for data definition in the cable are very big, to remove the cable from the equation.

How many times have you had to change the cable joining your hard drive to your motherboard, your floppy drive cables, your USB cables from camera to PC the list goes on, these are very tolerant of manufacturing margins.

There is no clocking info going down the cable, just data words that are reclocked/synched at each end.

If you can hear errors in any cable then the conclusion that should be drawn is its broken and faulty and not necessarily because its cheap quality although sometimes these intersect.

An optical cable should sound the same as SPDIF going from DAW to DAC1, if nothing is broken.

If cables made a sgnificant difference, where's the published gold standard on specs and choosing them, brands, recommendations etc from the audio community.

C0U1

Cucco Sun, 10/09/2005 - 07:41

Umm...

David,

I think you're a bit wrong on this one.

What get's transmitted is not what gets received. Not with the best cable in the world is this true. 1's and 0's do get shuffled. The reasons that this is are numerous. The most common in coax/aes/computer is electrical inductance within the cable itself. Within Fiber, it's often due to either inperfections in the fiber or due to faulty or bad connectors. (Though, because of the gap between connector and receptor, there is inherently some data loss as light has a phenomenon called "fall-off.")

A device receiving a bit stream can reconstruct it even with thousands of errors in the stream. However, it too can make mistakes. It interprets what it has been programmed to interpret and therefore can make mistakes.

The reality is digital cables are different as is the quality of the cable. There is tons of research saying so and I don't know of one scientific finding which says otherwise. (Seriously, not one.)

So far, John has been pretty much spot on with his assessment.

J.

DavidSpearritt Mon, 10/10/2005 - 17:58

larsfarm wrote: [quote=John Stafford]How else could highly random errors manifest themselves?

Some errors will be in the least significant bit, some in the most significant bit, others in between. Some will be 1 instead of 0. Some 0 instead of 1. So, it is not just an even level hiss or sound at some barely audible level from randomness in the low order bits (like dither). It must also be random in volume. From a full throttle spike where it should be silent to silent where it should be full speed ahead.

L

Hit the nail on the head larsfarm. It will never just be the LSB's giving errors, why would they be singled out?

anonymous Wed, 10/12/2005 - 09:45

One of the geniuses of teaching digital audio, John Watkinson, would say that if you notice any difference between digital cables, then your DAC is broken. (See "The Art of Digital Audio" Focal Press)

Jitter is not added in a digital transfer and jitter is not added from the digital output of your gear. Any jitter in the signal is encoded at the AD converter used to make the recording. Your DAC should minimize any jitter that was encoded.

As long as the impedence of the digital cable is within spec, and your run of distance is within spec, there should be not any difference between these cables.

The cable either works or it does not work. There should be no gradations of quality.

Zilla Wed, 10/12/2005 - 12:06

Plush wrote: ...if you notice any difference between digital cables, then your DAC is broken.

Maybe so. I would then have to conclude that every DAC/digital receiver I have been exposed to would be categorized as "broken".

Plush wrote: ...should minimize any jitter ...should be not any difference between these cables ...should be no gradations of quality.

I, too, have to qualify my digital theory statements with "should" (i.e.. expressing expectation or probability.) I have to use "should" because my listening experience does not agree with what I intellectually would expect of results from different interconnects.

John Stafford Wed, 10/12/2005 - 20:35

One thing I neglected to mention about random changes in digital data through consumer S/Pdif.

For some reason, big spikes were far less apparent than the one in twenty-four errors one might expect. Maybe this has something to do with an attempt at error correction in the receiving devices by filtering out big spikes, given that sophisticated error correction as used on CDs is not an option with this format. If this is the case, it would of course depend entirely on the specific equipment used. Just for the record, this was 1997, using Common LISP Music to manipulate files to simulate transmission errors. Maybe a lot has changed since then.

Come to think of it, it was 16-bit in those days :wink:

It could be an interesting experiment to transmit a stream with a lot of deliberate errors in the Validity bit with none elsewhere, just to see how DACs handle this sort of thing.

As far as cables go, I don't suppose there will ever be universal agreement. Maybe we should call a truce :wink:

John

mdemeyer Wed, 10/12/2005 - 21:53

Gents,

Two totally different issues here:

Errors in Transmission

Should be really rare unless things are really broken. Just look at the waveforms on your DAW and see if there are errors. They will be random spikes and easy to see if you zoom in enough (in the time domain). As was pointed out (correctly), there is no mechanism to gravitate these errors to the LSB, so they will not cause harmless noise. This can not be a common problem or we would all be recording crap when we use external A/D's connected with AES connections.

Jitter

The S/PDIF and AES connections carry not only data but also clocking info. If you are using this interface as a data transfer mechanism (to move PCM data from an external A/D to your PC hard drive, for example) the clocking info is generally not used except to determine the average sample rate. In that case, it is a data transfer mechanism and the jitter inherent in this interfaces is really not an issue.

But if you are sending the stream to a DAC for real-time playback, the DAC will (generally) derive the D/A playback clock in some way from the clocking info carried by the stream. DACs vary widely in the way they do this, up to and including some extreme approaches with fairly large amounts of memory and re-clocking.

Jitter on the S/PDIF and/or AES interface is a well documented issue, scientifically proven and measurable, and it will effect playback when feeding a DAC directly. This was known a very long time ago (at least in digital age years). Different interface implementations (especially cheap TOSLINK implementations) can have limited bandwidth which degrades the timing information carried by the stream. But this used to be a lot worse than it is today...

Michael
(formerly with Wadia Digital)

x

User login