Skip to main content

What is the difference between these 2 cables, and can I use a standard XLR interconnect -- out from an inrerface AES EBU -- to digital AES EBU in on a DAT recorder?

thanks.

-ptfigg.

Comments

tnjazz Tue, 10/11/2005 - 08:56

AES/EBU is digital
XLR is analog
Although the interconnect is the same.

AES/EBU cable is 110 ohm balanced.
XLR is typically 66-75ohm cable (isn't it?)

BUT you could use a regular XLR as an AES/EBU cable (just like you could use a regular old RCA cable as a coax S/PDIF cable), just know it would be more susceptible to noise and interference.

Dirk

FifthCircle Tue, 10/11/2005 - 10:28

A lot of audiophile engineers will use digital cable instead of standard mic cable as they prefer the sound.... I've always taken an approach that cables really don't have a sound, however.

AES on an XLR connector actually had 110 ohms chosen as the impedance because that is pretty close to the standard impedance of a standard mic cable. The "digital" cables perhaps have a touch higher quality control when it comes to that because of the fact that digital audio is a bit more "finicky" than analog audio for transmission- rather there is more that can mess up a digital signal in transmission than an analog signal.

As far as the construction of an AES cable versus a microphone cable goes, they are basically identical.

To connect to your DAT machine, I'd just make sure that the cable is under about 15 feet long or so if it is a microphone cable. I've sent further distances, but at longer distances, you'll run a higher chance of having problems.

--Ben

TeddyG Wed, 10/12/2005 - 18:22

Cables don't have their own sound!?!?! That is heresy! Possibly even himesy!! You mean a 500 dollar 3 foot audio cable is no better than a 19 dollar audio cable??? I can see snake-oil sales pe---- I mean, fancy cable makers all over the world holding "going out of business" sales soon, just as a result of your ill-considered email!!!

For shame! You cad! How dare you! Etc., etc., etc.

Never fails to amaze me what people swallow whole,

Teddy G.

ghellquist Sat, 10/15/2005 - 08:42

Hmm. To put it very simple.

Digital information on a cable either comes around unchanged or changed. That is sort of the digital part of things. A 0 either becomes a 0 or a 1. There are sort of nothing else to choose from (I am simplifying a bit here, there is also timing).

An analog signal on a cable is always changed some. It never comes out exactly the same on the other end.

Now if you send digital information over a cable it is actually analog there. There are no digital cables as such, they are all analog. So what you are doing is introducing a bit of analog change to the signal. This change can be described as an error probability. You might say that if I send one million bits over a cable, about 5 of them will be changed. All digital systems has some kind of "tolerance" to this kind of errors. It might be just chugging along happily, detecting the error or actually correcting the error.

If you use the "wrong" cable for transmitting digital information, the probability of error might increase. How much depends on a lot of factors. The AES/EBU cable system is carefully designed to allow you to run very long cable runs without getting very much errors, 100m (300 feet) or more. If you use the "right" cable it will still have very low error probability. Use a long cable of the "wrong" type and the error probability will be very high.

Now, if you use a very short cable, say a few meters, you will probably get away with just about any cable. The problem is that your system probably has no way for you to check. There probably is no "error meter" in the system, only thing left for you is to listen. So if it sounds good, it is good, but it might still have quite a few errors.

My suggestion is to spend the extra few bucks on getting the "right" kind of cable. Some day it might make a difference.

Interesting web page
http://www.nt-instr…

Gunnar

hociman Sat, 10/15/2005 - 09:38

cables

Cables that are marketed as AES/EBU cables often have two (2) characteristics that cables marketed as microphone cables do not.

1. Lower capacitance.
2. Gold plated XLR connectors.

Personally, I buy Canare AES/EBU cable when I need to, but I have not used Monster, so I cannot comment on its worthiness (or lack thereof).

foxint Sat, 01/28/2017 - 22:42

Hi Guys - great comments.

Please excuse me I am a novice home audio guy who has mostly Pro Audio stuff - due to the lack of BS and the honesty of you guys. Audiophiles tend to make a simple matter complex.

Can any one correct me if I am wrong about AES/EBU Cable vs "normal" balanced cable?? Some say use AES/EBU cable. Great but is they any physical (aside from the impedance) difference in the cable? From my limited understanding the only real difference is that AES/EBU cable is more carefully made (?) and is 110 Ohms? This is I understand created by using thinner wire.

Is this the simple answer? I am only a simple man. Sorry again, but I keep reading about cable and there is a lot of "stuff" that is hiding the simple answer

Boswell Sun, 01/29/2017 - 10:11

foxint, post: 447048, member: 50348 wrote: Can any one correct me if I am wrong about AES/EBU Cable vs "normal" balanced cable?? Some say use AES/EBU cable. Great but is they any physical (aside from the impedance) difference in the cable? From my limited understanding the only real difference is that AES/EBU cable is more carefully made (?) and is 110 Ohms? This is I understand created by using thinner wire.

That's right. AES/EBU cable has closer control during manufacture to maintain its characteristic impedance along its length. In quality balanced audio cable, other parameters such as low microphony and screening out external hum fields are usually more important. Standard analogue audio usage does not involve cable impedance.

Paul Spoltore Sun, 01/20/2019 - 11:33

The big issue to matching the impedance at the terminal end - you need to pull all the power of the digital transmitter's signal or it will "reflect" back to the transmitter and may continue again back to the receiver creating a ringing on the cable. That's why many electronic chains have terminal resisters (to draw unused signal power down to ground so that it does not create noise). This get's worse as cables get longer (all based on the time the signal takes to bounce back and forth depending on the diameter of the wire).

Best answer - if its digital and designed by the engineers to use AES/BSU cable then use it. This is less of a problem going the other way, because you may hear issues and can adjust for them with EQs or other filters. But most systems these days go digital somewhere, so it eventually becomes an issue.

cyrano Sun, 01/20/2019 - 14:49

Cable is just cable.

A good analog mic cable is twisted and shielded.

A good digital cable is twisted and shielded too.

A short cable will give identical results, whichever of the above you use.

Longer cables better be the specified impedance for digital and low capacitance for analog. If not, analog will loose some high and digital might not work at all.

Hec, even DMX cable will work over a short distance.

paulears Tue, 01/22/2019 - 06:03

There's so much confusing information for people new to it that is the sort of topic the generates all kinds of misunderstanding.

Remember that screened cable is not even a requirement. Broadcasters in the US and the UK used plain old coper cable with no screen for years to feed their remote radio towers, on telecoms company twin wire circuits. Somebody would shove a jack into a frame and make the circuit.

Cables have properties and they're not that difficult to explain. People mention impedance, and of course there is a resistive component too. The comment that when you shove audio down a cable it gets worse in 'quality' is valid. However best case is simply that all that happens is it's a tiny bit quieter - some of the input was wasted, generating a tiny (and usually totally unimportant) bit of heat in the cable. Unless the loss was huge, that's a tweak to a gain knob. If the loss IS huge, then you'd need to boost it - again a small extra loss of quality - usually in signal to noise ratio. Cables also exhibit capacitance, which has the effect of having an impact on frequency response. Varying from can't be heard to terribly dull and bass heavy.

All these problems apply equally to analogue and digital. The only difference is how the audio suffers. In analogue the audio gets duller and quieter. At some point attempts to eq and boost the signal fail because the signal to noise is just too bad. In digital the audio is impacted differently. Digital audio uses conversion from an analogue signal into a stream of bits of information. At the other end, the conversion is reversed. We all know that how well this conversion is done does have an impact on quality. We can argue the toss on mp3s vs lossless conversion, but the upshot is that usually quality is fit for purpose. We still have exactly the same cable issues. We send waveforms with nice hard edges down the cable, but if we suffer signal level loss, is getting 4.5V instead of 5V going to be a killer? Probably not. Let's assume we are sending older digital audio down the cable. 48 thousand bits of it every second. Maybe the converters can cope with 4.5V instead of 5V, but at 4V they start to struggle. The cable might also have the frequency response problem from the capacitance and the length. Digital systems can be quite clever. If one, of the 48 thousand bits of info gets wrongly re-converted, or even worse, is totally missing because it got lost - what happens? Usually the system detects the error. A bit is missing. The one that went before was a '1', so the error correction makes a guess and pops in a'1', even if it really should have been a '0'. We don't hear it. Systems right from the start were able to make up the missing data and we really didn't notice. At some point common sense of the designers says hang on - we're correcting so many bits that we have to stop - and the audio stream grinds to a halt. Early DAT recorders would show you on a display how many errors per second were being corrected, and when mine was running it would show from between none and maybe a few hundred every second. No clicks or pops, no dull sounding audio and no break in the music. If it got above 400 or so, you knew you were on dodgy ground because the next step was silence, and a machine that would simply stop!

Cable introduces changes. How the changes impact, depends on what the audio is. Somebody figured out that the best interface digitally worked with cable that works like RF systems where the signal is optimised for the characteristics of the cable and devices either end - so they picked 110Ohm impedance. You can use any old balanced audio cable, but the best and least comprised cable will be 110Ohm impedance. Using higher or lower impedance cable means a shorter run before issues start. Lighting DMX cable is also supposed to be 110Ohm, but loads of people, like in audio, just use anything. In analogue, or digital audio, video in both types and lighting control, all that really matters is using the most appropriate cable for the job. If you need distance, then proper, professional cable with known characteristics is required. For any of these uses over a couple of metres or so, ANY cable will work. Cable connectors are also nothing to get excited by. Specifications usually detail a connector type - EBU/AES digital specifies 3 pin XLR. DMX specifies 5 pin XLR, but thousands of bits of DMX kit use 3 pin connectors. It really isn't an issue. DMX connectors are NOT constant impedance connectors - there is no special 110Ohm version.

As for some cables performing better than others - they do, BUT based on physics, not cost. Your digital links may or may not cause the receiver to error correct. If they have to, then the sound HAS been compromised. Golden eared people claim to hear it. I only hear the silence when error correction fails, or it introduces weird artefacts like strange clicks or pops as the receiving device tries hard to put a solid stream together from missing bits! In analogue, poorer cable can allow in interference, and it can make the HF reduce in level. In digital the same things are happening, but digital reacts differently. Maybe the lack of HF means the cable is only good for 10m in length. Fifteen metres makes the error correction give up? It is, though, exactly the same physics at play.

For those that get confused by this notion of capacitance having an effect, just remember what it does inside an electric guitar's tone control. More capacitance becomes a tone control. Nobody says that a mellow tone is 'bad' and a bright tone is 'good' - yet in a cable some people do! You can make a mellow tone brighter with EQ - up to a point. You could, but people rarely do, make a bright tone mellow in an audio circuit.

Anyone got cables that work a bit like a microphone? The ones where they can be tapped, and you hear the tap, or they rustle - some really bad ones can even work like a mic and if you yell at them and turn the volume up - you can hear a voice! Not magic, just capacitance at work again.

Cable do not care what goes down them. It's just electrons in a pipe. Don't think of them as magical. They have physical properties. toughness, thickness, ability to resist being squashed or cut. They also let you stick voltage down them. All they need to do is maximise the transfer - nothing more. Resistance, impedance and capacitance. That really is about it. Remember the old unbalanced telephone company cables - they had an impedance too. In the UK it was 600Ohms. That was all that people needed to know to let a couple of twisted bits of copper send broadcast quality audio quite a way!

x