Skip to main content

Now here's something very scary.
It is a friend of mine who found out about the following, but I checked for myself and it is true...

Create a new session in 24 bits resolution, let's say 8 tracks if you have a 192 interface. My session was in WAV format, running at 48kHz.
Arm all eight tracks from inputs one to eight.
You need an external tone generator with plenty of gain (we both used our console's 1kHz generator with an outboard SSL XLogic channel to get enough clean gain in order to reach 0dB full scale. I also need you to monitor the ouput of your system.

Feed the 1kHz tone to one input at a time, crank up the gain, until you hear it distort. Check the session meters. Do this for every single input.
Strangely, you will find out that several inputs out of 8 will distort before turning red on the session tracks meters. The other ones will distort once you're in the red. If you switch the track i/o volume window to peak, you'll see that it distorts once you hit 0dBFs. (That's reassuring) but on the misbehaving tracks, it takes a whole lot of gain before you hit the red "led". On the other hand, if you look at your 192 interface meters, they all turn red when they should (once you hit full scale).

This is very annoying. Let's say you're tracking drums, with a whole bunch of tracks. Sometimes, you need to trust the meters, because distortion can be hidden in the tracking mix, or happen on one hit on the whole take, right? Well, you might be hitting full scale and beyond without the tracks turning red! How can you trust your system then?
I'm wondering if someone else has witnessed the same thing out there. I did not checked other interfaces myself, yet. My friend says the 96s inputs are responding ok. I wonder about the 888s, Digi001 and 002,...
Is this a bug? It seems that there is a problem of communication between the software and the analog inputs of the 192 interfaces...
Amptec, one of the most serious Digidesign dealers in Belgium has been kind enough to swap interfaces, just to make sure we did not have faulty hardware. They also personally recalibrated the interfaces. It does not make any difference. One more detail : each 192 interface has a different set of track measuring wrong... hihihi...

Oh, and I forgot, I tried this at 16 bits and the system works fine...

Time to do a little research for yourself and share...

Can't wait for your replies. Good night!

Topic Tags

Comments

Kev Fri, 09/30/2005 - 17:06

not a bug that I have ever found
I have very consistent results

first
it is very anal to want things to clip just as you get to 0dBFS.
DON"T get me wrong
I do understand what you are doing and have do have things aligned to this accuracy.
anal is good

second
how are you measuring your levels ?
... that may seem like a patronising question but it is important.

I am in the process of writting a couple of articles for protoolsforum.com and for my own sites totaly related to levels and interfacing DAWs and specifically Digidesgn stuff.
YES
I know it has all been done before but I feel people still get thing wrong and for reasons that are based around the very simplest of issues.
People do over complicate this stuff and often it is because they mix and try to match different level standards.
You have mentioned FS
this is good ... very good FS tells the story inside your DAW .. no matter which DAW you own it is an absolute.
we like FS
8-)

I won't try to short form it all here.
It will include interfacing to things like DigiBeta;s and the typicl -14dBFS and -18 and -20 dBFS ... and why.

My first sugesstion is that you need a passive VU meter with +4 and 0db settings and 600 ohm termination at least.
Very soon I'll be updating my site with a new page expanding the range of the DIY VU meter.
As I said above
FS is good so we need a method to see the output of the 001 to 192 at FS ... upwards of 20dBu. ... as high as 24dBu ???
how high can the the 192 go ??

search for
Kev's passive VU meter
and you should find things

I haven't answered you question but experience tells me that until we have a common link, what I'm trying to say during these discussions will end in conflict.

You need a passive level measuring device and the ability to control the termination does matter.

Kev Sat, 10/01/2005 - 02:08

almost
.. but even with this equipment I feel a problem can go unseen.
To some extent we have a chicken and an egg issue here.

I feel that you do need an independant signal generation device and an independant measuring device to begin to check you system.

Having said that I think I can talk you through an information gathering session with only a simple passive VU meter and the signal generator in PT.
The basic ideas will carry through to any combination of system.

... perhaps I'll add the simple DMM digital multi meter ... not true rms but close enough and I can also show you some mistakes a DMM can make.

anonymous Mon, 10/03/2005 - 00:55

Hello again,

It seems I did not make myself clear enough.

What I found out is that the meters inside pro tools react differently from input to input. I did not use a VU or a ppm or whatever measurement device, neither did I use a scope or an ultra clean signal generator. I didn't think I needed that as long as I was checking each input with a stable, identical signal....
It's not the AD converters, it's just the interfacing between the hardware and the software that seems a little weird...
The problem here, is that I do not have the same response from meters inside the pro tools session. That's those meters I was using, along the headroom meters available in the pro tools track windows (the pk mode)
I'm not saying we SHOULD reach OdBFS, I'm just worried about the fact that maybe we can't use full scale. You can be tidy and careful in the way you manage your gain chain and recording levels, but in some occasion, the unexpected can arise (hence the drum session example).
I love Pro Tools, I've used it for 10 years. I just want to verify what's I found out and see what others think about it.
With accurate information, we can then check with Digi and make things better for the whole recording community...

So, has anyone checked their inputs yet? Remember, you need to set your PT session at 24 bits...

Oh, and Kev, I know all about the problems of interfacing equipment. I don't think you're patronising at all. Make sure you post a link to your article once it's written. Level standards.... If we all agreed (or at least understood 100% what we are talking about)... We can all use a little reading once in a while to learn or refresh our memories.

I don't think this discussion will end in conflict (not with me at least)

Kev Mon, 10/03/2005 - 13:53

tomtom wrote: ... I didn't think I needed that as long as I was checking each input with a stable, identical signal....

So, has anyone checked their inputs yet? Remember, you need to set your PT session at 24 bits...

correct ... but what did you use for an identical signal ?

Yes, I check metering at the beginning of each session when using the analog desk.

WHY is 24 bit so important for this test ?

I have not yet found a discrepancy with ProTools and any of my interfaces.
Using either the internal Signal Generator plug or an external physical unit.
I have AMIII, 001, 96iO and have owned 888 and 88/24.
The first 3 don't have physical trims but do have a software control panel. 888's have the trim pots.

Your test has involved a great deal of the SSL desk.

Try signal generator straight to interface and check one channel at a time.
Then use a DA (properly calibrated) and do all channels at once.
Use the passive VU meter at each desk output to check the desks calibration.

Verify it with a simple DDM. Even if the DDM incorrectly measures the actual RMS voltage due to frequency and waveform ... IT is good enough for a comparative channel to channel test.

anonymous Mon, 10/03/2005 - 15:31

Kev, (and everybody else)

I have posted a screen capture of my Pro Tools session, it is accessible via a ftp site
http://www.bleunuit.be/client
login is : metering
password is : default

So you all can SEE what I'm talking about.

To answer your questions :

By an identical signal, I meant a sine tone, 1kHz, uninterrupted.
It is sooo easy to hear a distorted tone even at low level...

24 bits is important, because there is no problem when you set the session at 16 bits.

My test did not involve an SSL desk, but just an Xlogic channel to crank up the gain between the analog outputs of my digital desk (a DM 2000) that are about 4dB short to hit the analog inputs of a 192 interface hard enough to light up the red leds.

Now, did you see the picture? It shows that the same level (peak at 0dB Full scale on two different inputs don't necessarely turn the red light on. Once again, I'm not questionning the DACs or ADCs, just the communication between the hardware and the visualisation of what is going on in the pro tools window.
One more detail: I was using the digital outs of the 192 to go back to the DM2000 for monitoring. So there are no DACs of the 192 involved in the test.

I hope this clarifies my problem a little... I just have the feeling that I'm not making my point? What is wrong with me?

Heeeelp.

:wink:

Kev Mon, 10/03/2005 - 20:23

tomtom wrote: 24 bits is important, because there is no problem when you set the session at 16 bits.

!!
:shock:
did you say this earlier ?
This could be the most interesting part of the whole issue.
:roll:

Feed the 1kHz tone to one input at a time, crank up the gain, until you hear it distort.
Strangely, you will find out that several inputs out of 8 will distort before turning red on the session tracks meters.

I get the subtlety in your test.
You want to see the red light when you hear the distortion ?

I can see some issues with trying to get any system to calibrate this acurately all the time.
Yes it should
but I can see the possible variances.

:roll:
How is the SSL channel getting to both
192a/d 1 and 192a/d 2 at the same time ?? simple Y cord ?
OR
does the SSL have multiple outputs ?

I'll have to try this for myself in the way you have described.
note:
when ever I have tried these tests I've not had any significant differences BUT it has been more about watching than listening to each channel individualy. Output test where done based on a test.wav fed to each output. ... ie not a loop through test.
Clearly you can't use PT and the Interface itself to make the sine wave. You need solid clean tone at above the level that the 192 can handle.

Currently I only have the 96 so that will have to do.

anonymous Tue, 10/04/2005 - 00:32

Hi Kev,

I wish we did not have time differences, that would save us a lot of time.
For my test, I ran the tone one channel at a time, but for the screen capture, I "cheated" : I used a bridge in my patch to feed to inputs at a time (for the sole purpose of taking the picture. I would not do that for the test).

Actually, It's not that I want to see a red light when I hear distortion, it's the other way around, I wish I could see a red light when I hit 0dBFS or beyond. It's just expected from such pricey equipment. And work efficient...

I see your point about calibration and not trusting systems, but on my average day, I do not have time to check and re-check. Sometimes, you turn the studio on and off you go... (I have to)

We sent a message to Digidesign. If I get an answer, an possibly a fix, I will post it.

Kev Tue, 10/04/2005 - 13:56

that all sounds fine ... even the patch panel cheat.

these in detail calibrations need only be done when significant changes are made to the system (software or hardware)
Yes I think this stuff is pricey enough to expect it to work quite reliably.

I worry that is might be just too narrow.

explain ...
When dealing with analog recorder like tape decks it would be usual to start with a test tape and check the output levels, freq resp and metering.
Then use this metering to help check the inputs.

The important bit is that 1/2dB would have satisfied most people and expecting a test tape across 2inch 24 at significantly less that 1/2dB error would be a little over the top unless it was spanking new.

Likewise the whole system would be set for inside 1/2dB as best you could.

Now we have digital and the metering says 0.1dB and now we expect that accuracy.
Also in digital we have one given that all units have, 0dBFS.

The one trouble with that is ...
0dB FS sine wave is clean
0dB FS sine wave is clipped

not easy to measure the absolute point at which it goes over.
Even the term ... digital over ... doesn't actually mean anything cos it can't go over 0dBFS.

Perhaps you would be more comfortable to have the RED Lights come on at -1dBFS or -3dBFS and just know that you have a little in reserve.
If you test above is clean to this point then you have better confidence that the take is good.
I always have the Peak Margin Indicator displayed when track laying for this reason.

This is how I tend to build my own gear.
Have the RED indicator come on a little ahead of the real thing.
For my mic pres I set them such that there max output into 600 is higher than the max for the recorder or interface and so I always know I have a little headroom.
When it was analog recorder only I would set the peak lights for 3 to 6dB in front of that.
Now with DAW's like ProTools I tend NOT to put lights or meters on the gear and use the MARGIN INDICATOR on the digital units or software.

I will be very interested in Digi's answer
AND
I'll find time of the weekend to do your test. I'll get a DA from here at work and try to do an 8 at once test.

Last thought,
when doing your test be sure that there is no addition gear normal led into the input or outputs via the patch panel. Some can be easily forgotten.

??? hope all that makes some sense.

Kev Thu, 10/06/2005 - 19:19

I performed a trial run of your test last night with an HD system and a 96i/o

I used a tone generator that does get a little edgy up at 22dBu or there abouts.

SO I made an adapter lead and set the 96 for -10 input. Yes a compromise but it does give good clean tone past what the input to the 96i/o can handle in that mode.
The analog section of the 96i/o should be well inside headroom.

Session was 44.1k and 24bit.

The sig gen has two outputs that are calibrated with an external measuring device and both channels are close ... ultra close.

Two audio channels set to inputs 1 and 2 and as the generator is brought up in level the on screen meters track perfectly together.

Using the margin indicator at the bottom of each track I can set the input to -0.2dBFS.
The 96i/o has LEDs and the yellow is on but not the red.

Lift the input to -0.1dBFS and wait.
The 96i/o does lite up the channel 2 red LED.

interesting

wait and the red LED does go out ... and come back
the channel 1 red LED never comes on.

The sound of either channel is still good with no distortion edge.

Carefully lift the sig gen to give 0dBFS on each channel and both red LEDs on the 96i/o are on.

I can't be any more careful than that with an analog signal generator. Working to less than 0.1db tolerances is not easy unless you have a Class test unit with numerical data entry.

The on screen red lights are not initially on but will eventually light.
This is as close to 0dBFS as I can do.

I'll leave it at that for now
BUT
let me ask you to think about the differences between

wait for it

-0dBFS
0dBFS
+0dBFS

yes a bit pedantic but I hope you get the point.

anonymous Fri, 10/07/2005 - 09:19

Hi Kev,

Good to see that someone eventually did the test. I wish you had a 192! If you find out about the -0dBfs/0dBfs/+0dBfs, let me know.

Could +0dBfs perhaps mean that consecutive samples were clipped?
And -0dBfs be a reading between -0.1dBfs and 0dBfs?
I don't know.

My friend is having a baby right now. I don't think he cares about the Digi answer right now...
:D

Kev Fri, 10/07/2005 - 16:50

tomtom wrote: Could +0dBfs perhaps mean that consecutive samples were clipped?
And -0dBfs be a reading between -0.1dBfs and 0dBfs?

8-)
that is exactly the point

the makers and software writers have to make a choice at some point.
It could be based on consecutive samples and even if it was
you still need to choose at what point do you flip to the next displayed margin number.
AND
is this canculation and result consistent in both the pos and neg directions.
... think about how a calculator rounds up or down.
Audio is AC and music is continuously changing. Test tones should be easy ??

We can see one decimal place on the PT margin indicator.
The dB reading is only a presentation to us and may not have a direct bearing on the samples and calculations being made internally.
We don't know if the 192's red LED's are driven be the same calculations ... or if the unit does it's own internal calculation.

My final point is that it may just be to intolerant to expect all indicators to flip correct and at the same time as you pass from -0.2 to -0.1 and then to -0dBFS

SO my advice is to set the meters on the 192 to output and use the margin indicator on the PT mixer as you final clip indicator.
Aim for -6 to -3 dB margin generally on all takes, make choices on individual clips as required.
If in doubt, do another take
... and you will be fine.

Kev Mon, 11/07/2005 - 11:59

yes
please do
... a future fix seems to suggest a software issue and not hardware.

I have not opened a 96I/O yet so I haven't seen if there is any Tech Set Trims for meter and the pre AtoD sections.

I still think that aiming for this level of accuracy on ALL channels is difficult.
We never looked to below 0.1dB accuracy in the past and 0.5dB was close enough.