I have read a couple of posts and articles on how to phase-align two out-of-phase signals, e.g. a bass / guitar DI and mic signal or two guitar cab mic signals.
An easy and plausible way would be to zoom in on the waveforms of the two tracks and shift one of them so that the transients line up, as suggested here: http://www.soundons…"]Matching The Phase Of Mic & DI Signals[/]="http://www.soundons…"]Matching The Phase Of Mic & DI Signals[/]
But then there's some people claiming that using a phase-alignment tool like the Little Labs IBP will yield better results, because time-aligning two tracks is not the same as phase-aligning them, see e.g.
(Dead Link Removed)
[/URL(Dead Link Removed)
While it makes sense to me that time-aligning and phase-aligning is not necessarily the same, I don't see why I would want to use a phase-aligning tool in one of the above mentioned scenarios. If I place two mics in front of a guitar cab, one of them a few inches farther away, the recording of the more distant mic will be delayed by a few hundred samples in relation to the other one. Shifting ahead the delayed track in my DAW so that the two tracks line up perfectly should be exactly the same as placing the mic a few inches closer to the cab so that the two mics are at exactly the same distance, no? I other words: time-aligning the transients of the two tracks should be the same as phase-aligning the two tracks. Surely this makes more sense than not time-aligning the tracks and only use an IBP?
What do you say?
Tags
Comments
"Phase alignment" in this field of audio is usually a loose way
"Phase alignment" in this field of audio is usually a loose way of referring to "time alignment" in the context of the speed of sound and relative microphone positioning.
However, there is one area where there's a big difference, and that is in the blending of acoustic guitar pickup and microphone signals. The signals from most magnetic pickups are generated from the transverse vibration of the strings, whereas the microphone picks up a sound whose principal component is the front-to-back vibration of the guitar body. The physical distance from guitar body to microphone can be compensated by time delay of the pickup signal, but the difference between microphone and pickup characteristics results in a built-in 90 degree phase-shift which no amount of time alignment can compensate for.
I was very surprised with the sonic difference when I first used the phase-compensation control on the microphone channel of my Audient Mico relative to the guitar pickup channel (via the DI input). I used an X-Y display to look at the phase relationships, and was gratified to find that a good display alignment (greatest energy on a diagonal) corresponded with an improved sound relative to the un-compensated signals. I can only believe that the designers at Audient know all about this, and that's why they included the continuously-variable phase control, but they keep very quiet about it and I've never come across published articles or papers on the topic.
The example with the acoustic guitar pickup and mic is an intere
The example with the acoustic guitar pickup and mic is an interesting one. So in this scenario you would recommend first time-aligning the signals and then use a phase-correction tool to correct for the remaining 90° phase shift?
What about electric bass di and mic signals? In the past, I just moved the mic track back in time a little until I was satisfied with the result. Do you see the need for a phase-correction tool in this case?
bouldersound, post: 382414 wrote: A given change in time alignme
bouldersound, post: 382414 wrote: A given change in time alignment will cause different amounts of phase error depending on what frequency you measure.
This is exactly what I don't understand. Let me give you an example. I have here two tracks of an acoustic guitar, recorded with a close mic and a more distant mic. The second track is delayed by 73 samples.
If I listen to both tracks simultaneously it sounds kind of hollow. If I nudge the first track by 73 samples the waveforms line up very well and I get a nice full guitar sound.
Why would this time-alignment cause any phase errors? And why would I need a phase-correction tool in addition to the time-alignment?
You were likely not time aligning the delay properly in the firs
You were likely not time aligning the delay properly in the first place. If the "average" frequency range is 180 degrees out of phase with the second microphone then you will get phase cancellation. If the "average" frequency range is just a very small nudge out it can thicken things slightly. If they are more or less together then you get clean reinforced sound.
Now, the jpg you posted is a FFT representation of SPL and not phase alignment which is better represented by spectral analysis. The FFT image can help you align the mic's time relationship which will definitely affect the average phase alignment. The rule of th umb for clean delay for classical music would be roughly 3 ms per meter or yard. This gets you in the ballpark to start with. Then you can dial in your delay by ear if your system is capable of real time adjustment of FX or via a hardware delay if you are completely analog.
So right away if you are adjusting via a given sample nudge you will be doing far more math to come up with a time based delay which is what a sample nudge is. Also, if your mic's are only 1 meter apart then you might not want to add any delay or 3ms or less if you do. Remember that a lot of arm chair home studio engineers have more time to experiment than they do practical real world experience.
And I would adjust things by a plugin delay rather than try to u
And I would adjust things by a plugin delay rather than try to use a nudge feature for a daw track. Caveat: I started with hardware analog and digital delays so that was always my workflow even before computers. FWIW this just came up with regards a classical piano recording a couple weeks ago. You might search for audiokid's thread.
gunsofbrixton, post: 382468 wrote: The example with the acoustic
gunsofbrixton, post: 382468 wrote: The example with the acoustic guitar pickup and mic is an interesting one. So in this scenario you would recommend first time-aligning the signals and then use a phase-correction tool to correct for the remaining 90° phase shift?
Yes, but I always calculate the amount of time alignment by the distance measured with a tape measure and not by how the waveforms look on a screen. I should also have said that the phase-shift I end up with is never 90 degrees. There's a band of angle in which the sound is better than with no correction, but within the band the sound changes without there being an obviously "correct" value. It's a matter of choosing a suitable sound. Guitars vary, so a value determined for one is not going to be right for another.
gunsofbrixton, post: 382468 wrote: What about electric bass di and mic signals? In the past, I just moved the mic track back in time a little until I was satisfied with the result. Do you see the need for a phase-correction tool in this case?
I've not used this phase-correction technique for guitar basses, even when I've miked a bass guitar that had enough acoustic output to make it worthwhile. Acoustic (upright) basses are a completely different case, since they don't (usually) have magnetic pickups.
gunsofbrixton, post: 382468 wrote: And why would I need a phase-correction tool in addition to the time-alignment?
The simple answer to this is that a magnetic pickup is sensitive to the velocity of string movement in a plane parallel to the face of the guitar. The acoustic output is largely generated by displacement of the face of the guitar in a plane at right angles to the face. The velocity of a vibrating string is 90 degrees out of phase with its displacement. In addition, the way the acoustic energy is transmitted from the string to the face of the instrument involves a phase change, so the result is not a simple 90 degree correction. You don't need phase correction, but I have found in the majority of cases it gives a better sound to an acoustic guitar recording, or, at least, more sonic options for the engineer to call on.
gunsofbrixton, post: 382470 wrote: Why would this time-alignment
gunsofbrixton, post: 382470 wrote: Why would this time-alignment cause any phase errors?
It wouldn't. It would fix the problems caused by arrival time and change phase interactions caused by the two sources acting like filters. There are still differences between the two signals, but since they started out as different signals I wouldn't call the remaining differences errors. If they weren't different then there would be no point using both.
gunsofbrixton, post: 382470 wrote: And why would I need a phase-correction tool in addition to the time-alignment?
You don't if it sounds good. But try it and listen.
Another thing to think about is identified in your original ques
Another thing to think about is identified in your original question regarding distances.
Say, hypothetically that you have a mic on the front of a guitar cab and on the rear. By definition and by nature, they will be opposite in phase from each other. Phase aligning should be as simple as flipping a polarity. However, this isn't the case since distances (even in micrometers) will alter the phase relationship between two signals. Therefore, simply phase aligning may not solve the problem. Phase and time alignment may, however.
So then that leads to your next issue. Distance. You ask about having one mic further back than another then moving it forward a set amount of time in the DAW and wouldn't that be the same as having the mic closer. The short answer is - no. Because the mic picks up a different sound (higher ratio of diffuse to direct sound than if placed closer). Time aligning the two mics will ensure that the frequencies hitting both mics arrive at the same time. If the two mics are on identical horizontal and vertical planes and there is no external force acting on the sound waves as they travel to the mics, then time aligning is all you'd need to do. Since we don't live in a reality where that's possible, there will be mitigating factors that affect the relative phase to each mic picking up a signal.
Therefore, time aligning and phase aligning may be required separately. In practice, I do both steps as one. I typically will drag the track until it aligns appropriately close, then I'll zoom in and adjust the tracks so that they are close in phase. Unless they are picking up the EXACT same thing, they'll never be 100% in time/phase with each other, but that's ok. Close counts in horseshoes, hand grenades and time/phase alignment.
Cheers-
J
Cucco, post: 382486 wrote: Say, hypothetically that you have a m
Cucco, post: 382486 wrote: Say, hypothetically that you have a mic on the front of a guitar cab and on the rear. By definition and by nature, they will be opposite in phase from each other. Phase aligning should be as simple as flipping a polarity.
I think using the word phase interchangeably with polarity confuses the issue. The back of a speaker cone will have opposite polarity and some phase deviation compared to the front. Though phase and polarity interact they are different. Polarity is purely in the amplitude domain, only has two states* and it's totally independent of frequency. Phase is purely in the time domain, has infinite possible states and is directly tied to frequency/wavelength. I would change "phase" to "polarity" and "aligning" to "matching" in the sentences above.
*When amplitude only has one dimension.
Cucco, post: 382486 wrote: You ask about having one mic further
Cucco, post: 382486 wrote: You ask about having one mic further back than another then moving it forward a set amount of time in the DAW and wouldn't that be the same as having the mic closer. The short answer is - no. Because the mic picks up a different sound (higher ratio of diffuse to direct sound than if placed closer). Time aligning the two mics will ensure that the frequencies hitting both mics arrive at the same time. If the two mics are on identical horizontal and vertical planes and there is no external force acting on the sound waves as they travel to the mics, then time aligning is all you'd need to do. Since we don't live in a reality where that's possible, there will be mitigating factors that affect the relative phase to each mic picking up a signal.
That makes a lot of sense. thumb
I'm frequently zooming in down to the sample with parallel track
I'm frequently zooming in down to the sample with parallel tracks of the same source. Sometimes I'll make one lead or lag to tweak the quality of sound overall, monitoring both tracks in Mono. Sometimes a little leading or lagging will bolster the sound more than perfect alignment will. Doing everything scientifically doesn't always equate to musicality or listenability. But it can certainly get you closer into the ballpark by doing so.
I'm always lagging the pack
Mx. Remy Ann David
Time is absolute. Phase references a particular frequency. A giv
Time is absolute. Phase references a particular frequency. A given change in time alignment will cause different amounts of phase error depending on what frequency you measure. They are like two sides of one coin. A correction tool could work back from phase measurements to find the time misalignment.
Time alignment of the sort you're describing is really so simple that it's stupid to let a piece of software do it for you. I do it all the time. Usually it takes a few seconds. When it takes more time it's with tracks that have so little in common that any alignment will be approximate anyway, and I doubt software would be any better at guessing in that situation.