1. Register NOW and become part of this fantastic knowledge base forum! This message will go away once you have registered.

Phase Considerations

Discussion in 'Recording' started by TheDaMHatter, Nov 19, 2006.

  1. TheDaMHatter

    TheDaMHatter Guest

    The more I learn about recording, the more I realize that phase considerations are a make or break point.
    I have several questions about phase, all general, independent of specific setups.

    1) Besides listening, how can you really tell when your in phase or out of phase? For example, does any one think of the wavelength of the sound sources they're recording and use that as a guideline for mic placement?

    2) Any time that your using more than one microphone to record something, it seems that at certain frequencies your going to get phase cancellation. How do you work around this?

    3) How come after recording, it seems like all phase considerations are over. In mixing, it seems like you don't need to worry about arranging audiotracks so that they're in phase?

    As you can probably tell, I don't have much experience in this field. I'm just a physics student so mostly, I look at things from that perspective.

    Thanks in advance
  2. natural

    natural Active Member

    Jul 21, 2006
    There are 2 issues here. One is phase - the other is polarity
    1- Unless you're recording a sine wave. It's pretty near impossible to measure the wavelength in any way that will negate any phase problems.
    Even measuring the fundamental leaves all those harmonics to deal with.
    Basically- In this case, it doesn't really matter. We hear things in and out of phase all the time. Every time you move your head, you change the phase relationship of things. That's just life.
    In an audio environment, I don't know if there's a way to tell if a single mic is firing the speaker on the pos or neg side of the wave. In this case, we simply invert the polarity and make a judgement call on whether is sounds better or not.
    With 2 mics on the same source, there are top secret techniques to determine if they're in the same polarity with each other. If not, then reversing the polarity of one will correct the situation, but again, I don't know that there's a way to tell if they are now both pos polartiy or neg. Just that they are the same with each other seem to be enough generally.
    Although it is said that if the speaker fires forward on the pos side of the wave, that it sounds better. Others will be along shortly to debate this further I'm sure.

    2- When using mulitple mics, phase issues do create some concerns. Engineers use another top secret rule to reduce this to manageable conditons. (research the ' 3 to 1 rule')

    3- your statement here is not exactly true. Phase errors at best result in coloration of the sound. So the decisions you make about mic placement on, say, on a gtr amp, if 40 or 50 degrees out of phase, might make the sound more nasal than if they were more coherant. In bass tracks it can create a muddy sound that just can't be cleaned up later.
    At worst, some sounds can dissapear completey if summed to mono. Entire mixes can sound thin, or dull depending on what is in or out phase and by how much.
    But the beauty of all this is, that it doesn't really matter if things are in or out of phase, as long as you know what you're doing and you like the results.
    In art, the normal rules of physics don't necessarily apply.

    hope this helps
  3. SharkFM

    SharkFM Active Member

    Dec 28, 2010
    I hit this Signal Phase issue head on with my drum sound, solved today. Phase comes into play when two or more signals of the SAME source overlap - improperly! This robs your tracks of power, and also creates unwanted and unnatural effects. Phase issues crop up recording multiple mics, using a direct DI signal and a mic for guitar. Or in my case I used 8 mics on the drum and kit got royally screwed up until I fixed it.

    Overhead and Front Left and Front Right mics are about 2 M away from the most potent source of sound the snare. -> Leads to about a 5 ms delay there. So snare POP!!...the 57 on top/bottom get the signal right away...4-5 mS later the overheads pickup the POP. Big issue. A phaser effect works right in that range 2-8 ms or so. So sum those signals together my snare was just robbed sounded really thin and wimpy. I was having kick problems too, cymbal phasing what a mess!

    OK this was solved so simple. I just zoomed right in like as far in on the timeline I could go in REAPER, and then I just took a typical snare hit event usually without cymbals just clean as I could find ...then lined up all the drum tracks to be equal and dead on in sync on that one snare hit. So all the mics are sync'd pulling and pushing together. Sweet. Done, that simple.

    Now I have a ton of drum power, natural clean sound man I'm stoked sounds awesome.
  4. BobRogers

    BobRogers Distinguished Member

    Apr 4, 2006
    Blacksburg, VA
    You can do a test recording and compare the wave forms in your DAW. That way you can see the phase (really time) shifts. The only physical measurement that I know of people doing is putting mics exactly equal distances from a source (check out the recorderman method for micing drums). You seem to be hinting at putting mics at different distances - but having the difference an integer multiple of wavelengths. Of course the problems is that will work at some frequencies and be out of phase at others - (comb filtering).
    Equal distances or the three to one rule are the standard workarounds that I know. Once you have tracked, you can nudge the tracks in time to get the phase to line up. Another thing here is not to be too obsessive about this. Small phase differences are one big reason that stereo pairs give a realistic stereo image.
    As indicated - you can nudge tracks in time to make things align during mixing. You are always better off doing things right during tracking, but there are still things you can do during mixing. There are even toys that can help you with this.
  5. SharkFM

    SharkFM Active Member

    Dec 28, 2010
    Couple of notes for the Theory of Phase Relativity - the speed of sound is 343.2 metres per second -and pretty much independent of frequency. You can calculate the exact displacement required on a track just using the distance from the sound source. So I don't think there is a need to fret over freq., just time.

    I think for precision you must do time alignment after tracking. I had 9 mics on my kit.

    Kick Pr.
    Snare Pr.
    Tom Pr.
    Overhead / L& R Ambients.

    First step was to align the ambients with the Snare. The furthest mic from the Snare required the most shift of course
    Less critical but I did align the Kick and Tom Groups with the Snare Group as well - just a minor shift was required since it is only 1 M apart at most

    However Phase across the drum is something that I noticed. What I tried to do here was get the phase of the two signals so that initial strike results in the Speaker popping out instead of the two signals working against each other. But the shape and formation of the waves was less consistent so it was harder come up with a final track positioning on this.
  6. Davedog

    Davedog Distinguished Member

    Dec 10, 2001
    Pacific NW
    Phase coherency, especially with drums, is one of the main reasons I NEVER put the overheads on the same 2-bus as the rest of the kit. Its also why I always eliminate as much of the bleed as physically possible before mixing. When close-micing a kit, there will always be a timing issue between the overheads and any room mics you may have up and the close-mic'd pieces. This is why its important to eliminate the bleed physically before doing so electronically.

    When you time align the overheads with the snare and kick, be sure and cut the offending frequencies before doing so and you'll find much less of a shift that has any real impact on the overall sound. The lower the frequency the slower its travel time.

    In determining phase problems between two mics on a single source, be aware of the complexity of the signal involved. Something rich in harmonic content is going to have many cancellations as well as boosts, as opposed to a rather purer signal. Paying attention to the root frequency of a signal and other sources that are similar or harmonically similar in their content will usually guide you to the type of mic, style of placement and the conditions you can effectvely capture that source in.

    Some things are simply better captured in a dead environment while others need room the breathe.

    Most of this is simply making qualified decisions from the outset.

Share This Page