()
At a recent meeting with my supervisor, she made the point that it’s easy to forget that all of our representations of sound—for example, the image of the sine wave, the oscilloscope and the spectrograph—are exactly that: representations, enormously abstracted from the real-world nature of sound. Teaching the mechanics of phase, using these representations to show how sound pressure accumulates and cancels, is easy enough in the isolated context of the classroom. In terms of demonstration, it’s also pretty fun, as students seek out the peeks and troughs, the spatial logic, of two sine waves colliding in the classroom. Yet the ubiquity and complexity of sonic phase relationships in the real world is virtually unfathomable. It’s like trying to trace the paths of every photon that hits your eye.
Phase is usually associated with space, especially in the context of a room or a recording studio, whereby different phase characteristics can be found if one moves the microphone around the space. Indeed, the spatial understanding of phase largely suffices in the context of music production, especially multi-microphone situations like drum tracking. How the recordist deals with phase alignment through their mic placement can make or break a recording session. My intrigue with phase is in its slippery relationship with time, the inertia of imperceptibly small instants of time, and how these tiny temporal differences of sound waves hitting your eardrums at slightly delayed times can lead to dramatically different perceptions of sound.
The shotgun microphone embodies phase as a temporality, as a means of cancelling sound from the sides and the back of the microphone, making it a mandatory piece of kit for virtually all location/film/broadcast recordists. Using a shotgun mic for field recordings doesn’t necessarily lead to realistic recordings, because of the complex phasic relationships inherent in its design, but through its nature as an isolating device it can be conceived as a way of exposing intentionality and performativity in field recording. Through this configuration, the shotgun mic becomes an improvisatory instrument.
On my first day at Bogong, when pointing a shotgun mic at some arbitrary point in the rapids of the Kiewa river, it became apparent that I could be pointing anywhere and I’d expose some new timbral nuance. This recording quickly became this bizarre improvised performance of me swinging my mic around, creating lunging rhythms out of the sound of the river, sculpted by the phasic characteristics of the mic. This was one of the more physical improvisations I’ve ever done, a departure from the relatively micro-physical improvisations of the piano of laptop.
On paper, this practice sits uncomfortably close to the really tedious notion of “improvising with nature.” It’s something I’ve been accused of substantiating through my first EP, Bythorne. Musicians like David Rothenberg, whose practice includes improvising “with” nature, strike me as perversely humanist, and it carries some pretty dangerous assumptions that birds will wilfully evaluate human sound on aesthetic grounds. There’s also a touristic, solipsistic implication of using improvisation as validation for having an effect/impact on the environment (this idea is central to another project I’m working on, more on that another time). I haven’t read his book, I assume his framework is more nuanced than that, but I mention it only to distance myself from it!
The actual recording of this improvisation is difficult to pick out as a river once the mic-swinging begins. I did another recording of playing back this mic-swinging recording through a speaker underneath Junction Dam, with its gloriously muddy reverberation, and it becomes even harder to associate it with the river. Here, considerations of phase are so mediated that they’re practically negligent in their complete disarray, as it renders the source recording virtually unrecognisable. Phase and time become entangled yet diffuse.
As a performance of which I’m the only audience member, this experiment is intriguing, but I worry that it doesn’t translate to interesting music or sound. Explorations of phase in this sense don’t translate well to recorded music, possibly because the indeterminacy of the playback situation (headphones, hi-fi speakers, car stereo, etc) means the foregrounding of phase as a compositional device would be redundant in different playback situations. Moreover, the acousmatic condition of field recording and the tacit understanding that field recordings are edited and manipulated suggests that this particular timbre could’ve been added in the studio, not necessarily employed in the moment of recording/performance.
The American musican and sound artist Stephen Vitiello once said something to the effect of “music is about time, and sound art is about space.” Resituating the concept of phase as a fuzzy, inertial conception of the instant of time gives modulation to this problematic binary. More work needs to be done to articulate this concept more thoroughly; stay tuned.