The Sound within the Orchestra

The sound impression for the musician in the orchestra is clearly different from the sound impression of the audience. Not only the instruments, but the entire body of sound is designed to resonate with the concert hall and enter into a symbiotic connection with the room. As a listener in the hall, one hears the result of this complex connection. The central feature is the distance to the instrument, which makes the sound soft and warm.

Many orchestra musicians are downright addicted to the direct sound of their own instruments and those sounding in their immediate neighbourhood. An immediacy that gets you going. It is not only evident in the higher volume of the ambient noise (e.g. from the bowstroke or the keys), but also in the attack behaviour of each individual note. What the air evens out over distance are the small peaks and impulses, technically called transients, which are exciting for the ear.

Why? One possible reason: We humans are evolutionarily trained to react particularly quickly to cracking branches in the forest, for example. The transients put us into a state of alert, which we perceive as an increased state of energy. Pop music knows how to use this and microphones as close to the sound source as possible. In classical music, the more relaxed distance is usually preferred.

What would happen if the listener was able to hear the orchestral musician’s sound experience through media?

In a 360° video research project with the Munich Symphony Orchestra and the conductor Kevin John Edusei, who thankfully let me into their rehearsal, I devoted myself to the question of the sound inside the orchestra. Clarence Dadson from Design4real, Johannes Müller, Leo Waidosch, Fons Adriaenson, Martin Rieger, Roland Bachmann from Sennheiser and Christian Lehmler from Bavaria Musikstudios helped me a lot. Thanks for the great support!

Conductor Kevin John Edusei with imaginary VR goggles (all Photos: Mathis Nitschke)

360° Video / Virtual Reality

360° video in virtual reality goggles is the ideal medium for such insights directly into what’s happening and so it’s not surprising that several experiments with orchestras were undertaken in the VR hype of 2016.

In the eagerness of the hype, the sound was unfortunately hardly thought about. For the first time, different camera perspectives were taken, but the same “master” stereo mix was applied across all perspectives (e.g. here). This has many practical advantages, especially the balanced sound mix of the conductor and sound engineer, and goes hand in hand with the usual practice that close-ups in an orchestral video recording are also not treated from a perspective. In the “teleportation medium” Virtual Reality, however, I always found this to be wrong. I also want to be acoustically at the destination in such a way that the listening impression remains spatially stable against the movement of the head.

On other occasions, exactly this spatiality has been experimented with, unfortunately often with doubtful results (e.g. here). The reason for this is that, according to theory, synthetic binauralization (i.e. the artificial creation of a spatial location in three-dimensional space via headphones) requires the presence of signals that are as isolated as possible, which is difficult to achieve in an orchestral recording, since, for example, the brass can be heard almost as loudly in all other microphones as in its own. I also took directional microphones in my first experiments, but they lead to a compressed sound that you don’t really want to hear (that’s why you almost always take so-called omnidirectionals). Also here applies: Garbage in, garbage out. If the sound from the microphone is already wrong, it won’t get better with further processing.

Most times, however, the sound was simply handled carelessly or unknowingly, e.g. simply switched mono (as here, for example). Probably, however, the poor picture quality of the GoPro camera rigs at that time and the diminishing hype about VR brought the experiments to an early end. Today there are finally good and practicable 360° cameras available.

The Z Cam S1 360° camera, generously provided by Clarence Dadson, Design4real

Research question

It’s better to do research without hype. My main question was whether it was possible to get the necessary microphone techniques (omnidirectional microphones and A-B stereophony) for a large and open sounding classical recording into the so called Ambisonics format (equivalent to M-S or X-Y stereophony) necessary for a 360° video without suffering sound losses due to comb filter effects.

I also expected that this open sound image could rotate around the whole head without damaging the sound in between. I don’t think one should notice the rotatability obviously. In everyday life you also don’t notice that the auditory impression changes with every small movement of the head. Nevertheless, I expect the instruments to stay in place. The flute sitting to my right should be audible from behind when I turn left towards the oboe.

When I use omnidirectional microphones, I accept the fact that the spatial definition becomes more diffuse than would be possible with directional microphones. Finally, I believe that the volume balance is even more important for the midrange than the spatial positioning. When I sit next to the oboe, I want it to sound as loud as I expect it to. The decisive challenge is not to lose the overall sound of the orchestra.

Rehearsal of the  Munich Symphony Orchestra in the Bavaria Music Studios: Claude Debussy, “La Mer”, 2nd movement

Approach

For this 360° experiment I was interested in the position between the first woodwinds, which sit close together in a square. A central position in the orchestra that allows a slightly elevated view over the strings and, rising to the rear, allows the brass and percussion players to sit on their necks. Certainly not a balanced position, but that’s exactly what I liked about it.

The usual main microphone above the conductor makes no sense from this perspective. Instead, a Sennheiser Ambeo VR Mic recorded the sound all around at the camera position in such a way that it can later be assigned to the direction from which it came.

This Ambisonics main microphone is supported by the spot microphones (omnidirectional!) which are also common in regular recordings.

The “outriggers”, omnidirectional microphones on the front flanks of the orchestra, which add a special width to the sound image, were extended by another pair on the rear flanks. Quadrophonic outriggers, so to speak.

I mixed the 24 recording channels recorded in Samplitude in Reaper with Spatial Audio or Ambisonics plugins from DearVR, Facebook360, IEM and BlueRipple.

Prototype

Sadly I can’t make the prototype public for various reasons. Please contact me if you are interested in it.

Discussion

I am basically very satisfied with the result. I consider the primary goal of creating an “unphased” sound image to have been successful. The open sound impression of the omnidirectional microphones is preserved through the Ambisonics processing, which I do not take for granted. I would like to see a higher locating sharpness.

One objective problem is the sound of the strings. Maybe they wouldn’t even have to be much louder, but the size of the device isn’t reflected. I should have given the brass spot microphones to the strings. Two microphones per group would probably be right. I thought I needed the spots on the brass to pinpoint their location, but that didn’t work out.

Also, the overall frequency response is somewhat disappointing. The weight of the orchestra is largely determined by the space, and this space here has little volume to offer. The close miking also undermines the resonance system between the orchestra and the room. Normally you could simulate and add a larger room using a reverberator, I did that, but in this case that contradicted the visual impression of the 360° video. But also the binauralization takes away bass. In any case, there is still room for improvement.

Outlook

In my future projects I would like to work more with the sound from within the orchestra. My interest in it has increased even more. The decisive difference to this research project will be that the user can move through the sound field of the orchestra himself (6dof). So he or she can determine the sound balance of the instruments by his or her own position in the orchestral sound field.

The purpose of this 360° video experiment was to find out what the challenges and qualities of this listening position are in a static position. These experiences will be of great use to me in mastering the even more difficult tasks of interactive walkability.