Skip to main content

Hello everyone! Great news! IOSONO 3D sound teamed up with Sydney Opera House to present a new version of the opera "Die tote Stadt" by Korngold.
Next to our focus on cinema sound we thought it might be interesting how it sounds when an orchestra sound can be heard in state-of-the-art 3D sound technology.

Korngold’s “Die tote Stadt” shows what opera can look (and sound) like in the 21st century. As the orchestra pit was too small for the big orchestra, it plays live in a studio while the music is streamed in real time to the Opera Theatre. Our 3D sound system then creates an acoustical 3D picture of the orchestra.
So, closing your eyes you can hardly tell the orchestra isn’t actually there and single instruments can be located for both, audience and singers on stage :-)

Check out our website for more information on this one-of-its-kind project, Hope you enjoy! http://iosono-sound.com/
Best, Katja

Tags

Comments

RemyRAD Sun, 07/08/2012 - 20:44

I looked at your link. And how was the sound system to reproduce the orchestra and the opera house actually set up and placed? How many speakers/drivers were involved? This sounds like it might be similar to a concept I have been proposing for nearly 25 years. And to date, no takers. Though my concept did not include an actual Symphony Orchestra. It actually involves a sophisticated sample library along with a specially designed control interface. And the conductor would still be in complete real-time control. And the conductor would still be able to function as a conductor and not as a knob twiddling button pushing electro-mutant.

Without money you can't travel
Mx. Remy Ann David

anonymous Thu, 07/12/2012 - 06:20

The orchestra was reproduced using non continuous arrays of loudspeakers that are driven by our audio processor, which incorporates the IOSONO algorithm. The algorithm was fed with the signals of approximately 60 spot microphones that have been utilized to record orchestra and choir. The orchestra was placed in a separate room.
The position where a sound should radiate and be perceived from has been set in an object based manner (comparable to a PC desktop: you have an icon that represents a virtual sound source that is fed with one of the mic signals).
The IOSONO algorithm calculates the driving function for every loudspeaker to reproduce the wave field of the recorded instrument. We arranged the virtual sound sources the same way the microphones had been placed to reassemble the body of the orchestra.
My guess regarding your concept is, that you are proposing reproducing the signal for every instrument/sample with the desired directivity at the position you want the sound to be localized? Does this apply?
Please note, that in our concept the object based mix and the actual reproduction setup are independent from each other. The loudspeaker driving functions are rendered in real time to the actual loudspeaker arrangement and mix.

x

User login