VR-Panning-Tool for Fulldome production | Final Master's Project

This work presents a prototype of an alternative VR interface for panning sound sources in ambisonics, its field of application being the production of fulldome content. It provides the user with a more extensive unterstanding of the positions of all sources and their spatial relations, compared to the conventional user interfaces of most spatial audio tools. Additionaly, video can be played back simultaniously which allows for precise source positioning on visual elements in the footage.

NOTE: Because the application is mainly designed for use in a physical speaker dome, the positioning of the sound sources heard in the video, while being binaural, does not represent what is seen on screen, as there is no headtracking affecting these sources. The audio reflects the experience from a fixed position under a physical speaker dome, like in a planetarium.

Play Video

By now there are numerous hardware and software solutions for spatial audio production in different formats, designed for various playback systems. Their interfaces often make it hard to achieve a precise positioning of the sound sources by usually offering only a small, two dimensional depiction of the room; especially if the positioning refers to moving imagery and the presentation is to take place in such a large environment as a planetarium.

Considering that, the goal for this project was to develop a virtual reality interface, which enables the user to get a comprehensive overview of the spatial distribution of the sound sources and their relative locations to each other through visual feedback, and to also allow exact source positioning relating to moving picture. The result is the prototype of a VR application for 3D panning while serving as a graphical extension of Reaper and two Ambisoncs encoders of the IEM Plug-In Suite, bringing their functionality into a virtual dome in which the video material can be viewed simultaniously.

Although it can also be used at home with headphones and the IEM BinauralDecoder, the tool is tailored to the specific setup and worklow of the Immersive Audio Lab at the HAW Hamburg, which is equipped with a dome shaped 33.2 speaker array. Aside from a multitude of channel and object based formats, the primarily used playback format is Higher Order Ambisonics (HOA). Together with the use of Reaper and the IEM Plug-In Suite, it is possible to produce Ambisonics audio up to the 7th order hemispherical.

Immersive Audio Lab at the HAW Hamburg (Bottom speaker ring is not installed in this picture)

DearVR SPATIAL CONNECT, which is based on the same concept of controlling spatial audio in a virtual environment for 360 video playback, is only capable of 3rd order Ambisonics, meaning it could not take full advantage of the speaker array in the IAL and the spatial resolution that can be achieved using 7th order Ambisonics.

More Infos Coming Soon

The architecture of the tool can be seperated into different building blocks, each with its own set of functionalities, to keep the structure as modular as possible during development. This should make any future expansion or modification of the application easier. The tool itself does not process any audio signals but only serves as a graphical interface to operate the DAW and the two encoders via the HTC Vive Pro’s HMD and its controllers. Inspiration for how to execute the idea for such a tool was obviously drawn from dearVR SPATIAL CONNECT, but was adjusted to fit the typical workflow for Fulldome and Planetarium content in the IAL.

My Roles

Everything

Audio Software

Reaper

IEM Plug-In Suite

Game Engine

Unreal Engine 4

Platform

PC

HTC Vive Pro

Year Of Production

2020 / 2021