Alongside announcements of new Creative Cloud applications, Lightroom CC, and machine learning features powered by Sensei, Adobe is today previewing an entirely new tool it calls Project SonicScape, shown for the first time at Adobe MAX 2017.
Project SonicScape is an experimental feature that builds on Adobe’s recently expanded support for immersive 360-degree and VR experiences. Adobe Premiere now features real-time VR playback support and VR-enabled motion graphics templates. Project SonicScape goes a step further by allowing you to edit 3D audio in VR through an immersive visual experience.
The magic begins by capturing 360-degree video and audio. Spatial audio imported by Project SonicScape is shown onscreen, with different frequencies represented by colored dots. This new visualizer moves beyond the traditional waveform by showing you not just audio frequencies themselves, but their position in 3D space. By clicking and dragging, the audio can be manipulated in space to change the perceived direction of sounds.
In its current state, Project SonicScape isn’t built into any specific Creative Cloud application, but rather serves as a standalone tech demo, showcasing the capabilities made possible thanks to modern advancements in VR. Adobe hasn’t provided a timeline for the feature’s release, but we’ll keep you up-to-date on new developments in the future.