Last April, Adobe’s Virtual Reality capabilities took on new territory. The company announced it would release VR editing capabilities into its Premiere Pro software. The new capabilities include auto-detection of VR, and affordances for the editor to assign properties to the sequences, track the head-mounted display and seamlessly publish to specific platforms, such as YouTube and Facebook. You can’t blame any company for wondering about the future of VR. So what changed?
Bronwyn Lewis, Adobe product manager for Video Editing, brings up Corridor Digital’s “Where’s Waldo” VR video. “They had this wipe, it was like a diagonal wipe, from the sky.”
You could not have had this conversation with the Creative Cloud Video Team, or very many people at all, even two years ago.
“Making VR content actually requires a huge amount of technical skill,” Laura Williams Argilla, Adobe director of Services and Worklows for Creative Cloud Video says. “That often conflicts with creative intent, because you have to have both skill sets. Brian Williams, senior computer scientist for Premiere Pro at Adobe, is making sure that the ability to create content that doesn’t exclude people who are more creative than technical.”
For filmmakers, the blending of creative and technical aptitude has been beneficial. But there’s a limit, one that has more elements than the kind of rigs you use, or the headsets they will eventually populate. In the center of all of that, there is software. Until recently, the pioneers of Virtual Reality storytelling, especially live action, were using the digital equivalent of baling wire and duct tape to tell their stories. For the Video Team, it was hearing multiple times that video creators were using Premiere to edit VR that sprung them into action. Turns out it was not the easiest sell.