Author – Hugh Gormley, Project Manager (Video Innovation), Learning Futures
What’s in the Box?
Learning Futures is always up to a challenge! This year we partnered with OMC (Office of Marketing & Communications) and the Red Zone to provide an Immersive (360/VR) experience suitable for the Commonwealth Games (and focused on showcasing our talented student Athletes). This post takes a personal look at the learning ‘on the ground’ garnered through this experience, and some of the innovative approaches employed along the way (warning…this might get a little ‘techie’).
Oranges or mandarins?
One of the hardest things about pitching any new media project is simply conveying how your idea differs from a traditional approach.
In the case of 360˚ video, you’re pitching snippets of ‘actual’ reality, pieced together to make a creative work. There is nowhere to easily hide lights, microphones, passers-by or errant directors. The camera crops nothing from a scene, nor adds anything in terms of zooms, pans or focus-pulls. In our case, it renders reality, warts and all, in beautiful / brutal 6k video at 60 frames per second. The video is stereoscopic too, meaning that when you view it in any headset, everything appears to be 3D, just like you’re there.
These factors are often unknown quantities to client partners and talent. As you describe for example ‘a kind of carousel menu where four athletes will be standing around the viewer, ready to tell their story’ YOU know what you mean, how it will look, sound and most importantly ‘feel’, but most people have not yet tried any kind of VR at all. The gap between presenting in 360˚ / 3D and traditional media means they’re still thinking of delicious oranges when you’re talking about exotic Mandarins. They’re similar, not necessarily better, and it is hard to explain the difference until everyone has a taste.
This presents a delicious conundrum where, if you have your partner’s trust, you can both ignite your creativity and allow audiences to sample a new medium. Mmm… Mandarins.
You need a Plan B…or more
This exciting project came to Learning Futures in the last flush of preparation for the Commonwealth Games… and because we are the people that we are, we let our imaginations run away from us at first, even with the hardest of ‘hard deadlines’ approaching quickly. Our Learning Futures’ Dean and Director, Professor Alf Lizzio tasked us with undertaking the project to learn everything we could. We took this literally, making sure we pushed the equipment and our skills to the limit.
Our partners (Office of Marketing and Communications/Red Zone) were the best kind – keen, experimental, trusting, and with high standards. We pitched for the stars, roping in five (5) Griffith student athletes and committing to technical challenges like recording 360˚ on a green screen and building boutique camera rigs to shoot underwater.
OMC and Learning Futures both went straight into full production mode; organisational tasks were divided up, athletes arranged, favours called in, and test footage shot.
Our first recording session at Griffith Film School (GFS) couldn’t have gone worse. Our camera refused to acknowledge any media being plugged into it.
Athletes stood around patiently while we tried to get things back on track while even our back-up camera refused to play ball!
This was one of my worst ever days in the studio, and it just goes to show that with anything new and cutting edge, you just don’t know what you don’t know.
Still, we managed to take each one of our student athletes off to interview them separately in the audio suite, forming the bed of each story that you hear in the finished video.
Fit-out for Purpose
Our next challenge was location-based. This was unfortunately the last day the GFS sound stage was available prior to Trimester 1 for our purposes, so we needed another solution that would be more flexible and accommodate our athletes’ training, work and study commitments. We were already very conscious of the time we could ask from them, and of the mere weeks remaining until the start of the games, so we expanded the green screen wall offering in our ‘not yet built out’ Immersive Studio to include a floor.
We took the innovative approach of purchasing 12 square meters of interlocking green camping tiles (from Bunnings) to give us just enough space to have the athletes walk toward the camera on something resembling head-to-toe chroma-key green. The student athletes could now tell us when they were available, come to the studio, get changed nearby, record a full-length shot with us for ten minutes and leave again. We’d spent considerable time double checking camera operations and ensuring there wouldn’t be a repeat of the issues experienced in our first attempt. Lighting was made to be ‘set-and forget’ for when the athletes arrived, and this time everything went perfectly by comparison.
We could have shot all of this with traditional cameras, but the decision was taken early to use the Insta360 Pro camera to capture our talent with the same colour, resolution and wide-angle lens distortion as our ‘real world’ shooting. This also gave us the benefit of 120fps stereo filming that comes through so clearly in the finished product when the athletes are ‘virtually’ standing around the viewer.
The real-world shooting allowed us to learn a great deal too.
In the Gym
Shooting in gyms full of mirrors that exacerbated the traditional issues of shooting 360 video was nothing compared to trying to corral the ‘Joe Public’ gym goers who just came to work out and couldn’t care less about walking through our shot.
The 360˚ camera is, quite plainly, 360˚ and thus a simple shot can be ruined by someone walking past the ‘back’ of the camera, straight behind your main action. It’s also hard to tell if this has happened, because you’re invariably hiding with the Producer behind any cover available with only a 1 frame per second wifi stream for monitoring. We found the wifi monitoring connection to be weak at best and constantly requiring reconnection. In the future, I will try taking a pre-configured router onto location to enhance signal and save time keeping the connection between camera and mobile device intact.
In the Swim
Australians love swimming, and I really wanted to put the viewer into the pool so the first shot of swimming would be a 3D swimmer tearing through their vision and revealing that they are watching the swimming from underwater. This would require another camera called the Insta360 One; itself the little sister to the Insta360 Pro we used on the rest of the production. It is small, unable to do 3D, yet has an optional underwater housing, and I thought if I put two (2) of them side by side (about the distance of the average person’s eyes) I could manage a stereoscopic image from the camera – at least directly in front and directly behind the viewer.
Unfortunately, on the day this experiment failed. as I hadn’t considered the effects of underwater refraction and how it would influence the stitching of parallel lines at the extreme left and right of the camera, and also one of the camera batteries failed for no good reason. I’m convinced the principle was sound though, and I will endeavour to prove it at another time.
On the day, we went to our plan B, shooting underwater with just a single camera. The refraction problem is still evident, but by positioning the camera thoughtfully we decided we could turn the viewer’s head away from seeing this annoying problem.
The underwater shoots proved challenging, jumping in and out of the pool each time we needed to button on or off, while also directing our talent swimmers to get in and out of the pool to change, or to repeat shots over and over again. We overshot here on purpose because we couldn’t preview and didn’t know how each shot would turn out. Still, nice work if you can get it, bombing the pool on a sunny summers day!
Pulling it all together
With everything now shot, it was time to get stitching.
Most of our footage used 6 x 5k cameras stitched together by powerful computers using ‘new optical flow interpolation’ to produce a 3840 x 3840 pixel stereoscopic master at 60 fps. We were lucky to realise early on how time consuming this would be, and change tack, splitting the footage amongst the 4 computers bought for the upcoming GU Immersive Lab that contained the latest nVidia GeForce 1080 graphics cards. Even still, this process took a whole weekend to complete, though to be fair, we also had the computers double the frame rate of much of this footage (again, using the time consuming optical flow retiming interpolation process) so that we could seamlessly halve its speed later.
The next step was a surprise in that our standard desktop editing software, Adobe Premiere seemed unwieldy and struggled to make the editing process convivial. It also wouldn’t allow headset previews on the mac. It was time to make another change, and the solution came in the shape of Apple’s Final Cut Pro X, which I read had recently been ‘optimised’ for 360˚. I grabbed a HTC Vive headset and plugged it into my Mac Pro, and (much to the chagrin of my windows compatriots) everything just clicked. I’m no great fan of FCPX for traditional editing (although I think that’s just because I’m getting older and less keen to change the habits of a lifetime), but for 360˚ video, it has some fantastic tools and makes the process much easier. I particularly like its ability to instantly switch between the full resolution 3840×3840 60fps files and low resolution proxy files that FCPX creates for you to allow you to just work and then only process the high res footage at the end.
Let’s put those file sizes in context. A High Definition (HD) frame is 1920×1080 pixels and 25fps. A 4k frame is double the height 3840×2160 and therefore 4 times as many pixels. Our footage is two 4k frames, one atop the other, and thus 8 times the HD footage. It’s also at least 60fps (and some shots were 120fps) and therefore 2.4 times more than the HD footage. So, in every second of 360˚ video footage we shot, there is minimum 19.2 (8 x 2.4) times more information than the standard, traditional HD footage we are typically used to working with! This manipulation of super large file sizes was one of the biggest challenges we had to overcome
Now happily ensconced in the editing process, I was conscious of the ‘KISS – Keep it Simple, Stupid’ principle, but at the same time keen to see how far I could push editing and transitions to create an entertaining experience.
However, pushing the bar doesn’t mean pushing right to the bleeding edge, and so designing the experience through trial and error was important. Using traditional tricks like cutting a lap of the swimming pool into 4 parts to convey speed and yet compress the time were employed. The viewer doesn’t move (saving potential motion sickness), but sees ‘reality’ selectively cut in order to speed it up. Slow motion is also used to ramp from normal to slow and back to normal speed. On average, each story contains about 12 – 14 cuts but only 3 – 4 scenes to limit the possibility of confusion or simulation sickness.
Further to this, we quickly found that managing the orientation of the footage was something of a mission.
For those who might read this to inform development practice, note that orientation is important throughout your entire production but expect to have a review of every orientation before finalising. This is because while you might wish to reorient each shot so that your viewer is always facing the same as the end of your last shot. If you make a change anywhere further down your timeline, you’ll need to change every preceding and successive shot to compensate; especially if you want your experience to loop. My advice is to keep it roughly reoriented to allow for meaningful playback preview and feedback, but plan for an overhaul tune up at the end, no matter how frustrating this might be.
Also, its imperative that you give the viewer a chance to look around and find the action. This can be achieved using queueing techniques like audio panning or dissolving to the next shot while the viewer is already likely to be panning their head from the previous shot.
Special Effects could have their own article, and I wont get into it, other than to say that nearly everything I knew about compositing had to be questioned to complete this project!
From the most simplest ideas to the more complicated, nothing translated easily from the flat, traditional composition techniques I know well, into the wrapped equirectangular frame of 360˚. With the frame sizes as large as they were too, my processes had to be broken down and re-established. I now know of some great tools within After Effects, albeit somewhat unwieldy in themselves to translate the flat to the warped and back again to help the motion graphics artist visualise. Next time!
So, how did we go?
The experience was provided as part of the ‘Red Zone’ activation at the Commonwealth Games Swimming event. It was reportedly standing-room only with long queues prevalent throughout the entire Games schedule.
The experience will be available in the Campus-based Red Zones shortly, and there are always ideas for iteration and enhancement. There are opportunities to extend the VR component to enable viewers to select the athlete they wish to hear about, and possibilities for an Augmented Reality experience, where the athletes appear on a flat surface for selection, and elements of the 360˚ video then play.
Given more time (or our time again), the experience would have included spatial audio / sound and potentially more graphics production, especially in relation to text.
In all, the project took (in ‘man’ hours) 41.5 hours of shooting, about 80 hours of post-production and about 60 hours of planning and test shooting. All to deliver a 5 minute experience!
It takes a Village!
None of this would have been possible without Learning Futures colleagues Bradley Harrison Producing, Espen Dammen recording the audio and his ever-present tech support in the studio, or our partners from the Griffith Film School (particularly Dean Chircop, Brett Wiltshire, Curtis Sullivan and Alex Waller) who helped immensely to pull everything together during our fateful day in the GFS studio.
Of course, none of it would not have happened without Anne Brandt and Emily Kuntz (OMC / Red Zone) and the support and vision of Professor Alf Lizzio (Learning Futures).
Immersive Yourself and meet our Athletes
There are two ways you can engage with the Commonwealth Games Immersive 360 Experience:
- 360 view on your Browser | Use your mouse to navigate the 360 environment via YouTube
- View via Google Cardboard or a range of VR Headsets (eg. Daydream, Vibe) for a more immersive experience…get ready to start boxing!
To see more of the Red Zone, and other ways our Learning & Teaching expertise at Griffith is articulated in this emerging technologies context, visit the site here.