In no particular order, the three technology advances that make the Solar Odyssey production even feasible are: Large sensors, Syncing double system production by audio waveforms, and the recognition of the importance of metadata. Alternate camera and lighting mounts, along with LED lighting and laptop computers powerful enough for primary production also play an important role.
I make no claim to be a camera geek. I rely on advice from my friends Carey Dissmore and Steve Oakley, among many other IMUGgers: always believe what they say about lenses over me! That said, I love working with beautifully shot images, which was the main reason I stopped shooting my own material many years ago. The images I’m seeing off the NEX 7’s and FS100 are just beautiful. Here’s why I decided on them.
But, like so much of the technology we’re using, large sensor cameras were “big boy” toys. Even ⅔” cameras were significantly more expensive than their small sensor brethren. But in the few years since I wrote the first edition of the HD Survival Handbook I’ve seen affordable cameras transformed with the release of both DSLRs and the new crop of large sensor, and affordable cameras: Sony’s FS100, Panasonic’s Micro 4/3 camera, Canon’s C300 and the upcoming Blackmagic Design camera. These cameras produce noticeably nicer images with seriously improved dynamic range, making lighting less of an issue.
The FS100 is significantly more sensitive under low light (leading to the need for ND in bright light as none is built in) seeing the world brighter than my native eye does! For reality production that offsets the less-ergonomic aspects of the camera. The NEX7 shots video that looks like what I see – not quite the dynamic range, but under challenging conditions in Las Vegas during NAB it performed remarkably well.
That pairs nicely with the advent of LitePanels’ LED lighting. Because the cameras are so sensitive we don’t need massive amounts of light to “get picture”. We can use relatively small (and battery powered) lighting units that help fill shadow and control the image.
The most significant development though, is the ability to synchronize separate audio and video recordings by matching audio waveforms. I credit Bruce Sharpe at Singular Software with first bringing the technology to market with Pluraleyes. We may well end up buying Pluraleyes for the Solar Odyssey if Final Cut Pro X’s synchronization proves to be too slow for us.
Having that feature right in the NLE or in third party software, means we don’t have the complication of needing to use cameras that can be externally locked to timecode, and the synchronizing hardware to keep it all on the same time of day. That’s huge for the Solar Odyssey project, but I can comfortably say, that the speed advantage for synchronization for those using matching timecode and our Sync-N-Link or Sync-N-Link X, is phenomenal. But it’s ironic we can’t use our own software on this project.
Having that technology available means we can record quality audio without needing direct connections to the camera. On board six radio mics and two PZM mics feed a multitrack recorder on the Mac Mini, which will be trimmed and individual segments and people pulled out for editorial by our metadata-based software. These will then be synchronized in Final Cut Pro X. Off boat up to four people will be mic’d with a lapel mic feeding to a Zoom H1 (very basic but I want simplicity and basic functionality). Each person has a self contained recording, again merged with the camera(s) footage of the same event.*
Equally important is the recognition of the importance of metadata, and particularly our own thinking and developments in that field. Solar Odyssey is an opportunity to test a new approach to “easier and earlier” logging that gets transmitted to the editor without additional effort.
But that wouldn’t work without the rise of cellular data and iPads. Without them, we’d have to be using laptops on “set”, which would be nowhere near as convenient. With our planned system for handling metadata multiple people can be logging the same, or different, events and the system will merge them appropriately. For those interested, we already working with a functional, but not pretty, beta.
Not that laptops don’t figure in the equation. The quad core i7 17” MacBook Pro I’ll be working with for the production has more power available than any tower I’ve ever used. Ditto the Mac Mini we’re using as a central server: Metalan Server for sharing media from the duplicate G-tech towers; as a multi-track audio recorder; and for Final Cut Pro X ingest and proxy generation workstation.
With the shift to using the GPU power as well, these tools are many times more powerful than the last big production setup, and we manage with those much-less-powerful tools to create interesting stories!
It’s a great time to be in production and I’ve appreciated the challenge of rethinking production.
*Like many people I questioned the term “Event” in Final Cut Pro X. Having now gone through the exercise, for our new software system, of naming each individual part of the shoot – self contained elements, I find myself coming around to Event as being the most accurate term. Someone at Apple is likely laughing at me right now, but I’m OK with that.