The present and future of post production business and technology | Philip Hodgetts

May/12

24

Three things that changed everything for Solar Odyssey

In no particular order, the three technology advances that make the Solar Odyssey production even feasible are: Large sensors, Syncing double system production by audio waveforms, and the recognition of the importance of metadata. Alternate camera and lighting mounts, along with LED lighting and laptop computers powerful enough for primary production also play an important role.

I make no claim to be a camera geek. I rely on advice from my friends Carey Dissmore and Steve Oakley, among many other IMUGgers: always believe what they say about lenses over me! That said, I love working with beautifully shot images, which was the main reason I stopped shooting my own material many years ago. The images I’m seeing off the NEX 7’s and FS100 are just beautiful. Here’s why I decided on them.

But, like so much of the technology we’re using, large sensor cameras were “big boy” toys. Even ⅔” cameras were significantly more expensive than their small sensor brethren. But in the few years since I wrote the first edition of the HD Survival Handbook I’ve seen affordable cameras transformed with the release of both DSLRs and the new crop of large sensor, and affordable cameras: Sony’s FS100, Panasonic’s Micro 4/3 camera, Canon’s C300 and the upcoming Blackmagic Design camera. These cameras produce noticeably nicer images with seriously improved dynamic range, making lighting less of an issue.

The FS100 is significantly more sensitive under low light (leading to the need for ND in bright light as none is built in) seeing the world brighter than my native eye does! For reality production that offsets the less-ergonomic aspects of the camera. The NEX7 shots video that looks like what I see – not quite the dynamic range, but under challenging conditions in Las Vegas during NAB it performed remarkably well.

That pairs nicely with the advent of LitePanels’ LED lighting. Because the cameras are so sensitive we don’t need massive amounts of light to “get picture”. We can use relatively small (and battery powered) lighting units that help fill shadow and control the image.

The most significant development though, is the ability to synchronize separate audio and video recordings by matching audio waveforms. I credit Bruce Sharpe at Singular Software with first bringing the technology to market with Pluraleyes. We may well end up buying Pluraleyes for the  Solar Odyssey if Final Cut Pro X’s synchronization proves to be too slow for us.

Having that feature right in the NLE or in third party software, means we don’t have the complication of needing to use cameras that can be externally locked to timecode, and the synchronizing hardware to keep it all on the same time of day. That’s huge for the Solar Odyssey project, but I can comfortably say, that the speed advantage for synchronization for those using matching timecode and our Sync-N-Link or Sync-N-Link X, is phenomenal. But it’s ironic we can’t use our own software on this project.

Having that technology available means we can record quality audio without needing direct connections to the camera. On board six radio mics and two PZM mics feed a multitrack recorder on the Mac Mini, which will be trimmed and individual segments and people pulled out for editorial by our metadata-based software. These will then be synchronized in Final Cut Pro X. Off boat up to four people will be mic’d with a lapel mic feeding to a Zoom H1 (very basic but I want simplicity and basic functionality). Each person has a self contained recording, again merged with the camera(s) footage of the same event.*

Equally important is the recognition of the importance of metadata, and particularly our own thinking and developments in that field. Solar Odyssey is an opportunity to test a new approach to “easier and earlier” logging that gets transmitted to the editor without additional effort.

But that wouldn’t work without  the rise of cellular data and iPads. Without them, we’d have to be using laptops on “set”, which would be nowhere near as convenient. With our planned system for handling metadata multiple people can be logging the same, or different, events and the system will merge them appropriately. For those interested, we already working with a functional, but not pretty, beta.

Not that laptops don’t figure in the equation. The quad core i7 17” MacBook Pro I’ll be working with for the production has more power available than any tower I’ve ever used. Ditto the Mac Mini we’re using as a central server: Metalan Server for sharing media from the duplicate G-tech towers; as a multi-track audio recorder; and for Final Cut Pro X ingest and proxy generation workstation.

With the shift to using the GPU power as well, these tools are many times more powerful than the last big production setup, and we manage with those much-less-powerful tools to create interesting stories!

It’s a great time to be in production and I’ve appreciated the challenge of rethinking production.

*Like many people I questioned the term “Event” in Final Cut Pro X. Having now gone through the exercise, for our new software system, of naming each individual part of the shoot – self contained elements, I find myself coming around to Event as being the most accurate term. Someone at Apple is likely laughing at me right now, but I’m OK with that.

No tags

9 comments

  • Marcus R. Moore · May 24, 2012 at 7:10 am

    Hey Philip. After a recently completed project, I’ve started impressing on my producer the cost and time benefits we could reap if we started employing some kind of digital slating.

    It’s seems crazy getting back a sheet of paper with good and bad takes. That I sit here at 3am and can’t be sure which b-roll shot is supposed to match the script.

    I’ve been rallying for years agains the duplication of effort that occurs when there isn’t a proper followthru of information between production and post-production. It’s so wasteful.

    So I’ve started looking into a couple of these digital slating devices, most notably QR slate. I wrote them looking for some more detailed info on how the slate data was translated into FCPX and this is what I got back,

    “Right now, QRSlate imports the metadata as keywords. It sets a marker on the clapstick clap, and also sets a duration marker over the length of the action.

    We’re still working on the best practice for it, as it was mostly translated from our knowledge of commercial/film editorial in Avid and FCP7. So any suggests are strongly considered when updating the software. We’re also working on a customizing feature so people can import the metadata how best suits them.”

    I was wondering if you had any perspective on this? Are the APIs available in FCPX to really maximize this function right now? Have you tested any of these solutions? Or are you working on something of your own?

    • Author comment by Philip · May 25, 2012 at 7:16 am

      Sadly, right now, no slate-based tool can go to FCP X directly. They all require you to go to FCP 7, link the media, export XML and translate with 7toX.

      The limitation is in FCP X (and may be permanent?). If media is not online in the expected path when XML is imported, FCP X replaces offline cops with a gap clip that can never be linked to media.

      We’re hoping a small subset of what we’re doing with Logger (our working name) can turn into a utility that will make those apps work (we might even work with the folks that need it to incorporate it in their apps).

      • Marcus R. Moore · May 25, 2012 at 10:42 am

        That’s weird. QR Slate says they recently update their desktop app to export to FCPX, and you can see it in the interface as a separate export option from FCPX on their website.

        I may have to bite the bullet and buy the thing to see how it really works.

        • Marcus R. Moore · May 25, 2012 at 10:43 am

          Separate option than FCP7, I mean. They clearly make the destination to either export for FCP7 or FCPX.

          • Author comment by Philip · May 25, 2012 at 8:54 pm

            Interesting. Unless they have you export an FCP X Event XML to merge with, it can’t possibly work. Not equivocal about this, but fact.

          • Greg · May 26, 2012 at 10:00 am

            I haven’t used QR Slate, but here’s my current understanding of how it works (which is quite different to Movie*Slate). Info is put onto the slate and displayed to the camera. Then a QR code is generated with the info which is also played to the camera.

            After shooting the media is imported into a desktop QR Slate app which scans the media for the QR code. This is then interpreted and the metadata is embedded into the media file. (There also seems to be options for batch file renaming.)

            If the NLE can read the embedded metadata out of the media file, then it will carry into it without any XML needed.

  • Author comment by Philip · May 26, 2012 at 10:37 am

    Ah, that makes sense. ALthough on the website FCP X gets only a mention on the home page.

    • Marcus R. Moore · May 26, 2012 at 4:28 pm

      Yeah, their original promo video has specific ingest demos for both AVID and FCP7. They added FCPX export option to the the desktop app a while later, but they haven’t updated their materials to demo it, unfortunately.

      What I can’t tell is whether it is actually able to embed the metadata onto the camera original files, or if it actually creates NEW media files.

      • Author comment by Philip · May 26, 2012 at 5:37 pm

        They should be able to embed it in any media file (well almost any format). What interests me is what format they could possibly use that will show up in the searchable areas in FCP X. EXIF or IPTC are read by FCP X but in the metadata pane from the info pane and aren’t accessible or searchable. XMP metadata would be great but is unsupported in FCP X.

        Curious. I suppose I could just ask, but where’s the fun in that! :)

<<

>>

May 2012
M T W T F S S
« Apr   Jun »
 123456
78910111213
14151617181920
21222324252627
28293031