I recently wrote about Airplay as a possible solution to external video monitoring for Final Cut Pro X, and that would indeed be a cool monitoring solution but not for critical work as it’s H264 compressed and 8 bit.
With Matrox’s announcement today of their monitoring solution, which confirms uncategorically that there is no “broadcast quality video” out of Final Cut Pro X, I had to rethink what I thought they were thinking!
First a little clarification. Larry Jordan asked me to summarize AirPlay. It’s still evolving from its initial incarnation as a way of distributing audio from iTunes to external speakers and audio devices, but it’s being expanded on iOS (and therefore eventually on OS X) to allow video to be pushed from your iPad (or ultimately laptop/desktop) to an AppleTV connected display. Apple have a fuller explanation available.
So, while it’s a cool technology it isn’t something one would use to accurately grade footage. Surely Apple couldn’t have ignored this. And they haven’t, I have.
What I have bene missing is one of those major bullet points in the Supermeet sneak peek and subsequent release: Colorsync. Colorsync has a long history as a way to ensure that the color on one monitor would accurately represent the color that would be printed. All devices have color profiles (essentially Look Up Tables – LUT – to adjust the monitor to ‘standard’).
How do you grade in Rec 709 colorspace in Final Cut Pro X? You calibrate your display to Rec 709 colorspace using Colorsync.
A nice explanation of how this works in Final Cut Pro X from Chris Kenny over at CreativeCow.net (Chris is responding to Clayton Burkart initially – that’s the italicized text):
[Clayton Burkhart] “No, what makes no sense is Colorsync for video output, because Colorsync is an RGB system which an I/O system bypasses.
So if you expect to actually USE Colorsync for video, that by it’s very nature precludes YUV.
This does not mean of course that you cannot HAVE video output at the same time, it only means that Colorsync in reality will have little or no relation to your true YUV video image.
The reason that I have pointed this out is to show that Apple has almost exclusively concerned itself with the individual who creates WEB content which is RGB, not the video professional who creates for television, film, etc.”
Anything that can be represented in YUV can be represented in RGB. There’s no quality penalty for conversion if you do so in a sufficiently precise color space… and FCP X’s engine uses high-precision floating-point processing. You know what other app processes everything in a linear floating-point RGB color space? DaVinci Resolve. Would you like to argue that’s also not capable of accurate color output? Because my experience screening stuff I’ve graded with it in DI theaters says otherwise.
YUV, in the modern world, is best thought of as an internal implementation detail of some deliverable formats. With the precision of floating point, there’s no reason it needs to have anything in particular to do with how processing actually occurs.
FWIW, I consider Chris’s Nice Dissolve blog one to read on Final Cut Pro X.
So, whether we monitor remotely via AppleTV and Airplay, or use an attached monitor, color accuracy will be maintained using Colorsync, not specific monitors, which are used because they have the advantage of a calibrated colorspace. That sounds familiar. Similar things have been done in hardware by AJA and Blackmagic Design who have converters that take and SDI signal and convert it to DVI while adding a color LUT to display “video” color on a “computer” monitor.
I needed to think more different! Or pay attention to the big bullets in Apple’s PR!