Categories
Apple Pro Apps

More on Final Cut Pro X’s monitoring solution

Even I get locked into legacy thinking from time to time!

I recently wrote about Airplay as a possible solution to external video monitoring for Final Cut Pro X, and that would indeed be a cool monitoring solution but not for critical work as it’s H264 compressed and 8 bit.

With Matrox’s announcement today of their monitoring solution, which confirms uncategorically that there is no “broadcast quality video” out of Final Cut Pro X, I had to rethink what I thought they were thinking!

First a little clarification. Larry Jordan asked me to summarize AirPlay. It’s still evolving from its initial incarnation as a way of distributing audio from iTunes to external speakers and audio devices, but it’s being expanded on iOS (and therefore eventually on OS X) to allow video to be pushed from your iPad (or ultimately laptop/desktop) to an AppleTV connected display. Apple have a fuller explanation available.

So, while it’s a cool technology it isn’t something one would use to accurately grade footage. Surely Apple couldn’t have ignored this. And they haven’t, I have.

What I have bene missing is one of those major bullet points in the Supermeet sneak peek and subsequent release: Colorsync. Colorsync has a long history as a way to ensure that the color on one monitor would accurately represent the color that would be printed. All devices have color profiles (essentially Look Up Tables – LUT – to adjust the monitor to ‘standard’).

How do you grade in Rec 709 colorspace in Final Cut Pro X? You calibrate your display to Rec 709 colorspace using Colorsync.

A nice explanation of how this works in Final Cut Pro X from Chris Kenny over at CreativeCow.net (Chris is responding to Clayton Burkart initially – that’s the italicized text):

[Clayton Burkhart] “No, what makes no sense is Colorsync for video output, because Colorsync is an RGB system which an I/O system bypasses.

So if you expect to actually USE Colorsync for video, that by it’s very nature precludes YUV.

This does not mean of course that you cannot HAVE video output at the same time, it only means that Colorsync in reality will have little or no relation to your true YUV video image.

The reason that I have pointed this out is to show that Apple has almost exclusively concerned itself with the individual who creates WEB content which is RGB, not the video professional who creates for television, film, etc.”

Anything that can be represented in YUV can be represented in RGB. There’s no quality penalty for conversion if you do so in a sufficiently precise color space… and FCP X’s engine uses high-precision floating-point processing. You know what other app processes everything in a linear floating-point RGB color space? DaVinci Resolve. Would you like to argue that’s also not capable of accurate color output? Because my experience screening stuff I’ve graded with it in DI theaters says otherwise.

YUV, in the modern world, is best thought of as an internal implementation detail of some deliverable formats. With the precision of floating point, there’s no reason it needs to have anything in particular to do with how processing actually occurs.

FWIW, I consider Chris’s Nice Dissolve blog one to read on Final Cut Pro X.

So, whether we monitor remotely via AppleTV and Airplay, or use an attached monitor, color accuracy will be maintained using Colorsync, not specific monitors, which are used because they have the advantage of a calibrated colorspace. That sounds familiar. Similar things have been done in hardware by AJA and Blackmagic Design who have converters that take and SDI signal and convert it to DVI while adding a color LUT to display “video” color on a “computer” monitor.

I needed to think more different!  Or pay attention to the big bullets in Apple’s PR!

44 replies on “More on Final Cut Pro X’s monitoring solution”

Rethinking about this whole problem yesterday, I thought about something that might be the answer to the current absence of video output. I think that all third party hardware will be abstracted by AVFoundations. Thus, FCP would never have drivers to address specific output cards, AVFoundation would take care of that. It would make sense to do it that way since it would make the I/O card available to any application. Then FCP or other app would only have to open a stream to AVFoundations to output to any device or technologies like AirPlay.
The reason why I think it wouldn’t be available right now is that maybe the AVFoundation that is bundled with FCPX right now isn’t complete. It would be some kind of subset of the whole thing. Lion probably has some pretty deep implementation of AVF that wouldn’t be possible to snap off and package with an app in Snow Lep.
If I’m somewhat right, we should see the solution coming soon after Lion release

While that’s possible, it would require the application to write the hooks for AV Foundation to hook into. It appears they are simply not there, so no matter what AV Foundation might want to do, Final Cut Pro X isn’t going to cooperate on output. It’s theoretically plausible for Final Cut Pro X to (privately) call Airplay for “couch” monitoring that someone else might hook into but I don’t think there’s anything in AV Foundation that would help without the applications full support. I think it’s pretty clear that Apple’s solution for monitoring is on a computer monitor that has been calibrated with ColorSync to the profile you want to work in.

However, I agree that there’s more coming in Lion.

Well.. Off course, my thoughts are just speculations but from a software architecture stand point, it would be just logical to abstract that (most hardware) layer at OS level. Now it’s also possible that FCP doesn’t have the “output” module yet in its code. I just cannot believe they will keep it the way it is too! 🙂

I’m saying that there is no architectural provision for video out. Period. FCP X is the way it was conceived and will continue.

A colorsync calibrated monitor – calibrated to Rec 709 – is Apple’s solution I believe. The comment from Matrox I believe is definitive. There is no broadcast output, (and there never will be) because there’s no need when you can calibrate the monitor you’re working on to whatever colorspace/colorsync profile you want to work in. If the calibration is accurate, it’s as accurate as a calibrated external broadcast monitor.

Philip,
That’s what I was asking to you at Larry’s blog.
Is it not possible to have broadcast quality output from FCPX since AV foundation and ColorSync?

Sure you can have broadcast quality output on your computer monitor using ColorSync. No need for any external monitoring and support for external monitoring has not been designed into Final Cut Pro X. The use of AV Foundation is irrelevant. ColorSync your monitor to whatever color space you want to accurately work in. Done. Caibrate to Rec 709 and you’re working on the equivalent of a broadcast monitor (in 8 bit).

That’s a current OS X limit, not FCP X. Most monitors (including broadcast) aren’t more than 8 bit and few sources are more than 8 bit.

But for some applications 8 bit is a little coarse for gradient displays.

So really, unless you need it for baseband I//O like tape-based ingest and output, there would be no reason to use an AJA/BMD/Matrox card or device with FCPX. Just use a second display right off the video card of the Mac. The fly in the ointment for me though, is you have to choose between your second display housing the Event Browser or the Viewer. You wouldn’t be able to dedicate two displays to the FCPX UI and then a third to playback (which would be my preference). Of course, there’s nothing stopping Apple from adding triple display support to FCPX. The new 27″ iMacs have twin Thunderbolt ports after all…

If ColorSync will solve the accurate monitoring on external non-broadcast monitors issue, then the only issue is 8bit. Only high end compositors and motion graphics designers (film, commercial) will need 10bit monitoring. Sounds like progress.

In other words…as my dad used to say – Its like going to Cleveland to make change for a dollar.

Phil, you said “How do you grade in Rec 709 colorspace in Final Cut Pro X? You calibrate your display to Rec 709 colorspace using Colorsync.”

I’m a user of FCPX that falls into what you would call their target audience, and just educating myself on the technical ins and outs. If I were to grade and require accurate monitoring on an iMac using FCPX to grade, how do I calibrate my iMac to rec709 ? Do I need some special hardware to do that, or is there a rec709 colorsync profile I need to download or is it built into the Displays System Preference?

Thanks to anyone who can answer this.

There isn’t a Rec 709 colorsync profile shipping in my Macbook Pro but they are independent files. Each profile has to be made for each monitor model for accuracy. This is probably a great third party opportunity selling colorsync profiles for common monitors to calibrate to Rec 709. To generate the profile will require specialized tools. When you have such a file, it goes into the appropriate location (which I don’t know from memory) and then it would appear in Displays to be selected for that display. Of course, if your output is not broadcast television – web or anything not viewed on an actual TV set, then this is irrelevant and you should grade for computer colorspace.

Obviously the problem there is one of the “Wild West” of monitors available and how they come calibrated. The smart best practice would still be to calibrate to a “standard”. At least then you have something to fall back on when your client complains that the images on his monitor look “dark” or some such thing. You obviously need to have that conversation with the client- “What’s the end delivery for this job”, but people should have been asking that question for years now.

At first [and still to some extent] I was a bit peeved at the lack of external monitor support in FCPx. I still think they need the ability to push out JUST the image final image to a second [or even third] display. I do think ultimately there could be big advantages to this if Colorsync works. For the cost of a good Consumer TV and my Matrox2 mini, I could buy a 30″ display that will have no problem with viewing material at any frame-rate, or at sizes larger than 1080.

I think longer-term, the question becomes will there be a distinction between a computer display and TV?

You won’t need to calibrate your monitor to Rec.709. You profile your display in its natural state and ColorSync does the negotiation between that and the video’s colour space. For more info read up on how Photoshop colour manages.

I commented on this in a different blog post but I think it’s worth saying again that being able to set ColorSync to REC 709 is only one step in the process. Computer monitors and computer GFX cards are designed to work with progressive images running at whole frame rates. How accurately will they deal with interlaced video running at a fractional frame rate (ex. 1080i 29.97). How accurately can the computer monitor be calibrated and how long will that calibration hold (will it drift in a matter of hours, days weeks)?

There’s a reason broadcast monitors are different from computer monitors. And even with computer monitors there are ones targeted towards consumers and ones that are targeted towards professionals (graphic artists, photographers, etc.,).

Now before anyone jumps on me for being an elitist or a hater I’d just like to add that if FCP 10 can really turn my computer display into the equivalent of, say, an FSI broadcast monitor then that’s great. That would actually bring me one stop closer to having FCP 10 be functional for me and it could save me a few grand on my home studio.

I agree with you, but in the long view, how many more years are we going to have to worry about interlaced content and delivery? It’s reason for existing is mostly to support legacy eqipment and delivery that’s becoming less and less relevant.

Someone w/more b’cast engineering knowledge could answer it better, but my guess is that interlaced and fractional frame rates will stick around until all content is delivered via the internet (and that brings up its own set of problems).

Legacy equipment includes things like communications satellites that are not easy to upgrade (if upgradeable at all). One of the hurdles of switching over to HD b’casts in the first place was waiting until technology could squeeze an HD signal into the same amount of bandwidth that an SD signal took up. Everything has to be made compatible w/the weakest link in the chain.

AFAIK all channels that broadcast HD do so in either 720p60 or 1080i60 with MPEG-2 compression (for which there isn’t even a 1080p60 broadcast standard). Even 24p content is carried in a 60i stream.

When all TVs are basically HTPCs (home theater PCs) w/screens then I think standards won’t really be applicable anymore because you’ll be able to playback everything as long as you have the correct codec/player installed.

Of course a family of five concurrently using a single connection for everything is going to consumer a lot of bandwidth. I recently read that Netflix is downgrading the quality of its streaming service in Canada to help keep people from hitting their monthly bandwidth caps.

Colorsync is great – but if only the color correction in FCPX didn’t suck so much.

But what of frame-sync?

Even on a very powerful GPU there is poor frame-sync when using “second display” to display TV frame-rates; certainly for 25fps or 24fps the signal stutters and jumps to keep in time. Even on video from a GTX285! It’s not really pronounced but it’s there and its REALLY annoying on all my Macs. Its almost like the 1080i50 or 1080p25 signals are still based around the 60fps Mac OS display signal…

Regardless. How are we supposed to created motion GFX or edit when we don’t even get a true representation of the motion or colorspace for that matter? (Referring to 8bit Mac OS X color accuracy)

I also don’t want to have my mouse pointer or desktop spread over on to my viewing monitor or taking up GPU memory that would be best applied to the FX rendering.

If like me you use a Wacom Tablet you can limit the area you use on screen to one display but then its a two sets of keyboard shortcuts to get back into full screen mode to watch the juddering playback in 8bit!

You can’t even use skim tool in when full screen mode – which means the “FCPX style” of editing has to be done with the 1080 viewer picture cropped top and bottom when in 100% on the viewer window.

Plus I’ve noticed that FCPX really lags in performance when running on 2 monitors under Snow Leopard (not running Lion yet).

I’ll stop whinging now but Apple have really made a total boob one this one. I hope they have something far more accurate for external monitoring/second displays in Lion or at the very least created a proper video signal output from the Mac OS via GPUs if that is what we have to use to monitor! Maybe even a dedicated monitor control from FCPX.

I’ve been through the technical details with FSI about what makes their monitors more accurate and stable. Most of it has to do with the technology of the actual screen itself. It must physically be able to show true blacks, true colors, and be stable over time and not degrade.

But what Phillip is saying the debate about plasma vs LCD and which creates a more true black is dead, and never made sense in the first place. All computer monitors are created equal in their ability to faithfully reproduce true colors? Then the DreamColor monitor was simple marketing hype?

Sorry Phillip, but the physics that go into a real broadcast monitor prove that not just any computer monitor can do the same thing with a simple ColorSync Profile installed. If that were true, why wasn’t this being done a couple of years ago?

Not even a Matrox MXO2 with a high end LCD display can physically reproduce an image 100% equivalent of an FSI monitor coming out of a Kona card. This all sounds like pretzel logic and marketing hype. It simply defies the physics of what makes a broadcast monitor different. I simply don’t see you putting forth the “science” to back this whole “ColorSync” marketing hype make logical sense.

I don’t think anyone is claiming that a calibrated computer monitor is the equivalent of an FSI, but you do know what incredibly tiny, tiny, tiny pecentage of FCP 7 systems have any monitor in that class connected? FSI count sales in hundred; Apple counts in millions.

If you need that color accuracy then you’ll be using a different tool than FCP X.

“How do you grade in Rec 709 colorspace in Final Cut Pro X? You calibrate your display to Rec 709 colorspace using Colorsync.” I think that’s a statement that very strongly implies you can duplicate a broadcast monitor on your computer monitor. All due respect, but you do seem to be insisting that a broadcast monitor is not needed for color accurate work. And yes, there are more hobbyist than broadcast stations using FCP, but Apple is telling those broadcast stations they don’t care about them anymore? I’m a tad confused about how FCP X is in anyway a “pro” app, and why ColorSync even matters to a hobbyist.

I think you’re making the same mistake Philip has been pointing out for weeks, that there’s nothing between “Pro” film and broadcast users, and “hobbyists” who don’t need any kind of colour accuracy. I live and work in that huge market, the majority market.

For critical color work, say in Resolve, a broadcast calibrated monitor is essential. But again that’s a tiny % of people that need to do that, and from Resolve you can monitor on the broadcast monitor.

FCP X otoh is more focused on those delivering to screens that are NOT a TV and they need to color correct for computer screen colorspace. It’s horses for course.

And not all computer screens are the same, so ColorSync only tells you what looks good on your specific make/model of computer screen. Again, the pro market is left behind. So, let’s call consumer app a consumer app and move on. ColorSync still makes no technical sense.

I think you’re missing the point that there are many types of professional work that is never going to go near a TV broadcaster or even a TV set or fil screen. That work is now that vast majority of the work done in production and post production. May not be full time editors but professional none the less, as I’ve attempted to explain repeatedly here. Maybe you missed reading them or is it that you’re stubbornly refusing to face that reality?

If people are coloring for web-distribution only then what is the point of using ColorSync to try and match Rec 709 in FCP 10? Why not just calibrate the monitor to sRGB, for example, and grade based off that?

Of course doesn’t this lead to the bigger issue of the digital video world lacking the same color management standardization that is in the digital image world?

ColorSync provides a way to match color across devices to the profile you want to use. Most production these days is not going to broadcast and so those people should be calibrating to sRGB (via ColorSync to maintain accuracy on the device).

ColorSync is now a widely recognized and used standard cross platform and way beyond Apple. The print industry use it to solve much more complex imaging problems: reproducing the results of reflected light off white paper using CMYK colors on an RGB monitor. And it works, and has been working for 20 years.

My opinion is that FCP X will be the dominant tool for people editing video as part of their profession – the vast majority. It may evolve into a tool that supports the less common workflows of Studio Film and Broadcast TV/Episodic Television but right now it doesn’t support important parts of the workflow for those editors.

Philip said, “I think you’re missing the point that there are many types of professional work that is never going to go near a TV broadcaster or even a TV set or fil screen. That work is now that vast majority of the work done in production and post production.”

Well, okay… A pro gets paid. I’ll give you that. But there are pros and there are pros, and those of us in the latter category will likely do work in TV (or that might one day show up on someone’s TV, if you’re lucky), and not to be arrogant about it (which of course means I’m about to get arrogant about it), my stuff gets seen. That teal shirt needs to be teal.

Now, web content is growing, and it’s getting expensive, which is great. And some of it gets seen by a LOT of people, which is also great. But those of us in TV, delivering 26 to 52 episodes a year of programming at 28:30 into hundreds of thousands of households on good old fashioned TV sets… well… When you think about it like that you can understand why a few of us might be really, really ticked off.

I see where you’re coming from Philip, and I don’t disagree with anything you’ve said. I don’t think it’s stubbornness you’re witnessing here. The offering you’re talking about from Apple is great and interesting stuff, but it is of little consolation to those who make a good living in broadcast. There’s just some stuff we gotta have right now. If ColorSync really is the big ‘why’ we’re waiting for, what then?

I think by now it’s clear that those things broadcast needs, are not in Final Cut Pro X and are probably never going to be there – broadcast video out for example is simply not part of the design. I think it’s safe to assume that Final Cut Pro X is not focused on the 25,500 people working in Film or TV as editors in the USA (as I quoted in the Who are Apple’s Customers for FCP X post) but on the other 1,974,500 of their current installed base. Reality is that no company is focused exclusively on “professional full-time editors working in Film and Television”. For Avid, NLE sales contribute 14% to net revenue – the rest comes from servers, and major infrastructure items for TV stations. Adobe’s Premiere Pro direct income is insignificant compared to their enormous business in high volume document handling.

That segment of the total market for professional editing tools that works in broadcast TV and film is very small, and was never the major focus of Final Cut Pro. Remember at V1 it supported NTSC DV only. Just one workflow and was definitely NOT suitable for broadcast TV or studio film until it got to version 3 or 4. FCP was always a tool for primarily, independents and the democratized editor/production base, which is now way more dominant than the markets that 20 years ago pretty much were production and post production. Times change.

Learn something new everyday, I wasn’t aware that ColorSync was cross platform. So, as long as people properly calibrate their monitors to sRGB we, we as in video producers, don’t need to have embedded image profiles like with digital images?

I’m still wondering though where using ColorSync to load the Rec 709 color space fits in if it won’t be a replacement for a broadcast monitor and if web-only delivery should be using sRGB.

Philip, it never ceases to amaze me (it shouldn’t, but it does!) how stupid, ignorant, arrogant, self-obsessed, half-witted, slow, behind-the-curve and just plain down dumb these so-called ‘professionals’ really are. They are so wrapped up in their own little cocoons that they don’t see whats going on around them, and they don’t see what’s approaching them. Quite simply, they are getting what they deserve. This last three weeks has been a real eye opener for me. A lot of these ‘professionals’ are way past their best. Good luck and good riddance! 🙂

I get that Apple are pushing ColorSync on a Mac monitor for semi-critical FCPX colour monitoring. What are we supposed to do to monitor interlace though? Monitoring on a progressive monitor simply isn’t acceptable for many users editing from a mix of sources – including file-based downloads.

If you’re working with interlaced material, and delivering for interlaced playout (all UK broadcasters still require 1080/50i HD or 576/50i SD masters) you really need to see stuff like field dominance errors. Often the only way of seeing these is on a CRT or a decent broadcast panel with decent de-interlacing.

Not being able to output an HD-SDI or SDI (HDMI at a push but hardly ideal) broadcast interlaced signal whilst you’re editing is a real dealbreaker.

I’ve yet to see a broadcast on-line edit suite without a broadcast monitor (and most have decent waveform monitors / vector scopes in circuit all the time – rather than taking up FCP screen real-estate)

If you absolutley need to monitor interlace on an interlaced monitor you’re kind of out of luck generally since no CRTs are being sold any more (due to lead removal requirements) and the only choice are LCDs and plasma.

FCP X allows viewing of interlace on it’s Viewer but if you have critical need for a distinct video output then FCP X is not appropriate for your needs. Once again, the broadcast on-line suite is NOT the target market for Final Cut Pro X.

When you talk of ‘calibrating to Rec.709’, I think you actually mean softproofing with Rec.709.

Some clarification:

I’ve been interested in how one might use a computer monitor to do critical color work for some time now and posted some stuff on Creative Cow.

Here’s the deal:

For “true” interlaced monitoring, you can’t do this.

However, for progressive monitoring of any frame rate, you generally can get a “smooth enough” playback to do color work on as long as your media drives and throughput are fast enough to support the format you are working on.

That said, you run into these limitations and needs:

1) OS X only supports 8 bit monitoring right now and even at the graphics card level, there are very few video graphics cards that support 10 or more bits output.

2) You will have to use a probe / colorimeter and specialized software to calibrate your particular monitor, which will generate a specific ColorSync profile for your monitor on your system; this is what graphic designers use to accurately calibrate their monitors for print. And most likely you will will work in Rec 709 for Broadcast TV.

You can’t use a “generic” colorsync profile or the default one that OS X uses when you connect a monitor (or the one it uses as a default for you iMac / MacBook (Pro) because these are not profiled to the correct color space for viewing.

At this point I could not find ANY documentation on how FCP X uses ColorSync. So it is possible that the app itself adjusts the color to compensate for any of the ColorSync Profiles loaded in OS X so as to display the video image in Rec709 or the appropriate video space for your source material, but I doubt this is the case. Typically, you will have to create a Rec709 profile for your gear, yourself, for any application’s signal to show up properly on your monitor.

Finally, a word on Color accuracy and such:

A Broadcast Monitor will ALWAYS render a better quality image for Color work. This is the difference between a Toyota (a good computer monitor) and a Ferrari (a good Broadcast monitor). It’s simply impossible to get the same performance from $500 in hardware than that of $5000 in hardware.

But if your ColorSync Profile is totally calibrated appropriately for your hardware and in the correct Color Space (Rec 709), you will be viewing colors that are accurate “within reason” of what they should look like.

I think it is possible to grade a TV show or even a film this way, but I would not hesitate to check my output on a higher-end system as well.

Finally, proper color correction requires not only the right monitor spec, but also the appropriate viewing conditions: lighting, neutral grey surroundings, etc. It’s for these reasons and other technical reasons, that I wouldn’t even bother calibrating my monitor to sRGB for the web. I mean, no one “out there” is going to watching your shows in an even close to ideal situation. So why try to enforce a standard that is never going to happen on the delivery end?

That’s my 2 cents…

I have an honest question. I keep reading all the time that people are obsessed with having a properly calibrated broadcast monitor for their edit station because using anything else is so inaccurate it borderlines on crap. My question is, especially in regards to content for broadcast, do people realize what 95% of everyone is watching this content on? TVs so poorly calibrated not any discernible original color presentation is still relavent. Almost every home I have ever walked into has a big screen TV locked into Vivid mode, set to “Very Cool” coloring, with frame interpolation active no less because “the colors are awesome and everything is so smooth!” I know there are a small minority of home theater buffs in the world with properly calibrated TVs, but for nearly your entire audience, this couldn’t be further from the truth. In fact, most computer monitors I encounter have a much more accurate color calibration as people don’t generally have a horrid Vivid mode to activate, and generally additive frame interpolation isn’t an option.

I have friends that do independent fx and coloring work for TNT shows that only ever use two 30″ apple cinema displays. Granted they went through the effort of calibrating them well, but if it looks good on their displays they are generally very satisfied. In my earlier points I am not trying to say quality is irrelevant, but rather a decently calibrated computer monitor with attention to detail when grading will yield something so close results wise that 99% of those watching the content will never notice the difference.

The one point that gets made that I have respect for is with regards to interlaced footage and how it ultimately ends up, but for the large amount of content now captured in full progressive modes, why is the concept of a $5000 broadcast monitor still regarded as so damn necesary. I just can’t wrap my mind around given nearly everyone watching the content absolutely destroys their display settings anyways.

The primary reason for calibrated monitors, ie calibrated to an objective standard, is so that what people watch is *consistent*. If they like vivid, then every show will be equally vivid. If we didn’t have these standards, one show would be vivid, another green, another a little red and so on. By correcting to an objective standard then we have consistency

Phillip,

Having no idea of how ColorSync is implemented in FCPX or insights from Apple of where it’s headed. How do you come up with such wild conclusions? You don’t even know if they will be using a 3x3x3 matrix or a 64x64x64 3D LUT. Currently Apple says they support ColorSync… In what capacity? None of us know. I look through the FCPX help section and found nothing. I emailed Larry Jordan and he knows nothing. If you can enlighten me with facts please do. From the people that have tried ColorSync currently in FCPX… all I’ve seen are bad reports.

Comments are closed.