Last September, in my What should Apple do with Final Cut Pro article, one of the bullet points was:
Better media and metadata management.
Right in with 64 bit, all processors used and the use of the GPU. I immediately qualified myself:
Ok, there probably aren’t that many people clamoring for better metadata management, but it’s a significant part of better media management, and crucially important for the future of automation in post production.
Then toward the end of that article, under the heading of what I thought that Apple should do, other than what everyone expected, I said:
More metadata automation. Well, part of me hopes they won’t because that’s my field, but it would be nice to see source metadata being used to auto-populate Titles or Master Templates (like iMovie for iPhone does).
Truthfully, I was indulging in some wishful thinking. I still don’t think we’ll get – at least not with Final Cut Pro X v1 – auto-populating titles or Master Templates, but I am very pleasantly surprised how far Apple have “come around to my way of thinking”.
OK, let’s call it parallel development then, as I’m fairly sure that Apple had their metadata-centric rewrite well under way by the time I was writing, but it is gratifying to have one’s position validated. For a company that didn’t really show much sign of “getting” metadata with Final Cut Pro 1-7, they have certainly embraced it for Final Cut Pro X.
Final Cut Pro X is entirely metadata based. You find clips initially via their date/format/source metadata. Each metadata category gets its own smart collections. From there, each time a keyword is applied to a clip, a new Smart Collection is created. Add the keyword to another clip  (they autocomplete even for multiple keywords at once, just like our prEdit does) and it goes into the same bin (whether the keyword is applied to the full clip or a range). Or drag the clip or range to a Smart Collection and have the keyword applied.
Media organization/location, Bins and Subclips are all metadata functions in Final Cut Pro X. There’s no need to assign a Bin for imported footage (a.k.a. Log and Capture/Transfer target bin) because all media finds its way into Keyword Collections and Smart Collections based on metadata.
I also suspect that audio routing to replicate the functionality of fixed tracks will also use metadata, so that similarly tagged clips are always output to the same channels, i.e. dialog, music, effects, etc.
To be more precise, we’re talking about two different types of metadata. I’ve written extensively about the Six Types of Metadata we use in postproduction. Initially identifying four: Source, Added, Derived and Inferred; and then later adding Analytical and Transform. You can find out more in my Mastering Metadata for Postproduction webinar that I did for Larry Jordan and Associates back in March.
In Final Cut Pro X Apple are using Source, Added and Derived Metadata. They also use Source and Derived for automation, beyond simply automatically creating Bins (as Smart Collections).
The information that comes with the file from the camera is clearly Source Metadata: camera, format, frame rate, codec, date shot, transfer date, and other information comes from the camera with no additional work. Interestingly there is more camera metadata available than apparently there are columns that can be displayed in Final Cut Pro X. In the NAB sneak peek we caught a brief look at the optional columns for the Browser and there are missing columns for metadata I know is available from the camera. Maybe our Metadata Explorer isn’t dead after all?
Obviously keywords are Added Metadata: it has to be added by someone. Keywords and especially Range-based Keywords, provide much more flexible media sorting than Clips and Subclips ever could.
More interesting is that Apple have gone beyond those two types of metadata: something I’d assume any modern NLE to embrace one way or the other. Apple have already embraced Derived Metadata. My definition:
Derived Metadata is calculated using a non-human external information source and includes location from GPS, facial recognition, or automatic transcription.
In Final Cut Pro X Apple derive:
- Facial Detection;
- Shot detection (plausibly derived from the size of any faces, and amount of sky area in the picture, among other Source criteria);
- Stabilization and rolling shutter correction based on detecting and removing the artifacts (saved purely as metadata);
- Automatic neutral white balance, based essentially on the auto-white balance technologies that are available (saved purely as metadata);
- Audio cleanup based on the detection of common source problems and applying a correction.
These functions, being metadata driven, are optional. Turn them off or have them performed only when you need them (but I would let them be done in the background during ingest, and just turn off automation when it doesn’t work to my favor).
Metadata driven automation is something that highly established editors well into their career tend to not want, while those in short term, faster turnaround jobs tend to appreciate it more. When demonstrating our First Cuts software the response of editors with traditional experience is generally kind of negative; but the response from those earlier in their career, or who have tight turnaround, is more “Thanks for providing these great tools”.
You see, this metadata stuff isn’t just theory to me. I live in it every day. Every one of our Assisted Editing applications manipulates or uses metadata to make editing easier: to take the boring out of post.
We believe in metadata. I believe that the automations Apple have already built into Final Cut Pro X are just the beginning. The focus on metadata is good for us as one of the problems we’ve faced in the past is to have our metadata needs force-fitted into Final Cut Pro’s earlier inflexible metadata structures.
Now, with metadata embraced so comprehensively in Final Cut Pro X, perhaps it will make it easier for people to understand our goal of taking the boring out of post by automating as much as possible using the metadata provided or derived/inferred.
Comments
23 responses to “Why did Apple base Final Cut Pro X on Metadata?”
This is one of the things I’ve been waiting to see in FCP for a long time as well. There is only so much you can do with file name and folder level organization.
It’s not so much that editors don’t want or care about metadata but let’s be honest, what good is metadata other than trying to locate desired media? At the end of an editing day, I don’t care about metadata as much as I do my output quality, an area that metadata cannot help. I do value metadata when I am trying to find a particular piece of footage but other than location assistance, what good is metadata really? Perhaps this what should be preached about rather than the chest thumping you were right. Just an opinion.
What good is metadata, really? Didn’t you just read the article that answers that question? I answer a hell of a lot of what good is metadata in the article, and in the linked articles. You can also still pick up the 2010 Supermag http://www.supermeet.com/supermag/ with my “Mundane and Magic Future of Metadata” – that’s still free.
Support for metadata was one of the first things that leaped out at me when watching the Apple demo.
The fact that a lot of editors don’t seem to get the power and potential of metadata isn’t particularly surprising. If it wasn’t for the fact that I am a photographer in my free time I probably wouldn’t fully get it either. When you have 10,000 photos to track metadata becomes critical. And when you get used to metadata you extrapolate back to how it can speed up just about any media workflow.
And video regions as metadata? Hallelujah! Goodbye sub-clips and good riddance.
For my line of work (movie trailers) breaking down the source material into every possible little detail is a very tedious but an absolutely essential part of every project.
Looks like Apple had us trailer editors in mind when they came up with their smart bins and heavy use of metadata. I couldn’t be more excited.
Now, let’s hope they also thought of a way in FCP X for more than one editor to collaborate on a project. The fact that there doesn’t seem to be a traditional “project file” any more scares me. I know, Philip keeps talking about FCP X being a version 1…
Only, this is not the first time Apple’s releasing a professional NLE . Expectations are quite a bit higher this time around…
Actually this is the first time Apple has designed and released a professional NLE. Final Cut Pro as we know it was designed at Macromedia as a cross-platofrm application before Apple bought it. The current FCP is not a great OS X app having started on OS 9.
I’m sure sharing projects is something Apple have considered, but right now we don’t know how. Project migration or sharing is likely to be a V1 feature because, to do it right, it would have to be core to the application. What I don’t think we’ll have is open access directly to the CoreData “persistent data store” (aka database). That could lead to quick and dramatic project corruption from just one bad update command.
> … what good is metadata other than trying to locate desired media? At the end of an editing day, I don’t care about metadata as much as I do my output quality, an area that metadata cannot help.
Have to disagree Patrick. Indeed, as Philip notes above using the demonstrated functions of FCP X as an example, “derived metadata” can be used for automatic shot stabilization, white balance and audio clean-up etc … these are all processes which have the potential to directly and positively affect the quality of ones output. Combine this with appropriate leverage of additional source and derived metatdata during the edit itself, and you have the possibility of reaching the “end of an editing day” (or at least the first pass) much more quickly, thereby allowing you more time to spend on the online/finishing process. Again, this clearly affords a potential for a direct and positive affect on the quality of ones output.
Just my 2c
Andy
As a Final Cut Server consultant, integrator, and administrator, I find larger, higher end production facilities become so entrenched in metadata after moving to centralized shared storage, it’s not funny. After learning FCSvr at the first Apple T3 for it, I began to see the light, and large projects in my own studio stopped being to difficult to manage. Metadata is everything, hands down. Great insights, Phillip. I wish more of my FCP students would “get it” when I talk about media management and metadata. They usually email me six months later with huge projects (usually documentaries), and frazzled cause they can’t track/find anything easily. “Media Management Is Metadata Management”, remember that from class? Hello? Thanks for writing about this topic more, Phillp!
Philip…great post and you deserve credit for being spot on about metadata which now seems simply so obvious.
While I’m very eager for FCP X I am concerned about losing the bin/folder structure. I do not think metadata replaces that…on the contrary I believe it enhances to the Nth degree. Just like in editing it’s always helpful to have a wide shot, then medium then close up. Same for file structure in projects. Good to have that quick 30,000 ft view. I hope that’s not thrown out entirely. Or perhaps I’m simply missing something that is going to seriously transform the way I work…all for it if so.
Lastly, I don’t know if you follow Steve Rubel but you should. Think you’d like him. And I bring him up specifically as I’d like to just offer “two cents” about your new website. Not crazy about it to be honest…particularly with the way stories seem arbitrarily, or based on some page view algorithm, presented.
For someone who follows your blog often it’s not fun having to scroll through your page in case you posted something new that for some reason or another didn’t make it to the top of the page.
Linear thinking I know…but when your new post comes out I simply want to see it first thing as I’ve already read or scanned the previous. Food for thought and one persons viewpoint to add to others if you have them.
Anyway, back to Steve. He switched to Tumblr and describes it here…it’s fantastic and I think would serve your blogging needs very well from the little I know of how you operate…
http://www.steverubel.me/post/6070334427/why-i-adopted-a-scorched-earth-policy-dismantled-two
My essay pieces are “stuck” to the top of the home page until I write another essay piece. Reference pieces go in normal historic series. It’s a normal function of a WordPress blog btw. Anyway, opinion noted. Thanks for the reference to Steve Rubel. Not that I’m taken with the idea of Tumblr.
Hi Christopher,
Quick usability hint for this blog theme: click on “Toggle posts” in the top right and you’ll get a list of front page headlines. Any story can have its heading clicked on to open into a new page, or click the up-and-down arrow button next to the headline to reveal just that item.
Short posts will have the full text on the home page, whereas longer essay posts will have an except followed by a (More…) link to view the full content.
Cheers,
Greg
Not quite metadata – but i havn’t seen (or i just missed it) any talk of FCP x being more ‘aspect ratio’ fame size agnostic – the way Motion and or other pixel manipulations programs are.
We do a lot of digital signage work with non standard screen display formats:
1. HD vertical (HD monitors placed vertically so 1920 is the vertical axis) )
2. Double HD – like video walls only without scaling (3840 w by 1080h typical)
Final Cut CAN technically handle it but was never envisioned for it – even with our Frankenrig Mac towers and RAIDS …the render times for multi minute timelines are hours for our ‘Double HD’ in store displays.
Any thoughts on FCP X allowing as normal ‘non standard’ screen resolutions?
Thanks for all your diligence.
No idea on “non standard” resolutions but my reading of AV Foundation is that it shouldn’t have size dependencies. So, in theory, FCP X will be truly resolution independent, but whether that’s the case or not, we’ll discover later this month.
….taking the boring out of post by automating as much as possible
Good and bad.
As you say it is the less experienced younger editors that tend to embrace fully the metadata and associated automations.
The trouble is that to be fully aware of how and why things are as they are, in terms of what works and how you get there, you must learn about all the boring things.
Walk then run.
I’ve heard that argument over and over again since the originals of NLE’s where people advocated that you couldn’t learn an NLE until you’d learnt linear editing so you got a good grounding in Timecode, color framing, horizontl phaze and all those technologies that we no longer have to worry about (OK TC we still worry about). Automation takes over everywhere. Sure some will want to do it the manual way for control, but they will be the minority. A very small minority.
You talk about source date here and on page 10 of your book. But I have tried to get that info from a DV tape file that I imported into FCPX and can’t find it. I’ve tried the extensive lists and they come up blank. I know the data is there because CATDV sees it. How do you see it?
Thanks
It was my friend Carey Dissmore that discovered the date from DV footage he captured. I have to do some DV Capture soon, so I’ll test it out and let you know.
I’ve just been capturing in some DV footage from LAFCPUG 2006 and the file name(s) (after stop/start detection) are the date and time of the original file. This is capturing directly into FCP X.
Importing tape from a camera for both DV and HDV for fine for creation date. New clips are created for each sequence on the tape.
My problem is with files that I originally captured in the “pre-X” FCP. The DV file has the info in it because Catdv can see it. For HDV, I assume it is hiding there as well, but I don’t have a program that looks into HDV meta data.
Have you tried a custom metadata view in the info/info pane? Any metadata recovered and stored should be displayed there. If the original capture didn’t capture it (and FCP 5.1.2 onward did) then it’s not available.
No idea about HDV – not even sure if it has date metadata.
It doesn’t show up anywhere in the full metadata custom view on previously captured files. It works just fine with newly imported footage from the camera for both DV and HDV. It is odd Apple is ignoring the actual creation date in the DV file and using the computer file creation date.
I think this is one more example of FCPX being written without any regard to past FCP use. Can’t import previous projects and now can’t get all the data on previously captured tape. Can only be used fully if start from scratch.
Did some more checking and it seems only media captured with Final Cut Pro X has the capture date in the metadata (as well as the file name and clip name). I checked really old DV footage from FCP 1.2.5 and even older PAL footage. What I haven’t got an example to check is a DV file captured with Final Cut Pro 5.1.2 or later. That’s when they started embedding source metadata into the QT metadata in the media file. Never got fully implemented (ie never got into the interface or a feature) but the interest in metadata goes back a few versions.
The date is in the file but you have to create a custom metadata set (or add that criteria to a current metadata preset. This is in the info pane/info tab area. Bottom left is a gear wheel and edit Metadata set would be your selection. It’s creation date that you want to add.
I tested that with current captures and it does show up there. I haven’t looked at it on old files but I will when I’m back home.