During a recent thread here where I “infamously” suggested Apple should drop Log and Capture for the next version of FCP, one of the topics that came up was the use of metadata. Most commenters (all?) appeared – to my interpretation – to feel that reel name and TC were the “essence” of metadata.
And yet, if we look at the most recent work of the Chief Video Architect (apparently for both pro and consumer applications) Randy Ubilos we see that Location metadata is a requirement for the application. According to Apple’s FAQ for iMovie for IPhone if you don’t allow iMovie for iPhone to access your location metadata:
Because photos and videos recorded on iPhone 4 include location information, you must tapÂ OK to enable iMovie to access photos and videos in the Media Library.
If you do not allow iMovie to use your location data, then the app is unable to access photos and videos in the Media Browser.
You can still record media directly from the camera to the timeline but, without the Location metadata, you’re pretty much locked out of iMovie for iPhone for all practical purposes.
There is no location metadata from tape capture! There’s not much from non-tape media right now, although some high end Panasonic cameras have an optional GPS board. However P2 media (both DVCPRO HD and AVC-I) as well as AVCCAM all have metadata slots for latitude and longitude.
Now, I’m NOT saying that Apple should force people to use metadata – particularly if it’s non existent – and this type of restriction in a Pro app would be unconscionable. I merely point out that this shows the type of thinking within Apple. In iMovie for iPhone they can create a better user (consumer) experience because they use Location metadata for automatic lower third locations in the themes.
Where I think it’s a little relevant is in counterpoint to some of my commentors: building an app that’s reliant on metadata is a different app than one relying on simple reel name and TC numbers.
10 replies on “How serious is Apple about metadata?”Leave a Comment
I think you’ve got it around the wrong way though — if iMovie is not given permission to location data (as an security control within the iPhone) then it can’t use the media that holds that information, because the iPhone’s security model doesn’t permit it to. At least that’s my interpretation.
And I genuinely understand that there’s more to metadata that timecode and reel (which I imagine iMovie on the iPhone couldn’t care less about) – my point in “the other” thread was simply that retaining support for those pieces of metadata doesn’t inhibit support for more complex metadata.
Maybe Dylan, maybe. OTOH, iMovie for iPhone *uses* the location metadata in every theme. The theme would be broken without it, so unless you allow access to the Location (as a security measure or whatever) then you can’t use the media. Period.
No second rate experiences because location is lacking! Maybe as I post more examples you’ll see why you can have a tape & TC focus, or a real metadata focus. I frankly don’t consider reel name and TC to be truly useful metadata *in a modern workflow*. In the older workflows yes, they were essential. In a modern workflow….
Reel name and Timecode contribute nothing to First Cuts; Finisher and our next piece of software. All work with metadata to take dull work off the editor. I guess over time you’ll “get it” and when you truly understand the metadata-based future then I think you’ll understand why it’s hard to accommodate the past. (And indeed actively spend money to reproduce the past).
What did you think of my article in the Supermeet magazine? (That’ll be a requirement for my next response to you. 🙂 )
Looking at the API docs the application does have to be authorised for ‘Location Services’ before it can read any data. And as core functions utilise that data, then yup it has to be able to read it or it won’t work I guess.
I’ll look for the Supermeet mag article soon.
Hey Philip, I’ll read your Supermeet Mag article when I can get a copy. At present I’m not comfortable to signup with no way to opt out of sponsor contact.
As for your other comments there – I don’t think a the case for FCP is an all-or-nothing one… It can support TC/Reel data within a larger metadata structure. Clips without additional metadata don’t benefit from that, but support for “older workflows” (which are still very real in many professional post environments) can be retained.
In the case of iMovie here – I don’t have an iPhone (4 or otherwise) but from what I’ve found in the SDK documentation it’s not possible to read location metadata from files without the app having access to “Location Services” – so if the app is to support any location-based functions then it has to have that data. I guess that’s a decision the developers have chosen to take, rather than create alternate use cases for situations where that information is not available to the application.
I already do “get” what metadata can offer, and I’ll continute to keep an eye on what options are available. But I see no benefit in excluding other situations when it’s not necessary. If my clips only have reel and TC then I will not be able to benefit from workflow enhancements that rely of other data, but those can still exist alongside the ability to support clips with less data.
I guess that’s where we differ. While FCP next might include L&C it will be a compromise on a truly metadata based app for the future if it’s core to FCP. Of course, my proposal was that it was always supported, but in a 3rd party app.
And I think you make my point that the developers (ie Randy Ubilos, developer of Premiere and FCP, and iMovie 09, iMovie for iPhone and lead on Aperture 1) preferred to make the app useless without location data, than support an “alternative workflow”. Why would he support “alternative workflows” in the pro app under his guidance as Chief Video Architect?
Eventually your source will have metadata. I’d rather Apple plan for the next 20 years than support the last twenty! Ironically while everyone was telling me how “not dead” tape was, over on the IMUG list two people in two days said “they were pulling their last BetaSP deck out of the rack and selling it because it hadn’t been used in a year” (their only tape machine) and another said they’d only delivered digital in the last year.
As for supporting both… I see the data points, I comment.
So, what do you think of our First Cuts application then?
Well an iPhone editing application is very different in purpose and scope from FCP. It’s in a closed environment to start with, with basically a single purpose. All iMovie for the iPhone needs to be able to do is edit video recorded on the iPhone. It can be very limited in function and make strong usage assumptions based on it’s limited scope. FCP can’t… Or at least I don’t think it should.
But what I still don’t understand is how retaining support for tape within the application natively will limit it’s ability to work with metadata-rich media? Surely if tape I/O support is made available in a third-party tool then there’d still be media being used within the application that only has minimal metadata (Reel and TC basically).
Some development would be requried to provide the tools to work with that media, but all of the (minimal) metadata required for a tape-based workflow can be represented within a more metadata-aware structure.
As for First Cuts, I’ve never had the opportunity to use it – I work day-to-day in Avid.
Without metadata the tape-source content would perform badly in a properly designed metadata-centric app.
Ah, you work in Avid, that explains your attitude to any real use of modern metadata 🙂 Media Composer is a great app and, until recently, so mired in the past that it was failing. They’ve turned that around with AMA but had to build a whole new architecture into the app to bypass tape and TC.
But Media Composer 5, for all it’s strengths (and there are many) is a very traditional application for traditional workers.
Not sure why you even care about FCP. FWIW I expect the next release of FCP to be future focused rather than the past focused, and that may well mean letting go of great support for tape, to some degreee.
First Cuts, btw, is the first innovation in editing since Media Composer 1 20 years ago. The computer does part of the “creative” process. There’ll be more of it soon. Like real soon.
Yes of course, tape-originated footage misses out on the benefits of enhanced metadata, unless it can be added somehow, but it can co-exist in the same system… And while FCP is still widely used in professional situations where tape is still an important consideration it would seem foolish to drop it.
Avid has been very tape-centric, no doubt, because that’s what it’s core users have needed until recently. They have responded well with AMA (and with powerful P2 and XDCAM support before that). But I’d argue that MC has been a better support of non-tape metadata than FCP the entire time, mainly because of it’s broad support for film workflows. While timecode and reel have always been important, Media Composer has also been tracking other associated metadata (keycode, aux timecodes, etc) to facilitate film work. FCP has never been quite as flexible in that respect.
My views on tape support are because I work with tape everyday. While some jobs are more file-based than others, at some point almost everything I do involves tapes, input or output. Every single master I make ends up on a tape (Digibeta, HDCAM, SR whatever). Every network I’ve ever dealt with requires tape for programme delivery. At this stage, in New Zealand, the only thing I could finish and deliver on file is a commercial. But even then I’d usually retain a tape copy because they are pretty easy to store reliably (files, for all their benefits, are still pretty easy to lose without dedicated backup processes, which not many people really have).
Do I care what happens with FCP? Not really. I don’t use it often. But I still think that support for tape is relevant to it’s use.
As for First Cuts – it does seem very interesting. I remember reading about it when it came out. I’d hoped something similar might be possible with the use of Script-sync… I’d love to be able to take a transcript that had been married to footage with ScriptSync, reorganise the transcript contents (as index cards perhaps) and use that to then generate a rough cut. A great quick way to build the desired story narrative.
I understand the potential of metadata in our work, but I’m also very pragmatic and I understand that the way we work seldom moves as quickly as technology allows. So yeah, FCP should support these advances as much as possible, but I don’t see it necessary to close the door on other workflows at the same time. It’s a hill, not a cliff.
Something to ” take a transcript that had been married to footage with ScriptSync, reorganise the transcript contents (as index cards perhaps) and use that to then generate a rough cut. A great quick way to build the desired story narrative”?? Well, it won’t be with script sync, but stand by. Real soon. But sadly (for you) FCP and PPro only as Avid are very closed to developers with a binary project format and no XML representation. There’s always automatic Duck though.
Also (I just can’t stop!) – as you noted in the SuperMag article (see I read it), the current metadata from cameras isn’t a whole lot more useful than tape and TC anyway. Lens data, date and time, white balance, none of that is all they helpful in the editing process. It’s largely going to be “Derived” and “Added” metadata that will be most beneficial, and those things can be applied to tape-source media too.
As long as “reel” and “timecode” are simply a couple of metadata keys they aren’t going to inhibit more-complex utilisation. And software from third parties could, in fact, add meaningful metadata to that footage.
If you find yourself in Auckland I’d be happy to buy you a beer, and maybe I can join your Metadata Think Tank 🙂
Comments are closed.