CAT | Metadata
I first wrote about derived metadata back at the end of January 2009. Derived metadata uses computer analysis to derive metadata from the video source. There are now technologies for speech-to-text, meaning extraction, facial detection, facial recognition, emotion detection, image recognition, and more. One company has been accumulating these somewhat diverse technologies: Apple.
One of the Final Cut Pro X features that really resonates with me, is Keyword Ranges, and by extension, Keyword Collections. I realize now that this enchantment is because Keyword Ranges are a very pure embodiment of Content Metadata. I also realize now, that I’d been simulating this approach in other software, for as long as I can remember. In order to understand better, we’ll need to take a little trip to the past.
At the current stage of technology development, we are largely limited to adding Content Metadata manually. If we want people described; if we want the scene described; or the action described, we need to add Keywords or Notes to achieve that. I don’t expect that to be the case in the future. Technology from Clarifai and Google give us clues to the future.
I was saddened, but not really surprised, by this week’s announcement that Adobe were pulling Speech-to-Text transcription from Premiere Pro, Prelude and AME. As Al Mooney says in the blog post:
Today, after many years of work, we believe, and users confirm, that the Speech to Text implementation does not provide the experience expected by Premiere Pro users.
I am saddened to see this feature go. Even though the actual speech-to-text engine was somewhat hit or miss, there was real benefit in the ability to import a transcript (or script) and lock the media to the script. So it’s probably worth keeping the current version of Premiere (or one of the other other apps) to keep the synching function, as the apps will continue to support the metadata if it’s in the file.
Co-incidentally, we had a feature request recently, wanting a transcription-based workflow in Final Cut Pro X. When questions on how he’d like it to work, he described (unintentionally) the workflow in Premiere Pro!
In fact, I’d almost implore Adobe to keep the ability to import a transcript and align it to the media, using a speech analysis engine. That way the industry will have an alternative to Avid’s Script Sync auto-alignment (previously powered by Nexidia) tools currently unavailable in Media Composer. The ability to search – by word-based content – hundreds of media files with transcripts, is extremely powerful for documentary filmmakers.
And yes, there is the Nexidia-powered Boris Soundbite, but there is one problem with this waveform-matching approach: there is no content metadata. Nor anything (like text) we can use to derive content metadata.
This week I sat down with Larry Jordan and Michael Horton and talked – what else – metadata: what it is, why we need it, how we get it, and how we use it. It was a good interview and probably my clearest explanation of what metadata is.
You can hear the interview here:
and read the transcript, courtesy of Take 1 Transcription is at:
Over the weekend, Chris Fenwick interviewed me for his FCPX Grill podcast about the importance of logging, metadata and Lumberjack System. The podcast is available now and it is a good conversation.
What makes it meta though is that Alex Gollner logged the conversation and put it up on his website: metadata (logging) about a conversation on metadata. It doesn’t get much more meta than that!
According to a recent webinar, “Metadata is the New Gold,” a Webinar produced by the Media & Entertainment Services Alliance and the Hollywood IT Society, metadata is more important than ever. Each of the presenters had specific takeaways.
Around two years ago I was preparing for a trip to Florida to start a journey producing a reality TV series on a solar powered boat trip. There were some challenges: specifically the lack of time to log footage and then edit (and still get some sleep). I need my sleep so I had the idea of logging real-time as we shot. Finally, the outcome of that challenge is available to everyone: Lumberjack System is live. (more…)
Lumberjack and Producer’s Best Friend at BOSCPUG fcp.co/final-cut-pro/…
As part of their October 2013 Creative Cloud updates Adobe have released Prelude LiveLogger, an extension to the Prelude family of metadata entry tools, for real-time on location logging. Like us, Adobe realized that there are situations where there is simply no time to log between the shoot, and the need to edit. Adobe offers the example of a football match where plays are tagged in real time during the game, and a highlight real pulled from Prelude moments after the shoot.
Our need was to log, shoot and edit as we undertook the Solar Odyssey, realizing that the only way to allow time for sleep was to log as we shoot.