Speech-to-Text: Recent Example
A 46 minute interview transcribed with only 15 of 4100 words needing correction. That’s over 99.98% accurate.
A 46 minute interview transcribed with only 15 of 4100 words needing correction. That’s over 99.98% accurate.
It seems machine learning is powering more and better extraction of all kinds of metadata.
This is the first time I’ve taken a deep look at a TV show and worked out what I think would be the perfect metadata workflow from shoot to edit bay. I chose to look at Pie Town’s House Hunters franchise because it is so built on a (obviously winning) formulae, and I thought that might make it easier for automation or Artificial Intelligence approaches.
But first a disclaimer. I am in no way associated with Pie Town Productions. I know for certain they are not a Lumberjack System customer and am also pretty sure they – like the rest of Hollywood – build their post on Avid Media Composer (and apparently Media Central as well). This is purely a thought exercise built around a readily available example and our Lumberjack System’s capabilities.
What technologies took my interest in 2017, and what will happen to them in 2018.
Lumberjack System will be previewing something extremely exciting at the FCP X World Event at IBC.
The report isn’t clear on exactly how Watson’s “AI” is being used but the article says that they are “now curating the biggest sights and sounds from matches to create “Cognitive Highlights,” which will be seen on Wimbledon’s digital channels.”
Apparently using Watson cognitive services to recognize a significant moment, and pull it together with cheers and social media comments to make a 2 minute video.
The AI platform will literally take key points from the tennis matches (like a player serving an ace at 100 mph), fans’ cheers and social media content to help create up to two-minute videos. The two-week tourney at the All England Lawn Tennis and Croquet Club, complete with a Google Doodle to celebrate Wimbledon’s 140th anniversary, began Monday.
Apple have made another purchase around facial recognition, this time a machine learning startup
With Lumberjack System we don’t focus enough on Story Mode. Of late Transcript mode and Magic Keywords have taken the main focus, and of course the primary real-time logging and pre-editing tools are well known by now.
But Story Mode is ultimately move valuable if the project continues more than a one or two day shoot. Story mode lets us send Lumberjack logged Final Cut Pro X Events or Libraries back to the Lumberyard app to create string-outs from all the footage.
This recently became very valuable for a recent project: extracting the conversations on Final Cut Pro X from nearly 20 episodes of Lunch with Philip and Greg for an upcoming documentary.
Introducing FinderCat: simple media organization in macOS Finder from you FCP X Keywords.