CAT | Technology
Since starting work on the Lunch with Philip and Greg I’ve battled a little with the multicam. Largely because I’m using it in an atypical way, although I suspect setups like mine will become more common in the future.
My solution was Automator actions, triggered by Function keys and activating an AppleScript, so that the mode is first switched to Video Only (for angles 1 or 2) or Audio only (3, 4 and 5) before switching to the angle. It reduces a lot of repetitive strain injury potential!
The tutorial is over at FCP.com, but here’s a little background.
We just published a new version of the IntelligentAssistance.com. It was time! Each of our host apps – Final Cut Pro X, Premiere Pro CC and Final Cut Pro 7 – has their own tab on the home page, and customized app pages.
While I expect computers to take on more and more of the pre-editing functions, we’re losing the Assisted Editing branding. It made sense when we first used it. Back then we had books, training products, The Digital Production BuZZ as well as the software.
The BuZZ went to Larry Jordan, we discontinued our training products and the books rightly came here to my personal site.
So, Intelligent Assistance Software, Inc makes metadata based workflow tools for Final Cut Pro X, Premiere Pro CC and Final Cut Pro 7. That’s all we do. As a company.
Greg and I are also behind LumberjackSystem.com, of course.
Both this new site and LumberjackSystem.com are built using Adobe Muse, which I really like for this type of sales brochure website. In both sites, store functions are outside the Muse site.
While Augmented Reality and Virtual Reality often get conflated, they are very different beasts.
Avid Media Composer has always been great at tracking metadata. Without accurate timecode, Media Composer would have never become established. Over the years Avid have continued to add metadata support to the app.
Reading an Avid blog on Spanish Broadcaster RTVE’s technical deployment in Rio, I was struck by this:
All of the cataloging and indexing process will be carried out by means of an autometadata software developed by the Corporación itself, which will enable the use of metadata provided by OBS and those selected by TVE’s documentary makers.
I’d love to know more about what the “Corporation” has done. Maybe we can set up a meeting for October when we’ll be back in Barcelona!
Spoiler alert: they use a lot of Avid technology!
Only a few days ago I wrote about smart APIs that developers can use to enhance their apps. Today, John Hauer on TechCrunch postulates that he could not find one job that someday won’t be dehumanized.
A few days ago I wrote about metadata’s application to distribution. A recent panel discussion at the Rights and Metadata Madness conference outlined some of the challenges and case studies from Rovi, MLB and Viacom outlining their metadata needs and the practices they’ve developed to deal with them.
The article is worth a read, but I’ll highlight the challenge outlined by Michael Jeffrey, VP of market solutions at Rovi:
A feature-length movie with a sports theme and containing content that includes music from other properties can have assets from 20-50 separate entities.
And each of those entities can have restrictions on what the maker of that movie can show, he said, adding that it’s possible you can’t show any beer cans or can’t use an actor in any promotions.
Now let’s add the formatting, duration, and other issues from my earlier post!
Google today launched a new API to help parse natural language. An API is an Application Programming Interface, that developers can use to send data to, and get a response back. Natural Language Parsing is used to understand language that is available in computer-readable form (text). Google’s API joins an increasingly long list of very smart APIs that will understand language, recognize images and much more.
A lot has changed since I last wrote about Advances in Content Recognition late last year.
The Final Cut Pro X Creative Summit is on again in October this year.
Three days of cutting-edge training on the latest FCPX and Motion.
Hear directly from Apple Product Managers. Learn from top industry experts.
Apparently I slip in as an ‘industry expert’ with these sessions
10:30am Saturday Using Transcripts in FCPX
10:30am Sunday Production Kit in a Bag
Metadata is one of the most useful tools we have, if we have the tools to use it! Aside from the obvious problems when no metadata is gathered during the shoot, or insufficient metadata is gathered, other issues arise because there are not always tools in the production chain that use the metadata that has been gathered!
A recent student film was used as a template by the Entertainment Technology Center at USC with the purpose of realizing the long-hoped for promise of production metadata, with some fairly ambitious goals.
The results are interesting and important, particularly considering that this is what I would categorize as Technical metadata, rather than Content metadata.
Although my focus is very much on metadata for production, and in particular Content Metadata, there’s a whole other area of metadata for distribution, built around the EIDR ID and fleshed out largely by Rovi. But there’s another area where metadata will likely have to apply: distribution deliverables.