The present and future of post production business and technology | Philip Hodgetts

CAT | Business

Avid Media Composer has always been great at tracking metadata. Without accurate timecode, Media Composer would have never become established. Over the years Avid have continued to add metadata support to the app.

Reading an Avid blog on Spanish Broadcaster RTVE’s technical deployment in Rio, I was struck by this:

All of the cataloging and indexing process will be carried out by means of an autometadata software developed by the Corporación itself, which will enable the use of metadata provided by OBS and those selected by TVE’s documentary makers.

I’d love to know more about what the “Corporation” has done. Maybe we can set up a meeting for October when we’ll be back in Barcelona!

Spoiler alert: they use a lot of Avid technology!

A few days ago I wrote about metadata’s application to distribution. A recent panel discussion at the Rights and Metadata Madness conference outlined some of the challenges and case studies from Rovi, MLB and Viacom outlining their metadata needs and the practices they’ve developed to deal with them.

The article is worth a read, but I’ll highlight the challenge outlined by Michael Jeffrey, VP of market solutions at Rovi:

A feature-length movie with a sports theme and containing content that includes music from other properties can have assets from 20-50 separate entities.

And each of those entities can have restrictions on what the maker of that movie can show, he said, adding that it’s possible you can’t show any beer cans or can’t use an actor in any promotions.

Now let’s add the formatting, duration, and other issues from my earlier post!

Google today launched a new API to help parse natural language. An API is an Application Programming Interface, that developers can use to send data to, and get a response back. Natural Language Parsing is used to understand language that is available in computer-readable form (text). Google’s API joins an increasingly long list of very smart APIs that will understand language, recognize images and much more.

A lot has changed since I last wrote about Advances in Content Recognition late last year.




Two companies: $8 billion of production

It’s hard to imaging but between them Amazon and Netflix plan on spending $8 billion on original content in 2016.

Amazon is expected to spend $2 billion, subsidized by Prime subscribers

Netflix’s subscribers will finance $6 billion in original programming.

Combined that would be 60 movies with a $100 million budget each. (Typically 400-500 movies are released by studios every year.)

Or 3000 ‘hour’ television episodes assuming a $2 million an episode budget. (Some would be higher, some lower). At 23 episodes a season, that’s one season for 130 different shows.

In this episode I get to spend a lot of time talking about the background the Lumberjack System, in the context of  the very unsexy topic of workflow, particularly automating the workflow. I share many of the background decisions related to Lumberjack System – our logging and pre-editing system for Final Cut Pro X – including why it’s limited to FCP X.

Other topics include automation; Digital Heaven’s announcement of SpeedScriber; how Lumberjack has developed based on user, and use, feedback; the post NAB development of noteLogger; Prelude LIveLogger and the Premiere Pro ecosystem and NLE market shares; how development resources are allocated.

Patently Apple reports Apple have filed for Trademarks for: ‘Mac – Works with iMovie’, ‘Mac – Works with Final Cut Pro X’ & Combo of Both.

This certainly isn’t the first time Apple have filed for “Works with” Trademarks, and that’s what makes it interesting. Previously these type of trademarks have been for Apple Ecosystems, like iPhone, iOS, iPad, CarPlay, AirPrint, et al.

While I have no idea what it might mean – developers have no clues yet – it is interesting that iMovie and Final Cut Pro X are being considered as part of a larger ecosystem. For those who don’t know, these days iMovie is a version of Final Cut Pro X with a simplified interface.



Adobe sets new revenue, subscriber records

Adobe revealed record quarterly earnings on March 17th, with $1.38 billion in quarterly revenue ($5.5B annualized) and 4.252 million Creative Cloud subscribers (us included).

Of course, Adobe is strongest in document handling and photography but every subscriber has access to the entire Creative Cloud suite. I believe the integration between the video and audio apps is one of Adobe’s strengths in the creative TV, film and video space.

If I had to place an educated guess, I’d say there are more active Premiere Pro CC users than Media Composer active users, but not as many as Final Cut Pro X.

According to an article on MacRumors today, Apple is negotiating with Studios and Producers to create original programming for Apple TV. Two thoughts.

Apple have long created their own content by running music festivals and recording the performances.

It’s been a long time coming, but I thing it was inevitable. Back in late 2009 I postulated on What if Apple or Google simply bypassed Networks and Studios? My conclusion then:

Clearly, either Google or Apple could destroy the existing content production industries without borrowing or risking their business. Just what leverage do the current middlemen really have?

It’s a strategy that’s working well for Netflix and Amazon.

Yet again the threat of movie piracy – that is, unauthorized distribution – has had no observable affect on an industry with higher attendance and higher revenues. Please destroy my businesses like this!

I first wrote about derived metadata back at the end of January 2009. Derived metadata uses computer analysis to derive metadata from the video source. There are now technologies for speech-to-text, meaning extraction, facial detection, facial recognition, emotion detection, image recognition, and more. One company has been accumulating these somewhat diverse technologies: Apple.


<< Latest posts

Older posts >>

October 2016
« Sep