The present and future of post production business and technology | Philip Hodgetts

A recent comment in an article on CNET.com caught my eye:

“If I owned a studio, I’d make movie theaters pay me,” says Dana Brunetti, producer of “House of Cards” and “The Social Network.”

Needles to say I had to read the article. First note was that this comment was in the context of a web focused conference, so there may be an element of “playing to the audience”, but in essence the argument is that more online/web companies should follow Netflix (and Amazon, Google and Apple) into producing more original content.

With online and technology-based companies already threatening traditional distribution methods, the impact would be huge: “Once Silicon Valley can create content as well,” said Brunetti, “they’ll own it soup to nuts.”

I can’t argue with that. More original production means more jobs in the industry. (And yes, more clients for my day job’s business.)

What appeals to me is the push for “per program” content purchase. As long as the pricing issue is solved. It should cost no more (over a month) for a la carte purchases of limited programming, than it is for a full cable subscription.

Nov/14

15

Metadata and Organization

This week I sat down with Larry Jordan and Michael Horton and talked – what else – metadata: what it is, why we need it, how we get it, and how we use it. It was a good interview and probably my clearest explanation of what metadata is.

You can hear the interview here:

http://www.digitalproductionbuzz.com/BuZZ_Audio/Buzz_141113_Hodgetts.mp3

and read the transcript, courtesy of Take 1 Transcription is at:

http://www.digitalproductionbuzz.com/2014/11/transcript-digital-production-buzz-november-13-2014/#.VGeUCoe287Q

Red Shark news reports that Disney Research have:

Researchers working for the Mouse have developed a groundbreaking program that delivers automated edits from multi-camera footage based on cinematic criteria.

When you read how they’ve achieved it, I think it’s impressive, and very, very clever.

The system works by approximating the 3D space of the cameras in relation to each other. The algorithm determines the “3D joint attention,” or the likely center of activity, through an on-the-fly analysis of the multiple camera views. Based on this information, the algorithm additionally takes into account a set of cinematic preferences, such as adherence to the 180 degree rule, avoidance of jump cuts, varying shot size and zoom, maintaining minimum and maximum shot lengths, and cutting on action. The result is a very passable, almost human edit.

Perhaps it’s the very nature of research, but I’m not sure of the practical application. Maybe that’s the point of pure research.

Assuming the technology delivers, it’s rare that we want to take a multicam shoot and do a single, switched playback version. “Live switching” after the fact, if you will. At least in my experience, the edit not only needs to switch multicam angles, but to remove dross, tighten the presentation, add in additional b-roll, etc, etc.

More often than not, my angle cuts are more directed by the edit I want, than a desire to just pick the best shot at the time.

That said, this type of research is indicative of what can be done (and therefore almost certainly will be done): combine a good multicam edit, with content metadata and perhaps you’d have a decent first pass, that could be built on, finished and polished by the skilled editor. The point being, as Larry Jordan points out is

How do you save time every step of the production process, so that you’ve got the time that you need to make your films to your satisfaction.

Ultimately the commercial versions of these type of technologies should be seen as tools editors can use to make more time for their real job: finessing, polishing and finishing the project; bringing it heart that makes the human connection in storytelling.

Variety just posted an article on how many people had watched online game play (of one game) in one week. 75,000 players and 6 million individual viewers who collectively watched 327 million minutes of gameplay. Watched. That’s about an hour per viewer on average.

Six million people watching one game’s game play. That’s a decent network-sized audience these days. That’s one game for one week. Admittedly a release week for the game.

Watching game play has become a huge audience, with very low production costs. While it’s not traditional production, the time spent watching gamers play video games, erodes the time available for other forms of entertainment, specifically films and television!

Once upon a time it was easy to differentiate between Film and TV production: film was shot on film, TV was shot electronically. SAG looked after the interests of Screen Actors (film) while AFTRA looked after the interests of Television actors. That the two actors unions have merged is indicative of the changes in production technology.

As is noted in an article at Digital Trends, there is almost no difference between the technologies used in both styles of production, so what are the differences? It comes down to two thing, which are really the same thing.

(more…)

Over on IndieGoGo there’s a project for MOX – an open source mezzanine codec for (mostly) postproduction workflows and archiving. The obvious advantage over existing codecs like ProRes, DNxHD and Cineform is that MOX will be open source, so there is significantly reduce risk that the codec might go away in the future, or stop being supported.

Technically the project looks reasonable and feasible. There is a small, but significant, group of people who worry that support for the current codecs may go away in the future. There’s no real evidence for this, other than that Apple has deprecated old, inefficient and obsolete codecs by not bringing them forward to AVFoundation.

I have more concerns for the long term with an open source project. History shows that many projects start strong, but ultimately it comes down to a small group of people (or one in MOX’s case) doing all the work, and inevitably life’s circumstances intervene.

MOX is not a bad idea. I just doubt that it will gain and sustain the momentum it would need.

As Final Cut Pro X – and other modern video apps – are built on Frameworks from the core OS, those Frameworks sometimes provide clues to Apple’s thinking. One that we care a lot about is AVFoundation, which is the modern replacement for QuickTime at the application and OS level. We’ve seen this in the transition from QuickTime Player 7, which is built on QuickTime (both QTKit and the older C API). Unfortunately AVFoundation has lacked many features that are essential for video workflows, so I watch the features added to AVFoundation as a way of understanding where video apps might go.

Firstly, there has been a massive update to AVFoundation in Yosemite, and it appears we get reference movies back.

(more…)

Today, Adobe held a “family and friends” screening of Gone Girl on the Fox lot. It felt very much like Adobe’s formal Debutante appearance in Hollywood – the world of Studio films. For those who do not know, David Fincher’s Gone Girl was the first studio film edited on Adobe Premiere. It must be a proud day for the entire Adobe Premiere Pro team, but I couldn’t help but reflect on what a great day it must be for Adobe’s Mike Kanfer, who has worked tirelessly promoting Premiere Pro within the Hollywood Filmmaking community to see this day happen.

It’s a damned good film, you should go see it, although I was a bit squeamish in one part.

Anyway, Greg and I – aka Intelligent Assistance – were very proud to have been a small part of Adobe’s success story, by providing the crucial Change List tool for Adobe Premiere Pro. We’ve still to commercialize it into an Adobe Panel, but it’s coming.

We were also very pleased with a “Special Thanks” credit on the movie itself. It’s right at the end but we’re in good company.

This interview was recorded a couple of months back, and I’ve been waiting for it to come out, as I think it’s one of my three favorite times as interview subject of all time, along with recent interviews on That Post Show (Episode 13) and FCPX Grill. This one is interesting because it focuses more on my adventures in business rather than production or post production. Apparently during the interview I talk about (from Scott’s description, not my own):

  • How to recover when forces outside of your control destroy your entire business model.
  • Moaning and complaining is a rather poor business approach.
  • To always be on the look-out for new opportunities.
  • When to tell your clients to go pound sand.
  • How to concentrate on things to advance your company’s goals.
  • That marketing is education.
  • What is the best way to eat a giant animal.

The Podcast’s full name is The Video Crush with Scott Markowitz and I’m up in Episode 3.

(more…)

Sep/14

9

Amsterdam Supermeet and Lumberjack System

Greg and I are heading for Amsterdam tomorrow for IBC, but especially the Amsterdam Supermeet. This will be the first time we’ve had Lumberjack System at a Supermeet and we’re going to celebrate by giving away 200 of these cute little keyring pouches that can carry a credit card, business cards, or a USB memory stick attached to your keyring.

Two of the pouches will carry vouchers for two (2) Envoy Pro EX 240GB SSD drives courtesy of OWC/ MacSales which you’ll get instantly from OWC at the Supermeet.

Come check out Lumberjack System – if you edit with Final Cut Pro X, some easy logging in the field will pay huge benefits preparing for post.

Visit our table at the #Amsterdam #SuperMeet.  Save €5.00 on tickets. Buy using this link: https://amsterdam2014.eventbrite.com/?discount=lumbervip

Older posts >>

November 2014
M T W T F S S
« Oct    
 12
3456789
10111213141516
17181920212223
24252627282930