CAT | Business
I was saddened, but not really surprised, by this week’s announcement that Adobe were pulling Speech-to-Text transcription from Premiere Pro, Prelude and AME. As Al Mooney says in the blog post:
Today, after many years of work, we believe, and users confirm, that the Speech to Text implementation does not provide the experience expected by Premiere Pro users.
I am saddened to see this feature go. Even though the actual speech-to-text engine was somewhat hit or miss, there was real benefit in the ability to import a transcript (or script) and lock the media to the script. So it’s probably worth keeping the current version of Premiere (or one of the other other apps) to keep the synching function, as the apps will continue to support the metadata if it’s in the file.
Co-incidentally, we had a feature request recently, wanting a transcription-based workflow in Final Cut Pro X. When questions on how he’d like it to work, he described (unintentionally) the workflow in Premiere Pro!
In fact, I’d almost implore Adobe to keep the ability to import a transcript and align it to the media, using a speech analysis engine. That way the industry will have an alternative to Avid’s Script Sync auto-alignment (previously powered by Nexidia) tools currently unavailable in Media Composer. The ability to search – by word-based content – hundreds of media files with transcripts, is extremely powerful for documentary filmmakers.
And yes, there is the Nexidia-powered Boris Soundbite, but there is one problem with this waveform-matching approach: there is no content metadata. Nor anything (like text) we can use to derive content metadata.
A recent comment in an article on CNET.com caught my eye:
“If I owned a studio, I’d make movie theaters pay me,” says Dana Brunetti, producer of “House of Cards” and “The Social Network.”
Needles to say I had to read the article. First note was that this comment was in the context of a web focused conference, so there may be an element of “playing to the audience”, but in essence the argument is that more online/web companies should follow Netflix (and Amazon, Google and Apple) into producing more original content.
With online and technology-based companies already threatening traditional distribution methods, the impact would be huge: “Once Silicon Valley can create content as well,” said Brunetti, “they’ll own it soup to nuts.”
I can’t argue with that. More original production means more jobs in the industry. (And yes, more clients for my day job’s business.)
What appeals to me is the push for “per program” content purchase. As long as the pricing issue is solved. It should cost no more (over a month) for a la carte purchases of limited programming, than it is for a full cable subscription.
Variety just posted an article on how many people had watched online game play (of one game) in one week. 75,000 players and 6 million individual viewers who collectively watched 327 million minutes of gameplay. Watched. That’s about an hour per viewer on average.
Six million people watching one game’s game play. That’s a decent network-sized audience these days. That’s one game for one week. Admittedly a release week for the game.
Watching game play has become a huge audience, with very low production costs. While it’s not traditional production, the time spent watching gamers play video games, erodes the time available for other forms of entertainment, specifically films and television!
Once upon a time it was easy to differentiate between Film and TV production: film was shot on film, TV was shot electronically. SAG looked after the interests of Screen Actors (film) while AFTRA looked after the interests of Television actors. That the two actors unions have merged is indicative of the changes in production technology.
As is noted in an article at Digital Trends, there is almost no difference between the technologies used in both styles of production, so what are the differences? It comes down to two thing, which are really the same thing.
As Final Cut Pro X – and other modern video apps – are built on Frameworks from the core OS, those Frameworks sometimes provide clues to Apple’s thinking. One that we care a lot about is AVFoundation, which is the modern replacement for QuickTime at the application and OS level. We’ve seen this in the transition from QuickTime Player 7, which is built on QuickTime (both QTKit and the older C API). Unfortunately AVFoundation has lacked many features that are essential for video workflows, so I watch the features added to AVFoundation as a way of understanding where video apps might go.
Firstly, there has been a massive update to AVFoundation in Yosemite, and it appears we get reference movies back.
Today, Adobe held a “family and friends” screening of Gone Girl on the Fox lot. It felt very much like Adobe’s formal Debutante appearance in Hollywood – the world of Studio films. For those who do not know, David Fincher’s Gone Girl was the first studio film edited on Adobe Premiere. It must be a proud day for the entire Adobe Premiere Pro team, but I couldn’t help but reflect on what a great day it must be for Adobe’s Mike Kanfer, who has worked tirelessly promoting Premiere Pro within the Hollywood Filmmaking community to see this day happen.
It’s a damned good film, you should go see it, although I was a bit squeamish in one part.
Anyway, Greg and I – aka Intelligent Assistance – were very proud to have been a small part of Adobe’s success story, by providing the crucial Change List tool for Adobe Premiere Pro. We’ve still to commercialize it into an Adobe Panel, but it’s coming.
We were also very pleased with a “Special Thanks” credit on the movie itself. It’s right at the end but we’re in good company.
This interview was recorded a couple of months back, and I’ve been waiting for it to come out, as I think it’s one of my three favorite times as interview subject of all time, along with recent interviews on That Post Show (Episode 13) and FCPX Grill. This one is interesting because it focuses more on my adventures in business rather than production or post production. Apparently during the interview I talk about (from Scott’s description, not my own):
- How to recover when forces outside of your control destroy your entire business model.
- Moaning and complaining is a rather poor business approach.
- To always be on the look-out for new opportunities.
- When to tell your clients to go pound sand.
- How to concentrate on things to advance your company’s goals.
- That marketing is education.
- What is the best way to eat a giant animal.
The Podcast’s full name is The Video Crush with Scott Markowitz and I’m up in Episode 3.
Adobe have previewed their IBC video app presentations and have confirmed that they will be continually adding new features to the Creative Cloud.
FCP.co’s lead story today is good news for Apple and Final Cut Pro X – The BBC are adopting more than 1000 seats of Final Cut Pro X for news. To be fair, the BBC seems to be adopting both Final Cut Pro X and Premiere Pro across their own production units, and some remain on installs of Media Composer, but News seems to be going Final Cut Pro X exclusively.
My Apple PR contacts tell me that Final Cut Pro X is also being used on “several popular daytime shows” as well.
In other good news, not reported (yet) on FCP.co, the French TF1 group have also adopted Final Cut Pro X. According to this Tweet both Premiere Pro CC and Final Cut Pro X were tested, with Final Cut Pro X getting the gig.
Of course, come IBC I’m sure Adobe will share some of their new partners as well, and no doubt, increased Creative Cloud subscribers.
A new show in which we discuss 4K. http://www.theterenceandphilipshow.com/?p=546