CAT | Interesting Technology
A recent articles, and project, demonstrate an increasing trend to automate certain types of production: generally that which is highly predictable. One example uses new technology to build news videos from text articles; the other builds multiple videos based on the same XML template.
These types of technologies are but another in a series of developments on templatorization or automatic editing. Naturally, at the heart of all automated processes is metadata.
For the Lunch with Philip and Greg project we shoot 4K and extract 1080 out of the larger image (or scale the image down). Working on the edit of the next to be published, this very moment shows why I find a big advantage in 4K for acquisition.
You can see from the multicam thumbnail why I wanted to crop this image, even though the shot on Greg is the best choice for that moment. And yes, cutting around eating is one of the early challenges of this project. We’re developing better strategies as we gain experience.
In the latest episode of the Terence and Philip Show, Terry and I discuss how to survive an ever-changing world and keep your career alive. Triggered by a discussion of a family history video project I’m undertaking (and will be writing about more shortly).
As Final Cut Pro X – and other modern video apps – are built on Frameworks from the core OS, those Frameworks sometimes provide clues to Apple’s thinking. One that we care a lot about is AVFoundation, which is the modern replacement for QuickTime at the application and OS level. We’ve seen this in the transition from QuickTime Player 7, which is built on QuickTime (both QTKit and the older C API). Unfortunately AVFoundation has lacked many features that are essential for video workflows, so I watch the features added to AVFoundation as a way of understanding where video apps might go.
Firstly, there has been a massive update to AVFoundation in Yosemite, and it appears we get reference movies back.
This news item caught my eye.
“DJI has expanded the feature set of the Phantom 2 Quadcopter line by giving it the ability to map out a route and automatically fly it. Using Ground Station, shooters will be able to create a route full of way points in order to let the quadcopter fly automatically from one point to another, and allow the shooter to concentrate on getting the shot. And it’s just what the pilot ordered.”
It reminded me of some of my thoughts on – what I was then calling a quadcopter but would now be a drone – how these flying camera platforms were going to evolve.
One of the most interesting NAB announcements was the addition of some 70 additional editorial tools to DaVinci Resolve 11. This makes Resolve a very competent on-set DIT tool, a full featured NLE, and a world-class finishing tool. Mostly for FREE. What is interesting here is how this is likely to play out in the long term.
The latest release of Resolve – 10.1.3 – includes this gem:
- GPU debayer Preferences option for REDCODE RAW clips
I think it’s inevitable that NLEs also get this GPU capability, which allows debayering of RED RAW files on the GPU instead of needing a RED Rocket card for the task. Already Premiere Pro uses the GPU for debayering for Cinema DNG files, so I see a trend coming.
In fact Adobe’s David Hemly teased a real-time GPU debayer as part of a March Technology Preview. (Toward the end). Technology previews are not guaranteed to be part of a future release, but they always have been in the past!
Software that recognizes mood is apparently used in call centers – could be used to derive metadata for pre-post logging (and story derivation). Imagine a keyword collection for “happy” or “stressed” or whatever mood happens to be demonstrate in the audio content. I’m not sure if the technologies are related but Affectiva have demonstrated emotion-detecting software in the past.
With speech-to-text, keyword extraction, mood extraction the basic logging of reality and documentary could be done in pre-post and handed to the editor.
After Terry Curren’s round up of last year’s Hollywood Post Alliance Retreat I decided I should attend this year. While I was working on marketing for Lumberjack – our real time location logging tool – I got an email from the HPA offering spaces in the demo room during the retreat. It was immediately obvious that this was the time and place to reveal what we’ve been working for the last 8-9 months.
I have a strong interest – personally and professionally – to want to automate the boring parts of post-production away from humans to computers, extending to some of the basic string-outs. This seems to infringe on the “human” role in postproduction, at least according to some of my associates. Well, lately I’ve come across a whole range of stories on how traditionally human roles, like doctors (and assistant editors), can or will be automated out of existence. That’s led me to think about what is the essential role of the human that can’t be automated? It’s not a simple question. (more…)