CAT | Technology
I cut a short 30 minute competition entry for a friend today. A relatively simple single-take green screen over a Pond 5 background she purchased.
Except we used a bunch of technologies that were all non-existent just a few years ago.
Starting with some Blackmagic Design ProRes files, we:
- Sped up the talent about 20% with no visible or audible artifacting
- Keyed out the green background by using the built-in keyer in FCP X at the default settings
- Repositioned the talent to better fit the background shot
- Slowed down the background to 66% with no visible artifacting
- Applied a real time blended mask and gaussian blur on the background (over a duplicate, not blurred copy to simulate depth of field
- Used the Color Board to reduce the exposure on her face, while using a mask so her eyes continued to sparkle
all in real time on a 2015 Retina 5K iMac and Final Cut Pro X.
It wasn’t that long ago that applying a soft edge to a mask; or any gaussian blur, or any chroma key meant a render before playback.
Like in the machine learning/AI field, the video technologies also keep getting better all the time.
Maybe I’m pushing this subject a bit hard, but I really believe we are on the cusp of a wide range of human activities being taken over by smart algorithms, also known as Machine Learning. As well as the examples I’ve already mentioned, I found an article on how an “AI” saved a woman’s life, and how it’s being used to create legal documents for homeless (or about to be homeless) in the UK.
Comments off · Posted by Philip in Interesting Technology
I’ve been talking about machine learning and smart APIs recently, where I think there is great potential to make pre-editing tasks much easier. But they are not without their downside. They are built on sample data sets to ‘train’ the algorithm. If that training set is not truly representative of the whole data set, then the results will go horribly wrong.
Cory Doctorow at Boing Boing uses the Trump campaign as an example of how this can play out in ‘the real world’.
I recently commented on the importance of metadata for rights management during distribution. While cleaning my email inbox I revisited a story from late last year, on how over-the-top content providers (generally niche) can use metadata from social media and other sources to help grow their audiences.
A couple of recent articles have pointed to Artificial Intelligence writing, or contributing to, a screenplay. A narrative script. I find this fascinating, even though my own area of interest in applied AI is in non-scripted.
There is no doubt that computer algorithms – up to true AI – will be involved in productions future. Smart people will work out how to master it.
Since starting work on the Lunch with Philip and Greg I’ve battled a little with the multicam. Largely because I’m using it in an atypical way, although I suspect setups like mine will become more common in the future.
My solution was Automator actions, triggered by Function keys and activating an AppleScript, so that the mode is first switched to Video Only (for angles 1 or 2) or Audio only (3, 4 and 5) before switching to the angle. It reduces a lot of repetitive strain injury potential!
The tutorial is over at FCP.com, but here’s a little background.
We just published a new version of the IntelligentAssistance.com. It was time! Each of our host apps – Final Cut Pro X, Premiere Pro CC and Final Cut Pro 7 – has their own tab on the home page, and customized app pages.
While I expect computers to take on more and more of the pre-editing functions, we’re losing the Assisted Editing branding. It made sense when we first used it. Back then we had books, training products, The Digital Production BuZZ as well as the software.
The BuZZ went to Larry Jordan, we discontinued our training products and the books rightly came here to my personal site.
So, Intelligent Assistance Software, Inc makes metadata based workflow tools for Final Cut Pro X, Premiere Pro CC and Final Cut Pro 7. That’s all we do. As a company.
Greg and I are also behind LumberjackSystem.com, of course.
Both this new site and LumberjackSystem.com are built using Adobe Muse, which I really like for this type of sales brochure website. In both sites, store functions are outside the Muse site.
While Augmented Reality and Virtual Reality often get conflated, they are very different beasts.
Avid Media Composer has always been great at tracking metadata. Without accurate timecode, Media Composer would have never become established. Over the years Avid have continued to add metadata support to the app.
Reading an Avid blog on Spanish Broadcaster RTVE’s technical deployment in Rio, I was struck by this:
All of the cataloging and indexing process will be carried out by means of an autometadata software developed by the Corporación itself, which will enable the use of metadata provided by OBS and those selected by TVE’s documentary makers.
I’d love to know more about what the “Corporation” has done. Maybe we can set up a meeting for October when we’ll be back in Barcelona!
Spoiler alert: they use a lot of Avid technology!
Only a few days ago I wrote about smart APIs that developers can use to enhance their apps. Today, John Hauer on TechCrunch postulates that he could not find one job that someday won’t be dehumanized.