CAT | Metadata
In September 2010 Apple purchased Swedish facial recognition company Polar Rose, and today we learn they’ve purchased Israeli startup RealFace: “a cybersecurity and machine learning firm specializing in facial recognition technology”.
What is different between the two purchases is that this latest is based on machine learning.
…the startup had developed a unique facial recognition technology that integrates artificial intelligence and “brings back human perception to digital processes”. RealFace’s software is said to use proprietary IP in the field of “frictionless face recognition” that allows for rapid learning from facial features.
Another step towards our software identifying and labelling people in our media.
With Lumberjack System we don’t focus enough on Story Mode. Of late Transcript mode and Magic Keywords have taken the main focus, and of course the primary real-time logging and pre-editing tools are well known by now.
But Story Mode is ultimately move valuable if the project continues more than a one or two day shoot. Story mode lets us send Lumberjack logged Final Cut Pro X Events or Libraries back to the Lumberyard app to create string-outs from all the footage.
This recently became very valuable for a recent project: extracting the conversations on Final Cut Pro X from nearly 20 episodes of Lunch with Philip and Greg for an upcoming documentary.
2016 was a year of consolidation and growth for Greg and I: citizenship, green card, artificial intelligence and a house and yard dominated the year. 2017 looks like being another interesting and exciting year.
I’d like to introduce you to our first new piece of software for about two years: FindrCat. FinderCat is an easy-to-use app that converts your Final Cut Pro X Keywords into Finder Tags, so you can then filter and search for your media via Finder. In a world of Media Asset Management (MAM), and Digital Asset Management (DAM) this is a ‘no M’am’ asset organization tool.
The biggest advantage is that the FCP X keywords now travel with the media files, and will return to FCP X as keywords when re-imported, on any system.
I guess it won’t be any surprise that I have a lot of metadata entered in my Aperture photo library. In fact the lack of metadata support in Photos is the reason I can’t migrate there.
The real value of metadata is to help find photos, but sometimes the right piece of metadata is beyond value. For some paperwork related to my current husband’s ‘green card’ I needed the date of birth of my former wife. I could not remember it, but looking in my photo library, I found one taken on her 23rd birthday.
Of course I have the original date set on my photos, even those that I scanned from prints or slides. Because I had added the metadata when I knew it, I now had the all-important date I needed, and was able to file the paperwork.
Never underestimate the value of metadata!
As you probably all know, I have two day jobs heading Intelligent Assistance Software and Lumberjack System. We’re very proud of the work we’ve done through both companies. We make a decent income from them for sure, but what makes us particularly happy when our tools get people’s work done faster. They get to go home to their families earlier and production has less drudgery.
So it pleases us greatly when that gets recognized, as it did this trip.
In this latest episode of The Terence and Philip Show Terry and I discuss metadata, my citizenship, smart APIs, Artificial Intelligence and more.
The extensive article by Steven Levy – The iBrain is Here – is a fascinating read on how Apple are using Machine Learning, neural networks and Artificial Intelligences across product lines. It’s well worth the time to read through, but this quote from Phil Schiller stood out:
“We use these techniques to do the things we have always wanted to do, better than we’ve been able to do,” says Schiller. “And on new things we haven’t be able to do. It’s a technique that will ultimately be a very Apple way of doing things as it evolves inside Apple and in the ways we make products.”
The ways this could all be aligned with editing? Speech-to-text; keyword extraction (just like Magic Keywords in Lumberjack System); sentiment extraction; image recognition; facial detection and recognition; speech controlled editing (if anyone really wants that), and the list goes on.
I’d like to believe the Pro Apps Team are working on this.
UPDATE: Ruslan Salakhutdinov is Apple’s first Director of AI.
I recently commented on the importance of metadata for rights management during distribution. While cleaning my email inbox I revisited a story from late last year, on how over-the-top content providers (generally niche) can use metadata from social media and other sources to help grow their audiences.
Avid Media Composer has always been great at tracking metadata. Without accurate timecode, Media Composer would have never become established. Over the years Avid have continued to add metadata support to the app.
Reading an Avid blog on Spanish Broadcaster RTVE’s technical deployment in Rio, I was struck by this:
All of the cataloging and indexing process will be carried out by means of an autometadata software developed by the Corporación itself, which will enable the use of metadata provided by OBS and those selected by TVE’s documentary makers.
I’d love to know more about what the “Corporation” has done. Maybe we can set up a meeting for October when we’ll be back in Barcelona!
Spoiler alert: they use a lot of Avid technology!