CAT | Apple
Apple has reportedly purchased machine learning company Turi. The expectation is that they will use it to improve other Apple products, most likely Siri. But what is machine learning and how does it fit with the algorithms I’ve been talking about lately.
Since starting work on the Lunch with Philip and Greg I’ve battled a little with the multicam. Largely because I’m using it in an atypical way, although I suspect setups like mine will become more common in the future.
My solution was Automator actions, triggered by Function keys and activating an AppleScript, so that the mode is first switched to Video Only (for angles 1 or 2) or Audio only (3, 4 and 5) before switching to the angle. It reduces a lot of repetitive strain injury potential!
The tutorial is over at FCP.com, but here’s a little background.
While Apple’s Eddy Cue says they’re not interested in original programming, that’s not 100% true. Cue said that Apple was only interested in the content space when the projects are tied to its existing products.
But it does not explain Planet of the Apps! I think Apple will be interested in original programming as soon as its seen to be strategically valuable. It’s a topic I’ve written on before in an article from December 2009.
What if Apple or Google simply bypassed Networks and Studios? which does some rough financial calculations which show that Apple could fund as much TV Show and Movie production as they want!
I’m also not the only one who thinks there will be more Apple original programming in the future:
From Variety: Apple Eyes Move Into Original Programming
Fast Company (as part of a bigger piece) suggest Apple is covertly pursuing original television programming deals.
And in yesterday’s Q3 Earnings Report Apple CEO Tim Cook said that we have to think about the Apple TV (current generation with tvOS) in a different way.
Cook: The introduction of the Apple TV and tvOS last October and the subsequent OS releases, what’s coming out this fall… think of that as sort of building the foundation for what we believe can be a broader business over time. I don’t want to be more precise than that, but you shouldn’t look at what’s there today and think: “we’ve done what we want to do.” We’ve built a foundation that we can do something bigger off of.
Given that Apple are slowly repositioning themselves to rely more on services revenue, and Apple TV is a foundation… Let the speculation begin.
This certainly isn’t the first time Apple have filed for “Works with” Trademarks, and that’s what makes it interesting. Previously these type of trademarks have been for Apple Ecosystems, like iPhone, iOS, iPad, CarPlay, AirPrint, et al.
While I have no idea what it might mean – developers have no clues yet – it is interesting that iMovie and Final Cut Pro X are being considered as part of a larger ecosystem. For those who don’t know, these days iMovie is a version of Final Cut Pro X with a simplified interface.
According to an article on MacRumors today, Apple is negotiating with Studios and Producers to create original programming for Apple TV. Two thoughts.
Apple have long created their own content by running music festivals and recording the performances.
It’s been a long time coming, but I thing it was inevitable. Back in late 2009 I postulated on What if Apple or Google simply bypassed Networks and Studios? My conclusion then:
Clearly, either Google or Apple could destroy the existing content production industries without borrowing or risking their business. Just what leverage do the current middlemen really have?
It’s a strategy that’s working well for Netflix and Amazon.
I first wrote about derived metadata back at the end of January 2009. Derived metadata uses computer analysis to derive metadata from the video source. There are now technologies for speech-to-text, meaning extraction, facial detection, facial recognition, emotion detection, image recognition, and more. One company has been accumulating these somewhat diverse technologies: Apple.
You may have read that Randy Ubillos – Chief Architect, Video Applications at Apple – retired after 20 years with Apple, yesterday. I’ve had the great privilege of meeting him from time to time, and offer my hearty congratulations on his retirement, the strongest of best wishes for the future, and heartfelt thanks for largely making my career possible.
The question on everyone’s lips is “how does this affect Final Cut Pro X?” My honest thought is “not much”. There are concepts in Final Cut Pro X that clearly came from Randy’s mind, but so also did the original Premiere Pro (1-4.2), the original Final Cut (Pro) (aka Keygrip at Macromedia), Aperture where he was lead architect, and iMovie 08. There were other apps before that, and the full history can be found in Timelines 2 by John Buck.
Randy was also an important part of the team that developed Final Cut Pro X, but more in the role of supervising architect, rather than as part of the detailed group of Product Marketing, App Design, and App Architecture. The people working in those key roles remain in those key roles, and I sense nothing that would affect, or change the direction Apple are taking with Final Cut Pro X. Final Cut Pro X is in exceptionally good hands moving forward.
As I’ve written before, the tools of creative endeavor will always be part of Apple’s DNA, and therefore I expect we’ll see evolution of the tools over time, but never abandonment. There will always be professional, and consumer level, audio, video and photography apps in Apple’s world.
Professionally, I’ve benefited from (writing an unreleased book about) Premiere Pro; from decades of Final Cut Pro classic; and from Final Cut Pro X. I still prefer Aperture over Photos but I’m keeping an open mind that the metadata functions in Photos will improve. For all that, I simply say “Thank you”.
Finally, a little bit of advice from Randy, on taking “holiday videos” (I may paraphrase slightly):
At each location, take out the video camera and shoot a shot. Now, put the camera away and enjoy your holiday and the location in the present.
A very subjective take on NAB 2015 because I spent very little time looking at tech! Instead my focus was on the FCPWORKS demo room and particularly my Lumberjack System presentation on Wednesday. But, of course, NAB is also about the socializing.
CNET are reporting that the February 25th Episode (Season 6 Episode 16) was shot with iPhone 6 and iPad Air 2 (with a little assist from a MacBook Pro). An iPad Air 2 was my primary “camera” for my family history video shoot back in early January.
Out of the blue, Apple announces Final Cut Pro X 10.1.4, which includes some key stability improvements. There is also a Pro Video Formats 2.0 software update, which provides native support for importing, editing, and exporting MXF files with Final Cut Pro X. While FCP X already supported import of MXF files from video cameras, this update extends the format support to a broader range of files and workflows.
– Option to export AVC-Intra MXF files
– Fixes issues with automatic library backups
– Fixes a problem where clips with certain frame rates from Canon and Sanyo cameras would not import properly
– Resolves issues that could interrupt long imports when App Nap is enabled
– Stabilization and Rolling Shutter reduction works correctly with 240fps video
Jon Chapelle of Digital Rebellion has noted that the support for MXF is much wider than just Pro Apps. What is interesting is that the MXF components seem to be QuickTime based, rather than AV Foundation, probably for historic reasons.