What is Apple doing with QuickTime?
As expected, AV Foundation from iOS 4 will be added to Lion. My take is that signals the end of QuickTime as we’ve known it. But it’s not only that there’s a new Framework for working with time-based audiovisual media – there’s a lot more to QuickTime than that, and it’s all the interactive and additional technologies in QuickTime that don’t appear to have a future. Features that were important when QuickTime MOVs were the preferred (at Apple) distribution format.
It’s been a very long time since QuickTime was seen in Apple’s view as a distribution format. Apple threw its weight behind MP4 (largely based on the MOV container, but NOT the same) some time before the release of QuickTime 4 in 2002. Since then, distribution has been via MP4: originally using the MP4 Simple Profile, and at QuickTime 7.0.2 with the H.264 codec.
The focus on QuickTime spun around to the use in the OS X ecosystem. There’s unlikely to ever be a 64 bit “QuickTime for Windows”: MP4 is for distribution.
Most people think of the QuickTime Player as “QuickTime” but if you look carefully it’s a very small application for the power it has. That’s because “QuickTime” is really a series of system-wide event-based media handling routines. Your Mac won’t even boot without QuickTime!
The other thing that most people don’t know is that QuckTime handles way more than just “video” playback. It was planned as a very powerful multi-media tool. In many ways it took Flash until version 8 to reach feature parity with QuickTime 3. Most of the features I’m talking about – beyond playback of diverse codecs and formats – are for interactive media:
- QuickTime Vector Graphics (yes like Flash vectors)
- Wired Sprites (clickable interactivity – used extensively in our “Intelligent Assistants” of early this century)
- Sprite tracks (that can be animated around the screen)
- Flash Media (since removed for security reasons when Adobe didn’t “fix” security on older versions)
- Transitions (still available in Final Cut Pro 7)
- Filters (still available in Final Cut Pro 7) including filters like Clouds and Fire that required no media
- Text tracks (although most developers used Flash for text because QT Text didn’t cross platforms well)
- HREF tracks
- Media free color source
- SMIL – the basis of our 2001 technology demo that led eight years later to First Cuts.
- Streaming Track (RTSP)
- VR – both objects and panoramas
- Movie within Movie loading, which we used extensively in the Intelligent Assistant for Adobe After Effects – when you drilled down for more detail, that movie loaded within the parent movie.
It was all this functionality that attracted me to QuickTime as a technology, and as a company we certainly took advantage of these features in our Intelligent Assistants. For part of 1998-99 my email sig line included “Accidental QuickTime Evangelist”.
All in all, “QuickTime” – the original 32 bit C API (Application Programming Interface) – had more than 575 Classes and more than 10,000 Methods. (Although you don’t need to know what that means, for simplicity think of Class as a related collection of Methods, while Methods are the actual functions you would use when writing code.) But sadly, 32 bit APIs have no future in Apple’s world, and Apple changed their mind on developing 64 bit Carbon APIs at in 2007.
During the development of Tiger, Apple started work on a Cocoa version of QuickTime with a new Framework (a collection of Classes) to replace the old C API. There have been few additional Classes and Methods added to the QTKit Framework, which allowed QuickTime 7 to talk directly to Core Audio and Core Video. (These are the very low level Frameworks to handle video and audio media.) The primary benefit was to allow graphic card acceleration through Core Video.
Even with the additional Methods added to QTKit in Leopard and Snow Leopard (with apparently more to be added in Lion) the QTKit Framework has 24 Classes with around 360 methods, mostly covering relatively simple media playback. Anything not covered uses the older C APIs for backward compatibility.
(Many thanks to Chris Adamson’s excellent Mastering Media with AV Foundation for the comparison of Classes and Frameworks. I’m a regular reader of his blog, and follow him as @InvalidName on Twitter.)
I’ve covered before why Final Cut Pro couldn’t continue on the C APIs as it needed to go to 64 bit, and that the QTKit Framework didn’t yet have all the functions that a 64 bit Final Cut Pro would need, particularly since so little work had been done.
Then I discovered that Apple were building a new framework on the iOS for handling media called AV Foundation, which seemed to handle the requirements of a professional NLE in 64 bit quite well. This seems to be the basis of the announced-but-not-seen Final Cut Pro 8 (or Final Cut Pro 2011), and that’s good. But even AV Foundation, as of iOS 4.2, has 56 Classes and 460 Methods. That’s more than double those in QTKit but a long way short of the QuickTime C API’s 10000 plus Methods!
Most importantly for our purposes in professional video, AV Foundation is time based. QuickTime is event based. It’s an important distinction for when you media playback timing has to be within video tolerances. (On a computer 29.97 or 29.99 doesn’t much matter!)
I was resigned to the fact that all the interactivity, filtering, transitions, etc, etc, from QuickTime were gone. At least those parts not appearing in AV Foundation. Then I discovered Quartz Composer!
I had been vaguely aware of Quartz Composer in the past – after all Noise Industry’s FX Factory and CHV’s QC Integration FX are for the sole purpose of wrapping a Quartz Composer Composition inside an FXPlug plug-in for Final Cut Pro or Motion. But beyond that, I didn’t know much.
Then, as part of our regular evening entertainment of watching Apple Developer videos (yes, we are that geeky) there were some introductions to Quartz Composer! That’s when I realized that yes, those QuickTime technologies were dead, but that there was a much better replacement in Quartz Composer. Even better, Apple supplies an authoring tool.
There was a time when choosing the preferred interactive authoring environment was a sufficiently close call that GoLive had an Interactive QuickTime authoring tool when Adobe purchased it. I believe that one of the reasons Flash got traction over Interactive QuickTime was because Macromedia had a business model in selling the tools. For Interactive QuickTime, it was left to Totally Hip with their LiveStage Pro, Electrifier Pro and GoLive. (All of which we used.) There wasn’t a business model for QuickTime within Apple, and no authoring tool officially sanctioned by Apple.
But with Quartz Composer, Apple supplied a development tool. You do need to download the free Developer Tools. Quartz Composer is amazingly powerful. It’s node based and reasonably accessible to non-programmers. (You don’t have to write any code of any kind, except for custom expressions.) It can take data input and manipulate visuals. It can mix all types of media, including text (and generate text dynamically based on data inputs). It can be used in Applications or stand-alone compositions or saved as a movie from QuickTime Player 7.
There are standard compositions that are shared system-wide for use in any application. Pixelmator’s filters are Quartz Composer Filters in the standard System location.
In iMovie ’11, all the filters, transitions and templates are all Quartz Compositions! Each Trailer Template is a single Quartz Composition. They are very cool.
So, two conclusions. Clearly the features not already added to QTKit for Lion will never be part of QuickTime moving forward. In its place is a much more flexible and powerful technology called Quartz Composer, integrated at the System level.
My other conclusion. I would be very, very surprised if – in addition to FXPlug – Final Cut Pro 8 (or 2011) did not support native Quartz Compositions as filters and transitions.