Categories
Apple Metadata Video Technology

How is Apple using metadata in iMovie for iPhone?

I was finally watching the Steve Jobs Keynote from WWDC on June 7. (I know, but this was our second try – we get talking about stuff, what can I say?) I got to the iMovie for iPhone 4 demo and was blown away by the creative use of source metadata.

At 58 minutes into the keynote, Randy Ubillos is demonstrating adding a title to the video he’s editing in iMovie and iMovie automatically ads the location into the title. Not magic, but it’s simply reading the location metadata stored with images and videos shot with an iPhone and using that to generate part of the title. This is exactly how metadata should be used: to make life easier and to automate as much of the process as possible.

Likewise the same metadata draws a location pin on the map in one of the different themes. Exactly like the same metadata does in iPhoto.

In a professional application, that GPS data – which is coming to more and more professional and consumer video camcorders – could not only be used to add locations, but also to read what businesses are at the address. From that source and derived metadata (address and business derived from location information) we can infer a lot.

Check out my original article on metadata use in post production and for a more detailed version, with some pie-in-the-sky predictions of where this is going to lead us, download the free Supermeet Magazine number 4 and look for the article (featured on the cover) The Mundane and Magic future of Metadata.

Categories
Business & Marketing Item of Interest Metadata

Seven hours from feature request to product update!

Seven hours from feature request to updated application released: Sync-N-Link now uses log notes from video *or* audio. http://bit.ly/aqcxN7

I love being a small independent software developer: it’s great to be able to respond to customer requests promptly – and it makes the software better. Incidents like this one today make me also appreciative of the communication tools we now have

Some time, overnight our time, we had a new customer buy a copy of Sync-N-Link to sync rushes for 8 episodes of a new drama series: in Belgium! A few hours later he emailed to say that it was doing everything he expected, but their sound guy entered metadata (log notes) into the sound clips and Sync-N-Link (like Final Cut Pro itself) discards audio metadata in favor of the video metadata. (In a merged clip there is only room for one of each type of log note/metadata). The feature request was that the metadata from the audio could be preserved instead of from that from the video.

A good request. The ever efficient Greg Clarke, after morning coffee, got to work. At around 1:30 pm (Pacific) an update was published, ready for download, with the feature added. Not quite seven hours from feature request to released software.

I love that we can do that.

If you use any of our software let us know what more you want it to do. We can be very responsive!

Categories
Apple Pro Apps

Why Apple Insider couldn’t be more wrong!

Today Apple Insider got the echo chamber of the Internet buzzing, with their post Apple scaling Final Cut Studio apps to fit prosumers by Prince McLean. It’s a great headline and I can’t blame Prince McLean and Apple Insider for running with it: it’s bound to get them a whole bunch of links.

However, they couldn’t be more wrong. Factually they have the entire history of Pro Apps at Apple just plain wrong. That’s probably because Prince McLean isn’t exactly well known in the professional video communities and because that history is only known by those who where paying attention at the time. (And also, Apple have definitely encouraged the inaccurate version of the Pro Apps history mistakenly quoted at Apple Insider.)

More on that in a minute. Aside from the factual errors in the history, I think they have had some data from an insider that they’ve totally misinterpreted and the true interpretation is incredibly positive for Final Cut Pro.

Now for the standard disclaimer. I’m not a rumor monger. I gather data from a lot of different places; have watched the professional video software industry closely on a day-to-day basis; and am very good at interpreting and interpolating meaning from the data points. However, I do have a way-above-average history of accuracy in my predictions, something that cannot be said for Apple Insider (G5 Powebook anyone? Where’s my Final Cut Extreme Apple Insider?)

In the late 1990’s Macromedia were going head-on against Adobe: whatever Adobe could, they could do better. There was Freehand against Illustrator; Fireworks to ImageReady; Dreamweaver vs GoLive; and there was to be KeyGrip against Premiere. In fact Macromedia snagged the three core members of the development team for Premiere 1-4.2 and they started work on KeyGrip. KeyGrip had evolved to become Final Cut by NAB 98, where it was being shown in a small demo room in the basement. That was my first exposure and I still have the T shirt (which fits a much younger man).

Macromedia suddenly decided to stop fighting Adobe and jump on this new thing called the Internet. Good call. So Macromedia had no need for Final Cut and where in fact shopping it around before NAB 98. Media100, who were going to use KeyGrip on PCs with their Vincent Card but became frustrated with how far behind schedule it was they went on to develop Finish, passed on buying Final Cut, probably because of the history. 1998 was the year that Media100 launched a Windows app. Premiere had gone cross platform at 4.2 and Premiere 6 was developed for both platforms.

This was the year that it wasn’t looking all that good for Apple. NAB was PC all the way. Even Avid had endured the “we’re going only to PC” debacle/rumor/whatever.

Apple eventually purchased Final Cut about three weeks after NAB in reality to ensure that there would continue to be a Non Linear Editing application on the Mac. I also believe that someone figured that Apple’s FireWire (they developed it) port combined with the iLink on Sony’s DV cameras just released (in reality, also FireWire) combined with the new software could sell some Macs. That was a smart move. When I saw Final Cut in March 98, it was working with some Targa dual stream cards, which was not as robust as when Final Cut Pro was release at NAB 99. But Final Cut Pro had native FireWire/DV support: perfect with those new Blue and White G3 towers with native FireWire!

But Apple bought Final Cut Pro as a defensive (and marketing) move. I seriously doubt that there was a cohesive Professional Applications Strategy in 1999. Or 2000. But by NAB 2002 there had been some serious planning going on. By then (or shortly before) there was definitely a Pro Apps strategy in place. (If I recall correctly, largely attributable to Richard Kerris.)

I do know that the Final Cut Pro team were a whole lot more open then than they are now. It was a different time at Apple. I’m very confident, from conversations at that time, and when Apple went on the Pro Apps buying spree, that the strategy of a Pro Apps group came well after the Final Cut Pro purchase. When Apple saw how successful Final Cut Pro had become, and how valuable its nascent involvement in the professional film and television world was for selling iMacs with iMovie in the heartland, a Pro Apps strategy evolved.

And Apple went on a buying spree:

  • eMagic (Logic, Logic Pro, Garageband and Soundtrack Pro have evolved from that purchase)
  • Prismo Graphics for “LiveType” (a Cocoa version of India Pro)
  • Nothing Real (Shake) and Silicon Grail
  • Astarte (DVD Studio Pro 1-1.5)
  • Spruce (DVD Studio Pro 2 onward)
  • The Motion team who had previously created combustion and it’s ancestors (well, they had just been let go from discreet and Apple employed the whole team so technically Motion was developed by Apple employees)

and so on.

Apple have poured a lot of money into the Pro Apps and in turn it’s made them a lot of profit on the software division. “Highly Profitable” according to one very reliable source.

So, to the substance of the Apple Insider rumor: is Apple turning Final Cut Pro into Final Cut Prosumer? Let’s consider some data points.

  1. Apple does not like to be second best in anything. Consider DVD Studio Pro. DVDirector, the product they purchased Astarte for, was released by Apple as DVD Studio Pro 1 – effectively DVDirector 2.0.  There was a 1.5 release but DVD Studio Pro was not getting the professional respect that Apple hoped for. (Was that polite enough?) So they purchased Spruce. Although it was PC only and immediately killed, Apple bought the best available knowledgeable engineering team and abstract layer code. This became DVD Studio Pro 2 with the Pro Apps kit interface. (The first app with that Interface Framework.) They genuinely want Final Cut Pro – or its successor – to be a truly great application for their target market, which may not be senior editors on studio pictures!
  2. Apple derives a lot of benefit from the Pro Apps.
    1. The division is highly profitable. (500,000 users upgrade a version of the Studio and it’s $150 million). Not iPod territory but respectably profitable. (And they do help sell some of those expensive MacPros.)
    2. The technology is now interwoven throughout their iApps.
    3. There is a huge marketing advantage from the Pro Apps, such that it’d be worth keeping them if they were only just profitable. Every time a documentary is nominated for an Academy Award edited on Final Cut Pro, Apple sell 10,000 copies of Final Cut Express and an iMac or MacBook Pro in the heartland – it’s aspirational but affordable.
  3. Apple are pushing all their applications to 64bit and to Cocoa. Final Cut Pro has a harder-than-most development path because of the history (cross platform app to OS 9 to OS X to Intel and now to Cocoa and 64bit).
  4. Apple need to come out with a very strong version at the next release. Avid have been very strong with their recent Media Composer releases, particularly with workflow features that editors appreciate (a better open timeline than Final Cut Pro, for example). Adobe have just released a version of Premiere Pro that leverages Apple’s hardware for performance far better than Final Cut Pro does. Apple know this.
  5. Apple has the financial resources to wait until something is right, rather than release a half-finished version.
  6. Apple does not leak. OK, I think I’ve substantiated that Randy Ubillos is back in a senior designer position (or more) but really, Apple employees don’t leak. They’re my worst source of information that isn’t necessarily public knowledge. Randy, for those who don’t know, was one of those original three that went from Adobe to Macromedia: he was the original designer of Premiere 1-4.2. He is also the lead designer for Aperture and iMovie 09 was almost a personal project before Apple picked it up.
    1. So Apple Insider have not had a review copy of any development version of Final Cut Pro (next); it’s almost certain they don’t have any substantial information at all, just a snippet. Perhaps a quick view of an interface or mockup? There isn’t anything substantial in the article.

Ok, given all that, here’s why I think Apple Insider are about as wrong as anyone could be. They got something: a tip or a sneak peak or something. The most likely thing they saw that could lead to this type of misinterpretation is they saw, or more likely someone visiting Apple saw, a screen supposedly from the next version of Final Cut Pro and it looked, superficially like iMovie. Combine that with Randy Ubillos’ move back to Final Cut Pro and the leap is obvious, but wrong.

Apple appear to be revising the Pro Apps kit from it’s original incarnation in 2002-03. We’ve seen hints of more HUD (the white-on-black interface for Motion’s floating palettes) like interface design in places, and that look is very similar to iMovie 09. They’re looking for designers now. It’s likely that whatever the current interface design is, it’s not there yet or they wouldn’t be hiring designers now!

Let me go out on a limb and say that it much more likely means that Final Cut Pro is getting a very thorough rewrite. Not just a 64 bit/Cocoa rewrite (and hopefully take advantage of modern OS X features) but a complete rethink.

When iMovie 09 was demonstrated at LAFCPUG, there were a lot of people who wanted iMovie features incorporated into Final Cut Pro. Not dumb Final Cut Pro down to iMovie but take the best features of iMovie and incorporate them. While you’re at it, if nothing’s sacred in the current design, let’s take the best from Avid (metadata management – the groundwork has been happening since FCP 5.1.2 and the evidence is in the XML); the importance of performance from Adobe (strap in Grand Central Dispatch and OpenCL and make a showcase for Apple’s technologies); the best of iMovie.

This actually makes me much more hopeful and positive for the next version of Final Cut Pro. It suggests that Apple are serious about rewriting and not just changing out the minimum possible. And if it looks a little like iMovie 09, that wouldn’t be all bad. (But could you borrow customizable interfaces from Adobe, please?)

That’s why I believe Apple Insider misinterpreted the snippet of information and that the opposite is true: Apple are serious about making the next release the killer release everyone is hoping for.

Above all else, I reserve the right to be wrong. It’s a guess: an intelligent guess, yes.

Categories
Interesting Technology Item of Interest

XDCAM continues to take over acquisition

XDCAM continues to take over low cost acquisition http://bit.ly/cRUGBz JVC, Sony and Canon have XDCAM cameras, and now a standalone recorder.

The new FS-T1001 Camera Mount XDCAM EX recorder records to SxS in the same format as XDCAM EX devices for maximum compatibility. I guess it could also work with the SD card adapters for SxS that are available.

XDCAM EX is now the most popular (of new cameras) recording format: JVC’s GY-HM100 works with the 35 Mbit version; Canon’s XF305 and XF300 cameras works with an MPEG-2 codec identical to the 50 Mbit/sec version of XDCAM and of course Sony has the EX-1 and EX-3 recording XDCAM EX format.

Affordable acquisition formats have settled into two camps: the XDCAM EX world of 35 or 50 Mbit/sec long GOP MPEG-2 or the AVCCAM/AVCHD world of “long GOP” H.264 MPEG-4. H.264 is approximately 4x more efficient than MPEG-2 for the same bitrate, but H.264 takes more processing power to decode.

Categories
Metadata

Why is metadata so important?

I have recently been scanning in slides and negatives from my young adult days, and I’m really wishing I’d entered more metadata at the time. That would have been too easy. Anyway, I thought it was worthwhile examining the different metadata I found/used in indexing these old slides.

Note: Apparently I believed I’d remember the people in my pictures for ever! While I remember the people, names that are not noted somewhere have evaporated over the 30 or so years in between.

The first, and most common, source of metadata was “Added” metadata: those notes that were made on the slide, a carrier or slide sheet carrier heading. Apparently in my early 20’s I could write in 6 point text, which I’m having trouble reading with much older eyes. Regardless, Added metadata has proven to be the most valuable.

The brief added metadata is usually combined with some “Analytical” metadata. Analytical metadata is metadata we get from analyzing visual content in the image. For example, this picture:

was one of a group of 20 on the same sheet with the single word ‘Burbank’ in the heading. In fact only two of the dozen images were in Burbank. But using Analytical metadata – the general location and the street sign near the traffic lights – the shot is clearly on Burbank Blvd, on the rise to the overpass over the 5 Freeway and railway lines. (The sign is for Front St.)

Co-incidence number 1: this is less than a mile from where i now live.

The other very useful metadata was stamped on the slides themselves: the month and year of processing, which locks down an approximate time scale. Also useful was the fact that I’d numbered all my files sequentially from the beginning of 1973 (my year in Japan). That sequential metadata made it much easier to identify specific times, in combination with the stamped-on date.

A combination of Added and Analytical metadata led me to the discovery that most of my 1976 trip was spent in the West San Fernando Valley. Identified was the location of an awards ceremony (by Church name) and a street shot, that was clearly looking across the SF Valley, was also named. Both turn out to be incredibly close to our Tarzana office and Woodland Hills home (2001-2005 before we moved to Burbank).

This jogged my memory that we had spent a lot of time in a school hall for rehearsal/training and enjoyed a close-by Denny’s. Taft High school (Ventura Blvd at Winnetka) has a Denny’s across the road.

So it is entirely possible that during my 1976 trip to the US, I spent the majority of my time around the area that was to become home a quarter of a century later; and some of the rest of my time near my 30-years-in-the-future home in Burbank.

Crazy. And without metadata I’d have never remembered.

Categories
Interesting Technology Item of Interest

$9.99 Video editing app for iPhone 3GS

$9.99 Video editing app for iPhone 3GS http://bit.ly/choXm9 Only for video shot on the phone but for newsgathering or mobile video blogging, it would be perfect.

Features:

  • Use the onboard camera from within the App, or import h.264 video or pictures already saved in the Camera Roll.
  • Edit with VeriCorder’s unique, multitrack, patent-pending editor.
  • Send smaller files ( < 10MB ) by email, or share larger files with your computer over a WiFi networks.
  • Export completed video projects into the Camera Roll, for simple posting on YouTube or MobileMe accounts.
  • Software includes a multitrack audio editing suite, with advanced features, including volume curves, sound mixing, gain control and more.

Categories
Apple Distribution Media Consumption Video Technology

What is it with Flash?

I’ve just been reading my daily round of news, and there’s still more on the whole “Flash v HTML5” or “Flash v H.264” thing and I’m just arrogant enough to believe I can contribute something here.

Flash is an interactive player that produces a consistent result across browsers and platforms. That’s why publishers like it. But most Flash use is at a very basic level: a simple video player. That is also why early QuickTime interactive programmers liked to use Flash (yes, as a QT media type) for controls and text as QT text did not display consistently across platform.

Flash is a player and not a codec or file format. The current iteration of the Flash player plays:

  • the original “Flash video” format, which is sequential JPG files, up to 15,000 a movie
  • Sorenson Spark, the first real video codec for Flash; based on the very ancient H.263 videoconferencing codec it did not produce good video quality.
  • On2 VP6, a good, high quality codec now owned by Google with their purchase of On2. Still not a bad choice for Flash playback if you need to use an alpha channel for real-time compositing in Flash.
  • H.264 in MP4 or MOV (with limitations) format. Licensed from Main Concept (now owned by DivX).

Note that those same H.264/MP4 files can be played on Apple’s iDevices using the built-in player; or using the <video> tag supported by HTML5 in Safari or Chrome (and IE9 coming sometime).

Flash as a simple video player is probably dead in the water. Flash for complex interactivity and rich media experiences probably will continue for a while, at least until there are better authoring environments for the more complex interactivity provided in “HTML5”.

That brings me to HTML5, which is not a simple player but a revision of the whole HTML tags supported by browsers, that allow native video playback by the browser without plug-in (the <video> tag); local storage (similar to Google’s temporary Gears offering, now replaced by HTML5 support) and a whole bunch of other goodies. Add to this CSS for complex display (and I mean complex – mapping video to 3D objects in the browser, for example); Javascript for interactivity and connectivity to remote servers/databases; and SVG (Scalable Vector Graphics) for creating graphic elements in a browser (useful for interface elements in rich media).

Javascript used to be very slow and not even comparable to the speed of interactivity possible in Flash, but over the last three years all Javascript interpreters have become massively faster, making complex software possible in the browser. (Check out Apple’s implementation of iPhoto-like tools in their Gallery – online version.)

Summing up: HTLM5/CSS/Javascript is already very powerful. Check out Ajaxian for examples on what is already being done. For simple video playback, Flash is probably not the best choice. MPEG-4 H.264 video AAC audio probably is the best choice. For rich interactivity targeted at anything Apple, build it with HTML5/CSS/Javascript – it’s the only choice. It is also a powerful one: Apple’s iTunes Albums are essentially HTM5-based mini-sites; iAds are all HTM5/CSS/Javascript based and not lacking in rich interactivity or experience.

If you’re building a rich media application to connect with a web backend targeting mostly desktop computers, then Flash could still be the best choice.

For building Apps for iPhone, iPad: use the Xcode tools Apple provides free. While Adobe might be complaining to the Feds looking for “anti-trust” sympathy, they won’t get it as Apple is nowhere near dominant in any market, which has to be proven before taking up the point as to whether or not they have abused a monopoly position. Apple are not the dominant smartphone manufacturer; nor dominant MP3 player, nor dominant Tablet manufacturer. (Ok, they probably are dominant in MP3 players and Tablets but they are not, by definition, a monopoly, and Apple will work very hard to ensure they never are.)

Categories
Apple Video Technology

There’s no QuickTime on Apple’s Mobile Devices Either!

In the discussion about Flash-on-iDevices following yesterday’s post it occured to me that not only was there no Flash on the iPhone, et al., but there was no QuickTime either!

Not what QT was at least. The iDevices support H.264 video and AAC audio, primarily in a MPEG 4 file wrapper (although some devices will play H.264/AAC in a MOV wrapper) that is really not what QuickTime has been. (More below). Try playing a Sorenson video file on an iPad. What about QuickTime interactivity (Wired Sprites)?  Ever seen a QT VR play on an iPhone?

Of course not. QuickTime is not supported on any Apple device other than desktop and laptop computers. I also believe that the QT I loved and evangelized heavily late last Century is destined for the scrapheap. It’s been increasingly obvious, since around 2002/2001 that Apple decided that the future of web video was MP4: open standards. Initially they supported the MPEG-4 Simple Profile (just MPEG-4 in Apple’s world) in QuickTime 6 and then H.264 – the Advanced Video Codec from MPEG 4 Part 10.

Now, a lot of MPEG-4 is adopted from QuickTime. Apple donated the QT container to the MPEG group for consideration as their container format. Because of that MPEG-4 can do pretty much anything that QT could do, except there are very few implementations of anything beyond basic video playback. So with the QT container at the center of MPEG-4 it was easy for Apple to adopt and support this evolving (at the time) technology.

So QuickTime became the pre-eminent MPEG-4 player. When it came to the Apple TV, iPhone, iTouch and now iPad, the decision was made to only support simple MP4 playback. When QuickTime X was announced it referenced “the experience of the iPhone video” suggesting that QuickTime X was a different approach. When it was released it’s clear that QuickTime X will be the next generation of consumer-facing video playback.

So I expect that QuickTime X will never get the advanced features that QuickTime currently has. There’s no business model for it within Apple, which was always the problem with QuickTime. Frankly that Apple never provided a development environment was why Flash was able to so quickly “take over”. Remember that in QuickTime 6, Flash 5 was a supported media type. (Support was dropped because of security concerns with that version of Flash.) It took Flash to version 8 before it equalled all the features of QuickTime 3! (Seriously).

Few people made use of the advanced features of QuickTime. Our Australian company was one of them, making all the movies for the DV Companion for Final Cut Pro, and most of the other Intelligent Assistants with QuickTime wired sprite animations so the file size was acceptable. We were in the era of small hard drives after all. There was never a development environment from Apple: Totally Hip stepped up with our development environment (LiveStage Pro). Had there been a business model within Apple for QuckTime then the story of the web would have been different.

The advanced features in QuickTime have had no development since, well, QuickTime 4 (before the return of Jobs to Apple). I believe, without proof, that there was a fundamental shift within Apple around that time to, really, abandon the features they could get no return on, and make QuickTime the best MPEG-4 player; a great architecture for creating media and the foundation of their total media strategy. Without the advanced features, because, by this time Flash had “won” the interactivity war.

Now we can have better interactivity using features from HTML5, Javascript and CSS, which are all web standards overseen by a body outside of one company. It’s not just Flash that won’t see the iDevices, but any resemblance to the old QuickTime won’t make it either.

And I’m OK with that. QuickTime – MOV distribution – served Apple well and continues to power their iLife applications and Professional Video and Audio applications, but without the features that it had, and no longer needs. Apple are always “good” at dumping technology that no longer meets their need. I think it’s one of Jobs’ strengths.

I also believe Apple are being consistent by not allowing Flash: it’s on a par with their own technology also not getting on the platform.

Categories
Apple Pro Apps Interesting Technology Video Technology

What are my thoughts on NAB 2010?

By now you’ve likely been exposed to news from NAB – at least I hope so. If not head over to Oliver Peter’s blog and read up on what you missed. Rather than rehash the news I’d like to put a little perspective on it.

Digital Production BuZZ

The little show that I co-created nearly five years after a successful five years with DV Guys (although I was only managing editor for the last 3 years of that show) has now been the official NAB Podcast for 2009 and 2010. Big props to Larry Jordan, Cirina Catania, Debbie Price and the amazing team they put together for NAB 2010. I filed some special reports, which you can hear among the more than 70 shows the team pulled together in the six days of NAB.

3D Everywhere

Whether it’s Panasonic’s “Lens to Lounge” or Sony’s “Camera to Couch” 3D was everywhere. Everywhere except actually being able to do something with all the 3D content we’re being pushed to produce. I’m aware that the top grossing movies last year were 3D and 3D movies perform better than 2D. I just don’t see that as being relevant to my universe where I don’t distribute my work through a major studio to 2000 cinemas.

So short of that, where’s the outlet for all the 3D? YouTube plays 3D (but is incredibly hard to monetize). The Blu-ray 3D spec is finalized but no shipping players, burners or encoders are available.

While I have no real quibble with the cinema experience – although films need to be designed for 3D, and shot with 3D in mind, to be successful 3D experiences (and few are) – I am very skeptical about 3D in the home, at least for the next couple of years. The problems of the glasses – I multitask a lot of the time while watching TV, what about visitors, or preparing dinner? – and the very different nature seems to limit the future of 3D in the home to those who have dedicated home theaters and dedicated, monotasking viewing time.

The missing Apple

Of course, if you’re a regular reader you’ll know it came as no surprise that Apple wasn’t at NAB. They don’t do trade shows any more so it was highly unrealistic to expect anything at NAB this year, next year, or any year. When they have something to announce, they’ll announce it.

You’ll also be aware that I believe Apple is doing a lot of what they need to do with Final Cut Pro to make it the “awesome” release that Steve Jobs tells us it will be. Maybe 2011 some time, but more likely early 2012 for the next awesome Final Cut Studio release. Or whenever Apple is ready!

Avid Media Composer 5 and editing in the cloud

The new management (current management) at Avid certainly appear to be spot on track. Media Composer 3.5, 4 and now 5 have all been great releases. As more of the work this management team are pushing comes to the public, the more I see the company back on track.

In fact hearing “interoperable” and “openness” sprinkled regularly into the press event and marketing materials seems slightly out of character from the old Avid, but is very welcome. Direct editing of QuickTime media, HDSLR or RED media via AMA for quick turn-around content is a huge advancement. Improvements to audio filters (and eventual round-tripping to a future version of ProTools) are long-standing requests from Avid’s customers. Even the “expensive” monitoring (output only) requirement has gone thanks to support for an MXO Mini for monitoring. (I wish that was an option back in January – it would have saved a client of mine about $18K!)

While only a “technology demonstration” at this point, Avid’s “edit in the cloud” (i.e. over the Internet or from a Local Server) looks like the real deal. Scott Simmons has a review of the demo over at Pro Video Coalition. Avid is back and we like it.

Adobe CS5

I doubt there’s much to add to Adobe’s CS5 announcements. The Mercury Engine is a major step forward in performance and it will take the others a while to catch up. To be competitive Apple would have to rewrite FCP to 64 bit and then implement Grand Central Dispatch and OpenCL to deliver that level of performance (and that’s what I expect they’re doing). Adobe’s platform-agnostic code (at the core) has made it easier for them to move to 64 bit, and tight integration with Nvidia’s CUDA engine, on top of some mighty software optimizations, gives the performance boost.

The whole Master Collection is a must-have for post production for After Effects, Encore, Photoshop and Illustrator alone. Premiere Pro is a bonus and could well become the Swiss Army Knife of editing tools as it supports pretty much any format natively.

Pick of the show

The pick of the show for me is, without a doubt, Get: phonetic search for Final Cut Pro. Search your clips for specific words wherever they occur. The exact opposite of Adobe’s Transcription (although that can be boosted by feeding it a script in CS5) Get does not attempt to derive meaning from the waveforms that make up the audio. Instead it predicts what the waveform for your search terms should look like, then goes and tries to match it in your media.

It has certainly set my thinking cap buzzing. What we could do at Assisted Editing with this technology would be amazing – almost delivering my “magic future” for metadata I spoke of at my two presentations. But for now, Get is an amazingly powerful tool that every documentary filmmaker will want to be using.

Hardware trends you might have missed

Not many of the main news streams picked up on the trend to multiple cards, or multi-channel cards, this NAB. Obviously 3D capable cards were announced (by AJA and Blackmagic Design) but AJA also announced that multiple Kona cards can co-exist in the one host computer; while Blackmagic Design announced a dual channel card, and Matrox promised a four channel I/O card.

What we’ll be using this multi-channel capability for, I’m not quite sure, as no software supports it, yet. Except, Blackmagic Design used to have a two channel software switcher in their product range (although it seems to be missing from their website right now). A dual channel Decklink card, with software switcher, makes a very powerful and inexpensive studio or location tool with a Mac Pro. Seriously undercuts dedicated switchers from Focus Enhancements or Pansonic.

$999 daVinci

Blackmagic Design almost deserve a post of their own on the NAB announcement (that you no doubt followed here) of the $999 software-only daVinci. Scott Simmons reminded me in a Tweet that I had accurately predicted a dramatic price drop for the daVinci system. What I didn’t predict was how far, and how fast, Grant Petty would drop the price. What I expected to come in at $60K was announced as a turnkey system for $30,000! I didn’t expect the software only version, although in reality, with hardware, monitors, scopes and storage, that’s still likely a $20,000 investment, for what used to be a minimum of $300,000 or more.

This is, of course, consistent with everything that Grant Petty has done with Blackmagic Design. I remember the first Decklink announcement (on the DV Guys show) at under $1,000 and everyone wondered how the industry would cope. Those cards are now much more powerful, and even cheaper, and now we’re going down the same path with daVinci.

Friends, fun and the Future

For me, NAB is as much about friends as it is about the technology. It’s a time when my virtual communities intrude into real space. Once again, NAB proved to be two days too long and four nights too short. With about 20 parties happening Monday night and a similar number Tuesday, we need more nights to spread them over, and fewer days. I was done with the show floor by Tuesday afternoon and there were two days to run.

This year’s MediaMotion Ball was a great social event, as it always is; running into the Adobe party following. Tuesday’s Supermeet broke new ground with the “Three A’s” on stage together for the first time.

I made my contribution to the show via my Supermeet Magazine article, The Mundane and Magic Future of Metadata, which I also delivered as a presentation at the ProMax event and in the Post Pit on the show floor. The Supermeet Magazine should be available soon from Supermeet.com.

The future of post production automation is metadata. Check out the article and tell me what you think.

And that’s my NAB wrap for 2010. Other than to say, worst WiFi experience ever at the Sahara. Expensive and slow. It’s time for broadband to be included in the price of a room, like air conditioning (didn’t use); the Television (only to get the sign up details for the Internet connection); etc.

Categories
Interesting Technology Item of Interest

What about 3D at NAB 2010?

Throughout the Panasonic and Sony press events we were bombarded with 3D. Panasonic pushing the camera to couch while Sony approaches the concept as Lens to Lounge. Both companies showed examples of their partner’s work in 3D.

Panasonic are at the forefront of the affordable HD production race with the AG-3DA1 camera that is shipping in very limited supplies and has a waiting list. At $21,000 this dual lens, single body camcorder records to AVCCAM and has integrated convergence control.

Sony had many partners in 3D production at the high end but nothing yet in their affordable product lines, although the implication was that these will be coming in the future.

Between the two companies we were exposed to a lot of 3D examples. My thoughts are very subjective but I found that 3D worked well for gaming, sports, particularly the relatively slow-paced golfing footage for this week’s Master’s event. What I found, and my associated confirmed, is that the fast cut concert footage and entertainment features did not work so well because of the slight disorientation at every edit.

While the cuts we use in traditional editing are analogous to the way the Human Visual System works, there is no real-world analog to jumping from place to place, view-to-view in the real world. This leads to a momentary disorientation at each edit, which takes the viewer out of the experience.

The other problem we noted collectively was that we got tired of graphics being “thrown out” of the screen to the audience.

One more thing: 3D content creators STOP THROWING THINGS AT ME! Stop with the gratuitous “in your face” movements. Whenever you throw something like that – like a 3D Bono hand (U2 concert video) or a 3D Graphic or whatever – keep it close to the screen. When things come flying at you in real life you react. With 3D, I react and I don’t react positively to the program.

So, just stop it, OK?

Regardless of my impressions this is the year that 3D hit the mainstream at NAB. Will it still be prominent next year?

For 3D, mark me as skeptical