I was finally watching the Steve Jobs Keynote from WWDC on June 7. (I know, but this was our second try – we get talking about stuff, what can I say?) I got to the iMovie for iPhone 4 demo and was blown away by the creative use of source metadata.
At 58 minutes into the keynote, Randy Ubillos is demonstrating adding a title to the video he’s editing in iMovie and iMovie automatically ads the location into the title. Not magic, but it’s simply reading the location metadata stored with images and videos shot with an iPhone and using that to generate part of the title. This is exactly how metadata should be used: to make life easier and to automate as much of the process as possible.
Likewise the same metadata draws a location pin on the map in one of the different themes. Exactly like the same metadata does in iPhoto.
In a professional application, that GPS data – which is coming to more and more professional and consumer video camcorders – could not only be used to add locations, but also to read what businesses are at the address. From that source and derived metadata (address and business derived from location information) we can infer a lot.
Check out my original article on metadata use in post production and for a more detailed version, with some pie-in-the-sky predictions of where this is going to lead us, download the free Supermeet Magazine number 4 and look for the article (featured on the cover) The Mundane and Magic future of Metadata.