During a conversation last night about a new type of touch-screen display that mounts on regular glass (don’t know any more about it than that – hope to get more information shortly and share).
During the discussion I was reminded that in the earliest days of using NLEs (a Media 100 for me at that time) I had fantasies about being able to edit using a 3D display environment, where in this virtual world the clips would be in space or grouped together in some logical order (these days I’d say that was based on metadata groupings) and the editor could simply move clips around, stack them and build the story along a virtual timeline. Even composite by stacking clips.
Not that I ever really developed the idea beyond that trip to my imagination, it does make me wonder if some sort of surface like that being proposed for regular glass, or even maybe a 30″ Cinema Display type screen, that was a full touch-screen surface that supported gestures, etc. Microsoft’s Surface would be close to the sort of experience I’m visualizing.
In thinking about it further I realized that the sort of work we’ve been doing with metadata would tie in nicely. The metadata would be used to group and regroup clips organizationally, but also to suggest story arcs or generally assist the editor.
It’s probably time for a new editing paradigm.
If not for a future version of FCP or Media Composer, perhaps, for iMovie?