While watching the LAFCPUG “X Night” videos I noted that Michael Wohl advises that “all edits should be in V1” and not doing that is a sign of a failure to commit.  Similarly reviewing some FCP 1 release videos, they once again (and again from Michael) seem to advocate a mostly single track approach.  That would certainly parallel historic ways of working with video or film where only one track, or an A/B configuration was standard.
I think the use of tracks has evolved since then. One of the reasons that people became concerned when Final Cut Pro X took away the traditional track layout was simply because those tracks have become valuable metadata. A typical audio example might be
- Audio 1-4 Â Dialog allowing for stereo channel overlap
- Audio 5-8 Music, allowing for overlapping stereo
- Audio 9-12 (or more) Spot Effects.
That’s an indicative example, not prescriptive.
Similarly in the video tracks I’ve used layouts that had:
- On-camera footage on Video 1
- B-roll on Video 2 (and 3)
- Titles on Video 4 (and if there were more than one language, one track for titles in each language).
With that configuration one could easily turn off b-roll, or titles, to view the underlying video. Or turn on only one language track of titles at a time.
Some usage was as a workaround for the lack of something like Final Cut Pro X’s Auditions feature, but that’s not really a metadata-focused use and it doesn’t count here.
Simply because they were there, tracks evolved from being there, without specific purpose, to being used to carry very valuable metadata.
Comments
5 responses to “How Tracks evolved from function to Metadata”
I got in several debates on the COW about this very subject. Traditionalists talk about tracks as if they were invented for primarily for organizing media roles in the timeline. Video tracks were created for hierarchical compositing, and audio tracks being assigned “roles” like you laid out above has always been the workaround for not having any other way to group audio within a timeline. Explicit roles are better than implicit ones.
FCPX needs some work in regards to being able to leverage roles metadata to manipulate timelines with similar ease to turning off whole tracks and exporting tracks, but the foundations are there and the concept is sound.
I do like the idea that’s been put forth that audio elements can be tagged with their assignments rather than the traditional idea of exporting them from specific tracks. However, that said I do think that there needs to be an explicit visual cue for audio that’s been tagged as part of a specific group (DIA, NAR, MUS, etc). A simple colour assignment would make this work much better, and also ensure that no audio elements have been missed, or misstagged.
Ironically, if the rumors are true, the Apple Logic team doesn’t seem to agree with the trackless philosophy.
– Oliver
Logic is for multiple track music production. It’s totally different than an NLE.
As a musician, I must have tracks to mix. Each instrument in their own track.
As an editor, I don’t need more than a few.
I am looking forward to see how audio metadata will replace tracks in exports entirely, and if it’s faster to work with.
Yeah, no, this doesn’t work for me. I read this after commenting on your follow-up article, and it made me think a bit longer about it.
Tracks aren’t metadata… they are places where you put things. Unless you think everything is metadata. Like if we get rid of the boys’ washroom and the girls’ washroom, hey that’s just metadata, why don’t they share a washroom and I’ll just tag every person with a “boy” or “girl” flag and that’s just as good?
When I edit video, I need tracks.
Anyway, FCPX DOES have tracks. When you overlap audio, it moves stuff up and down to get out of the way. Also you can drag video above or below other video and it puts it on a different track.
The idea that tracks show lack of commitment is nonsense. Non-linear editing is all about lack of commitment – that’s what it’s for.
Razor and tape – now that’s commitment. Maybe FCP 11 will not be on a computer at all…