CAT | The Technology of Production
In early 2012 I went through a process of reducing production gear to a minimum, akin to trying to write a Haiku. I’m gearing up for a production trip to Australia to record interviews with my extended family during our quadrennial family reunion. It’s a new production Haiku with different solutions due to the inevitable march of technology, and the needs of this production.
I do not expect this project to ever reach broadcast, but there’s no reason not to have the best quality sound and picture I can, for these recordings should last into posterity. I am traveling alone, so it was important to not carry too much. Essentially I need a good multicam interview setup, with excellent audio quality. I will shoot b-roll around the family reunion and some of the family sites.
Once upon a time it was easy to differentiate between Film and TV production: film was shot on film, TV was shot electronically. SAG looked after the interests of Screen Actors (film) while AFTRA looked after the interests of Television actors. That the two actors unions have merged is indicative of the changes in production technology.
As is noted in an article at Digital Trends, there is almost no difference between the technologies used in both styles of production, so what are the differences? It comes down to two thing, which are really the same thing.
Avid folk – hi Frank – have been promoting this survey result that purportedly shows that Media Composer is used more than all other NLEs. In fact the article starts off with:
Avid Media Composer remains the most popular editing system in production by a considerable margin. It’s the primary editing system of 70% of our respondents, proving that it still rules the roost despite the challenge from Apple and Adobe
Which is hardly accurate if you really examine the subject. In fact is bordering on deliberately misleading.
This news item caught my eye.
“DJI has expanded the feature set of the Phantom 2 Quadcopter line by giving it the ability to map out a route and automatically fly it. Using Ground Station, shooters will be able to create a route full of way points in order to let the quadcopter fly automatically from one point to another, and allow the shooter to concentrate on getting the shot. And it’s just what the pilot ordered.”
It reminded me of some of my thoughts on – what I was then calling a quadcopter but would now be a drone – how these flying camera platforms were going to evolve.
The final post in my series rising out of a recent Digital Production BuZZ segment with Larry Jordan and Michael Horton. Larry asked one final, very important question.
Larry Jordan: Because we are charged with delivering our projects on time and on budget, at what point should we resist change, like not being too close to the bleeding edge, and at what point should we embrace change?
Comments off · Posted by Philip in The Technology of Production
Continuing the discussion from a recent Digital Production BuZZ show. Larry asked an excellent question
Larry Jordan: I was just reflecting on the difference between improvements and changes and I realized that the tools that we use influence the stories that we tell and I was thinking back, again to when I was directing live TV, I would have the opportunity every so often to do a three videotape edit in a very expensive CMX room and the stories that I could tell with that videotape was limited by how much money I had and how much time in the CMX room. I couldn’t do graphically intensive tasks, I’d have to go off to an animation stand. Are we actually being blinded by the tools we’re using in terms of the stories that we can tell?
I have conflicting thoughts about 4K for production and distribution. At one level I’m convinced it’s being pushed on us by equipment manufacturers when there is no real demand: at another I know from experience that there are some non-obvious advantages to 4K. But one thing is clear: the push to 4K is not about a push to improved quality.
In this episode Terence and Philip discuss the way the App Store works and the issues that arise for developers; business models and compatibility between versions in Media Composer; in-app purchases; Adobe panels; Flash; Creative Cloud; app interchange formats; reporting; and app ecosystems.
As part of their October 2013 Creative Cloud updates Adobe have released Prelude LiveLogger, an extension to the Prelude family of metadata entry tools, for real-time on location logging. Like us, Adobe realized that there are situations where there is simply no time to log between the shoot, and the need to edit. Adobe offers the example of a football match where plays are tagged in real time during the game, and a highlight real pulled from Prelude moments after the shoot.
Our need was to log, shoot and edit as we undertook the Solar Odyssey, realizing that the only way to allow time for sleep was to log as we shoot.
I’m working on a side project for my friend Cirina Catania who’s trying to get a series running called After Action Stories about what happens to servicemen once they return from service. The pilot stories have revolved around Furnishing Hope that we shot back in May. One interview edit has been on hold until we were able to shoot some B-roll of Furnishing Hope in action. (They transform an empty apartment or house into a home in 3 hours.)
For the first time ever in my life there was more b-roll than I could use. Between us, Cirina and I captured over 320 video shots and hundreds of stills. The footage spoke to me and called me to put it together into something that Furnishing Hope could use to attract sponsors. Then something odd happened.