Over the last year I’ve managed to have some valuable insight on the direction Apple has been, and is, going with what became Final Cut Pro X, but of late the timing – June 21 – has got me thinking. One of the things that has bugged me is that Final Cut Pro X seems like it’s only most of a story. That there are still “other shoes to drop”. Since I don’t know how to quit when I’m “ahead” on the forward looking insight, here’s some more.
Since apparently the next update is coming “in a few weeks” according to the FAQ posted by Apple and will include the workflow/updated XML I wonder why it was released on June 21? What comes to mind is that Apple wanted to make it very clear that Final Cut Pro X will run on Snow Leopard (and uses Snow Leopard technologies like Grand Central Dispatch), even as the company prepares for the launch of the next OS – Lion – in the middle of July.
I don’t think it’s co-incidence that the first update release appears to be timed for Lion. Before I get to the topic of the title of this post, a little context. As I wrote a while back in Why we want Final Cut Pro rewritten to Cocoa, when you program to OS specific frameworks you get a lot of benefits “free”. In fact, what happens is that Operating System upgrades often bring new features to existing applications, just because the OS changed, and the application was written to take advantage of them.
Lion, I think, is a very important update for Apple. I harmonizes their two Operating Systems to a much greater degree than before. For a better understanding of the harmonization, you’ll need to sign up for a free developer account and watch some of the WWDC 11 videos. (But note, like me, signing up for that will put you under NDA so neither of us can discuss specifics.)
It seems to me that Final Cut Pro X, although written on Snow Leopard and released so that it’s obvious that it runs well on Snow Leopard, is clearly designed with a look toward a Lion future. Right from AV Foundation – the underpinnings of the media engine in Final Cut Pro X – to the single window(ish) interface, to the autosaving documents.
So, I fully expect that a small update (or not even an update) to Final Cut Pro X on Lion, will give use two Lion features immediately: versioning and a single window interface, Lion style. The latter is clear in the way the window operates. The former, not quite as obvious but the versioning features comes automatically with Cocoa’s NSDocument in Lion.
Versions would go a long way to alleviating the issues surrounding duplicating Projects to lock off certain states or to perform versioning. (The issue is that render files are duplicated.) To be able to go back to any earlier state of a Project will be helpful. And thanks to Lion it’s a feature provided by the OS rather than added to the application.
So, that brings me to the topic. Final Cut Pro X does not seem to have the same architecture as Final Cut Pro 7 and earlier, where external monitoring “hooks” were in the app for third parties to link to and send signals out to standard video signal formats – Component analog or SDI (or HDMI). AJA has announced a Final Cut Pro X solution that is effectively a mirrored desktop. Many find this unsuitable for color grading because of quality issues.
My first thoughts were that Thunderbolt equipped broadcast monitors might be a solution, but last night it struck me that the solution was really obvious and totally OS dependent but would give Final Cut Pro X a “real” video output: AirPlay.
AirPlay has been expanding under iOS to include video features, and it’s my expectation that it will also come “back to the Mac” with Lion although it’s not an announced feature that I can find.
So, Final Cut Pro X gets an AirPlay output option in Lion so that the video and audio can be streamed wirelessly to an AppleTV (hocky puck version) and now theres a real HDMI connector with the signal from Final Cut Pro X’s Viewer. For SDI there are HDMI to SDI converters available from AJA, Blackmagic Design and others.Â (Sorry if this blows a nice big surprise Apple has cooking for us, because this would be an amazing feature.)
And that’s how the Operating System solves a real problem for Final Cut Pro X. Â I wonder if there are solutions at the Operating System level for sharing projects?
[UPDATE] I clarified and expanded my thinking on monitoring.
50 replies on “How will Apple solve FCP X monitoring? [Updated 7/7]”Leave a Comment
Philip, I think you’re right. The current version of Final Cut Pro X does not give those options on a system running Lion Golden Master. But I expect an update (like Apple told us) with the release of Lion.
I do believe Apple has much bigger plans with Lion, FCP X then most of us can imagine. I also think that FCP X is touch (future) proof… the skimmer would work perfect for that (iMovie is already offering that on iOS).
I still have no good word for Apple’s handling of this situation and the transition, and I can imagine why business are upset. But I love FCP X so far and I can really see it shine after a few .1,2,3 updates in the near future.
I think AirPlay from FCPX would be dandy, but it is no better than AJA’s present beta kludge in terms of quality. Apple TV only supports 720p H.264 streams. It simply isn’t equipped with the ASICs to do more than that. Perhaps a new Apple TV in the pipeline with an A5 chip (like the iPad 2) could support 1080p, but that is still rather pedestrian when handled as H.264 and doesn’t address higher resolutions that could be supported by a Kona 3G, for instance. All the advanced color space guts that FCPX has is lost on the user without critical monitoring that can handle 10-bit RGB out.
I’ve been positing on the boards at Creative COW that it may take Lion and its public hooks for AVFoundation to open up FCPX I/O to drivers from AJA, BMD, and Matrox for their respective lines of I/O products. From your understanding of how AVFoundation works and what it does, is that at all relevant? Could someone write a driver to “see” a steam of video being generated from an application calling upon AVFoundation to do its thing (as FCPX does)?
Greetings from a lunch stop on the trip home.
While I think there would be great utility in AirPlay/FCP integration, I don’t thing it will suffice for critical colorist type monitoring as AirPlay works by steaming iOS-compliant h264 steams through the air to be decoded by the receiving device.
In a color grading workflow, your source footage is decoded (might be compressed, uncompressed, raw, avcintra etc) but then color corrections are applied, typically in 32 bit float space and display is typically 10 bit 422 or full 444 RGB. That signal pic ally has NO compression applied in the process of applying those color corrections and the image is passed to the monitor over HDSDI in 444 pure baseband. There isn’t enough bandwidth to do that over AirPlay. I think it would have to get smooshed (technical term) into a compressed format to make the traversal to the display and that is not likely to be deemed acceptable for most pros.
No, not for critical colorist type monitoring, but those folk aren’t going to be using FCP X when there are better options (Resolve, Filmlight). Keeping in mind that one client is doing broadcast and DVD work with an NTSC downconvert; another to a consumer plasma (and they’re doing restoration; at the multi-user film producer only one room had a broadcast monitor, all the edit bays had consumer plasma or LCD. I see Airplay fitting in for those folk just nicely.
Why bake in 32bit float RGB color processing, 4K image handling, Pro Res, and the rest of FCPX’s top-end image processing support if Apple aren’t going let anything more than an H.264 stream or extended desktop out the other end? FCPX may have a somewhat hobbled UI for serious color grading (I have to defer to Patrick Inhofer on that opinion), but it also has great bones upon which powerful things could be built. The shared rendering engine with Motion has tons of potential, and the big guns in the plug-in space are hard at work on porting their wares over to Motion 5, the conduit for all FCPX effects. I think there is every reason to want to have color-critical output from FCPX, just as there was from FCP7. Loads of legacy FCP edits have been cut on color critical monitors and then handed over to grading on a dedicated tool like Resolve or Color. I agree AirPlay would be a novel and useful addition to the playback options, just like iChat Theater was in FCP7, but I’d be very disappointed if that’s the best we could ever get for monitoring FCPX.
Isn’t it plausible that, if nothing else, a driver for a Kona or Decklink card could slave to whatever the foremost AVFoundation playback stream is? So when FCPX is the active app, its playback is available to the card? And for that matter, when Motion 5 is the foreground app!
it’s wildly possible but implausible that a Kona or Decklink card could grab whatever AV foundation stream was foremost. Personally I do not expect FCP X to support direct video output, but we have to wait and see what Apple decide.
Why would Apple not go the Thunderbolt -> broadcast monitor route? Anyone who’s planning on working with RED footage, for example, (and we know that native support is coming) would want to have the option to monitor on a professional monitor through a proper direct video output. We have be able to trust what we’re seeing on a high end monitor – and I am not talking color grading here.
AirPlay doesn’t make sense to me since it’s a 720p consumer device and who knows what kind of playback delay we’re facing here.
I can’t imagine they didn’t think this one through…
I hope you’re wrong on this one. Never getting proper monitoring back might be the unforgivable dropped feature that has me joining the ranks of disaffected former FCP devotees currently making so much noise on the tubes. I can see virtually every other present shortcoming being reasonably addressed by third parties or by Apple as promised in their FAQ. However, proper monitoring is still an unknown, and Apple must participate in enabling support for such a feature, be it at the OS or app level. Indeed we wait.
Actually, if a client comes to me and asks me to grade their FCPx project *inside FCPx*… if they pay my rate – not a problem.
Except of course – I can’t actually *see* my work.
BTW – full bandwidth, 10-bit monitoring isn’t only for color grading… FCPx supports 2k and 4k – but not monitoring. Finishing editors who handle the final graphics, titles, and creating the final file for delivery also need to see what’s happening with their picture (example: Is the fringing around the text really there or just a display problem?).
While your insight into this Airplay solution is interesting (and lets assume they up it to 1080p monitoring for LAN playback) – at best it’s a client monitor feed. The finishing editor still needs to see something other than an h.264 approximation.
Frankly – I’m holding out for FCPx to be updated with hooks for Thunderbolt solutions.
Marcus, it’s probably not either or, but Thunderbolt would take away one of your work screens because, like the AJA solution, it’s a mirror of the desktop. Quality may or may not be good enough. Using AirPlay would allow a dual monitor setup with Viewer on a remote monitor.
FCP X is not for the niche Hollywood Pro where every editor needs a broadcast monitor (and frequently don’t btw), it’s for the wider group of professionals not producing for broadcast. 720P monitoring will be fine.
It’s reasonable for Apple themselves to eschew building niche features for the relatively small high-end, but I’m hoping that means Apple will give eager third parties a way to do it instead (and do it better than Apple would ever care to). I’ll be very disappointed if FCPX and its exciting editorial innovations can never serve a wider user-base than it can today, including broadcast and Hollywood.
Hey Philip –
I just have to jump in here because I keep hearing you (and others) say this when talking about the direction FCPx is going. [WARNING – here comes one of those emotional reactions that you sometimes have a hard time understanding!]
I completely agree that it appears Apple has gone for the widest audience possible with FCPx. They’re sending all sorts of signals that higher-end workflows aren’t what’s important to them. However, to keep calling those workflows “niche Hollywood Pro” (or many other variations of that phrase) is extremely frustrating for me to hear.
I am a small business owner with three seats of Final Cut Pro (four if you count my personal laptop). I’m not Hollywood. I’m not niche. But I do want to do things the right way – and (speaking strictly to this example – there are may other places where this argument can be repeated) h.264 720 output is not the *right* way to do video monitoring. It might be adequate for many (or even most) people, it might be the way Final Cut is going to do it, but it’s not the RIGHT way to get accurate results. And wanting my software to do things the right way does not make me niche, nor Hollywood, nor elitist, nor a big company, nor yearning for the past, nor any of the things that people are accused of when they have legitimate concerns about the future of this very important piece of software.
Just to be clear – I think your ideas are spot on. You’ve been incredibly insightful about where FCP is going. I think there is a lot to like about the future of FCP (although I’m concerned that future won’t include my company because of issues like this). What I would love is if you were less cavalier with the idea that only “Hollywood” wants these higher end features from their editing software.
At the very least, help me out so that my staff can stop asking me what’s wrong (and who the hell is Philip) when I yell at my laptop…
I have to agree with Brent on this one, I’m in almost the exact same position. It’s sloppy thinking to want to define the debate in terms of “reactionary big hollywood” versus “forward thinkers”.
While the airplay idea is cool, it’s mostly just that. A cool idea. That doesn’t really address the need for proper external monitoring.
This seems such a fundamental function I’m having a hard time believing there isn’t a way to achieve it using AV foundation. Isn’t this new technology supposed to be more capable – not less?
But Philip, regardless of what anyone else thinks about it, you may still win the award for prognosticator of the year.
Phil has no idea what he is talking about.
Alistair, thank you for your comprehensive and insightful comment. I don’t mind contrary opinions but contentless posts without any substance are not welcome and this will be your last. I’d like you to explain exactly how I don’t know what I’m talking about. Then I might learn something. Right now I just think you’re a bit of a dick and not welcome here.
I’m an outsider (more involved in playback systems), but came across this from StudioDaily link…
I’m contemplating using the Thunderbolt version of Ultrastudio 3D from Blackmagic for certain “special venue” (non-broadcast) playback situations. The thought occurred to me that this thing might be a solution for your monitoring concern. I don’t see anything referencing FCP X on Blackmagic website, but it seems pretty close to what you are looking for.
I haven’t contacted Blackmagic yet, but what say you guys?
Blackmagic Design have not announced any support for FCP X, although their latest drives/software do support capture for (but outside of) FCP X and lay off to tape in the same situation. The question is not whether third parties will or can support FCP X for video output, but rather the question is “are their the hooks in the software for those third parties to hook into”. I think there are not but we keep hoping.
I doubt the question is whether or not AV Foundation is capable, it’s whether or not Apple architected it into the application when they were designing it. Right now I don’t see any evidence of there being a “video pipe” like there was in FCP 7 for 3rd parties to hook into. I could be wrong on it.
And Brent, I’ll try and moderate the language but I struggle with the term for “professional use of video production and post production techniques but not working full time at the editing and not working using the techniques of Studio Film and Broadcast Television”. That group, however we think about them is the “vast majority” of Apple’s current FCP 7 customer base (according to Apple) and who they made FCP X for.
Pains me too – sales of our software to support those full-time “Hollywood Pros” have dropped to pretty much nothing, so it’s more pain for us than even most multi-seat users.
I think – when talking about these monitoring issues – you can talk about people who want ‘accurate’ monitoring.
I can attest it isn’t just the Broadcast and Film pros looking for accuracy.
I have at least one student in my training who bought the 10-bit FSI monitor even though 100% of his work is file based and will *never ever* hit broadcast / film.
80% of his work is web. The remaining is corporate installs.
He wants to know what his work *actually* looks like. Then, when his client says it’s too green or too red, he can be confident of his deliverable.
I would suggest that *anyone* who uses FCPx to deliver to a paying client has a vested interest in having a choice to accurately monitor their final product.
The accuracy of color is why Apple incorporated ColorSync into FCP X. With an appropriate color profile for your monitor you *can* see what it will look like on a broadcast monitor. (In 8 bit, true).
The solution is in a different place: ColorSync – not broadcast monitors.
“the versioning features comes automatically with Cocoaâ€™s NSDocument in Lion.”
I too am hoping that a forthcoming FCPX update will enable Lion’s versioning (and full screen UI) functionality with the app … at present tho, as far as I can see, FCP X is not an NSDocument based app (unlike, for example, TextEdit) so the “free versioning” functionality may not be quite so “free” … either way, if its there it’ll be a good thing.
Interesting. My coding partner nailed it as an NSdocument-based app. What leads you to believe the opposite?
No special knowledge on my part to be sure, Philip … I have only limited experience in Cocoa coding, but in the Xcode projects I’ve played with and in the limited playtime I’ve had in Lion GM, one characterization of an NSDocument style of app ‘that allows versioning’ is that they are those with a document based interface. FCPX doesn’t have that, it has a single window interface … and so would need at the very least need a non-standard (at least non-Lion standard) method of accessing version control i.e. it would not seem to come as a direct borrow from Lion’s NSDocument control.
Hope that makes sense, am writing in a hurry whilst producers are poring over my latest cut (and making altogether too many notes for my liking)
Ah, but isn’t FCPX’s Project File a document Andy?
I think Andy’s right. It’s not a traditional NSDocument. What I was getting confused was the nature of the save. Autosave is an option for NSdocument and it’s that which pushes forward to Autosave and versioning on Lion.
Philip wrote: No, not for critical colorist type monitoring, but those folk arenâ€™t going to be using FCP X when there are better options (Resolve, Filmlight).
Philip you’re missing the point. Critical color monitoring is essential for ALL professional edit suites. Not just the colorist station.
In our facility we have 7 workstations. One is a Davinci Resolve. The rest all run Apple Color. We finish projects on every single one of those 7 workstations.
So FCP X has to send out a proper video signal natively to work in a critical color environment which includes the edit suite. Airplay isn’t going to cut it.
Once again I’d like to express my admiration and thanks to all those experienced and bloody clever people who have taken the time to share their knowledge with us.
(Especially Philip and Walter Biscardi)
I agree that EVERYBODY should be able to look at their footage on a broadcast monitor, even if it’s a piece-of-crap. It’s what makes an edit suite into an edit suite. Without it you have some editing software, no matter how good it is.
What do you show your director/ agency/ clients?
I have had an external monitor for all of my editing days and will continue to have one for the rest of them thanks. Even on my first Lightworks which could not even add a title!
Right now I have a cheap CRT broadcast monitor, a pro LCD and a consumer LCD. If I could fit a pro Panasonic Plasma I would have it too.
I can’t believe there is actually a discussion about whether you can get by without a proper SDI or HDMI video output.
As I wrote to Walter, among my broadcast and film clients a broadcast monitor is a rarity.
I have a client with five seats of FCP that has just one broadcast quality monitor and they got that years after they finished many TV series because Color was out of house. All edit bays had consumer level monitors. As do two of my other clients. Lots of people producing for Broadcast TV, Disney and Warner without calibrated or broadcast monitors in their bays.
I’m sure everyone can go back and fourth w/anecdotal evidence but that’s not really an accurate portrait of what’s going on across the board. My experience, even when I was working in the dub room of a small shop in Indiana, is that every bay typically has a broadcast monitor. Even at places where the content primarily went straight to DVD. But my experiences are different than Walter’s which are different than Philip’s.
If FCP X is only suited for offline editing hopefully Apple will not just reinstate all the I/O features (EDL, XML, OMF, etc.,) but make them incredibly robust. An NLE that can’t send out a video signal and cannot play nice with other apps is totally useless in a professional environment, IMO.
And maybe that’s part of the greater plan for FCP X anyway. iTunes music is ‘good enough’. Apple’s video streams are ‘good enough.’ Maybe all Apple want’s is a ‘good enough’ FCP X?
I know ‘the cloud’ and ‘new media’ all are big buzz words right now but look at the content that people are willing to pay for, the content that advertisers are willing to support, and it’s almost all TV shows and films. Successful ‘new media’ is largely just ‘old media’ in a new distribution channel.
In my case, I have to agree with Philip. I work for a high-end offline editing facility in Australia, that takes on a huge amount of commercials, short films and feature films, and none of the suites have broadcast quality monitors. All of the suites have prosumer or consumer monitors, connected via either composite SD video or component HD. Why? Because when you’re cutting in DNxHD36 and ProRes LT for offline, you don’t need colour accurate monitors – you just need something that will keep the client, directors, agency and producers happy. All of the colour grading is done at another facility, with a Resolve, Lustre, Baselight, etc. with hugely expensive broadcast quality monitors that engineers spend a great deal of time making sure that everything is properly calibrated.
HOWEVER – when I take my post production supervisor hat off, and put on my independent filmmaker hat on – it’s absolutely vital that I have good monitoring available, because when you have only limited money, you have to do everything in house.
So for me – it’s extremely important to have SDI monitoring in a prosumer app (which I believe FCPX is), so that I can do EVERYTHING within the one app.
This is where I think Apple is confused. They have sold FCPX as a “one stop shop” where you can edit and FINISH within the one app (because as of TODAY, you can’t actually get content easily into and out of FCPX) – and yet a vital part of this workflow is accurate monitoring.
So really – as of TODAY, Final Cut Pro X is useless for professionals (as the export tools are non-existant), useless for prosumers (as you can’t accurately monitor in-app), and really only focussed towards people who want to step up from iMovie to a more professional platform (which is fine). So maybe Philip is right on the money (yet again). Maybe Airplay is the solution for enthusiastic consumers who want a cheap Apple-branded monitoring solution? However, consumers aren’t going to need a client monitor – so I don’t see why the current dual screen desktop won’t suffice.
It’s going to be a very interesting few months…!
Video monitoring needs to be full quality, super fast & frame/color accurate.
Airplay does not fit that bill, yet, it might be fairly useful, if it will be available.
Until then, If FCPX is a one stop pro shop, it’s video should be able to play externally.
Currently, most FCPX projects will be finished on FCPX, without Lustre or Da Vinchi, therefore must monitor their video.
Apple must keep this functioning, hopefully built in.
This is basic.
Do you think that under AV Foundation, this will be an issue?
I don’t think it’s an issue related to AV Foundation, I believe monitoring is a decision made in the architecture of the application. I also have to raise the question: If only 5% of customers are doing broadcast work, how essential is broadcast monitoring? I raise the question, not dictate the answer!
Personally I think it would be nice to have the choice of high quality monitiroing via (say) Thunderbolt, and simpler Airplay for those who’s output is going to computer screens.
A good question to raise indeed.
I believe more than 5% will want to work with 2 monitors and a video monitor with good color/response. That survey might tell us more…
Thank you Philip for your thoughts.
If you are right and that Apple does not consider it is important to provide a way to have accurate color monitoring output from FCP X, why do they even bother to integrate some of the Color features in FCP X? I mean if you cannot tweak your pictures knowing exactly what you will get, what is the point?
Now to respond to your point that they are better programs to do color correction, this is true… but if they are no way to share your FCP X timeline (like we did in Color before), how is that going to work?
Also it seems to me that over the years we have been wanting to do more with our NLE, not less like Apple seems to think… that means being able to use plug-ins like Magic Bullet to gives us a better control over the look of our stories without completely relying on an outside color grading phase.
I guess what I am trying to assess (like many others 🙂 is the direction Apple wants to go with FCP X… I am open to changes, but I don’t want to compromise the quality of my work… I can change my workflow if I know I will be more efficient and that I can get the same or a better looking film…
If the professional market is not important to Apple, why don’t they say it? I mean if really we are a small part of Apple revenues, then stating that they are targeting the mass should not hurt them, don’t you think? so I guess I am not sure I understand their game here…
I guess time will tell…
Colorsync. I missed it but just wrote a new post http://www.philiphodgetts.com/2011/07/more-on-final-cut-pro-xs-monitoring-solution/ on it.
Sharing of the FCP X Project will come with the workflow XML that has been publickly announced. The professional market is important to Apple. The 25,500 people in the US who work in Film and Television, not so much. As I wrote http://www.philiphodgetts.com/2011/07/who-are-apples-final-cut-pro-x-customers/
IMO, if you are doing anything that is going to be displayed on a TV it is advantageous to have accurate external monitoring. You need to make sure the text looks right, the colors look right, that interlacing is being handled properly, etc.,. Sure you can probably get most of the way there w/o it and most of the consumers out there wouldn’t be able to notice the difference on their own, but there is something to be said about craftsmanship. The devil is in the details and getting the details right separates pro looking results from amateur looking results.
Color Accurate sure, but why does it have to be external if we can calibrate the display to Rec 709? See my more recent updated post.
If what FCP X displays in the Viewer is just as accurate as sending out a base band video signal to a broadcast monitor it’ll be a none issue. But can it?
It’s been my understanding that there are basically three points of failure in trying to do accurate monitoring inside the GUI on a computer monitor. First, the NLE is not designed to display the image at broadcast quality in the GUI. Second, the GFX card is designed to pass along a computer video signal but not a base band video signal. And, lastly, the display itself is designed to display a computer video signal and not a base band video signal.
How well will a 1080i 29.97 signal be displayed by FCP X and hardware that’s designed around progressive at 60.00?
1) FCP X is designed to display the image in whatever color space your monitor is calibrated for. That’s why it was a bullet point in the presentations.
2) There is no base band video signal to pass on.
3) The display needs to be calibrated with colorsync – a standard part of the Operating system.
Color space is only one part of the equation though. Can the hardware and software properly display interlaced footage and non-integer frame rates? Right now AJA and Matrox are labeling FCP X’s Viewer as preview quality. Is the max quality, for lack of a better term, of FCP X’s Viewer something that change be changed in a future update?
What made the original MXO such a unique little box is that, via hardware and software, it was able to over come the inherent differences between computer display systems (the GFX card and the monitors) and b’cast video display systems. Though the MXO was only qualified to work with a fairly narrow range of GFX cards and monitors.
Final Cut Pro X displays interlaced to the monitor as an option. Frame rates I am uncertain about – that would depend on the specifics of the display hardware I would expect.
Matrox do have more experience at this type of mirrored display output than anyone else.
But can it *properly* display interlacing? FCP ‘classic’ can display both fields in the Canvas but it looks nothing like it would when displayed on b’cast monitor or TV.
I’m not trying to be a negative nancy or antying I’m just trying to get an accurate, as accurate as educated speculation can be, idea of what FCP X can and can’t do. Like with everything else time will ultimately tell, right?
AT 100% it played like regular interlaced video plays on a progressive display, which is what it would do on an external progressive display. Progressive displays (even of interlaced material) are now the norm for HD (as no HD CRT sets were ever sold afaik)
Even more to your point Philip, progressive displays are now the norm for TV in general. The local WalMart has a huge display of beautiful LCD and Plasma flat screens. You’d be hard-pressed to find a CRT. We are in the kind of transition that happened when TV went from BW to Color. The CRT as “broadcast monitor” is rapidly going away because it will no longer be the “standard” it once was.
Phil – you’re very much misinformed if you think there weren’t any HD CRT broadcast monitors manufactured. If you go to most HD studios you’ll often still see an HD CRT in the lighting/vision area for critical work, though some very high-end LCDs from Sony are taking their place.
JVC and Ikegami still sell HD CRTs – and they’re not uncommon in edit suites.
As for your comments about progressive displays and interlaced sources – yes, LCDs and most Plasmas, are natively progressive. However how the interlaced picture is handled can vary massively between panels (as there are many different de-interlacing algorithms that can be used – some of which simply won’t display some errors/artefacts you need to monitor).
For broadcast work – you need to monitor on something with a similar performance to that used in the QA area of the broadcaster who is going to QA accept your material.
What de-interlacing algorithms is FCP X using ? Really not sure I’d want to trust that over a decent CRT or a Sony BVM…
I certainly see the CRTs in use all over the place but I didn’t think they were still on sale. could you provide a link to the sales pages?
See the part of my post where Final Cut Pro X is not suitable for your needs. I know that’s not comfortable but that is the reality. Without a “true” video output, FCP X will likely never meet your needs.
Ikegami sales pages here : http://www.ikegami.de/products/broadcast/hdtv_monitor/
They do 15″ and 19″ CRTs still, and were pushing them quite hard last time I saw them exhibiting at trade shows.
I know a number of broadcasters bought up as many of the final Sony and JVC CRT monitiors as they could.
The high-end LCD (and now OLED) flat panels have pretty good de-interlacing in them – though they are still defeated by some interlaced content.
The possibility of an Apple HDTV should perhaps be considered in the context of this discussion:
If Apple took their cinema displays, added the electronics from an Apple TV but left the capability for being used as a display they’d have a killer product imho.
Comments are closed.