Canon 1D Mk IV Low Light Capabilities

Canon 1D Mk IV Low Light Capabilities

I've been reading a lot about the amazing low light abilities of the Canon 1D Mk IV with it's extended ISO settings that apparently can "see what our eyes can't even see". Intriguing. Our ASC charts for Incident Key Light don't go beyond ASA/EI 3200 where you'll find that to expose for 18% Reflectance Gray with a T Stop of 1.4 you'll need 1 foot candle. These charts don't even account for <1 foot candle readings despite the fact that light meters can measure them down to 1/10. This makes sense from a film point of view where the fastest motion picture stocks readily available are 500 ASA. Rating them as such and shooting with a stop of T1.4, you still need 5 foot candles for optimum exposure. Up until recently, with the advent of digital cameras that can see in the dark, there has been no real need to expand upon these tried and true charts.

Open moon light is about .1 foot candle and the 1D Mk IV's expanded ISO of 102400 is 4 2/3 stops faster than ASA/EI 3200. Using the ASC charts as a guide, with a T Stop of 1.4 if my math is correct (4 2/3 stops faster than 1 is .048) that should be enough light to actually expose, albeit a noisy exposure no doubt, in available moon light. Has anyone out there had the opportunity to try this camera out in only available night time "light"?

Please someone weigh in. I'd love to hear from you guys more often :)

Phantom v12

Phantom v12

I recently shot a promo for Starz Encore using the Phantom v12 and wanted to share a few thoughts regarding this sorely under utilized camera system. First of all, what's most amazing is that it delivers excellent quality 720p HD video at a shocking 6,900 fps. For this project we were mostly blowing up glass and ceramic objects and at this extreme frame rate, it’s like you’re seeing into the inner workings of the universal laws of physics. The minute beauty of it is just incredible.

Here is the clip:

http://www.starz.com/Promotions/FearFest

The stop motion segments were shot with a Canon 40D and the rest was shot on the V12. I have some additional footage that was not used in the spot that I'll be posting eventually.

If anyone reading this isn’t familiar, Phantom is pretty much the go-to cine-style video camera for high-speed applications. Part of the reason Phantom is so useable to production professionals is because of one of my fav NYC rental houses, Abel CineTech, and their unique relationship with the camera’s creator, Vision Research. Abel’s staff has done a first rate job of taking a camera that was designed mostly with military engineers in mind and turning into something that slides seamlessly into the camera department. Just have a look at the new Phantom HD Gold for a perfect example of this.

Prior to this job, my experience had only been with the Phantom HD, the far more commonly used Phantom camera that shoots 1080p at around 1000 fps. The new HD Gold is the upgrade to the original HD and was created to rectify some of its well-known issues – most noticeably noisy, unstable blacks. Having not worked with the Gold, I was expecting the v12 to perform similar to the HD and it certainly did. When you’re shooting at 6,000 fps you can expect to encounter a few additional curveballs though.

First of all, the v12 is an incredibly light sensitive camera. I was astonished by its flexibility. Granted this was a tabletop shoot so it wasn’t like I was lighting a massive set but I had 2 5k’s going through 6x6’s with Light Grid, shooting at 6,000 fps and I was not fighting for stop. I would safely rate this camera at 640 ASA. As with all high-speed videography, big lights are essential to reduce flicker. At such high frame rates, if you use any lamp smaller than a 2k, you will actually see the light cool slightly in between our alternating current’s 60 cycles per second. Lamps bigger than 2k burn hot enough that their intensity is not noticeably diminished in between cycles – ala NO flicker. 5k’s are a good choice for such high frame rates because you can easily knock them down if you want to work with less fps. Lately I’ve been using the iPhone App, PocketLD (Lighting Designer) by software designer Michael Zinman, to help spec out lighting packages for this sort of work. It will display the photometrics for tons of various lamps from Arri, Mole, Kino Flo, etc. at any distance you input. Extremely useful for figuring out how close your light sources need to be to get a working stop at 6,000 fps. Especially when you know that your going to be exploding canisters of paint and that you really can’t be any closer than 10 feet! 

The v12 has the same problem with the black portion of the video signal (pedestal) as the HD if not worse. This is the most critical aspect of working with the Phantom. You must do a CSR aka Current Session Reference aka Black Balance before every shot. Why Phantom has such an issue with the blacks, I don’t know. I do know that if you don’t do it, you can count on weird blocky banding and discoloration on your shot that will no doubt render it totally useless. The nice thing is that the v12 has an internal capping shutter so you can do the CSR remotely and not have to cap the lens unlike the HD (pain in the ass!) Love that feature. I did notice that the higher the frame rate, the worse the problem with the blacks became and sometimes they would start to give out towards the end of the shot. This is not something I’ve encountered with the HD and is problematic because there’s really nothing you can do to remedy it except trigger the shot so that the important moments are stored in the beginning of the buffer.

Other issues:

Blue Channel Weirdness. This seems to be a real CMOS issue and the Phantom camera is no exception. No matter how many times you black and white balance, it seems that blue channel is always a little boosted. We were doing some shots with a black curtain background that was very slightly reflective and that was all it took for the blue channel to go haywire. The whole image went completely blue and doing black and white balances did nothing to remedy. I switched the curtain out for black duvetyne with the matte side to camera and that seemed to fix it. Phantom cannot deal with black so if you’re using a black background, make sure it is as close to non-reflective as you can get and keep spill off it!

White Clip: The director had a few shots he wanted to get on a seamless white background so we lit it up and put the whites at around 100 IRE, not even clipped, and that seemed to overload the sensor. The image was taken over by horrible banding that black balancing had no effect on. It wasn’t until the white was at just under 80 IRE that a black balance removed the banding.

CMOS Smear at very high frame rates: On some of our more reflective objects, at very high frame rates, I noticed some smearing in the highlights. Any highlight that clipped or approached clipping had a little smear to it. The bigger the highlight, the greater the smear. This must tie in with the aforementioned white issue. There is no white protection or KNEE on the Phantom camera so when the whites go, they go hard and apparently make the sensor more vulnerable to the well known issues related with CMOS sensors. My remedy was to just to keep the specular highlights under control with lighting. Honestly it wasn’t such a huge issue but it was something that I noticed.

All and all the v12 is incredible technology and I cannot wait to work with it again. Blowing stuff up at 6,000 fps is why I got into this business in the first place :)

LEGACY BAGGAGE

LEGACY BAGGAGE

I've been researching this notion of Legacy Baggage and how it often unfortunately impairs the development of new technologies. 2 aspects of video production in particular are victims of this, namely - 2/3" sensors and 29.97/23.98 time code.

The size, 2/3", standard for broadcast camera sensors, lenses, etc. comes from the fact that prior to CCD technology, video cameras used a CRT pick up tube much like the one in your television. This pick up tube was 2/3" in diameter so the video lenses of the day were designed to work with this standard size. Due to the efforts of Sony, Ikegami, and others, eventually pick up tubes were phased out in favor of new CCD sensors. Given this opportunity to introduce a new broadcast video standard, the size of these new chips could have been anything but due to the existing equipment legacy, 2/3" was chosen so that all those thousands of video lenses out there could continue to work. It's the exact same situation with time code. Time code was originally a solid 30 fps. With the advent of color television back in the 1950's, the frame rate was slowed by 0.01% to become 29.97 which could accommodate analog color sync. Years later when 24 fps video production became a reality, the frame rate of 23.98 was introduced so that the new technology could fit into existing workflows. Now here we are again in the process of adopting a new 100% digital TV standard, where there are no analog related sync issues. HDTV broadcasts could quite easily utilize a solid frame rate and we would be done with 59.94/29.97/23.98 forever but instead, it was deemed easier and safer to make the new accommodate the old. Legacy Baggage.

Within reason, newer companies like RED and Vision Research have minimal existing equipment legacies to deal with so are therefore free to design with much less limitation. Take this new RED Digital Stills in Motion Camera idea. Unlike Canon, RED doesn't already have a pro video product line that would be rendered instantly obsolete with the introduction of this one product. (They do however have a certain 4k digital cinema camera that could be jeopardized, that is if they don't continue to support and develop it in its own right.) 

No legacy baggage = technological innovation

These are just some quick thoughts. It plays in with my earlier post on stills/video convergence. Just wondering where all this technology is leading us.. I'd love to hear what other people have to say about it.