Color Correction
/Color Correction
May 19, 2013
This short article began with the rather dry title, "Tracking CDL Through Post." As I began to write, my thoughts started to meander into the obvious and maybe not so obvious ancillary aspects of this topic. I now feel the more gestalt title, "Color Correction", actually seems appropriate. And please forgive me if this post comes off as very "101".
I do a fair job of keeping up with the blogs, (RIP Google Reader, enter Feedly.) Among the many great sites out there on the topic I'm always reading about software and hardware tools, plug-ins, look building, and the technique of color correction but very little about why it's necessary in the first place.
So why do we color correct?
Forget building Looks for a moment. And by that I mean digitally crafting the visual qualities - color, saturation, and contrast - of a shot instead of doing it the old fashioned way - with optics, exposure, and photochemistry. At it's very root, color correction is about camera matching and shot matching within a scene so as not to take the viewer out of the moment with an abrupt change in visual continuity. And this, more than anything else is defined by color temperature.
A definition - Correlated Color Temperature (wikipedia):
The particular color of a white light source can be simplified into a correlated color temperature (CCT). The higher the CCT, the bluer the light appears. Sunlight at 5600K for example appears much bluer than tungsten light at 3200K. Unlike a chromaticity diagram, the Kelvin scale reduces the light source's color into one dimension. Thus, light sources of the same CCT may appear green or magenta in comparison with one another [1]. Fluorescent lights for example are typically very green in comparison with other types of lighting. However, some fluorescents are designed to have a high faithfulness to an ideal light, as measured by its color rendering index (CRI). This dimension, along lines of constant CCT, is sometimes measured in terms of green–magenta balance;[1] this dimension is sometimes referred to as "tint" or "CC".
Every camera sensor, every lens, in-front-of-the-lens filter, light source (most particularly the sun and sky!), and light modifier will produce or interpret Reference White (Chroma Free) in a slightly different way. In the case of lenses, something like Master Primes are remarkably well matched within the set whereas older glass like Zeiss Superspeed Mk III's, for example, will have a lot of color temperature and even contrast shift from lens to lens. This being the case, we can say there is a significant amount of color temperature offset to contend with between all of our light producing and image re-producing devices.
Here's a 50mm Ultra Prime on an Arri Alexa with camera white balance set at 3300 -3, lens iris at T2.8. Below it is an Angeniuex Optimo 24-290mm zoom lens @ 50mm put on the same camera with the same exposure and white balance.
The Optimo Zoom lens (bottom image) is much warmer and greener than the prime. If these lenses were both working in the same setup, color correction instantly becomes necessary, lest one angle looks "correct" and the other either too warm or too cool in comparison.
All of these variables - optics, light sources, sensors, etc - and their inherently different color temperatures often add up to cameras that don't match and shots within the same scene that are offset from one another along the warm-cool and green-magenta axis.
In this era of high ISO cameras, color temperature offsets are most evident in heavy Neutral Density filters, often used to block as much as 8 stops. In my opinion, heavy ND's are the most destructive variable in contemporary digital imaging. Even with the best available filters such as the Schneider Platinum IRND's, we're still seeing a lot of color temperature offsetting with filters over 1.2. The problem is it seems that most Neutral Density filters (either conventional or with Infrared Cut) do not retard Red, Green, and Blue wavelengths of light in equal proportions. What we're left with after reducing a lot of stop with ND is more blue and green wavelength than red which is vital to the accurate reproduction of skin tones. If this part of the picture information has been greatly reduced, it can be very challenging to digitally push life and warmth back into the subject's skin without introducing a lot of noise and artifacting.
Here's the 50mm Ultra Prime again.
And here with a Platinum IRND 1.2. The camera ISO, white balance and exposure are the same. To get the stop needed to compensate for the ND, the quantity of the light was increased on the chart by bringing it closer to not affect its color temperature by dimming or scrimming.
Comparing the two, they're really quite close. I've got to say, the Schneider Platinum's are the best I've found. With other sets of IRND's, you'll see significant color temp offset even at ND .3 but with these at ND 1.2, there is only a very slight shift to green. But this is still something that will need to be corrected.
Here's IRND 1.5. We're starting to get increasingly cool and green.
IRND 1.8
IRND 2.1
And for comparison, back to our original, filter-less image.
After depriving the image of 7 stops of light with Neutral Density, we've unintentionally reduced some of our red channel picture information. At this point we can correct with camera white balance by swinging the camera to a warmer Kelvin and pulling out a little green. Or we can use digital color correction tools like LiveGrade at the time of photography or DaVinci Resolve in post production to match this shot with the scene. ND filters are but one variable among many when it comes to managing color temperature offsets spread across the camera and lighting.
Fortunately, there are numerous ways to deal with it.
In my opinion, these offsets can usually be solved most expediently with Camera White Balance (WB). Depending on the camera and how we're doing the recording, this WB setting is either "baked in" to the image or exists as metadata. In the case of the Arri Alexa, the orange-cyan (warm-cool) axis is represented in degrees of kelvin with green-magenta adjustable in "+" or "-" points of color correction.
If you're working with the RED camera, the Redmote is great for wirelessly adjusting white balance when you need to.
Wireless remote operation of the Alexa is a desperately needed feature. The best we can do for now is the Arri RCU-4 better known as the "Assistant Panel".
This is a great little device that's chronically underutilized as it gives you full remote access to the camera unlike the Web Browser ethernet interface which is very limited. The RCU-4 is powered through its control cable which I've used successfully at lengths up to 150'. This device makes white balancing the Alexa incredibly fast and efficient as it no longer need be done at the side of the camera.
Not to get too obvious with this.. Moving on.
Another approach is to manage color temperature by putting color correction gel - CTB, CTO, CTS, Plus Green, Minus Green - on light sources in order to alter those with undesirable color temperatures to produce the correct, color accurate response. Color correction tools, digital or practical, do not necessarily apply to the creative use of color temperature. Having mixed color temperatures in the scene is an artistic decision and one that can have a very desirable effect as it builds color contrast and separation into the image. Mixed color temperatures in the scene will result in an ambient color temperature lying somewhere in between the coolest and warmest source. Typically in theses scenarios, a "Reference White", or chroma-free white can be found by putting the camera white balance somewhere around this ambient color temperature.
Identifying problematic light sources and gelling them correctly can be a very time and labor intensive process and one that doesn't happen on the set as often as it should so is usually left up to the digital toolset. There is now a whole host of affordable softwares that can be used on the set at the time of photography like LiveGrade or LinkColor or later in post production - such as Resolve, Scratch, Express Dailies, and countless others.
When we're talking about On-Set Color Correction, we're usually talking about ASC-CDL or Color Decision List. CDL is a very useful way to Pre-Grade or begin color correction at the time of the photography. This non-destructive color correction data is very trackable through post production and can be linked to its corresponsing camera media through metadata with an Avid ALE. When implemented successfully, the Pre-Grade can be recalled at the time of finishing and be used as a starting point for final color. In practice, this saves an enormous amount of time, energy, and consequently.. $$$.
Here's one way an ALE with the correct CDL information can be generated in Assimilate Scratch Lab:
In the top level of Scratch, here's our old friend the Chip Chart. Hooray!
We've applied the standard Alexa Log to Video 3DLUT to these shots and as you can see, the first one looks pretty good but the rest suffer from various degrees of Color Temperature Offsetting.
At this point, if we Pre-graded on the set, we could load the correct CDL for each shot and be ready to output dailies.
In the bottom lower left on the Matrix page, is the LOAD button. Click it to go to this dialog window:
Here CDL from the set can be applied on a shot by shot basis. Once everything is matching nicely it's time to embed this work into metadata that can easily be tracked and recalled at a later time.
Select +CDL and click "Export EDL/ALE"
From the drop-down, select .ale, and then name your ALE something appropriate.
Now in Avid Media Composer, we're going to import this ALE to add ASC-CDL Slope, Offset, Power, and Sat (Gamma, Lift, Gain, and Saturation) values that will now be associated with their corresponding clips.
This post assumes a working knowledge of Media Composer. If you're not sure how to set up an Avid project, import media, make bins, and import an ALE, there are plenty of great tutorials out there.
Once you have the transcoded DNxHD media in the correct MediaFiles directory, import the ALE.
Click the "Hamburger" Icon in the lower left of the bin (I have no idea what this Selector tool is actually called but I've heard many an Assistant Editor refer to it as the Hamburger), and then select "Choose Columns".
Here we have the opportunity to select which columns show up in our bin. The ASC-CDL values are already embedded in the ALE we imported but it's a good idea to verify them which we can do at the bin level by turning on these columns. From the "Choose Column" drop-down, select ASC_SOP (Slope, Offset, Power) and ASC_SAT (Saturation).
As you can see, all of the adjustments we made as CDL are now reflected in numeric values and are linked to their corresponding shot in the form of Avid metadata. ASC-CDL, while unfortunately limited in a lot of ways, really is a fairly univeral interchange for color correction data and can be implemented quite easily.
What we really need is a way to recall these ASC-CDL values from the ALE in a software like LiveGrade making this color correction data even more interchangeable.
Another possible workflow is to generate the dailies in Resolve using CDL from the set. Once that CDL corresponds with a shot in Resolve, that CDL can track with its correct shot all the way to finishing if the original Resolve project(s) is used.
What's the best approach? All of the above. The right tool for the right task and no two projects are alike. That's why a DIT is hired in the first place, to consider the criteria and then advise the best course of action.
Update
on 2013-06-06 14:55 by Ben Cain
Just read this related article -
Content feels eerily familiar!
Rec.ommendations for Display Gamma
/Rec.ommendations for Display Gamma
May 12, 2013
Old news really as this white paper from the International Telecommunications Union (ITU) standardizing gamma for reference monitors in HDTV was released in March 2011! But the topic of Gamma Correction as it pertains to Display Gamma is one that comes up constantly in conversation with colleagues in both post and acquisition. So much so that as I was combing through old NegativeSpaces articles detailing monitor calibration, I felt that a post on Gamma for broadcast monitors is one well warranted.
The ITU, which is located in Geneva, Switzerland, is the inernational governing body for broadcast, transmission, and television system standards. And like most international bodies, the ITU has no real authority and can only make rec.ommendations, such as Rec. 601 for digitally encoding interlaced analog video signals, and Rec. 709 for HDTV broadcast which after only a few years of widespread implementation is about to be superceded by Rec. 2020, a guideline for UHDTV, higher than HD resolution television systems (read 4k and beyond).
So soon? Yep.
Anyone involved with the creation of motion picture content should be aware of Display Gamma as a factor that has a profound effect on the way an image is perceived by the end user. And one that cannot be controlled by the content creator!
No point in reinventing the wheel here as a wonderful knowledge base is readily available at the excellent site, Cambridge in Colour, a learning community for photographers. This snippet from their article provides the best explanation and examples of Display Gamma I've been able to find.
Here's a link to the full article, Cambridge in Colour - Understanding Gamma Correction. Additonally, their article on Input and Output Tone Curves is an excellent companion to this topic.
Gamma is an inevitable part of the digital imaging process and can mean different things depending on where in the chain we're looking - file gamma, camera, display, etc. But as we see in the above example, what gamma effects is the midtones. That is - grayscale which is neither black nor white but somewhere in-between. In all optical perception, the separation between midtones and black or white is ultimately what defines the contrast. And as we can see in the most simple terms in the examples above, the exact same image seen with various display gammas will be profoundly different.
This topic is a doozy and one I have no intention of delving too deeply into. To any academic reading this, they will no doubt find what I've written to be overly simplistic and only glossing the surface of a very complex topic. That's ok. There are plenty of great scientific and very thorough bodies of work already out there on the topic. Being a technician for motion pictures that are primarily broadcast bound, my concern is Display Gamma for HD Television Systems. The problem is the current "rulebook" for HDTV, that is Rec. 709, actually makes no mention of gamma. Because of legacy technologies - NTSC, sRGB, and others - we've arrived at a defacto "average" display gamma of 2.2. Despite this number not ever being specified by the ITU in Rec. 709, it is the display gamma you are likely to find on virtually any HDTV found in the home, on laptop and LCD computer displays, and many tablets and smart phones. And again, this number of 2.2 is a ballpark figure with many devices measuring at odd numbers such as 2.15 for the iPad Mini for example. The long and short of it is Display Gamma is one of the most critical components in digital imaging as it pertains to the end viewer experience and one that's in need of a universal standard.
Enter ITU-R BT. 1886
(Another way these recommendations are listed is ITU-R BT."#", "BT" denoting "Broadcast Television". "ITU-R BT.709", reads "Radiocommunications Sector of International Telecommunications Union Recommendation for Broadcast Television #1886", the truncated version being BT. 1886 or simply Rec. 1886.)
Why this new recommendation? (from the white paper)
"This Recommendation specifies the reference electro-optical transfer function (EOTF) that the
displays used in HDTV programme production should follow in order to facilitate consistent picture
presentation. The reference EOTF is specified as a simple equation, with exponent function, based
on measured characteristics of the Cathode Ray Tube (CRT)."
Because...
a) that reference displays play a crucial role in television programme production as they are used as reference for picture presentation;
b) that the characteristics of reference displays should be unified to ensure consistent picture presentation of programmes produced for use in broadcasting;
c) that historically, picture presentation characteristics were determined based on Cathode Ray Tube (CRT) characteristics, and the opto-electronic transfer characteristics were implicitly based on the CRT physical characteristics;
e) that CRT reference displays are no longer available;
f) that the electro-optical transfer function (EOTF) of CRT displays differs amongst manufacturers, amongst models, and amongst regions, as well as varying with the settings of contrast and brightness;
g) that for the consistency of picture presentation, it is desirable that newly introduced display technologies have an EOTF that closely matches that of the CRT;
h) that the reference EOTF for a non-CRT display has not been defined in any ITU-R Recommendation;
j) that Recommendation ITU-R BT.709, provides specifications for the opto-electronic transfer characteristics at the source, and a common electro-optical transfer function should be employed to display signals mastered to this format
Buried in Annex 1 (page 2) is this very important bit of business -
γ (Gamma): Exponent of power function, γ = 2.40
As of Rec. 1886, the ITU is now recommending a Display Gamma of 2.4 for reference monitors in HDTV Systems.
But what does this mean for acquistion and post production where our work (for now) will almost universally be viewed on 2.2 displays?
The reality is, if you're monitoring and mastering at 2.4, contrast in these images will always seem slightly lifted on 2.2 displays. So for example, if the room we're coloring in has a Sony OLED using a gamma of 2.4, then our deliverables will look quite a bit different when seen on our client's 2.2 gamma iPad or MacBook Air. Rec. 1886 was needed but is far from universal implementation. Ask many a technician in both the field or in post if they had even heard of it and you would likely hear "no" more often than not. My own personal rule of thumb is that in situations where I'm handling the deliverables myself and those deliverables are for 2.2 displays, I set my monitors to 2.2. When I'm working with a facility and a colorist whose monitor is set to 2.4, then my monitors are 2.4. It's very important to be on the same page with whoever gets the stuff next and it will be different from project to project. The Sony OLED's fortunately have several choices for User Display Gamma which have made these monitors indispensable tools in today's digital imaging workflow chaos.
At the end of the day, Rec. 1886 is just that, another rec.ommendation, but a step in the right direction.
UPDATES:
My Wireless HD Video article (Cutting the Cord) has been updated with several more systems as per reader suggestions. I got great feedback on this post and the content was improved because of it. Thanks for that. I truly appreciate it and makes the hours I spend researching and writing a blog post totally worth it.
The new site is coming along great albeit terribly slow. Working TV hours, I have to be realisitic about just how long these projects take. That said, If I can go live with the new version before 2014 shows up, I'll be feeling great. I've been writing this site in the form of HD Cinema on Blogger since 2007 and on this domain since 2010. For better or for worse, I can't shake this project and it's something I'll probably continue to do for as long as I'm involved in this business. Looking forward to getting its next iteration out there.