Tag Archives: levels

Why Does S-Log Recorded Internally Look Different To S-Log Recorded On An External Recorder?

I have written about this many times before, but I’ll try to be a bit more concise here.

So – You have recorded S-Log2 or S-Log3 on your Sony camera and at the same time recorded on an external ProRes Recorder such as an Atomos, Blackmagic or other ProRes recorder. But the pictures look different and they don’t grade in the same way. It’s a common problem. Often the external recording will look more contrasty and when you add a LUT the blacks and shadow areas come out very differently.

 

Video signals can be recorded using a several different data ranges. S-Log2 and S-Log3 signals are always Data Range.  When you record in the camera the cameras adds information to the recording called metadata that tells your editing or grading software that the material is Data Range. This way the edit and grading software knows how to correctly handle the footage and how to apply any LUT’s.

However when you record to an external recorder the external recorder doesn’t have this extra metadata. So the recorder will record the Data Range signal that comes from the camera but it doesn’t add the metadata. The ProRes codec is normally used for Legal Range video and by default, unless there is metadata that says otherwise, edit and grading software will assume any ProRes recordings to be Legal Range.

So what happens is that your edit software takes the file, assumes it’s Legal Range and handles it as a Legal Range file when in fact the data in the file is Data Range. This results in the recording levels being transposed into incorrect levels for processing. So when you add a LUT it will look wrong, perhaps with very dark shadows or very bright over exposed looking highlights. It can also limit how much you can grade the footage.

What Can We Do About It?

Premiere CC.

You don’t need to do anything in Premiere for the internal .mp4 or MXF recordings. They are handled correctly but Premiere isn’t handling the ProRes files correctly.

My approach for this has always been to use the legacy fast color corrector filter to transform the input range to the required output range. If you apply the fast color corrector filter to a clip you can use the input and output level sliders to set the input and output range. In this case we need to set the output black level to CV16 (as that is legal range black) and we need to set output white to CV235 to match legal range white. If you do this you will then see that the external recording appears to have almost exactly the same values as the internal recording. However there is some non-linearity in the transform, it’s not quite perfect.

Screenshot-2019-03-01-at-11.04.04 Why Does S-Log Recorded Internally Look Different To S-Log Recorded On An External Recorder?
Using the legacy “fast color corrector” filter to transform the external recording to the correct range within Premiere.

Now when you apply a LUT the picture and the levels are more or less what you would expect and almost identical to the internal recordings. I say almost because there is a slight hue shift. I don’t know where the hue shift comes from. In Resolve the internal and external recordings look pretty much identical and there is no hue shift. In Premiere they are not quite the same. The hue is slightly different and I don’t know why. My recommendation – use Resolve, it’s so much better for anything that needs any form of grading or color correction. DaVinci Resolve: It’s very easy to tell Resolve to treat the clips as Data Range recordings. In the media bin, right click on the clip and under “clip attributes” change the input range from “auto” to “full”. If you don’t do this DaVinci Resolve will assume the ProRes file to be legal range and it will scale the clip incorrectly in the same way as Premiere does. But if you tell Resolve the clip is full range then it is handled correctly.

Adobe still can’t get XAVC levels right!

I’m often asked at the various workshops I run why I don’t grade in Adobe Premiere. Here’s why – they can’t even get basic import levels right.

Below are two screen grabs. The first is from Adobe Premiere CC 2019 and shows an ungraded, as shot, HLG clip. Shot with a Sony Z280 (love that little camera). Note how the clip appears over grossly exposed with a nuclear looking sky and clipped snow, it doesn’t look nice. Also note that the waveform suggest the clips peak levels exceed 110%. Now I know for a fact that if you shoot HLG with any Sony camera white will never exceed 100%.

Adobe-HLG1c-1024x626 Adobe still can't get XAVC levels right!
Incorrect levels with an XAVC clip in Adobe Premier. Click on the image to view a larger version.

The second screen grab is from DaVinci Resolve and it shows the same clip. Note how in Resolve that although bright the clip certainly doesn’t look over exposed as it does in Premiere. Note also how the levels show by the waveform now no longer exceeds code value 869 (100% white is 940).  These are the correct and expected levels, this is how the clip is supposed to look. Not the utter nonsense that Adobe creates.

resolve-hlg1b-1024x626 Adobe still can't get XAVC levels right!
Same XAVC clip in Resolve and now the levels are correct. Click on the image to view a larger version.

Why can’t Adobe get this right. This problem has existed for ages and it really screws up your footage. If you are using S-Log and you try to add a LUT then things get even worse as the LUT expects the correct levels, not these totally incorrect levels.

Take the SDI or raw out from the camera and record a ProRes file on something like a Shogun while recording XAVC internally and the two files look totally different in Premiere but they look the same in Resolve. Come on Adobe – you should be doing better than this.

If they can’t even bring clips in at the correct levels, what hope is there of being able to get a decent grading output? I can make the XAVC clips look OK in Premiere but I have to bring the levels down to do this. I shouldn’t have to. I exposed it right when I shot it so I expect it to look right in my edit software.

Whites, Super Whites and other Bits and bobs.

Do you know how your NLE is handling your video, are you whites white or whiter than white or does this sound like a washing powder add?

In the analog world you shot within the legal range of black to 100% white. It was simple, easy to understand and pretty straight forward. White was white at 100% and that was that. With digital video it all gets a lot more complicated, especially as we now start to move to greater and greater bit depths and the use of extended range recording with fancy gamma curves becomes more common. In addition computers get used more and more for not just editing but also as the final viewing device for many videos and this brings additional issues of it’s own.

First lets look at some key numbers:

8 bit data gives you 256 possible values 0 to 255.

10 bit data gives you 1024 possible values, 0 to 1023.

Computers use bit 0 to represent black and bit 255 or 1023 to represent peak white.

But video is quite different and this is where things get messy:

With 8 bit video the first 16 bits are used for sync and other data. Zero or black is always bit 16 and peak white or 100% white is always bit 235, so the traditional legal black to white range is 16 to 235, only 219 bits of data. Now in order to get a better looking image with more recording range many cameras take advantage of the bits above 235. Anything above 235 is “super white” or whiter than white in video terms, more than 100%. Cinegammas and Hypergammas take advantage of this extra range, but it’s not without it’s issues, there’s no free lunch.

10 bit video normally uses bit 64 as black and 940 as peak white. With SMPTE 10-bit extended range you can go down to bit 4 for undershoot and you can go up to bit 1019 for overshoots but the legal range is still 64-940. So black is always bit 64 and peak white always bit 940. Anything below 64 is a super black or blacker than black and anything above 940 is brighter than peak white or super white.

At the moment the big problem with 10 bit extended (SMPTE 274M 8.12) and also 8 bit that uses the extra bits above 235  is that some codecs and most software still expects to see the original legal range so anything recorded beyond that range, particularly below range can get truncated or clipped. If it is converted to RGB or you add an RGB filter or layer in your NLE it will almost certainly get clipped as the computer will take the 100% video range (16-235) and convert it to the 100% computer RGB range (0-255). So you run the risk of loosing your super whites altogether. Encoding to another codec can also lead to clipping. FCP and most NLE’s will display super blacks and super whites as these fall within the full 8 or 10 bit ranges used by computer graphics, but further encoding can be problematic as you can’t always be sure whether the conversion will use the full recorded range or just the black to white range. Baselight for example will only unpack the legal range from a codec so you need to bring the codec into legal range before going in to baselight. So as we can see it’s important to be sure that your workflow is not truncating or clipping your recorded range back to the nominal legal or 100% range.

On the other hand if you are doing stuff  where the full 0 to 255 (1023) are used then you often need to use the illegal video levels above 100% white to get whites to look white and not bright grey!  There are so many different standards across different platforms that it’s a complete nightmare. Arri with Alexa for example won’t allow you to record extanded range using ProRes because of these issues, while the Alexa HDSDi output will output extended range.

This is also an issues when using computer monitors for monitoring in the edit suite. When you look at this web page or any computer graphics white is set at bit 255 or 1023 (a lot will depend on the gamma that the monitor is set to). But that would be a super white or illegal white for video. As a result “in-range” or legal range videos when viewed on a computer monitor often look dull as the whites will be less bright than the computers own whites. The temptation therefore is to grade the video to make the whites look as bright as the computers whites which leads to illegal levels, clipping, or smply an image that does not look right on a TV or video monitor. You really need to be very careful to ensure that if you shoot using extended range that your workflow keeps that extended range intact and then you need to remember to legalise you video back to within legal range if it’s going to be broadcast.