Sony Launches Venice II

Sony Venice II

 

Today Sony launched Venice II. Perhaps not one of the very best kept secrets with many leaks in the last few weeks, but finally we officially know that it’s called Venice II and it has an 8K (8.6K maximum) sensor recording 16 bit linear X-OCN or ProRes to 2 built in AXS card slots.

The full information about the new camera is here. https://pro.sony/en_GB/products/digital-cinema-cameras/venice2

Venice II is in essence the original Venice camera and the AXS-R7 all built into a single unit.  But to achieve this the ability to use SxS cards has been dropped, Venice II only works with AXS cards. The XAVC-I codec is also gone. The new camera is only marginally longer than the original Venice camera body.



As well as X-OCN (the equivalent of a compressed raw recording) Venice II can also record 4K ProRes HQ and 4K ProRes 444. Because the sensor is an 8.6K sensor that 4K 444 will be “real” 444 with a real Red, Green and Blue sample at every position in the image. This will be a great format for those not wishing to use X-OCN. But why not use X-OCN? the files are very compact and full of 16 bit goodness. I find X-OCN just as easy to work with as ProRes.

One thing that Venice II can’t do is record proxies. Apparently user feedback is that these are rarely used. I guess in a film style workflow where you have an on set DIT station it’s easy for proxies to be created on set. Or you can create proxies in most edit applications when you ingest the main files, but I do wonder if proxies are something some people will miss if they only have X-OCN files to work from.

New Sensor:

There has been a lot of speculation that the sensor used in Venice II is the same as the sensor in the Sony A1 mirrorless camera, after all the pixel count is exactly the same. We already know that the A1 sensor is a very nice and very capable sensor. So IF it were to be the same sensor but paired with significantly more and better processing power and an appropriate feature set for digital cinema production it would not be anything to complain about. But it is unlikely that it is the very same sensor. It might be based on the A1 sensor (and the original Venice sensor is widely speculated to be based on the A9 sensor) but one thing you don’t want on these sensors are the phase detection sites used for autofocus.

When you expand these very high quality images on to very big screens, even the smallest of image imperfections can become an issue. The phase detection pixels and the wires that interconnect them can form a very, very faint fixed pattern within the image. In a still photograph you would probably never see this. In a highly compressed image, compression artefacts might hide it (although both the FX6 and FX9 exhibit some fixed pattern noise that might in part be caused by the AF sites). But on a giant screen, with a moving image this faint fixed pattern may be perceptible to audiences and that just isn’t acceptable for a flagship cinema camera. So, I am led to believe that the sensors used in both the original Venice and Venice II do not have any AF phase detection pixels or wire interconnects. Which means these can not the very same sensors as found in the A1 or A9. They are most likely specifically made for Venice.
Also most stills camera based sensors are only able to be read at 12 bit when used for video, again perhaps a key difference is that when used with the cooling system in the Venice cameras these sensors can be read at 16 bit at video frame rates rather than 12 or 14 bits.

The processing hardware in Venice II has been significantly upgraded from the original Venice. This was necessary to support the data throughput need to shoot at 8.6K and 60fps as well as the higher resolution SDI outputs and much improved LUT processing.  Venice II can also be painted live on set via both wiFi and Ethernet. So the very similar exterior appearances do in fact hide the fact that this really is a completely new camera.



My Highlights:

I am not going to repeat all the information in the press releases or on the Sony website here. But what I will say is I like what I see. Integrating the R7 into the Venice II body makes the overall package smaller. There are no interconnections to go wrong. The increase in dynamic range to 16 stops, largely thanks to a lower noise floor is very welcome. There was nothing wrong with the original Venice, but this new sensor is just that bit better.

The default dynamic range split gives the same +6 stops as most of Sony’s current cameras but goes down to -10 stops.  But with the very low noise floor that this sensor has rating the camera higher than the rated  800 base ISO to gain a bit of extra headroom shouldn’t be an issue. Sample footage from Venice II shows that the way the highlights do reach their limits is very nice.

The LUT processing has been improved and now you can have 3D LUTs in 4K on SDI’s 1&2 which are 12G and in HD at the same time on SDI’s 3&4 which are 3G – as well as on the monitor out and in the VF. This is actually quite a significant upgrade, the original Vence is a little bit lacking in  the way it handles LUTs. The ART look system is retained if you want even higher quality previews than that possible with 33x LUTs. There is also built in ACES support with a new RRT, this makes the camera extremely easy to use for ACES workflows and the 16 bit linear X-OCN is a great fit for ACES.



It retains the ability to remove the sensor head so it can be used on the end of an extension cable. Venice II can use either the original 6K Venice sensor or the new 8K sensor, however a new extension cable which won’t be available until until some time in 2023 is needed before the head can be separated, so Venice 1 will still have a place for some considerable time to come.

Venice only takes the original 6K sensor but Venice II can take either the original 6K sensor or the new 8K sensor.



Moving the dual ISO from 500/2500 to 800/3200 brings Venice II’s lower base ISO up to the same level as the majority of other Cinema cameras. I know that some found 500 ISO slightly odd to work with. This will just make it easier to work alongside other similarly rated cameras.

Another interesting consideration is that you can shoot at 5.8K pixels with a Super 35mm sized scan. This means that the 4K Super 35mm material will have greater resolution than the original Venice or many other S35 cameras that only use 4K of pixels at S35. There is a lot of very beautiful super 35mm cine glass available and being able to shoot using classic cinema glass and get a nice uplift in the image resolution is going to be really nice. Additionally there will be some productions where the shallower DoF of Full Frame may not be desirable or where the 8.6K files are too big and unnecessary. I can see Venice II being a very nice option for those wishing to shoot Super 35.

But where does this leave existing Venice owners? 

For a start the price of Venice 1 is not going to change. Sony are not dropping the cost. This new Venice is an upgrade over the original and more expensive (but the price does include the high frame rate options). Although my suspicion is that Venice II will not be significantly more expensive that the cost of the current Venice + R7 + HFR licence. Sony want this camera to sell well, so they won’t want to make it significantly more as then many would just stick with Venice 1. The original remains a highly capable camera that produces beautiful images and if you don’t need 8.6K the reasons to upgrade are fewer. The basic colour science of both cameras remains the same, so there is no reason why both can’t be used together on the same projects. Venice 1 can work with lower cost SxS cards and XAVC-I if you need very small files and a very simple workflow, Venice II pushes you to a AXS card based workflow and AXS cards are very expensive. 

If you have productions that need the Rialto system and the ability to un-dock the sensor, then this isn’t going to be available for Venice II for some time. So original Venice cameras will still be needed for Rialto applications (it will be 2023 before Rialto for Venice II becomes available).

Of course it always hurts when a new camera comes out, but I don’t think existing Venice owners should be too concerned.  If customers really felt they needed 8.6K then they would have already likely been lost to a Red camera and the Red ecosystem. But at least now that there is an 8K Venice option that might help keep the original Venice viable for second unit, Rialto (for now at least) or secondary roles within productions shooting primarily in 8K.

I like everything I see about Venice II, but it doesn’t make Venice 1 any less of a camera.


My Exposure Looks Different On My LCD Compared To My Monitor!

This is a common problem and something people often complain about. It may be that the LCD screen of their camera and the brightness of the  image on their monitor don’t ever seem to quite match. Or after the shoot and once in the grading suite the pictures look brighter or darker than they did at the time of shooting.

A little bit of background info: Most of the small LCD screens used on video cameras are SDR Rec-709 devices. If you were to calibrate the screen correctly the brightness of white on the screen would be 100 Nits. It’s also important to note that this level is the level that is also used for monitors that are designed to be viewed in dimly lit rooms such as edit or grading suites as well as TV’s at home.

The issue with uncovered LCD screens and monitors is your perception of brightness changes according to the ambient viewing light levels. Indoors in a dark room the image on it will appear to be quite bright. Outside on a Sunny day it will appear to be much darker. It’s why all high end viewfinders have enclosed eyepieces, not just to help you focus on a small screen but also because that way you are always viewing the screen under the very same always dark viewing conditions. It’s why a video village on a film set will be in a dark tent. This allows you to then calibrate the viewfinder with white at the correct 100 NIT level and then when viewed in a dark environment your images will look correct.


If you are trying to use an unshaded LCD screen on a bright sunny day you may find you end up over exposing as you compensate for the brighter viewing conditions. Or if you also have an extra monitor that is either brighter or darker you may become confused as to which is the right one to base your exposure assessments on. Pick the wrong one and your exposure may be off.  My recommendation is to get a loupe for the LCD, then your exposure assessment will be much more consistent as you will then always be viewing the screen under the same near ideal conditions.

It’s also been suggested that perhaps the camera and monitor manufacturers should make more small, properly calibrated monitors. But I think a lot of people would be very disappointed with a proper calibrated but uncovered display where white would be 100 NITs as it would be too dim for most outside shoots. Great indoors in a dim room such as an edit or grading suite but unusably dim outside on a sunny day. Most smaller camera monitors are uncalibrated and place white 3 or 4 times brighter at 300 NIT’s or so to make them more easily viewable outside. But because there is no standard for this there can be great variation between different monitors making it hard to understand which one to trust depending on the ambient light levels.

New Arri-Look LUT

Arri-Look LUT V1
Arri Look V1 Sample 2
s709 Sample

 

UPDATE – Some issues with the original version of the LUT were found by some users, so I have created a revised version and the revised version is now linked below.

Arri Look LUT’s are clearly very popular with a lot of Sony users,  so I have created an Arri-Look LUT for the FX3/FX6/FX9/Venice that can be used to mimic the look from an Arri camera. It is not designed to pretend to be a real Arri camera, but to instead provide an image with the look and feel of an Arri camera but tailored to the Sony sensors.

As usual the LUT is free to download, but if you do find it useful I do ask that you buy me a coffee or other drink as a thank you. All contributions are always most welcome. Additionally do let me know what you like about this LUT or don’t like, so I can look at what LUTs may be good to create in the future.

Click Here to download my Arri-Look LUT (latest version 2C),

And here is a warmer version (may be very slightly too warm), version 2B.

Click below to buy me a thank you drink if you like it and use it.


 

Your choice:


Northern Lights Photo and Video Tours Back On.

Captured on the first night at the cabins in 2018.

 

After having to skip a year my Northern Lights tours are back on again starting January 2022. These trips are made for those that appreciate the beauty of nature. The arctic is a spectacular place in so many ways. Especially in winter when the low arctic sun skims along the horizon providing golden hour light all day.

During the long nights when the sky is clear the Northern Lights come out to play. The cold air provides very clear viewing and most guests are blown away by the numbers of stars visible. It’s a photographers paradise.

For more information take a look at the tour page. If you are interested, send me a message. 

Northern lights Expeditions.

Free Sony FX6 and FX3 Tutorial Videos

Hidden away in the Sony Alpha Academy are 6 tutorial videos that I made for the the Sony Cinemaline cameras, most notably the FX6 and FX3. These videos mainly cover the FX6 but information on the FX3 (and FX9) is also included in several of the videos..

The 6 videos cover the following subjects:

FX6 – Scan Modes and Codecs (including information of recording media)
FX6/FX3 – What is S-Cinetone.
FX6 – How to use the Cine-EI mode to shoot S-Log3.
FX6/FX3 – Slow Motion and Timelapse.
FX6/FX3 – Exposure tools (covering waveform and histogram as well as Zebras)
FX6/FX3 – Post Production Stabilisation.

To watch these video you will need to setup a free account with Sony. Then go to the Alpha Academy page linked below and scroll down to the FilmMaking section and then open the My Sony Expert tab.

https://www.sony.co.uk/alphauniverse/alpha-academy/videos

DaVinci resolve Frame Rendering Issue and XAVC

There is a bug in some versions of DaVinci Resolve 17 that can cause frames in some XAVC files to be rendered in the wrong order. This results in renders where the resulting video appears to stutter or the motion may jump backwards for a frame or two. This has now been fixed in version 17.3.2 so all user of XAVC and DaVinci Resolve are urged to upgrade to at least version 17.3.2.

https://www.blackmagicdesign.com/uk/support/family/davinci-resolve-and-fusion

SDI Failures and what YOU can do to stop it happening to you.

Updated 22/01/2024.

Sadly this is not an uncommon problem. Suddenly and seemingly for no apparent reason the SDI (or HDMI) output on your camera stops working. And this isn’t a new problem either, SDI and HDMI ports have been failing ever since they were first introduced. This issue affects all types of SDI and HDMI ports. But it is more likely with higher speed SDI ports such as 6G or 12G as they operate at higher frequencies and as a result the components used are more easily damaged as it is harder to protect them without degrading the high frequency performance.

Probably the most common cause of an SDI/HDMI port failure is the use of the now near ubiquitous D-Tap cable to power accessories connected to the camera. The D-Tap connector is sadly shockingly crudely designed. Not only is it possible to plug in many of the cheaper ones the wrong way around but with a standard D-Tap plug there is no mechanism to ensure that the negative or “ground” connection of the D-Tap cable makes or breaks before the live connection. There is a however a special but much more expensive D-Tap connector available that includes electronic protection against this very issue (although a great product, even these cannot totally provide protection from a poor ground connection) – see: https://lentequip.com/products/safetap

Imagine for a moment you are using a monitor that’s connected to your cameras SDI or HDMI port. You are powering the monitor via the D-Tap on the cameras battery as you always do and everything is working just fine. Then the battery has to be changed. To change the battery you have to unplug the D-Tap cable and as you pull the D-Tap out, the ground pin disconnects fractionally before the live pin. During that extremely brief moment there is still positive power going to the monitor but because the ground on the D-Tap is now disconnected the only ground route back to the battery becomes via the SDI/HDMI cable and back through the camera. For a fraction of a second the SDI/HDMI cable becomes the power cable and that power surge blows the SDI/HDMI driver chip or damages the cameras motherboard.

After you have completed the battery swap, you turn everything back on and at first all appears good, but now you can’t get the SDI or HDMI output to work. There’s no smoke, no burning smells, no obvious damage as it all happened in a tiny fraction of a second. The only symptom is a dead SDI or HDMI.

And it’s not only D-Tap cables that can cause problems. A lot of the cheap DC barrel connectors have a center positive terminal that can connect before the outer barrel makes a good connection. There are many connectors where the positive can make before the negative.

You can also have problems if the connection between the battery and the camera isn’t perfect. A D-Tap connected directly to the battery might represent an easier route for power to flow back to the battery if there is  corrosion on the battery terminals or a loose batter plate or adapter.

It can also happen when powering the camera and monitor (or other SDI connected devices like a video transmitter or timecode box) via separate mains adapters. The power outputs of most of the small, modern, generally plastic bodied switch mode type power adapters and chargers are not connected to ground. They have a positive and negative terminal that “floats” above ground at some unknown voltage. Each power supplies negative rail may be at a completely different voltage compared to ground.  So again an SDI or HDMI cable connected between two devices, powered by different power supplies will act as the ground between them and power may briefly flow down the SDI cable as the SDI cables ground brings both power supply negative rails to the same common voltage. Failures this way are much less common, but they do still occur. 

For these reasons you should always connect all your power supplies, power cables, especially D-Tap or other DC power cables first. Avoid using adapters between the battery and the camera as each adapter plate is another possible cause of trouble.

Then while everything remains switched off the very last thing to connect should be the SDI or HDMI cables. Only when everything is connected should you turn anything on. But beware – there is a myth that turning cameras and monitors off before plugging or unplugging is enough to stop this issue. This simply isn’t true because power is fed to the monitor and camera even when they are switched off so power loops and surges can still occur.

If unplugging or re-plugging a monitor (or anything else for that matter) turn everything off first. Do not connect or disconnect anything while any of the equipment is on.  Although the greatest moment of risk is the moment you connect or disconnect any power cables such as when swapping a battery where you are using a D-Tap to power any accessories.
So, if changing batteries, switch EVERYTHING off first, then disconnect your SDI or HDMI cables before disconnecting the D-Tap or other power cables. Seriously – you need to do this, disconnect the SDI or HDMI before changing the battery if the D-Tap cable has to be unplugged from the battery. Things are a quite a bit safer if any D-Tap cables are connected directly to the camera or a power plate that remains connected to the camera as this way you can change the battery without needing to unplug the D-Tap cables and this does reduce the risk of issues.

Also inspect your cables regularly, check for damage to the pins and the cable, if you suspect that a cable isn’t perfect – throw it away, don’t take the risk. I’ve seen plenty of examples of D-Tap cables where one of the wires has broken off the connector pins.

A great safety check is to turn on your monitor immediately after connecting the power, but before connecting any SDI or HDMI cables. If the monitor comes on OK, this is evidence that the power is correctly connected. Then you can connect the SDI or HDMI cable. However, while a really good idea, this only indicates that there is some power to the monitor, it does not ensure that the ground connection is 100% OK.
 

The reason Arri talk about shielded power cables is because most shielded power cables use connectors such as Lemo or Hirose where the body of the connector is grounded to the cable shield. This helps ensure that when plugging the power cable in it is the ground connection that is made first and the power connection after. Then when unplugging the power breaks first and ground after. When using properly constructed shielded power cables with Lemo or Hirose connectors it is much less likely that these issues will occur (but not impossible).

Is this an SDI/HDMI fault?

No, not really. The fault lies in the use of power cables that allow the power to make before the ground or the ground to break before the power.  A badly designed power connector often made as cheaply as possible.  D-Tap was originally designed to be used to be used with high power video lights, it wasn’t designed to be used with delicate monitors and the design will allow it to be plugged in the wrong way around if you force it.
 
Additionally it could be user error. I know I’m guilty of rushing to change a battery and pulling a D-Tap connector without first disconnecting the SDI on many occasions, but so far I’ve mostly gotten away with it (I have blown an SDI on one of my Convergent Design Odysseys).

If you are working with an assistant or as part of a larger crew do make sure that everyone on set knows not to plug or unplug power cables or SDI cables without checking that it’s OK to do so – and always unplug the SDI/HDMI before disconnecting or removing anything else.
 
How many of us have set up a camera, powered it up, got a picture in the viewfinder and then plugged in the monitor via an SDI  or HDMI cable? Don’t do it! Plug and unplug in the right order – connect ALL power cables and power supplies first. Check power is going to the camera and check power is going to the monitor by turning them on, then finally plug in the SDI. When removing a battery, unplug the SDI/HDMI, power down the camera and only then remove the D-Tap from the battery.

New LUTs from Sony

 

I was asked by Sony to produce a couple of new LUT’s for them. These LUT’s were inspired by many recent blockbuster movies and have been named “Space Adventure” and “Super Hero”.

Both LUT’s are available for free and there is a link on the page linked below that will allow you to obtain them.

Rather than explain the two different looks here go to this page on the Sony website https://pro.sony/en_GB/filmmaking/filmmaking-solutions/full-frame-cinematic-look

Scroll down to where it says “Stunning Cinematic Colour” and there you will find a video called “Orlaith” that shows both LUT’s applied to the same footage.

Orlaith is a gaelic name  and it is pronounced “orla”. It is the name of a mythical golden princess. The short film was shot on a teeny-tiny budget in a single evening with an FX3 and FX6 using S-Log3 and SGamut3.cine. Then the LUTs were applied directly to the footage with no further grading.