Category Archives: Shooting Tips

How To Live Stream With The Sony PXW-Z90 and NX80.

The Sony PXW-Z90 is a real gem of a camcorder. It’s very small yet packs a 1″ sensor , has real built in ND filters, broadcast codecs and produces a great image. On top of all that it can also stream live directly to Facebook and other similar platforms. In this video I show you how to set up the Z90 to stream live to YouTube. Facebook is similar. The NX80 from Sony is very similar and can also live stream in the same way.

 

Rigging the PXW-FX9

In case you missed the live stream I have uploaded the recording I made of my almost hour long video with hints, tips and ideas for rigging the PXW-FX9. In the video I cover things like base plates including VCT and Euro Plate. I look at hand grip options, rod rails and matte boxes as well as power options including V-mount adapters and the XDAC-FX9.  Of course everything in the video is based on my own personal needs and requirements but I think there is some good information in there for anyone looking to accessorize their FX9, whether for working from a tripod or handheld.

 

PXW-FX9 User Guide Now Available for FREE Download.

FX9-quick-ref-guide PXW-FX9 User Guide Now Available for FREE Download.
The PXW-FX9 user guide.

 

Sony have released the PXW-FX9 user guide that I wrote for them. The guide is in the form of a searchable PDF designed for reading on a mobile device. The idea being that you can keep it on your phone in case you need to reference it on a shoot. It’s not meant to replace the manual but to compliment it and answer questions such as – what is S-Cinetone?

To download the guide go to the main Sony PXW-FX9 landing page and scroll down towards the bottom. There you should find a link that will take you to the guide download page as well as other resources for the FX9.

https://pro.sony/en_GB/products/handheld-camcorders/pxw-fx9

Look after Your Batteries In Lockdown.

I received this timely reminder from the guys at Pag Batteries and it contains important information even if you don’t have one of Pag’s excellent batteries. The main one being that you should not store lithium batteries full charged.

2074787f-47de-45d3-9ece-3b97aee4f155 Look after Your Batteries In Lockdown.

If you are currently unable to work as a result of the global pandemic, then you need to make sure that your Li-Ion camera batteries are in good health when it becomes possible to return to work.
 
Batteries naturally self-discharge over time. If their state-of-charge is less than 10% before an extended period of inactivity, they could become difficult for you to recover.
 
It is also undesirable for batteries to be 100% charged for storage as this can damage the cells and lead to a shorter overall life.
 
PAG recommends that you charge your Li-Ion batteries to 50% (anywhere between 20% and 80% is desirable) prior to long term storage of more than 2 weeks. PAGlink batteries should also be in an unlinked state during this period.
 
 
PAGlink Sleep Mode for Storage
 
PAGlink Batteries can be put into Sleep Mode for long term storage, using the battery display menu system. It shuts down the internal electronics and greatly reduces battery self-discharge. The battery can be woken-up with 2 presses of the display button.
 
Please refer to Section 6 of the User Guide for PAGlink Batteries via the links below:
 
 
 
PAGlink Gold Mount Batteries have an automatic sleep mode feature that initiates after 2 weeks of inactivity.
 
To maintain good battery capacity, it is beneficial to cycle batteries every 6 weeks (discharge followed by a full-charge).
 
If you have any queries regarding your PAG batteries, Sleep Mode and storage then please contact support via email: support@paguk.com
 
 
PAG would like to wish you all good health and a safe return to work when that day finally arrives.
 
The PAG team.

Streaming and Live Feeds.

With some difficult times ahead and the need for most of us to minimise contact with others there has never been a greater need for streaming and online video services that now.

I’m setting up some streaming gear in my home office so that I can do some presentations and online workshops over the coming weeks.

I am not an expert on this and although I did recently buy a hardware RTMP streaming encoder, like many of us I didn’t have a good setup for live feeds and streaming.

So like so many people I tried to buy a Blackmagic Design Atem, which is a low cost all in one switcher and streaming device. But guess what? They are out of stock everywhere with no word on when more will become available. So I have had to look at other options.

The good news is that there are many options. There is always your mobile phone, but I want to be able to feed several sources including camera feeds, the feed from my laptop and the video output from a video card. 

OBS to the rescue!

The good news is that there is a great piece of open source software called OBS – Open Broadcast System and the Open Broadcast Studio streaming software.

OBSDemoApp2321 Streaming and Live Feeds.
Open Broadcast Studio Software.

 

OBS is s great piece of software that can convert almost any video source connected to a computer into a live stream that can be sent to most platforms including Facebook and YouTube etc. If the computer is powerful enough it can switch between different camera sources and audio sources. If you follow the tutorials on the OBS website it’s pretty quick and easy to get it up and running.

So how am I getting video into the laptop that’s running OBS? I already had a Blackmagic Mini Recorder which is an HDMI and SDI to thunderbolt input adapter and I shall be using this to feed the computer. There are many other options but the BM Mini Recorders are really cheap and most dealers stock them as well as Amazon. it’s HD only but for this I really don’t need 4K or UHD.

Blackmagic-mini-recorder Streaming and Live Feeds.
Blackmagic Mini Recorder HDMI and SDI to thunderbolt input adapter.

 

Taking things a step further I also have both an Atomos Sumo and an Atomos Shogun 7. Both of these monitor/recorders have the ability to act as a 4 channel vision switcher. The great thing about these compared to the Blackmagic Atem is that you can see all your sources on a single screen and you simply touch on the source that you wish to go live. A red box appears around that source and it’s output from the device. 

atomos_atomsumo19_on_set_in_studio_4kp60_1576110181000_1334246-e1584607746226 Streaming and Live Feeds.
The Atomos Sumo and the Shogun 7 can both act as 4 input vision switchers.

 

So now I have the ability to stream a feed via OBS from the SDI or HDMI input on the Blackmagic Mini Recorder, fed from one of 4 sources switched by the Atomos Sumo or Shogun 7. A nice little micro studio setup. My sources will be my FS5 and FX9. I can use my Shogun as a video player. For workflow demos I will use another laptop or my main edit machine feeding the video output from DaVinci Resolve via a Blackmagic Mini Monitor which is similar to the mini recorder but the mini monitor is an output device with SDI and HDMI outputs. The final source will be the HDMI output of the edit computer so you can see the desktop.

Don’t forget audio. You can probably get away with very low quality video to get many messages across. But if the audio is hard to hear or difficult to understand then people won’t want to watch your stream. I’m going to be feeding a lavalier (tie clip) mic directly into the computer and OBS.

I think really my main reason for writing this was really to show that many of us probably already have most of the tools needed to put together a small streaming package. Perhaps you can offer this as a service to clients that need to now think about online training or meetings. I was lucky enough to have already had all the items listed in this article, the only extras I have had to but are an extra thunderbolt cable as I only had one. But even if you don’t have a Sumo or Shogun 7 you can still use OBS to switch between the camera on your laptop and any other external inputs. The OBS software is free and very powerful and this really is the keystone to making this all work.

I will be starting a number of online seminars and sessions in the coming weeks. I do have some tutorial videos that I need to finish editing first, but once that’s done expect to see lots of interesting online content from me.  Do let me know what topics you would like to see covered and subject to a little bit of sponsorship I’ll see what I can do.

Stay well people. This will pass and then we can all get back on with life again.

Temporal Aliasing – Beware!

As camera resolutions increase and the amount of detail and texture that we can record increases we need to be mindful more and more of temporal aliasing. 

Temporal aliasing occurs when the differences between the frames in a video sequence create undesirable sequences of patterns that move from one frame to the next, often appearing to travel in the opposite direction to any camera movement. The classic example of this is the wagon wheels going backwards effect often seen in old cowboy movies. The cameras shutter captures the spokes of the wheels in a different position in each frame but the timing of the shutter relative to the position of the spokes means that the wheels appear to go backwards rather than forwards. This was almost impossible to prevent with film cameras that were stuck with a 180 degree shutter as there was no way to blur the motion of the spokes so that they were contiguous from one frame to the next. A 360 degree shutter would have prevented this problem in most cases. But it’s also reasonable to note that at 24fps a 360 degree shutter would have introduced an excessive amount of motion blur elsewhere.

Another form of temporal aliasing that often occurs is when you have rapidly moving grass, crops, reeds or fine branches. Let me try to explain:

You are shooting a field of wheat, the stalks are very small in the frame, almost too small to discern individually. As the stalks of wheat move left, perhaps blown by the wind, each stalk will be captured in each frame a little more to the left, perhaps by just a few pixels. But in the video they appear to be going the other way. This is  because every stalk looks the same as all the others and in the following captured frame,  the original stalk may have moved  say 6 pixels to the left. But now there is also a different stalk just 2 pixels to the right of where the original was. Because both stalks look the same it appears that the stalk has moved right instead of left. As the wind speed and the movement of the stalks changes they may appear to move randomly left or right or a combination of both. The image looks very odd, often a jumbled mess, as perhaps the tops of the stalks appear to move one way while lower parts appear to go the other.

There is a great example of temporal aliasing here in this clip on Pond5 https://www.pond5.com/stock-footage/item/58471251-wagon-wheel-effect-train-tracks-optical-illusion-perception

Notice in the pond 5 clip how it’s not only the railway sleepers that appear to move in the wrong direction or at the wrong speed but notice how the stones between the sleepers appear to look like some kind of boiling noise.

Like the old movie wagon wheels one thing that makes this worse is the use of too fast a shutter speed. The more you freeze the motion of the offending objects or textures in each frame the higher the risk of temporal aliasing with moving textures or patterns. Often a slower shutter speed will introduce enough motion blur that the motion looks normal again. You may need to experiment with different shutter speeds to find the sweet spot where the temporal aliasing goes away or is minimised.  If shooting at 50fps or faster try a 360 degree 1/50th shutter as by the time you get to a 1/50th shutter motion is already starting to be as crisp as it needs to be for most types of shots unless you are intending to do some for of frame by frame motion analysis.

Using User Files and All Files to Speed Up Switching Modes on the FX9.

Sometimes changing modes or frame rates on the FX9 can involve the need to change several settings. For example if you want to go from shooting Full Frame 6K at 23.98fps to shooting 120fps then you need to change the sensor scan mode before you can change the frame rate. One way to speed up this process is to use User Files or All Files to save your normal operating settings. Then instead of going through pages of menu settings you just load the appropriate file.

All Files save just about every single adjustable setting in the camera, everything from you white balance settings to LUT’s to Network settings to any menu customisations.  User Files save a bit less. In particular User Files can be set so that they don’t change the white balance. For this reason for things like changing the scan mode and frame rate I prefer to use User Files.

You can add the User File and/or All File menu items to the user menu. If you place them at the top of the user menu, when you enter the cameras menu system for the first time after powering it on they will be the very first items listed.

Both User Files and All Files are found under the “project” section in the FX9 menu system. The files are saved to an SD card in the SD Card Utility slot. This means you can easily move them from one camera to another.

Before you save a file, first you have to give it a name. I recommend that your name includes the scan mode, for example “FF6K” or “2KS35”, the frame rate and whether it’s CineEI or not.

Then save your file to the SD card. When loading a User File the “load customize data” option determines whether the camera will load any changes you have made to the user menu. “Load white data” determines whether the camera will load and overwrite the current white balance setting with ones saved in the file. When loading an All File the white balance and any menu customizations are always loaded regardless, so your current white balance setting will be overwritten by whatever is in the All File. You can however choose whether to load any network user names and passwords.

How We Judge Exposure Looking At an Image And The Importance Of ViewFinder Contrast.

This came out of a discussion about viewfinder brightness where the compliant was that the viewfinder on the FX9 was too bright when compared side by side with another monitor. It got me into really thinking about how we judge exposure when purely looking at a monitor or viewfinder image.

To start with I think it’s important to thing understand a couple of things:

1: Our perception of how bright a light source is depends on the ambient light levels. A candle in a dark room looks really bright, but outside on a sunny day it is not perceived as being so bright. But of course we all know that the light being emitted by that candle is exactly the same in both situations.

2: Between the middle grey of a grey card and the white of a white card there are about 2.5 stops. Faces and skin tones fall roughly half way between middle grey and white. Taking that a step further between what most people will perceive as black, something like a black card, black shirt and a white card there are around 5 to 6 stops and faces will always be roughly 3/4 of the way up that brightness range at somewhere around about 4 stops above black . It doesn’t matter whether that’s outside on a dazzlingly bright day in the desert in the middle East or on a dull overcast winters day in the UK, those relative levels never change.

Now think about this:

If you look at a picture on a screen and the face is significantly brighter than middle grey and much closer to white than middle grey what will you think? To most it will almost certainly appear over exposed because we know that in the real world a face sits roughly 3/4 of the way up the relative brightness range and roughly half way between middle gray and white.

What about if the face is much darker than white and close to middle grey? Then it will generally look under exposed as relative to black, white and middle grey the face is too dark.

The key point here is that we make these exposure judgments based on where faces and other similar things are relative to black and white. We don’t know the actual intensity of the white, but we do know how bright a face should be relative to white and black.

This is why it’s possible to make an accurate exposure assessment using a 100 Nit monitor or a 1000 Nit daylight viewable monitor. Provided the contrast range of the monitor is correct and black looks black, middle grey is in the middle and white looks white then skin tones will be 3/4 of the way up from black and 1/4 down from white when the image is correctly exposed.

But here’s the rub: If you put the 100 Nit monitor next to the 1000 Nit monitor and look at both at the same time, the two will look very, very different. Indoors in a dim room the 1000 Nit monitor will be dazzlingly bright, meanwhile outside on a sunny day the 100 Nit monitor will be barely viewable. So which is right?

The answer is they both are. Indoors, with controlled light levels or when covered with a hood or loupe then the 100 Nit monitor might be preferable. In a grading suite with controlled lighting you would normally use a monitor with white at 100 nits. But outside on a sunny day with no shade or hood the 1000 Nit monitor might be preferable because the 100 nit monitor will be too dim to be of any use.

Think of this another way: Take both monitors into a dark room and take a photo of each monitor with your phone.  The phone’s camera will adjust it’s exposure so both will look the same and the end result will be two photos where the screens will look the same. Our eyes have iris’s just like a cameras and do exactly the same thing, adjust so that the brightness is with the range our eyes can deal with. So the actual brightness is only of concern relative to the ambient light levels.

This presents a challenge to designers of viewfinders that can be used both with or without a loupe or shade such as the LCD viewfinder on the FX9 that which be used both with the loupe/magnifier and without it. How bright should you make it? Not so bright it’s dazzling when using the loupe but bright enough to be useful on a sunny day without the loupe.

The actual brightness isn’t critical (beyond whether it’s bright enough to be seen or not) provided the perceived contrast is right.

When setting up a monitor or viewfinder it’s the adjustment of the black level and black pedestal which alters the contrast of the image (the control of which is confusingly called the brightness control). This “brightness” control is the critical one because if the brightness adjustment raises the blacks by too much then you make the shadows and mids brighter relative to white and less contrasty, so you will tend to expose lower in an attempt to have good contrast and a normal looking mid range. Exposing brighter makes the mids look excessively bright relative to where white is and the black screen surround is.

If the brightness is set too low it pulls the blacks and mids down then you will tend to over expose in an attempt to see details and textures in the shadows and to make the mids normal.

It’s all about the monitor or viewfinders contrast and where everything stits between the darkest and brightest parts pf the image. The peak brightness (equally confusingly set by the contrast control) is largely irrelevant because our perception of how bright this is depends entirely on the ambient light level, just don’t over drive the display.

We don’t look at a VF and think – “Ah that face is 100 nits”.  We think – “that face is 3/4 of the way up between black and white” because that’s exactly how we see faces in all kinds of light conditions – relative levels – not specific brightness.

So far I have been discussing SDR (standard dynamic range) viewfinders. Thankfully I have yet to see an HDR viewfinder because an HDR viewfinder could actually make judging exposure more difficult as “white” such as a white card isn’t very bright in the world of HDR and an HDR viewfinder would have a far greater contrast range than just the 5 or 6 stops of an SDR finder. The viewfinders peak brightness could well be 10 times or more brighter than the white of a white card. So that complicates things as first you need to judge and asses where white is within a very big brightness range. But I guess I’ll cross that bridge when it comes along.

Hot Pixels and White Dots From My New Camcorder (FX9 and many others).

So you have just taken delivery of a brand new PXW-FX9. Turned it on and plugged it in to a 4K TV or monitor – and shock horror there are little bright dots in the image – hot pixels.

First of all, don’t be alarmed, this is not unusual, in fact I’d actually be surprised if there weren’t any, especially if the camera has travelled in any airfreight.

Video sensors have millions of pixels and they are prone to disturbance from cosmic rays. It’s not unusual for some to become out of spec. So all modern cameras incorporate various methods of recalibrating or re-mapping those pesky problem pixels. On the Sony professional cameras this is called APR. Owners of the Sony F5, F55, Venice and FX9 will see a “Perform APR” message every couple of weeks as this is a function that needs to be performed regularly to ensure you don’t get any problems.

You should always run the APR function after flying with the camera, especially on routes over the poles as cosmic rays are greater in these areas. Also if you intend to shoot at high gain levels it is worth performing an APR run before the shoot.

If your camera doesn’t have a dedicated APR function, typically found in the maintenance section of the the camera menu system, then often the black balance function will have a very similar effect. On some Sony cameras repeatedly performing a black balance will active the APR function.

If there are a lot of problem pixels then it can take several runs of the APR routine to sort them all out. But don’t worry, it is normal and it is expected. All cameras suffer from it. Even if you have 1000 dead pixels that’s still only a teeny tiny fraction of the 19 million pixels on the sensor.

APR just takes 30 seconds or so to complete. It’s also good practice to black balance at the beginning of each day to help minimise fixed pattern noise and set the cameras black level correctly. Just remember to ensure there is a cap on the lens or camera body to exclude all outside light when you do it!

SEE ALSO: https://www.xdcam-user.com/2011/02/are-cosmic-rays-damaging-my-camera-and-flash-memory/

Struggling With Blue LED Lighting? Try Turning On The adaptive Matrix.

It’s a common problem. You are shooting a performance or event where LED lighting has been used to create dramatic coloured lighting effects. The intense blue from many types of LED stage lights can easily overload the sensor and instead of looking like a nice lighting effect the blue light becomes an ugly splodge of intense blue that spoils the footage.

Well there is a tool hidden away in the paint settings of many recent Sony cameras that can help. It’s called “adaptive matrix”.

When adaptive matrix is enabled, when the camera sees intense blue light such as the light from a blue LED light, the matrix adapts to this and reduces the saturation of the blue colour channel in the problem areas of the image. This can greatly improve the way such lights and lighting look. But be aware that if trying to shoot objects with very bright blue colours, perhaps even a bright blue sky, if you have the adaptive matrix turned on it may desaturate them. Because of this the adaptive matrix is normally turned off by default.

If you want to turn it on, it’s normally found in the cameras paint and matrix settings and it’s simply a case of setting adaptive matrix to on. I recommend that when you don’t actually need it you turn it back off again.

Most of Sony’s broadcast quality cameras produced in the last 5 years have the adaptive matrix function, that includes the FS7, FX9, Z280, Z450, Z750 and many others.