top of page
  • LinkedIn
  • Vimeo
  • ArtStationLogoGrey
Search

Post-Production – Get those people/birds out of my shot

  • Writer: Travis Parkes
    Travis Parkes
  • Apr 28, 2023
  • 4 min read

Whilst my project is very much VFX over story, a large plot point that needed to be consistent is the lack of the people in the area, hence why I shot the project so early in the morning when there was nobody about. But as this is still a University, and I am only a student, I could not control who was and wasn’t in a shot, and whilst I could wait for people to pass in most cases, sometimes the best shot still had something or someone in the distance.

The first case of this is in shot 3 where a person can be seen in a far window moving in a room.

This is a fairly simple shot that could be fixed by freezing a frame where the woman is not in view and projecting that in the correct place throughout the entirety of the shot.

I used the model builder node to make sure the card was placed at the correct angle and position in 3D space.

By then placing the tracked frozen window onto the correct place, the woman no longer appears in the shot.

Another issue I had was with the last shot. In the original version the shot moves to the right and follows the smaller robot on a walk to a wall. Unfortunately I had to cut that aspect of the short and needed the camera to instead simply hold on the two robots, this was easy enough to do by just reversing the clip to stretch it out, but a bird flies in the background and as it is reverses, the bird appears twice. The solution to this? To paint said bird out.

This was very simple as the bird was in the distance against a blown out sky. All I needed to do with track the bird and use that to drive a roto shape that was plugged into a keymix which used another part of the sky, I was able to make this a highly efficient fix.

Those two were much nicer problems than the third. In shot 10 a man in the background is seen walking, this was the best take of the scene and the issue was not noticed when filming.

The obvious fix was to paint and project like usual, however I had made this shot rotate around the characters meaning there was a large amount of parallax going on in the background, with a number of things like bushes and buildings being at different points of depth, everything was becoming distorted.

So I chose to go about this differently with a more experimental technique. Machine learning. I’m not completely new to the machine learning features of Nuke but they are still new features and vary greatly in quality depending on the task. So I took a leap in training a dataset on the clean-up of this shot.

But, even to make a dataset still takes over 300 nodes.

The way in which the copycat node works is it takes a frame from the original plate, here I have cropped the area of focus for efficiency reasons:

and of that same frame you give a version of what you want the shot to look like:

I gave it 16 reference frames with the man painted out with correct distortion. The copycat node then uses the GPU to compare two images thousands of times to figure out the difference between the two, tries to recreate it, and measures how close it got and uses that to train itself.

This comparison process was done 80,000 times. This generates a .cat file which Nuke uses to read and process incoming data, in this case the cropped version of the original plate. The result was a very big “almost”.

It understands what needs removing and basically understands what it needs replacing with, but the digital artefacts are too significant, and the detail just isn’t there. The copycat node simply isn’t made for clean plating such as this. Luckily, I expected this could be the outcome and made sure I could repurpose some of the training data if I did have to do a manual version.

And, as it turns out, it wasn’t as hard as I initially expected and doing the machine learning technique was mostly a waste of time. Instead of projecting to a card, I instead projected onto some proxy geometry to make sure the parallax in the shot was caught more accurately, then I just had multiple reference points on the shot that I could dissolve between, then when this area is defocused anyway, you can no longer see where the man used to be.

In an ideal world I wouldn’t have to do these things to get rid of people that weren’t supposed to be there, but there was always going to be some kind of issue when filming, even if I did do a number of test shoots to mitigate this fact.

Recent Posts

See All
Shot Breakdown

Below is a video that showcases the wipes of a typical shot, in this case shot 10 of scene 1.

 
 
 

Comments


© 2023 by TRAVIS PARKES.

bottom of page