Nuke – GreenScreen Motion Graphic

During one of our sessions doing greenscreen, we went and filmed on our green screen studio a set of two videos and then were taught how to green screen these out by using the keyer nodes and creating a garbage matte. We also learned about then using these to build out a 3D Scene and creating enviro nodes based off of videos. I originally wanted to do a different area to greenscreen in, but had some issues regarding the environment nodes and getting it into a 3D space, so i chose to go about coming back to that task later with advice from my professor.

This shot had already been keyed and had a lightwrap applied, so i went for the motion graphic of adding something to the screen when our filmed student would point. I went about building out a small board using constant nodes and ramps as well as transform nodes, and using a switch between both using constant and linear keyframing in the dope sheet.

A lot of the keyframing of this came down to going between copying the keyframes and using a constant 2 frame animation style, unfortunately this isn’t done through simple ctrl copy and paste but after a lot of headaches and google searching i found ways to do both this and opacity keyframing.

I used some opacity keyframing on both the underlinings and text on the small weatherboard

I then set keyframes on the actual merge using the mix option as an opacity.

Personally i really do not like using Nuke for almost anything motion graphics related. While i did like doing motion graphics again, doing things in Nuke takes an absurdly long amount of time to do the simplest tasks even if you know exactly what you’re doing compared to something like after effects. This coupled with the fact that after effects still has illustrator compatibility between creative cloud leads to this having more control but at the cost of time. Doing something this simple motion graphics style ended up having me looking up a lot of different ways in nuke to essentially get around simple things like using an opacity gizmo or not being able to copy keyframes as easily that bog down this process overall. While Nuke could be used for motion graphics i would most likely end up using something like after effects overall.

Below is the animation, though the animation has been converted to web res at 1280×720 and the bitrate has been dropped due to converting this in media encoder for web.

Term 2 – Nuke Week 5 Roto Work

This week we went about doing similar points tracking as the previous week, however this week we went about doing roto along with this work. We created Roto Projections and also even made some 2D images appear 3D through creating lighting and using the model builder.

Term 2 – Nuke Weeks 3 and 4 3D Tracking

These weeks we worked on doing 3D tracking in scenes to clean out small patches and create 3D Scenes in nuke for implementation.

Some quick things we did beforehand was learning how to change the directions of piping in Nuke to allow ease of viewing when looking at pipes and where things connected.

We first went about learning how to do lens undistort, as we would need to undistort the footage and then afterwards we would need to track the footage. We did a distort and then undistort to get a clean undistorted version.

There is also a way to undistort that is less heavy then the lens distortion node, but is also a lot more complex.

We learned about using scanline render and using 3D objects in the 3D View of uke and adding textures to these objects.

Afterwards we looked at making complex 3D scenes, these wouldn’t be to replace Maya but would be used to just make simpler things to then speed up smaller things instead of re-rendering.

We first went about creating a track and giving the scene some treatment, this would help with getting the tracking of this scene done properly.

We then used the Camera Tracker node to get a proper track and create point clouds for our 3D Scene.

I then made a mask of places i didn’t want to track, tracked the shot and had a decent camera solve.

We then learned about using our points clouds and building geometry around said points clouds to get a decent scale of the scene. We also learned about using the modelbuilder in the 3D scene

We stopped and continued this the next week as we were having some small issues.

In the next class we refined what we were doing, got a more accurate track, tracked some geometry properly using the model builder, and implemented some patches onto the geo.

These would be some of the fundamental ways in which i would do some cleanup regarding the Crypt project.

Term 2 – Nuke Weeks 1 and 2

In Nuke these weeks, we learned about removing tracking dots, motion vectors and re-matching grain to create proper comps in Nuke.

We started with learning simple regraining when comping things in or out of shots, this would entail making a denoised plate of our shot then using the noise difference beween our shots and applying this on to our comped areas.

When we do our rotos for a scene, we use a frame hold to freeze our paint and make it not move around, but since at this point it would be a static image, it needs grain reapplied to make it look real

If you want to get your noise to match the original plate, what you’ll do is merge the denoise plate and the original with the operation set as merge, this will give you this result, only the grain that was minused or removed from the original plate.

After this you use a tracker and a transform match move to get a track of your patch to place overtop of the original.

There were two ways to regrain this properly, one being re-graining it through the above state method of getting the original grain, then using a merge plus and keymix to put it on my patch.

The other way of doing this was using an F_Regrain to get a sample of the original grain for the patch and then merging it to create the grain that way.

After this there was 3 ways of reviewing the grain implementation to make sure it looked real, these were using a blue and merge with a grade, using grain check and using Qc Technical to check the edges of our patch.

After this we worked on comping in something and replicating the lighting data and luminance on to a patch. We did his by using a curve tool and keying the luma data to a grade node. This would give us our luma values and keyframe them into a grade to allow for light flickering.

After this we went about doing tracking on a face to remove facial markers and add things on to aface.

I Struggled a bit with getting the operations to work on the painting but managed to get it working after getting a proper alpha out of it.

After this we learned about using UV maps, as they can be used to distort in certain ways and are a way to organize a texture and image and modify it later on.

Using the smear tool i managed to disrupt the UV and make the texture non linear.

After this we experimented with using Motion Vectors, which would help a lot with doing the tracking on the facial dots.

Next i did the homework, which involved adding in two things on to the face and removing the tracking dots.

After a lot of tweaking, i realized that both i had a regrain where it shouldn’t be and a premult where it shouldn’t be, and managed to get a decent cleanplate working.

I had some issues with the face warping hard, but it was because i didn’t have my input frame set to the correct reference frame. After this i applied some vector blur and reapplied the grain and it looked much better.

Overall it could use a bit more mixing on parts of the face to look more realistic but it works as a proof of concept and learning motion vectors.

Term 2 – Machine building progression

We were tasked with building a machine to implement into a scene that we’ll comp in using nuke. I didn’t really have much of an idea of what i wanted to do. I’d previously worked on some small line engines in taking apart some cars and lawn mowers, so i had an idea of some of what i could do with a design but wanted to create something almost steampunk like with my engine.

I went about getting a very rough mood board of what i wanted to create with my engine.

I designed a very rudimentary sketch of what i also wanted the base of this to be, though I’m definitely not the best artist when it comes to things like drawing sketches, and this only served to get my own ideas visualized a bit better.

Two of the main things i wanted to do in this were use some of my modelling knowledge and build upon it. Two ways in which i wished to challenge myself was to have a heat radiator at the front of the machine, with glowing coils that would animate realistic heat coursing through coils similar to a heat radiator, though in the form of a car’s grille where it would be visible through.

I’d previously made a lamp in the previous semester that had used some of these same heated coil techniques before, though i wanted to refine this more and potentially have this be more detailed with more texture.

I had also wanted to use a tank of boiling water on the side, making this appear to be a water powered steam engine. I wanted a circular brass tube with a windowed opening, showing boiling water with bubbles stirring each time a piston would rotate. With the bubbles making the water appear to bubble.

Lastly i had wanted to attempt to use something like substance painter to texture the exterior of the machine. I’d previously learned a bit of substance through work previously, and a thing that intrigued me was how it could be used to do things like congeal textures like rust around the sides of objects or in small nooks. I wanted to make my machine a brass steam engine, undergoing the effects of brass oxidization. Over time brass will oxidize and turn green like the statue of liberty or the buildings of parliament in Canada, and i wanted to learn enough of substance painter to replicate this effect in my model.

I think that my next step will be doing some modelling work in maya to get some of the basics of it’s shape. Afterwards i will then work on refining it down and working on some of the challenges i have presented myself.

I then worked on trying to create some shapes to play around with, i created a small box to house the heated coils that i would then have in the main machine’s box and a water tank next to it for it to feed into.

I experimented around with making multiple types of gears and making some bolts to have going across the machine next.

I also used a lot of duplicate special in this project, and it quickly became my bread and butter for making things symmetrical with the machine.

I also made a pipe, and used the mirror tool to create a duplicate and form two into a mesh together.

I then started making a box with hot coils as the main base of the machine. I used the emissions and gradients to create this effect, and added some more subdivisions to the box to get it smoothed properly.

I also created some small bolts that would act as a place to feed the tubes that would connect between the box and water heater, and added in some lights to animate on them, originally this panel was on the side of the box but was moved to in-between to see it better.

I also made a small gauge that would animate as the animation played, with two hands animating around, andcovered it in a small glass panel which i used a deformer to get the shape of.

I then went back to arranging the gears and adding in some belt loops to animate later. At this point i also made a top box for the gears to sit on as a placeholder.

I then created some piping and used some deformers to get the shape of these pipes to connect in with eachother and shape around. I also created a vent for the top pipe.

Next i created a light texture through photoshop that i could put on the blinking lights, fixed the UV;s on the lights, and applied a texture to each of them so they could be individually animated to blink on and off.

I then worked on making a bigger pinwheel to put one of my belts on to give it a bit of difference between the others.

After that it was simply a matter of tweaking where the belt was placed.

I worked on making a proper shape for the actual base of where the gears would be as well at this point.

I also decided to make a newton’s cradle at the top of this machine to animate later on. With this project i would usually make one thing, then go back to fix another, and alternate between fixing or adding multiple things to work towards a finished design.

A reminder to always save as this blunder in forgetting to save before a crash easily cost me an hour.

At this point i had my machine and had it modelled, all that was left to do was fix some deformers that kept breaking, and then texture it and animate some small last things.

This is where i made the decision to make the panel in the middle as opposed to the side.

At this point i went about a spree of fixing UV’s as a lot of them were odly stretched in portions or didn’t look right and needed to be redone.

With the machine only facing one way, i decided on the pipes to just rotate them facing away as the seam wouldn’t be seen. With ReUving the pipes i would need to take off the deformers, take the bolts out of the shell by only using face selection, then re-uving the pipes as cylindrical and reapplying everything, and applying a different texture to the bolts.

The main compartment was not as much fun to re-uv and took a lot of work making segments and moving uv’s around.

I then tweaked some of the UV’s with the newton’s cradle piping as well.

At this point i put in a light and made some renders and was happy with how it was coming along.

I went about fixing some of the Wheel wells by smoothing them and adding edge loops, then replacing the fixed one with all others.

I then went about tweaking the small tubes connecting the panel to the rest of the machine as it needed to be tweaked, doing this i used both bend deformers and simple sculpting tools.

I also noticed the UV’s on the side of the panel were really stretched, and went about fixing this along with the initial water tank UV’s and heater UV’s.

I then had to separate the textures of the piping and the bolts on the piping, and did this by selecting only the pipe’s faces.

At this point the machine was coming together a lot more and started looking a lot better.

I went about adding a small ring for the glass to sit on and then fixing the UV’s of the water tank one last time.

After this i had mostly fixed the tubes, but tweaked them a bit and added a texture that gave it an opaque outer plastic look and an inner red transmission to look like liquid.

I used some transmission objects to make it look like there was an opaque tube with liquid in it.

At this point in the project unfortunately i ended up getting sick yet again. As i had already had covid in the semester, this slowed down my progress even further, however i tried to push through and work on things.

For the compositing portion, i did a lens distortion to undistort the footage for tracking and to import the tracked footage to maya later on.

After this i got a points cloud after tracking and started to set up a scene using cards. Originally i had a fairly good solve with little error, however the groundplane made the cards and camera seem slightly warped to the side, this needed to be fixed sooner before i would import to Maya.

I went back into Maya to fix some lingering issues, such as emissions on the lights needing to be animated, some textures needing to be fixed and other textures needing to be worked on.

I also originally had the animations going for shorter then they needed to be, but this was fixed with some tweaking in the animation editor.

I managed to animate the coils similar to how i animated the lights by keyframing the emissions of the light.

I then needed to animate the Newton’s cradle, this took a lot of tweaking as it constantly seemed too short, too long or unreal due to the way a newton’s cradle operates even in real life, though after a lot of tweaking in the graph editor i managed to be happy with it.

At this point i went about animating the belt textures, and pulling around verticies similar to the belt animations we had done in Nick’s class in the first two weeks.

So now comes a large issue i encountered throughout this project that caused me many headaches. Maya versions can be extremely finicky. I usually end up working between the school computers which by default run Maya 2022, and my home computer that runs 2019. A problem between these two is sometimes textures won’t relink properly, but more importantly that deformers will fully break and not retain any of their properties on an object. this caused many issues and i ended up having to redo my deformers one last time.

After a lot of tweaking, i wasn’t quite able to get an alymbic file working in Maya as it wouldn’t display points clouds, though after exporting an FBX out of Nuke i managed to have this working fine.

At this point it was a matter of tweaking lights to get the lighting detail correct.

I Made some mesh lights and got an approximate feel for the colour tone and what i wanted my scene to look like when matched at this point by changing light colour tones.

I then went through the process of adding AOV’s to be able to control things in my export.

A problem i noticed was that some meshes would need to be grouped together to make ID’ing easier, so i went about combining the mesh of multiple objects in ways that would make sense.

After this i set up the rest of my AOV’s of what i imagined i would need to tweak later on when compositing, including reflections, AO and shadow mattes. This is when i also added shadow mattes to my ground planes to get it to export only the shadows from these objects.

I ran into what is possibly one of the oddest Maya bugs i’ve ever seen while doig this, in which lights will not shine on certain surfaces, but only in a singular direction. below shows a light which has a new plane in which light goes on it normally, the light also hits the machine, but wont hit the back wall and won’t hit anything behind it at all but will hit in front of it, regardless which way the area light was pointing. It was a frustrating bug but i managed to fix it simply by adding in new planes.

At this point i was happy with how my scene was comping along in Maya and started exporting it to Nuke. A problem i did unbeknownst to me at the time was not combining AOV’s and rendering them out separately which would be a large mistake.

At this point i needed a holdback of the wall so that the wall could go behind it, i used the same projection roto techniques used in Week 5 of Gonzalos class to create a projection camera on cards and then roto the geometry from there. I added on a grade and checked the roto to make sure it would look decent and then merged it with the rest of my machine.

At this point i went about merging in the AOV’s of the machine, though i had exported improperly and both couldn’t properly tweak grades and had an issue with the opacity of the machine at this point.

Here are two videos that show some of the emissions animations done in the animation that are not visible in the final version.

At the end of the day, this was the final export i got of the machine in the scene with it comped in Nuke.

I’m not entirely happy with this machine. I feel like given more time i could have gotten a better comp, better colours and less errors when exporting, as it stands it is missing emissions and has issues with opacity. Unfortunately due to sickness, and problems with group members, i had to focus a lot more time on the collaborative group project than working on this machine and i do wish i had more time to refine it and polish it off, as i feel with just a bit more tweaking i could get it looking much more realistic in the scene.

Nuke – Week 10 Scenarios in Production and QCing

In this class we talked about the real world aspects of VFX, such as the scenarios a VFX Artist may encounter, how to handle them and some of the pitfalls some artists may make and how to avoid them.

We started with a talk about specifically how critical it is to review your own work, regardless of the deadline, regardless of the stress, it’s always important to re-check things or it will cause problems later on. I’ve had this issue prior working on jobs where due to time and stress, i did not check graphic properly before handing it off to artists and had to redo entire files as something was wrong i i didn’t check for. This is a lesson that needs to be learned and followed to create error free shots or files when working with other artists.

Next we talked about the compositing workflow, that being temps where files are created for clients so that clients can ask for removals or roto needed for a shot, trailers where a trailer will be created for said film or tv show, finals where shots will begin to go into iterations asked for by the client, and narrowing things down to a final shot to be approved. Lastly is QC Or quality control in which either a QC Team or a VFX Supervisor will meticulously check a shot and and scrutinize a shot to give needed revisions.

We then discussed how to review shots and organize shots, some of the softwares used in this is simply just google docs, ftrack and shotgun. These are used in sharing files, keeping track of projects and shots and getting things quality checked.

We discussed some of the roles in production, that being Line roles in which they gather information regarding problems with shots or giving information to the producer. The producer’s role regards around completing the project, managing the budget and setting deadlines.

Next we talked about VFX Dailies, dailies include a meeting where everyone meets to review shots and give feedback and info regarding shots to keep them going in the right direction.

A process of tech checking occurs when someone checks all the notes regarding a shot, comparing the versions of a shot, checking editorial, checking if there is any retime on a shot, making sure that the shot has the latest CG and FX, making sure the match move and tracking is the latest version, writing personal comments if you have any before submitting, checking for alternatives and doing a QC.

Nuke – 2D Tracking Practice

I started this with simply getting a tracker attached to the original footage and then making a 4 point cornerpin track to have somewhere for the iphone screen to lock on to.

Afterwards, i took the tracking data and applied to set to input on the screen before the mirror and reformat nodes were added in, as doing it after caused a lot of distortion.

Once this was tracked properly, i went about rotoing the fingers that go over top of the screen, though unfortunately the original video has an alpha baked in and this needed to be removed.

Afterwards i went about rotoing each finger, trying to feather the sides as there’s some motion blur whenever the fingers move that was causing a lot of green to come through when not feathered.

After this i had my alphas for both the screen and the fingers merged, next was simply putting the background under it.

There are some issues with this regarding the cornerpin track not quite going exactly where i want it to and a little bit of fuzziness on the fingers, but i plan to fix this a bit with some help.

Nuke Week 7 – Colour Correction

This is what my my Nuke tree loks like at the end, and what i was meant to comp in and colour correct, so lets see how i got there and the end result.

I started out before doing my primary corrections with putting in an unpremult node to properly color correct the edges, i also pulled up my colour scopes to try to correct more accurately and make sure i didnt have any whites or blacks clipping with any of the corrections.

I went about doing my primaries by originally adjusting the exposure with an exposure node slightly and then moving onto a grade node to bring adjust the highlights, shadows and midtones and try to give it some more overall contrast as the original image looked fairly washed out.

II then added in a color correct node to try and add an orange tint to the shadows and highlights of the plane, to simulate the orange sunset in the back hitting the plane and washing it out with orange tones, especially on the highlights. This gives it a little bit more of a red look because i didnt want to blow out the colours super hard until i could use my huecorrect node later.

The midtones on this felt kind of dark, so i took a grade node and slightly lightened the midtones a bit.

After this i took a last stab at bumping up the orange tones with lightning them through a hue correct, as i didnt want to blast out the midtones on the previous colour correct node but wanted the colour as more orange, i went about doing this by adjusting the oranges on the hue correct and toning down the blues and greens.

For now i think this looks better then it originally did for sure, but i want to come back to this and try to add in a lightwrap and adjust some more of the highlights to make it seem as though the light is hitting off of the jet a bit more.

Nuke Week 6 – Roto Practice

In Gonzalo’s class we were given the task of doing some roto on around 100 or so frames of animation regarding a man running down a bridge where both the man and bridge were meant to be cut out.

I started out with getting a roto for the ground and pillars on the bridge as the camera isn’t quite a still and would need to be tracked. I made a track and then had it tracked fairly well as while there is movement on the camera, it’s very subtle and doesn’t move a whole lot.

I managed to get a roto of all parts of the bridge and afterwards went about slightly adjusting and matching the tracker’s movements onto the roto coordinates.

Afterwards i went about attempting to track portions of the guy running’s body in an attempt to make rotoing slightly easier as opposed to moving the roto every 5 or so frames but this ended up not working in the end and ended up scrapping the tracks.

I went about breaking the roto into different sections of the body to get a bit more control, starting with separating the legs into four different roto portions.

Afterwards i went on to doing the chest and torso portions to try and separate out the roto to get more overall control.

Lastly i went about separating the arms and head from the roto, as I’ve learned prior that using as few points as possible and separating it into as many shapes as possible gives both more control over a roto and makes it easier to do as you’re not tweaking thousands of points.

This is what the alpha looked like on just the guy running, as well as what the mask looked like on him while moving.

Next i combined the alphas using a merge and this was the result of that. I plan later on to do some more experiments with just refining the roto, feathering in places and getting both the feet and hands, but as this was mostly just to experiment with how Nuke’s roto works, for now i’m content with it.

Nuke Week 5 – Interface and file formats + Davinci Color Correction Comparison

Discussed file formats and their relevance in Nuke, with EXR being the most prevalent file formats to use in Nuke, that being mp4’s or EXR’s and the software used in compositing, with After Effects and Nuke being the main programs while also having davinci resolve fusion as an option.

We also talked about the digital production pipeline and the main lines of pre production, production and post production and a short overview of the roles done inbetween.

I’ve previously used Nuke before so relearning most of the controls were a bit of a refresher course for me, but after around a year of last use it was definitely a much needed refresher.

Starting off with the refresher was simply just hitting Tab and importing a file through the Read node, and toggling the viewer to see the footage using the 1 Key.

Afterwards was using both the gain and Gamma sliders to effect the video in the top right. I’d actually not properly used these before as i’d never really had a reason too or would usually add a different node to do so, but having these around is nice and may save time from crating and plugging in new nodes.

Next was testing out and applying some blur nodes and keyframing them to create a Gaussian blur into frame. I’ve previously done this in after effects a lot of times, but a lot of my experience in Nuke came from Tracking using CaraVR, 3D Integration using merge nodes and 360 Stitching, so something as simple as doing a blur node felt fresh to me again.

Next i was taught about just rearranging the panels in Nuke to fit your layout. I’ve had many times in Nuke prior of accidentally messing up the layout and removing a pane that i shouldn’t have, so being able to finally edit this is one of the greatest strengths i could probably learn. Layout and knowing your interface can truly make a difference between working both quickly and efficiently.

Next was creating a simple composition using the merge tool and some grading / transform nodes. Back when i previously had worked in flame, i’d had it really drilled into my head to keep my node graphs very meticulously organized, which is something i want to keep doing going forward using Nuke.

After this it was just a simple thing of using the write node again and exporting this composition to a jpeg after fixing up the composition a little more.

And our little test composition was now finshed!

With this proof of concept i decided i wanted to test out the color grading skills of Nuke. I Personally use Davinci Resolve for all of my Compositions and have some footage that i previously color graded in Davinci, though it was graded incorrectly on a monitor that had very low contrast. I no longer have the raw footage, but to get a little refresher on both Nuke and Davinci, i decided to import different shots and see how it fared on color correcting compared to eachother.

First of all, these videos are both way too large and also as MOV, which are probably not going to play well in Nuke, With some of these videos clocking in at nearly half a gig, these definitely need to be compressed down.

These were converted to an H.264 Codec and the resolution was halved, with the bitrate being dropped to just around half of what it originally was. Now that the file size and codec has changed these should play well in Nuke and Davinci.

While attempting to find the Histogram in Nuke to start compositing i wanted to use the color correct node but accidentally put in a deep color correct node. I hadn’t seen this before and looked on the foundry website here: DeepColorCorrect (foundry.com) to attempt to find the difference. It seems as if deep color correct is mainly used for getting a matte, so for this time i’ll stick with using a regular color correct node.

I found a way on the foundry site to use histogram both as a panel to view in real time, and a node to use. Though the node doesn’t seem to have too much use when editing the way i’m doing as it doesn’t update in real time and only updates after switching to the node after every edit with the color correct node.

Originally the video had a lot of issues with the highlights being very oversaturated and the reds being blasted to the point where the video looked very yellow and green tinted in some places. I fixed these by using 3 different nodes, being the hue correct to bring out more of the reds, a color correct node to tone down the highlights and up the shadows, and a grade node as a master correction node to adjust gain and lift if needed.

This was the difference between the videos, the first one being the color corrected: and the second being the original version.

Next was to use the same clip in Davinci and see if i could get the same result or better. I used initial primaries adjusting the Lift, Gamma Gain and Offset, then a secondary to try and bring back some of the red tones while toning down the yellow tones. This originally didn’t work as well and i had to almost brute force remove some of the color using the qualifier tool, with some more refining i feel like i could have gotten it a bit more red, but it started to look close enough to what i was going for.

The end result was a little bit red blasted in some of the highlights but overall achieved the comparison i was going for.

At the end of the day these are two very separate programs. However i learned a lot more about Nuke and it’s ways of color correcting from this small process. In terms of which is better for color correction, i think Davinci still ends up coming out on top when it comes to specifically making very detailed corrections with its easy qualifier tools, access to many LUTS and power windows. While i feel Nuke being mainly used for compositing can still provide some very quick and easy color correction.