It was suggested in our previous mentor meeting that we have a lighting effect when the ingredients emerge from the computer screen. While we’re planning on reshooting this footage with an actual light on set, we wanted to experiment with doing it in Nuke. My initial thought was just using rotos to manually paint in the light, but then I thought I could remove some of that guesswork by using a normal matte. I found a tool called DSINE which has a Cattery implementation by Francisco Contreras. It worked very well creating normals for the shot.
I then used a tool called PM_Matte which uses normals to create a matte of the surfaces facing a certain direction. I did some masking and blurring to clean up the matte.
The temporal consistency on DSINE was not the best, so I chose a frame, put a frame hold, then used smart vectors to warp it to follow the motion.
It’s still rough, but I think it’s a goof proof of concept. I want to research using these machine learning tools more in Nuke. I did not know until recently that other models can brought into Nuke, and that’s something I’d like to learn more about.