Virtual Production Demos
In the past week, Emma and I did a couple of filming sessions using the lab's virtual production tools. In the first session, we focused on the virtual tools in Unreal engine using the Rokoko Smartsuit Pro v2, the Intel T265 Realsense camera, and the iPhone 12 for Livelink facial capture. In our example scene we filmed everything separately and combined them together. Notably, we did have another filming session before the discussion in which we encountered multiple problems including a suit wire being damaged, the ndisplay interrupting the virtual camera, and the iPhone not properly connecting to the computer.
So Emma first did the motion capture in the Smartsuit for both characters, then filmed the camera shots by physically moving the T265 camera in real life, then finally did the facial performance for both characters. During all of these filming steps, we displayed the current sequence in unreal so Emma could act along with what was recorded. For example, while she captured the movement of the 2nd character the performance of the first character was being played on the projector. But something that seemed a little problematic was eye lines as Emma would need to look at the projector screen to see where the other character was and then try and look in the same general direction. Thus, it may be useful when using one actor that we use the VR headset, which I recall seeing in old documentation to provide the actor with the visual context of the scene.
In our second session, we played around with real people in virtual environments using the projector screen as the backdrop. We utilized a makeshift light-focusing contraption out of duct tape to try and prevent light from affecting the projector while keeping light on the subject. We tested the western scene and the meadow scene Sonya made for Emma in the very old recording session. Additionally, we discussed ways to enhance the shot so there is less of a disconnect between the actor and the backdrop, as it's very noticeable right now. Some interesting possibilities we discussed are a smoke machine in real life combined with virtual volumetrics, blurring the background such that the camera is only focusing on the subject, and other particulates like fake snow.
-Tyson
