After entering Perception Neuron's Set Your World in Motion video contest in 2016, Studio Local —the makers of the short film, Twenty One Points—won two Perception Neuron motion capture systems to complete the visual effects for the fillm released the following year. This year, the New Zealand film team, led by Twenty One Points director, Pete Circuitt, have just won the Best Visual Effects Award at France's Festival du court métrage de Clermont-Ferrand. Twenty One Points stars actor Josh Thomson as a quirky loner who is best friends with an imaginary robot named Gary (played by Rohan Glynn). Noitom had the chance to sit for a quick Q&A with Pete Circuitt.
What was your inspiration for the story of Twenty One Points?
I was living in Mt. Albert, Auckland, where there is a woman who walked exclusively in reverse. Her local nickname was Backwards Lady. One day I was watching her as she back-strolled past my house and I thought . . . what does she do when she gets home? How does she watch TV? How does she get up to answer the phone? Alan became a character respectfully cut and spliced from all the unique people that I've witnessed in all the communities I have lived.
Would you consider making a feature-length version of Twenty One Points?
Yes, we are in talks about that now.
How do you view a motion capture solution like Perception Neuron in regards to helping Indie filmmakers such as yourself to create projects that may not have been possible in the past?
Absolutely crucial. It's the second mocap system we have developed but the first that is wireless. The difference between the wireless Neuron system and an optical system is vast. Not just in the infinite staging of your capture area but in calibration times. Typically you can't stop a live action shoot for 30 minutes while you calibrate, and PN requires just 10-20 seconds. Brilliant. (See this Twenty One Points shot breakdown.)
How did you find working with Perception Neuron? Were you able to create a good workflow? Did you find the system easy to work with both for your special effects team and the actor?
It's a wonderful system and I recommend it to everyone. The workflow is straightforward and robust. One factor not often talked about, but I think is sometimes relevant, is that the actor does not feel self conscious in the skintight bodysuit common in optical systems.
Did you also work with UNREAL Engine for the effects? If not, what other VFX platforms did you work with alongside PN?
We didn't use Unreal in the film but it's something we are looking into now. For the film we cleaned the mocap in Maya, added animation layers where required, and used Redshift for rendering.
To learn more about Twenty One Points and the team behind it, visit Studio Local.