Coco VR is Pixar’s stunning debut into virtual reality: an adventure into the beautiful Disney-Pixar film, Coco, that have been released to movie theaters last November 22, 2017.
Coco VR is a film tie-in developed by Magnopus with cooperation from Disney-Pixar and Oculus. It is the first virtual reality experience created using a Pixar property. It explores an expanded version of the world put forth in the film by supplementing that narrative. Magnopus started by creating a proof of concept project that would demonstrate three things:
- Pixar quality environments in VR
- Pixar animation in VR
- Multi-user networking
Viewers of movie’s VR can choose either a single or multi-player experience, and follow the magical alebrije into the luminous world of Coco filled with lovable characters and beautiful settings from the film.
Magnopus wanted to expand the possibility of what marketing material could be. VR is a new and exciting place, it’s also a personal space. For all the investment on behalf of the audience to even be accessible in this space, Magnopus wanted to make sure they delivered an experience that was not only worthy, but lasting — something that could stand on its own and inspire generations of storytellers to come.
Working with Pixar assets gave Magnopus a massive head start in that direction, all they needed to do was get them working in VR. No small task, considering a single frame of the film renders in 50 hours. Magnopus would need to get that down to 11 milliseconds, and still have headroom for multiuser logic.
In the end, Magnopus completed the experience with five locations (as well as a few special nooks), three hero characters with synced audio and facial animation (as well as a few NPCs), and an end to end social experience for up to four simultaneous users.
- Meet Miguel, Ceci, and Hector in fascinating locations featured in the movie.
- Multi-player enables the viewer to include his/her amigos and have fun together!
- Express uniqueness and rock a new style in Ceci’s costume shop.
- Clown around and capture the moment in the photo booth.
- Take a gondola ride and see the gorgeous city from above.
- Live a little in the spotlight! The viewer is the star alongside of Hector in a musical celebration of Dia de los Muertos!
Because Coco VR is a social experience, and Magnopus wanted to make sure user avatars matched user height, the studio had to come up with a scaling solution that would allow users to customize their height without breaking the character rig. This had to take into account arm length (so the IK systems would continue to work) and hand size (so their grab poses would still look right).
Ultimately, Magnopus decided on nudging the scale of the user’s bones depending on their height which allowed the studio to keep everything they wanted, but maintain the emergent nature of the design.
Coco is a beautiful film. Reproducing the look in VR required Magnopus to create new tools for projecting environment textures. The standard UV-based workflow was too dense, and would not allow for rapid iteration. Magnopus wanted to take images straight out of renderman (rendered by Pixar in Emeryville) and plug them straight into Unity.
For this, Magnopus ended up writing their own shader which could not only do what they wanted, but would adapt the projection according to the best possible angle for every face of environment geometry.
With the number of environments that Magnopus wanted to include, they knew traveling from place to place would be a point of concern. The simplest solution would have been to create a series of instances, but that left them feeling like they were robbing themselves of the fundamental promise of the experience. They wanted to feel like they were there — that means every place needs to be connected.
In the end, Magnopus took the main environment – the Plaza – and used that as a hub to connect the other environments. This forced the studio, as well as it forced the filmmakers, to think about how all these places flowed together in a higher degree of detail than had ever before been necessary.
INGESTING FILM MODELS
Magnopus would be nowhere without the ability to take assets from the film and use them in the VR experience. This would be the most difficult with regard to environments. In the scene in the background of this image for instance, every page of every book was modeled.
Optimizing these assets was an ongoing task that did not finish until their final pushes to source control.
Converting animation from Presto into something Unity would accept was a challenge by itself – one the VR studio thought they would fight the entire way. But Magnopus ended up solving it on their second try by reverse engineering a rig based on observing vertex deltas and configuring a cluster matrix based on the data.
The exciting part was bringing in facial animation. Magnopus went old school and simply baked down every facial expression as a unique pose, then played that back as blendShape sequence. Not the most optimal solution, but it has the benefit of being utterly flawless. Which is nice if one is into that sort of thing.
Magnopus rendered every scene from multiple positions with a spherical lens, then built a shader where they could plug those images in. The shader would decide, based on view angle and world position and a little magic, which image to show the user – resulting in the least amount of stretching. We then combined that method with UV’d objects to fill in the patches and finish off the set dressing.
SCOUTING AND LAYOUT
Magnopus built an in-world camera for tech scouting. Each photo taken had a readout of the world coordinates of the image so they could reliably take notes and track changes natively. This helped them mutate the film sets into something appropriate for VR while staying true to the initial design.
A SHARED EXPERIENCE
Coco VR is best when experienced with a group. Building a multi-user experience meant that every bit of interaction (down to the eye saccades) should have some shared value. But, Magnopus also wanted to make sure users maintained agency over their in-world avatars. To prevent user harassment Magnopus did two things:
- Limit multi-user sessions to groups of connected friends
- Create a shader that would render molestation invisible
This allowed Magnopus to focus on interactions that would bring joy to the user experience, rather than constantly reacting to what potential crisis was around the corner with each new feature.
Another approach was to make sure the social tasks revolved around cooperation rather than competition. Venturing into the Land of the Dead with one other person should feel like a date, with multiple people, it should feel like an adventure party.
Seeing your friends as skeletons puts everyone in a friendly mood. Suddenly users are doing things with the sole purpose of making other people laugh. If we could create a context for the audience to forget themselves for a moment and invest in a fantasy world – Magnopus will have done their job.
The dressing room is the first piece of social interaction in Coco, and there’s a lot happening here that is not entirely obvious.
First, the doll on the desk comes from the desire to present even solo users with some form of pseudo-social interaction in this bit. But the real heavy lifting is done by the clothing system.
Clothing is only accessible to the user assigned to it. This is to prevent users adding, and more importantly – removing – clothing from an unwilling partner. The only exception to this rule are ornamental pieces like hats, glasses, and mustaches. Giving users the option to customize their appearance ensures the experience is different each time they log in, and doing it with the restrictions we’ve set in place makes sure they will be satisfied with their base, but also open to playful interaction.
THE PHOTO BOOTH
From early in the process, Magnopus knew taking photos would be a core feature. But we wanted make sure it was an activity the whole group could participate in simultaneously. Thus, the photo booth was born. This not only allowed users to have a souvenir that was accessible from outside the experience, it also encouraged user-initiated promotion for the experience itself.
Cornskull was the last piece Magnopus added to the experience. They figured, why not? User can remove his/her skull after all. Let’s go ahead and toss it around with a bit of purpose. There are also musical trees, paper airplanes, face painting, a magic hat, and sparklers!
LOOK AT SYSTEM
All of the hero characters in Coco VR will look at the player’s eyes no matter where they are in the scene. The potential of this tiny detail to contribute to immersion is incredible. Immediately, the users are grounded as a member of this world, and the characters suddenly appear to think and contemplate one’s presence in their world.
Magnopus achieved this with an IK spine and dynamic eyeballs that would dynamically shift between fully IK driven and fully animation driven. To make sure the characters would only attempt to look at others with a natural bend of the spine or neck, the VR studio weighted each bone differently, and even determined the Look-At target by a series of dynamic attributes.
In a sense, each set of eyes really did “think” about what they were seeing. If something is moving fast through their peripheral vision, they look at it. But, if something is noisy, they will turn to seek at the source.
By hijacking the clothing system, Magnopus were able to dynamically dress members of the crowd. This way, they only needed to dress them by hand once – from there, the system would take over to deliver something diverse and interesting. Magnopus ended up with only two female variants and two male variants, but the amount of clothing variations available does a great job of making it seem like much more.
But the system goes one step further. Magnopus extended this dynamic dress system to the cascading levels of detail as well. As user back further away, individual crowd members transition from fully animated 3D characters, to static 3D models, to 2D cards facing the viewer. Each character is profiled depending on whether they are sitting, standing, or conversing, and thus update their static or animated pose based on that parameter.
While performing on stage, each user is given the chance to take center stage. The spotlight shines on them and something very special happens. The crowd will begin to mimic their actions.
Magnopus temporarily piped Oculus Touch tracking data into the hands of each crowd member as location and rotation targets. Then, using our existing, robust IK arm system, the VR studio were able to interpolate to those targets at varying speeds and with varying degrees of accuracy. That way, the crowd didn’t look like perfect robots, but a natural and enthusiastic audience.
This system ties into the Look-At system in a number of ways. Most importantly, the two must layer on top of one another, meaning, Magnopus cannot have the crowd attempt to unrotate their spine in order to get a better view of the user, if the dance we want them to do requires that they rotate their spine.
In the end, Magnopus feel this is a brave step in the direction of NPC interaction that does not require physical contact.
THAT’S A WRAP!
The project came together with just under one year of development. Magnopus’ crew started small for the prototyping phase, then quadrupled as they ramped into full production.
Disney-Pixar coordinated to demo Coco VR at the Dia de los Muertos festival in Los Angeles, as well as at Disney retail stores and film festivals across the country. Magnopus premiered in Mexico City, and in Stuttgart shortly thereafter.
Given the long lines and short demonstration times of these events, it was crucial to develop a demo mode of the experience that would take users efficiently through the major points.
This allowed Magnopus to accommodate a high volume of users, so everyone could walk away having tasted a piece of Coco VR.