Behind the Scenes of X-Men: Days of Future Past
Over a decade ago, 20th Century Fox brought Marvel’s X-Men to life on the big screen and with it, ushered in a resurgence of comic book movies that have all but taken over the summer box office. Fast forward to today, and Bryan Singer has come back to direct his first X-Men film since X2, which was released 11 years ago. surely watched X-Men: Days of Future Past already, if you haven’t yet you may want to bookmark this page and come back to it after the on-demand or Blu-Ray release so you don’t get hit with any spoilers. For the talented teams from the 11 different effects houses (including the in-house VFX team at Fox) around the world worked on Days of Future Past, there were over 1,000 VFX shots that were completed, with Digital Domain and MPC handling a bulk of these shots. For this behind-the-scenes sneak peek, we’ll take a look at some of the techniques and thought processes that went into creating these effects.
A majority of the shots the artists at Digital Domain (DD) were tasked with were during the portion of the film set in 1973. Thanks to a major sequence being cut from the film which cut the number of shots that MPC (whom we’ll look at a bit later) had in the final film, that meant Digital Domain was the studio responsible for the most shots in the film.
Digital Domain had to deliver more than one finished shot per day, or 428 shots over the 322 days (46 calendar weeks) they worked on the film. To save you from doing the math, that’s just over nine finished shots per week. To accomplish this, the 242 talented artists at Digital Domain put over 165,000 man hours into the project and utilized just under 20,000,000 processing hours.
RFK Stadium Sequence
For the Robert F. Kennedy Stadium shot, the team at DD had a number of challenges, most notably that the stadium is not only a massive asset but also that it’s easily recognizable by anyone who’s visited or seen the real stadium. To help accomplish this, they started with a LIDAR scan of the stadium.
As with most 3D scans, this certainly wasn’t a one-stop solution out of the box. But it did give a great starting point for some of the big features. From there, the DD team took hundreds of photos for reference which they used, along with the LIDAR scan, to create the final asset.
In addition to helping with the modeling of RFK Stadium, the reference photos were really helpful in the texturing phase as well. It was during this phase that the team at DD ran into a somewhat unexpected challenge.
Lou Pecora, VFX Supervisor at Digital Domain, explains, “Our texture team put together quite a lot of real-world photography for [texturing], but we kind of had to grunge it up a lot more than the real stadium because it looked too fake when it looks too clean.”
Of course, the textures weren’t the area where the stadium had to have a little extra added to it for visual appeal. While the actual RFK Stadium holds about 46,000 seats, the CG version of the stadium used in Days of Future Past had 60,000 individually rigged seats that were capable of being animated and flapping around as Magneto interacted with the stadium.
After the stadium asset was created, they were left with a massive asset that would surely slow down any simulation or render. To resolve this challenge, they decided to chop up the stadium. “We decided to take the pie wedge approach,” says Pecora. “There was no way we were going to sim and render every piece of geometry in that scene. So we were only going to sim and render the wedges that were going to be on screen.”
To go about lifting the stadium, the DD team took a very practical approach. Because Magneto’s power is to affect metal, it made sense that if he were to lift the stadium he would do so wherever he could find metal in the guts of the stadium. So they built out simulations on the stadium as if Magneto were grabbing the metal substructures and the concrete rebar to move the stadium.
“In doing so, you create stress on the overall structure and we used that stress to drive tension maps that were used to drive the effects rigid body simulations,” Pecora explains. “The areas with the most tension would obviously break and crack and we would ultimately use those as source points for the dust and falling debris as the stadium is being ripped up out of the ground.”
While it goes by quick in the shot, if you look closely at the still frame above you can easily pick out varying shades of smoke and dust that play a big part in pulling off this effect. As with everything else, all of that was carefully planned and revised.
“Notice in particular the various shades and hues of smoke and dust,” says Pecora. “I thought my FX guys were going to kill me with the amount I went over and over with them about how badly we wanted to maintain as many hues and shades and tones in that smoke. In the real world, you’ll have dust that’s a little bit more brown or a little bit blue for the concrete or a little gray here and there. In order to achieve the scale and believability we really wanted to have all of those hues represented in there.”
Just as they did with the model and texturing of the stadium, the DD team had to push the simulations just a little bit beyond reality to get the look they needed. “We also found that real-world gravity is way too strong for this and everything looked a little fast,” said Pecora. “We ended up somewhere around fifty percent of real-world gravity to help achieve the epic scale we were going for.”
1973 Washington D.C. and Sentinels
As you might have already seen the film, you’ll know that after Magneto picks up RFK Stadium he transports it over Washington D.C. before dropping it around the White House. The process of creating a 1973 Washington D.C. environment and White House was another task that the Digital Domain team worked on.
Just like with the RFK Stadium shot, they first started by gathering reference. Although unlike the RFK stadium, there were some aspects of the White House environment that they couldn’t simply get photographic reference.
“We even made a call to the White House historian to get the types, variates and sizes of the trees how they actually were during the Nixon administration in the early 70s,” states Pecora.
To create the trees, the DD team developed a foliage pipeline using SpeedTree, V-Ray and a lot of proprietary tools that allowed their animators to have a fine level of control over the animations for the trees at various speeds.
Perhaps one of the most prominent aspects of the shots at the 1973 White House are the Sentinels that Magento uses to cause mayhem. While creating the CG versions of these Sentinels was up to the DD team, they had quite a bit of reference from other effects houses to work with. The original concept image, which was approved by Bryan Singer, was created by the team over at Framestore. From this concept, the team at Legacy FX actually built a prop that, to date, is the largest prop they’ve ever built at a whopping 22 feet tall.
“It’s composed of many different materials and since it’s a real prop that Bryan Singer got fairly familiar with, as you can see here, we had to match every one of those materials pretty meticulously,” explains Pecora.
Once the Sentinel was modeled out, the next step was to figure out how it would move. As any animator knows, the first step to accomplishing this is to find reference. In this case, it meant studying reference of mechanical parts and artificial limbs to see how not only how they moved but also how various parts moving would affect other parts on the robot.
“We also found this great video of a Boston Dynamics robot,” says Pecora. “It’s a fantastic movement study for our animators. Notice in particular not only that the body paneling is very similar to what our Sentinel has, but also there’s a lot of cable-imposed jiggle on this fellow as he bounces around. The big movements he’s making, a little unrefined and a little herky-jerky but he’s up to the push-up challenge here. He runs along and does a lot of stuff in exactly the kind of movements you’d expect reasonably from futuristic robots in the early ’70s.”
To help mimic this sort of effect for the Sentinels, DD’s rigging lead Hans Heymans built a jiggle and bounce system. Even though most of it was driven by rigid body simulations, the custom-built system allowed for the animators to control a curve for how much jiggle and bounce would be applied. This let the animators focus on the overall performance of the Sentinels without worrying about the technical details.
Because the real Sentinel prop built by Legacy FX didn’t actually move, it wasn’t until the DD team started doing some range of motion tests on the Sentinels that they realized the original design would need to be changed. “Once he started moving, we realized we needed to change some of the panel thicknesses, some of the joint clearances and cable routing on his neck,” Pecora explains. “Also, some of the pieces ended up having to have function. Using the shoulder ailerons and armpit fans, as we called them there, to adjust his flight.
But not all of the design changes were required by movement. “Some of the design changes we had to incur were just to make the story more interesting, says Pecora. “For instance, his gun is a mechanical gun and we needed to add motion to it since the real prop didn’t have that. So we came up with a somewhat believable ammunition delivery system with counter-rotating cartridges feeding the spinning barrel and a central drive motor controlling all the action.”
One of the big challenges that the team at Digital Domain had to figure out for Mystique’s transformation was how to deal with the fact that the transformation between Mystique and another character nearly always meant dealing with very different volumes. In true Hollywood fashion, the DD team needed to find a way to hide the transition of very different volumes.
“We decided to base the mechanics of the Mystique transition on the idea of a magician flipping over a row of cards,” explains Pecora. “That’s something most people are familiar with and they’ve seen once or twice. As the cards, or in this case feathers, flip over in a row they reveal a pattern of color on the other side that ripples down across the surface of the body or face and it gave us a physically plausible foundation on which to build this effect.”
Another key aspect that had to be kept in mind were the volume changes that would be needed as the transition from Mystique to another character took place. “We wanted to make sure that we gave Bryan Singer the ability to change exactly where he wanted the leading edge at exactly what time,” states Pecora. “We had fantastic pre-viz but, as you know, when you really get going on things and you start seeing these feathers flip, changes are going to come.”
The effect was made possible through a feather rig which was designed by rigging lead David Corral. With the feather rig, animators were able to drag simple primitive shapes, such as spheres, over the model or animation. Depending on which shape the animator was manipulating, the shape would then drive either the leading or trailing edge of the feathers. In addition, the rig allowed for the ability to scale the feathers as needed. Because this rig was so fast and efficient, it allowed animators to crank out multiple iterations per day with alternate timings.
“The beauty of the feathers and that feather-flip rig allowed us to hide that ugly swelling transformation behind feathers,” says Pecora. “[The feathers] could come out small and go down big or come out big and go down small. However we needed them to act in order to hide the leading edge of the transition.”
It didn’t take long for the DD team to realize that the transition wasn’t their only challenge with Mystique’s transformation. Another major challenge was the simple fact that the two actor’s performances never seemed to line up. “Anytime that two performances didn’t line up, which was always, we ended up matching the other actor’s performances to Jennifer Lawrence’s,” explains Pecora. “So her performance was always the foundation that we used to build everything else around it.”
One of the most extreme cases of matching performances came in Mystique’s transition during the battle scene in the clip above. “She changes from a small, uniformed Asian general into the familiar unclothed blue lady whilst fighting off a couple of ill-prepared goons,” says Pecora. “In order to achieve this, a whopping 36 passes of motion controlled stunt action were recorded for this shot. And out of those about a baker’s dozen made the cut and provided bits and pieces that we could stitch together to use to create one fluid motion. An arm from here, a hand from there, a toe, an earlobe stitched together allowed us to come up with the necessary pieces to could project onto our model, and ultimately finish the shot.”
While MPC was originally tasked with over 500 shots in the film, one of their sequences ended up being cut from the final film so in the end they had 372 shots in the film. Although they were responsible for the X-Jet and Cerebro sequences in Days of Future Past, a majority of the shots they were responsible had to do with the future portion of the film.
The process of creating the future Sentinels started as most characters do, in the concepting phase. Benoit Dubuc, Animation Supervisor at MPC, explains, “The initial brief that we received was that their presence needed to be purely physical. They’re never meant to speak or police, they’re plain and simply mutant-killers.”
To achieve this, the art department at MPC went through quite a bit of exploration to find the right look. You can find some of that concept art here. Something else the team at MPC had to keep in mind for the future Sentinel’s design that differed greatly from the ’70s version of the Sentinel is that they had been biogenetically engineered using Mystique’s DNA. This means not only that they’d have the ability to absorb and adapt to any mutant’s powers, but from a design side there needed to be a visual connection to Mystique.
A big part in achieving the Mystique connection came through covering the Sentinel’s body with what ended up being about 100,000 independent blades that, for the most part, acted very similar to the feathers on Mystique’s body. The final design, seen in the image above, was enough for MPC to start their work.
As we learned earlier, Digital Domain ran into some necessary design changes once they started working on the ’70s version of the Sentinels that were based off of Legacy FX’s 22 foot tall real-world prop. While MPC didn’t have a real-world prop to work off of, they also ran into some necessary design changes pretty early on. In this case, though, the primary issue in the design came with its height.
As you can tell from the concept above, the future Sentinels were originally intended to be pretty tall. While this actually matches both the ’70s Sentinels from earlier in the film as well as the original comic book interpretations of the Sentinels, the team at MPC was worried about more than that. “For practical and story-telling reasons it was decided that their ideal height would stand at about 12 foot tall,” says Dubuc. “As they’re a more advanced and powerful, efficient piece of technology, they can afford to be a little more compact. But the main reason is as there are a quite a few hand-to-hand combats between the Sentinel and the mutants, from a compositional standpoint it’s much more interesting to have them within the mutant’s reach.”
Once the concept was approved by Bryan Singer, MPC’s first task was to create a performance test. This was primarily as a study to help everyone figure out some of the general performance characteristics for the character. It’s through this test that the team at MPC learned they needed to figure out a solution for all of the blades on the Sentinel. Since the Sentinel’s design is based off of Mystique’s look, it shouldn’t be too surprising, then, that the solution for the blades ended up being very similar to the feather system that Digital Domain used for Mystique.
“Much like Mystique’s feather system, the rigging team gave us some end-point spheres on the puppet and the way these spheres would work is that any time they would intersect with the Sentinel body, the blades would lift,” states Dubuc. “So in animation, this allowed us to control the lift, control the timing and control the direction of the flutter however we wanted to.”
The next piece of the puzzle was to figure out the Sentinel’s head blast. For this, the rigging team split up the Sentinel’s head into 17 panels and put controls on each of the fingers inside the head. Because there were a lot of fingers inside the head, this meant the animation process for the head blast was incredibly time consuming. So instead of animating it each time, the animation team essentially templated the animation by creating one base head blast animation. From there, on a shot-by-shot basis, they’d tweak the animation as needed.
As all of this was going on for the rigging team, the animation team had their own challenge to figure out. That was to figure out how they’d perform. Dubuc explains their first explorations into the performance of the Sentinels, “The first thing that we initially explored was the use of motion capture. Motion capture is a great tool, we often use it at MPC but I found that in this case the performance from the mo-cap resulted in being a bit too fleshed out and a bit too organic.”
Even though the future Sentinels are a bit removed from their very obviously-robotic ’70s counterparts they are, for all intents and purposes, still giant robots. So that needed to come through in their performance. What the animation team at MPC found was that achieving a robotic look from mo-cap data basically meant stripping out a lot of the subtleties and nuances from the data. Essentially, this would end up being counter-productive since those subtleties are really one of the primary benefits of mo-cap.
But that wasn’t the only reason the MPC team decided not to use mo-cap for the Sentinels. “Another issue with mo-cap would’ve been the scale difference between the Sentinel and mutants,” says Dubuc. Since the Sentinels need to do a lot of hand-to-hand combat with mutants, capturing the scale difference between those mutants and the Sentinels in mo-cap would’ve been extremely difficult to do.
In the end, the animation team turned to one of many animator’s favorite tools to get the job done: Reference. But even with their reference, the animators did their best to recreate the shot as much as possible. Dubuc explains, “When filming our reference, essentially we try to match the plate’s composition and the timing of the actors as closely as possible to be able to carry this performance as smoothly as possible into Maya.”
Another source of reference the team at MPC had at their disposal was the pre-visualization done by The Third Floor. “Brian Singer was heavily involved in the previz process, so on the shoot they actually stuck to the previz moreso than probably any show that I’ve worked on,” says Dubuc. “So when you did get the turnovers, there were no surprises and it also meant that we were able to carry over some of our animation dev work directly into the shots — which rarely happens.”
Of course even with all of these references, there were still plenty of challenges to overcome. For example, one of the big challenges came through the use of a giant foam Sentinel that was used on set. The foam Sentinel was exactly the same proportions as the CG Sentinel, so it definitely had it’s purpose as something the actors to aim and look at with the right proportions. Unfortunately, though, giant foam Sentinels aren’t quite as menacing as their CG counter-parts, so the foam Sentinels on set failed to give any sort of performance as the actors interacted with them.
A great example of this challenge was when Warpath jumps on a Sentinel during one of the futuristic fight sequences. When Warpath lands, the Sentinel reacts and flings him from his shoulders. “Our approach here and for every shot like this was to roto-animate the mutant,” says Dubuc. “Which meant that the Warpath rig, which was based on the exact proportions of the actor playing Warpath, would be animated to match the performance perfectly on the plate also with extreme spacial accuracy.”
Although this didn’t give the intended results the first time around, thanks to some classic VFX cheating the team at MPC was able to figure out a solution. Dubuc explains, “[Our first pass at] this instantly highlights one of the main issues which is that it looks like it’s Warpath driving the animation as opposed to the Sentinel driving the action. So as a cheat in this case to help us out, we had Warpath on a 2D card, which means he was extracted from the plate and from that point we were able to enhance his lateral movements. Thankfully this helped out quite a bit and it made the Sentinel look like he was man-handling Warpath, which was the point. Luckily these cheats go largely unnoticed in the final shot.”
In addition to the Sentinels, the team at MPC brought many of the mutants to life. Some of the most challenging of these were Blink, Sunspot and Iceman.
For Blink, the effect was not so much with the character itself but rather was in her power to create portals. “So we’ve seen various interpretations of portals throughout the years and the problem with portals is that everyone knows what a portal looks like,” says Dubuc. ” And in this case, as if often the case, the clients wanted something new and fresh and something they hadn’t seen before. Which is no pressure at all [laughs].”
To tackle this challenge, the art department at MPC created some different variations of what the portal could look like. Through these concepts, the team realized they really needed to get a sense of energy flowing through the portal. In the final effect, there were multiple teams that worked together to get it to look right.
The first step was to block out the animation of the portal. This was done with a simple circular plane that was positioned and animated based on the character’s performance. The position and timing were then sent to MPC’s technical animation team who would use that data to create a mesh for the portal.
The portal mesh would amount to a blendshape that was based on the timing established in the blocking animation. On top of that, a cloth simulation was added to the mesh to give it a bit more of an organic feel. The final step of the portal creation process would be done by a compositor, who would bring in the mesh and project the portal texture onto the mesh.
“What’s interesting here is this never needed to go through lighting,” comments Dubuc. “This was entirely done in comp. So they’d merge all of the elements together and then this would achieve the final look, which was fairly successful.”
Another one of the mutants the team at MPC tackled was Sunspot. As the name implies, Sunspot’s power is to absorb and channel solar energy. The basic idea of Sunspot was to take NASA reference footage from the sun and scale it down to a human-sized character. As simple as that sounds, it proved to be quite tricky to get right as it was only achieved through months of shader development work.
Although that look was certainly important, one of the main things Bryan Singer wanted MPC to get right about Sunspot was actually his flame. “He was after something closer to liquid or plasma than actually fire. We needed to avoid the flamethrower look at all costs, because Sunspot’s power needed to be reminiscent of solar flares.”
In the end, the FX team at MPC used Scanline VFX’s software, Flowline, to get the right look. “Ironically enough, during the comps stage we did sneak in some flamethrower elements just to ground this look into a more realistic context,” admits Dubuc.
While Blink and Sunspot were new characters to the X-Men series, one of the returning mutants from previous films in the franchise is Iceman. For Days of Future Past, though, Bryan Singer wanted to put a fresh spin on Iceman and bring him a little closer to the character found in the comic book. A big way this was done was to have Iceman’s entire body freeze up during battle along with seeing him riding on his ice slide.
One of the many challenges the team at MPC had to figure out for Iceman was how to get the actor’s facial expressions to be readable when he was entirely ice. Dubuc explains how they solved this challenge, “What we found was an important balance to strike was to have enough detail in the ice itself and combine that with a roughness on the surface of the ice. So this balance would get revised and tweaked on a shot-by-shot basis depending on the lighting of the plate or Iceman’s size in the frame.”
For Iceman’s slide, Shawn Ashmore, the actor who plays Iceman, was shot using a lot of wire work to swing around the room as if he were skating on an invisible ice slide. “This wire work was probably the best I’ve ever seen,” remarks Dubuc. “Once the Iceman puppet was roto-animated to match Iceman’s performance we never had to tweak the performance in animation.”
Because of the great wire work, the team at MPC was able to reverse engineer the ice slide. After roto-animating a 3D version of Iceman onto the actor’s performance on the original plate, they used Iceman’s position in space as the basis for modeling out the ice slide. Once the model was completed, it was handed off to the rigging team who essentially extracted a curve from the model and made the rig able to collapse on the curve. The end result was that the animators were able to organically reveal the ice slide to match Iceman’s performance.
You can see an example of how Blink, Sunspot and Iceman’s effects came together for the final film in the clip below:
Rising Sun Pictures
Australia-based Rising Sun Pictures (RSP) certainly didn’t deliver the same number of shots as Digital Domain or MPC, but their 31 shots on the film for Days of Future Past came during what many fans of the film consider to be one of the more comical sequences. That is, of course, the kitchen sequence where we’re all treated with Quicksilver’s comedic sense of humor.
As they did for the rest of the film, The Third Floor provided previsualization for the kitchen sequence. This really helped set the tone of the comic relief for the shot that RSP would need to create. Adam Paschke, DFX Supervisor at Rising Sun Pictures, explains how they started the shot, “We knew we had to get on with the simplest part of that process first, which was really just populating it with all of the many objects that were around them on the set.”
With so many items on the set, the RSP team broke them all down into three categories. There were ceramics and glassware that they knew could break, the utensils that wouldn’t break and any organics in the scene. Rather than just take a few reference photos to create these ordinary objects, the team at RSP took photos of every prop on set using the same lighting as would be in the final shot. Since everything in the final shot ended up being a CG double of what was on the real set, the reference photos proved to be a huge help to make sure the digital versions of the props matched their real-world counterparts.
“All of these objects were pretty much non-subjective surfaces so they were pretty easy to get from point A to point B in look dev software,” explains Paschke. “And the good thing about that is that it set an increased level of where the effects would need to get in the final shot.”
After all of the props were created, the team at RSP created a global layout. This served as a base for the team to anchor their continuity. Once the layout was done, the next step was to go through each of the 31 shots in the sequence and grab frames with the lit set at the first, middle and last frame.
The purpose behind this was not only to help with the continuity, but out of doing this they generated a heat map of exactly what areas of the set would be seen the most by the camera. This helped the RSP team know where they really needed to focus their efforts and what areas didn’t need as much attention.
For lighting the room, they created HDR images to match the real-world set. “It really wasn’t complex, so it was really important that we didn’t over-complicate it,” Paschke states. “We were pretty much mocking out exactly what they had on set.”
“I guess that ended up making us more rely on Macbeth calibrated HDRI as well,” Paschke continues. “Everything was trusted and all of the disciplines from compositing right back through animation could trust it.”
The time scale was incredibly important to the story being told in the sequence. For that reason, from the onset the team at RSP planned for, and then built, a technical control for time. This control allowed for them to tap into the idea that when Quicksilver slowed down or sped up that the water droplets falling around him would adjust accordingly. Although the practical shots were captured at a really high framerate, the team at RSP still had to slow it down even further in the final shot.
This idea of Quicksilver bending time while still interacting with other actors in the shot made it really difficult to maintain physical plausibility. After all, if Quicksilver was moving as quickly as he appears to be in the shot and he were to touch someone’s face, there’d be a good chance the shot would end up being a lot more gruesome than anyone would want it to be. Because of this, along with the nature of extreme slow motion to give the viewer plenty of time to see every little detail, the RSP team knew they couldn’t hide behind traditional movie magic tricks like motion blur or camera shakes.
To tackle this challenge of bending what would be physically plausible while still being grounded in reality, the RSP team first turned to some real-world high-speed reference like the video above. One of the keys they wanted to get out of the reference was to figure what aspects of the real-world footage could be transferred over to the digital world to give the sequence that feeling of reality.
“Our simulations were all based on their own real-time/real-world scale,” explains Paschke. “Basically all of our reference was either 3,200 frames per second or faster and none of that was slow enough for what we needed in our sequence. That included the tile work, the muzzle flashes, the muzzle break and the airborne droplets and all of the associated impact events that happen when they land.”
In a typical CG sequence to get water droplets in a scene, a normal workflow would be to to emit droplet particles from a CG emitter. Then, using a force of gravity, simulate the droplet falling and, once the droplet hits an object, telling it to collide with that geometry. While that works for many scenes, with the sheer number of droplets and objects in the scene that meant there were millions of collision events. The simulation time that’d be needed to accomplish that simply wasn’t something they could afford.
Instead, the team at RSP figured out a solution that would allow them to get the results they needed faster while still allowing them to maintain creative freedom. “We started developing a massive library of sometimes crude, sometimes highly detailed events,” says Paschke. “That was basically from airborne droplets such as crowning water interaction and other effects involved. So we started off with real-world physics where everything was based on real-world time and from there an effects artist would apply a user timecode per-event to control direction of the art and preserve continuity.”
The result of this technique meant RSP’s artists had independently-controlled time scales that they could place around the entire room to get the look they needed.
Another time-saver came in the approval process where they were able to approve simulations based on crude, low-resolution simulations. “Perhaps you’d have every 5th frame or something like that and we’d approve that and we’d know it’d be good in the final simulation,” says Paschke. “That kind of meant more efficient gateways of approval for us and a cleaner commitment to coverage and shape qualities, as well as being able to control the timing factor. It allowed us to find more complex effects detailing in set dressing and we were able to be more playful with time scale and velocity gags as well and combine the two.”
The team at RSP took advantage of Houdini 13’s new VDB volumes to get this shot to work. “It gave us those three phases,” Paschke explains. “[They're] particularly suited to this kind of shot and that was particle simulation which really we’re very familiar with and we had the CHOPs to be able to pull into this show.”
With the water droplets looking right, one of the next challenges they had to tackle were with the bullet trajectories through the scene. Again, the artists at RSP turned to real-world reference. This time, to ballistic gel tests like the video above. “What we were looking at was that negative shape of the bullet passing,” comments Paschke. “Then we essentially took that and turned that into positive shapes. We got some awesome looking effects in the early tests and some of these effects actually made it into the final sequence.”
With the previsualization done by The Third Floor, the artists at RSP knew much of the sequence involved the camera following Quicksilver around the room. How this camera movement would affect the water droplets in the air was something they knew they needed to figure out. “We knew that we were going to encounter this obvious issue of the camera moving through this volume of droplets and fields of droplets were going to be impacting the camera or passing through the camera,” says Paschke.
They thought they found a solution in using standard frustum culling for all of the shots in the sequence, but as it turns out the initial tests didn’t reveal the entirety of the issue. “When you actually saw the shot content you could start reading the path of the camera carved out,” explains Paschke. “So we started implementing this idea of a rain curtain, which is essentially the same idea of shooting a crowd for live-action and passing a camera through that crowd where crowd members would just pass gently out of the frame but perhaps then pull back into frame. And it worked quite effectively and we managed to finally get it to work for the stereo as well.”
Simultaneously as all of this work was being done, another team of RSP artists were working on figuring out how to handle stabilizing the actor’s performance. “As rock solid as the performers would try to stand they were obviously standing a wet room for many hours they’re holding their arms out, they’re getting tired,” states Paschke. “We knew that we had to have interaction with both the guards themselves and also on the wall so we had to have footprints registering.”
For this massive challenge, the artists at RSP anchored the plate to the interactions that Quicksilver had with the guards. From there it came down to a lot of hard work. “We just bit the bullet,” comments Paschke. “We sliced the plates up, we put it back through animations to a new performance and reprojected those textures and we got it in the bag with first-round iterations on these.”
As overly simplified as that may sound, it gets even more complex when you realize that the shots were all done in stereo. Overcoming this challenge was done with a pipeline that the artists at RSP developed with the help of Stereo D. “There was absolutely no way of post-converting a space with such a vast volume or field of droplets and micro-events,” says Paschke. “So we had Stereo D converting the plates early in the schedule, the production schedule, sending those over to us and then we did both mono comps as well as the stereo comps. We had plenty of camera blends as well and in stereo using multiple different projection formats that was quite a challenge. So all in all it was a huge learning curve for us. We were very fortunate to be a part of it.”
If this is the first time you’ve gotten a practical look into how some of the magic of VFX happens, next time you’re watching this film, or any other film, we’d highly encourage you to stick around after the movie for the credits to see all of the talented artists and studios who worked on the film. Then be sure to look some of those artists or studios up when you get home.
As you can tell from the behind-the-scenes footage from the set in the video above, there’s a lot of shots we didn’t have the chance to talk about that nonetheless had to get completed. Although we didn’t really go into the eight other effects houses, that certainly doesn’t discount the hard work they put into making Days of Future Past a success.