“I was the client contact and handled the on-set supervision for Scanline in Bulgaria,” explains Scanline VFX Supervisor Bryan Hirota. “Once principal photography wrapped I stayed in the LA office and supervised the teams there. Danielle was in Vancouver nearly full-time and supervised the teams in that location. Given her experience working in the Munich office for a number of years it made sense for Danielle [Plantec] to oversee the work sent there as well. Stephan oversaw all Flowline simulation.” Orchestrating the visual effects for the stylish historic epic helmed by Noam Murro (Smart People) was Richard Hollander (WALL•E). “I’ve known Richard for many years so it was easy to establish a dialogue with him while we were in Bulgaria during principal photography to make sure we were shooting the footage in a way that would optimize our efforts in post-production. Once we moved into post John ‘DJ’ DesJardin [Batman vs Superman] handled the production side VFX supervision; he and I go back many years as well. Taking advantage of working out of the Los Angeles office of Scanline I spent many an afternoon over at Warner Bros. going through in progress shots and artwork to ensure the film moved forward in a concert with the filmmakers.”
“Scanline worked on the first 300 film 7 years ago, while we were still in Munich,” states Danielle Plantec who serves as a Vice President and Visual Effects Supervisor at the Los Angeles facility of the German VFX company. “We did the storm shots with the ships crashing at the cliffs of the hot gates and our shots were featured the following year in the Electronic Theater at SIGGRAPH. Consequently, when it turned out that 300: Rise of an Empire (2014) screenplay had over 50 per cent more boats, battles and storms on the ocean, Chris de Faria and Anne Kolbe from Warner Bros. invited us to a brainstorming session in fall of 2011. Over the course the months before the shoot started in July 2012, WB and Scanline refined the approach not only on how to shoot all this water action dry for wet in front of green screen, but we also started to develop the look of the four different battles: Circle, Fog, Fire and Final Battle.”
“All of Scanline’s water is simulated and rendered with our proprietary Fluid simulation software Flowline,” explains Plantec. “Early on before previs even started, we developed a Flowline real-time previs Ocean Rig for Third Floor. This allowed Third Floor to work with actual ocean caches and wind speeds that exist in reality. One of the biggest problems in physical based effects is when directors fall in love with a previs that will never actually work once scale and physical reality need to be taken into consideration. For example, if an artist would animate the wave shape too fast, then the spray from the wave crests would not be able to fall fast enough, creating huge unrealistic curtains of water in the air. The meshes that were created for the previs rig were 2D representations of simulations and physically based. Thus, no matter what happened in the shot we could take the previs file and simply ‘converted’ the rig to full 3D crashing rolling waves that can interact with the boats, have foam and whitewater. Our setups allowed us also to transition seamlessly out of full 3D water simulations in the foreground to simpler 2D simulations in the background. For every shot simulation artists could use a level of detail that was suitable based on the needs.”
“Co-owner and VFX supervisor Stephan Trojansky is also the mastermind behind Flowline,” remarks Danielle Plantec. “It started as a weekend project for him over 10 years ago and evolved into the FX system it is now. When we worked on the first 300 we worked on seven shots. Now we had to deal with roughly 700 shots. While we are constantly developing features to improve realism of our simulations, a big part is our FX pipeline is centered around Flowline. Artists can trigger simulation and rendering of all simulation aspects with one single button outputting everything needed for compositing. A single shot might consist out of 15 simulation steps like Ocean Solving, Basic 3D Fluid, Surface creation, 2D wakes, 2D flows, Spray, Foam, Bubbles, Mist, Oil, Fire, Explosions, and Wind; these simulations are distributed among many render nodes on the farm, acting as a big simulation cluster, not much unlike how super computers work these days where single problems get distributed among thousands of processors. Along with roughly 40.000 cores, we used 2 Petabyte of disk storage just for simulations.”
“Given the physical limitations of the shoot, it was paramount that we had a game plan for the shots that required live-action components,” notes Bryan Hirota. “We were limited to having a single boat on stage, which if it was a Greek Trireme the entire boat fit but if it was a Persian ship only a third would fit. You had the choice of getting the front or the back of ship. Lastly there was minimal use of motion-control as well [primarily for the forced perspective work to make Xerxes 8 feet tall] so understanding the shots and breaking down passes to photograph was something that would have been impossible without having good previs.” A trademark of a Zack Snyder is the use of speed-ramping during action sequences. “When you’re applying a speed-ramp to a shot that is all in camera it’s not that much of an issue other than doing some clean up work the artefacts that might be introduced by your retiming software of choice. When applying it to a shot that many layers of character animation, photographic elements and simulations things become exponentially more complex. Fortunately, we were able to implement a pipeline into our production to help out.”
“Considering nearly every shot in the film was a potential visual effects shot it was imperative that visual effects were involved with the shoot,” states Bryan Hirota. “For most of production there were two units going at the same time and having a presence allowed us to make sure that for the sequences that we were going to have to tackle someone was there to offer suggestions and/or flag potential issues that might come up down the road.” Partial sets had to be integrated with the CG environments. “We spend the time to make sure we could re-create the various sets in post as needed by acquiring LiDAR data for each set, taking extensive photographic reference for each set, capturing HDRI data for each lighting set up and shooting grey/chrome balls for each camera set up. Simon Duggan (the DoP) did a great job in lighting the scenes to give us flexibility in post to seat the footage into the cg environments that would come later. For example, since we had a pretty good idea of where the fire would be during the fire battle he worked out an interactive lighting rig that would provide the kind of directional lighting cues that would really aid in our final composites.”
“The Art Department provided great designs and reference for the various fleets,” states Bryan Hirota. “Patrick Tatopoulos [Dark City] had designs for the various Greek and Persian navies. Some of the designs were only fully detailed out to the actual physical sets that were going to be built so we fleshed out the ships digitally and I’d run them by Patrick to make sure they were keeping within the aesthetic he had established. As far as the battles were concerned there was some artwork that we used as a starting point. Stephan and the Flowline teams worked their magic to establish distinct stylized oceans for the various battles.” The stylized nature of the film allowed for more creative freedom. “Early on it allows you while you’re in look development to try and experiment with a number of wild ideas that you wouldn’t get to on a more traditional film. Care needs to be taken to avoid it devolving into a visual free-for-all, though. There was an established visual language from the first film and we really wanted to respect that and find ways to extend and expand upon that.”
Each sea battle had to be unique. “This was a definite goal to have all of the battles have visually distinctive ocean conditions,” reveals Bryan Hirota. “We spent a great deal of time working out the different ocean conditions and looks for the different conflicts.” Hirota remarks, “The ‘horse run’ was by far the single biggest challenge in the show. This required an enormous effort from all departments. From a photography standpoint the shot was composed of nearly dozen individual takes. Each take was shot with how it would get handed over to the next in mind. For example, the opening take where Themistokles runs and jumps on the horse we laid down a dolly track and made sure the end framing was close to the start framing of the next series of takes which were to be achieved by shooting Themistokles on a motion controlled mechanical riding buck with a motion-controlled camera. At the end of the shot where he’s fighting a series of proto samurai that fight was broken up into about five different takes that would have to be seamed together in post. The additional challenge was that the on-set protos were stuntmen in performance capture suits that we had to replace in post with digital protos. The costumes were too bulky to allow for the kind of combat orchestrated by stunt choreographer Damon Caro [Man of Steel] and the digital warriors also allowed for tweaking of their animation.”
“This shot was nearly 4000 frames with hundreds if not thousands of interacting elements, all subject to re-times,” adds Danielle Plantec. “To accommodate, the retimes we had to plan and work everything out from animation to simulation in 96 frames. This meant we were handling close to 16,000 frames for this shot. We started to replace each part of the previs: Thermistokles with full digital cape and leg replacement, ships crashing together splintering, full digital humans, full digital proto Samurai with simulated cloth and hair, digital oceans and splashes with major retimes, horse with muscular systems and fur and hair and blood, burning ship, flaming horse, the horse going underwater, practical characters interacting with full digital ones, and Digital blood landing on the deck and accumulating [We even had them leave blood footprints as they stepped through blood].”
“Scanline built ships for the Athenian, Spartan, and Arcadian fleets along with the Persian fleet which we shared with the other vendors,” states Bryan Hirota. Additionally, we built and shared a number of digital Greeks and Persian soldiers and their various weapons and shields. We prepared all of those assets and sent them out, but didn’t have any back and forth so it was quite straight forward from our side.” The 3D presentation of 300: Rise of an Empire added to complexity. “The stereo conversion definitely added another step to our production pipeline. Once we finalized a shot we then would package up the scripts, all of the 2D layers, 3D camera and geometry caches to aid in the conversion process.” Hirota notes, “For me, a sequence that was a pleasant surprise was the bit where Themistokles visits Artemesia on her warship and there’s this enormous oversized moon in the sky. Early on Zack Snyder was clear on his vision for the extended 300 universe which was to paraphrase him ‘not a historian channel documentary on ancient Greece.’” A different cinematic moment stands out to Danielle Plantec. “For me, it was the underwater creature shots during the fire battle. It wasn’t originally planned for the film and was added quite close to delivery. It was a fantastic design and a really cool choreography. Everyone, from concept and look development to animation, simulation and compositing, had a blast with that one.”
Visual effects images © 2014 Warner Bros.. Courtesy of Scanline VFX.
Many thanks to Danielle Plantec and Bryan Hirota for taking the time to be interviewed.
Trevor Hogg is a freelance video editor and writer who currently resides in Canada.