Alien Territory: The Making of Prometheus

Trevor Hogg chats with visual effects supervisors Richard Stammers, Charley Henley, Martin Hill, and Paul Butterworth; visual effects producers Michelle Eisenreich, and Unjoo Byars; executive producer Jason Bath, creative director Ahmet Ahmet, 3D cg supervisor Lee Nelson, 3D design lead Dong Ho Lee and Territory Founder / Director David Sheldon-Hicks…

“I worked with Ridley [Scott] on Robin Hood [2010] and he was impressed with the work we did at MPC,” recalls Visual Effects Supervisor Richard Stammers who took a leave of absence from the London-based VFX facility. “Ridley asked me to come back to be the overall supervisor on Prometheus [2012] which was a pleasant surprise.” Working with Scott on the medieval tale was a different experience than the science fiction thriller. “We had great plates and it was about augmenting stuff that was already there; the visual effects in Robin Hood were supporting what he had shot as a film and what we added to it was more straightforward.” A whole new world needed to be brought to life for the second collaboration between the British filmmaker and Stammers. “There was an extensive design process, much of which was heavily unified through the work that Ridley’s Art Department did under Production Designer Arthur Max [Se7en].” The concept art and final cinematic image did not always mirror each other. “There are some things you could look at the concept art and go, ‘Wow! That is like what we ended up shooting.’ But as a general overview you could look at the concept and say, ‘That’s definitely reminiscent of what was shot and the sets that were built.’” The initial designs influenced “the way the set decorating was done, the colour schemes, the cool clinical crispness of the interior of the ship, and the degraded exteriors of the planet’s surface.”

“All the way through pre-production we had this detailed art book which was all of the concept art relating to every individual set and location,” says Richard Stammers. “For any of the unknown details there was some form of artwork.” However, more design elements were required for the visual effects. “There were a number of things that weren’t conceptualized which had to be created by our visual effects vendors; some of that work related to the holographic effects.” In regards to the signature manner in which Ridley Scott expresses his ideas, Stammers remarks, “The ‘ridleygram’ is an excellent way of fast communication. Ridley would never just tell you something; he would draw and tell you it.” The sophistication of the sketch changed during the course of the production. “The detail of those would vary immensely depending on how much time he had and what it was relating to. While we were shooting it tended to be quite simple and basic, when we got into post-production we were able to go further with that and he would have enough time to do more complex drawings. Sometimes we would give him printouts of frames that were in the edit; Ridley would draw over those to fill in all of the visual effects elements that he wanted to see in that shot and would give us an indication of what sort of scale, size and position of those things he wanted. Ridley would always relate to the plate which was an excellent way of briefing our vendors so they could say, ‘Okay, that’s that frame.’”

“The biggest challenge was creating the planet environment,” reveals Richard Stammers. “It was a big unknown even though there was lots of great concept work which gave a suggestion of what we needed to achieve. At the beginning we weren’t sure where we were going to go to shoot this. We reconnaissance Morocco as a place to go to but then there was a lot of unrest in the Middle East so we ended up changing our location to Iceland. Ridley had always been drawn to this one place in Jordan called Wadi Rum. He had reconnaissance there years before and it’s a famous location that was used in Lawrence of Arabia [1962]; there’s a reference to that in the film as well. It’s a stunning valley that has a sandy base to it which Ridley didn’t want but he liked the valley walls. Our solution was to get him to come into MPC, sit with our previs team and physically build what he wanted. We started with Google Earth and bought some digital elevation maps which enabled us to build the geometry of the valley based on satellite data within our CG Maya scene. We showed him the valley of what he liked and lined it up with a photograph he had drawn over of what he wanted to see. We positioned things in relation to that and did an interactive session with him where we literally scaled the valley walls and the elements within it. We put in the Prometheus spaceship, the alien Juggernaut spaceship [the derelict horseshoe ship that was in the original Alien film], all of the other structures in the valley and placed additional mountains around the background. We looked at stormy skies and created not only the scale but a little bit of the mood of the environment in a simple previs form; Ridley signed off on that as an idea.”

“We were able to create a previsualization of the whole landing sequence of the Prometheus and also some of the bigger action sequences that are later on in film involving the Prometheus crashing with the Juggernaut spaceship and the debris storm which rain down afterwards,” says Richard Stammers. “Those were two big sequences we prevised based on that whole environment. The initial process was a useful thing to do and solved a lot of the problems; it also told me what my prerequisites would be when finding a location to shoot our actors on the ground somewhere.” The desired settings were discovered in the barren and volcanic regions of an island nation situated in the North Atlantic Ocean. “We chose Iceland to go with our actors and shot there. I took a visual effects aerial unit to Jordan. We spent three days shooting plates, getting lots of stills and textures, and repeatable positions on the ground along the valley which we could shoot at different times of day; that way we could recreate the whole digital valley of what was in Jordan, put it into Iceland and match the lighting conditions. That in itself was a big undertaking and we tackled it methodically. We got all of the pieces we needed and were able to do a nice job of recreating what Ridley wanted from the valley and placing it in the surrounding areas of what we shot in Iceland. Everywhere we went in Iceland we knew that we were going to replace the entire background but the local ground around our actors would be the one thing that remained. We kept the quality of lighting that we got in Iceland because it had interesting and stormy skies. A lot of the time, there were fast moving thick clouds and sometimes we had breaks of sunlight; it was the variation that Ridley wanted.”

“In our early previses we had the whole heat shield moment, the upper and lower atmosphere, into clouds and beyond,” states Richard Stammers. “It got simplified to the point where we only had to deal with the clouds but even that was a big undertaking. We shot aerial plates with our helicopter unit flying through clouds and shot those in stereo in hopes that we would get some great depth. Behind the clouds we added in matte paintings of the outer space star fields and this huge gas giant planet with a ring around it that dwarfs the moon which the Prometheus crew land on; that became part of the huge background feature. We also had other landscapes that we were revealing as we went through the clouds. There was quite a lot of integration work with the elements we had. The big wide shots were great matte paintings of cloudscapes that were created from complex photo montages of real clouds which we added animation to. Thankfully we managed to avoid having to create full CG clouds and the only atmospherics we added were the light wispy stuff that came close to the camera.”

“We did all the NASA research to find what space Ridley likes the look of,” says Richard Stammers who used satellite photographs of Saturn and Jupiter to create the gas giant planet in which the moon that the Prometheus lands upon orbits; the images were utilized as texture references for the surface environment. “The [representation of] the human and alien technology was by far the biggest debate that was always going on during the course of pre-production and the design phase.” A major point of discussion was the cinematic depiction of holograms. “With the alien holograms we looked at ways of manifesting them as a physical object.” The idea was too extreme for filmmaker. “Ridley would often say that things were too Star Trek, meaning it was too fantasy for what we felt would ground it into the level of reality he wanted to achieve; it was interesting to have a boundary. A good example of that is if you look at the controls of the Prometheus there are buttons, switches and joysticks all over the place as well as quite conventional monitors and screens in a lot of the consuls and desks.”

“There was only one point where we ever did discuss doing anything as a miniature model and that was for the specific moment of the collision between the Prometheus and the Juggernaut,” remarks Richard Stammers. “We weren’t sure whether we were going to get a great result out of the CG approach so we had kept some money aside in our budget. I got convinced early on that we could probably approach it the CG way and spend the money on other resources that we needed to do. MPC demonstrated beautifully that they could create a fantastic and realistic CG destruction with the work they had done on X-Men: First Class [2011]. I convinced Ridley that would be fine and to be honest, we didn’t look back at that point. The Prometheus has four structural landing engines and of those there are four landing feet. We had one of those 16 landing feet built for the interaction of the actors when they’re caught up in a storm. We were able to use that as some of our texture references. The garage that gets lowered from the underbelly of the Prometheus was built with a full interior and the exterior of two sides could be used as well; that again gave us a great practical set to shoot with where our vehicles could physically drive out of. We had real Rovers and ATV vehicles that could be in the garage, a practical door that could open and close and a huge ramp the vehicles could drive over. We had discussed doing CG doors but thankfully the special effects team were able to pull up to the mark. While the whole entire ship above was missing, it gave us a fantastic starting point for a lot of our work. Similarly the Juggernaut we had one small section that was built where Shaw played by Noomi Rapace [The Girl with the Dragon Tattoo] flails out of one of the Orpheus doors. We had a sample panel which we took to Iceland which we were able to use for a moment when she nearly gets squashed. Other than that the two spaceships were all CG.”

“The Prometheus and the Juggernaut are enormous,” notes Richard Stammers. “We realized how much of a challenge to sell that scale and movement was going to be when we started previs. I wanted to animate them to be quite articulate rather than being slow heavy beasts barely moving. It was a fine line between having an agile motion like when the Prometheus lands; you want to be able to get some dramatic motion to it but not have it take forever. We approached it in a sense of a combination between a Harrier Jump Jet and an Osprey which is a twin-rotor plane type helicopter craft which can do vertical landing and takeoff. We used those points of reference but they’re far smaller aircrafts. We had to slow everything down to the right amount to get the level of weight and scale the Prometheus needed. Ridley always liked the idea that it would takeoff like a Black Hawk helicopter. Trying to factor all of those agile motions from a smaller aircraft into such a huge thing was a real challenge. The MPC animation team did a great job of getting that sweet spot. Similarly with the Juggernaut, it has no visible engines or form of thrust; trying to make it move in an interesting way that was believable was difficult. Lighting is often the key.” The changing reflections of the clouds moving across the large surface of the spaceship as well as the dapple sunlight helped to create the impression of movement. “One side of might be in bright sunlight and the other side of it might be in shadow. It’s getting both kinds of scale in the lighting that makes a difference. We had to do that with the Prometheus too; we would end up having shadows of clouds moving over the surface to help with the scale of it.”

Developing the badly degraded Engineer hologram which documents the day-to-day life of the alien species involved a resourceful technique. “The technology we looked at for that which Ridley loved so much was live-action lidar scanning where somebody had been captured with a lidar scanner and you don’t get a very good picture of them,” reveals Richard Stammers. “It’s an impression of motion; there are a lot of breakups and holes in it. We looked at a MIR style where it had a volume inside of it where you might get a sense of the internal parts of the Engineer character in the holographic form so it wasn’t just the outer shell you were seeing. Ridley felt that was too much. At a certain angle you would be looking straight through the breaking up lines and it was reminiscent of a broken up old fashion television. Looking down from a higher vantage you’d see the contour lines of the 3D shape of the Engineer.” Another significant alien hologram appears in the pilot chamber with the star map known as the Orrery. Ridley talked about the room being filled with frog spawn and within that you might want to look at one particular sphere, like our solar system the Milky Way or something like that as particular bit of detail.”

Prometheus features the dramatic scene known as Med Pod Sequence where Noomi Rapace performs surgery on herself to remove her unborn alien child. “Ridley got sold on the idea of working with previs on this film for the first time,” states Richard Stammers. “Our First Assistant Director was printing out frames from the previs and that was the checklist of storyboards we had up on-set. It involved a great combination of some fantastic prosthetics and great animatronics. The Special Effects Department was able to provide a certain amount motion so that the med pod table. The motorized chair was based on a real surgical chair which could articulate in a number of ways. Much of it was completely in camera. The visual effects would come in and augment some of that where it was required. As soon as the actual surgery commences there were specific surgical tools that had to be generated to carry out the surgery. That was all done by Weta based on matching at what we had done as a previs. We had a prosthetic belly that was puppeteered from underneath that opened up and CG spreaders were animated to match it; that was a good combination of CG and prosthetics working together. In a couple of cases we had to replace some of the prosthetic work because it was too gory. It worked out well.”

“MPC ended up picking up most of the work purely because they were dealing with a lot of environment shots,” explains Richard Stammers as to the selection of the 10 VFX vendors responsible for the Prometheus. “Every time we went outside on the planet’s surface that in itself locked them into a few hundred shots of exterior work. In the exterior work we needed to see the spaceships so they also got awarded to doing the spaceships. And because of the spaceships they needed to do the space shots as well. It amounted to about 450 shots for MPC. We felt that Weta was our best choice for doing creature work and they took on board around 200 shots. Weta had some set extension work and did the reveal of the Space Jockey chair out of the floor of the pilot chamber which is an iconic moment; they also did the Med Pod Sequence because that involved a little bit of creature work and CG augmented robotic arms. Fuel VFX in Sydney was responsible for creating much of the holographic technology. We utilized and chose them because they’re a very creative team and got a strong design sense. Paul Butterworth, the visual effects supervisor there, has a creative eye and led a great team creating some fantastic and original looking ideas and concepts for some of the things that were challenging for Ridley’s Art Department to create. Our fourth vendor to come on board was Hammerhead with the intention they would pickup a couple of key sequences; they did Weyland’s [Guy Pearce] briefing and Holloway’s [Logan Marshal-Green] and Shaw’s presentation to the crew. The two scenes were back to back and take place in one of the cargo areas of the Prometheus that involves two holographic presentations. They also did work relating to David, the android, when he gets his head ripped off. Hammerhead removed Michael Fassbender’s [Shame] body and left behind his performing severed head.”

“Ridley worked very closely with MPC, especially during the pre-production stage,” states Motion Picture Company VFX Charley Henley. “We collaborated on previs shots for key scenes by taking his sketches and matching in 3D. We would later review these shots with him in Maya or as a cut, tweaking the layout and animation interactively where necessary.” Henley explains, “Ridley would draw his own storyboards ready for the beginning of a days shooting. These were invaluable guides and would cover VFX shots as well as live action. For all CG shots we would match his drawings in layout. Throughout post Ridley would draw us frames where needed. After there was an edit of a scene we would print out a frame for each shot in cut sequence, then he would sketch over the top where the ship should be in frame or how the creature animation should play out or what extra digital shots where needed. The shots often turned out to be almost identical to the ‘Ridleygrams’, he has a very accurate eye for lens and perspective. It was a great way of working and helped us get exactly what he wanted.”

“The biggest challenge was the goal we set ourselves,” reflects Charley Henley. “It was to do everything we could not to interfere with or limit Ridley’s directing process and creative freedom. This meant we had to be flexible to change right up to the last minute. We built a pipeline to allow, if need be, animation or even model changes to update after we were signed off in lighting and compositing of a shot. Everything was scalable. We would technically layout a shot correctly and then Ridley would want to adjust the size of say the Dome and Ship just in that one shot if it gave a better composition or story point; flexibility was key. To help tackle this we decided to present the Layout and animation stages to a much higher visual level. MPC Animation Supervisor Ferran Domenech did a great job on Post-Vis while Layout lead Paul Arion took our layout pass to a new level. Traditionally these would be grey play blasts roughly matted over the BG plate. We pushed the texture, lighting, atmosphere and level of FX at this stage to be as presentable as possible. This was good for Production as we had good-looking temp versions for screening tests. By investing time on the look early on we actually got approvals faster and with more confidence. We found we could move on quicker into discussing lighting and finishing qualities.”

Bringing the spaceship Prometheus to cinematic life was a creative journey in itself. “It began with the concept drawings Ridley’s team had developed which we matched these to get the basic model,” states Charley Henley. “We approached texturing the surface using photographed panels from real aircrafts and textures of dirt and tiles on buildings to help with scale. We then photographed and scanned the set piece of a foot of the Prometheus, which was about 30ft high. We then modeled to the scan and textured it with photo projections. We analyzed the previs and post-vis shots to find key angles and make sure we didn’t overwork areas that weren’t ever going to be seen. Our lead modeler Lisa Gonzalez and texture artists worked together to find a fine balance between modeling and displacement to make sure we could be efficient in lighting and rendering. Moreover, our Lighting Supervisor Daniele Bigi was keen to push ray traced and image based lighting and not bake in any shadows or occlusion. CG Supervisor Matt Middleton kept a close eye on all the extra details that we modeled in to make sure we were always sympathetic to the style set by the art work and set constructions. There was a huge amount of design work needed to get to a finished level on any surface of the Prometheus, and credit goes go to our model and texture teams for all the hard work they put in. We used references from all sorts of sources, from Arthur Max’s sets to NASA spacecraft’s and from the classic Nostromo model from the original Alien [1979] to modern buildings.”

“Steve Messing from Ridley’s concept team drew up the concept artwork for the Engineers ship,” remarks Charley Henley. “It clearly related back to the derelict in the original Alien but looked bigger and had a much higher level of detail. We knew this was going to be a huge challenge to recreate digitally as it would be seen from every angle and in close-up and wide shots in the. A Russian artist Alex Kozhanov also known as Gutalin, was commissioned to take the concepts to the next level using Zbrush. While the shape of the ship is fairly simple to create, the complexity lies in its details. To generate a renderable model we striped down the design and modeled the pipes, holes, recesses and larger components. We then tried a relatively procedural approach to laying out the smaller level piping and textures but this didn’t work particularly well. The organic but precisely designed nature of the ship’s flowing lines meant all details had to be layered out by hand. Part of the Juggernaut was built practically by Production Designer Arthur Max’s team; this was the hole that Shaw lowers David out of and a piece that falls on Shaw after the crash. These gave us a template for MPC Texture Lead Caroline Delen to work from. We also referenced buildings, ships and submarines for scale and material qualities. The final CG Juggernaut was wonderfully detailed and looked great with the close-up shots but we still did a projected DMP pass on most of them.”

“We had two planets that could be viewed from space: the GAS giant and LV-223 the moon the Prometheus lands on,” states Charley Henley. “We based the gas giant on a very small moon of Saturn called Enceladas that the Cassini space probe had photographed and Ridley had liked the texture of. The challenge was then to create a huge gas giant that encompassed the essence of this small ice moon so we looked at textures of Saturn and Jupiter for inspiration and then photographed swirls of ink to simulate giant gas flow dynamics. We combined this with satellite images of earth into a DMP, which were then projected into each shot.” Henley adds, “LV-223 and all the atmosphere was mostly projected matte paintings. Ridley’s meeting with NASA planetary experts encouraged him to go with a stormy and turbulent weather for the outer atmosphere. We got the look from hurricanes on earth seen from space and added lightening flashes in compositing. We also added subtle magnetic storms above the atmosphere which combined with the Gas giant backdrop gave a nice other world impression.”

“The whole Landing Sequence was one of the hardest scenes to nail,” admits Henley. “Ridley had gone on a recce to Morocco and had some great concept paintings based on photography of mountains there. These portrayed a dark, stormy surface of great scale and no foliage. We ended up with plates from the red stone mountains of Wadi Rum in Jordan and the grey desolation of volcanic Iceland. The scale had to be changed and the two environments we shot blended together. Compositing Supervisor Marian Mavrovic led his team adding atmosphere, clouds, lightening and cloud shadows where needed. With a reference to ancient land markings seen on earth we drew out a pattern of lines across the main valley the Prometheus lands in. Selling the ships scale and interaction with this environment was the key; with CG FX we simulated contrails off the engines, atmosphere blowing past camera and huge dust clouds and stones being kicked up as the Prometheus touches down.” A spectacular moment is when the Prometheus collides with the Juggernaut. “We began the Crash Sequence by heavily researching real explosions and looking through numerous film references. Although the option was there to use miniatures, Richard had full confidence in using our proprietary destruction software Kali and go with a digital approach. We set about carefully plotting how the Prometheus would break-up and destroyed by impacting with the Juggernaut [the Engineer’s ship] by using the earlier previs we had done with Ridley. Having worked out which parts of the ship would be destroyed we made them Kali compliant. Explosions were later added based on a fluid simulation and enhanced with elements of practical explosions.”

The planetary world plays a pivotal role in making the story believable on the big screen. “To create the alien planet environment we used projected photography of real locations and combined with VFX work to give it that unearthly look,” says Henley. “We worked with an initial concept from Ridley who had scouted a photo of a valley in Wadi Rum, Jordan and drew some strange alien domes and markings onto the landscape to create the desired look. We went back to the Wadi Rum valley and shot our reference plates for our aerial shots. On the ground we took high dynamic range 360deg images from the same spot at various times of day. From these we had a library for the back drop mountains that could be used to match any lighting direction we had in the plates that where to be shot at a later date in Iceland. Several foreground green screen plates where also shot at Pinewood, and we used our digital environment to extend these. The concept for the ground was to be covered in strange rock pinnacles with the layout of a petrified forest. To build these, we scanned and texture photographed real rocks we found on location in Iceland and then built a handful of CG pinnacles. Environment Lead Julien Bolbach created a tool for naturally placing them across the landscape which was then lit to match location plates shot in Iceland or green screen plates shot exterior on Pinewood’s backlot.”

The Engineers were not the only alien species to encounter the human explorers. “The Hammerpede was a collaboration between the practical creature FX and VFX,” says Henley. “Here, our main challenge was to seamlessly intercut between animatronics and CG. Neal Scanlan created a great animatronic Hammerpede that had a lot of control but was limited for extreme animations. The Creature was built to mimic the translucent surface of the practical. We brought the animatronic into the office and broke down how it was built. There was an inner sculpted muscle layer covered with a silicon translucent skin. We copied this two layer approach exactly in the model and Lighting artist Arturo Orgaz Casado lit it simulating the scattering of light though various thicknesses of outer skin. In all, we took over shots where the head opens up, the Hammerpede wraps around and breaks Milburns arm, the head re-growing and the internal helmet shots and needed to perfectly match the look of the animatronic. The internal helmet shots required very accurate match move and roto animation. Shooting with stereo cameras is very unforgiving in these situations because the other eye picks up the smallest error. It makes you realize how much you get away with on a mono show.” CG was used to produce the fatal encounter between the character played by Logan Marshal-Green (Devil) and the alien creature within him. “As Charlie get’s infected there is a progression of disease across his face. This was a digital make-up task, which is something we are doing more and more of now. To create the look, the actor was shot with 3 different states of make-up to and we added animation to a number of shots to help enhance the effect. For the simpler shots we cleaned up the original makeup and then used animated mattes and warping in 2D to revel the veins growing. The more complex needed full roto animation of Charlie’s head and blend shapes of 3D veins bulging and collapsing.”

“Almost every exterior shot on the planet involved a digital replacement,” states Charley Henley. “Foreground plates were either shot on the back lot in Pinewood studios in front of a huge 270 degree green screen built on a wall of shipping containers, or on location in Iceland which, even though we we’re carful to not shoot the helmets with sky behind, was often harder to blend into and involved a lot of roto and edge fixing in compositing. But the Iceland locations helped us with the realism and there was great light in Iceland that we could match the CG to.” Digital effects were blended with the practical elements. “As the Juggernaut, hidden underground, starts to take off a huge kilometer wide section of ground opens up, we called this the Silo. An articulated 20ft square of the silo was built on the back lot for the actors to interact with and the rest was effectively set extension. Our CG build was based on of photography of, and tuned in to perfectly match, the set piece. We then used a combination of CG renders and projections to integrate this into Iceland foreground plates.” Some software customization was required. “We have been developing some useful volumetric tools at MPC for the past few years and the timing was just right for Prometheus. These tools allowed our artists to more accurately work with the look of volumetric FX. It’s a tool that allows you to accurately visualize how CG clouds or smoke will look and light when rendered but in real time. We used it extensively to help design the Sandstorm Sequence. But the majority of the work was done with systems we have developed over the years at MPC such as our destruction tool Kali.”

“As we where doing all the environment work there was some crossover with Weta which were doing FG creature work, and with Fuel doing holograms on the Prometheus Bridge where MPC generated the environment out of the window. It was a pretty clean divide,” observes Charley Henley. “Working on a stereo show whether native [shot with stereo cameras] or post converted always involves extra labor and complications. MPC was heavily involved in the stereo pipeline set-up for the production having learnt from previous stereo shows such as Pirates of the Caribbean. Some of the hardest issues arise with camera matchmove and roto-animation; these disciplines require you to accurately re-create the camera and place objects in depth. With a single camera this is somewhat forgiving to small errors but with two cameras looking from slightly different perspectives you have to be extremely accurate. Lens and mirror distortion add to the challenge.”

“Some of the best things about Prometheus are how well Ridley’s greatest strengths marry with science fiction,” believes Charley Henley. “His wonderful talent for lighting, composition and style help the Production Design, Cinematography and Visual Effects to work together to create something both beautiful and tangible. To distinguish Prometheus from Ridley’s previous science fiction films, the advancement in what Visual Effects can offer today allowed Ridley to explore more thoroughly and with greater creative freedom the spaceships and environments than was possible while making the original Alien.” Henley notes, “For me one of the greatest things about working on Prometheus was the infectious enthusiasm of the entire team. For someone working in Visual Effects this was a dream project. While VFX contributes and expands the possibilities of many genres, Science fiction is where it has its greatest creative freedom. With Ridley Scott at the helm the MPC team was encouraged to contribute technically as well as creatively and with great results.”

“We were first approached in December 2010,” states Weta Digital Visual Effects Supervisor Martin Hill. “Ridley had an early design for the Engineer and a maquette which he lit and put on turntable. Ridley shot some footage and sent it to us to see if we could match it. We built the model digitally from scans, recreated the material properties of the skin and added a facial rig so that we could animate the model and bring it to life. The results gave Ridley the confidence to pursue using digital creatures for the film.” The Academy Award-nominated filmmaker provided the New Zealand-based VFX facility with a clear idea of what he sought to achieve cinematically. “Ridley has a really strong vision in terms of what he wants, but is also very open to have ideas presented to him; we were able to get involved with the design process and how creatures would move and look on some of the sequences. In some instances there was a creature that already had a maquette built and a design which we would match and enhance. We had to make sure it was able to articulate in a realistic way, where a puppet was more rigid or unnatural. Ridley would give us stylistic reference, saying for example he like the ferocity of a clip with a baboon, and our Animation Supervisor would create three to four motion studies based on similar reference, which we’d present then refine. In conjunction with Richard Stammers we were able to collaborate and design the vision that Ridley wanted.” Scott would also draw illustrations. “We had some great ones representing an alternative vision of Trilobyte [the large octopedal creature that fights the Engineer at the end of the film]. Where the creature was an amorphous blob that falls from the ceiling [Ridley described this motion as being ‘like a bucket of tripe’], then inverted his head through its body and grows tentacles. There was a lot going on in these little storyboards.”

“The diversity of the effects we needed was the biggest challenge,” remarks Martin Hill. “Almost every shot was bespoke in one way or another. Every Engineer shot was an escalation of the one before it, all the med pod shots were unique in their approach and each Trilobite shot had its own tracking, matchmoving or animation design issues. This made it difficult to achieve any economy of scale with the effects, as creating each new effect was its own unique challenge.” A pivotal task was the development of the alien race responsible for making the human race. “The Engineer was unique. Usually we would strive to make a digital character as anatomically accurate as possible in terms of its musculature, articulation, and the thickness and pliability of the fat under the skin. For continuity with the practical plate we had to make some compromises to match an actor in silicone prosthetics. For example, because the silicon was so thick, we needed to increase the depth of our subsurface vastly, which causes problems with light bleeding through areas like the bridge of the nose and the fingers, making him look waxy. To counter this we added an extension to our TDQ subsurface plugin written by Eugene D’Eon, which added internal blocking structures to the model. Our Creature’s Supervisor Matthias Zeller had to augment our muscle system to make some of the muscle contractions and tendons less pronounced to match the performance on set. It was a fine balance between hitting a photo-real CG character and matching the actor with the prosthetics.”

The ritualistic killing of the Engineer results in a dramatic death scene. “For the opening sacrifice of the Engineer sequence, we needed a constantly evolving and escalating decay of the Engineer’s body,” explains Hill. “Taking the lead from Ridley’s idea of capturing as much as possible in camera, we indirectly used a lot of filmed to elements that drove our procedural shaders to get the natural motion into each of the effects. For the initial stages where the vein and artery system is transporting the destructive black goo around the body, we took blocks of silicon, carved vein patterns into it and filmed pumping through inks and oils. These elements were processed in Nuke, by Shader Writers Remi Fontan and Chris George. They were used to drive not only the colour and displacement but also, with an amount of procedural processing, the bruising and bursting capillaries under the skin and with the darkening of the flesh becoming more specular. For the next stage of the effect [where the skin becomes more leathery, dried out and less translucent] we filmed elements of cracking drying clay and paint, to drive the shader behavoiur for the skin splitting that follows. Even down to the DNA level where the Engineer’s DNA infection is created by combining eroded fish spines with the rendering of tooth enamel, with the motion element of melting polysterene by Texture Artist Masaya Sasuki.”

Another Engineer meets an untimely demise at the end of the movie. “Animating Trilobites was down to the skill of our fantastic animation team led by Mike Cozens who hand animated each tentacle and the body of the creature to always feel like a continuously moving organic mass; it also solved the incredibly complex problem of making it fit with the engineer’s motion onset,” says Martin Hill. “Our camera department had to do a perfect matchmove of the Engineer for which we used three reference cameras. This was further complicated by the on-set lighting having erratically strobing banks of lights that traditional tracking software had a hard time with. Lee Bramwell’s camera team did a great job making sure the Engineer sat correctly in 3D space to give the animators a target to work with. With a tentacle creature, it is always a big challenge to create an animation rig that doesn’t collapse or twist, but still gives the animator full control of where the tentacles go. Also making sure that the stretching is even across the tentacle and that one section doesn’t get stretched more than others around it, which defies the elasticity of the creature and makes it look unnatural. On top of this where a tentacle is wrapping around the Engineer or pinned to the floor, it needs to fix there and be able to compress and deform against the surface it’s touching. Matthias Zeller, our Creature’s Supervisor used the layered deformer approach to the muscle rig that would fire the muscles in tension and relax them in compression; this was then passed through to a solver, which would wrinkle skin where it was more heavily compressed. Where it is stretched the skin would wrinkle along the tentacle. These same tension and compression attributes were passed along to the shading system so that when the skin was taught and tense it would become lighter and shinier. In compression it would become rougher and darker in the folds of the wrinkles. On top of that there was a secondary peeling/flaking skin which sat on top of the other deformers and was fixed in place at the base of the peeling skin of the underlying structure, but didn’t stretch with the main tentacle motion.”

“Ridley wanted the machinery to move in a disconcerting and sinister way,” states Martin Hill while speaking about the Med Pod Sequence. “We looked at reference of the motion of industrial machinery, car manufacturing robotic limbs, and also motion control cameras – the way the camera head can keep completely still and controlled while the rest of the machines limbs are moving furiously around. We augmented motion slightly to make it a little bit more sinister, often having more tools than were necessary for the medical procedure shots to create some extra claustrophobia. To give the stapler more impact we added some elements from pneumatic drills, making the whole shaft move to make the tool seem more forceful and give it more weight. Alfred Murrle and Phil Leonhardt’s team digitally projected Noomi’s body in a 2½D way onto the matchmove geometry so that we could compress where the stapler was punching into her, the staples take some time to settle back to their rest state. For the Med Pod Sequence we needed a high-res geometry for her torso that we could cut open, stretch and deform as the Trilobite is pressing against her skin from the inside. We re-projected the plates back onto her torso and then re-applied the specular highlights to match the augmented motion. The Trilobite had to be digitally realized in placental sac form and when it escapes and thrashes around. We referenced a lot of embryonic motion for the twitching and convulsing of the creature in the sac and were careful to make sure our final CG creature could cut seamlessly with a practical model.”

“We did set extensions for the Ampule room [including most of the ampules], the disintegrating Ceiling frescos and the Pilot’s Chair,” remarks Martin Hill. “One of the keys to matching the biomechanical look on set was making sure nothing was geometrically perfect; every part of the sets including the Pilot’s Chair was subtly distorted. Most of the environments had some graphite powder on them when gave a retro-reflective quality, we needed to write custom shaders to match the material quality of this.” Having to deal with other VFX vendors involved with the project was not a problem. “We had a very collaborative approach with the other facilities. For example on the Pilot Chair shots, we created the Chair and Elephantine Suit, then passed to Fuel our matchmoves, cameras and any geo they required to help them with their very cool Orrery effect.” The 3D aspect of Prometheus did not allow for visual shortcuts. “You lose a lot of 2D tricks when you working in stereo. All the matchmoves needed to be incredibly precise referencing from multiple reference cameras; this was especially difficult for the Med Pod and Trilobyte Fight Sequence when there are multiple layers in-depth of contact points between the objects in the plate and the CG applied.”

“Ridley likes to scribble what are known as ‘ridleygrams’ – very simple hand drawings that articulate his ideas,” states Visual Effects Supervisor Paul Butterworth who works for Fuel VFX. “We didn’t get a lot of them but the ones we did cut straight to the chase of what he wanted to see and were very useful. The internal structure of the Pyramid as seen on the holotable was really pushed along by a ridleygram, as were the data spheres and targeting system aspects of the Orrery. Ridley also provided some references of artworks that he felt captured the mood of a scene and some key video reference for looks he liked. Richard [Stammers] worked with us mostly via Cinesync where he could provide direction verbally and workshop ideas in real-time. During the concept design and look development phase he was great at sifting through the multitude of ideas we were investigating and helping us focus on what was most relevant to Ridley’s ideas.” The Australian company was responsible for the alien star map scene. “One of the biggest challenges was to realize the complexities of the Orrery design that we came up with. We didn’t have to invent any technology to handle this but we needed to seriously upgrade our deep image tools. A wide shot of the Orrery has 80-100 million polygons and these tools gave artists the ability to reach in and manipulate millions of points in an interactive way, reducing or removing the need for expensive 3D rendering for minor changes; that was incredibly important. There was no way we could have handled the Orrery in a more traditional way – it would have comprised about 200 traditional layers and would have had to be completely re-rendered each time for even the smallest change, requiring weeks to turnaround. The schedule wouldn’t have supported that.

“The Orrery was developed by our art department through traditional styleframes, taking on board aspects of the different concepts that Ridley liked, until we arrived at a final concept frame,” says Paul Butterworth. “Incidentally, this final frame was completed on-set as I sat next to Ridley at the monitor tent during the filming of the Engineer Running sequence. The script called for some sort of ‘magnification function’ in the centre of the orrery and we developed a styleframe that reasoned the Orrery used a logarithmic scale to represent its data as you moved from the centre [where you could view individual planets] to its extremities [where skeins of particles represented thousands of galaxies]. This was more about helping to create a set of rules by which the Orrery could work and we could use as a foundation in discussions. The multiple data spheres that appear within the orrery were born out of a ridleygram that referenced frog-spawn. The idea is that these are pre-selected parts of the universe that have been magnified to a certain extent and are ready to be moved into the centre of the orrery to be viewed even larger. The ‘gimbal rings’ are a reference to armillary spheres and Ridley had pointed us in that direction by supplying us with reference of an amazing Renaissance painting that included one, A Philosopher Lecturing on the Orrery by Joseph Wright of Derby. In our orrery, the rings hold DNA data of the galaxies or solar systems that they orbit.”

“The development of the holographic Engineer characters was probably the trickiest of all our concept work,” states Paul Butterworth. “These characters are made from light and yet had to be mysterious and eerie. At times they needed to be an abstract volume of particles that you were unsure what it was, and at other times you needed to be able to glimpse the Engineer. Ridley was especially particular about the look of these as they are so important to the film. It was a challenge for us to get these right and we spent a lot of time both in concept artwork and 3D R’n’D to get there. Once we had a good general recipe however we found that the look of the Engineers could vary considerably from shot to shot depending on framing, lens and lighting – mainly in terms of the readability of their volume. Given it’s a stereoscopic film those issues really jumped out. So the Engineers were ‘sculpted’ on a shot by shot basis (by controlling which particles are seen) to ensure they read well in 3D.” Fuel VFX was also responsible for the holotable and the control desk. “Both these looks came together really quickly. The holotable was the first thing in look development and Ridley basically approved our first motion test as the final look – and that’s the look that’s in the film. This was in pre-production so he was able to take our look dev on set with him for those scenes. So we got off to a good start! The control desk energy FX was something that was on the cards for a while but only got the go-ahead mid-way through post. But we’d had some time to think about it. We did some concept styleframes which Ridley really liked and he picked one. We then turned that into a motion test with some variations and again he really liked those and selected his preferred version, and that’s what went into the film.”

“The look of the laser probes [the ‘pups’] were trickier to get right,” admits Paul Butterworth. “While we could come up with concept styleframes that everyone liked, the success of the shots really came down to the motion of the lasers so the look development was pretty much entirely solved in 3D FX. There was quite a bit of investigation regarding how fast they scanned, the number of lasers and how much they flickered, and even how much contact with the set walls should be seen. At the end of the day they have a really cool look where the probes send out 3 forward-looking lasers that map the environment ahead of it as well as a curtain-type laser that performs the finer scanning of its immediate location. Ridley really liked the pups and started adding them to other scenes, and we found ourselves with a number of new shots not too far out from delivery.” In regards to the 3D wall in Meredith Vickers’ [Charlize Theron] suite, Butterworth remarks, “The idea was that Vickers’ 3D wall was a perfect hologram so it would look like you could literally step into it. So it didn’t need any concept design but we did a lot of work to make the alpine landscape and wheat field believable. The alpine shot is a matte painting that Fuel dimensionalized and added 3D trees, CG atmosphere and particle snow. We also dimensionalized the wheat field footage and re-created the moving wheat in Nuke.”

Background replacements and CG augmentation happen throughout Prometheus. “There was more than is probably apparent,” notes Paul Butterworth. “Fuel looked after the set extensions in the pilot’s chamber. This was a huge, impressive set which really only needed to be topped up and have a CG ceiling added in a few shots. We had a good lidar scan of the set that made modeling the remaining parts of the walls quite straight forward. There were quite a few occasions where we had to replace full walls to re-light them so having the lidar was great. The ceiling was based on an art department sketch. This required some reinterpretation to fit into onto the actual set but nothing major. The catacomb tunnels for the Running Engineers Sequence were significantly replaced simply because of the fact we had really complex light interaction requirements from the Engineers as well as the ‘tunnel activation’ particles that fill the space as the Engineers pass through.”

Some of the scenes involving Fuel VFX required collaboration with other VFX vendors. “We had some shared shots with both MPC and Weta but not a lot,” states Fuel VFX Executive Producer Jason Bath. The work we were adding to the shots was usually completely different to what they were doing and the effects didn’t need to interact, [e.g. MPC would comp the view outside the bridge of the Prometheus, while Fuel would be doing the holotable]. As Fuel had a lot of holographic and volumetric effects we tended to be last in the chain of any shared shots and delivered our work on top of final pre-comps that were pre-approved by Richard.” Paul Butterworth remarks, “The 3D stereoscopic aspect certainly did make life more complicated! Camera tracking, as expected, was much trickier, especially on the Orrery shots where we had CG objects close to camera as well as in the far distance. While most of the pitfalls of working stereoscopic are known, that doesn’t necessarily make it any easier to deal with – the solutions are mainly that you need to dedicate more time to the same tasks.” The technical difficulty was well worth the effort. “Make sure you see it in 3D!” says Bath. “It was a stereoscopic production from start to finish and from the footage we have seen as part of our contribution the stereography is beautiful.”

“In the case of the Linguist and Artifact Presentation, Production gave us some guidelines but wanted us to do a lot of the look development internally,” states Hammerhead VFX Producer Michelle Eisenreich. “We ended up going back and forth with Prologue as they were simultaneously doing look development of a similar effect, until we landed on a look that was related but not identical to each other. In contrast, there was extensive planning that that went into the Weyland Briefing, not only for shooting the plates but in the background Mars matte painting. Our supervisor Jamie Dixon flew to London for a few days to work with Richard and Ridley while they shot the scene in the Hanger. They worked with a stereo motion control rig for the moving camera shots, shooting the Hanger set in one pass and Weyland’s office in a second pass. Production did a number of concepts on the look of the Mars terraforming / solar farm in the background. We went through several variations and once a select was made we proceeded to make a stereo matte painting that would work with the camera moves and in stereo.”

“For the Weyland Briefing scene we looked at hologram reference, NASA Mars images, and solar farm images; for the sky and cloud movement for Mars, Ridley had us reference the Orange TV commercial he directed,” explains Eisenreich. “The challenges in this sequence were both technical and creative. The main technical challenge was the moving camera shot where the hologram turns on and Weyland appears. The motion control move was complicated by the fact that it was also in stereo. If there was any shake in the arm of the rig holding the cameras, it was not necessarily the same shake or rotation in both cameras. Also the convergence on the different motion control plates did not match and had to be adjusted. The select on the Weyland element was shot in a different motion control pass than the rest of the elements. To correct this we had to match move Weyland, re-project him on to new geometry and render with the hero camera move. The hologram had to turn on in-depth as well, requiring extensive, articulate roto for Weyland, Vickers, the dog and furniture in the room in both eyes. Creatively, we did many, many iterations playing with the transparency, the flicker, the chromatic aberration and the cat’s cradle effect before we hit the right balance that resonated with Richard and Ridley. Also of note, spatially the Mars BG appears further in depth than the back wall of the hanger, creating a nice holographic portal effect.”

“The main challenge with the Linguist shots was creating the feeling of depth within the box while not confusing the eye when viewing it,” remarks Michelle Eisenreich. “To balance this we tried various levels of detail before we struck the right balance. This was another instance where our work and MPC’s work, and Prologue’s work was passed back and forth so we could maintain a related look, as if it came from similar technology, without needing to be identical.” Hammerhead was involved with a scene which has dire consequences for the android David. “In this sequence of shots, stereo issues were the main challenge. We were given a plate with Michael Fassbender laying in position, and a plate with the prosthetic head and torn neck placed as closely matching that position as possible. Of course despite the best efforts of production, the placement wasn’t identical. Normally in a non-stereo project, you can shift, move, and warp the two plates to match properly with relative ease. However, those tricks do not work as easily in stereo. Making the transition between the two pieces seamless in depth was the main challenge and required meticulous attention to detail and much iteration.”

“Richard and Visual Effects Producer Allen [Maris] provided us with concept art and descriptions from Ridley to get started,” states Michelle Eisenreich. “As each facility made progress that Ridley liked, their work would be shared with us and vice versa.” Dealing with 3D complicated the visual effects process. “Doing work in stereo is always more complicated than not just by the sheer fact that we’re working with twice the material from storage, rendering, to delivery. It also makes shots that would be fairly straight forward with one eye, such as wire removals etc, more complex and time consuming. The final review process is also extended since once the work is finalized creatively it goes through a separate, technical review by a Stereographer. Items misaligned by a mere fraction of a pixel can be kicked back when dealing with stereo. Eisenreich adds, “We are very proud of our work on Prometheus. Despite the challenges it was a great experience working with Richard, Allen [Maris] and Ridley.”

“As we got further into post the shot count grew [from 1000 to 1400] and some of the work needed to be rearranged,” remarks Richard Stammers. “Rising Sun came on board to deal predominantly with a storm sequence; they had a great ability to do good stereo work and picked up a number of other additional shots, like monitor composites and reflection fixes. The remainder of the shots involved smaller sequences and required specific companies we felt were good at tackling certain types of work. A good example of that was that we used Lola which is based in Santa Monica to do some facial enhancements to the Engineer characters that had been shot practically for the most part. The Engineers are a huge creature meant to be eight foot tall. We had worked with an actor who was seven foot one and brought him out to have an incredible physique and tried to make him bigger than he actually was; in the process of doing that, the prosthetics of his head brought out his head size but his features of his face remained small. We needed to rebalance the scale of his features on his face and Lola were a great choice for that; they took the actor’s features in the prosthetics and rebalanced them to the size of his head so the eyes got bigger and wider, and scaled around the bridge of his nose. The studio liked so much what they had done with the prosthetics on the Engineer in the late sequence in the film they wanted to follow that through in the Engineer who we see in the beginning of the film.” Michelangelo’s famous sculpture of David served as the inspiration for the alien creators. “A picture of that was on the wall of the Art Department in the first iteration of the Engineer design as a key point of reference that Ridley liked.” Also working on the $130 million production were Luma Pictures, Territory, Prologue and Halon Entertainment which provided previs.

“Richard gave clear concise notes and was a pleasure to work with,” states Luma Pictures VFX Supervisor Vincent Cirelli. “We received reference of what the design should look like from their art department. Along with design reference, we also received the footage that needed to be projected into our fluid volume.” The major assignment for the California based VFX facility was the creation of a floating holographic screen. For scenes in which the actor was partially inside the volume of the hologram, we were faced with creating detailed holdout geometry and matchmoves of the actors, so they would integrate properly within the CG fluid. To create the look of the distortion field, we used FumeFX, which we recently worked with Sitni Sati to implement inside of Maya. However, a very large portion of the overall look development was done inside of Nuke, by one of our very talented senior artists. One of the biggest challenges was integrating a light-emitting hologram with lighting that was shot practically.” A practical element proved to be useful. “I believe the use of a digital projector on-set — which projected an image across the walls and the actors — when combined with our CG light volume created a more interesting effect than if it had been CG alone.”

In regards to maintaining a uniform look with the other VFX vendors, Vincent Cirelli remarks, “We were provided certain elements such as the HUD and various references, but the final look of this particular effect was unique to this scene.” When it comes to operating in the realm of 3D moviemaking, Cirelli believes, “Stereo has complicated effects delivery more than resolution increases have, in my opinion; the reason being that every year our computing power grows, so does our storage, so we are able to accommodate the transfer speeds needed to render and play back larger frame formats. With stereo, however, many of the 2D cheats we used to employ in compositing simply don’t work in stereo. This being said, over the past couple years we’ve developed stereo tools that have allowed us a lot more flexibility, and I predict these tools will become so robust in the near future that stereo integration will be as fluid a workflow as compositing a show in 2D.”

“We came to the job and initial creative meetings with some assumptions,” states Territory Founder/Director David Sheldon-Hicks. “I thought the simple but robust screen designs of Alien were timeless in a way, more engineered than designed. Also the design of iconography and symbols by Ron Cobb was stunning. Ridley, however, had something else in mind; he explained that the freight ship in Alien was always meant to have a low tech feel. The Prometheus was a cutting-edge scientific research facility and as such, should display a more advanced computer visual language. We still referenced Ron Cobbs work and some of the design elements from the Alien screens, but executed the work in a way that is relevant to our own time in film. Creative briefs were loose but inspiring. Ridley had a background in graphic design so I guess he knew how to brief people like us. Ridley wouldn’t talk about screen and HUD design from other films, but more emotional and artistic references; he has a great knowledge of art and photography and that came through all the time when discussing our screen designs. We were asked to look at using colours and textures from coral reefs. We took this reference of fluid organic nature and applied not only to colour palettes but animations and the way we dealt with data visualizations. It was a great springboard for us to do something new and fresh with computer screens for films and Ridley encouraged us to push this loose graphic approach across the entire ship.”

“The biggest challenge was technically achieving the node based operating system we had devised,” states David Sheldon-Hicks. We wanted the computer graphics to use lots of connected lines, almost like cables or tentacles. The movements and shapes of these node lines would be affected by the steady undulating rhythm of tabs, windows and widgets. To create these nodes strands in the way we wanted in after effects proved to be a little tricky. At the time, we couldn’t find a plugin or work-around that would make it work. We had the option of going into 3d which would have worked, but based on our timelines wasn’t practical. In the end a good friend, Carl Fairweather, built us a plugin that gave us the solution. We had lots of Bezier handle controls that we could parent and weight to other objects movements. All our motion designers loved it and he developed it further as we started asking for more features.”

“Creatively we treated each area of the ship slightly differently,” explains Sheldon-Hicks. “For the bridge and medical screens we used the node based, floating windows system. The coral reef reference comes in here, informing colour choices and animation treatments. All data visualizations were animated to have an organic rhythm to them, undulating and pulsing as the ship monitors the environment around them. Data often has a beautiful complexity to it, and we imagined the data streaming into a science ship would be immense and abstract. We would brainstorm many designs for each type of display. Not knowing which screens on-set will be shot throws up a challenge. With post-screens you’re pretty certain where to put all your effort as you have a narrative and shot to work to. With on-set you have no idea if any of it will be shot, and if it is how close they’ll go in on it. We ran through the script many times, thinking about what the characters are interested in. They’d be talking about weather on the planet’s surface, so we’d research a ton of NASA time-lapse visuals and started to think how our data aesthetic could be applied. We did a lot of visual research for just about every screen. We always strive for the work to be grounded in reality first and then add layers of designed UI afterwards. Other wise it becomes a big UI mass without any real story of its own. We always researched content first, and then its time for the fun and expression just to keep the work relevant.”

“The social area screens were also interesting, taking a push from Ridley’s interest in fine art such as Paul Klee and Rothko,” reveals David Sheldon-Hicks. “He wanted these to be much less about UI, and more abstract ambient textures. Funnily enough I find these screens work more closely with the original Alien films than a lot of the other stuff we all did. When the original Alien film came out the creative team would have been less constrained by preconceptions of high tech computer systems in general, but specifically in films. I imagine them tackling the problem without the constraint of a backlog of slick holographic UI ultra detailed tech to be influenced by. The screens were showing anything from food nutrition to ship oxygen levels. We abstracted this data as much as we could, playing with layering to move away from data rich designs towards moody, moving textures. As designs they can look at little odd on their own, but when seen through the Dariusz [Wolski] lens they take on a completely different feel.”

“On-set screens are way harder than post screens for many reasons,” observes Sheldon-Hicks. “Technically you have to consider screen resolutions, light bouncing off screen surfaces, flicker if you’re dealing with CRT’s and the extremely quick turn around. All of these technical restraints inform the design approach in the same way the dimensions of a piece of paper set some boundaries for a traditional graphic designer. Ordinarily in film screens we go super high contrast, zero anti aliasing for sharpness and simple animation loops so that all content can be left to run over multiple takes. We ignored pretty much all of that for this film. We had soft gradients, anti-aliasing, long narrative animations with no loops at all. Dariusz Wolski [Alice in Wonderland] is a particularly talented DOP, but also receptive to all the other creative teams working on the film. We did multiple shooting tests to ensure all the textural work we were putting into our designs worked. Dariusz worked with George Simons [the supervisor for screen graphics on the film], Territory and the rest of the team to make sure we were getting some great shots. Ridley also pushed us to ignore technical restraints, got us to run tests with projecting onto glass so that he could shoot through the graphics onto the actors. I don’t know how much of a headache it was for MPC and the other post houses to key our projected graphics against the green screen. Sorry!”

“With post screens we get time to consider the interaction happening in shot and have more time to craft the design and animation,” states David Sheldon-Hicks. “This is where I wanted to push the node lines and any 3D work. We used particle clouds to describe and 3D forms, layering up many renders to build translucent colours. The cryo pod was interesting to design for as we had Fassbender’s character, David interacting with the glass pod surface monitoring Shaw’s dreams. There was obviously no real world reference for this, so we went all out fantasy logic, ‘techie bollocks’ it is sometimes called. We took all the interesting design elements from the several hundred other screens and applied some simple logic and composition to create a pleasing data visual. In this case, the content we’re focused on as audience and designer is the two actors. There’s no point in trying to upstage there performances or the cinematic moment, it’s a time for the UI to support an idea.”

“We sent some of our early designs out to Fuel VFX [they did an amazing job I must say] who used some of that design language on the holotable,” remarks David Sheldon-Hicks. “Other than that my communication was directly with the VFX producer on the post shots. We trust in the hierarchy that design language will be consistent throughout. As our work was in shot on-set, it meant the designs would have been seen by all post houses working on the shots, hopefully leading to some kind of consistency.” 3D was not an issue for the London based VFX facility. “We’re a design focused studio, so we would supply flat animated sequences for the other post houses to comp in. We had plenty of other things to worry about creatively to also take on comping in stereo on this job. We’ve worked with stereo in the past and it can be an amazing tool, but on this job the majority of on-set work we completed left stereo workflow a consideration for further down the pipeline.” Sheldon-Hicks admires the creative attitude of the filmmaker behind the camera. “Ridley’s method was to free up the artists he worked with to approach the project with open minds and a desire to craft considered visuals. The pleasure in the process and exploration is hopefully evident in the work on film.”

“Pietro Scalia [the film editor for the movie] and Richard Stammers initially asked us to design the ‘cube’ sequences,” states Prologue VFX Producer Unjoo Byars referencing the on board computer, managed by Michael Fassbender’s character David, which is constantly trying to communicate with other life forms. “Ridley, Pietro and Richard were very clear that in order for the designs to work a full understanding of the content and narrative of the scenes was essential. The designs should not be spurious or decorative technical displays but have purpose and meaning. The functioning system and data base of imagery were the first considerations in driving the designs; an array of varied imagery of mankind’s Cultural, Scientific and Sociological achievements, and the sum of our global history had to be contained in a hierarchical display. Ridley communicated his initial visual ideas that were specific to colour, a golden hue, and a gelatinous membrane texture; a constantly changing display that forms through particles like the inverted movement of Champagne bubbles. This gave us a good starting point for our initial designs the practical placement size and interaction within each scene.”

“There are always many challenges that are both technical and creative,” notes Prologue Creative Director Ahmet Ahmet who had to make sure that the content, composition, and configurations of the ‘Welcome Message Holographic Cube’ did not over shadow the dialogue of the actors. “Creating stereoscopic space that contained other stereoscopic imagery, we worked collaboratively with Pietro and Richard. Pietro editing the scenes and once the initial visuals were comped in, Pietro worked with us analyzing the cut for the balance of visuals and exposition.” Prologue 3D Design Lead Dong Ho Lee remarks, “There was a lot of visual research for content to find and condense the library of messages from graphical hieroglyphic content – ancient and modern, mathematical formulas, arts, and linguistics; they were all given compositional and space considerations within the stereoscopic cube. There was also design research for holographic displays and operating systems to enhance and inform the choreography of the content.”
Prologue collaborated with the main VFX vendor for the science fiction thriller. “Once the look to the architecture of the cube was established the shared assets meant that the team could update and animate content within a set look,” says Ahmet. “Some of our shots were shared with MPC in London they composited some of the other screen materials for the ship, the shot would be then sent to us for final comping.”

“The Dream Sequence again obviously had specific narrative content but this time we were invited to experiment and explore the visuals of Elizabeth’s dream,” states Ahmet Ahmet. “The scanning conversion of the brain’s neuron activity by the computer was the premise for some of our design approaches for David’s voyeuristic incursion. Ridley and Pietro again referenced some form particle animations, this was a perfect opportunity to play with the stereoscopic space to offset and add more dimension to the footage.” Prologue 3D CG Supervisor Lee Nelson observes, “Additional challenges were creating an abstract stereo look. We sort of broke the rules of stereo to force stylized perspectives of the depths of the shot. Using a method we call luminance pixel depth [lpd] we used the disparity of the two eyes to create a depth formula that offsets the pixel information in Zdepth; then by mapping the original depth to the luminance pixel depth we got the desired look.”

“We were invited to design the Title Sequence which was always intended to be a homage to the original Ridley Scottʼs Alien Title Sequence,” states Ahmet. “The main titles went through several stages. Pietro’s first cut extended the hieroglyphic segmented type to form over the entire opening. We then experimented with various durations and hieroglyphic segmentation of the type in homage the original title sequence. In parallel particular emphasis was focused on the look and dimension of the type, creating many renders to find the right balance of subsurface scattering and specular light transparency.” Dealing with 3D required precise work by Prologue. “The stereoscopic production of our shots, creating previews and updating submissions was very specific. The creation and design considerations in stereo made it both challenging and exciting.”

“We were careful about how we divided up the work so that became less than an issue. That’s always the first point,” states VFX Supervisor Richard Stammers when addressing the issue of maintaining a unified look amongst the various visual effects vendors. “The second point was making sure that technically everybody was singing off of the same song sheet. We had a rigid colour pipeline and that helped things enormously. In terms of the look and design of the work, that was down to my and Ridley’s review process where we made sure it didn’t stray too far from things that had been conceptualized. There were certainly times where we had four or five different visual effects companies doing holographic effects and that alone is a challenge because you’ve got a different interpretation. We had to be careful that certain designs Ridley had approved from one company ended up being a point of reference for some of the other companies.” Conceptual Artist Steven Messing, who had done some pre-production designs for the Art Department, was recruited to be the Visual Effects Art Director. “Ridley thought that was a great idea because he favours Steven as one who could visualize the planet exteriors and some of the interior spaces of the alien spaceship.”

“A lot of hard and good quality work is what it boils down to,” remarks Richard Stammers when discussing how to make seamless CG augmentation. “I was meticulous about making sure that we got enough references from where we were shooting. We always took the time to get good lighting and texture references when we were shooting wherever we could. Throughout much of film nearly every shot had a potential somewhere to have a visual effects element in so we got into a habit of making sure that every time we shot a scene nearly every shot got some visual effects reference. Sometimes it was an overkill but to be honest, knowing that we ended up with 1400 shots out of 2200 shots in the final film we were doing well over half of the film was visual effects. To have that to make sure that we were covered was definitely a good call. We always got good lighting references and that always makes a big difference to getting photo-real CG work to look good. It was a great inspiration for everybody to know that we’ve got to raise the bar here.” No new technology had to be invented. “In the most part everybody was working with existing tools and refining them down to serve a new purpose.”

“Ridley was always trying to shoot as much practically,” reveals Richard Stammers who agreed with the approach adopted by the British filmmaker. It was a big team effort and we had a good creature team run by Neal Scanlan (Babe). There’s a sequence where there’s a worm which we call the Hammerpede; one we did as CG and another done as a live-action which was puppeteered underneath the set through rubber gaskets. They puppeteer this snakelike movement with rods underneath the black goo and were able to use their animatronic rigs to rear up the head of the snake out of the puddle. There were a lot of things like that could have easily fallen into the Visual Effects Department. ‘That’s too difficult. You guys do it.’ But there was so much stuff we said, ‘No. Hang on. We can do this practically.’ For every stage of the way Neal Scanlan’s team was able to provide a lot of creature, prosthetics and animatronics work that filled huge amounts of gaps of what was required in a particular scene. We would work out a plan that would involve a combination of the both of us to achieve what Ridley wanted. What’s great is that we were able to intercut between CG, prosthetics or live-action in many of the scenes and you can’t see the difference.”

“We shot it in native stereo,” states Richard Stammers who along with Ridley Scott were embarking into 3D filmmaking for the first time. “It was certainly a big learning curve to take on board a 3D film of this nature but we had the experience of a fantastic Director of Photography called Dariusz Wolski who had completed Pirates of the Caribbean 4 [2011] prior to coming onto Prometheus; through his great eye we got a comfortable stereo experience. When it came to the holographic effects we were able to getting even more immersive.” A standout moment for Stammers is when Michael Fassbender explores the Orrery as the visual effect was placed right around the camera. “All of that was well considered during the time of the shoot. I worked carefully with Dariusz’s stereographer [James Goldman] to make him aware of what effect was going into every shot because quite often we were shooting an empty plate with nothing on the foreground and that temptation was to go, ‘Well I’m going to crank up the stereo just so whatever is in the background reads as stereo.’ I will have to go and say, ‘We’re going to put this huge effect right in front of the lens so you need to shoot this in a way that’s going to be suitable for that.’ We would always work off a plan of how we would approach these shots knowing what the end result was going to be once the visual effects were completed.”

“I’m definitely proud of pretty much everything,” says Richard Stammers. “For me, there are always two key things that standout to me one is the Med Pod Scene because it’s a great scene and I find it a nice combination of being a bit squeamish and gory. There is a nice reaction there from the audiences and the fact that it is a good combination of great CG and animatronics. I think that works well. Then as a bigger action sequence the moment after the Prometheus and the Juggernaut crash, and the Juggernaut comes falling out of the sky, crashing down onto the planet surface, and starts to roll and chase Shaw and Vickers across the planet surface; that scene to me stands out as being a great moment. Visually it’s stunning and there is some fantastic audio that plays in it as well. The 3D works nicely in that moment as well. Everything seems to fall together and creates a nice and gritty look. There are plenty of things flying around in the atmosphere, debris exploding everywhere, and there’s ash in the air so you get a nice immersive moment of high drama.”

As for the debated cinematic connection between Prometheus and Alien, Richard Stammers believes, “To me the link is the inside of the pilot chamber seeing the Space Jockey chair coming out.” The visual effects supervisor for the picture adds, “There are a lot of people who have been expecting to see an Alien film even though it’s been quite carefully marketed as being not an Alien prequel to make sure that people don’t expect to see the classic xenomorph. We’ve seen that creature in so many films that followed his original and all of the sequels. Ridley felt we can’t go there again because it’s not scary; people know what that looks like. Ridley had to find a fresher start to be original. It’s not the same spaceship and planet that’s explored in the original Alien film. However, it’s the same race of aliens responsible for creating the xenomorph alien and we get a little taster of that at the end of the film which provides the origins of what it would look like. There’s that small link but Prometheus works as a standalone film.”

“This to me is the biggest project I’ve worked on to date where I’ve overseen all of the visual effects,” notes Richard Stammers. “I’m not a control freak but I like to understand all of the processes involved and try to get the best out of every aspect of every step. With this one you tend to have to look at the broad strokes of things and rely heavily on the creative people who you asked to take on a role. You delegate to other people who you can trust that work to. All of the visual effects supervisors who worked for the individual visual effects houses provided an excellent team and directions to their team based on the feedback I was able to give them based on from what Ridley was able to give me. The process of passing all of that information worked well and thankfully, I was supported by a great visual effects production team with fantastic coordinators and producers who were able to assist me to concentrate on what needed to concentrate on.”

Production stills © 2012 Twentieth Century Fox.

VFX images © 2012 Twentieth Century Fox. All rights reserved. Images courtesy of Twentieth Century Fox, MPC, Weta Digital, Fuel VFX, Hammerhead, Rising Sun Pictures, Luma Pictures, Territory and Prologue.

Visit the official websites for Prometheus, MPC, Weta Digital, Fuel VFX, Hammerhead, Lola VFX, Rising Sun Pictures, Luma Pictures, Territory and Prologue, and be sure to check out Trevor’s Ridley Scott filmmaker profile, Hard To Replicate.

Many thanks to Richard Stammers, Charley Henley, Martin Hill, Paul Butterworth, Michelle Eisenreich, Unjoo Byars, Jason Bath, Ahmet Ahmet, Lee Nelson, Dong Ho Lee and David Sheldon-Hicks for taking the time to be interviewed.

Trevor Hogg is a freelance video editor and writer who currently resides in Canada.

Around the Web

  • not happy

    dont forget to warn about spoilers