The Lego Movie (TLM) pre-production was at full steam. A key design directive from directors Phil Lord and Chris Miller was that everything had to be made with LEGO bricks, everything. Production Supervisors, R&D and TDs at Animal Logic figured out very neat and efficient pipeline and process for asset creation, modeling, shading and layout. Production had to be efficient because the budget was rather small, particularly tight in light of the scale of such an ambitious project. The problem was rendering it. The REYES renderer had been a great powerhouse up to that point. REYES is a streaming architecture, it was designed to process scene geometry once and in order so that it could render a lot of it in a very small memory budget. REYES geometry and shading representation was bloated if compared to other architectures, this because the renderer was supposed to only retain in memory only a very small amount of it at any given time. Time passed, requirements changed. REYES was an aging architecture, there was only so much gas left in that tank. That we know of, only path tracing can deliver a certain quality and tracing rays requires the entire geometry data set to be up in memory. If the future was more raytracing then the REYES rendering architecture was at the end of its life.
TLM was really running ahead of the trend. REYES had no real support for instancing. Photo realistic plastic is fundamentally a mix of subsurface scatter, reflections and refractions. Point-based techniques for indirect illumination were not coping with the shear geometric density of landscapes and cities made with LEGO geometry. The options? Go back to modeling and and texturing the traditional way, where models are made of simple surfaces, textured and displaced to look like interconnected bricks. The scale of the project and the production budget didn’t allow for that, not even in the ballpark. A substantial portion of the production budged was dedicated in creating and texturing a complete catalog of individual bricks, reproduced in exquisite details. Assets and props were budgeted to be designed and assembled in hours with minimal to no custom texturing work needed; that instead of assets modeled in days and shaded in weeks. The dynamic look of the movie, where every single brick could be dislodged from the stage and reconnected as part of the animation was also making that an incompatible proposition.
I was brought in to the project at a crisis point. The crew could achieve the required look only at a tiny scale. The pre-production lighting crew was scrambling to come up with a practical approach to shot lighting, while the aesthetic part of the cinematography had been already figured out, the technique to deliver was not. How to split scenes into many layer in order to make things render, and trust me, these were savvy people. The look was gorgeous but at hundreds of hours/frame (wall-clock). The biggest challenge with the REYES renderer was to produce convincing refractions at scale on LEGO geometry. Clear bricks were common and their internal parts required many ray bounces to render convincingly. The memory usage in a raytracer overall is not affected by the maximum depth of a light path, but on a REYES engine that was not the case: REYES with complex shading used a huge amount of memory at each extra ray depth, raytracing was part of grids shading and it was recursively stacking tens of GBs in the shading grids. To reduce memory pressure, the rendering buckets size and shading grids size had to be very small, drastically impacting on the efficiency of the architecture. A ray depth of 20 was common to achieve the desired result. Just one of those clear bricks in the scene was enough to make a shot a nightmare to render. This was why shots had to be split in passes to extremely small subsets, less geometry in memory means more of it available foir shading grids. A setup impractical to manage and composite back together, not at that weekly shot delivery quota. Production times allowances from 6-7 years earlier with current hardware may have made it possible… still a nightmare, but possible. Engineers in the R&D departments were hard at work to try on different ways to fake refraction, they were brave and creative, they spent lots of time on the challenge but unfortunately none of that was producing visually convincing results and the tech had to be abandoned.
We had brainstorming sessions where we laid on the table everything we could think off. Nothing was buying us significant margin. All the options were falling too short, pointing to a failure point just a little further down the line. We considered trying another renderer, but that would have meant to re-pipeline, typically something that takes 2-4 years to deploy. We had a few months to figure out a solution before shot-lighting was expected to ramp-up. The only option left was outlandish and it was from my list: to adopt Glimpse as a ray-server for the REYES engine. Not a new concept in our industry, but certainly something not seen in the wild for almost a decade. The idea was to compute shadows, indirect illumination and subsurface scatter by embedding Glimpse inside RenderMan as a plugin. Basically the REYES engine would have computed primary visibility and shading, which was what the REYES engine was excelling at, while letting the modern raytracer in Glimpse to do the heavy lifting. Thankfully RenderMan was a very flexible engine, allowing that sort of massive hacking.
I didn’t know Luke Emrose at the time. He joined the company as a TD to work on TLM after having developed production tech for Happy Feet 2. During the brainstorming session Luke volunteered to bridge Glimpse to RenderMan in a pilot project named FrankenGlimpse. Suddenly we were in two! What I described early on as “the two unlikely underdogs” who pulled the project off.
As you can imagine there was a lot of skepticism in production about how this was not a good idea. We convinced production to give us 2 weeks of time to complete a feasibility test. The biggest problem was in how Glimpse was only capable or rendering from within Maya, not a Ri compliant renderer (differently from early known ray-servers from another age, such as Blue Moon Rendering Tools which could read rib files). The vast majority of the LEGO geometry was generated in Ri Procedurals, following a rather traditional RenderMan pipeline… all data Glimpse had no access to, a pipeline that couldn’t be changed in time.
Luke began working on a really complex RiFilter to intercept geometry as RenderMan was ingesting it and passing that onto Glimpse. I started working on a low-level renderer API in Glimpse for Luke to use. I didn’t want to compromise the live rendering characteristics of the Glimpse core design by shaping it to much to Ri quirkiness, so a low level API to the renderer was necessary. The two weeks budgeted were not nearly enough to complete the pilot project and we asked for more time. I needed some visual material to demonstrate, some LEGO production assets rendererd in Glimpse to convince production to commit to the effort and buy us the time we needed to prove that our solution would work. What really sold Glimpse to production was a picture I made of the Emmet’s bedroom (the scene at the very beginning of the movie), in the room there was an Emmet entirely made of clear plastic. The frame took less than a minute to complete, with very convincingly looking clear plastic. If you think about it wasn’t much, but at that moment, in that pipeline and with those asserts it was, it definitely was. The lighting supervisor was blown away. That was the sign production needed. The development project was officially green-lit.
We continued for months, me and Luke, sitting side by side at the edge of the lighting department, bouncing ideas off each others day in and out. Luke was finessing the never ending corner cases of the RiFilter and figuring out how to make the two renderers coexist in memory, figuring out how to avoid a two step process of data conversion + binary dump followed by a second scene reload and the actual rendering. I was implementing subdivision surfaces, textures support and a C++ replica of the rather complex LEGO material. It was great to have a partner in crimes. Everything I couldn’t do he could do better and vice versa. Soon we deployed Glimpse to production and shot lighting started to flow.
We figured we could do a lot more, the new tech allowed the R&D pipeline TDs to switch a substantial chunk of the modeling, animation and tech check review renders over to Glimpse. We made special integrators to be able to render a predictive look form any asset without any lighting rig. With the new tech, animation reviews were taking 15 second/frame to render, while the scene loading time was much much longer than that, highlighting how the bottleneck were drifting somewhere else in the software stack. Still it was a massive improvement, freeing up farm quota now made available to final lighting.
We had time and we could improve out tech. Subsurface scatter was the next big challenge. The polygonal density of LEGO bricks was producing immense point clouds. Yet geometry in the distance couldn’t result it the point density to produce any scatter for that given mean free path, diffuse approximations weren’t giving great results especially with illumination from the back and the thin LEGO plates, where it was completely missing. So we approached replacing more of the shading calculations by adding more features to Glimpse. A few other engineers were brought in for short initiatives, or to crack the math of the inversion for some sampling functions. A bit at a time we replaced more and more of the REYES renderer with the Glimpse back-end. It was not just the render time to be positively impacted, the render quality went up too. We kept on improving our tech up until eventually Glimpse could produce final frames of LEGO assets. This was achieved by letting the RiFilter capture the entire scene data preventing any of it from loading in the REYES renderer which in turn produced black frames quickly, while Glimpse did all the light transport calculations and saved its own frame buffers, hardcoded AOVs and all. In the last few weeks of shot lighting we were still adding features to reach that milestone. A handful of the complex shots in the final battle sequence were rendered entirely in Glimpse. The quality in the details was good enough that the art director couldn’t really tell which one was which during digital dailies.
Project delivered on time and on budget: TLM, an unexpecteded pop culture phenomenon.
The end of the project didn’t stop Luke and me. The next production was likely not a LEGO production and I knew none in the lighting crew wanted to go back to the previous tech. On the flips side FrankenGlimpse was too hacked and ad-hock, it wouldn’t have survived a second production. Lighters and surfacing artists working on TLM didn’t experience interactive rendering because all the geometry processing was still going through rib and RiFiltes, which as an interface weren’t really capable of the type of interactivity the new renderer was capable off. We knew we had a much more promising tech just waiting to be deployed, yet and we needed time and resources to pull it off. One thing is to hardcode a renderer to deliver within a strict set of requirements, another is to make something ready for any production need. We needed more engineers.
In the next post I will tell you about some of the events that followed, from FrankenGlimpse to a fully featured production renderer.