Life stories: the Glimpse Renderer – part 2

The Lego Movie (TLM) pre-production was at full steam. A key design directive from directors Phil Lord and Chris Miller was that everything had to be made with LEGO bricks, everything. Production Supervisors, R&D and TDs at Animal Logic figured out very neat and efficient pipeline and process for asset creation, modeling, shading and layout. Production had to be efficient because the budget was rather small, particularly tight in light of the scale of such an ambitious project. The problem was rendering it. The REYES renderer had been a great powerhouse up to that point. REYES is a streaming architecture, it was designed to process scene geometry once and in order so that it could render a lot of it in a very small memory budget. REYES geometry and shading representation was bloated if compared to other architectures, this because the renderer was supposed to only retain in memory only a very small amount of it at any given time. Time passed, requirements changed. REYES was an aging architecture, there was only so much gas left in that tank. That we know of, only path tracing can deliver a certain quality and tracing rays requires the entire geometry data set to be up in memory. If the future was more raytracing then the REYES rendering architecture was at the end of its life.

TLM was really running ahead of the trend. REYES had no real support for instancing. Photo realistic plastic is fundamentally a mix of subsurface scatter, reflections and refractions. Point-based techniques for indirect illumination were not coping with the shear geometric density of landscapes and cities made with LEGO geometry. The options? Go back to modeling and and texturing the traditional way, where models are made of simple surfaces, textured and displaced to look like interconnected bricks. The scale of the project and the production budget didn’t allow for that, not even in the ballpark. A substantial portion of the production budged was dedicated in creating and texturing a complete catalog of individual bricks, reproduced in exquisite details. Assets and props were budgeted to be designed and assembled in hours with minimal to no custom texturing work needed; that instead of assets modeled in days and shaded in weeks. The dynamic look of the movie, where every single brick could be dislodged from the stage and reconnected as part of the animation was also making that an incompatible proposition.

I was brought in to the project at a crisis point. The crew could achieve the required look only at a tiny scale. The pre-production lighting crew was scrambling to come up with a practical approach to shot lighting, while the aesthetic part of the cinematography had been already figured out, the technique to deliver was not. How to split scenes into many layer in order to make things render, and trust me, these were savvy people. The look was gorgeous but at hundreds of hours/frame (wall-clock). The biggest challenge with the REYES renderer was to produce convincing refractions at scale on LEGO geometry. Clear bricks were common and their internal parts required many ray bounces to render convincingly. The memory usage in a raytracer overall is not affected by the maximum depth of a light path, but on a REYES engine that was not the case: REYES with complex shading used a huge amount of memory at each extra ray depth, raytracing was part of grids shading and it was recursively stacking tens of GBs in the shading grids. To reduce memory pressure, the rendering buckets size and shading grids size had to be very small, drastically impacting on the efficiency of the architecture. A ray depth of 20 was common to achieve the desired result. Just one of those clear bricks in the scene was enough to make a shot a nightmare to render. This was why shots had to be split in passes to extremely small subsets, less geometry in memory means more of it available foir shading grids. A setup impractical to manage and composite back together, not at that weekly shot delivery quota. Production times allowances from 6-7 years earlier with current hardware may have made it possible… still a nightmare, but possible. Engineers in the R&D departments were hard at work to try on different ways to fake refraction, they were brave and creative, they spent lots of time on the challenge but unfortunately none of that was producing visually convincing results and the tech had to be abandoned.

We had brainstorming sessions where we laid on the table everything we could think off. Nothing was buying us significant margin. All the options were falling too short, pointing to a failure point just a little further down the line. We considered trying another renderer, but that would have meant to re-pipeline, typically something that takes 2-4 years to deploy. We had a few months to figure out a solution before shot-lighting was expected to ramp-up. The only option left was outlandish and it was from my list: to adopt Glimpse as a ray-server for the REYES engine. Not a new concept in our industry, but certainly something not seen in the wild for almost a decade. The idea was to compute shadows, indirect illumination and subsurface scatter by embedding Glimpse inside RenderMan as a plugin. Basically the REYES engine would have computed primary visibility and shading, which was what the REYES engine was excelling at, while letting the modern raytracer in Glimpse to do the heavy lifting. Thankfully RenderMan was a very flexible engine, allowing that sort of massive hacking.

I didn’t know Luke Emrose at the time. He joined the company as a TD to work on TLM after having developed production tech for Happy Feet 2. During the brainstorming session Luke volunteered to bridge Glimpse to RenderMan in a pilot project named FrankenGlimpse. Suddenly we were in two! What I described early on as “the two unlikely underdogs” who pulled the project off.

As you can imagine there was a lot of skepticism in production about how this was not a good idea. We convinced production to give us 2 weeks of time to complete a feasibility test. The biggest problem was in how Glimpse was only capable or rendering from within Maya, not a Ri compliant renderer (differently from early known ray-servers from another age, such as Blue Moon Rendering Tools which could read rib files). The vast majority of the LEGO geometry was generated in Ri Procedurals, following a rather traditional RenderMan pipeline… all data Glimpse had no access to, a pipeline that couldn’t be changed in time.

Luke began working on a really complex RiFilter to intercept geometry as RenderMan was ingesting it and passing that onto Glimpse. I started working on a low-level renderer API in Glimpse for Luke to use. I didn’t want to compromise the live rendering characteristics of the Glimpse core design by shaping it to much to Ri quirkiness, so a low level API to the renderer was necessary. The two weeks budgeted were not nearly enough to complete the pilot project and we asked for more time. I needed some visual material to demonstrate, some LEGO production assets rendererd in Glimpse to convince production to commit to the effort and buy us the time we needed to prove that our solution would work. What really sold Glimpse to production was a picture I made of the Emmet’s bedroom (the scene at the very beginning of the movie), in the room there was an Emmet entirely made of clear plastic. The frame took less than a minute to complete, with very convincingly looking clear plastic. If you think about it wasn’t much, but at that moment, in that pipeline and with those asserts it was, it definitely was. The lighting supervisor was blown away. That was the sign production needed. The development project was officially green-lit.

We continued for months, me and Luke, sitting side by side at the edge of the lighting department, bouncing ideas off each others day in and out. Luke was finessing the never ending corner cases of the RiFilter and figuring out how to make the two renderers coexist in memory, figuring out how to avoid a two step process of data conversion + binary dump followed by a second scene reload and the actual rendering. I was implementing subdivision surfaces, textures support and a C++ replica of the rather complex LEGO material. It was great to have a partner in crimes. Everything I couldn’t do he could do better and vice versa. Soon we deployed Glimpse to production and shot lighting started to flow.

We figured we could do a lot more, the new tech allowed the R&D pipeline TDs to switch a substantial chunk of the modeling, animation and tech check review renders over to Glimpse. We made special integrators to be able to render a predictive look form any asset without any lighting rig. With the new tech, animation reviews were taking 15 second/frame to render, while the scene loading time was much much longer than that, highlighting how the bottleneck were drifting somewhere else in the software stack. Still it was a massive improvement, freeing up farm quota now made available to final lighting.

We had time and we could improve out tech. Subsurface scatter was the next big challenge. The polygonal density of LEGO bricks was producing immense point clouds. Yet geometry in the distance couldn’t result it the point density to produce any scatter for that given mean free path, diffuse approximations weren’t giving great results especially with illumination from the back and the thin LEGO plates, where it was completely missing. So we approached replacing more of the shading calculations by adding more features to Glimpse. A few other engineers were brought in for short initiatives, or to crack the math of the inversion for some sampling functions. A bit at a time we replaced more and more of the REYES renderer with the Glimpse back-end. It was not just the render time to be positively impacted, the render quality went up too. We kept on improving our tech up until eventually Glimpse could produce final frames of LEGO assets. This was achieved by letting the RiFilter capture the entire scene data preventing any of it from loading in the REYES renderer which in turn produced black frames quickly, while Glimpse did all the light transport calculations and saved its own frame buffers, hardcoded AOVs and all. In the last few weeks of shot lighting we were still adding features to reach that milestone. A handful of the complex shots in the final battle sequence were rendered entirely in Glimpse. The quality in the details was good enough that the art director couldn’t really tell which one was which during digital dailies.

Project delivered on time and on budget: TLM, an unexpecteded pop culture phenomenon.

The end of the project didn’t stop Luke and me. The next production was likely not a LEGO production and I knew none in the lighting crew wanted to go back to the previous tech. On the flips side FrankenGlimpse was too hacked and ad-hock, it wouldn’t have survived a second production. Lighters and surfacing artists working on TLM didn’t experience interactive rendering because all the geometry processing was still going through rib and RiFiltes, which as an interface weren’t really capable of the type of interactivity the new renderer was capable off. We knew we had a much more promising tech just waiting to be deployed, yet and we needed time and resources to pull it off. One thing is to hardcode a renderer to deliver within a strict set of requirements, another is to make something ready for any production need. We needed more engineers.

In the next post I will tell you about some of the events that followed, from FrankenGlimpse to a fully featured production renderer.

5 thoughts on “Life stories: the Glimpse Renderer – part 2

  1. These stories are really interesting. Can’t wait for the next part! Just out of curiosity, how do you think the final version of Glimpse (just before you joined Pixar) stacked up against RenderMan after it moved to RIS? Of course, RM is built by some of the best minds in the industry but by then did Glimpse still do something different that RenderMan lacked?

    Like

    1. Hi there, when in 2015 I left Animal Logic (and Glimpse), Glimpse and RenderMan were two completely different beasts. RenderMan was not interactive yet and the new RIS architecture in R20 was still very young and not feature complete like it is today. So at that point in time, Glimpse was probably more capable for the subset of features requested by production at Animal Logic. Adopting some custom tech such as a proprietary renderer at another studio and they would have found all sort of kinks and corners that would have made the tech not good for them. It’s kind of having a custom made piece of furniture, it is perfect for you and your kitchen, it is durable and very high quality; but if you move it to somebody else’s house it may not fit, they may not find it ergonomic. 5 years have passed, Glimpse and RenderMan have both evolved and matured at different pace, without a direct comparison and fresh data at hand, who knows how they would compare today.

      Like

  2. I am really loving these posts – as someone who has followed a similar path of being thrown into situations where having to solve challenging production problems leads to unplanned/fortuitous changes in career direction, I find they really resonate and are truly inspiring to rise to the challenge of doing the impossible! As I find myself wanting to code more and more in the rendering and graphics domain, I find what I struggle with most is the underlying math concepts. Im curious what your path was to (re)learn what sometimes feels like an insurmountable domain. Did you learn the math ad-hoc as it became necessary or take a more structured approach. Thanks for taking the time to post these!

    Like

    1. Hi Gary, structured learning is always good when you feel you need to build a foundation of something you practically don’t know anything about. Take calculus for example, you need to understand it in order to read most papers related to graphics. Then there is a change you forget most of it because you may not practice it frequently enough (it happens to me) and have to relearn bits and pieces again and again. But there are so many fields at the cross point of photo-realistic rendering. I find it is close to impossible to approach learning in a structured way while also carrying on your daily job and other aspects of life. So you may end up learning bits and pieces that you need when you need them.

      Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: