Quantcast
Channel: Media & Entertainment Technology » GPUs
Viewing all articles
Browse latest Browse all 2

Digital Pyrotechnics at Industrial Light and Magic

$
0
0

March 26, 2014, GPU Technology Conference, San Jose, CA—Olivier Maury from Industrial Light and Magic described the use and modifications of their tools for generating pyrotechnics. Plume is a GPU-based tool that provides integrated fluid simulation and rendering.

The tool transforms 6-7 k Plume effects for smoke, fire, dust, and other physical explosion effects. The improved performance allows for reduced iteration times and greater productivity. Earlier versions were CPU-based and could only produce partial particle renders and took too long. By transferring the calculations to the GPUs, the new pipeline can simulate and render all of the elements of an explosion or of other pyrotechnics.

The tool set has a constant compute pattern with linear memory for the simulation memory and a texture cache for data realism. The overall design tried to balance simplicity with performance. The solvers use a shaped grid to account for changes in velocity and temperature as the simulations progress.

The inputs to the solvers are Python sources, geometric primitives, and a particle grid. The multi-grid pressure solver uses a small grid to minimize the boundary issues and differences. The Python code runs on the GPU with overload from an abstract syntax tree, that is fed as byte code in the CUDA kernel. This code is used for turbulence, reheating, and local field modifications. In addition, there is an iterated orthogonal projector for collision detection.

One reason for the Python code is to be artist friendly. The artists use a lot of Python for their work, so this tool allows the artists to remain in the Python code space. The volume renderer is a one-pass ray marcher that maps the simulation data to color and opacity fields, and render-time procedural details to texture coordinates. The physically-based tools facilitate other time allocations for the rest of the render and the compositing tools.

The current GPU farm is comprised of 128 Quadro FX 5800 boards with 4GB of RAM. Job scheduling is through OBAQ, but there is no optimal solution for scale versus immediacy. Health checks relieve stability concerns and all of the GPUs are on a reboot schedule to minimize crashes during run time. The overall intent is to reduce delays for pick-up, but managing the GPU farm is still difficult.

The artists and IT have developed templates for explosions, dust, smoke, and sticky fire. These templates bypass the large iteration counts for large elements and allow the artist to view the effects of the setup on a partially rendered template. This process allows for quick views of work while the GPU farm finishes the render, but if the artist feels that the effect is not working, he can quit the render early.

The tools also allow for directed flows and better water effects. One insight they developed is that for large water effects, they can pick up and pair high velocity particles to reduce the computational requirements.

The render time for a large scene can be about an hour when it includes pyrotechnics, dust, smoke, and deep object drill outs ( e.g. a solid object moving through smoke). The effects for "The Avengers" did not use pairing, just the natural swirls. They used single and multiple scattering for the details. For "Battleship" they included temperature and particle pairs to help generate the particle shadows. Additional elements were the shock wave, fire swirls, and smoke.

The high-performance, single-step simulation and render allows the junior artists to use the latest tools and helps the non-technical artists get to the best results. Overall, productivity has gone up 10 times. The new tools also allow the reuse of data for longer times and for other projects.
 


Viewing all articles
Browse latest Browse all 2

Latest Images

Trending Articles