Render Storage Performance on a Budget for All-Digital Dome Productions

Step through the doors of the California Academy of Sciences Morrison Planetarium in Golden Gate Park, and you could end up anywhere in the universe—from the center of the Milky Way Galaxy to the streets of San Francisco during the 1906 earthquake, or smack-dab inside the double-helix strand of a DNA molecule. While these educational forays through earth and space typically last just 25 minutes, the show’s own journey to the Planetarium’s 75-foot-diameter projection screen spans more than a year—that’s the time it takes for engineering and production staff to create the imagery that so convincingly transports you to new worlds of adventure.

Challenge: Quality Imagery Despite Unpredictable Workloads and a Fixed Budget

During the production process, digital artists develop and process huge amounts of scene data—175TB in the case of Earthquake, the Planetarium’s Summer 2013 release. Computer-generated content is grafted together with high-resolution photography and live-action film into 3D frameworks for projection onto the largeformat screen of the Planetarium dome. Tilted at 30 degrees to allow a shared perspective, the dome immerses audiences in the show experience.

 

California Academy of Sciences Morrison Planetarium in Golden Gate Park

 

Rendering content to create the high-quality imagery for a single show in the dome requires hundreds of rendering nodes and massive, high-speed storage resources to support the compute workload. In general, the more compute and storage resources you apply to the job, the more artists can iterate on a scene, and the higher the quality and realism of the resulting imagery. But, as is the case with its commercial counterparts in the entertainment business, the non-profit Academy’s Visualization Studio must constantly balance visual impact against the realities of limited budgets, schedules, and artistic resources. 

 

 

A small production crew of six create and produce content for the Planetarium’s theater shows, for the Academy’s bi-monthly Science Today documentary series, as well as for all of the digital exhibits throughout the aquarium, natural history museum, and four-story tropical rainforest. Planetarium and Production Engineering Manager, Michael Garza, explains the team’s unique challenges: “Content development includes taking live-action footage, but for Planetarium shows, the balance of content is computer-generated. The dome screen is six times the surface area of a typical movie theater, and current camera and lens technologies cannot produce adequate image data to fill it. For that reason we rely primarily on computer-generated imagery (CGI)—but CGI requires significantly more time and resources to create.”

During production of the Earthquake show, Garza’s Visualization Studio team utilized dedicated IT infrastructure for developing the CGI. Six Linux file servers with some 200TB of capacity service twelve artist workstations (six Linux-based and six Apple Macintosh systems) and a large rendering farm from Boxx Technologies. Applications include the Houdini high-end 3D animation software from Side Effects Software that was used to convert actual 1906 earthquake data (supplied by Lawrence Livermore National Laboratory) into visualizations for the show.

Garza points out that although visual effects (VFX) computations were spread across the large render farm, performance became an issue. Queue bottlenecks began to negatively impact image quality and production schedule. “Our storage systems couldn’t deliver the aggregate bandwidth required to service the render farm, particularly during peak demand,” Garza explains. “Although we had a lot of storage capacity and performance, it was spread across isolated systems. So for example, if we were working on multiple scenes stored on a single filer, the system could not keep pace with render-farm requests. The only way to improve performance was to distribute the load manually, moving data across multiple filers. But copying terabyte-size files takes time—moving 2TB, for example, could take two days. Those lengthy processes pushed revision turnaround out two weeks past our target of 24 hours, reduced artist productivity, and ultimately threatened the quality of the show’s final imagery.” 

The production team needed more storage bandwidth, but financial constraints limited options. Garza says, “Show budgeting starts as early as two to four years ahead of production. Over the last several years of economic uncertainties, the Planetarium has asked us to deliver more and higher-quality content with increasingly smaller budgets. And the final production budget is rigid—we can’t redefine it mid-project, even if we encounter changing compute or storage requirements.”

Solution: Avere Systems Storage for Render Power + Value

“We considered solutions from BlueArc, EMC Isilon, and NetApp, all of the common storage vendors in the VFX and entertainment industries,” continues Garza. “But all were priced out of our range, and none gave us the ability to expand incrementally. We couldn’t afford a large initial purchase or to completely replace all of our existing storage. Nor could we afford large capital outlays at future performance thresholds. “Avere was the only solution that met every requirement— we could achieve our performance objectives with a smaller purchase footprint, leverage existing storage assets, and be assured of incremental scalability. Avere was an across-the-board win, delivering a better solution in every category.”

Today a three-node Avere FXT 3200 Edge filer cluster makes it possible for the Academy’s visual effects production team to deliver more visual impact in less time and at a lower cost. The Edge filer accelerates I/O to render nodes by automatically placing active data—like texture files—on solid-state media. The Edge filers meet the demands of the Visualization Studio’s complex workloads, effectively eliminating queue bottlenecks. The Avere clustered solution delivers high-speed I/O to some 400 rendering nodes while ensuring maximum responsiveness to artist workstations. Avere FXT Edge filers meet current performance requirements and protect the Planetarium’s investment. Avere can provide linear performance scaling to millions of operations per second and more than 100GB-per-second throughput to support future growth.

 

 

Benefits: More Impact, Less Cost

10X More Render Power

Garza says that the Avere deployment was a “seamless dropin, with no disruptions. Our artists were not even aware of the switch—except for the notable performance improvements. The cluster runs flawlessly and has eliminated the downtime we occasionally experienced with our original Core-filer infrastructure.”

The Avere cluster accelerates I/O performance for routine rendering processes and also ensures adequate bandwidth to handle peak demand when artists are working with files up to 50MB with as many as 50 images in each. Garza reports impressive results: “We expected the Avere solution to deliver a 2-5X performance improvement—but in direct-comparison testing running some of our most demanding shots, actual results show a 10X boost. In cases where we could previously run just two or three simultaneous rendering processes, we’re now running 20-30 processes.”

Avere performance allows the Visualization Studio team to meet the shifting requirements of an unpredictable VFX workflow. “This architecture gives us flexibility to handle changes in demand as the show develops,” Garza comments. “In our business, iterations equal quality. With the Avere cluster in place, artists can run more iterations to add more detail and increase the complexity of the overall show. Another benefit of faster turnaround on revisions is that we’re able to review progress daily while scenes and objectives are fresh. And of course we have unprecedented room to grow. Because of our screen size, each show that we develop contains up to 6X more data than a Hollywood production, and our data doubles with every show. The seamless scalability of the Avere ensures we can meet ongoing demand, and we’ll be able to grow incrementally, without having to make large, unanticipated capital expenditures.” 

One-tenth the Cost

“We estimate that it would have been a 10-fold increase in cost to build out a comparable system with a competing vendor,” adds Garza. “We would have spent more on software licensing and would have had to throw away existing storage. Instead we’re continuing to leverage that investment. We’ve always used low-cost drives, but we’re now replacing 750GB disks with high-density 2TB drives and using that capacity as Tier 2 or nearline storage. With more capacity in our Core filers we can keep more data online. We also have more flexibility to re-use assets for promotional purposes and can keep all our show data online for up to nine months past opening.

“Using Avere has helped us reduce operating costs—we spend less time on data management, and we’re able to manage backup and archiving independently of production. That gives us more administrative flexibility and has eliminated the performance hits that backup processes used to inflict on production systems.”

Immersive Imagery

The California Academy of Science Planetarium distributes shows internationally, with productions currently running in four countries. In San Francisco, as many as ten daily showings let audiences in the 300-seat dome experience breathtaking tours through time and place. The immersive video technology creates unique opportunities to engage audiences in educational experiences. 

 

 

“The Academy’s focus is on education,” Garza states, “and our data-driven visualizations make the experience entertaining. We’re competing for audience time, so Planetarium shows must be interesting and compelling. Without Avere technology, continued economic pressures would likely have forced us to scale back. With the Avere solution in place, artists can focus their efforts on content rather than simply trying to get their jobs done. As a result, they’re creating even more engaging productions with the immersive imagery that brings audiences back.” 

About California Academy of Sciences

The California Academy of Sciences is a world-class scientific and cultural institution based in San Francisco, California. The Academy’s newest facility, a 400,000-square foot structure, houses an aquarium, a planetarium, a natural history museum, and a four-story rainforest. Located in Golden Gate Park, the facility is also home to the Academy’s staff of world-class scientists, an education department that provides a wide range of student and teacher services, and an extensive science library with more than 28 million specimens and artifacts. The mission of the California Academy of Sciences is to explore, explain and sustain life. www.calacademy.org