Skip to main content

3D News

We’re fortunate enough to have another fine 3D video from New Media Film Festival to share with you here on 3DVisionLive—a pop music video from Italy called “The Way,” which you can view here. Even better, New Media Film Festival has provided an interview with one of the co-directors of the video, Edoardo Ballanti, which provides insights on how the video was created and the vision behind it. Enjoy! (Alice Corsi also co-directed the video.) What was the Inspiration behind “...
The Fall Photo Contest received nearly 100 images – thanks to all that entered! The contest called for your best “nature” shots with the only other requirement being that they had to be true stereo images. Submissions ranged from shots of spiders in gardens to artistic approaches to tasteful nudes. As before, members were invited to vote for the winner by tagging images in the contest gallery as favorites. Without further ado, the winner is: Autumn Goodbye to Summer This...
In driver 334.89 NVIDIA introduced a new proprietary rendering mode for 3D Vision that enables us to improve the 3D experience for many key DirectX 10 and 11 games. This mode is now called “3D Compatibility Mode”. We have continued to iterate on this feature in beta driver 337, increasing game support and adding a toggle key to enable/disable the mode. Games with 3D Compatibility Mode will launch in this mode by default. To change the render mode back to standard 3D Vision...
3DVisionLive’s first-ever short-form 3D video contest received 14 entries that showed a great deal of diversity, ranging from video game captures to commercial-style clips to raw captures of pets or people doing cool things (such as bashing each other with swords). During judging we laughed, we cried (okay, maybe not), and we simply scratched our heads…. But seriously: thank-you to all that participated and we hope to see more of your content uploaded to the site for all to...
The submission period for the Fall Photo Contest is now closed, and we are happy to report we’ve received nearly 100 images from our members for consideration. And, once again, we’re opening the judging process to our community as well to help us determine the winners. The full gallery of images may be seen by clicking the link above. Between now and February 10th (11:59 PST), please view all of the images in the gallery and place your votes for the ones you’d like to win by...

Recent Blog Entries

In Bee Movie, an animated feature from 2007, a friendship between a bee (voiced by Jerry Seinfeld) and a young woman (Renee Zellweger) leads to the world’s bee population reclaiming the honey it produces.

A decade later, a young woman who is real and with a self-described “penchant for cute, round things” — working with NVIDIA engineers and GPU-powered deep learning — may help to minimize the impact of a destructive parasite and lead to domesticated bees being returned to the almond-shape hive design that serves them so well in the wild.

Jade Greenberg, a 17-year-old junior at Pascack Hills High School in Bergen County, N.J., zeroed in on honey bees and the causes of colony collapse as the subject of a research project for her molecular genetics class.

Eventually, Greenberg focused on the threat posed by Varroa mites, a parasite thought to be one of the most frequent causes of collapses of domestic hives. Her research has led her to postulate that the long-accepted design of the Langstroth hive — the cabinet-like standard since its introduction in the 1850s — is a big reason Varroa mites have become such a big problem.

The classic Langstroth hive design.

NVIDIA, GPUs and deep learning came into the picture when Jade’s father, a solutions engineer at Kinetica, teamed with Jacci Cenci, an NVIDIA engineer, and they started applying their companies’ respective technologies to the problem. Linking sensors and cameras to a convolutional neural network, Cenci’s team began collecting data on hive conditions such as weight, humidity, temperature and population.

Ramping Up Detection

A variety of deep learning and machine learning technologies — including NVIDIA’s Jetson TX2 development kit, an NVIDIA DGX Station, TensorRT, a high-performance deep learning inference optimizer and the Microsoft Cognitive Toolkit deep learning framework — combine to rapidly detect and warn against the presence of Varroa mites.

The stack of NVIDIA hardware and software is able to optimize, validate and deploy trained neural networks for inferencing in the field, thereby alerting beekeepers of the potential for infestation sooner.

“If the weight decreases, the hive could be sick and bees are leaving. If the hive is heavy, it could mean lots of swarming, or high humidity might be present, which increases the odds of mite infestation,” said Cenci.

Armed with this extra information, which is converted into useful charts and graphs using Kinetica’s GPU-accelerated insight engine, Greenberg moved steadily from simply studying the problem to crafting a solution.

“We have better ways of collecting data, and we have better ways of observing bees in more detailed ways,” she said. “I’m surprised that this industry hasn’t moved past something designed in the 19th century. It’s time for a change.”

Fighting Mites with Hive Design

Greenberg, who explains her work in a video below, has been using the data collected with the NVIDIA technology to guide her efforts to design a better hive. She’s learned that the different sizes and shapes of the entrances, the contrast with the natural almond shape of wild hives, and the fact that the queen is separated from the rest of the colony are potentially fatal flaws of the Langstroth hive design.

She also suggested that the larger spaces that bees occupy allow other mite-infected insects, such as moths, to enter the hive and become trapped, leading to further infestation.

In other words, AI is enabling Greenberg to pull back the curtain on the Langstroth hive’s failings, which may have been underestimated before now.

“It tells us in what ways the Langstroth hive is failing us when it comes to Varroa mite infestation,” she said.

It also is helping Greenberg refine her design, which, to be viable in the commercial beekeeping arena, must improve hive health while preserving the commercial capabilities of the Langstroth hive.

Greenberg and her bee hive design were recently awarded first alternate in the engineering category at the Nokia Bell Labs North Jersey Regional Science Fair. She’s also a finalist in the Intel International Science and Engineering Fair in May.

Her work and the technology backing her up will additionally be the subject of a session at the upcoming GPU Technology Conference in San Jose. Kinetica dashboards presenting the info will be rendered on an NVIDIA DGX Station AI supercomputer.

The post Why Deep Learning May Prove to Be the Bee’s Knees appeared first on The Official NVIDIA Blog.

The slithery spotted eel isn’t really swimming over you, and you’re not surrounded by a school of light blue fish. But it sure feels that way as you plunge into Expedition Reef, the new planetarium show at the California Academy of Sciences.

The San Francisco museum’s GPU-powered coral reef experience immerses viewers in a dramatic 3D video re-creation of reefs as they live, reproduce and struggle to survive in an increasingly challenging environment.

The reef teems with life — 50 species of coral, sea turtles, seaweed, algae and more than 5,000 individual fish — captured in fine detail in what the academy describes as the world’s most accurate digital reef. The film is showing daily in San Francisco until March 2019 and is being licensed to planetariums around the world.

“We wanted to bring corals to life in a way you haven’t seen in other productions,” said Ryan Wyatt, senior director of Morrison Planetarium and science visualization at the academy. “These complex ecosystems demand a highly realistic approach to help people engage with them.”

Plenty of Fish in the Sea

The film posed unprecedented challenges for the museum’s visualization studio, which relied heavily on NVIDIA Quadro GPUs, the same technology used by movie studios to create their dazzling special effects.

Making the reef and its inhabitants look real meant capturing a vast amount of detail that included everything from the reflection of light on the water to the rough texture of the coral to the gaudy colors of the tropical fish. In addition to the thousands of plants and animals, studio artists had to reproduce movement as creatures swam, swayed or floated in the ocean currents.

“We’ve produced shows with photorealistic environments before, but none have been this complex, with this much detail and variety,” said Michael Garza, the museum’s senior planetarium and production engineering manager.

A sea turtle swims above a colorful coral reef in the California Academy of Sciences’ GPU-powered 3D experience. Into the Blue with GPUs

The film’s reefs are 3D reconstructions from more than 100,000 underwater photos shot by researchers and collaborators around the world. The museum’s visualization studio transformed these two-dimensional photos into 3D models, with help from NVIDIA Quadro GP100 and P6000 GPUs. The production team then used GPU-accelerated rendering software to turn the models into a movie.

“Quadro acceleration allowed us to process large swaths of undersea surveys into realistic virtual coral reefs,” Garza said. It also produced a 10x improvement in rendering performance, he said, and it drove new 32-inch, 4K monitors at artists’ workstations. The monitors were critical for artists to make creative choices and iterate more frequently.

The museum also relied on the NVIDIA Quadro Virtual Workstation to manage resources easily with limited space, easily allocating multiple GPU resources to a single large task or multiple small tasks, Garza said.

Hope for Coral Reefs

The show unfolds on the planetarium’s 75-foot, 180-degree screen with an awesome display of beautiful multicolored reefs and flashy fish darting across the screen.

But the film isn’t just about beauty.

It’s about the vital role coral reefs play in the world’s ecosystems, why they’re imperiled and what scientists at the California Academy of Sciences and elsewhere are doing to save them. It also aims to inspire viewers to do their part by consuming fewer resources.

“Most people won’t get to visit coral reefs,” said Elizabeth Babcock, dean of education at the academy. “We want to use digital tools to spark people’s imaginations and create an emotional connection to reefs.”

In addition to the planetarium shows in San Francisco and elsewhere, the museum plans to offer an HD version and lesson plans that teachers can use in their classrooms. For more information on how the museum created Expedition Reef, see the video below.

Learn more about NVIDIA GPU-accelerated rendering solutions during industry and technology focused sessions at the GPU Technology Conference, March 26-29 in San Jose. Register now.

* The main image for this story pictures a moray eel. It is provided courtesy of the California Academy of Sciences.

The post No Barrier to This Reef: Dazzling Film Brings Coral to Life in GPU-Powered Museum Show appeared first on The Official NVIDIA Blog.