Skip to main content

3D News

For the last few years we’ve worked with the National Stereoscopic Association to support the 3D Digital Showcase photo competition featured at the NSA’s annual conventions. The images from this past year’s showcase are now live for everyone to view. We really enjoy the diversity of images submitted by 3D artists and enthusiasts to this event, and this gallery is certainly no different. You’ll see everything from close ups of insects to people juggling fire. Simply put,...
In driver 334.89 NVIDIA introduced a new proprietary rendering mode for 3D Vision that enables us to improve the 3D experience for many key DirectX 10 and 11 games. This mode is now called “3D Compatibility Mode”. We have continued to iterate on this feature in driver 344.11, increasing game support and adding some new interface elements. You can get the new driver at www.geforce.com/drivers or via the update option in Geforce Experience. With the release of 344.11, new 3D...
We’re fortunate enough to have another fine 3D video from New Media Film Festival to share with you here on 3DVisionLive—a pop music video from Italy called “The Way,” which you can view here. Even better, New Media Film Festival has provided an interview with one of the co-directors of the video, Edoardo Ballanti, which provides insights on how the video was created and the vision behind it. Enjoy! (Alice Corsi also co-directed the video.) What was the Inspiration behind “...
The Fall Photo Contest received nearly 100 images – thanks to all that entered! The contest called for your best “nature” shots with the only other requirement being that they had to be true stereo images. Submissions ranged from shots of spiders in gardens to artistic approaches to tasteful nudes. As before, members were invited to vote for the winner by tagging images in the contest gallery as favorites. Without further ado, the winner is: Autumn Goodbye to Summer This...
In driver 334.89 NVIDIA introduced a new proprietary rendering mode for 3D Vision that enables us to improve the 3D experience for many key DirectX 10 and 11 games. This mode is now called “3D Compatibility Mode”. We have continued to iterate on this feature in beta driver 337, increasing game support and adding a toggle key to enable/disable the mode. Games with 3D Compatibility Mode will launch in this mode by default. To change the render mode back to standard 3D Vision...

Recent Blog Entries

For the fourth consecutive year, our Tesla Accelerated Computing Platform helped set new milestones in the Asia Student Supercomputer Challenge, the world’s largest supercomputer competition.

Each year, the brightest minds from universities around the world compete to find out who can build the fastest, most efficient supercomputer.

At ASC16, GPUs once again powered the winning team, and helped another break the performance record on an industry-standard supercomputing benchmark.

Huazhong University of Science and Technology Earns Top Spot

Some 175 teams from universities in North America, South America, Africa, Asia, Europe and Oceania participated in ASC16. Of these, 16 advanced to the final round held last week at Huazhong University of Science and Technology in Wuhan, China.

Armed with their custom-built systems, finalists competed on six different supercomputing application benchmarks within a 3,000W system power limit.

The benchmarks included the surface wave numerical model, MASNUM; the material simulation software, ABINIT; the High Performance Conjugate Gradients benchmark; and ABySS, a de novo, parallel sequence assembler. Teams also had to train a deep neural network for speech recognition.

The final test was LINPACK, the industry benchmark used to measure the performance of the world’s most powerful supercomputers, like the Titan system at Oak Ridge National Laboratory.

Thanks to our Tesla K80 GPU accelerators, Huazhong University beat out the field of competitors, grabbing the highest overall score.

The victorious team from the Huazhong University of Science and Technology. Zhejiang University Breaks LINPACK Record

The team from Zhejiang University used a system with eight NVIDIA Tesla K80 GPUs to establish a new record on LINPACK.

Clocking in at a remarkable 12.03 teraflops – 12 trillion floating-point operations per second – Zhejiang’s system overturned the previous record of 11.92 teraflops.

Nanyang Technological University achieved that record at ASC15. Sun Yat-sen University set a record of 9.27 teraflops using Tesla K40 GPUs at ASC14.

NVIDIA GPUs continue to help computer scientists, researchers and engineers around the world tackle massive computational challenges. Through competitions like ASC, they’re preparing new generations of experts to address tomorrow’s toughest problems.

The post GPU-Powered Systems Take Top Spot, Set New Record in Student Supercomputer Competition appeared first on The Official NVIDIA Blog.

Marc Gyongyosi isn’t your average college student. The junior computer science major at Northwestern University’s McCormick School of Engineering has thrown himself into the world of lightweight robotics in a way that reaches far beyond the classroom.

Not only has Gyongyosi spent the past two years working with BMW’s robotics research department on developing robotic systems to help factory workers, he’s also involved in two startups. One of those, MDAR Technologies, is working on 3D vision systems for autonomous vehicles.

But it’s his work with the second company, IFM Technologies, which he founded, that landed him on a stage at our annual GPU Technology Conference.

IFM has been working on an autonomous drone that can be reliably operated indoors. Most drones today only fly outdoors because a) they’re too large and clunky to be safely flown indoors, and b) the GPS systems they rely on don’t work indoors. Further complicating the market for outdoor drones is the fact that the FAA must approve them for flight. That’s not the case with indoor drones.

Gyongyosi looked at that convergence of facts and determined that there’s a huge potential market for a commercially available indoor drone. He told GTC attendees that he estimates there are multi-billion-dollar opportunities in areas such as warehouse analytics, utility analysis, insurance inspections, and commercial real estate and construction.

new juicebox({ backgroundColor: "rgba(34,34,34,1)", configUrl: "http://blogs.nvidia.com/wp-content/plugins/wp-juicebox/config.php?gallery_id=52", containerId: "juicebox-container-52", galleryHeight: "400px", galleryWidth: "100%" });

 

And make no mistake, he’s not in this just to identify those opportunities; he wants to seize them. “We don’t want to just be a research project,” Gyongyosi said during his talk. “We want to be something that goes from problem to solution.”

His solution, however, has presented technical challenges. To start with, he’s had to find an alternative to the GPS built into outdoor drones. He said others have tried motion capture or radio beacons as GPS substitutes, but because he’s trying to keep IFM’s drone small and light, he didn’t want the extra weight. That, plus those options tend to be expensive and need constant calibration.

Similarly, other drones rely on onboard sensors to detect physical objects around them to avoid collision. But that also has presented a major space challenge on IFM’s small drone, as the amount of data that has to be processed is enormous.

“The processing power you need onboard is large,” he said. “That’s why these platforms are very large.”

To combat these issues, Gyongyosi did two things: First, he opted to mount a single camera on the IFM, sacrificing stereoscopic vision but preserving space and keeping the weight down. Then, he choose to incorporate feature tracking that operates somewhat like sensors, but instead uses the data from the camera.

When the performance of that configuration came up short of his expectations, he turned to the GPU, specifically the NVIDIA Jetson Tegra K1, which is now part of the vehicle’s physical design.

The results speak for themselves. GPUs are processing the data nearly four times as fast as a CPU. Plus, the feature-tracking rate nearly doubled, from 5.5 Hz to 9.8 Hz. And if that’s not enough, it also improved accuracy and created enough spare space that Gyongyosi was able to add a second camera, which is mounted at a 45-degree angle to the first, trading stereoscopic sight for a larger field of vision.

To further illustrate the potential impact of IFM’s design, Gyongyosi pointed to the colossal failure that is Berlin’s long-planned futuristic airport, a project that was supposed to open years ago but remains non-operational after design flaws were found in the fire detection system during inspection.

Gyongyosi believes indoor drones could have prevented the fiasco by detecting the issue long before inspection, and he hopes IFM’s drones will be performing such tasks soon.

 

The post Inside Job: Student Turns to GPUs to Create Drones for the Great Indoors appeared first on The Official NVIDIA Blog.