Skip to main content

3D News

For the last few years we’ve worked with the National Stereoscopic Association to support the 3D Digital Showcase photo competition featured at the NSA’s annual conventions. The images from this past year’s showcase are now live for everyone to view. We really enjoy the diversity of images submitted by 3D artists and enthusiasts to this event, and this gallery is certainly no different. You’ll see everything from close ups of insects to people juggling fire. Simply put,...
In driver 334.89 NVIDIA introduced a new proprietary rendering mode for 3D Vision that enables us to improve the 3D experience for many key DirectX 10 and 11 games. This mode is now called “3D Compatibility Mode”. We have continued to iterate on this feature in driver 344.11, increasing game support and adding some new interface elements. You can get the new driver at www.geforce.com/drivers or via the update option in Geforce Experience. With the release of 344.11, new 3D...
We’re fortunate enough to have another fine 3D video from New Media Film Festival to share with you here on 3DVisionLive—a pop music video from Italy called “The Way,” which you can view here. Even better, New Media Film Festival has provided an interview with one of the co-directors of the video, Edoardo Ballanti, which provides insights on how the video was created and the vision behind it. Enjoy! (Alice Corsi also co-directed the video.) What was the Inspiration behind “...
The Fall Photo Contest received nearly 100 images – thanks to all that entered! The contest called for your best “nature” shots with the only other requirement being that they had to be true stereo images. Submissions ranged from shots of spiders in gardens to artistic approaches to tasteful nudes. As before, members were invited to vote for the winner by tagging images in the contest gallery as favorites. Without further ado, the winner is: Autumn Goodbye to Summer This...
In driver 334.89 NVIDIA introduced a new proprietary rendering mode for 3D Vision that enables us to improve the 3D experience for many key DirectX 10 and 11 games. This mode is now called “3D Compatibility Mode”. We have continued to iterate on this feature in beta driver 337, increasing game support and adding a toggle key to enable/disable the mode. Games with 3D Compatibility Mode will launch in this mode by default. To change the render mode back to standard 3D Vision...

Recent Blog Entries

Self-driving cars will make transportation easy. But building an autonomous vehicle is arduous.

That’s why we’ve partnered with AutonomouStuff, a supplier of components used within autonomy systems, to make DRIVE PX on Wheels available to researchers and developers working to revolutionize transportation.

DRIVE PX on Wheels is a kit available in three flavors — advanced, basic and custom — that makes the transition to autonomy as smooth as possible. Each comes with a vehicle configured by AutonomouStuff with sensors and our end-to-end autonomous driving platform, that’s ready to go on day one, allowing developers to focus on creating their own self-driving solutions.

DRIVE PX installed in the trunk of the car. Enabling Self-Driving Solutions

The advanced kit begins with a Ford Fusion, which is loaded with a NVIDIA DRIVE PX AI car computer, as well as cameras, LIDAR, radar, navigation sensors and a drive-by-wire system.

The NVIDIA DriveWorks software development kit (SDK), which comes pre-installed in DRIVE PX, gives developers a foundation for building applications, including computationally intensive algorithms for object detection, map localization, and path planning. AutonomouStuff installs DRIVE PX, so that developers can instantly access the reference applications, tools and library modules in the SDK.

The advanced kit offers an extensive array of sensors and peripherals for data acquisition, storage, and development of applications ranging from Level 3 to Level 5 autonomy. Instead of spending months setting up sensors or calibrating software, developers can jump straight to building their own applications.

The Advanced kit features a Ford Fusion. Customized Kits for Specialized Development

A basic kit is also available. It includes a Polaris GEM compact electric car, as well as DRIVE PX, DriveWorks, cameras, navigation sensors and a drive-by-wire system. This entry level kit is an affordable option for many development needs. It’s ideal for applications like low speed campus shuttles in hospitals, colleges, factories and other similar scenarios.

The Basic kit includes the Polaris GEM.

For specific research and development needs, AutonomouStuff will configure a custom kit for any vehicle. This provides flexibility with placement and choice of sensors. From the advanced kit to the basic kit, there is a wide spectrum of possible sensor configurations for a development vehicle.

Several universities and startups are using DRIVE PX on Wheels, making it easier than ever to use our self-driving car platform to combine surround vision, sensor fusion and artificial intelligence. Each kit provides the hardware, software, and tools developers need to get started. Visit AutonomouStuff to learn more.

Photos courtesy of AutonomouStuff.

The post Autonomous Vehicle Development Now Speedier with NVIDIA DRIVE PX on Wheels appeared first on The Official NVIDIA Blog.

Few moments are more magical than slipping on a headset and being instantly transported to an immersive virtual world.

To help bring such experiences to more people, we’re showing some of NVIDIA Research’s latest work to heighten the magic of VR and AR at the next week’s SIGGRAPH computer graphics conference, in Los Angeles.

We’ll present work in two areas: what researchers call “varifocal displays,” which give users the ability to focus more naturally while enjoying VR and AR experiences; and haptics, which enhances VR and AR with touch and feel. This represents the latest in a growing body of research we’ve shared over the past decade at industry events such as SIGGRAPH, as well as academic venues.

Enhancing Focus in AR and VR

We’re demonstrating a pair of techniques that address vergence-accommodation conflict. That’s caused when our eyes, accustomed to focusing on objects in 3D space, are presented with stereo images with parallax depth cues, but which are presented on a flat screen at a constant optical distance. Both aim to solve this in different ways by varying the focus of virtual images in front of a user, depending on where they’re looking.

The first, Varifocal Virtuality, is a new optical layout for near-eye display. It uses a new transparent holographic back-projection screen to display virtual images that blend seamlessly with the real world. This use of holograms could lead to VR and AR displays that are radically thinner and lighter than today’s headsets.

This demonstration makes use of new research from UC Berkeley’s Banks lab, led by Martin Banks, which offers evidence to support the idea that the our brains use what a photographer would call a chromatic aberration — causing colored fringes to appear on the edges of an object — to help understand where an image is in space.

Our demonstration shows how to take advantage of this effect to better orient a user. Virtual objects at different distances, which should not be in focus, are rendered with a sophisticated simulated defocus blur that accounts for the internal optics of the eye.

So when a user is looking at a distant object it will be in focus. A nearby object they are not looking at will be more blurry just as it is in the real world. When the user looks at the nearby object, the situation is reversed.

The second demonstration, Membrane VR, a collaboration between University of North Carolina, NVIDIA, Saarland University, and the Max-Planck Institutes, uses a deformable membrane mirror for each eye that, in a commercial system, could be adjusted based on where a gaze tracker detects a user is looking.

The effort, led by David Dunn, a doctoral student at UNC, who is also an NVIDIA intern, allows a user to focus on real-world objects that are nearby, or far away, while also being able to see virtual objects clearly.

For example, a label displaying a person’s name above a person’s head might appear to a user to actually be on top of their head, creating an experience that blends the virtual and real world’s more seamlessly. (To learn more, read the award-winning paper Dunn co-authored on this technique.)

New Ideas in Haptics

We’re also showing off two new techniques for using fluid elastomer actuators — small air chambers — to provide haptic feedback that enhances VR and AR, by connecting what you see on your display to what you feel in your hand. Both are created by Cornell
University in collaboration with NVIDIA.

One is a prototype VR controller that lets VR users experience tactile feedback while they play, relaying a sense of texture and changing geometry. Its soft skin can safely provide force feedback, as well as simulate different textures and materials.

The second is a controller that changes its shape and feel as you use it. So, a foam sword — the kind you might wave around at a sporting event — feels soft and squishy, yet can transform, in a moment, into a katana that feels longer and firmer in your grip.

We’ve integrated these novel input devices with our VR Funhouse experience. You’ll feel a knock when you whack-a-mole with a mallet in the game, or a kick when you fire at plates in a shooting gallery with antique revolvers.

Learn More

There is a lot of work left to be done to take ideas like these to market and make VR and AR more comfortable for users. But, from optics to haptics, NVIDIA is committed to solving the industry’s hardest technology problems in order to drive mass adoption of VR and AR.

Come see our latest ideas on display at SIGGRAPH’s Emerging Technologies exhibit. And don’t forget to stop by our booth for demonstrations of how you can put technologies such as AI and VR to work.

The post NVIDIA Inventions Promise to Make Augmented Reality More Comfortable appeared first on The Official NVIDIA Blog.