Skip to main content

3D News

If you know the Trine series, you’re already salivating: the first downloadable content (DLC) for Trine 2 is now available! If you’ve not heard of Trine at all, then prepare yourself for a visual feast. Trine is a physics-based action game in which you can switch amongst three characters – each with distinct attributes – to come up with clever solutions to an array of challenges created by hazardous puzzles and threatening enemies. The platform-style gameplay is based on...
Thousands of enthusiastic Star Wars fans in all manner of costumes enjoyed the sixth annual Star Wars Celebration that recently took place in Orlando. NVIDIA was there, too, with a 3D Vision photo booth to capture as many of the attendees as possible to help those that missed the show "feel like they were there." You can view the gallery here. What is the Star Wars Convention you say? Well, it's sort of like Blizzcon but focused on Star Wars of course: Here's the official...
You can now view nearly 300 new 3D images on 3DVisionlive.com that were created by National Stereoscopic Association (NSA) members. The shots represent all the entries in the second annual 3D Digital Image Showcase contest at the recent 2012 National Stereoscopic Association convention. Sponsored by NVIDIA, the contest helps the NSA members gain exposure for their work—as well as earn a chance to score some fun prizes. The full gallery of 280 images can be viewed here. NSA...
Getting a robotic exploration vehicle safely to a destination that’s hundreds of millions of miles away is an amazing feat in and of itself. And the good folks at NASA’s Jet Propulsion Laboratory (JPL) should be congratulated for their current success with the latest machine sent to Mars to “look around” – the MSL Curiosity. However, getting the craft to Mars was just the start of the challenge: Once there, how do you control it? In Curiosity’s case, the answer lies with...
The Dark Knight Rises, the final movie in Warner Bros. trilogy starring everyone’s favorite Caped Crusader, has “Batman Fever” at an all-time high. Outside of the films, there’s a bounty of excellent ways to feed your hunger for more Batman thrills on your 3D Vision-equipped PC. The following games all provide top-notch 3D experiences, feature Batman in a starring role, and all are available now.    Batman: Arkham City Harley Quinn’s Revenge  The first...

Recent Blog Entries

Superior Street is one of Duluth, Minnesota’s oldest and most iconic streets, home to family businesses, shops and cafes.

Superior Street today in downtown Duluth, Minnesota.

Now virtual reality is helping with a long awaited revitalization project thanks to the work of LHB, a local architecture, engineering and planning firm.

Everyone from Minnesota’s lieutenant governor to the general public is interested in Superior Street’s redevelopment. Key to gaining support for the project was how LHB used VR to provide an immersive, realistic rendering that displayed the aesthetics, lighting, ambience and sightlines.

For many, it’s difficult to visualize how the final project will appear in real life when viewing 2D drawings of the design. The ability to experience the scene in a real scale from every viewpoint using VR made for far better informed decision making.

Superior Street revitalized as rendered in VR.

VR also helped streamline the design review process and control costs because potential design issues were spotted before construction started.

For LHB, delivering a high-quality VR experience began by generating designs in high-end design and modeling applications. They used topography information from AutoCAD Civil 3D and modeled details such as curbs and utilities with Autodesk Revit.

To create realistic nighttime views, streetlamps and other light sources were photometrically accurate for the lamp and bulb type — all with the help of NVIDIA Quadro GPUs. The data was rendered in real time using VR applications such as Revizto, and then all came together in Fuzor, a turnkey VR platform for the AEC industry.

The Tech Behind the Turnaround

Running Fuzor with NVIDIA Quadro GPUs offers unique benefits, such as access to NVIDIA VRWorks enabled VR SLI. With VR SLI, dual Quadro GPUs can render one eye each — dramatically accelerating performance and resulting in a smoother VR experience.

It all adds up to LHB creating a more compelling experience much earlier than with traditional design tools.

“Traditional fixed-angle renderings still have their place, but having VR allows our clients to freely explore projects in progress. This forces us to consider materials and other design elements earlier in the process so the VR experience is realistic. This also allows the client to make more informed decisions earlier,” says Dan Stine, BIM administrator for LHB.

“The payoff is client buy-in and spotting potential issues before starting construction, and that’s invaluable,” Stine says.

Read the full case study.

The post How VR Is Helping Revitalize Downtown Duluth appeared first on The Official NVIDIA Blog.

The world’s largest automotive supplier, Bosch, provided a massive stage today for NVIDIA CEO Jen-Hsun Huang to showcase our new AI platform for self-driving cars.

Speaking in the heart of Berlin to several thousand attendees at Bosch Connected World — an annual conference dedicated to the Internet of Things — Huang detailed how deep learning is fueling an AI revolution in the auto industry.

The small AI car supercomputer was unveiled yesterday in the opening keynote address by Bosch CEO Dr. Volkmar Denner, who focused on how his company, which had €73 billion ($77.6 billion) in revenue last year, is pushing deeper into the areas of sensors, software and services.

“I’m so proud to announce that the world’s leading tier-one automotive supplier — the only tier one that supports every car maker in the world — is building an AI car computer for the mass market,” said Huang, speaking in the main theater of the glass-roofed, red-brick exhibition center.

NVIDIA’s Huang and Bosch’s Hoheisel reveal the Bosch AI Car Computer.

“It blows my mind where this industry is going and where this strategy is going,” said Dr. Dirk Hoheisel, who sits on Bosch’s management board, responsible for mobility solutions.

First Adoption of Xavier Technology

The collaboration with Bosch represents the first announced DRIVE PX platform incorporating NVIDIA’s forthcoming Xavier technology. Xavier can process up to 30 trillion deep learning operations a second while drawing just 30 watts of power.

That power is needed to achieve what the automotive industry refers to as “Level 4 autonomy,” where a car can drive on its own, without human intervention. The number of cars with various levels of autonomy will grow to a total of 150 million vehicles by 2025, analysts project.

NVIDIA’s Huang said his company will deliver technology enabling Level 3 autonomous capabilities (in which a car can drive on its own but still needs a driver to intervene under various conditions) by the end of this year, and Level 4 capabilities by the end of 2018.

Huang noted that a wide range of leading brands are working on autonomous solutions — from traditional carmakers like Audi, Ford and BMW, to new competitors like Tesla, and technology innovators like Waymo, Uber and Baidu.

Such vehicles will require unprecedented levels of computing power, due to the profound complexity posed by self-driving. Coded software can’t possibly be written that would anticipate the nearly infinite number of things that can happen along the road, Huang said in his keynote.

Cars that stray from their lanes, objects that fall onto the roadway, rapid shifts in weather conditions, deer that dart across the road. The permutations are endless.

While cars on the road now are capable of detecting vehicles in front of them and braking when needed, the requirements for autonomous driving are dramatically more demanding, Huang said.

Instead, deep learning can enable us to train a car to drive, and ultimately perform far better — and more safely — than any human could do behind the wheel.

“We’ve really supercharged our roadmap to autonomous vehicles,” Huang said. “We’ve dedicated ourselves to build an end-to-end deep learning solution. Nearly everyone using deep learning is using our platform.”

Huang noted that the company’s massive commitment — which started five years ago with thousands of engineering years of effort behind it  — has put NVIDIA at the center of the AI revolution. It’s working with every significant cloud service provider, researchers worldwide and a wide range of corporates in nearly every sector.

Accelerating the AI Pipeline

Deep learning plays a vital role through the entire computational pipeline for a self-driving vehicle enabling it to get increasingly smarter based on experience. This involves:

  • Detection — understanding the world around the vehicle;
  • Localization — using what’s perceived to create a detailed local map;
  • Occupancy grid — building a real-time 3D environment around the vehicle;
  • Path planning — determining how to proceed along the mapped route;
  • Vehicle dynamics — calculating how to drive smoothly

Consider the processing horsepower required to make sense of the ocean of data that streams in from a car’s array of sensors, including cameras, radar, lidar and ultrasonics. This is where deep learning comes in. By first developing and training a deep neural network in the data center, the NVIDIA DRIVE PX system becomes able to understand everything happening around the car in real time.

From Cloud to Car

Companies are using the power of the GPU in the cloud as well. NVIDIA HGX-1 is the new AI supercomputer standard, which is designed for deep learning in the data center, and for use across all major industries.

AI Car Revolution

Many cars on the road today have some basic safety features, known as advanced driver assistance systems (ADAS). These systems are often based on smart cameras and offer basic detection of obstacles and identification of lane markings. These capabilities can help carmakers increase their New Car Assessment Program safety ratings.

While a stepping stone to making cars safer, ADAS systems are a long way from a self-driving car. And the amount of processing required for an autonomous vehicle is orders of magnitude greater. Huang noted the incremental amount of processing to be at least 50 times greater.

And that doesn’t include the addition of an AI co-pilot. Introduced at CES two months ago, NVIDIA’s AI co-pilot technology will act as an AI assistant in the vehicle, as well as provide safety alerts of potential hazards outside the car. By monitoring the driver as well as a full 360 degrees around the car, the system works to keep the occupants of the vehicle safe.

“Of course, our goal someday is that every single car will be autonomous,” Huang said. “But for the path to then, we’ll have AI that will be your co-pilot, will be your guardian, and look out for you.”

Powered by deep learning, AI co-pilot can recognize faces to automatically set specific preferences in the car depending on the driver. The system can also see where the driver is looking, and detect expressions to understand the driver’s state of mind. Combining this information with what is happening around the car enables the AI co-pilot to warn the driver of unseen potential hazards.

In addition, the system has the ability to read lips. So even if the radio is cranked up, the car can understand a driver’s instructions.

The post NVIDIA and Bosch Announce AI Self-Driving Car Computer appeared first on The Official NVIDIA Blog.