Skip to main content

3D News

3DVisionLive’s first-ever short-form 3D video contest received 14 entries that showed a great deal of diversity, ranging from video game captures to commercial-style clips to raw captures of pets or people doing cool things (such as bashing each other with swords). During judging we laughed, we cried (okay, maybe not), and we simply scratched our heads…. But seriously: thank-you to all that participated and we hope to see more of your content uploaded to the site for all to...
The submission period for the Fall Photo Contest is now closed, and we are happy to report we’ve received nearly 100 images from our members for consideration. And, once again, we’re opening the judging process to our community as well to help us determine the winners. The full gallery of images may be seen by clicking the link above. Between now and February 10th (11:59 PST), please view all of the images in the gallery and place your votes for the ones you’d like to win by...
With the holidays drawing near NVIDIA would like to say a quick thank-you to all the 3DVisionLive members for sharing so many outstanding 3D images throughout the year and continuing to provide a supportive and fun environment for 3D enthusiasts. We look forward to seeing what 2014 brings to the site, and with that said, here are a few of our favorite holiday-themed images to add an extra dimension to that holiday spirit.   See stereo 3D on photos.3dvisionlive.com See...
Time to break out those 3D video cameras folks and show us your inner Peter Jackson – or Cameron or Speilberg or whichever director you admire. 3DVisionLive is pleased to announce its first ever Short-form 3D Video Contest. And don’t worry, we really don’t expect Orcs or Goblins in your video (but that would be cool). What’s a Short-form video you ask? Well, it’ simple really: short-from videos are generally defined as a video of less than 5 minutes in length – and we’re...
We’re fortunate to be able to host Elysian Fields here on 3DVisionlive for all of you. Winner of a number of accolades, including multiple “Best Animated 3D Short Film” awards, it’s hard to watch Elysian Fields and not be drawn into its world. The short was brought to us through Susan Johnston, Founder/Director of the New Media Film Festival, who was also kind enough to provide us with the following interview of Elysian Field’s creator, Ina Chavez. Enjoy! Silverlight....

Recent Blog Entries

Superior Street is one of Duluth, Minnesota’s oldest and most iconic streets, home to family businesses, shops and cafes.

Superior Street today in downtown Duluth, Minnesota.

Now virtual reality is helping with a long awaited revitalization project thanks to the work of LHB, a local architecture, engineering and planning firm.

Everyone from Minnesota’s lieutenant governor to the general public is interested in Superior Street’s redevelopment. Key to gaining support for the project was how LHB used VR to provide an immersive, realistic rendering that displayed the aesthetics, lighting, ambience and sightlines.

For many, it’s difficult to visualize how the final project will appear in real life when viewing 2D drawings of the design. The ability to experience the scene in a real scale from every viewpoint using VR made for far better informed decision making.

Superior Street revitalized as rendered in VR.

VR also helped streamline the design review process and control costs because potential design issues were spotted before construction started.

For LHB, delivering a high-quality VR experience began by generating designs in high-end design and modeling applications. They used topography information from AutoCAD Civil 3D and modeled details such as curbs and utilities with Autodesk Revit.

To create realistic nighttime views, streetlamps and other light sources were photometrically accurate for the lamp and bulb type — all with the help of NVIDIA Quadro GPUs. The data was rendered in real time using VR applications such as Revizto, and then all came together in Fuzor, a turnkey VR platform for the AEC industry.

The Tech Behind the Turnaround

Running Fuzor with NVIDIA Quadro GPUs offers unique benefits, such as access to NVIDIA VRWorks enabled VR SLI. With VR SLI, dual Quadro GPUs can render one eye each — dramatically accelerating performance and resulting in a smoother VR experience.

It all adds up to LHB creating a more compelling experience much earlier than with traditional design tools.

“Traditional fixed-angle renderings still have their place, but having VR allows our clients to freely explore projects in progress. This forces us to consider materials and other design elements earlier in the process so the VR experience is realistic. This also allows the client to make more informed decisions earlier,” says Dan Stine, BIM administrator for LHB.

“The payoff is client buy-in and spotting potential issues before starting construction, and that’s invaluable,” Stine says.

Read the full case study.

The post How VR Is Helping Revitalize Downtown Duluth appeared first on The Official NVIDIA Blog.

The world’s largest automotive supplier, Bosch, provided a massive stage today for NVIDIA CEO Jen-Hsun Huang to showcase our new AI platform for self-driving cars.

Speaking in the heart of Berlin to several thousand attendees at Bosch Connected World — an annual conference dedicated to the Internet of Things — Huang detailed how deep learning is fueling an AI revolution in the auto industry.

The small AI car supercomputer was unveiled yesterday in the opening keynote address by Bosch CEO Dr. Volkmar Denner, who focused on how his company, which had €73 billion ($77.6 billion) in revenue last year, is pushing deeper into the areas of sensors, software and services.

“I’m so proud to announce that the world’s leading tier-one automotive supplier — the only tier one that supports every car maker in the world — is building an AI car computer for the mass market,” said Huang, speaking in the main theater of the glass-roofed, red-brick exhibition center.

NVIDIA’s Huang and Bosch’s Hoheisel reveal the Bosch AI Car Computer.

“It blows my mind where this industry is going and where this strategy is going,” said Dr. Dirk Hoheisel, who sits on Bosch’s management board, responsible for mobility solutions.

First Adoption of Xavier Technology

The collaboration with Bosch represents the first announced DRIVE PX platform incorporating NVIDIA’s forthcoming Xavier technology. Xavier can process up to 30 trillion deep learning operations a second while drawing just 30 watts of power.

That power is needed to achieve what the automotive industry refers to as “Level 4 autonomy,” where a car can drive on its own, without human intervention. The number of cars with various levels of autonomy will grow to a total of 150 million vehicles by 2025, analysts project.

NVIDIA’s Huang said his company will deliver technology enabling Level 3 autonomous capabilities (in which a car can drive on its own but still needs a driver to intervene under various conditions) by the end of this year, and Level 4 capabilities by the end of 2018.

Huang noted that a wide range of leading brands are working on autonomous solutions — from traditional carmakers like Audi, Ford and BMW, to new competitors like Tesla, and technology innovators like Waymo, Uber and Baidu.

Such vehicles will require unprecedented levels of computing power, due to the profound complexity posed by self-driving. Coded software can’t possibly be written that would anticipate the nearly infinite number of things that can happen along the road, Huang said in his keynote.

Cars that stray from their lanes, objects that fall onto the roadway, rapid shifts in weather conditions, deer that dart across the road. The permutations are endless.

While cars on the road now are capable of detecting vehicles in front of them and braking when needed, the requirements for autonomous driving are dramatically more demanding, Huang said.

Instead, deep learning can enable us to train a car to drive, and ultimately perform far better — and more safely — than any human could do behind the wheel.

“We’ve really supercharged our roadmap to autonomous vehicles,” Huang said. “We’ve dedicated ourselves to build an end-to-end deep learning solution. Nearly everyone using deep learning is using our platform.”

Huang noted that the company’s massive commitment — which started five years ago with thousands of engineering years of effort behind it  — has put NVIDIA at the center of the AI revolution. It’s working with every significant cloud service provider, researchers worldwide and a wide range of corporates in nearly every sector.

Accelerating the AI Pipeline

Deep learning plays a vital role through the entire computational pipeline for a self-driving vehicle enabling it to get increasingly smarter based on experience. This involves:

  • Detection — understanding the world around the vehicle;
  • Localization — using what’s perceived to create a detailed local map;
  • Occupancy grid — building a real-time 3D environment around the vehicle;
  • Path planning — determining how to proceed along the mapped route;
  • Vehicle dynamics — calculating how to drive smoothly

Consider the processing horsepower required to make sense of the ocean of data that streams in from a car’s array of sensors, including cameras, radar, lidar and ultrasonics. This is where deep learning comes in. By first developing and training a deep neural network in the data center, the NVIDIA DRIVE PX system becomes able to understand everything happening around the car in real time.

From Cloud to Car

Companies are using the power of the GPU in the cloud as well. NVIDIA HGX-1 is the new AI supercomputer standard, which is designed for deep learning in the data center, and for use across all major industries.

AI Car Revolution

Many cars on the road today have some basic safety features, known as advanced driver assistance systems (ADAS). These systems are often based on smart cameras and offer basic detection of obstacles and identification of lane markings. These capabilities can help carmakers increase their New Car Assessment Program safety ratings.

While a stepping stone to making cars safer, ADAS systems are a long way from a self-driving car. And the amount of processing required for an autonomous vehicle is orders of magnitude greater. Huang noted the incremental amount of processing to be at least 50 times greater.

And that doesn’t include the addition of an AI co-pilot. Introduced at CES two months ago, NVIDIA’s AI co-pilot technology will act as an AI assistant in the vehicle, as well as provide safety alerts of potential hazards outside the car. By monitoring the driver as well as a full 360 degrees around the car, the system works to keep the occupants of the vehicle safe.

“Of course, our goal someday is that every single car will be autonomous,” Huang said. “But for the path to then, we’ll have AI that will be your co-pilot, will be your guardian, and look out for you.”

Powered by deep learning, AI co-pilot can recognize faces to automatically set specific preferences in the car depending on the driver. The system can also see where the driver is looking, and detect expressions to understand the driver’s state of mind. Combining this information with what is happening around the car enables the AI co-pilot to warn the driver of unseen potential hazards.

In addition, the system has the ability to read lips. So even if the radio is cranked up, the car can understand a driver’s instructions.

The post NVIDIA and Bosch Announce AI Self-Driving Car Computer appeared first on The Official NVIDIA Blog.