Skip to main content

3D News

For the last few years we’ve worked with the National Stereoscopic Association to support the 3D Digital Showcase photo competition featured at the NSA’s annual conventions. The images from this past year’s showcase are now live for everyone to view. We really enjoy the diversity of images submitted by 3D artists and enthusiasts to this event, and this gallery is certainly no different. You’ll see everything from close ups of insects to people juggling fire. Simply put,...
In driver 334.89 NVIDIA introduced a new proprietary rendering mode for 3D Vision that enables us to improve the 3D experience for many key DirectX 10 and 11 games. This mode is now called “3D Compatibility Mode”. We have continued to iterate on this feature in driver 344.11, increasing game support and adding some new interface elements. You can get the new driver at or via the update option in Geforce Experience. With the release of 344.11, new 3D...
We’re fortunate enough to have another fine 3D video from New Media Film Festival to share with you here on 3DVisionLive—a pop music video from Italy called “The Way,” which you can view here. Even better, New Media Film Festival has provided an interview with one of the co-directors of the video, Edoardo Ballanti, which provides insights on how the video was created and the vision behind it. Enjoy! (Alice Corsi also co-directed the video.) What was the Inspiration behind “...
The Fall Photo Contest received nearly 100 images – thanks to all that entered! The contest called for your best “nature” shots with the only other requirement being that they had to be true stereo images. Submissions ranged from shots of spiders in gardens to artistic approaches to tasteful nudes. As before, members were invited to vote for the winner by tagging images in the contest gallery as favorites. Without further ado, the winner is: Autumn Goodbye to Summer This...
In driver 334.89 NVIDIA introduced a new proprietary rendering mode for 3D Vision that enables us to improve the 3D experience for many key DirectX 10 and 11 games. This mode is now called “3D Compatibility Mode”. We have continued to iterate on this feature in beta driver 337, increasing game support and adding a toggle key to enable/disable the mode. Games with 3D Compatibility Mode will launch in this mode by default. To change the render mode back to standard 3D Vision...

Recent Blog Entries

Search just got smarter, thanks to AI and NVIDIA GPUs.

Microsoft’s Bing now lets you search for images within images. You can even buy items you find there.

Let’s say you’re a “Fast and Furious” fan, and want to trick out your ride with the gear you’ve seen on the big screen. Or you’re remodeling  your living room, and see a photo with a glittering chandelier that would add just the elegance you crave.

Whatever has caught your eye in a photo, just draw a box around it. Bing’s Visual Search then displays photos similar to your selection, where to buy it and, in many cases, what it will cost.

“We want to go way beyond the search box,” said Meenaz Merchant, who leads the Visual Search Group at Bing.

Merchant and his team will demonstrate Visual Search this week in Honolulu at IEEE’s Computer Vision and Pattern Recognition (CVPR) conference, the premier annual computer vision event. If you’re attending the conference, you can also attend a keynote talk on July 23 by Microsoft’s Harry Shum, executive vice president for the company’s Artificial Intelligence and Research Group.

How Bing Knows a Louis Vuitton

This new type of search is for more than shopping. You can upload your own photos or select anything online — an apple pie, a waterfall, a hotel — and Bing Image Search returns similar photos and tags that describe the search engine’s notion of what’s in the picture. Once you select a lookalike image, you may see useful information like recipes for apple pies, the location of the waterfall or the name of the hotel.

Visual Search is powered by deep learning so the more it’s used, the more accurate it will become. For now, the related images sometimes look visually similar but aren’t really the same thing. Search for that sporty yellow dress Emma Stone wore in “La La Land” and you’ll get more yellow dresses, but they’re all evening gowns.

While Bing isn’t the first to introduce search within an image, its advantage, said Merchant, is the vast collection of images from the Bing search index.

“If someone’s holding a Louis Vuitton handbag, we can identify it because we’ve seen thousands of images of these,” he said.

Using Bing’s image within image search, you can select a portion of a picture, see similar images and find out what the picture depicts. The Future of Search

Merchant and his team used images from the Bing search index, along with our GPUs, to train the deep learning algorithm for Visual Search. All of the images had been identified or, in deep learning parlance, labeled.

Researchers provide a detailed technical explanation of how Bing Image Search works and the image understanding technologies behind it in this blog post. In addition to the image search on its website and mobile app, Bing released an API so developers can build Visual Search into their apps.

Merchant noted his team continues work to make searching easier. “We’re working to get better at this,” he said.

Soon, Bing Image Search will help you pick objects without needing to draw a box, Merchant said. He said Bing also plans to connect the objects identified to the Bing Satori and Web graph knowledge repository so when the search engine identifies a celebrity, for example, you can get detailed information about that person. When it’s a product, Bing will show where you can buy it.

“Visual search is a fraction of text search today,” Merchant said. “Now that everyone is carrying a high-resolution camera, it’s going to explode over the next couple of years and maybe even surpass text.”

The post Get the Picture: How AI Helps Bing Search for Images Within Images appeared first on The Official NVIDIA Blog.

Some come to Hawaii with snorkels. Others come with scuba gear. And then there are those who come to dive really deep.

In the days ahead, some of the greatest minds in artificial intelligence will be in Honolulu for the annual Computer Vision and Pattern Recognition (CVPR) conference.

Many attendees were arriving Friday afternoon, and with the blue waters of the Pacific and the abundant pools beckoning from the hotel lobby, one could overhear laments of not being able to partake of the abundant pleasures.

Tropical temptations aside, CVPR is serious business, and the growing importance of the show is evident in the 5,000 researchers, engineers and business leaders gathering here, nearly twice the number of just a year ago.

Despite the idyllic setting, they’ll spend the conference, which kicked off today, sharing research that pushes the frontiers of AI. As AI has moved from an academic specialty to the most powerful technology trend of our era, interest has broadened and once obscure figures now loom as industry celebrities.

So, it’s no surprise that this year’s CVPR is different. Past editions focused on where AI could go in the future. But computer vision has moved beyond future promise to a practical technology that’s begun disrupting industries from intelligent video analytics and self-driving cars to medical imaging and advertising.

CVPR 2017: NVIDIA Research Everywhere You Look

With NVIDIA technologies and partners playing roles in all these worlds, we’ll be visible everywhere at CVPR.

Members of our AI and computer vision research team were among the few selected to present their work at the main conference and workshops. Topics addressed include polarimetric multi-view stereo and dynamic facial analysis.

In addition, the team will run a special tutorial on the Theory and Application of Generative Adversarial Networks.

Other highlights include:

  • On Saturday night, NVIDIA will host a reception to recognize the work coming out of our NVIDIA AI Labs (NVAIL) program, which fuels the efforts of researchers at universities around the world. (We recently shared some of this work in a blog post.)The NVAIL reception is an opportunity for the research community to share its breakthroughs. In doing so, NVAIL partners can gain new perspectives on where to take their research, or dovetail their work with complementary efforts, accelerating future AI breakthroughs.
  • On Tuesday, NVIDIA will sponsor CVPR’s annual Women in Computer Vision dinner and workshop, with a goal of inspiring more women to join the field.With so many career opportunities emerging in computer vision, it’s a great time for women to jump in. The Women in Computer Vision event is a chance for women to network. They’ll also hear from guest speaker Shalini De Mello, a senior research scientist at NVIDIA.
  • NVIDIA’s Inception program has helped hundreds of startups accelerate their development of AI technologies. More than 20 of these companies will be at CVPR to present their research. NVIDIA employees will also have an eye out for cutting-edge research, with a goal of expanding the Inception roster of startups.
  • On Monday, NVIDIA’s Deep Learning Institute, which is on a mission to train 100,000 people in deep learning and neural networks this year, will host a sold-out workshop. For those who weren’t able to secure a spot, the institute offers online labs as an alternative.
Find Us at CVPR 2017

If you can’t get to any of these happenings, don’t despair. Come to the NVIDIA booth on the exhibition floor. We’ll point you in the right direction, whether it’s to find out more about any of our programs or to discuss career opportunities in computer vision. To stay up to date in real time, follow us on @NvidiaAI.

And even if you don’t need any direction, you can come by and check out some of the amazing technologies we’ll be showing off, including ISAAC (virtual robot simulator), TensorRT (deep learning inference optimizer), DGX Station (AI supercomputer) and our AI Co-Pilot.

Featured image credit: Alan Light, via Flickr.

The post Putting the AI in Hawaii: Surfing the Big Waves of Artificial Intelligence appeared first on The Official NVIDIA Blog.