Easing Robotics, AI Development the Focus of NVIDIA Keynote at RoboBusiness | Robotics

Easing Robotics, AI Development the Focus of NVIDIA Keynote at RoboBusiness

Source:

Before they can get full value from robotics and artificial intelligence, enterprises need to understand how the technologies are converging and how they can benefit. At 2018, we’ll separate the hype from the reality and provide suppliers and end users with actionable information. Deepu Talla, vice president and general manager of Autonomous Machines at NVIDIA Corp., will discuss the importance of robotics and AI development to the future of industry.

Talla is responsible for deploying AI in factory robots, commercial drones, and video analytics at NVIDIA. Robotics Business Review recently spoke with Talla, who provided the following preview of his keynote address at RoboBusiness in Santa Clara, Calif.


 

This year, RoboBusiness includes four conferences to make it easier for you to find the information you need most. Whether you are involved in running a robotics business, designing products, or implementing robotics solutions in your company – we have a conference to meet your needs.

 


Q: How did you get into robotics and AI development? What first interested you about the field?

Talla: I’ve always been working at the cutting edge of technology. Back in graduate school, it was computing architecture and computing platforms. I like doing things that “move the needle” and excited me. I started working in mobile computing platforms before smartphones became ubiquitous.

The challenge was how to take desktops and laptops to a portable form factor. It eventually led to the smartphone revolution.

Deepu Talla, vice president and general manager of Autonomous Machines at NVIDIA

Deepu Talla will give examples of robotics and AI development at RoboBusiness 2018.

I like to work on challenging problems. Artificial intelligence and deep learning is going to transform every industry, similar to how electricity did 100 years ago. We believe the same thing will happen with AI development. Robotics is the ultimate incarnation of AI.

Q: What is the current state of robots using machine learning?

Talla: They’re helping to solve problems in every industry. The ramifications are huge. We need a lot more autonomous machines, whether it’s for agriculture — reducing the amount pesticides, or picking fruits and vegetables at the right time — or it’s for inspecting bridges and powerlines, manufacturing, or making warehouses optimal.

There are plenty of use cases, but all of these robotics applications, like humans, need to do three things. They need to sense and perceive the world, to think and reason about what’s happening, and to act. This happens in a continuous loop.

Step 1 is perception. With cameras, lidar, and radar, artificial intelligence is able to process data from ubiquitous sensors in a much more accurate manner than ever before.

Step 2 is reasoning. AI development is showing tremendous promise. All of this needs to happen in a seamless manner in real time. We need tools to make software intelligent. How fast can you build these robots?

You need a lot of compute capability or computing architecture. NVIDIA’s software development kit provides tools for building robots fast rather than everyone programming to the bare metal.

Step 3 is actuation. Some of this is coming out the research labs around gripping and manipulation. You can’t necessarily test this in the real world because it’s not safe, the robots are too big, or it takes too long.

You want to train robots in a virtual world, like training fighter jet pilots. When you’re confident, then you can bring them into the physical world. We’ll look at case studies in my keynote.

Q: Is there a minimum level of autonomy that we need for the next generation of robots to be useful?

IoT and AI development are connected.

Talla: Think about where we will be in the future, and then back up from there. Everything that moves will be autonomous in some sense. How much capability will depend on the type of application — an autonomous car is different from an autonomous drone.

Even within a self-driving car, it would behave differently in different situations, such as on the highway or in a school zone.

It’s hard to quantify the minimum level of autonomy. The whole idea is that we want to increase productivity and make it safer. Anything that’s repetitive, that can be done through arrays of sensors and computing will make our lives better over time. We’re barely at the beginning.

Q: Which industries could benefit most from AI-enabled robots?

Talla: Certain industries have gotten off the ground with robotics; other problems are in the research stage.

We’re going fast with machine vision for optical inspection in manufacturing. It’s traditionally done with a bunch of cameras on factory robots. They can detect defects better than humans can. With AI development, the accuracy increases significantly, reducing errors and increasing throughput.

Another example, where there’s a lot actually happening in startups, is last-mile delivery. Many folks are working on self-driving cars. It’s the “Amazon effect,” where everybody wants packages instantaneously and multiple packages a day.

It’s not practical to keep humans in the loop for lots of packages or groceries, and you don’t want traffic congestion. Robots off the street, on sidewalks, could navigate to apartment complexes.

One example that’s a bit longer-term where fundamental problems need to be solved is anything that deals with robots that are flexible, have actuators to grip objects, and work with humans. There are several parallel problems to be solved for such service robots.

Robotics and AI development was a topic of discussion at CES 2018.

NVIDIA’s Talla spoke in the AI track at CES 2018.

Q: What breakthroughs have there been in robotics and AI development since you spoke at our AI track at CES 2018?

Talla: This industry is moving at such a rapid pace — for example, in computing — that, in order for robots to come into the real world and transform industries and make them more productive, there are a few things they must do.

First, for safety, we’re doing a lot of computation. We need to do the same thing in different ways for resiliency, such as with cameras, lidar, ultrasonic, or radar for self-driving cars. We want to process sensor data in diverse ways, like neural networks and traditional computing methods.

The underlying computing needs to change. Jetson Xavier is a platform to power the next generation of robots. It’s an order of magnitude greater in performance than its predecessors.

A lot of prototyping is using PC and G4 graphics cards that can’t be taken into production environments. We’re bringing that now to the palm of your hand. The sheer amount of computing capability has grown.

We’ve also seen new software capabilities, new algorithms that can plugged and played quickly into robot brains.

With simulation, we’ve made progress where we can simulate robots in a virtual world for training.

Is next year going to be greater than this one? I hope so!

Q: We’ve talked before about automation augmenting human capabilities. How can AI help robots work better and safer alongside people?

Talla: Most robots that are deployed today are fenced off. They’re not really interacting with humans or operating around us.

With the amount of compute power now, we can create virtual environments where you can simulate and design these things in a cost-effective and safe manner.

Think about last-mile delivery robots. Not only must robots interact with humans, but they need to be cognizant of sidewalks and not collide with humans.

Not that these robots have many degrees of freedom, but when you talk about future robots working alongside humans, those kinds of things are longer term. We’re providing the tools for researchers to prototype and develop these robots.

Q: How important are interoperability and the emerging Industrial Internet of Things?

Talla: Some of these things might be connected all the time or not, but in the case of some of these machines, you might not be connected to the cloud all the time.

In terms of fusion of intelligence and autonomous machines, they will definitely interact with sensors that are not just on the robots or autonomous machines. They could be gathering data from sensors placed elsewhere that are communicating with the robots.

The best example are smart cities and traffic cameras — all of those need to be aggregated in a box. It might not have arms or legs, but it’s still a robot that needs to consider and digest data and make real-time decisions.

Q: What does NVIDIA’s Isaac platform offer to the global robotics ecosystem? What sorts of things can the Jetson Xavier platform do?

Talla: NVIDIA Isaac is our overarching robotics platform. Within it are several elements. The first is you need to have a computing hardware underneath. That’s Jetson Xavier.

Then on top of that, you need tools for developers. They need a robot SDK, a framework they can build on top of. The Isaac SDK is that framework.

We also have a bunch of algorithms that we’re providing as a baseline for object recognition or simultaneous localization and mapping [SLAM], for example. We call these building blocks Isaac GEMS. Developers can use them as-is, write their own, or build on top of them.

Step three is the Isaac simulator, which is on top of that for the virtual world.

It’s a journey, not a destination — we’ll build what the developers ask for. Third parties are building apps they can leverage.

The first version of Isaac at launch will have some level of functionality. Over time, we’ll upgrade the hardware, add more APIs, create more GEMS algorithms or building blocks. The simulator will add more scenarios.

The initiative will start with an indoor navigation type of scenario, but over time, that can be generalized to other scenarios where you can simulate all sorts of environments.

Q: Speaking of developers, what skills are most in demand? How can business, academia, and governments work together to fill that need?

Talla: During the past 30 years, I grew up programming and building computers. In the next 20 to 30 years, the kids will not be programming computers; they’re going to be teaching and building robots.

We need to build out the infrastructure and tools to make it easier. That’s why we’re providing the Jetson developer kit.

On top of that, we’re offering vertical platforms like Isaac to teach not only developers at companies and startups, but also to university and high school students.

Q: What are you hoping to see at RoboBusiness 2018, and what do you think attendees can learn?

Talla: Looking at the program and the breadth of industry, it’s amazing. I think of robotics as vertical and as a horizontal.

Just as AI is going to transform every industry, I can also think of robotics transforming agriculture, healthcare, service, and manufacturing.

I think the future, everything is going to be robotics, and I think RoboBusiness will be a fun place to see that.

Q: What’s on your wish list for NVIDIA and robotics and AI development? What would you like to see them be able to do next?

Talla: We’re trying to provide the instruments and tools for the whole world of robotics. Everybody is going to be a robot teacher and programmer. We want to create the tools to jump-start the future.

My wish list is to make the Isaac initiative serve the needs of all developers.

Q: What’s your favorite fictional robot or AI?

Talla: The first thing that comes to mind is the people close to me think I’m a robot.

Beyond that, the first robot I ever saw was, back in early ’80s was the TV series Giant Robo. As a kid growing up, the takeaway for me was that robots were cool. They were saving humanity, doing things that people couldn’t do.

You might also like
Leave A Reply

Your email address will not be published.