Universal Robots at Autodesk for human-robot collaboration in construction – Info Robotic

At Autodesk’s Robotics Lab in San Francisco, Universal cobots are proving it might be possible to automate portions of the industry.

The projects span human-robot interactions, machine learning, drawing, and smart assembly systems.

Some say robots are going to take over, and humans will have nothing to do. “But we wanted to explore the possibility of humans and robots working together to accomplish things that neither could alone,” says Heather Kerrick, senior research engineer at Autodesk’s Robotics Lab.

So her lab recently had collaborative robots and conference attendees build a pavilion out of raw bamboo and fiber string. Raw bamboo is a very uneven and bendable material with different lengths and widths. “At first we weren’t sure of the extent to which we could work with the robot … or help it understand the uncertainty and variability we were giving it,” explains Kerrick.

In fact, this question is core to the construction industry. “Manufacturing supply chains have much smaller tolerances … but in construction, tolerances are pretty broad. So we gave the robot sensors and decision-making abilities to act accordingly.”

The Hive Pavilion included winding stations in the lobby of the Autodesk University conference. At each station, attendees fastened three random pieces of bamboo onto a Universal Robot that moved to hook fiber on the bamboo tips to create a unique, tumbleweed-looking tensegrity elements.

The Autodesk HIVE pavilion let conference Autodesk University attendees experience a seamless integration between robotic manufacturing, wearables, RFID tracking, and embedded intelligence. The HIVE project was a collaboration between Autodesk Robotics Lab and ICD University of Stuttgart.

The UR robots made precise movements and measurements that would’ve been difficult for a human to do onsite. No humans needed measuring tools or equipment, either; they simply went to the robot, got the needed part, and then took it to the construction site.

Collaboration between inexperienced personnel and a safe robot

Building the pavilion involved close collaboration between the conference attendees and the Universal Robots’ cobots. Force-limiting safety features stop the robot arms if they encounter obstacles.

“People with robotics experience were impressed with what we did with sensors, while people who had never worked with robotics took it for granted,” says Heather Kerrick, Senior Research Engineer at Autodesk’s Robotics Lab.

“Our setup included robot moves based on real-time sensor data, so the chance of the robot doing something unexpected is really high,” explains Kerrick. “But we wanted to engage people without experience with robots … providing them with a safe and fun experience,” she says. A larger industrial robot wouldn’t have engaged with the public in the same way. “With the Universal Robots cobots, we were able to be a little more daring … because we could trust that the robot wouldn’t break itself or pose a danger to others.”

Elsewhere at Autodesk, a cobot makes a cameo — and demonstrates on-location feasiblity

The ability to operate in an open space without safety guarding also landed the UR10 robot a cameo in Artoo in Love — a viral short film by Autodesk research engineer Evan Atherton. The film depicts an R2D2 robot in a San Francisco park falling in love with a mailbox that it mistakes for another robot. A UR10 draws a portrait of the loving couple.

Of course, industrial robots aren’t often operated on rugged terrain.

“We had to bring a generator for filming, which was an interesting challenge,” explains Atherton. But he and his team calibrated the robot and wrote a program to direct the arm to follow a vector drawing’s paths projected onto a canvas. “The UR10 was small, mobile, and safe or the job. We brought it onsite in a Pelican case. A traditional robots would’ve necessitated a forklift and a safety cage — so that never would’ve worked,” he says.

Such on-location work (even on rugged terrain) is applicable to construction sites. “You don’t see many robots in construction today, as it isn’t feasible in these settings to have industrial robots that need to stay inside cages,” explains Atherton.

But the Autodesk research team recently fitted a UR cobot with a router, camera, projector, and machine-learning software to let it recognize human gestures and voice commands. So now the UR10 can be rolled up to a piece of drywall and project an image of a future outlet …  once properly placed, personnel can use a voice command to tell the UR10 to go ahead and cut it out. “We can put the robot on a cart and roll it around construction sites to help,” says Atherton.

Autodesk built voice-enabling software to let personnel simply tell the UR10 when to cut out the drywall.

Hard-coded robots yield to machine learning for bin picking and more

Another construction industry challenge now addressed in Autodesk’s research with the UR robots are smart assembly systems. Autodesk software architect Yotto Koga explains that assembly systems of today tend to be hard-coded and brittle. “They’re engineered so parts coming down the line must be in an exact position … and need tooling and fixtures customized to specific parts for the robot to assemble them,” says Koga.

UR open architecture makes it simple to stream commands. Through it, researchers had a UR10 robot make a drawing by following the paths of a vector drawing projected on a canvas.

That’s costly and problematic when things become misaligned, as fixtures can be damaged … and sometimes parts need to be swapped out. So Autodesk used UR cobots to investigate on-the-fly learning for flexible assembly. Now their Brick-Bot grasps Lego blocks — to recognize and handle more than 10,000 different bricks with very tight tolerances.

Brick-Bot executes bin picking, regrasping, and placement tasks using vision guidance … easily handling Lego parts in a jumble of sizes and colors. If a brick is grasped in the wrong position for placement, the UR10 performs a visual survey and repositions and regrasps the brick until it is correctly placed in the gripper. Vision assists in final placement by a second cobot, which is a UR5.

“Next we’ll actually start assembling designs — for example, a house out of Legos or a toy giraffe — and have the robot automatically build it,” explains Koga.

UR cobot open APIs allow low-level control of the UR robots. For the Autodesk project, there was streaming API over TCP communication so programmers could directly access code and bypass the robot’s own operating system.

“We chose Universal Robots in part is because it’s safe to work around. I connect the robot to my laptop and work next to it — and quickly iterate through experiments without worrying about safety protocols slowing things down,” says software architect Yotto Koga.

The HIVE project benefited from the open architecture as well. “Building the HIVE meant working different coding languages and environments across teams and devices. We were able to simplify all of the commands into a single string that we could send to the robot,” adds Kerrick. With the larger industrial robots, there’s often extra steps or extra software that’s needed to sidestep native controls are built into the robot.” The UR scripting language is also simple to learn.

For more information, visit www.universal-robots.com or Autodesk’s www.todayinthelab.com.

If a high stack of bricks starts leaning over, the UR5 can swoop down with a camera, detect the leaning angle, and tell the UR10 to fix it by pushing the bricks back in position.

Article Prepared by Ollala Corp

You might also like
Leave A Reply

Your email address will not be published.