Home › EIDA Forum › Today’s Discussion and Announcements › Uncaged industrial robots need advanced sensors, AI for work with humans
- This topic is empty.
-
AuthorPosts
-
-
at #5307Tingting ZhangKeymaster
Industrial robots can be freed from their protective cages to collaborate with human workers safely as long as sensors, compute and AI align for proactive safety. The next wave of industrial robots will be uncaged.
Whether it’s a manufacturing setting or a warehouse, robots are becoming more collaborative, thanks to advances in sensors and artificial intelligence, both deemed critical to robot capabilities, productivity, and functional safety.
More robust automation empowered by artificial intelligence (AI) coupled with a labor shortage that’s not going away provide a big opportunity for more mobile, purpose-built robots in manufacturing and logistics.
The market for collaborative robots was pegged at $1.1 billion in 2022 with the potential to grow to $9.2 billion by 2028, according to one report, as automation takes on a new sense of urgency to offset a shortage of workers by taking on more complex tasks and improving safety.
While repeatable tasks for robots have been common in automotive manufacturing for some time, such as welding, repeatability becomes a challenge for many use cases, such as parcel handling or in any setting where the robot must share the space with people and other robots.
If industrial robots are to continue to take on greater roles in factories and warehouses, advances must be made for them to be more autonomous – safely.
Every robot needs a brain
No matter the environment, AI enables industrial robots to optimize performance of tasks through real-time information that’s captured by computer vision and sensors. Mix in machine learning and large data sets, and today’s robots can better navigate their environment autonomously. A great example is the ability for a Roomba vacuum to avoid doggy doo.
GPU maker Nvidia sees manufacturing and warehouse logistics as the sweet spot for many of its platforms to advance collaborative robots in manufacturing and logistics. “We want to help bring those robots out of the cage and move them closer to work in collaboration with their human partners,” Gerard Andrews, senior product marketing manager focused on the robotics developer community, told Fierce Electronics. That ability not only requires awareness of their surroundings through perception technologies such as sensors, but a “brain” that make sense of all the inputs so it can safely execute its assigned task, he said.
Nvidia’s Isaac Cortex, introduced last year, is a framework for programming collaborative robots so they can understand their environment while they execute a task. Andrews said earlier iterations of industrial robots only had to do simple tasks such as moving an item from one location to another. “They weren’t keeping track of the environment and obstacles and human partners and all of these things as they executed the tasks.”
They were also siloed off from people with protective cages erected on the shop floor– not to keep the robot in, but to keep people out and safe, he said. “If you want robots to be able to do more, instead of just being in a cage doing one thing, then they need to be kind of taking responsibility for the safety.” The onus has traditionally been on people to stay away from the robots.
Freeing robots from cages to roam a warehouse makes functional safety more complex. It’s still early days, Andrews said, but as collaborative robots become more integrated into factories and warehouses, some concepts are becoming clear. “You have reactive and proactive safety.” A typical reactive scenario might be that a human is within three meters of a robot’s safe operating zone and the robot will yield to that person, he said, while a proactive safety scenario is a robot on aisle 3 of a warehouse knowing there are six people in aisle 9 and adjusts its route based on where the people are located anywhere in the warehouse. The latter scenario is more dependent on cameras throughout the warehouse helping guide the behavior of the robot, rather than sensor technology on the robot itself.
Both scenarios illustrate the intersection of software, AI, and hardware, including sensors. Andrews said many platforms for mobile robots are using 2D lidar technologies. Last year Nvidia introduced its Isaac Nova Oren, a compute and sensor platform that includes lidar, multiple cameras and ultrasonic sensors, which he said would be suitable for unstructured environments such as warehouses and draws from Nvidia’s work with autonomous vehicles.
Sensors enable robots to see more
The sophistication of sensors is crucial because they are what drives robotics no matter the purpose of the robot, according to Paul Drysch, founder and CEO of PreAct Technologies. Also, the cost of sensors is coming down, meaning “a lot of use cases become doable.” Industrial robots are a magnitude easier than autonomous vehicles on the road because they operate in a more predictable environment – you’re not going to have oddball scenarios such as wind blowing a tree onto a road that causes every car to swerve.
Autonomous vehicles require long range lidar, but PreAct’s focus is on nearfield lidar, Drysch said. “In factory automation you generally don’t even need to see beyond three meters, max five meters. If you’re going at one, two, even three miles an hour, that’s still plenty of time for the robot to stop or change course.” PreAct’s lidar technology is also software defined, Drysch said, with an architecture based on modulated waveforms that can be optimized for different applications via over-the-air updates, along with an application stack that resides on the edge. This flexibility means Tier 1 suppliers and OEMs can package one sensor for multiple use cases. Along with the use of off-the-shelf components, he said this flexibility lowers costs.
The company typically supports uses cases where all the compute is done centrally in a cloud data center, based on the sensor input and sending commands back to the robot, but Drysch said there are some instances where a customer needs computation to be done within the robot so it can respond within milliseconds. “They’ll engage us to help develop an algorithm that sits within the lidar and can make those rapid-fire decisions.”
Even as various enabling technologies evolve, Drysch said the uses for robots tend to be case specific and they are increasingly traveling faster, hence becoming more productive. “They could be going easily three or four miles per hour. It doesn’t sound like a big deal, but that’s one third or one quarter of the time that it used to get parts from one spot to another.” He said robots are also getting more accurate and less likely to get stuck or run into anything. “Things like that will improve as the performance of the sensors gets better.”
Drysch expects sensor improvements to contribute to a big second wave of robots that will be built by many different companies in different niches, including startups. He is quite bullish on growth potential for robotics in the next couple of years, despite layoffs in the broader technology sector in recent months.
Those robots and the startups that make them could come out of Pittsburgh, which has a history in the steel business, but robotics, too. Joel Reed, executive director of the Pittsburgh Robotics Network, said the city has been at the forefront of developing autonomous solutions and AI since the late sixties. The Robotics Institute (RI) at Carnegie Mellon University was launched in 1979 and was the first robotics department at any U.S. university. Now the city is home to many startups and large companies are setting up shop, he said. There are more than 100 companies in the robotics segment operating in Pittsburgh.
The Pittsburgh Robotics Network along with Innovation Works and other regional partners recently launched the Robotics Factory, a set of inter-related programs to create, accelerate and scale robotics startups in the Pittsburgh region. The accelerator program is open for startup companies in robotics, robotics-adjacent sectors, and other advanced technologies that are or can be related to robotics.
Reed said the Robotics Factory is part of a broader initiative to strengthen the region’s robotics and AI cluster. Aside from accelerating startup companies through funding and mentorship, it will help existing companies scale and develop the capacities necessary for what is predominantly a hardware business, he said.
Industrial robotics innovation has many moving pieces
The robotics ecosystem in Pittsburgh covers more than a dozen industry verticals, including construction robots and agriculture logistics. Expertise-wise, companies can be segmented into those that help enable autonomy, which encompasses software and AI, hardware – everything from sensors to motors, and integrated systems where integrators are pulling everything together. “That’s really where we’re going to start seeing growth over the next couple years,” Reed said.
This growth will put robotics far ahead of the days of robot arms inside protective cages on manufacturing floors. “They provided a high degree of precision and automation, but there wasn’t a lot of intelligence in those machines,” Reed said. Now there’s a much higher level of smarts and capabilities. “The biggest thing that has happened in the last five years is that you now see collaborative features where those machines could work side by side with humans.”
While collaboration opens a whole new range of applications, it’s not about replacing humans completely, Reed said, although more automation is necessary given labor shortages not expected to go away. “This is a problem that every industry is dealing with right now. Businesses have tried to figure out how to remain profitable and to continue to scale their operations with the declining workforce.”
The real economic value of collaborative robots is when you pair them with people, Reed said. “The industry now is pointedly working on leveraging human talent and creating more capacity for existing workforces,” he said. “The goal is not to replace humans… it’s to be able to do more with less.”
And it’s an incredibly complex industry. Software, machine learning, and AI platforms are driving a lot of development, putting pressure on compute as well as the sensitivity and capability of sensors. Many mechanical and electrical components work alongside motors, electric drives, motion controllers, actuators, and other mechanical subsystems. “All of these components are coming together to help advance what’s possible,” Reed said. “The ultimate cost of these hardware platforms needs to come down.”
Nvidia’s Andrews is optimistic that compute-wise, the platforms are ready, as are the sensor suites, and he’s excited for the potential of a hybrid category of industrial robots that are mobile and can operate with robust manipulation capabilities. “That opens up all sorts of use cases on the manufacturing side and on the logistics side.”
By: Gary Hilson
-
-
AuthorPosts
- You must be logged in to reply to this topic.