Autonomous Mobile Robot Conference: Insight into a Rapidly Expanding and Evolving Ecosystem

The Robotic Industries Association (RIA) hosted its virtual Autonomous Mobile Robot Conference earlier this week (October 26 – 27). I personally kicked off research on the global mobile warehouse robotics market earlier this month and therefore chose to attend conference sessions that provided insights on the broader context of mobile warehouse robotics, namely the concepts of ecosystems, integration, and interoperability.

Autonomous mobile robot (AMR) interoperability as a concept found its way into many sessions but it was the main topic of the session presentation by Aaron Prather, Senior Advisor, technology planning and research at FedEx. Aaron stated the importance of interoperability guidelines for robotics industry growth and shared a couple examples of organizations, such as Mass Robotics, that are currently addressing the interoperability topic. Aaron also announced a new project team with members from FedEx, Siemens, Yaskawa, Waypoint Robotics, and the University of Memphis, with a focus on enabling AMR cross-vendor communication and management to facilitate further adoption of AMRs. Currently most AMR providers utilize their own proprietary robot fleet management software, which makes it difficult for end-users to operate a mixed fleet of robots. But as the use cases for robotics expand, the interoperability of mixed fleets will become more relevant to safe and productive work environments. Finally, Aaron noted a separate initiative, an RIA interoperability task force with a mission to survey the current landscape of interoperability initiatives and then report its findings back to the RIA with suggested next steps. The task force will focus its efforts on four topic areas – fleet management, robot-to-robot communications, robot-to-cloud communications, and battery management systems (to participate, contact

A unifying architecture as the mechanism to scale AMR adoption over the next five years, was the theme of Hans Lee’s presentation Crossing the [Robotics] Chasm. Hans is the CTO of Freedom Robotics, a provider of software infrastructure for modern robotics companies. Freedom Robotics’ software includes capabilities for robot fleet monitoring, remote control and teleoperation tools, management dashboards, and cloud-based analytics. During the presentation, Hans framed his premise with the adage that systems complexity is exponential.  This was followed with six hurdles that are impeding AMR adoption and solutions for addressing those impediments. Most notably, Hans suggested that non-scalable deployments could be addressed by designing for scale upfront through the use of unified architecture. Hans included screen shots of user interfaces for various personas including a dashboard view and a remote camera view of a warehouse from which a teleoperation can take over control of the robot. Also included was a framework for measuring and quantifying performance with metrics such as uptime, distance traveled, and idle time. I plan to learn more about the Freedom Robotics solution, as I do see a valuable role for interoperability, whether it be through the creation of robust standards, a unifying architecture such as that offered by Freedom Robotics, or a platform like the one provided by SVT Robotics.

The MiR AI Camera is an innovative complementary technology that is poised to take MiR robotic workflow to the next level. Lourenco Castro, an AI developer at Mobile Industrial Robots (MiR), presented the value proposition for this novel technology. This stationary camera and AI technology performs object detection and classification and communicates that information to MiR robots. The device is focused on improving robot efficiency. A customer that installs the AI camera is retroactively installing these object detection capabilities to their existing MiR robots. The AI camera is essentially the equivalent to adding an additional sensor to each robot, and a new level of intelligence to the MiR fleet management system. Using object detection in the AI camera, robots can be informed ahead of time of blocked routes that are likely to remain blocked (inanimate objects) and subsequently reroute, or blocked routes that are just temporary (a standing person) and subsequently continue along initial path. Perhaps more novel, the automatic mission calling feature can detect a crate or similar object and call a robot to pick it up. The fleet management system can automatically call the right mission or load compatible robot by using input from the AI camera.

These three sessions were just a sampling of the content available during the conference. For additional information or session content, go to the RIA Autonomous Mobile Robot Conference website.

Leave a Reply

Your email address will not be published. Required fields are marked *