Smarter, autonomous robots, have been developed over the past five years thanks to advances in mobile computing, sensors and AI. Many of these robots are now being deployed to assist in the fight against COVID-19 in an effort to “flatten the curve” of cases or provide human-augmented services for companies providing essential functions.
Over the last two decades, we’ve seen the evolution of hardware virtualization that resulted in the cloud as we know it today. It started with Virtual Machines and then expanded to include Software-Defined Networking (SDN) and Software-Defined Storage (SDS).
Software-defined networking is “an approach to networking that uses software-based controllers or application programming interfaces (APIs) to direct traffic on the network and communicate with the underlying hardware infrastructure.” (Source: VMWare) Similarly, software-defined storage separates the management and provisioning of storage from the underlying physical hardware.
As the COVID-19 epidemic continues to wreak havoc in countries around the world, there’s some reason for hope as the number of new cases in China drops to 0. This shows that with concerted effort by individuals, companies and government, it is possible to limit the exponential scale of the virus.
For this post, I want to go beyond the day-to-day commentary on the robotics industry or our own business and take a look at longer term trends shaping society. Needless to say, this is a personal perspective and pure speculation. Making predictions about what’s to come is easy, getting it right is hard, so take this with a grain of salt. My hope is that this will spur some healthy discussions and drive additional innovations.
The start of a decade is always a good excuse for predictions. After all, who’s going to remember 10 years later if you were right? For us at InOrbit, we are setting a goal and making a prediction that will determine the fate of our company and the broader robotics industry. We fully expect to be held accountable to it.
We recently got together as a team to work on our 2020 Vision (yes, pun fully intended) to guide us through the decade we are just kicking off. As part of this we have committed all our energy and passion behind a BHAG: a big, hairy and audacious goal. If you are not familiar with the concept of BHAG (pronounced “bee hag”), it’s a term popularized by Jim Collins, of Good to Great fame, to capture a simple goal that help align a whole organization.
Last week InOrbit was at ROScon, one of the most important technical conferences in robotics. It was great to see a growing and very engaged community tackle a wide variety of topics. Most of the energy in recent years has been around ROS2, and this year it was clear that it’s really happening. You could find robots on the exhibit floor running ROS2, and most of the talks tackled new capabilities in ROS2.
We noticed another, perhaps more significant change: there was a significant uptick in the number of talks and discussions regarding scalability of operations in the field and interoperability across robots. As ROS adoption is growing, attention is shifting towards managing hundreds or thousands of robots outside the lab.
The opening keynote set the tone. Selina Seah, Director of the Centre for Healthcare Assistive & Robotics Technology (CHART) at Changi General Hospital, and Morgan Quigley, Chief Architect at Open Robotics, presented their ongoing work to coordinate navigation of heterogeneous robotic fleets and more broadly addressing the need for interoperability of complex and disparate technological systems with HIT and infrastructure.
Our very own CTO/co-founder, Julian Cerruti, and one of the engineers on the InOrbit team, Florencia Grosso, presented some lessons learned working with robotics companies to deploy and operate fleets of ROS-based autonomous robots in production. Several other presentations mentioned fleet management, and the excellent panel on ROS at Scale delved into the details of how to handle large systems of robots in production settings.
Our goal is to put every robot in orbit around the cloud to help accelerate the adoption of robotics across industries. Today we are one important step closer to that goal.
At RoboBusiness, the premier commercial robotics trade show for business executives, we announced that the Qualcomm® Robotics RB3 development kit will have pre-integrated support for InOrbit’s cloud platform. This advanced development kit is based on the powerful Qualcomm SDA845 SoC. This allows robotics developers to create autonomous robots for the most challenging applications.
In Part 1 of this article, we discussed in some detail some of the limitations of AI and autonomous systems. The key takeaway is that autonomy is relative, and there continues to be a need for human interaction and direction.
At InOrbit, we are harnessing the power of the cloud and the edge to bring automation and efficiency to the operation of distributed robot fleets. It consists of four O’s, which of course we think of as concentric orbits.
Imagine that you need to fly to a distant location, a busy airport you’ve never visited. You are the nervous kind, so you want to make sure it will be safe. You talk to the airplane manufacturer, and walk away with confidence that the machine has been well designed. You verify the maintenance schedule and are satisfied that it is thoroughly tested.
Then you talk to the management at the airport you’re flying into, and they tell you that they cobbled together their air traffic control system with some old PCs that nobody was using and installed some software for a taxi dispatch. Would you get on that plane?
Sadly, this is often the case in robotics. Since before we started InOrbit, and throughout the last 2 years, we have engaged with over 75 robotics companies and hundreds of people in the robotics space. We have talked to the C-suite setting strategy and to front line operators doing triage, to robotics Ph.D.’s and to self-identified robot baby-sitters. Across all these conversations, we had an overarching question: how do you manage robots after they leave the lab?
Robots are everywhere. They can be found in hospitals and hotels. In farms and construction sites. Brick and mortar retailers and e-commerce distribution centers. In the air, in the sea, on the ground and even underground, as we saw recently in the DARPA challenge.
But what is a robot? There have been plenty of philosophical discussions on this, and probably no shortage of flame wars. We like this definition from IEEE:
A robot is an autonomous machine capable of sensing its environment,carrying out computations to make decisions, and performing actions
in the real world.
So right there in the definition is the A-word: autonomy. Since nomos is Greek for “law”, something autonomous makes its own laws. Pretty cool, right?
However, autonomy is relative. We’re not just talking about being constrained by the laws of physics, but by the limits of AI, sensing technology, computing power, servos, etc. In essence, the guts and brains of robots can only go so far with current technology.