Another year, another prediction. Let’s start with some of the ongoing trends before getting into the less obvious predictions.
Robotic technology is constantly evolving and has made significant advancements in recent years. The use of increasingly advanced artificial intelligence and machine learning algorithms has enabled robots to perform tasks more autonomously and adapt to changing conditions more easily.
As these smart robots are no longer confined to factories, they are being deployed in a growing number of sites and applications. Robots are moving boxes and pallets in warehouses, scrubbing the floors of massive airport terminals and conducting inventory in retail stores.
Another trend is the increasing use of robots in service industries, such as healthcare, education, and hospitality. These robots are designed to operate in unstructured environments and interact with people. The use of robots in these industries can improve efficiency and productivity, as well as provide improved services.
These are not new but will continue to accelerate over the next few years. On the other hand, these are widely accepted and acknowledged trends, so not much to see here. Let’s try something different: here are some thoughts that the average person, outside of the relatively small circle of roboticists that work in this space, would likely find surprising.
In the course of a regular day, many people may encounter one or more robots, even if we don’t think about them in those terms. Of course, many of us have robotic vacuums at home; the more advanced of these use a scaled-down version of the tech in autonomous forklifts and self-driving cars.
On many university campuses and downtown areas, you may see robots delivering food or groceries. More and more restaurants are using robot carts to deliver food to the table. As you travel, the floor at the airport terminal may have been cleaned by a robot and your hotel may deliver a toothbrush to your room using a robot. When you order goods through your favorite ecommerce website, there’s a good chance that robots took care of picking and packing your items.
If you visit a retail store, you may find multiple robots being orchestrated together to deliver a great consumer experience, from receiving shipments and replenishing stock on shelves to delivering items to buyers while they wait comfortably. Expect to hear more about these robot-enabled experiences in 2023.
The list goes on. However, most people may go about their day without thinking of or even recognizing many of these machines as robots. It’s OK; you’re not going to hurt the robots’ feelings. But that brings us to the next point.
The robotics industry has been obsessed with creating human-like robots – aka androids or humanoid robots – that can walk on two legs and have two arms. I get it, this is an engineer’s version of the God complex. Many aspire to create something in their own image and are driven by the sheer complexity.
Technology has advanced to the point where this is tantalizingly close to being feasible – today, these robots already work under some tight constraints, although they are still cost-prohibitive and would perform poorly in almost every real-world environment.
However, the issue isn’t just a matter of advancing the technology (more efficient batteries, smaller motors, lighter materials, smarter software, etc.) but a more fundamental one of design that is fit for purpose. Imagine having to clean that airport terminal every night. Would you really build an insanely expensive humanoid robot and then give it a mop and a bucket, or would you design a machine with a 200-liter tank and a 1.5' diameter brush that can cover 10X the surface in 1/10th the time?
In a recent LinkedIn thread, someone involved in the autonomous vehicle space pointed out that “humanoid robots are now where self-driving cars were in 2016” … to which another industry veteran quipped that “self-driving cars are still where they were in 2016.” Much more interesting than creating a marvel of engineering is thinking about how people and robots will interact; enter the next trend for 2023.
For a while in 2022, everyone was talking about the metaverse, even if nobody really knew what it meant. In 2023 and beyond, we will see more VR and AR devices being introduced, and it will eventually become as common as having a computer on our desks or holding a smartphone in our hands.
But what does this have to do with robots? Well, as it turns out, quite a bit. You see, modern robots and head-mounted displays use some of the very same tech to understand the 3D world we live in (or 4D if you add time.) That means that advances in one area (whether it’s better depth-sensing cameras, cheaper LiDARs, object recognition or physics-driven simulations) are driving progress in the other in a virtuous cycle.
Moreover, this allows people to “see” through the “eyes” of the robots as naturally as if they were there, and to interact with the physical world though the robot – possibly from a different continent.
This is not quite the experience in The Peripheral, a novel by William Gibson that was recently made into a streaming show, where people from our near future use quantum entanglement to control a robot that looks just like them 50 years into the future in another branch of the multiverse. However, the technology is here today and we will see more of this in the year ahead.
Overall, we can expect that some of these trends, which have been developing over the last decade, will accelerate and become increasingly commonplace during the year ahead. In the way that technology tends to propagate, we will see it go mainstream and eventually become as much a part of the landscape as being always connected to the Internet.