Hands, Hurdles, and Headwinds: The Real Talk from the 2025 Robotics Summit & Expo
This year’s Robotics Summit exposed the gap between what robots can do — and what the world is letting them do.
Aaron’s Thoughts On The Week
“Any sufficiently advanced technology is indistinguishable from magic — until tariffs hit.”
This year’s Robotics Summit & Expo in Boston, once again, brought together an amazing mix of industrial leaders, early-stage innovators, and academic trailblazers. As someone who has attended this event since its inception, I have watched it grow year after year to the point where it is bursting at the seams in one of the largest venues in Boston. This event clearly demonstrates that if you are in the robotics industry, you must attend.
The Summit is a unique blend of talks and demos that thoroughly explores current developments in the robotics industry. It is not Automate, and it is not ICRA, which is why it is so successful. When I leave Automate, I know what robots are on the market and how they are used in various applications. If you are getting ready to buy a new robot application, then Automate is the place to go. If you are conducting robot research or want to hear the latest and greatest in robot research, then ICRA is the place for you.
Where the Summit finds its sweet spot is in discussing the factors driving the robotics industry forward. What are the next new robotics technologies we need to focus on? What are the business challenges? What technical challenges must we address? All these questions are on the table at the Summit. This is where you hear from the leaders and experts in robotics about their perspectives on these topics and what the industry needs to do. For me, this helps me focus on the areas I need to work on to advance the industry.
So what were my takeaways from the 2025 edition of the Summit? Let’s go through them.
Dexterous Manipulation Is Finally Getting Its Hands Dirty (But We’re Not There Yet)
At this year’s Robotics Summit & Expo, you’d be forgiven for thinking the spotlight would be on humanoid robots. However, as I navigated the keynotes, tech demos, and hallway chats, it became clear: dexterous manipulation is quietly becoming the must-watch capability in robotics — and it's what will ultimately determine whether humanoids can move beyond flashy demos and into real-world deployment.
So what is dexterous manipulation?
In simple terms, it’s a robot’s ability to use its "hands" to grasp, move, orient, and interact with objects in a flexible, human-like way. It’s not just about picking something up — it’s about doing so in cluttered environments, with unknown objects, and with the kind of subtlety humans take for granted (like twisting a cap, folding laundry, or plugging in a cable).
At the summit, we saw demos ranging from soft grippers that can handle produce to multi-fingered hands learning tasks via reinforcement learning. One of the more compelling use cases was in logistics, where robots are learning to pick irregular items out of bins — a deceptively difficult task that’s been a bottleneck for automation.
But here’s the core insight:
👉 Without dexterous manipulation, humanoids will never reach their potential.
Walking, seeing, and speaking are great — but if your robot can’t open a door, plug in a device, or handle tools, its use cases collapse quickly. In healthcare, eldercare, manufacturing, retail, and disaster response, hands matter more than legs.
Aaron Parness, the director of applied science in robotics and artificial intelligence at Amazon Robotics, explained in his keynote and fireside chat how Amazon is enabling its robot arms to have a sense of touch to perform tasks in high-clutter and high-contact environments. When asked by an audience member about the use of humanoids at Amazon, he stated that they have a role but emphasized that many of Amazon’s existing workstations are fixed and the product comes to the station. They primarily require robot arms, but these arms need to possess significantly more dexterous manipulation capabilities. This focus has always been a priority for Amazon since the days of their Pick-and-Place Challenge. Their acquisition of KIVA Systems provided them with mobility, and now they need the picking robots to match the dexterity of the human hand. Those picking robots, however, do not need to be mobile at this time, but when they do they will need that dexterity. So that is where Amazon is on the topic.
As we move into a world of more general-purpose robots, dexterity isn’t a bonus — it’s a baseline. The humanoid revolution can’t scale without it, and that’s why this tech is finally getting the respect (and funding) it deserves.
Robots Are Starving for Physical Training Data
In the age of Large Language Models (LLMs) like GPT-4 and Gemini, it’s easy to assume that AI has limitless data to train on. And for text, that's mostly true — you can scrape billions of words, codebases, and documents from the internet. But for robots? The data famine is real — and it's one of the most pressing bottlenecks in robotics today.
As SK Gupta pointed out during his presentation, robotic learning is fundamentally different from language learning. LLMs benefit from massive, high-quality datasets built from decades of human knowledge, much of which is freely available online. But when it comes to teaching a robot to pour a drink, fold a shirt, or insert a key — that data doesn’t exist at scale. Worse, even when it does exist, it is highly specific, brittle, and does not transfer well between tasks, robots, or environments.
LLMs can hallucinate and still be useful. Robots? Not so much. A “close enough” answer from a chatbot leads to confusion — a “close enough” movement from a robot can mean broken fingers, shattered objects, or worse, real harm.
This is why the most effective robotic training today still relies heavily on teleoperation — having a human physically guide or remotely control the robot to demonstrate tasks. This generates the best-quality data because it directly captures human intent and dexterity. However, it is incredibly time-consuming, resource-intensive, and does not scale as effectively as internet scraping.
Here’s the catch:
Every robot platform may require its own custom data.
Physical demonstrations are slow — one action at a time, real-time.
Hardware wear-and-tear makes mass-scale experimentation costly.
Environmental variation is hard to replicate and generalize across.
Despite advances in sim2real transfer and reinforcement learning, simulated data still struggles with the messy, unpredictable physics of the real world. The gap between “trained in sim” and “works in warehouse” remains a chasm — and one that data alone won’t solve without better infrastructure and incentives.
“It struck me that we’re essentially trying to teach robots to run before they’ve learned to crawl—because no one wants to fund crawling.”
Until we solve the data problem, even the smartest robots will remain clumsy learners — limited not by their algorithms, but by the scarcity of real-world experience.
Why Everyone in Robotics Is Asking, ‘Are You OKAY?’”
At the Robotics Summit & Expo, this phrase kept surfacing — half joking, half serious: “So… are you OKAY?”
Not about mental health (though that's fair game too), but about tariffs — and the ripple effects they’re having on nearly every robotics company in the ecosystem.
Behind the cutting-edge demos and investment buzz lies a much harsher reality: most of the key components that power robots — servos, actuators, sensors, control boards — are manufactured outside the U.S. And as tariffs rise and trade tensions escalate, hardware startups and scale-ups alike are feeling the pressure in supply chains, margins, and go-to-market timelines. I got confirmation from many vendors that their prices have gone up.
And yet, very few, though, wanted to say it too loudly on stage.
While reshoring and local manufacturing were floated as solutions, the reality is that rebuilding supply chains is slow, capital-intensive, and risky, especially in a climate where every investor wants faster returns.
In short, the tech may be futuristic, but the logistics are hitting hard in the here and now. And in every hallway conversation, that quiet question kept surfacing: “So… are you OKAY?”
My Closing Thoughts
As the 2025 Robotics Summit & Expo wrapped up, one thing became crystal clear: we’re no longer asking if robotics will change the world. We’re now deep in the trenches of how it actually happens — and how hard that journey really is.
This year’s conversations around dexterous manipulation reminded us that no matter how advanced the AI, it’s the hands, not just the brains, that will unlock the next frontier of real-world applications. Without touch, grip, and nuanced motion, our most humanoid-looking robots are just mannequins in motion.
Meanwhile, the data challenge underscored the fundamental difference between digital and physical AI. Robots don’t get to learn from YouTube tutorials or Reddit threads. They require physical experiences, and collecting that data is still too slow, expensive, and fragmented to keep pace with innovation. Solving this won’t just require better tools — it’ll take collaboration, shared infrastructure, and a willingness to invest in the unglamorous work of crawling before we sprint.
And then there’s the silent weight of tariffs, looming behind every production delay and BOM redesign. Everyone’s asking, “Are you OKAY?” because the answer is rarely a simple yes. Rising costs and trade constraints are now as integral to the robotics equation as AI models and actuator torque.
But that’s precisely why the Robotics Summit matters. It’s where the industry comes together not just to celebrate breakthroughs, but to confront the obstacles — technical, economic, and strategic — that we all face. It’s not just a showcase of what’s possible. It’s a reality check on what’s necessary.
And if this year’s summit is any indication, the industry isn’t backing down. We’re tackling more challenging problems, having more candid conversations, and laying the groundwork for a robotics ecosystem that can scale — with smarter hands, richer data, and more resilient supply chains.
See you in 2026 — and yes, I’ll be OKAY.
Robot News Of The Week
Hugging Face releases a 3D-printed robotic arm starting at $100
Hugging Face has launched the SO-101, a 3D-printable, programmable robotic arm that builds on its earlier SO-100 model. Priced from $100 (but often higher due to tariffs and assembly), it features better motors, easier assembly, and reinforcement learning capabilities. Developed with The Robot Studio and partners like WowRobo and Seeed Studio, the SO-101 can learn tasks like picking up objects. Hugging Face also acquired France-based Pollen Robotics and plans to sell its humanoid robot, Reachy 2, with open-source software for developer contributions.
Omron separates robotics business unit in ‘strategic step’
Omron has launched a dedicated global robotics organization to reinforce robotics as a core part of its automation strategy. This move enhances technical support, speeds decision-making, and strengthens expertise across Europe. The initiative includes new and existing Automation Centres and PoC labs in Annecy, Barcelona, Dortmund, and Stuttgart. A specialized European team will work closely with global R&D, aiming to boost innovation and meet evolving customer needs. Leaders say the new structure will improve service, support co-development projects, and help customers maximize their robotics investments through quicker, more targeted solutions.
ARX Robotics rides defence tech wave with €31M for battlefield robots
German defense tech startup ARX Robotics has raised €31M to scale production of its modular, autonomous ground vehicles and advance its Mithras OS, which upgrades existing military vehicles with AI and autonomous capabilities. Backed by NATO’s Innovation Fund, ARX aims to retrofit 50,000 NATO vehicles and become Europe’s leader in battlefield robotics. Its robots, resembling mini tanks, are customizable for tasks like mine-sweeping or casualty transport. ARX plans a £45M UK facility to produce 1,800 units annually. Its tech is already in use by several European militaries and was recently deployed in Ukraine.
Robot Research In The News
Delivery robots like those from Starship and Kiwibot rely on lidar and SLAM for navigation, but these systems consume large memory and computing resources. Zihao Dong, a PhD student at Northeastern University, developed a new algorithm called DFLIOM that reduces memory usage by up to 57% while maintaining accuracy. Unlike traditional approaches that prioritize more data, DFLIOM selectively extracts only essential information for 3D mapping. This advancement helps robots operate more efficiently across longer distances. The algorithm was tested using Northeastern’s Scout Mini robot, mapping campus areas with lighter processing loads and improved performance.

Check Out Aaron On 𝑺𝒕𝒂𝒏𝒅𝒂𝒓𝒅𝒔 𝑰𝒎𝒑𝒂𝒄𝒕
From food delivery bots on busy sidewalks to autonomous systems managing herds in the fields, the robotics industry is evolving faster than ever. And standards are keeping pace.
Aaron joined David Walsh on the latest episode of 𝑺𝒕𝒂𝒏𝒅𝒂𝒓𝒅𝒔 𝑰𝒎𝒑𝒂𝒄𝒕 to discuss the growing influence of robotics across sectors — and why smart standards are critical for scaling this technology safely and responsibly.
Give it a listen if you’re curious about how standards fuel innovation in robotics.
Robot Workforce Story Of The Week
In Southwestern PA, students are getting paid to train for the future of manufacturing jobs
Federal Build Back Better funds continue to boost Allegheny County’s workforce, including a robotics and automation training program at Robert Morris University (RMU). Backed by a $100,000 grant from Carnegie Mellon’s Block Center, RMU offers paid courses in industrial robotics, PLCs, and collaborative robotics. Aimed at veterans and local workers, the program upskills participants for advanced manufacturing jobs. Students earn $18/hour during training, with strong attendance and engagement reported. Lessons learned will be shared at an ASME conference this fall. The effort is part of a $62.7M regional strategy to revitalize post-industrial communities across Southwestern Pennsylvania.
Robot Video Of The Week
In robotics, dexterity and perception go hand-in-hand—literally. Boston Dynamics' Atlas manipulation test stand demonstrates a variety of grasping techniques, using reinforcement learning (RL) policies trained with NVIDIA Robotics' DextrAH-RGB.
Upcoming Robot Events
May 12-15 Automate (Detroit, MI)
May 17-23 ICRA 2025 (Atlanta, GA)
May 18-21 Intl. Electric Machines and Drives Conference (Houston, TX)
May 20-21 Robotics & Automation Conference (Tel Aviv)
May 29-30 Humanoid Summit - London
June 9-13 London Tech Week
June 17-18 MTC Robotics & Automation (Coventry, UK)
June 30-July 2 International Conference on Ubiquitous Robots (College Station, TX)
Aug. 17-21 Intl. Conference on Automation Science & Engineering (Anaheim, CA)
Sept. 15-17 ROSCon UK (Edinburgh)
Sept. 27-30 IEEE Conference on Robot Learning (Seoul, KR)
Sept. 30-Oct. 2 IEEE International Conference on Humanoid Robots (Seoul, KR)
Oct. 6-10 Intl. Conference on Advanced Manufacturing (Las Vegas, NV)
Oct. 15-16 RoboBusiness (Santa Clara, CA)
Oct. 19-25 IEEE IROS (Hangzhou, China)
Oct. 27-29 ROSCon (Singapore)
Nov. 3-5 Intl. Robot Safety Conference (Houston, TX)
Dec. 11-12 Humanoid Summit (Silicon Valley TBA)