Click to Skip Ad
Closing in...

RoboNinja’s AI-powered chef skills are blowing my mind

Published Feb 23rd, 2023 4:05PM EST
RoboNinja AI-powered chef robot
Image: Columbia University

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Have you ever seen a robot perfectly cut around the pit of an avocado? I sure haven’t.

Today, AK took to Twitter to share a video of RoboNinja, an AI-powered robot that can perfectly cut through multi-material objects like mangos and avocados while factoring in things such as pits. As you can see in the video, the robot perfectly cuts around the pit of a mango. It’s pretty wild to see.

According to a post from Columbia University, the purpose of RoboNinja is to perfect cutting through objects that feature a soft exterior while containing a hard interior. While previous techniques used “open-loop cutting actions to cut through single-material objects,” the new version can factor in a hard interior and know how to work around it.

To achieve this, our system closes the perception-action loop by utilizing an interactive state estimator and an adaptive cutting policy. The system first employs sparse collision information to iteratively estimate the position and geometry of an object’s core and then generates closed-loop cutting actions based on the estimated state and a tolerance value. The “adaptiveness” of the policy is achieved through the tolerance value, which modulates the policy’s conservativeness when encountering collisions, maintaining an adaptive safety distance from the estimated core. Learning such cutting skills directly on a real-world robot is challenging.

The team says that it used a “differentiable cutting simulator” and a “low-cost force sensor” in order to be able to use the learned model in real-world scenarios with objects that had a range of geometries and materials.

Yet, existing simulators are limited in simulating multi-material objects or computing the energy consumption during the cutting process. To address this issue, we develop a differentiable cutting simulator that supports multi-material coupling and allows for the generation of optimized trajectories as demonstrations for policy learning. Furthermore, by using a low-cost force sensor to capture collision feedback, we were able to successfully deploy the learned model in real-world scenarios, including objects with diverse core geometries and soft materials.

Watching a robot learn how to cut fruit reminds me of some of the crazy videos you see of Boston Dynamics robots dancing or doing parkour. With AI and robotics coming together in more ways, we’re definitely heading to something more intense than Bing losing its mind when trying to chat with you.

Joe Wituschek Tech News Contributor

Joe Wituschek is a Tech News Contributor for BGR.

With expertise in tech that spans over 10 years, Joe covers the technology industry's breaking news, opinion pieces and reviews.

\