Future

MIT Particle Simulator Improves Robots’ Abilities To Interact With Liquids and Solid Objects

MIT-Physics-Simulator
MIT/CSAIL

MIT is at it again. After developing the RoboCycle robot, capable of sorting out trash by touch alone, researchers want to get robots to predict how different items and liquids will react to their touch.

It’s well-known that robots have a hard time grabbing delicate items and cannot tell fragile objects or liquids from solid state ones – most of the time, their approximations fall short and the robots end up squishing or destroying them altogether.

As it has been the case before, the researchers developed a ‘learning-based’ simulation system that should, in theory, help the robots work their way around the objects they are supposed to interact with. This particle simulation is not very different from the way we humans learn to grip by intuition.

Humans have an intuitive physics model in our heads, where we can imagine how an object will behave if we push or squeeze it. Based on this intuitive model, humans can accomplish amazing manipulation tasks that are far beyond the reach of current robots,” graduate student in the Computer Science and Artificial Intelligence Laboratory (CSAIL) said We want to build this type of intuitive model for robots to enable them to do what humans can do.”

The researchers created a two-fingered robot called RiceGrip and tasked it to put a foam into a certain shape. The robot used a depth camera and object recognition to ‘understand’ what the foam is and eventually identified it as a deformable material; it added edges between its particles and reconstructed them into a ‘dynamic graph customized for deformable materials.’

Because it went through all the simulations, RiceGrip had a solid idea of how a touch would affect the particles it identified as being the foam. When the particles did not align, an error signal was sent to the model, which, in turn changed the way the model interacted with the material in order to better match the actual physics of it.

The robot won’t be making sushi any time soon but the researchers are working on it in order to help RiceGrip and other robots out there be more capable of predicting interactions in scenarios they might not have complete control over or about which they only have partial information on.

One example was given in the case of a pile of boxes. In the future, the team hopes that the robot will be able to know how the boxes will move and fall when pushed, even if it cannot see the hidden boxes, just the ones on the surface.

You’re dealing with these cases all the time where there’s only partial information,” Jiajun Wu, a CSAIL graduate student and co-author of the particle simulator paper said “We’re extending our model to learn the dynamics of all particles, while only seeing a small portion.”

Follow TechTheLead on Google News to get the news first.

Subscribe to our website and stay in touch with the latest news in technology.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Must Read

Are you looking for the latest innovations in tech? You're in the right place, just subscribe to our RSS feed


Techthelead Romania     Comedy Store

Copyright © 2016 - 2023 - TechTheLead.com SRL

To Top