Autonomous robotic grasping is a fundamental skill for the next generation of robots. It is a challenging problem as it requires many steps to succeed, ranging from detecting the target location to selecting the grasp pose configuration to have a stable grasp. Grasping fragile objects or objects with variable shapes is a more complex task with respect to the traditional pick and place of solid objects, and robots typically do not have a reliable sense of touch. In such cases, a retention action, which can be obtained by electro-adhesion, is typically preferred over a compression force in order to avoid damaging the objects. The proposed work presents a robotic manipulation grasping system that leverages a gripper realized with the flexible thin-film electro-adhesive (EA) devices technology and a vision pipeline based on an RGB-D camera to detect the grasp pose configuration and track the target during the holding phase to check whether the task has been successfully completed. Thanks to the properties of the EA gripper, vision is the only perception cue needed to successfully grasp the target without damaging it since the gripper can automatically adapt its shape to the surface of the target delicately wrapping its two fingers around the object. Several tests have been done to assess the capabilities of the proposed robotic system, picking and placing deformable objects, comparing the EA gripper with a traditional parallel jaw gripper. The gripper is particularly suitable to handle objects where two parallel flat surfaces are available for grasping. Future works will attempt to improve the grasping of non flat objects.

D'Avella S., Fontana M., Vertechy R., Tripicchio P. (2022). Towards autonomous soft grasping of deformable objects using flexible thin-film electro-adhesive gripper. IEEE Computer Society [10.1109/CASE49997.2022.9926531].

Towards autonomous soft grasping of deformable objects using flexible thin-film electro-adhesive gripper

Vertechy R.;
2022

Abstract

Autonomous robotic grasping is a fundamental skill for the next generation of robots. It is a challenging problem as it requires many steps to succeed, ranging from detecting the target location to selecting the grasp pose configuration to have a stable grasp. Grasping fragile objects or objects with variable shapes is a more complex task with respect to the traditional pick and place of solid objects, and robots typically do not have a reliable sense of touch. In such cases, a retention action, which can be obtained by electro-adhesion, is typically preferred over a compression force in order to avoid damaging the objects. The proposed work presents a robotic manipulation grasping system that leverages a gripper realized with the flexible thin-film electro-adhesive (EA) devices technology and a vision pipeline based on an RGB-D camera to detect the grasp pose configuration and track the target during the holding phase to check whether the task has been successfully completed. Thanks to the properties of the EA gripper, vision is the only perception cue needed to successfully grasp the target without damaging it since the gripper can automatically adapt its shape to the surface of the target delicately wrapping its two fingers around the object. Several tests have been done to assess the capabilities of the proposed robotic system, picking and placing deformable objects, comparing the EA gripper with a traditional parallel jaw gripper. The gripper is particularly suitable to handle objects where two parallel flat surfaces are available for grasping. Future works will attempt to improve the grasping of non flat objects.
2022
IEEE International Conference on Automation Science and Engineering
1309
1314
D'Avella S., Fontana M., Vertechy R., Tripicchio P. (2022). Towards autonomous soft grasping of deformable objects using flexible thin-film electro-adhesive gripper. IEEE Computer Society [10.1109/CASE49997.2022.9926531].
D'Avella S.; Fontana M.; Vertechy R.; Tripicchio P.
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/914959
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? ND
social impact