Publication:
Object and relation centric representations for push effect prediction

dc.contributor.coauthorTekden, Ahmet E.
dc.contributor.coauthorAsfour, Tamim
dc.contributor.coauthorUğur, Emre
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorErdem, Aykut
dc.contributor.otherDepartment of Computer Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.date.accessioned2024-12-29T09:41:24Z
dc.date.issued2024
dc.description.abstractPushing is an essential non -prehensile manipulation skill used for tasks ranging from pre -grasp manipulation to scene rearrangement, reasoning about object relations in the scene, and thus pushing actions have been widely studied in robotics. The effective use of pushing actions often requires an understanding of the dynamics of the manipulated objects and adaptation to the discrepancies between prediction and reality. For this reason, effect prediction and parameter estimation with pushing actions have been heavily investigated in the literature. However, current approaches are limited because they either model systems with a fixed number of objects or use image -based representations whose outputs are not very interpretable and quickly accumulate errors. In this paper, we propose a graph neural network based framework for effect prediction and parameter estimation of pushing actions by modeling object relations based on contacts or articulations. Our framework is validated both in real and simulated environments containing different shaped multi -part objects connected via different types of joints and objects with different masses, and it outperforms image -based representations on physics prediction. Our approach enables the robot to predict and adapt the effect of a pushing action as it observes the scene. It can also be used for tool manipulation with never -seen tools. Further, we demonstrate 6D effect prediction in the lever -up action in the context of robot -based hard -disk disassembly.
dc.description.indexedbyWoS
dc.description.indexedbyScopus
dc.description.openaccessGreen Submitted, hybrid, Green Published
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuTÜBİTAK;EU
dc.description.sponsorsThis research has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement no. 731761, IMAGINE; supported by a TUBA GEBIP fellowship awarded to E. Erdem; and supported by a Tubitak 2210-A scholarship awarded to A.E. Tekden. The numerical calculations reported in this work were partially performed at TUBITAK ULAKBIM, High Performance and Grid Computing Center (TRUBA resources) .
dc.description.volume174
dc.identifier.doi10.1016/j.robot.2024.104632
dc.identifier.eissn1872-793X
dc.identifier.issn0921-8890
dc.identifier.quartileQ1
dc.identifier.scopus2-s2.0-85185165186
dc.identifier.urihttps://doi.org/10.1016/j.robot.2024.104632
dc.identifier.urihttps://hdl.handle.net/20.500.14288/23630
dc.identifier.wos1173506700001
dc.keywordsPush manipulation
dc.keywordsEffect prediction
dc.keywordsParameter estimation
dc.keywordsGraph neural networks
dc.keywordsInteractive perception
dc.keywordsArticulation prediction
dc.languageen
dc.publisherElsevier
dc.relation.grantnoEuropean Union [731761]
dc.relation.grantnoTUBA GEBIP fellowship [2210-A]
dc.relation.grantnoTubitak 2210-A scholarship
dc.sourceRobotics and Autonomous Systems
dc.subjectAutomation and control systems
dc.subjectComputer science, artificial intelligence
dc.subjectRobotics
dc.titleObject and relation centric representations for push effect prediction
dc.typeJournal article
dspace.entity.typePublication
local.contributor.kuauthorErdem, Aykut
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files