Publication:
Detecting human motion intention during phri using artificial neural networks trained by EMG signals

dc.contributor.coauthorN/A
dc.contributor.departmentN/A
dc.contributor.departmentN/A
dc.contributor.departmentN/A
dc.contributor.departmentDepartment of Mechanical Engineering
dc.contributor.kuauthorŞirintuna, Doğanay
dc.contributor.kuauthorÖzdamar, İdil
dc.contributor.kuauthorAydın, Yusuf
dc.contributor.kuauthorBaşdoğan, Çağatay
dc.contributor.kuprofilePhD Student
dc.contributor.kuprofileMaster Student
dc.contributor.kuprofilePhD Student
dc.contributor.kuprofileFaculty Member
dc.contributor.otherDepartment of Mechanical Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokidN/A
dc.contributor.yokidN/A
dc.contributor.yokid328776
dc.contributor.yokid125489
dc.date.accessioned2024-11-09T23:05:22Z
dc.date.issued2020
dc.description.abstractWith the recent advances in cobot (collaborative robot) technology, we can now work with a robot side by side in manufacturing environments. The collaboration between human and cobot can be enhanced by detecting the intentions of human to make the production more flexible and effective in future factories. In this regard, interpreting human intention and then adjusting the controller of cobot accordingly to assist human is a core challenge in physical human-robot interaction (pHRI). In this study, we propose a classifier based on Artificial Neural Networks (ANN) that predicts intended direction of human movement by utilizing electromyography (EMG) signals acquired from human arm muscles. We employ this classifier in an admittance control architecture to constrain human arm motion to the intended direction and prevent undesired movements along other directions. The proposed classifier and the control architecture have been validated through a path following task by utilizing a KUKA LBR iiwa 7 R800 cobot. The results of our experimental study with 6 participants show that the proposed architecture provides an effective assistance to human during the execution of task and reduces undesired motion errors, while not sacrificing from the task completion time.
dc.description.indexedbyWoS
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.sponsorshipScientific and Technological Research Council of Turkey (TUBITAK) [EEEAG-117E645] The Scientific and Technological Research Council of Turkey (TUBITAK) supported this work under contract EEEAG-117E645. Moreover, the authors thank to Dr. O. Caldiran for the fruitful technical discussions during this study and B. Oznalbant, and B. Tosun for the initial support.
dc.identifier.doiN/A
dc.identifier.isbn978-1-7281-6075-7
dc.identifier.issn1944-9445
dc.identifier.scopus2-s2.0-85095797289
dc.identifier.urihttps://hdl.handle.net/20.500.14288/8787
dc.identifier.wos598571700186
dc.keywordsNormalization methods
dc.keywordsMoment arms
dc.keywordsSurface emg
dc.keywordsMuscle
dc.keywordsManipulators
dc.keywordsamplitude
dc.keywordsSensors
dc.languageEnglish
dc.publisherIeee
dc.source2020 29th Ieee International Conference On Robot And Human Interactive Communication (Ro-Man)
dc.subjectComputer science
dc.subjectArtificial intelligence
dc.subjectEngineering
dc.subjectElectrical and electronic engineering
dc.subjectRobotics
dc.titleDetecting human motion intention during phri using artificial neural networks trained by EMG signals
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authoridN/A
local.contributor.authorid0000-0002-4759-9706
local.contributor.authorid0000-0002-4598-5558
local.contributor.authorid0000-0002-6382-7334
local.contributor.kuauthorŞirintuna, Doğanay
local.contributor.kuauthorÖzdamar, İdil
local.contributor.kuauthorAydın, Yusuf
local.contributor.kuauthorBaşdoğan, Çağatay
relation.isOrgUnitOfPublicationba2836f3-206d-4724-918c-f598f0086a36
relation.isOrgUnitOfPublication.latestForDiscoveryba2836f3-206d-4724-918c-f598f0086a36

Files