Publication:
Detecting human motion intention during pHRI using artificial neural networks trained by EMG signals

dc.contributor.departmentN/A
dc.contributor.departmentDepartment of Mechanical Engineering
dc.contributor.kuauthorŞirintuna, Doğanay
dc.contributor.kuauthorÖzdamar, İdil
dc.contributor.kuauthorAydın, Yusuf
dc.contributor.kuauthorBaşdoğan, Çağatay
dc.contributor.kuprofileFaculty Member
dc.contributor.otherDepartment of Mechanical Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokidN/A
dc.contributor.yokidN/A
dc.contributor.yokidN/A
dc.contributor.yokid125489
dc.date.accessioned2024-11-09T12:13:06Z
dc.date.issued2020
dc.description.abstractWith the recent advances in cobot (collaborative robot) technology, we can now work with a robot side by side in manufacturing environments. The collaboration between human and cobot can be enhanced by detecting the intentions of human to make the production more flexible and effective in future factories. In this regard, interpreting human intention and then adjusting the controller of cobot accordingly to assist human is a core challenge in physical human-robot interaction (pHRI). In this study, we propose a classifier based on Artificial Neural Networks (ANN) that predicts intended direction of human movement by utilizing electromyography (EMG) signals acquired from human arm muscles. We employ this classifier in an admittance control architecture to constrain human arm motion to the intended direction and prevent undesired movements along other directions. The proposed classifier and the control architecture have been validated through a path following task by utilizing a KUKA LBR iiwa 7 R800 cobot. The results of our experimental study with 6 participants show that the proposed architecture provides an effective assistance to human during the execution of task and reduces undesired motion errors, while not sacrificing from the task completion time.
dc.description.fulltextYES
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuTÜBİTAK
dc.description.sponsorshipScientific and Technological Research Council of Turkey (TÜBİTAK)
dc.description.versionAuthor's final manuscript
dc.formatpdf
dc.identifier.doi10.1109/RO-MAN47096.2020.9223438
dc.identifier.embargoNO
dc.identifier.filenameinventorynoIR02547
dc.identifier.isbn9781728160757
dc.identifier.linkhttps://doi.org/10.1109/RO-MAN47096.2020.9223438
dc.identifier.quartileN/A
dc.identifier.scopus2-s2.0-85095797289
dc.identifier.urihttps://hdl.handle.net/20.500.14288/1214
dc.keywordsElectromyography
dc.keywordsHuman-robot interaction
dc.keywordsMotion estimation
dc.keywordsNeural nets
dc.keywordsPath planning
dc.keywordsSignal processing
dc.languageEnglish
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.relation.grantnoEEEAG-117E645
dc.relation.urihttp://cdm21054.contentdm.oclc.org/cdm/ref/collection/IR/id/9185
dc.source2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)
dc.subjectBiomedical science
dc.titleDetecting human motion intention during pHRI using artificial neural networks trained by EMG signals
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authoridN/A
local.contributor.authoridN/A
local.contributor.authoridN/A
local.contributor.authorid0000-0002-6382-7334
local.contributor.kuauthorŞirintuna, Doğanay
local.contributor.kuauthorÖzdamar, İdil
local.contributor.kuauthorAydın, Yusuf
local.contributor.kuauthorBaşdoğan, Çağatay
relation.isOrgUnitOfPublicationba2836f3-206d-4724-918c-f598f0086a36
relation.isOrgUnitOfPublication.latestForDiscoveryba2836f3-206d-4724-918c-f598f0086a36

Files

Original bundle

Now showing 1 - 1 of 1
Thumbnail Image
Name:
9185.pdf
Size:
966 KB
Format:
Adobe Portable Document Format