Researcher: Şirintuna, Doğanay
Name Variants
Şirintuna, Doğanay
Email Address
Birth Date
5 results
Search Results
Now showing 1 - 5 of 5
Publication Metadata only A variable-fractional order admittance controller for pHRI(IEEE Inc., 2020) Patoglu, Volkan; Tokatli, Ozan; Department of Mechanical Engineering; N/A; N/A; N/A; Başdoğan, Çağatay; Aydın, Yusuf; Şirintuna, Doğanay; Çaldıran, Ozan; Faculty Member; PhD Student; PhD Student; PhD Student; Department of Mechanical Engineering; College of Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; 125489; 328776; N/A; N/AIn today's automation driven manufacturing environments, emerging technologies like cobots (collaborative robots) and augmented reality interfaces can help integrating humans into the production workflow to benefit from their adaptability and cognitive skills. In such settings, humans are expected to work with robots side by side and physically interact with them. However, the trade-off between stability and transparency is a core challenge in the presence of physical human robot interaction (pHRI). While stability is of utmost importance for safety, transparency is required for fully exploiting the precision and ability of robots in handling labor intensive tasks. In this work, we propose a new variable admittance controller based on fractional order control to handle this trade-off more effectively. We compared the performance of fractional order variable admittance controller with a classical admittance controller with fixed parameters as a baseline and an integer order variable admittance controller during a realistic drilling task. Our comparisons indicate that the proposed controller led to a more transparent interaction compared to the other controllers without sacrificing the stability. We also demonstrate a use case for an augmented reality (AR) headset which can augment human sensory capabilities for reaching a certain drilling depth otherwise not possible without changing the role of the robot as the decision maker. © 2020 IEEE.Publication Metadata only Detecting human motion intention during phri using artificial neural networks trained by EMG signals(Ieee, 2020) N/A; N/A; N/A; N/A; Department of Mechanical Engineering; Şirintuna, Doğanay; Özdamar, İdil; Aydın, Yusuf; Başdoğan, Çağatay; PhD Student; Master Student; PhD Student; Faculty Member; Department of Mechanical Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; N/A; 328776; 125489With the recent advances in cobot (collaborative robot) technology, we can now work with a robot side by side in manufacturing environments. The collaboration between human and cobot can be enhanced by detecting the intentions of human to make the production more flexible and effective in future factories. In this regard, interpreting human intention and then adjusting the controller of cobot accordingly to assist human is a core challenge in physical human-robot interaction (pHRI). In this study, we propose a classifier based on Artificial Neural Networks (ANN) that predicts intended direction of human movement by utilizing electromyography (EMG) signals acquired from human arm muscles. We employ this classifier in an admittance control architecture to constrain human arm motion to the intended direction and prevent undesired movements along other directions. The proposed classifier and the control architecture have been validated through a path following task by utilizing a KUKA LBR iiwa 7 R800 cobot. The results of our experimental study with 6 participants show that the proposed architecture provides an effective assistance to human during the execution of task and reduces undesired motion errors, while not sacrificing from the task completion time.Publication Open Access A novel haptic feature set for the classification of interactive motor behaviors in collaborative object transfer(Institute of Electrical and Electronics Engineers (IEEE), 2021) Küçükyılmaz, Ayşe; Department of Mechanical Engineering; Başdoğan, Çağatay; Şirintuna, Doğanay; Al-Saadi, Zaid Rassim Mohammed; Faculty Member; Department of Mechanical Engineering; College of Engineering; Graduate School of Sciences and Engineering; 125489; N/A; N/AHaptics provides a natural and intuitive channel of communication during the interaction of two humans in complex physical tasks, such as joint object transportation. However, despite the utmost importance of touch in physical interactions, the use of haptics is under-represented when developing intelligent systems. This article explores the prominence of haptic data to extract information about underlying interaction patterns within physical human-human interaction (pHHI). We work on a joint object transportation scenario involving two human partners, and show that haptic features, based on force/torque information, suffice to identify human interactive behavior patterns. We categorize the interaction into four discrete behavior classes. These classes describe whether the partners work in harmony or face conflicts while jointly transporting an object through translational or rotational movements. In an experimental study, we collect data from 12 human dyads and verify the salience of haptic features by achieving a correct classification rate over 91% using a Random Forest classifier.Publication Open Access Towards collaborative drilling with a cobot using admittance controller(Sage, 2020) Department of Mechanical Engineering; Aydın, Yusuf; Şirintuna, Doğanay; Başdoğan, Çağatay; Faculty Member; Department of Mechanical Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; N/A; 125489In the near future, collaborative robots (cobots) are expected to play a vital role in the manufacturing and automation sectors. It is predicted that workers will work side by side in collaboration with cobots to surpass fully automated factories. In this regard, physical human-robot interaction (pHRI) aims to develop natural communication between the partners to bring speed, flexibility, and ergonomics to the execution of complex manufacturing tasks. One challenge in pHRI is to design an optimal interaction controller to balance the limitations introduced by the contradicting nature of transparency and stability requirements. In this paper, a general methodology to design an admittance controller for a pHRI system is developed by considering the stability and transparency objectives. In our approach, collaborative robot constrains the movement of human operator to help with a pHRI task while an augmented reality (AR) interface informs the operator about its phases. To this end, dynamical characterization of the collaborative robot (LBR IIWA 7 R800, KUKA Inc.) is presented first. Then, the stability and transparency analyses for our pHRI task involving collaborative drilling with this robot are reported. A range of allowable parameters for the admittance controller is determined by superimposing the stability and transparency graphs. Finally, three different sets of parameters are selected from the allowable range and the effect of admittance controllers utilizing these parameter sets on the task performance is investigated.Publication Open Access Detecting human motion intention during pHRI using artificial neural networks trained by EMG signals(Institute of Electrical and Electronics Engineers (IEEE), 2020) N/A; Department of Mechanical Engineering; Şirintuna, Doğanay; Özdamar, İdil; Aydın, Yusuf; Başdoğan, Çağatay; Faculty Member; Department of Mechanical Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; N/A; N/A; 125489With the recent advances in cobot (collaborative robot) technology, we can now work with a robot side by side in manufacturing environments. The collaboration between human and cobot can be enhanced by detecting the intentions of human to make the production more flexible and effective in future factories. In this regard, interpreting human intention and then adjusting the controller of cobot accordingly to assist human is a core challenge in physical human-robot interaction (pHRI). In this study, we propose a classifier based on Artificial Neural Networks (ANN) that predicts intended direction of human movement by utilizing electromyography (EMG) signals acquired from human arm muscles. We employ this classifier in an admittance control architecture to constrain human arm motion to the intended direction and prevent undesired movements along other directions. The proposed classifier and the control architecture have been validated through a path following task by utilizing a KUKA LBR iiwa 7 R800 cobot. The results of our experimental study with 6 participants show that the proposed architecture provides an effective assistance to human during the execution of task and reduces undesired motion errors, while not sacrificing from the task completion time.