Researcher:
Özdamar, İdil

Loading...
Profile Picture
ORCID

Job Title

Master Student

First Name

İdil

Last Name

Özdamar

Name

Name Variants

Özdamar, İdil

Email Address

Birth Date

Search Results

Now showing 1 - 4 of 4
  • Placeholder
    Publication
    Tactile feedback displayed through touchscreens via electrovibration
    (IEEE, 2020) N/A; N/A; N/A; N/A; Department of Mechanical Engineering; Özdamar, İdil; Alipour, Mohammad; Chehrehzad, Mohammadreza; Başdoğan, Çağatay; Master Student; PhD Student; PhD Student; Faculty Member; Department of Mechanical Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; N/A; N/A; 125489
    Displaying tactile feedback through a touchscreen via electrovibration has many potential applications in mobile devices, consumer electronics, home appliances and automotive industry. However, this area of research is new and the electromechanical interactions between human finger and the touchscreen under electrovibration as well as the effect of frictional forces arising from these interactions on our haptic perception have not been fully understood yet. The aim of this study is to investigate the electro-mechanical interactions between human finger and a touchscreen under electrovibration in depth. In particular, we investigate the effect of following factors on the frictional force acting on the finger and the finger contact area; a) the amplitude and signal type (AC or DC) of voltage signal applied to the conductive layer of touchscreen, b) the magnitude of normal force applied by finger on touchscreen, and c) effect of finger speed. The results of this study enable us to better understand the physics of contact interactions between human finger and a touchscreen actuated by electrostatic forces.
  • Placeholder
    Publication
    Detecting human motion intention during phri using artificial neural networks trained by EMG signals
    (Ieee, 2020) N/A; N/A; N/A; N/A; Department of Mechanical Engineering; Şirintuna, Doğanay; Özdamar, İdil; Aydın, Yusuf; Başdoğan, Çağatay; PhD Student; Master Student; PhD Student; Faculty Member; Department of Mechanical Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; N/A; 328776; 125489
    With the recent advances in cobot (collaborative robot) technology, we can now work with a robot side by side in manufacturing environments. The collaboration between human and cobot can be enhanced by detecting the intentions of human to make the production more flexible and effective in future factories. In this regard, interpreting human intention and then adjusting the controller of cobot accordingly to assist human is a core challenge in physical human-robot interaction (pHRI). In this study, we propose a classifier based on Artificial Neural Networks (ANN) that predicts intended direction of human movement by utilizing electromyography (EMG) signals acquired from human arm muscles. We employ this classifier in an admittance control architecture to constrain human arm motion to the intended direction and prevent undesired movements along other directions. The proposed classifier and the control architecture have been validated through a path following task by utilizing a KUKA LBR iiwa 7 R800 cobot. The results of our experimental study with 6 participants show that the proposed architecture provides an effective assistance to human during the execution of task and reduces undesired motion errors, while not sacrificing from the task completion time.
  • Thumbnail Image
    PublicationOpen Access
    Step-change in friction under electrovibration
    (Institute of Electrical and Electronics Engineers (IEEE), 2020) Delhaye, Benoit P.; Lefevre, Philippe; Department of Mechanical Engineering; Başdoğan, Çağatay; Özdamar, İdil; Alipour, Mohammad; Faculty Member; Department of Mechanical Engineering; College of Engineering; Graduate School of Sciences and Engineering; 125489; N/A; N/A
    Rendering tactile effects on a touch screen via electrovibration has many potential applications. However, our knowledge on tactile perception of change in friction and the underlying contact mechanics are both very limited. In this article, we investigate the tactile perception and the contact mechanics for a step change in friction under electrovibration during a relative sliding between a finger and the surface of a capacitive touch screen. First, we conduct magnitude estimation experiments to investigate the role of normal force and sliding velocity on the perceived tactile intensity for a step increase and decrease in friction, called rising friction (RF) and falling friction (FF). To investigate the contact mechanics involved in RF and FF, we then measure the frictional force, the apparent contact area, and the strains acting on the fingerpad during sliding at a constant velocity under three different normal loads using a custom-made experimental set-up. The results show that the participants perceived RF stronger than FF, and both the normal force and sliding velocity significantly influenced their perception. These results are supported by our mechanical measurements; the relative change in friction, the apparent contact area, and the strain in the sliding direction were all higher for RF than those for FF, especially for low normal forces. Taken together, our results suggest that different contact mechanics take place during RF and FF due to the viscoelastic behavior of fingerpad skin, and those differences influence our tactile perception of a step change in friction.
  • Thumbnail Image
    PublicationOpen Access
    Detecting human motion intention during pHRI using artificial neural networks trained by EMG signals
    (Institute of Electrical and Electronics Engineers (IEEE), 2020) N/A; Department of Mechanical Engineering; Şirintuna, Doğanay; Özdamar, İdil; Aydın, Yusuf; Başdoğan, Çağatay; Faculty Member; Department of Mechanical Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; N/A; N/A; 125489
    With the recent advances in cobot (collaborative robot) technology, we can now work with a robot side by side in manufacturing environments. The collaboration between human and cobot can be enhanced by detecting the intentions of human to make the production more flexible and effective in future factories. In this regard, interpreting human intention and then adjusting the controller of cobot accordingly to assist human is a core challenge in physical human-robot interaction (pHRI). In this study, we propose a classifier based on Artificial Neural Networks (ANN) that predicts intended direction of human movement by utilizing electromyography (EMG) signals acquired from human arm muscles. We employ this classifier in an admittance control architecture to constrain human arm motion to the intended direction and prevent undesired movements along other directions. The proposed classifier and the control architecture have been validated through a path following task by utilizing a KUKA LBR iiwa 7 R800 cobot. The results of our experimental study with 6 participants show that the proposed architecture provides an effective assistance to human during the execution of task and reduces undesired motion errors, while not sacrificing from the task completion time.