Publications without Fulltext
Permanent URI for this collectionhttps://hdl.handle.net/20.500.14288/3
Browse
5 results
Search Results
Publication Metadata only An audio-driven dancing avatar(Springer, 2008) Balci, Koray; Kizoglu, Idil; Akarun, Lale; Canton-Ferrer, Cristian; Tilmanne, Joelle; Bozkurt, Elif; Erdem, A. Tanju; Department of Computer Engineering; N/A; N/A; Department of Computer Engineering; Department of Electrical and Electronics Engineering; Department of Computer Engineering; Department of Electrical and Electronics Engineering; Yemez, Yücel; Ofli, Ferda; Demir, Yasemin; Erzin, Engin; Tekalp, Ahmet Murat; Faculty Member; PhD Student; Master Student; Faculty Member; Faculty Member; College of Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; College of Engineering; College of Engineering; 107907; N/A; N/A; 34503; 26207We present a framework for training and synthesis of an audio-driven dancing avatar. The avatar is trained for a given musical genre using the multicamera video recordings of a dance performance. The video is analyzed to capture the time-varying posture of the dancer's body whereas the musical audio signal is processed to extract the beat information. We consider two different marker-based schemes for the motion capture problem. The first scheme uses 3D joint positions to represent the body motion whereas the second uses joint angles. Body movements of the dancer are characterized by a set of recurring semantic motion patterns, i.e., dance figures. Each dance figure is modeled in a supervised manner with a set of HMM (Hidden Markov Model) structures and the associated beat frequency. In the synthesis phase, an audio signal of unknown musical type is first classified, within a time interval, into one of the genres that have been learnt in the analysis phase, based on mel frequency cepstral coefficients (MFCC). The motion parameters of the corresponding dance figures are then synthesized via the trained HMM structures in synchrony with the audio signal based on the estimated tempo information. Finally, the generated motion parameters, either the joint angles or the 3D joint positions of the body, are animated along with the musical audio using two different animation tools that we have developed. Experimental results demonstrate the effectiveness of the proposed framework.Publication Metadata only Adaptive human force scaling via admittance control for physical human-robot interaction(IEEE Computer Soc, 2021) Aydın, Yusuf; N/A; Department of Mechanical Engineering; Department of Mechanical Engineering; Al Qaysi, Yahya Mohey Hamad; Başdoğan, Çağatay; PhD Student; Faculty Member; Graduate School of Sciences and Engineering; College of Engineering; N/A; 125489The goal of this article is to design an admittance controller for a robot to adaptively change its contribution to a collaborative manipulation task executed with a human partner to improve the task performance. This has been achieved by adaptive scaling of human force based on her/his movement intention while paying attention to the requirements of different task phases. In our approach, movement intentions of human are estimated from measured human force and velocity of manipulated object, and converted to a quantitative value using a fuzzy logic scheme. This value is then utilized as a variable gain in an admittance controller to adaptively adjust the contribution of robot to the task without changing the admittance time constant. We demonstrate the benefits of the proposed approach by a pHRI experiment utilizing Fitts' reaching movement task. The results of the experiment show that there is a) an optimum admittance time constant maximizing the human force amplification and b) a desirable admittance gain profile which leads to a more effective co-manipulation in terms of overall task performance.Publication Metadata only Recognition of haptic interaction patterns in dyadic joint object manipulation(IEEE Computer Society, 2015) KucukYılmaz, Ayse; N/A; Department of Computer Engineering; Department of Mechanical Engineering; Department of Computer Engineering; Department of Mechanical Engineering; Madan, Çığıl Ece; Sezgin, Tevfik Metin; Başdoğan, Çağatay; Master Student; Faculty Member; Faculty Member; Graduate School of Sciences and Engineering; College of Engineering; College of Engineering; N/A; 18632; 125489The development of robots that can physically cooperate with humans has attained interest in the last decades. Obviously, this effort requires a deep understanding of the intrinsic properties of interaction. Up to now, many researchers have focused on inferring human intents in terms of intermediate or terminal goals in physical tasks. On the other hand, working side by side with people, an autonomous robot additionally needs to come up with in-depth information about underlying haptic interaction patterns that are typically encountered during human-human cooperation. However, to our knowledge, no study has yet focused on characterizing such detailed information. In this sense, this work is pioneering as an effort to gain deeper understanding of interaction patterns involving two or more humans in a physical task. We present a labeled human-human-interaction dataset, which captures the interaction of two humans, who collaboratively transport an object in an haptics-enabled virtual environment. In the light of information gained by studying this dataset, we propose that the actions of cooperating partners can be examined under three interaction types: In any cooperative task, the interacting humans either 1) work in harmony, 2) cope with conflicts, or 3) remain passive during interaction. In line with this conception, we present a taxonomy of human interaction patterns; then propose five different feature sets, comprising force-, velocity-and power-related information, for the classification of these patterns. Our evaluation shows that using a multi-class support vector machine (SVM) classifier, we can accomplish a correct classification rate of 86 percent for the identification of interaction patterns, an accuracy obtained by fusing a selected set of most informative features by Minimum Redundancy Maximum Relevance (mRMR) feature selection method.Publication Metadata only Tactile roughness perception of virtual gratings by electrovibration(IEEE Computer Society, 2020) Vardar, Yasemin; N/A; Department of Mechanical Engineering; Department of Mechanical Engineering; İşleyen, Aykut; Başdoğan, Çağatay; Master Student; Faculty Member; Graduate School of Sciences and Engineering; College of Engineering; N/A; 125489Realistic display of tactile textures on touch screens is a big step forward for haptic technology to reach a wide range of consumers utilizing electronic devices on a daily basis. Since the texture topography cannot be rendered explicitly by electrovibration on touch screens, it is important to understand how we perceive the virtual textures displayed by friction modulation via electrovibration. We investigated the roughness perception of real gratings made of plexiglass and virtual gratings displayed by electrovibration through a touch screen for comparison. In particular, we conducted two psychophysical experiments with ten participants to investigate the effect of spatial period and the normal force applied by finger on roughness perception of real and virtual gratings in macro size. We also recorded the contact forces acting on the participants' finger during the experiments. The results showed that the roughness perception of real and virtual gratings are different. We argue that this difference can be explained by the amount of fingerpad penetration into the gratings. For real gratings, penetration increased tangential forces acting on the finger, whereas for virtual ones where skin penetration is absent, tangential forces decreased with spatial period. Supporting our claim, we also found that increasing normal force increases the perceived roughness of real gratings while it causes an opposite effect for the virtual gratings. These results are consistent with the tangential force profiles recorded for both real and virtual gratings. In particular, the rate of change in tangential force (dF(t)/dt) as a function of spatial period and normal force followed trends similar to those obtained for the roughness estimates of real and virtual gratings, suggesting that it is a better indicator of the perceived roughness than the tangential force magnitude.Publication Metadata only A review of surface haptics: enabling tactile effects on touch surfaces(Institute of Electrical and Electronics Engineers (IEEE) Computer Society, 2020) Giraud, Frederic; Levesque, Vincent; Choi, Seungmoon; Department of Mechanical Engineering; Department of Mechanical Engineering; Başdoğan, Çağatay; Faculty Member; College of Engineering; 125489In this article, we review the current technology underlying surface haptics that converts passive touch surfaces to active ones (machine haptics), our perception of tactile stimuli displayed through active touch surfaces (human haptics), their potential applications (human-machine interaction), and finally, the challenges ahead of us in making them available through commercial systems. This article primarily covers the tactile interactions of human fingers or hands with surface-haptics displays by focusing on the three most popular actuation methods: vibrotactile, electrostatic, and ultrasonic.