Researcher:
Küçükyılmaz, Ayşe

Loading...
Profile Picture
ORCID

Job Title

PhD Student

First Name

Ayşe

Last Name

Küçükyılmaz

Name

Name Variants

Küçükyılmaz, Ayşe

Email Address

Birth Date

Search Results

Now showing 1 - 6 of 6
  • Placeholder
    Publication
    Role allocation through haptics in physical human-robot interaction
    (Institute of Electrical and Electronics Engineers (IEEE), 2013) N/A; N/A; Department of Computer Engineering; Department of Mechanical Engineering; Küçükyılmaz, Ayşe; Sezgin, Tevfik Metin; Başdoğan, Çağatay; PhD Student; Faculty Member; Faculty Member; Department of Computer Engineering; Department of Mechanical Engineering; Graduate School of Sciences and Engineering; College of Engineering; College of Engineering; N/A; 18632; 125489
    This paper presents a summary of our efforts to enable dynamic role allocation between humans and robots in physical collaboration tasks. A major goal in physical human-robot interaction research is to develop tacit and natural communication between partners. In previous work, we suggested that the communication between a human and a robot would benefit from a decision making process in which the robot can dynamically adjust its control level during the task based on the intentions of the human. In order to do this, we define leader and follower roles for the partners, and using a role exchange mechanism, we enable the partners to negotiate solely through force information to exchange roles. We show that when compared to an “equal control” condition, the role exchange mechanism improves task performance and the joint efficiency of the partners.
  • Placeholder
    Publication
    Haptic negotiation and role exchange for collaboration in virtual environments
    (IEEE, 2010) Department of Computer Engineering; Department of Mechanical Engineering; N/A; N/A; Sezgin, Tevfik Metin; Başdoğan, Çağatay; Küçükyılmaz, Ayşe; Öğüz, Salih Özgür; Faculty Member; Faculty Member; PhD Student; Master Student; Department of Computer Engineering; Department of Mechanical Engineering; College of Engineering; College of Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; 18632; 125489; N/A; N/A
    We investigate how collaborative guidance can be realized in multimodal virtual environments for dynamic tasks involving motor control. Haptic guidance in our context can be defined as any form of force/tactile feedback that the computer generates to help a user execute a task in a faster, more accurate, and subjectively more pleasing fashion. In particular, we are interested in determining guidance mechanisms that best facilitate task performance and arouse a natural sense of collaboration. We suggest that a haptic guidance system can be further improved if it is supplemented with a role exchange mechanism, which allows the computer to adjust the forces it applies to the user in response to his/her actions. Recent work on collaboration and role exchange presented new perspectives on defining roles and interaction. However existing approaches mainly focus on relatively basic environments where the state of the system can be defined with a few parameters. We designed and implemented a complex and highly dynamic multimodal game for testing our interaction model. Since the state space of our application is complex, role exchange needs to be implemented carefully. We defined a novel negotiation process, which facilitates dynamic communication between the user and the computer, and realizes the exchange of roles using a three-state finite state machine. Our preliminary results indicate that even though the negotiation and role exchange mechanism we adopted does not improve performance in every evaluation criteria, it introduces a more personal and human-like interaction model.
  • Placeholder
    Publication
    Conveying intentions through haptics in human-computer collaboration
    (IEEE, 2011) Department of Computer Engineering; Department of Mechanical Engineering; N/A; Sezgin, Tevfik Metin; Başdoğan, Çağatay; Küçükyılmaz, Ayşe; Faculty Member; Faculty Member; PhD Student; Department of Computer Engineering; Department of Mechanical Engineering; College of Engineering; College of Engineering; Graduate School of Sciences and Engineering; 18632; 125489; N/A
    Haptics has been used as a natural way for humans to communicate with computers in collaborative virtual environments. Human-computer collaboration is typically achieved by sharing control of the task between a human and a computer operator. An important research challenge in the field addresses the need to realize intention recognition and response, which involves a decision making process between the partners. In an earlier study [11], we implemented a dynamic role exchange mechanism, which realizes decision making by means of trading the parties' control levels on the task. This mechanism proved to show promise of a more intuitive and comfortable communication. Here, we extend our earlier work to further investigate the utility of a role exchange mechanism in dynamic collaboration tasks. An experiment with 30 participants was conducted to compare the utility of a role exchange mechanism with that of a shared control scheme where the human and the computer share control equally at all times. A no guidance condition is considered as a base case to present the benefits of these two guidance schemes more clearly. Our experiment show that the role exchange scheme maximizes the efficiency of the user, which is the ratio of the work done by the user within the task to the energy spent by her. Furthermore, we explored the added benefits of explicitly displaying the control state by embedding visual and vibrotactile sensory cues on top of the role exchange scheme. We observed that such cues decrease performance slightly, probably because they introduce an extra cognitive load, yet they improve the users' sense of collaboration and interaction with the computer. These cues also create a stronger sense of trust for the user towards her partner's control over the task.
  • Thumbnail Image
    PublicationOpen Access
    Supporting negotiation behavior with haptics-enabled human-computer interfaces
    (Institute of Electrical and Electronics Engineers (IEEE), 2012) Department of Mechanical Engineering; Küçükyılmaz, Ayşe; Sezgin, Tevfik Metin; Başdoğan, Çağatay; PhD Student; Faculty Member; Faculty Member; Department of Mechanical Engineering; College of Engineering; N/A; 18632; 125489; N/A
    An active research goal for human-computer interaction is to allow humans to communicate with computers in an intuitive and natural fashion, especially in real-life interaction scenarios. One approach that has been advocated to achieve this has been to build computer systems with human-like qualities and capabilities. In this paper, we present insight on how human-computer interaction can be enriched by employing the computers with behavioral patterns that naturally appear in human-human negotiation scenarios. For this purpose, we introduce a two-party negotiation game specifically built for studying the effectiveness of haptic and audio-visual cues in conveying negotiation related behaviors. The game is centered around a real-time continuous two-party negotiation scenario based on the existing game-theory and negotiation literature. During the game, humans are confronted with a computer opponent, which can display different behaviors, such as concession, competition, and negotiation. Through a user study, we show that the behaviors that are associated with human negotiation can be incorporated into human-computer interaction, and the addition of haptic cues provides a statistically significant increase in the human-recognition accuracy of machine-displayed behaviors. In addition to aspects of conveying these negotiation-related behaviors, we also focus on and report game-theoretical aspects of the overall interaction experience. In particular, we show that, as reported in the game-theory literature, certain negotiation strategies such as tit-for-tat may generate maximum combined utility for the negotiating parties, providing an excellent balance between the energy spent by the user and the combined utility of the negotiating parties.
  • Thumbnail Image
    PublicationOpen Access
    Intention recognition for dynamic role exchange in haptic collaboration
    (Institute of Electrical and Electronics Engineers (IEEE), 2013) Department of Mechanical Engineering; Küçükyılmaz, Ayşe; Sezgin, Tevfik Metin; Başdoğan, Çağatay; PhD Student; Faculty Member; Faculty Member; Department of Mechanical Engineering; College of Engineering; N/A; 18632; 125489
    In human-computer collaboration involving haptics, a key issue that remains to be solved is to establish an intuitive communication between the partners. Even though computers are widely used to aid human operators in teleoperation, guidance, and training, because they lack the adaptability, versatility, and awareness of a human, their ability to improve efficiency and effectiveness in dynamic tasks is limited. We suggest that the communication between a human and a computer can be improved if it involves a decision-making process in which the computer is programmed to infer the intentions of the human operator and dynamically adjust the control levels of the interacting parties to facilitate a more intuitive interaction setup. In this paper, we investigate the utility of such a dynamic role exchange mechanism, where partners negotiate through the haptic channel to trade their control levels on a collaborative task. We examine the energy consumption, the work done on the manipulated object, and the joint efficiency in addition to the task performance. We show that when compared to an equal control condition, a role exchange mechanism improves task performance and the joint efficiency of the partners. We also show that augmenting the system with additional informative visual and vibrotactile cues, which are used to display the state of interaction, allows the users to become aware of the underlying role exchange mechanism and utilize it in favor of the task. These cues also improve the user's sense of interaction and reinforce his/her belief that the computer aids with the execution of the task.
  • Thumbnail Image
    PublicationOpen Access
    The role of roles: physical cooperation between humans and robots
    (Sage, 2012) Moertl, Alexander; Lawitzky, Martin; Hirche, Sandra; N/A; Department of Computer Engineering; Department of Mechanical Engineering; Küçükyılmaz, Ayşe; Sezgin, Tevfik Metin; Başdoğan, Çağatay; PhD Student; Faculty Member; Faculty Member; Department of Computer Engineering; Department of Mechanical Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; 18632; 125489
    Since the strict separation of working spaces of humans and robots has experienced a softening due to recent robotics research achievements, close interaction of humans and robots comes rapidly into reach. In this context, physical human-robot interaction raises a number of questions regarding a desired intuitive robot behavior. The continuous bilateral information and energy exchange requires an appropriate continuous robot feedback. Investigating a cooperative manipulation task, the desired behavior is a combination of an urge to fulfill the task, a smooth instant reactive behavior to human force inputs and an assignment of the task effort to the cooperating agents. In this paper, a formal analysis of human-robot cooperative load transport is presented. Three different possibilities for the assignment of task effort are proposed. Two proposed dynamic role exchange mechanisms adjust the robot's urge to complete the task based on the human feedback. For comparison, a static role allocation strategy not relying on the human agreement feedback is investigated as well. All three role allocation mechanisms are evaluated in a user study that involves large-scale kinesthetic interaction and full-body human motion. Results show tradeoffs between subjective and objective performance measures stating a clear objective advantage of the proposed dynamic role allocation scheme.