Publications without Fulltext
Permanent URI for this collectionhttps://hdl.handle.net/20.500.14288/3
Browse
3 results
Search Results
Publication Metadata only Using haptics to convey cause-and-effect relations in climate visualization(IEEE, 2008) Sen, Omer Lutfi; Department of Mechanical Engineering; Department of Computer Engineering; Başdoğan, Çağatay; Taşıran, Serdar; Yannier, Nesra; Faculty Member; Faculty Member; Master Student; Department of Mechanical Engineering; Department of Computer Engineering; College of Engineering; College of Engineering; Graduate School of Sciences and Engineering; 125489; N/A; N/AWe investigate the potential role of haptics in augmenting the visualization of climate data. In existing approaches to climate visualization, dimensions of climate data such as temperature, humidity, wind, precipitation, and cloud water are typically represented using different visual markers and dimensions such as color, size, intensity, and orientation. Since the numbers of dimensions in climate data are large and climate data need to be represented in connection with the topography, purely visual representations typically overwhelm users. Rather than overloading the visual channel, we investigate an alternative approach in which some of the climate information is displayed through the haptic channel in order to alleviate the perceptual and cognitive load of the user. In this approach, haptic feedback is further used to provide guidance while exploring climate data in order to enable natural and intuitive learning of cause-and-effect relationships between climate variables. As the user explores climate data interactively under the guidance of wind forces displayed by a haptic device, she/he can understand better the occurrence of events such as cloud and rain formation and the effect of climate variables on these events. We designed a set of experiments to demonstrate the effectiveness of this multimodal approach. Our experiments with 33 human subjects show that haptic feedback significantly improves the understanding of climate data and the cause-and-effect relations between climate variables, as well as the interpretation of the variations in climate due to changes in terrain. © 2008 IEEE.Publication Metadata only Haptic negotiation and role exchange for collaboration in virtual environments(IEEE, 2010) Department of Computer Engineering; Department of Mechanical Engineering; N/A; N/A; Sezgin, Tevfik Metin; Başdoğan, Çağatay; Küçükyılmaz, Ayşe; Öğüz, Salih Özgür; Faculty Member; Faculty Member; PhD Student; Master Student; Department of Computer Engineering; Department of Mechanical Engineering; College of Engineering; College of Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; 18632; 125489; N/A; N/AWe investigate how collaborative guidance can be realized in multimodal virtual environments for dynamic tasks involving motor control. Haptic guidance in our context can be defined as any form of force/tactile feedback that the computer generates to help a user execute a task in a faster, more accurate, and subjectively more pleasing fashion. In particular, we are interested in determining guidance mechanisms that best facilitate task performance and arouse a natural sense of collaboration. We suggest that a haptic guidance system can be further improved if it is supplemented with a role exchange mechanism, which allows the computer to adjust the forces it applies to the user in response to his/her actions. Recent work on collaboration and role exchange presented new perspectives on defining roles and interaction. However existing approaches mainly focus on relatively basic environments where the state of the system can be defined with a few parameters. We designed and implemented a complex and highly dynamic multimodal game for testing our interaction model. Since the state space of our application is complex, role exchange needs to be implemented carefully. We defined a novel negotiation process, which facilitates dynamic communication between the user and the computer, and realizes the exchange of roles using a three-state finite state machine. Our preliminary results indicate that even though the negotiation and role exchange mechanism we adopted does not improve performance in every evaluation criteria, it introduces a more personal and human-like interaction model.Publication Metadata only Conveying intentions through haptics in human-computer collaboration(IEEE, 2011) Department of Computer Engineering; Department of Mechanical Engineering; N/A; Sezgin, Tevfik Metin; Başdoğan, Çağatay; Küçükyılmaz, Ayşe; Faculty Member; Faculty Member; PhD Student; Department of Computer Engineering; Department of Mechanical Engineering; College of Engineering; College of Engineering; Graduate School of Sciences and Engineering; 18632; 125489; N/AHaptics has been used as a natural way for humans to communicate with computers in collaborative virtual environments. Human-computer collaboration is typically achieved by sharing control of the task between a human and a computer operator. An important research challenge in the field addresses the need to realize intention recognition and response, which involves a decision making process between the partners. In an earlier study [11], we implemented a dynamic role exchange mechanism, which realizes decision making by means of trading the parties' control levels on the task. This mechanism proved to show promise of a more intuitive and comfortable communication. Here, we extend our earlier work to further investigate the utility of a role exchange mechanism in dynamic collaboration tasks. An experiment with 30 participants was conducted to compare the utility of a role exchange mechanism with that of a shared control scheme where the human and the computer share control equally at all times. A no guidance condition is considered as a base case to present the benefits of these two guidance schemes more clearly. Our experiment show that the role exchange scheme maximizes the efficiency of the user, which is the ratio of the work done by the user within the task to the energy spent by her. Furthermore, we explored the added benefits of explicitly displaying the control state by embedding visual and vibrotactile sensory cues on top of the role exchange scheme. We observed that such cues decrease performance slightly, probably because they introduce an extra cognitive load, yet they improve the users' sense of collaboration and interaction with the computer. These cues also create a stronger sense of trust for the user towards her partner's control over the task.