Research Outputs

Permanent URI for this communityhttps://hdl.handle.net/20.500.14288/2

Browse

Search Results

Now showing 1 - 7 of 7
  • Placeholder
    Publication
    Dynamic haptic interaction with video
    (Crc Press-Taylor and Francis Group, 2015) N/A; Department of Electrical and Electronics Engineering; Department of Mechanical Engineering; Dindar, Nuray; Tekalp, Ahmet Murat; Başdoğan, Çağatay; N/A; Faculty Member; Faculty Member; Department of Electrical and Electronics Engineering; Department of Mechanical Engineering; Graduate School of Sciences and Engineering; College of Engineering; College of Engineering; N/A; 26207; 125489
    N/A
  • Placeholder
    Publication
    Hapticolor: interpolating color information as haptic feedback to assist the colorblind
    (Assoc Computing Machinery, 2016) Carcedo, Marta G.; Chua, Soon Hau; Perrault, Simon; Wozniak, Pawel; Joshi, Raj; Fjeld, Morten; Zhao, Shengdong; Department of Mechanical Engineering; Obaid, Mohammad; Undergraduate Student; Department of Mechanical Engineering; KU Arçelik Research Center for Creative Industries (KUAR) / KU Arçelik Yaratıcı Endüstriler Uygulama ve Araştırma Merkezi (KUAR); College of Engineering; N/A
    Most existing colorblind aids help their users to distinguish and recognize colors but not compare them. We present HaptiColor, an assistive wristband that encodes discrete color information into spatiotemporal vibrations to support colorblind users to recognize and compare colors. We ran three experiments: the first found the optimal number and placement of motors around the wrist-worn prototype, and the second tested the optimal way to represent discrete points between the vibration motors. Results suggested that using three vibration motors and pulses of varying duration to encode proximity information in spatiotemporal patterns is the optimal solution. Finally, we evaluated the HaptiColor prototype and encodings with six colorblind participants. Our results show that the participants were able to easily understand the encodings and perform color comparison tasks accurately (94.4% to 100%).
  • Placeholder
    Publication
    HaptiStylus: a novel stylus for conveying movement and rotational torque effects
    (IEEE Computer Soc, 2016) Department of Mechanical Engineering; Department of Mechanical Engineering; Department of Computer Engineering; Arasan, Atakan; Başdoğan, Çağatay; Sezgin, Tevfik Metin; Researcher; Faculty Member; Faculty Member; Department of Mechanical Engineering; Department of Computer Engineering; College of Engineering; College of Engineering; College of Engineering; N/A; 125489; 18632
    N/A
  • Placeholder
    Publication
    Real-time finite-element simulation of linear viscoelastic tissue behavior based on experimental data
    (Ieee Computer Soc, 2006) N/A; N/A; N/A; Department of Mechanical Engineering; Sedef, Mert; Samur, Evren; Başdoğan, Çağatay; Master Student; Master Student; Faculty Member; Department of Mechanical Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; N/A; 125489
    N/A
  • Placeholder
    Publication
    Real-time visio-haptic interaction with static soft tissue models having geometric and material nonlinearity
    (Elsevier, 2010) Peterlik, Igor; Matyska, Luděk; N/A; Department of Mechanical Engineering; Sedef, Mert; Başdoğan, Çağatay; Master Student; Faculty Member; Department of Mechanical Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; 125489
    Realistic soft tissue models running in real-time are required for the development of computer-based surgical training systems. To construct a realistic soft tissue model, finite element (FE) modeling techniques are preferred over the particle-based techniques since the material properties can be integrated directly into the FE model to provide more accurate visual and haptic feedback to a user during the simulations. However, running even a static (time-independent) nonlinear FE model in real-time is a highly challenging task because the resulting stiffness matrix (K) is not constant and varies with the depth of penetration into the model. We propose a new computational approach allowing visio-haptic interaction with an FE model of a human liver having both nonlinear geometric and material properties. Our computational approach consists of two main steps: a pre-computation of the configuration space of all deformation configurations of the model, followed by the interpolation of the precomputed data for the calculation of the nodal displacements and reaction forces that are displayed to the user during the real-time interactions through a visual display and a haptic device, respectively. For the implementation of the proposed approach, no a priori assumptions or modeling simplifications about the mathematical complexity of the underlying soft tissue model, size and irregularity of the FE mesh are necessary. Moreover, it turns out that the deformation and force responses of the liver in the simulations are heavily influenced by the selected simulation parameters, such as the material model, boundary conditions and loading path, but the stability of the visual and haptic rendering in our approach does not depend on these parameters. In addition to showing the stability of our approach, the length of the precomputations as well as the accuracy of the interpolation scheme are evaluated for different interpolation functions and configuration space densities.
  • Placeholder
    Publication
    Sensation: Measuring the effects of a human-to-human social touch based controller on the player experience
    (Assoc Computing Machinery, 2016) N/A; Department of Electrical and Electronics Engineering; Department of Electrical and Electronics Engineering; Department of Electrical and Electronics Engineering; Department of Mechanical Engineering; Department of Computer Engineering; Department of Psychology; N/A; Department of Psychology; Department of Media and Visual Arts; Canat, Mert; Tezcan, Mustafa Ozan; Yurdakul, Celalettin; Tiza, Eran; Sefercik, Buğra Can; Bostan, İdil; Buruk, Oğuz Turan; Göksun, Tilbe; Özcan, Oğuzhan; Undergraduate Student; Undergraduate Student; Undergraduate Student; Undergraduate Student; Undergraduate Student; Undergraduate Student; PhD Student; Faculty Member; Faculty Member; Department of Electrical and Electronics Engineering; Department of Mechanical Engineering; Department of Computer Engineering; Department of Psychology; Department of Media and Visual Arts; KU Arçelik Research Center for Creative Industries (KUAR) / KU Arçelik Yaratıcı Endüstriler Uygulama ve Araştırma Merkezi (KUAR); College of Engineering; College of Engineering; College of Engineering; College of Engineering; College of Engineering; College of Social Sciences and Humanities; Graduate School of Social Sciences and Humanities; College of Social Sciences and Humanities; College of Social Sciences and Humanities; N/A; N/A; N/A; N/A; N/A; N/A; N/A; 47278; 12532
    We observe an increasing interest on usage of full-body interaction in games. However, human-to-human social touch interaction has not been implemented as a sophisticated gaming apparatus. To address this, we designed the Sensation, a device for detecting touch patterns between players, and introduce the game, Shape Destroy, which is a collaborative game designed to be played with social touch. To understand if usage of social touch has a meaningful contribution to the overall player experience in collaborative games we conducted a user study with 30 participants. Participants played the same game using i) the Sensation and ii) a gamepad, and completed a set of questionnaires aimed at measuring the immersion levels. As a result, the collected data and our observations indicated an increase in general, shared, ludic and affective involvement with significant differences. Thus, human-to-human touch can be considered a promising control method for collaborative physical games.
  • Placeholder
    Publication
    Systematic asynchrony bug exploration for android apps
    (Springer-Verlag Berlin, 2015) Emmi, Michael; N/A; Department of Mechanical Engineering; Özkan, Burcu Külahcıoğlu; Taşıran, Serdar; PhD Student; Faculty Member; Department of Mechanical Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; N/A
    Smartphone and tablet “apps” are particularly susceptible to asynchrony bugs. In order to maintain responsive user interfaces, events are handled asynchronously. Unexpected schedules of event handlers can result in apparently-random bugs which are notoriously difficult to reproduce, even given the user-event sequences that trigger them. We develop the AsyncDroid tool for the systematic discovery and reproduction of asynchrony bugs in Android apps. Given an app and a user-event sequence, AsyncDroid systematically executes alternate schedules of the same asynchronous event handlers, according to a programmable schedule enumerator. The input user-event sequence is given either by user interaction, or can be generated by automated UI “monkeys”. By exposing and controlling the factors which influence the scheduling order of asynchronous handlers, our programmable enumerators can explicate reproducible schedules harboring bugs. By enumerating all schedules within a limited threshold of reordering, we maximize the likelihood of encountering asynchrony bugs, according to prevailing hypotheses in the literature, and discover several bugs in Android apps found in the wild.