Researcher:
Baytaş, Mehmet Aydın

Loading...
Profile Picture
ORCID

Job Title

PhD Student

First Name

Mehmet Aydın

Last Name

Baytaş

Name

Name Variants

Baytaş, Mehmet Aydın

Email Address

Birth Date

Search Results

Now showing 1 - 8 of 8
  • Placeholder
    Publication
    Hotspotizer: end-user authoring of mid-air gestural interactions
    (Association for Computing Machinery, 2014) N/A; Department of Computer Engineering; Department of Media and Visual Arts; Baytaş, Mehmet Aydın; Yemez, Yücel; Özcan, Oğuzhan; PhD Student; Faculty Member; Faculty Member; Department of Computer Engineering; Department of Media and Visual Arts; KU Arçelik Research Center for Creative Industries (KUAR) / KU Arçelik Yaratıcı Endüstriler Uygulama ve Araştırma Merkezi (KUAR); Graduate School of Social Sciences and Humanities; College of Engineering; College of Social Sciences and Humanities; N/A; 107907; 12532
    Drawing from a user-centered design process and guidelines derived from the literature, we developed a paradigm based on space discretization for declaratively authoring mid-air gestures and implemented it in Hotspotizer, an end-to-end toolkit for mapping custom gestures to keyboard commands. Our implementation empowers diverse user populations - including end-users without domain expertise - to develop custom gestural interfaces within minutes, for use with arbitrary applications.
  • Placeholder
    Publication
    Labdesignar: configuring multi-camera motion capture systems in augmented reality
    (Assoc Computing Machinery, 2017) Fjeld, Morten; N/A; Department of Media and Visual Arts; Baytaş, Mehmet Aydın; Yantaç, Asım Evren; PhD Student; Faculty Member; Department of Media and Visual Arts; KU Arçelik Research Center for Creative Industries (KUAR) / KU Arçelik Yaratıcı Endüstriler Uygulama ve Araştırma Merkezi (KUAR); Graduate School of Social Sciences and Humanities; College of Social Sciences and Humanities; N/A; 52621
    We present LabDesignAR, an augmented reality application to support the planning, setup, and reconfiguration of marker-based motion capture systems with multiple cameras. LabDesignAR runs on the Microsoft HoloLens and allows the user to place an arbitrary number of virtual "holographic" motion capture cameras into an arbitrary space, in situ. The holographic cameras can be arbitrarily positioned, and different lens configurations can be selected to visualize the resulting fields of view and their intersections. LabDesignAR also demonstrates a hybrid natural gestural interaction technique, implemented through a fusion of the vision-based hand tracking capabilities of an augmented reality headset and instrumented gesture recognition with an electromyography armband. The source code for LabDesignAR and its supporting components can be found online.
  • Placeholder
    Publication
    User interface paradigms for visually authoring mid-air gestures: a survey and a provocation
    (CEUR-WS, 2014) N/A; Department of Computer Engineering; Baytaş, Mehmet Aydın; Yemez, Yücel; PhD Student; Faculty Member; Department of Computer Engineering; KU Arçelik Research Center for Creative Industries (KUAR) / KU Arçelik Yaratıcı Endüstriler Uygulama ve Araştırma Merkezi (KUAR); Graduate School of Social Sciences and Humanities; College of Engineering; N/A; 107907
    Gesture authoring tools enable the rapid and experiential prototyping of gesture-based interfaces. We survey visual authoring tools for mid-air gestures and identify three paradigms used for representing and manipulating gesture information: graphs, visual markup languages and timelines. We examine the strengths and limitations of these approaches and we propose a novel paradigm to authoring location-based mid-air gestures based on space discretization.
  • Placeholder
    Publication
    The effectiveness of mime-based creative drama education for exploring gesture-based user interfaces
    (Wiley, 2018) Ünlüer, Adviye Ayça; Department of Psychology; N/A; N/A; Department of Computer Engineering; Department of Media and Visual Arts; Cemalcılar, Zeynep; Baytaş, Mehmet Aydın; Buruk, Oğuz Turan; Yemez, Yücel; Özcan, Oğuzhan; Faculty Member; PhD Student; PhD Student; Faculty Member; Faculty Member; Department of Psychology; Department of Computer Engineering; Department of Media and Visual Arts; KU Arçelik Research Center for Creative Industries (KUAR) / KU Arçelik Yaratıcı Endüstriler Uygulama ve Araştırma Merkezi (KUAR); College of Social Sciences and Humanities; Graduate School of Social Sciences and Humanities; Graduate School of Social Sciences and Humanities; College of Engineering; College of Social Sciences and Humanities; 40374; N/A; N/A; 107907; 12532
    User interfaces that utilise human gestures as input are becoming increasingly prevalent in diverse computing applications. However, few designers possess the deep insight, awareness and experience regarding the nature and usage of gestures in user interfaces to the extent that they are able to exploit the technological affordances and innovate over them. We argue that design students, who will be expected to envision and create such interactions in the future, are constrained as such by their habits that pertain to conventional user interfaces. Design students should gain an understanding of the nature of human gestures and how to use them to add value to UI designs. To this end, we formulated an awareness course' for design students based on concepts derived from mime art and creative drama. We developed the course iteratively through the involvement of three groups of students. The final version of the course was evaluated by incorporating the perspectives of design educators, an industry expert and the students. We present the details of the course, describe the development process, and discuss the insights revealed by the evaluations.
  • Placeholder
    Publication
    Viewfinder: supporting the installation and reconfiguration of multi-camera motion capture systems with a mobile application
    (Assoc Computing Machinery, 2017) Batis, Emmanuel; Bylund, Mathias; Fjeld, Morten; N/A; N/A; Department of Media and Visual Arts; Baytaş, Mehmet Aydın; Çay, Damla; Yantaç, Asım Evren; PhD Student; PhD Student; Faculty Member; Department of Media and Visual Arts; Graduate School of Social Sciences and Humanities; Graduate School of Social Sciences and Humanities; College of Social Sciences and Humanities; N/A; N/A; 52621
    We present ViewFinder, a cross-platform mobile application to support the installation and reconfiguration of marker-based motion capture systems with multiple cameras. ViewFinder addresses a common issue when installing or reconfiguring motion capture systems: that system components such as cameras and the host computer can be physically separate and/or difficult to reach, requiring personnel to maneuver between them frequently and laboriously. ViewFinder allows setup technicians or end users to visualize the output of each camera in the system in a variety of ways in real time, on a smartphone or tablet, while also providing a means to make adjustments to system parameters such as exposure or marker thresholds on the fly. The app has been designed and evaluated through a process observing user-centered design principles, and effectively reduces the amount of work involved in installing and reconfiguring motion capture systems.
  • Thumbnail Image
    PublicationOpen Access
    User interface paradigms for visually authoring mid-air gestures: a survey and a provocation
    (CEUR-WS, 2014) Department of Media and Visual Arts; Department of Computer Engineering; Baytaş, Mehmet Aydın; Yemez, Yücel; Özcan, Oğuzhan; Faculty Member; Faculty Member; Department of Media and Visual Arts; Department of Computer Engineering; College of Social Sciences and Humanities; College of Engineering; N/A; N/A; 12532
    Gesture authoring tools enable the rapid and experiential prototyping of gesture-based interfaces. We survey visual authoring tools for mid-air gestures and identify three paradigms used for representing and manipulating gesture information: graphs, visual markup languages and timelines. We examine the strengths and limitations of these approaches and we propose a novel paradigm to authoring location-based mid-air gestures based on space discretization.
  • Thumbnail Image
    PublicationOpen Access
    Towards materials for computational heirlooms: blockchains and wristwatches
    (Association for Computing Machinery (ACM), 2018) Fjeld, Morten; Department of Media and Visual Arts; Baytaş, Mehmet Aydın; Coşkun, Aykut; Yantaç, Asım Evren; Faculty Member; Faculty Member; Department of Media and Visual Arts; KU Arçelik Research Center for Creative Industries (KUAR) / KU Arçelik Yaratıcı Endüstriler Uygulama ve Araştırma Merkezi (KUAR); College of Social Sciences and Humanities; N/A; 165306; 52621
    This paper explores the contrasting notions of "permanance and disposability," "the digital and the physical," and "symbolism and function" in the context of interaction design. Drawing from diverse streams of knowledge, we describe a novel design direction for enduring computational heirlooms based on the marriage of decentralized, trustless software and durable mobile hardware. To justify this concept, we review prior research; attempt to redefine the notion of "material;" propose blockchain-based software as a particular digital material to serve as a substrate for computational heirlooms; and argue for the use of mobile artifacts, informed in terms of their materials and formgiving practices by mechanical wristwatches, as its physical embodiment and functional counterpart. This integration is meant to enable mobile and ubiquitous interactive systems for the storing, experiencing, and exchanging value throughout multiple human lifetimes; showcasing the feats of computational sciences and crafts; and enabling novel user experiences.
  • Thumbnail Image
    PublicationOpen Access
    iHDI 2020: interdisciplinary workshop on human-drone interaction
    (Association for Computing Machinery (ACM), 2020) Funk, Markus; Ljungblad, Sara; Garcia, Jeremie; La Delfa, Joseph; Mueller, Florian 'Floyd'; N/A; Baytaş, Mehmet Aydın; Graduate School of Social Sciences and Humanities
    Human-drone interaction (HDI) is becoming a ubiquitous topic in daily life, and a rising research topic within CHI. Knowledge from a wealth of disciplines - design, engineering, social sciences, and humanities - can inform the design and scholarship of HDI, and interdisciplinary communication is essential to this end. The Interdisciplinary Workshop on Human-Drone Interaction (iHDI 2020) aims to bring together diverse perspectives; advancing HDI and its scholarship through a rich variety of activities involving an assortment of research, design, and prototyping methods. The workshop intends to serve as a platform for a diverse community that continuously builds on each other's methods and philosophies, towards results that ""take off.""