Researcher:
Akman, Alican

Loading...
Profile Picture
ORCID

Job Title

Master Student

First Name

Alican

Last Name

Akman

Name

Name Variants

Akman, Alican

Email Address

Birth Date

Search Results

Now showing 1 - 2 of 2
  • Placeholder
    Publication
    Generation of 3D human models and animations using simple sketches
    (Canadian Information Processing Society, 2020) Sahillioğlu, Y.; N/A; Department of Computer Engineering; Akman, Alican; Sezgin, Tevfik Metin; Master Student; Faculty Member; Department of Computer Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; 18632
    Generating 3D models from 2D images or sketches is a widely studied important problem in computer graphics. We describe the first method to generate a 3D human model from a single sketched stick figure. In contrast to the existing human modeling techniques, our method requires neither a statistical body shape model nor a rigged 3D character model. We exploit Variational Autoencoders to develop a novel framework capable of transitioning from a simple 2D stick figure sketch, to a corresponding 3D human model. Our network learns the mapping between the input sketch and the output 3D model. Furthermore, our model learns the embedding space around these models. We demonstrate that our network can generate not only 3D models, but also 3D animations through interpolation and extrapolation in the learned embedding space. Extensive experiments show that our model learns to generate reasonable 3D models and animations. © 2020 Canadian Information Processing Society. All rights reserved.
  • Thumbnail Image
    PublicationOpen Access
    Sketch-based interaction and modeling: where do we stand?
    (Cambridge University Press (CUP), 2019) Bonnici, Alexandra; Calleja, Gabriel; Camilleri, Kenneth P.; Fehling, Patrick; Ferreira, Alfredo; Hermuth, Florian; Israel, Johann Habakuk; Landwehr, Tom; Liu, Juncheng; Padfield, Natasha M. J.; Rosin, Paul L.; N/A; Department of Computer Engineering; Akman, Alican; Sezgin, Tevfik Metin; Faculty Member; Department of Computer Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; 18632
    Sketching is a natural and intuitive communication tool used for expressing concepts or ideas which are difficult to communicate through text or speech alone. Sketching is therefore used for a variety of purposes, from the expression of ideas on two-dimensional (2D) physical media, to object creation, manipulation, or deformation in three-dimensional (3D) immersive environments. This variety in sketching activities brings about a range of technologies which, while having similar scope, namely that of recording and interpreting the sketch gesture to effect some interaction, adopt different interpretation approaches according to the environment in which the sketch is drawn. In fields such as product design, sketches are drawn at various stages of the design process, and therefore, designers would benefit from sketch interpretation technologies which support these differing interactions. However, research typically focuses on one aspect of sketch interpretation and modeling such that literature on available technologies is fragmented and dispersed. In this paper, we bring together the relevant literature describing technologies which can support the product design industry, namely technologies which support the interpretation of sketches drawn on 2D media, sketch-based search interactions, as well as sketch gestures drawn in 3D media. This paper, therefore, gives a holistic view of the algorithmic support that can be provided in the design process. In so doing, we highlight the research gaps and future research directions required to provide full sketch-based interaction support.