Researcher:
Yeşilbek, Kemal Tuğrul

Loading...
Profile Picture
ORCID

Job Title

Master Student

First Name

Kemal Tuğrul

Last Name

Yeşilbek

Name

Name Variants

Yeşilbek, Kemal Tuğrul

Email Address

Birth Date

Search Results

Now showing 1 - 4 of 4
  • Placeholder
    Publication
    Sketch recognition with few examples
    (Pergamon-Elsevier Science Ltd, 2017) N/A; Department of Computer Engineering; Yeşilbek, Kemal Tuğrul; Sezgin, Tevfik Metin; Master Student; Faculty Member; Department of Computer Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; 18632
    Sketch recognition is the task of converting hand-drawn digital ink into symbolic computer representations. Since the early days of sketch recognition, the bulk of the work in the field focused on building accurate recognition algorithms for specific domains, and well defined data sets. Recognition methods explored so far have been developed and evaluated using standard machine learning pipelines and have consequently been built over many simplifying assumptions. For example, existing frameworks assume the presence of a fixed set of symbol classes, and the availability of plenty of annotated examples. However, in practice, these assumptions do not hold. In reality, the designer of a sketch recognition system starts with no labeled data at all, and faces the burden of data annotation. In this work, we propose to alleviate the burden of annotation by building systems that can learn from very few labeled examples, and large amounts of unlabeled data. Our systems perform self-learning by automatically extending a very small set of labeled examples with new examples extracted from unlabeled sketches. The end result is a sufficiently large set of labeled training data, which can subsequently be used to train classifiers. We present four self-learning methods with varying levels of implementation difficulty and runtime complexities. One of these methods leverages contextual co-occurrence patterns to build verifiably more diverse set of training instances. Rigorous experiments with large sets of data demonstrate that this novel approach based on exploiting contextual information leads to significant leaps in recognition performance. As a side contribution, we also demonstrate the utility of bagging for sketch recognition in imbalanced data sets with few positive examples and many outliers.
  • Placeholder
    Publication
    Sketch recognition with few examples (vol 69, pg 80, 2017)
    (Pergamon-Elsevier Science Ltd, 2021) N/A; Department of Computer Engineering; Yeşilbek, Kemal Tuğrul; Sezgin, Tevfik Metin; Master Student; Faculty Member; Department of Computer Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; 18632
    N/A
  • Placeholder
    Publication
    SVM for sketch recognition: which hyperparameter interval to try ?
    (IEEE, 2015) Department of Computer Engineering; Sezgin, Tevfik Metin; Şen, Cansu; Yeşilbek, Kemal Tuğrul; Çakmak, Şerike; Faculty Member; Master Student; Master Student; Master Student; Department of Computer Engineering; College of Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; 18632; N/A; N/A; N/A
    Hyperparameters are among the most crucial factors that effect the performance of machine learning algorithms. Since there is not a common ground on which hyperparameter combinations give the highest performance in terms of prediction accuracy, hyperparameter search needs to be conducted each time a model is to be trained. In this work, we analyzed how similar hyperparemeters perform on various datasets from sketch recognition domain. Results have shown that hyperparameter search space can be reduced to a subspace despite differences in dataset characteristics.
  • Placeholder
    Publication
    SVM for sketch recognition: which hyperparameter interval to try?
    (IEEE, 2015) N/A; N/A; N/A; Department of Computer Engineering; Yeşilbek, Kemal Tuğrul; Şen, Cansu; Çakmak, Şerike; Sezgin, Tevfik Metin; Master Student; Master Student; Master Student; Faculty Member; Department of Computer Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; N/A; N/A; 18632
    Hyperparameters are among the most crucial factors that effect the performance of machine learning algorithms. Since there is not a common ground on which hyperparameter combinations give the highest performance in terms of prediction accuracy, hyperparameter search needs to be conducted each time a model is to be trained. in this work, we analyzed how similar hyperparemeters perform on various datasets from sketch recognition domain. Results have shown that hyperparameter search space can be reduced to a subspace despite differences in dataset characteristics.