Research Outputs

Permanent URI for this communityhttps://hdl.handle.net/20.500.14288/2

Browse

Search Results

Now showing 1 - 10 of 10
  • Placeholder
    Publication
    A classification of concurrency bugs in java benchmarks by developer intent
    (Association for Computing Machinery (ACM), 2006) Department of Computer Engineering; Department of Computer Engineering; N/A; Keremoğlu, M. Erkan; Taşıran, Serdar; Elmas, Tayfun; Researcher; Faculty Member; PhD Student; Department of Computer Engineering; College of Engineering; College of Engineering; Graduate School of Sciences and Engineering; N/A; N/A; N/A
    This work addresses the issue of selecting the formal correctness criterion for a concurrent Java program that best corresponds to the developer's intent. We study a set of concurrency-related bugs detected in Java benchmarks reported in the literature. On these programs, we determine whether race-freedom, atomicity or refinement is the simplest and most appropriate criterion for program correctness. Our purpose is to demonstrate empirically the fact that the appropriate fix for a concurrency error and the selection of a program analysis tool for detecting such an error must be based on the proper expression of the designer's intent using a formal correctness criterion.
  • Thumbnail Image
    PublicationOpen Access
    Alpha-beta-conspiracy search
    (International Computer Games Association (ICGA), 2002) McAllester, David A.; Department of Computer Engineering; Yüret, Deniz; Faculty Member; Department of Computer Engineering; College of Engineering; 179996
    We introduce a variant of alpha-beta search in which each node is associated with two depths rather than one. The purpose of alpha-beta search is to find strategies for each player that together establish a value for the root position. A max strategy establishes a lower bound and the min strategy establishes an upper bound. It has long been observed that forced moves should be searched more deeply. Here we make the observation that in the max strategy we are only concerned with the forcedness of max moves and in the min strategy we are only concerned with the forcedness of min moves. This leads to two measures of depth - one for each strategy - and to a two-depth variant of alpha-beta called ABC search. The two-depth approach can be formally derived from conspiracy theory and the structure of the ABC procedure is justified by two theorems relating ABC search and conspiracy numbers.
  • Placeholder
    Publication
    Foreword to the special section on expressive 2015
    (Pergamon-Elsevier Science Ltd, 2016) N/A; Department of Computer Engineering; Sezgin, Tevfik Metin; Faculty Member; Department of Computer Engineering; College of Engineering; 18632
    N/A
  • Thumbnail Image
    PublicationOpen Access
    From noon to sunset: interactive rendering, relighting, and recolouring of landscape photographs by modifying solar position
    (Wiley, 2021) Türe, Murat; Çıklabakkal, Mustafa Ege; Erdem, Erkut; Satılmış, Pınar; Akyüz, Ahmet Oğuz; Department of Computer Engineering; Erdem, Aykut; Faculty Member; Department of Computer Engineering; College of Engineering; 20331
    Image editing is a commonly studied problem in computer graphics. Despite the presence of many advanced editing tools, there is no satisfactory solution to controllably update the position of the sun using a single image. This problem is made complicated by the presence of clouds, complex landscapes, and the atmospheric effects that must be accounted for. In this paper, we tackle this problem starting with only a single photograph. With the user clicking on the initial position of the sun, our algorithm performs several estimation and segmentation processes for finding the horizon, scene depth, clouds, and the sky line. After this initial process, the user can make both fine- and large-scale changes on the position of the sun: it can be set beneath the mountains or moved behind the clouds practically turning a midday photograph into a sunset (or vice versa). We leverage a precomputed atmospheric scattering algorithm to make all of these changes not only realistic but also in real-time. We demonstrate our results using both clear and cloudy skies, showing how to add, remove, and relight clouds, all the while allowing for advanced effects such as scattering, shadows, light shafts, and lens flares.
  • Thumbnail Image
    PublicationOpen Access
    Generalized Polytopic Matrix Factorization
    (Institute of Electrical and Electronics Engineers (IEEE), 2021) Department of Electrical and Electronics Engineering; Erdoğan, Alper Tunga; Tatlı, Gökcan; Faculty Member; Department of Electrical and Electronics Engineering; College of Engineering; Graduate School of Sciences and Engineering; 41624; N/A
    Polytopic Matrix Factorization (PMF) is introduced as a flexible data decomposition tool with potential applications in unsupervised learning. PMF assumes a generative model where observations are lossless linear mixtures of some samples drawn from a particular polytope. Assuming that these samples are sufficiently scattered inside the polytope, a determinant maximization based criterion is used to obtain latent polytopic factors from the corresponding observations. This article aims to characterize all eligible polytopic sets that are suitable for the PMF framework. In particular, we show that any polytope whose set of vertices have only permutation and/or sign invariances qualifies for PMF framework. Such a rich set of possibilities enables elastic modeling of independent/dependent latent factors with combination of features such as relatively sparse/antisparse subvectors, mixture of signed/nonnegative components with optionally prescribed domains.
  • Thumbnail Image
    PublicationOpen Access
    Highly efficient and re-executable private function evaluation with linear complexity
    (Institute of Electrical and Electronics Engineers (IEEE), 2022) Bingöl, Muhammed Ali; Kiraz, Mehmet Sabr; Levi, Albert; Department of Computer Engineering; Biçer, Osman; Department of Computer Engineering; Graduate School of Sciences and Engineering
    Private function evaluation aims to securely compute a function f(x(1), ..., x(n)) without leaking any information other than what is revealed by the output, where f is a private input of one of the parties (say Party(1)) and x(i) is a private input of the ith party Party(i). In this article, we propose a novel and secure two-party private function evaluation (2PFE) scheme based on the DDH assumption. Our scheme introduces a reusability feature that significantly improves the state-of-the-art. Accordingly, our scheme has two variants, one is utilized in the initial execution of the function f, and the other is utilized in its subsequent evaluations. To the best of our knowledge, this is the first and most efficient 2PFE scheme that enjoys a reusablity feature. Our protocols achieve linear communication and computation complexities and a constant number of rounds which is at most three.
  • Thumbnail Image
    PublicationOpen Access
    Location pairs: a test coverage metric for shared-memory concurrent programs
    (Springer, 2012) Muslu, Kıvanç; Department of Computer Engineering; Keremoğlu, M. Erkan; Taşıran, Serdar; Faculty Member; Department of Computer Engineering; College of Engineering
    We present a coverage metric targeted at shared-memory concurrent programs: the Location Pairs (LP) coverage metric. The goals of this metric are (i) to measure how thoroughly a program has been tested from a concurrency standpoint, i.e., whether enough qualitatively different thread interleavings have been explored, and (ii) to guide testing towards unexplored concurrency scenarios. This metric was inspired by an access pattern known to lead to high-level concurrency errors in industrial software and in the literature. We built a monitoring tool to measure LP coverage of test programs. We used the LP metric for interactive debugging, and compared LP coverage with other concurrency coverage metrics on Java benchmarks. We demonstrated that LP coverage corresponds better to concurrency errors, is a better measure of how well a program is exercised concurrency-wise by a test set, reaches saturation later than other coverage metrics, and is viable and useful as an interactive testing and debugging tool.
  • Placeholder
    Publication
    Scalability and robustness of pull-based anti-entropy distribution model
    (Springer-Verlag Berlin, 2003) N/A; Department of Computer Engineering; Özkasap, Öznur; Faculty Member; Department of Computer Engineering; College of Engineering; 113507
    There are several alternative mechanisms for disseminating information among a group of participants in a distributed environment. An efficient model is to use epidemic algorithms that involve pair-wise propagation of information. These algorithms are based on the theory of epidemics which studies the spreading of infectious diseases through a population. Epidemic protocols are simple, scale well and robust again common failures, and provide eventual consistency as well. They have been mainly utilized in a large set of applications for resolving inconsistencies in distributed database updates, failure detection, reliable multicasting, network news distribution, scalable system management, and resource discovery. A popular distribution model based on the theory of epidemics is the anti-entropy. In this study, we focus on pull-based anti-entropy model used for multicast reliability as a case study, demonstrate its scalability and robustness, and give our comparative simulation results discussing the performance of the approach on a range of typical scenarios.
  • Placeholder
    Publication
    Special section on the 2011 joint symposium on computational aesthetics (CAe), non-photorealistic animation and rendering (NPAR), and sketch-based interfaces and modeling (SBIM)
    (Pergamon-Elsevier Science Ltd, 2012) Isenberg, Tobias; Asente, Paul; Collomosse, John; Department of Computer Engineering; Sezgin, Tevfik Metin; Faculty Member; Department of Computer Engineering; College of Engineering; 18632
    N/A
  • Placeholder
    Publication
    Transport protocol mechanisms for wireless networking: a review and comparative simulation study
    (Springer-Verlag Berlin, 2003) N/A; N/A; Department of Computer Engineering; Kanak, Alper; Özkasap, Öznur; Master Student; Faculty Member; Department of Computer Engineering; Graduate School of Sciences and Engineering; College of Engineering; N/A; 113507
    Increasing popularity of wireless services has triggered the need for efficient wireless transport mechanisms. TCP, being the reliable transport level protocol widely used in wired network world, was not designed with heterogeneity in mind. The problem with the adaptation of TCP to the evolving wireless settings is because of the assumption that packet loss and unusual delays are mainly caused by congestion. TCP originally assumes that packet loss is very small. on the other hand, wireless links often suffer from high bit error rates and broken connectivity due to handoffs. A range of schemes, namely end-to-end, split-connection and link-layer protocols, has been proposed to improve the performance of transport mechanisms, in particular TCP, on wireless settings. In this study, we examine these mechanisms for wireless transport, and discuss our comparative simulation results of end-to-end TCP versions (Tahoe, Reno, NewReno and SACK) in various network settings including wireless LANs and wired-cum-wireless scenarios.