Research Outputs

Permanent URI for this communityhttps://hdl.handle.net/20.500.14288/2

Browse

Search Results

Now showing 1 - 3 of 3
  • Placeholder
    Publication
    Bias, Type I error rates, and statistical power of a latent mediation model in the presence of violations of invariance
    (Sage, 2018) Olivera-Aguilar, Margarita; Rikoon, Samuel H.; Gonzalez Oskar; MacKinnon David P.; Department of Psychology; Sakarya, Yasemin Kisbu; Faculty Member; Department of Psychology; College of Social Sciences and Humanities; 219275
    When testing a statistical mediation model, it is assumed that factorial measurement invariance holds for the mediating construct across levels of the independent variable X. The consequences of failing to address the violations of measurement invariance in mediation models are largely unknown. The purpose of the present study was to systematically examine the impact of mediator noninvariance on the Type I error rates, statistical power, and relative bias in parameter estimates of the mediated effect in the single mediator model. The results of a large simulation study indicated that, in general, the mediated effect was robust to violations of invariance in loadings. In contrast, most conditions with violations of intercept invariance exhibited severely positively biased mediated effects, Type I error rates above acceptable levels, and statistical power larger than in the invariant conditions. The implications of these results are discussed and recommendations are offered.
  • Placeholder
    Publication
    Code notes: designing a low-cost tangible coding tool for/with children
    (Association for Computing Machinery (ACM), 2018) N/A; Department of Psychology; N/A; Department of Psychology; Sabuncuoğlu, Alpay; Erkaya, Merve; Buruk, Oğuz Turan; Göksun, Tilbe; PhD Student; Undergraduate Student; PhD Student; Faculty Member; Department of Psychology; KU Arçelik Research Center for Creative Industries (KUAR) / KU Arçelik Yaratıcı Endüstriler Uygulama ve Araştırma Merkezi (KUAR); Graduate School of Sciences and Engineering; College of Social Sciences and Humanities; Graduate School of Social Sciences and Humanities; College of Social Sciences and Humanities; N/A; N/A; N/A; 47278
    Programming has become an essential subject for today's education curriculum and as a result, the importance of creating the right environments to teach is increasing. For such environments, featuring tangible tools enhances creativity and collaboration. However, due to their high prices, current tangible tools are not reachable by most of the students. We developed Code Notes as a low-cost, attainable and tangible tool aimed to motivate children to support programming education. Code Notes is comprised of an Android app and code-cardboards to teach the basic concepts in programming. We continue to develop the platform with insights gained from children. This paper shares the design phases of Code Notes and observations from our two-month programming project. We also presented some future concepts of Code Notes that offer an active and embodied interaction with the teaching material.
  • Placeholder
    Publication
    The comparative regression discontinuity (CRD) design: an overview and demonstration of its performance relative to basic RD and the randomized experiment
    (Jai-Elsevier Science Inc, 2017) Tang, Yang; Cook, Thomas D.; Hock, Heinrich; Chiang, Hanley; Department of Psychology; Sakarya, Yasemin Kisbu; Faculty Member; Department of Psychology; College of Social Sciences and Humanities; 219275
    Relative to the randomized controlled trial (RCT), the basic regression discontinuity (RD) design suffers from lower statistical power and lesser ability to generalize causal estimates away from the treatment eligibility cutoff. This chapter seeks to mitigate these limitations by adding an untreated outcome comparison function that is measured along all or most of the assignment variable. When added to the usual treated and untreated outcomes observed in the basic RD, a comparative RD (CRD) design results. One version of CRD adds a pretest measure of the study outcome (CRD-Pre); another adds posttest outcomes from a nonequivalent comparison group (CRD-CG). We describe how these designs can be used to identify unbiased causal effects away from the cutoff under the assumption that a common, stable functional form describes how untreated outcomes vary with the assignment variable, both in the basic RD and in the added outcomes data (pretests or a comparison group's posttest). We then create the two CRD designs using data from the National Head Start Impact Study, a large-scale RCT. For both designs, we find that all untreated outcome functions are parallel, which lends support to CRD's identifying assumptions. Our results also indicate that CRD-Pre and CRD-CG both yield impact estimates at the cutoff that have a similarly small bias as, but are more precise than, the basic RD's impact estimates. In addition, both CRD designs produce estimates of impacts away from the cutoff that have relatively little bias compared to estimates of the same parameter from the RCT design. This common finding appears to be driven by two different mechanisms. In this instance of CRD-CG, potential untreated outcomes were likely independent of the assignment variable from the start. This was not the case with CRD-Pre. However, fitting a model using the observed pretests and untreated posttests to account for the initial dependence generated an accurate prediction of the missing counterfactual. The result was an unbiased causal estimate away from the cutoff, conditional on this successful prediction of the untreated outcomes of the treated.