Publication:
Benefits of Online Tilted Empirical Risk Minimization: A Case Study of Outlier Detection and Robust Regression

dc.conference.date2025-08-31 through 2025-09-03
dc.conference.locationIstanbul
dc.contributor.coauthorYildirim, Yiǧit E. (60197925500)
dc.contributor.coauthorDemir, Samet (58662508900)
dc.contributor.coauthorDoǧan, Zafer (35101767100)
dc.date.accessioned2025-12-31T08:19:17Z
dc.date.available2025-12-31
dc.date.issued2025
dc.description.abstractEmpirical Risk Minimization (ERM) is a foundational framework for supervised learning but primarily optimizes averagecase performance, often neglecting fairness and robustness considerations. Tilted Empirical Risk Minimization (TERM) extends ERM by introducing an exponential tilt hyperparameter t to balance average-case accuracy with worst-case fairness and robustness. However, in online or streaming settings where data arrive one sample at a time, the classical TERM objective degenerates to standard ERM, losing tilt sensitivity. We address this limitation by proposing an online TERM formulation that removes the logarithm from the classical objective, preserving tilt effects without additional computational or memory overhead. This formulation enables a continuous trade-off controlled by t, smoothly interpolating between ERM (t → 0), fairness emphasis (t>0), and robustness to outliers (t<0). We empirically validate online TERM on two representative streaming tasks: robust linear regression with adversarial outliers and minority-class detection in binary classification. Our results demonstrate that negative tilting effectively suppresses outlier influence, while positive tilting improves recall with minimal impact on precision, all at per-sample computational cost equivalent to ERM. Online TERM thus recovers the full robustness-fairness spectrum of classical TERM in an efficient single-sample learning regime. © 2025 IEEE.
dc.description.fulltextYes
dc.description.harvestedfromManual
dc.description.indexedbyScopus
dc.description.publisherscopeInternational
dc.description.readpublishN/A
dc.description.sponsoredbyTubitakEuTÜBİTAK
dc.description.sponsorshipTürkiye Bilimsel ve Teknolojik Araştırma Kurumu, TUBITAK, (124E063); Türkiye Bilimsel ve Teknolojik Araştırma Kurumu, TUBITAK
dc.identifier.doi10.1109/MLSP62443.2025.11204247
dc.identifier.embargoNo
dc.identifier.isbn9798331570293
dc.identifier.isbn9781467374545
dc.identifier.isbn9781728166629
dc.identifier.isbn9781538654774
dc.identifier.isbn9781509063413
dc.identifier.isbn9781728163383
dc.identifier.isbn9781728108247
dc.identifier.isbn9781509007462
dc.identifier.isbn9781467310260
dc.identifier.isbn9781479936946
dc.identifier.issn2161-0363
dc.identifier.quartileN/A
dc.identifier.scopus2-s2.0-105022128098
dc.identifier.urihttps://doi.org/10.1109/MLSP62443.2025.11204247
dc.identifier.urihttps://hdl.handle.net/20.500.14288/31446
dc.keywordsfairness
dc.keywordsonline learning
dc.keywordsoutlier detection
dc.keywordsrobustness
dc.keywordsTilted empirical risk minimization
dc.language.isoeng
dc.publisherIEEE Computer Society
dc.relation.affiliationKoç University
dc.relation.collectionKoç University Institutional Repository
dc.relation.ispartofIEEE International Workshop on Machine Learning for Signal Processing, MLSP
dc.relation.openaccessYes
dc.rightsCC BY-NC-ND (Attribution-NonCommercial-NoDerivs)
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/
dc.titleBenefits of Online Tilted Empirical Risk Minimization: A Case Study of Outlier Detection and Robust Regression
dc.typeConference Proceeding
dspace.entity.typePublication

Files