Publication:
Neural Architecture Search for biomedical image classification: a comparative study across data modalities

dc.contributor.coauthorKus, Zeki
dc.contributor.coauthorAydin, Musa
dc.contributor.coauthorKiraz, Berna
dc.contributor.departmentDepartment of Physics
dc.contributor.departmentDepartment of Electrical and Electronics Engineering
dc.contributor.kuauthorKiraz, Alper
dc.contributor.schoolcollegeinstituteCollege of Sciences
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.date.accessioned2025-05-22T10:35:31Z
dc.date.available2025-05-22
dc.date.issued2025
dc.description.abstractDeep neural networks have significantly advanced medical image classification across various modalities and tasks. However, manually designing these networks is often time-consuming and suboptimal. Neural Architecture Search (NAS) automates this process, potentially finding more efficient and effective models. This study provides a comprehensive comparative analysis of our two NAS methods, PBC-NAS and BioNAS, across multiple biomedical image classification tasks using the MedMNIST dataset. Our experiments evaluate these methods based on classification performance (Accuracy (ACC) and Area Under the Curve (AUC)) and computational complexity (Floating Point Operation Counts). Results demonstrate that BioNAS models slightly outperform PBC-NAS models in accuracy, with BioNAS-2 achieving the highest average accuracy of 0.848. However, PBC-NAS models exhibit superior computational efficiency, with PBC-NAS-2 achieving the lowest average FLOPs of 0.82 GB. Both methods outperform state-of-the-art architectures like ResNet-18 and ResNet-50 and AutoML frameworks such as auto-sklearn, AutoKeras, and Google AutoML. Additionally, PBC-NAS and BioNAS outperform other NAS studies in average ACC results (except MSTF-NAS), and show highly competitive results in average AUC. We conduct extensive ablation studies to investigate the impact of architectural parameters, the effectiveness of fine-tuning, search space efficiency, and the discriminative performance of generated architectures. These studies reveal that larger filter sizes and specific numbers of stacks or modules enhance performance. Fine-tuning existing architectures can achieve nearly optimal results without separating NAS for each dataset. Furthermore, we analyze search space efficiency, uncovering patterns in frequently selected operations and architectural choices. This study highlights the strengths and efficiencies of PBC-NAS and BioNAS, providing valuable insights and guidance for future research and practical applications in biomedical image classification.
dc.description.fulltextNo
dc.description.harvestedfromManual
dc.description.indexedbyScopus
dc.description.indexedbyPubMed
dc.description.indexedbyWOS
dc.description.publisherscopeInternational
dc.description.readpublishN/A
dc.description.sponsoredbyTubitakEuN/A
dc.description.sponsorshipA. Kiraz acknowledges partial support from the Turkish Academy of Sciences (TUBA).
dc.identifier.doi10.1016/j.artmed.2024.103064
dc.identifier.eissn1873-2860
dc.identifier.embargoNo
dc.identifier.issn0933-3657
dc.identifier.quartileQ1
dc.identifier.scopus2-s2.0-85214299056
dc.identifier.urihttps://hdl.handle.net/20.500.14288/29482
dc.identifier.urihttps://doi.org/10.1016/j.artmed.2024.103064
dc.identifier.volume160
dc.identifier.wos001399282300001
dc.keywordsBiomedical image classification
dc.keywordsMedMNIST
dc.keywordsNeural Architecture Search
dc.keywordsOpposition-based differential evolution
dc.language.isoeng
dc.publisherElsevier
dc.relation.affiliationKoç University
dc.relation.collectionKoç University Institutional Repository
dc.relation.ispartofArtificial Intelligence in Medicine
dc.subjectComputer science
dc.subjectArtificial intelligence
dc.subjectEngineering
dc.subjectBiomedical engineering
dc.subjectMedical informatics
dc.titleNeural Architecture Search for biomedical image classification: a comparative study across data modalities
dc.typeJournal Article
dspace.entity.typePublication
person.familyNameKiraz
person.givenNameAlper
relation.isOrgUnitOfPublicationc43d21f0-ae67-4f18-a338-bcaedd4b72a4
relation.isOrgUnitOfPublication21598063-a7c5-420d-91ba-0cc9b2db0ea0
relation.isOrgUnitOfPublication.latestForDiscoveryc43d21f0-ae67-4f18-a338-bcaedd4b72a4
relation.isParentOrgUnitOfPublicationaf0395b0-7219-4165-a909-7016fa30932d
relation.isParentOrgUnitOfPublication8e756b23-2d4a-4ce8-b1b3-62c794a8c164
relation.isParentOrgUnitOfPublication.latestForDiscoveryaf0395b0-7219-4165-a909-7016fa30932d

Files