Publication:
MetaQA: combining expert agents for multi-skill question answering

Placeholder

Organizational Units

Program

KU Authors

Co-Authors

Puerto, Haritz
Gurevych, Iryna

Advisor

Publication Date

2023

Language

en

Type

Conference proceeding

Journal Title

Journal ISSN

Volume Title

Abstract

The recent explosion of question-answering (QA) datasets and models has increased the interest in the generalization of models across multiple domains and formats by either training on multiple datasets or combining multiple models. Despite the promising results of multi-dataset models, some domains or QA formats may require specific architectures, and thus the adaptability of these models might be limited. In addition, current approaches for combining models disregard cues such as question-answer compatibility. In this work, we propose to combine expert agents with a novel, flexible, and training-efficient architecture that considers questions, answer predictions, and answer-prediction confidence scores to select the best answer among a list of answer predictions. Through quantitative and qualitative experiments, we show that our model i) creates a collaboration between agents that outperforms previous multi-agent and multi-dataset approaches, ii) is highly data-efficient to train, and iii) can be adapted to any QA format. We release our code and a dataset of answer predictions from expert agents for 16 QA datasets to foster future research of multi-agent systems. © 2023 Association for Computational Linguistics.

Description

Source:

EACL 2023 - 17th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference

Publisher:

Association for Computational Linguistics (ACL)

Keywords:

Subject

Computational linguistics, Natural language processing systems, Language modeling

Citation

Endorsement

Review

Supplemented By

Referenced By

Copy Rights Note

0

Views

0

Downloads