Transformer-based Swedish Semantic Role Labeling through Transfer Learning
Paper in proceeding, 2024

Semantic Role Labeling (SRL) is a task in natural language understanding where the goal is to extract semantic roles for a given sentence. English SRL has achieved state-of-the-art performance using Transformer techniques and supervised learning. However, this technique is not a viable choice for smaller languages like Swedish due to the limited amount of training data. In this paper, we present the first effort in building a Transformer-based SRL system for Swedish by exploring multilingual and cross-lingual transfer learning methods and leveraging the Swedish FrameNet resource. We demonstrate that multilingual transfer learning outperforms two different cross-lingual transfer models. We also found some differences between frames in FrameNet that can either hinder or enhance the model's performance. The resulting end-to-end model is freely available and will be made accessible through Språkbanken Text's research infrastructure.

FrameNet

Semantic Role Labeling

Transfer Learning

Author

Lucy Yang Buhr

Dana Dannélls

Computing Science (Chalmers)

University of Gothenburg

Språkbanken Text

Richard Johansson

Chalmers, Computer Science and Engineering (Chalmers), Data Science

University of Gothenburg

2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation, LREC-COLING 2024 - Main Conference Proceedings

16762-16769
9782493814104 (ISBN)

Joint 30th International Conference on Computational Linguistics and 14th International Conference on Language Resources and Evaluation, LREC-COLING 2024
Hybrid, Torino, Italy,

Subject Categories (SSIF 2025)

Natural Language Processing

Comparative Language Studies and Linguistics

More information

Latest update

11/19/2025