About me
Hi! I am Yuekun Yao, a PhD student in computational linguistics at Saarland University working with Alexander Koller.
My research interests focus on structured prediction tasks in natural language understanding, including syntactic parsing, semantic role labeling and semantic parsing. Currently I am especially interested in using sequence-to-sequence models to learn compositional generalization, a key ability that humans use to generate infinate natural language sentences from finite grammar structures.
Publications
Anything Goes? A Crosslinguistic Study of (Im)possible Language Learning in LMs
Xiulin Yang, Tatsuya Aoyama, Yuekun Yao, Ethan Wilcox
Preprint, 2025
Xiulin Yang, Tatsuya Aoyama, Yuekun Yao, Ethan Wilcox
Preprint, 2025
Predicting generalization performance with correctness discriminators [paper]
Yuekun Yao, Alexander Koller
Findings of EMNLP 2024
Yuekun Yao, Alexander Koller
Findings of EMNLP 2024
Simple and effective data augmentation for compositional generalization [paper]
Yuekun Yao, Alexander Koller
NAACL 2024
Yuekun Yao, Alexander Koller
NAACL 2024
SLOG: A Structural Generalization Benchmark for Semantic Parsing [paper]
Bingzhi Li, Lucia Donatelli, Alexander Koller, Tal Linzen, Yuekun Yao, Najoung Kim
EMNLP 2023
Bingzhi Li, Lucia Donatelli, Alexander Koller, Tal Linzen, Yuekun Yao, Najoung Kim
EMNLP 2023
Structural generalization is hard for sequence-to-sequence models [paper]
Yuekun Yao, Alexander Koller
EMNLP 2022
Yuekun Yao, Alexander Koller
EMNLP 2022
Dynamic masking for improved stability in online spoken language translation [paper]
Yuekun Yao, Barry Haddow
AMTA 2020
Yuekun Yao, Barry Haddow
AMTA 2020
ELITR non-native speech translation at IWSLT 2020 [paper]
Dominik Macháček, Jonáš Kratochvíl, Sangeet Sagar, Matúš Žilinec, Ondřej Bojar, Thai-Son Nguyen, Felix Schneider, Philip Williams, Yuekun Yao
IWSLT 2020
Dominik Macháček, Jonáš Kratochvíl, Sangeet Sagar, Matúš Žilinec, Ondřej Bojar, Thai-Son Nguyen, Felix Schneider, Philip Williams, Yuekun Yao
IWSLT 2020