The Tenth Swedish Language Technology Conference (SLTC) will take place on 27th–29th of November 2024 at Linköping University, Sweden, with the main conference on Wednesday–Thursday 27th-28th of November and workshops on Friday, 29th of November. All researchers and students working on NLP and related fields are invited to participate.
Important Dates
- Workshop Proposal Submission Deadline:
23.8. - Workshop Notification of Acceptance:
26.8. - Extended Abstracts Submission Deadline:
15.9 - Extended Abstracts Notification of Acceptance: 14.10.
- Camera-Ready Abstracts: 1.11.
- Conference: 27-28.11.
- Workshops: 29.11.
Workshops
SLTC 2024 traditionally hosts selected workshops on relevant topics. This year's workshops are:
- Computational Social Science and Language Technology
- Swedish Workshop on Conversational AI
- Towards the Positronic Brain: Workshop on Embodied Language Processing and Multimodal Interaction
- Applications of Universal Dependencies
Call for Extended Abstracts
Papers are invited on all theoretical, practical and applied aspects of language technology, including natural language processing, computational linguistics, speech technology and neighbouring areas. Papers can describe completed or ongoing research, as well as practical applications of language technology, and may be combined with system demonstrations. More Information.
The conference does not publish any proceedings but accepted contributions will be made available on the conference web page as extended abstracts. Hence, it is possible to submit abstracts related to work that has been, or will be, published elsewhere, as long as this is compatible with the conditions of the respective publication channels.
Invited Speakers
We will have two keynotes by our exciting invited speakers:
- David Samuel is a PhD student in the Language Technology Group at the University of Oslo. His team's submission won the CoNLL 2023 Shared Task on training a language model on a fixed data budget of 10 or 100 million tokens, also called the BabyLM challenge. It outperformed much larger models trained on trillions of tokens. In this talk, David will share his insights on what is needed for successful pre-training with small corpora.
- Tiago Pimentel is a Postdoc at ETH Zürich and holds a PhD from the University of Cambridge. His work on information theory as a tool to understand linguistics and language models has been groundbreaking, with important applications such as probing representations for linguistic structure and improving sampling strategies in generative models.
Organizers
SLTC 2024 is organized by Linköping University and sponsored by WARA Media and Language.
Local organization committee:
- Lars Ahrenberg
- Arne Jönsson
- Marco Kuhlmann
- Jenny Kunz
Recent history: 2022 (KTH), 2020 (GU), 2018 (SU), 2016 (UmU), 2014 (UU), 2012 (LU), 2010 (LiU), 2008 (KTH), 2006 (GU).