Papers
arxiv:2602.13139

OpenLID-v3: Improving the Precision of Closely Related Language Identification -- An Experience Report

Published on Feb 13
· Submitted by
Maria F
on Feb 16
Authors:
,
,
,
,
,
,

Abstract

OpenLID-v3 improves language identification accuracy for closely related languages and low-resource variants through enhanced training data, cluster merging, and noise detection mechanisms.

AI-generated summary

Language identification (LID) is an essential step in building high-quality multilingual datasets from web data. Existing LID tools (such as OpenLID or GlotLID) often struggle to identify closely related languages and to distinguish valid natural language from noise, which contaminates language-specific subsets, especially for low-resource languages. In this work we extend the OpenLID classifier by adding more training data, merging problematic language variant clusters, and introducing a special label for marking noise. We call this extended system OpenLID-v3 and evaluate it against GlotLID on multiple benchmarks. During development, we focus on three groups of closely related languages (Bosnian, Croatian, and Serbian; Romance varieties of Northern Italy and Southern France; and Scandinavian languages) and contribute new evaluation datasets where existing ones are inadequate. We find that ensemble approaches improve precision but also substantially reduce coverage for low-resource languages. OpenLID-v3 is available on https://huggingface.co/HPLT/OpenLID-v3.

Community

Language identification model

  • Supports 194 languages
  • High performance (FastText-based)
  • Fast and easy to use
  • Fully transparent: training data and per-language performance openly available
  • Used by HPLT

Sign up or log in to comment

Models citing this paper 1

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2602.13139 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2602.13139 in a Space README.md to link it from this page.

Collections including this paper 1