Dutch NLU 🇳🇱

Last updated: 04/11/2024 17:09:43 CET
Model ID Parameters Vocabulary size Context Commercial Speed Rank CoNLL-nl Dutch Social ScaLA-nl SQuAD-nl CoNLL-nl version Dutch Social version ScaLA-nl version SQuAD-nl version
intfloat/multilingual-e5-large 559 250 512 True 6,732 ± 1,273 / 1,633 ± 523 1.45 82.31 ± 2.14 / 86.91 ± 1.34 32.64 ± 2.91 / 49.90 ± 3.42 58.51 ± 4.12 / 78.17 ± 2.32 45.32 ± 1.91 / 57.53 ± 1.78 0.0.0 0.0.0 0.0.0 0.0.0
setu4993/LaBSE 470 501 512 True 25,418 ± 6,435 / 4,536 ± 1,452 1.53 82.02 ± 1.04 / 84.71 ± 0.59 33.99 ± 4.05 / 50.69 ± 4.23 60.77 ± 1.53 / 79.80 ± 0.87 41.55 ± 1.08 / 51.73 ± 1.18 0.0.0 0.0.0 0.0.0 0.0.0
intfloat/multilingual-e5-large-instruct 560 250 514 True 5,947 ± 1,301 / 1,129 ± 374 1.71 83.68 ± 1.57 / 87.38 ± 1.03 27.19 ± 6.73 / 45.37 ± 6.25 51.80 ± 10.91 / 75.02 ± 5.60 46.07 ± 1.33 / 58.70 ± 1.30 13.0.0 13.0.0 13.0.0 13.0.0
gpt-4-0613 (few-shot, val) unknown 100 8316 True 597 ± 197 / 93 ± 33 1.75 73.35 ± 2.61 / 56.00 ± 2.82 18.92 ± 2.78 / 40.80 ± 2.43 76.70 ± 2.39 / 88.16 ± 1.21 55.03 ± 1.96 / 76.47 ± 1.22 12.10.0 12.10.0 12.10.0 12.10.0
DTAI-KULeuven/robbert-2022-dutch-base 118 43 512 True 11,307 ± 2,134 / 2,580 ± 834 1.83 79.84 ± 1.41 / 84.42 ± 1.03 24.58 ± 5.35 / 42.93 ± 4.78 68.76 ± 1.47 / 83.77 ± 0.93 27.63 ± 1.32 / 36.98 ± 1.39 0.0.0 0.0.0 0.0.0 0.0.0
pdelobelle/robbert-v2-dutch-base 116 40 512 True 15,481 ± 2,820 / 3,708 ± 1,186 1.86 78.30 ± 1.97 / 83.07 ± 1.30 26.68 ± 2.90 / 44.41 ± 2.97 63.83 ± 3.09 / 80.68 ± 2.18 28.34 ± 1.30 / 37.79 ± 1.37 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Meta-Llama-3-70B (few-shot, val) 70554 128 8317 True 312 ± 55 / 177 ± 51 1.89 72.91 ± 3.24 / 68.06 ± 4.62 19.08 ± 3.37 / 42.04 ± 2.31 54.33 ± 3.49 / 75.54 ± 2.31 63.99 ± 2.07 / 77.63 ± 1.16 12.7.0 12.7.0 12.7.0 12.7.0
ZurichNLP/unsup-simcse-xlm-roberta-base 277 250 512 True 34,520 ± 7,443 / 6,730 ± 2,224 1.91 78.45 ± 1.88 / 83.50 ± 0.85 22.67 ± 7.22 / 44.07 ± 6.51 54.92 ± 9.62 / 76.14 ± 5.00 31.82 ± 2.84 / 40.85 ± 3.02 0.0.0 0.0.0 0.0.0 0.0.0
intfloat/multilingual-e5-base 277 250 512 True 14,965 ± 2,890 / 3,322 ± 1,074 2.06 79.12 ± 1.90 / 83.05 ± 1.09 27.67 ± 2.85 / 44.90 ± 2.69 39.28 ± 12.28 / 67.90 ± 5.94 35.71 ± 1.70 / 46.63 ± 1.40 0.0.0 0.0.0 0.0.0 0.0.0
gpt-4-1106-preview (few-shot, val) unknown 100 128000 True 576 ± 221 / 81 ± 28 2.07 66.44 ± 2.18 / 56.97 ± 2.87 14.22 ± 3.26 / 33.41 ± 3.24 72.30 ± 2.26 / 85.96 ± 1.13 57.81 ± 1.23 / 74.51 ± 0.62 12.3.2 12.10.2 12.10.2 12.3.2
152334H/miqu-1-70b-sf (few-shot, val) 68977 32 32889 True 2,126 ± 676 / 319 ± 104 2.11 67.00 ± 3.69 / 56.41 ± 4.29 15.33 ± 4.14 / 36.14 ± 2.91 55.48 ± 4.37 / 77.55 ± 2.24 61.02 ± 1.67 / 76.87 ± 1.15 12.7.0 12.7.0 12.7.0 12.7.0
DTAI-KULeuven/robbertje-1-gb-non-shuffled 74 40 512 True 21,007 ± 3,892 / 4,922 ± 1,588 2.11 74.50 ± 1.61 / 81.38 ± 0.87 32.23 ± 1.76 / 50.11 ± 2.55 54.57 ± 1.72 / 75.82 ± 1.07 6.31 ± 0.28 / 11.55 ± 0.22 0.0.0 0.0.0 0.0.0 0.0.0
DTAI-KULeuven/robbert-2023-dutch-base 124 50 512 True 11,230 ± 1,939 / 2,750 ± 897 2.16 82.22 ± 1.28 / 86.32 ± 0.75 28.20 ± 3.75 / 44.38 ± 3.44 55.12 ± 11.67 / 76.12 ± 7.45 9.74 ± 0.34 / 44.34 ± 0.99 0.0.0 0.0.0 0.0.0 0.0.0
FacebookAI/xlm-roberta-large 559 250 512 True 17,897 ± 3,921 / 3,463 ± 1,141 2.19 83.49 ± 1.51 / 86.12 ± 1.21 8.82 ± 7.93 / 30.82 ± 4.71 64.80 ± 8.79 / 80.93 ± 6.29 50.72 ± 1.20 / 61.66 ± 1.16 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Meta-Llama-3-70B-Instruct (few-shot, val) 70554 128 8317 True 1,673 ± 583 / 275 ± 85 2.19 74.64 ± 3.67 / 71.84 ± 4.01 18.90 ± 2.04 / 41.93 ± 1.60 49.54 ± 4.22 / 74.03 ± 2.52 44.77 ± 1.67 / 71.44 ± 1.30 12.7.0 12.7.0 12.7.0 12.7.0
DTAI-KULeuven/robbertje-1-gb-merged 74 40 512 True 21,027 ± 3,902 / 4,932 ± 1,591 2.20 72.51 ± 0.97 / 80.14 ± 0.74 32.26 ± 2.51 / 47.44 ± 2.78 50.00 ± 2.09 / 73.38 ± 1.43 5.97 ± 0.44 / 11.08 ± 0.47 0.0.0 0.0.0 0.0.0 0.0.0
google/gemma-2-27b-it (few-shot) 27227 256 8320 True 1,516 ± 257 / 480 ± 148 2.21 65.20 ± 1.76 / 53.16 ± 2.84 14.80 ± 0.93 / 36.86 ± 1.08 59.02 ± 1.53 / 79.44 ± 0.78 59.60 ± 1.51 / 74.99 ± 0.59 13.0.0 13.0.0 13.0.0 13.0.0
jhu-clsp/bernice 277 250 128 True 5,567 ± 450 / 2,483 ± 798 2.23 78.74 ± 1.42 / 84.14 ± 0.87 22.58 ± 5.79 / 41.55 ± 4.55 55.39 ± 2.71 / 76.38 ± 2.03 5.95 ± 3.06 / 7.23 ± 3.67 0.0.0 0.0.0 0.0.0 0.0.0
gpt-4o-2024-05-13 (few-shot, val) unknown 200 128000 True 916 ± 329 / 114 ± 38 2.28 76.75 ± 3.44 / 61.13 ± 4.40 10.80 ± 2.24 / 32.52 ± 2.18 56.26 ± 4.51 / 73.83 ± 3.26 55.55 ± 2.54 / 76.28 ± 1.13 12.10.0 12.10.2 12.10.2 12.10.0
DTAI-KULeuven/robbertje-1-gb-shuffled 74 40 512 True 20,616 ± 3,755 / 4,819 ± 1,542 2.29 73.55 ± 2.27 / 80.69 ± 1.31 26.02 ± 3.29 / 42.90 ± 2.60 57.03 ± 1.80 / 77.24 ± 1.09 6.64 ± 0.36 / 12.04 ± 0.28 0.0.0 0.0.0 0.0.0 0.0.0
microsoft/mdeberta-v3-base 278 251 512 True 20,637 ± 3,925 / 4,497 ± 1,502 2.29 84.47 ± 1.84 / 87.98 ± 1.21 5.16 ± 5.21 / 27.85 ± 3.29 71.23 ± 1.62 / 85.45 ± 0.83 46.43 ± 0.72 / 57.80 ± 0.84 0.0.0 0.0.0 0.0.0 0.0.0
google/rembert 575 250 512 True 11,736 ± 2,822 / 2,102 ± 677 2.35 75.49 ± 1.75 / 81.37 ± 1.31 4.79 ± 3.93 / 27.49 ± 2.37 66.47 ± 2.04 / 83.16 ± 1.01 55.70 ± 1.62 / 68.38 ± 1.47 12.6.1 12.6.1 12.6.1 12.6.1
gpt-3.5-turbo-0613 (few-shot, val) unknown 100 4095 True 921 ± 293 / 113 ± 37 2.37 68.96 ± 3.80 / 58.45 ± 3.71 8.81 ± 3.30 / 30.88 ± 2.25 58.95 ± 4.48 / 78.64 ± 2.32 55.57 ± 2.33 / 68.26 ± 1.85 0.0.0 0.0.0 0.0.0 0.0.0
DTAI-KULeuven/robbert-2023-dutch-large 354 50 512 True 5,444 ± 911 / 1,413 ± 457 2.38 81.05 ± 2.44 / 85.20 ± 1.69 16.35 ± 6.39 / 36.89 ± 5.03 65.18 ± 1.83 / 82.29 ± 0.90 11.44 ± 0.50 / 52.98 ± 1.31 0.0.0 0.0.0 0.0.0 0.0.0
Nexusflow/Starling-LM-7B-beta (few-shot) 7242 32 8192 False 5,876 ± 1,021 / 1,677 ± 546 2.40 64.47 ± 2.31 / 40.89 ± 2.81 13.83 ± 1.91 / 41.53 ± 1.23 45.69 ± 1.76 / 72.13 ± 1.39 58.03 ± 1.37 / 73.17 ± 0.58 12.5.2 12.5.2 12.5.2 12.5.2
upstage/SOLAR-10.7B-v1.0 (few-shot) 10732 32 4096 True 3,780 ± 906 / 799 ± 261 2.40 65.37 ± 1.61 / 46.10 ± 1.53 11.93 ± 1.80 / 34.67 ± 2.84 41.67 ± 1.53 / 69.81 ± 1.38 67.75 ± 0.62 / 78.01 ± 0.45 12.5.3 12.5.3 12.5.3 12.5.3
cardiffnlp/twitter-xlm-roberta-base 277 250 512 True 34,475 ± 7,465 / 6,712 ± 2,223 2.44 77.15 ± 1.38 / 81.92 ± 1.32 18.78 ± 6.76 / 37.09 ± 4.14 56.72 ± 3.83 / 77.53 ± 2.17 14.61 ± 4.26 / 20.91 ± 5.21 0.0.0 0.0.0 0.0.0 0.0.0
google/gemma-2-9b-it (few-shot) 9242 256 8320 True 2,062 ± 397 / 589 ± 178 2.45 52.62 ± 2.15 / 39.41 ± 1.72 11.78 ± 1.31 / 32.80 ± 0.87 59.23 ± 1.58 / 79.42 ± 0.88 55.78 ± 0.87 / 72.71 ± 0.70 13.0.0 13.0.0 13.0.0 13.0.0
mistralai/Ministral-8B-Instruct-2410 (few-shot) 8020 131 32768 True 5,821 ± 1,090 / 1,561 ± 506 2.45 66.51 ± 1.38 / 52.40 ± 2.62 13.55 ± 2.14 / 37.36 ± 1.32 34.46 ± 2.79 / 65.61 ± 2.58 59.23 ± 1.16 / 72.56 ± 0.80 13.0.0 13.0.0 13.0.0 13.0.0
meta-llama/Llama-2-70b-hf (few-shot, val) 68977 32 4221 True 1,892 ± 650 / 318 ± 105 2.47 66.50 ± 3.72 / 57.66 ± 3.78 7.82 ± 4.30 / 34.91 ± 2.53 49.55 ± 4.95 / 73.43 ± 3.38 65.26 ± 1.55 / 77.36 ± 1.41 12.7.0 12.7.0 12.7.0 12.7.0
sentence-transformers/paraphrase-xlm-r-multilingual-v1 277 250 512 True 20,154 ± 4,438 / 3,890 ± 1,256 2.49 70.59 ± 1.60 / 78.25 ± 1.22 21.37 ± 8.79 / 40.62 ± 7.64 45.86 ± 2.06 / 71.32 ± 1.40 5.20 ± 0.30 / 10.40 ± 0.38 0.0.0 0.0.0 0.0.0 0.0.0
google/gemma-2-9b (few-shot) 9242 256 8320 True 2,038 ± 406 / 566 ± 172 2.52 57.13 ± 2.73 / 36.21 ± 1.71 17.43 ± 2.17 / 40.83 ± 1.50 31.39 ± 5.53 / 56.70 ± 5.97 59.33 ± 1.35 / 73.56 ± 0.52 13.0.0 13.0.0 13.0.0 13.0.0
mistralai/Mistral-Nemo-Instruct-2407 (few-shot) 12248 131 1024128 True 7,095 ± 2,193 / 1,063 ± 344 2.52 66.57 ± 1.86 / 48.40 ± 2.67 10.10 ± 1.55 / 33.62 ± 2.04 40.31 ± 2.25 / 69.53 ± 1.51 59.99 ± 0.95 / 74.24 ± 0.63 13.0.0 13.0.0 13.0.0 13.0.0
meta-llama/Llama-3.1-8B (few-shot) 8030 128 131200 True 2,439 ± 459 / 703 ± 219 2.53 64.79 ± 1.96 / 45.48 ± 2.24 11.95 ± 2.83 / 37.12 ± 2.19 32.97 ± 2.68 / 58.52 ± 2.92 63.89 ± 1.06 / 74.73 ± 1.02 12.11.0 12.11.0 12.11.0 13.0.0
ibm-granite/granite-3.0-8b-base (few-shot) 8171 49 4224 True 5,116 ± 943 / 1,436 ± 472 2.55 53.21 ± 1.37 / 42.13 ± 2.58 13.72 ± 2.18 / 39.38 ± 2.08 40.85 ± 4.16 / 67.34 ± 3.66 62.41 ± 0.99 / 74.19 ± 0.56 13.0.0 13.0.0 13.0.0 13.0.0
meta-llama/Llama-3.1-8B-Instruct (few-shot) 8030 128 131200 True 2,411 ± 465 / 686 ± 215 2.55 69.76 ± 1.36 / 57.66 ± 1.36 15.51 ± 1.59 / 39.71 ± 1.21 37.58 ± 3.42 / 66.98 ± 2.22 41.26 ± 2.09 / 65.63 ± 0.90 13.0.0 13.0.0 13.0.0 13.0.0
DTAI-KULeuven/robbertje-1-gb-bort 45 40 512 True 31,087 ± 5,833 / 7,147 ± 2,339 2.62 66.74 ± 1.53 / 75.07 ± 0.86 24.93 ± 6.85 / 41.47 ± 4.05 37.19 ± 6.22 / 66.68 ± 3.26 5.23 ± 0.43 / 10.67 ± 0.44 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Llama-2-70b-chat-hf (few-shot, val) 68977 32 4221 True 1,979 ± 621 / 320 ± 105 2.65 64.00 ± 3.52 / 48.94 ± 3.83 13.30 ± 3.75 / 30.50 ± 2.48 30.88 ± 4.62 / 59.62 ± 4.50 54.14 ± 1.55 / 70.96 ± 1.01 12.7.0 12.7.0 12.7.0 12.7.0
meta-llama/Meta-Llama-3-8B-Instruct (few-shot) 8030 128 8192 True 4,909 ± 1,215 / 978 ± 319 2.65 68.72 ± 1.81 / 54.89 ± 2.10 14.67 ± 2.51 / 41.36 ± 2.04 32.91 ± 2.56 / 64.93 ± 1.97 45.36 ± 1.31 / 67.50 ± 0.69 12.6.1 12.6.1 12.6.1 12.6.1
meta-llama/Meta-Llama-3-8B (few-shot) 8030 128 8192 True 4,687 ± 1,121 / 967 ± 313 2.67 62.26 ± 2.20 / 42.41 ± 2.02 10.45 ± 2.69 / 33.45 ± 1.99 30.30 ± 3.94 / 62.28 ± 2.89 62.99 ± 1.00 / 73.73 ± 0.98 12.6.1 12.6.1 12.6.1 12.6.1
yhavinga/Boreas-7B-chat (few-shot) 7242 32 32768 False 2,913 ± 459 / 1,129 ± 342 2.71 60.22 ± 1.55 / 38.72 ± 1.45 11.97 ± 1.80 / 35.17 ± 3.03 30.94 ± 4.81 / 62.66 ± 3.36 52.19 ± 1.96 / 67.52 ± 1.56 12.6.1 12.6.1 12.6.1 12.6.1
senseable/WestLake-7B-v2 (few-shot) 7242 32 32768 False 5,993 ± 1,028 / 1,742 ± 561 2.72 64.25 ± 2.23 / 46.52 ± 1.72 13.66 ± 1.99 / 39.45 ± 1.52 28.59 ± 1.48 / 61.24 ± 1.46 49.64 ± 0.86 / 68.04 ± 0.55 12.6.1 12.6.1 12.6.1 12.6.1
robinsmits/Qwen1.5-7B-Dutch-Chat (few-shot) 7719 152 32768 False 4,686 ± 1,131 / 996 ± 326 2.74 57.81 ± 2.68 / 47.15 ± 2.77 14.62 ± 2.25 / 41.08 ± 1.81 25.34 ± 2.37 / 54.46 ± 3.43 56.81 ± 1.44 / 70.49 ± 0.68 12.5.3 12.5.3 12.5.3 12.5.3
robinsmits/Qwen1.5-7B-Dutch-Chat-Sft-Bf16 (few-shot) 7719 152 32768 False 2,413 ± 463 / 700 ± 220 2.75 56.83 ± 2.31 / 46.81 ± 2.87 14.79 ± 1.96 / 41.48 ± 1.53 23.58 ± 2.69 / 50.85 ± 3.74 55.90 ± 1.80 / 70.07 ± 0.77 12.6.1 12.6.1 12.6.1 12.6.1
CohereForAI/aya-23-8B (few-shot) 8028 256 8192 False 5,608 ± 1,062 / 1,472 ± 479 2.77 60.81 ± 1.94 / 46.59 ± 3.32 7.90 ± 1.63 / 24.82 ± 0.95 31.12 ± 2.35 / 64.29 ± 1.88 63.00 ± 1.23 / 74.60 ± 0.67 13.0.0 13.0.0 13.0.0 13.0.0
CohereForAI/aya-expanse-8b (few-shot) 8028 256 8192 False 5,581 ± 1,066 / 1,471 ± 483 2.77 62.07 ± 1.67 / 37.68 ± 1.28 11.09 ± 1.67 / 31.37 ± 1.35 35.14 ± 2.33 / 66.66 ± 1.50 49.15 ± 1.48 / 68.82 ± 0.68 13.0.0 13.0.0 13.0.0 13.0.0
skole-gpt (few-shot) unknown 32 32768 False 3,583 ± 977 / 686 ± 231 2.78 62.16 ± 1.09 / 45.76 ± 2.07 8.92 ± 1.08 / 24.28 ± 0.76 32.76 ± 2.94 / 65.17 ± 2.79 56.87 ± 0.92 / 72.57 ± 0.85 13.0.0 13.0.0 13.0.0 13.0.0
ibm-granite/granite-3.0-8b-instruct (few-shot) 8171 49 4096 True 5,090 ± 937 / 1,423 ± 466 2.81 53.62 ± 2.29 / 40.51 ± 1.41 13.37 ± 1.25 / 36.94 ± 1.81 23.47 ± 1.79 / 60.17 ± 1.23 61.20 ± 1.02 / 72.98 ± 0.62 13.0.0 13.0.0 13.0.0 13.0.0
mlabonne/NeuralBeagle14-7B (few-shot, val) 7242 32 8192 False 2,549 ± 472 / 784 ± 245 2.82 63.53 ± 3.80 / 50.43 ± 2.90 11.25 ± 4.22 / 39.00 ± 3.14 27.76 ± 4.44 / 62.44 ± 2.43 50.94 ± 1.12 / 70.12 ± 0.96 9.3.2 9.3.2 9.3.2 12.5.2
sentence-transformers/quora-distilbert-multilingual 135 120 512 True 26,458 ± 5,992 / 5,274 ± 1,731 2.82 67.89 ± 1.61 / 74.48 ± 1.24 23.25 ± 6.95 / 44.88 ± 6.27 21.36 ± 7.80 / 59.50 ± 3.54 4.50 ± 0.39 / 9.94 ± 0.33 0.0.0 0.0.0 0.0.0 0.0.0
microsoft/xlm-align-base 277 250 512 True 14,744 ± 2,870 / 3,265 ± 1,053 2.83 78.85 ± 2.48 / 83.35 ± 2.28 11.80 ± 7.64 / 33.49 ± 6.73 14.56 ± 8.02 / 53.64 ± 5.14 42.08 ± 7.94 / 51.94 ± 9.08 0.0.0 0.0.0 0.0.0 0.0.0
ReBatch/Reynaerde-7B-Instruct (few-shot) 7248 33 32768 False 2,562 ± 487 / 782 ± 247 2.86 59.16 ± 2.29 / 42.33 ± 2.15 10.39 ± 1.44 / 28.74 ± 1.05 19.50 ± 1.96 / 55.52 ± 3.92 60.96 ± 1.24 / 72.79 ± 0.95 13.0.0 13.0.0 13.0.0 13.0.0
ibm-granite/granite-8b-code-base-4k (few-shot) 8055 49 4096 True 2,313 ± 423 / 682 ± 210 2.86 63.29 ± 2.51 / 52.18 ± 4.31 13.81 ± 1.66 / 36.99 ± 2.58 8.16 ± 1.97 / 44.29 ± 4.25 56.64 ± 0.68 / 66.29 ± 0.71 13.0.0 13.0.0 13.0.0 13.0.0
mistralai/Mistral-7B-v0.1 (few-shot) 7242 32 32768 True 2,657 ± 524 / 880 ± 278 2.86 58.15 ± 1.14 / 40.78 ± 1.91 7.94 ± 1.25 / 31.02 ± 3.45 25.41 ± 3.46 / 61.11 ± 2.36 62.56 ± 1.10 / 73.16 ± 0.93 9.1.2 9.1.2 9.1.2 12.5.1
ReBatch/Llama-3-8B-dutch (few-shot) 8030 128 8317 False 3,800 ± 1,275 / 566 ± 194 2.87 60.14 ± 2.00 / 44.91 ± 2.19 11.07 ± 1.98 / 34.77 ± 1.31 15.67 ± 3.75 / 40.14 ± 2.65 59.93 ± 1.17 / 71.20 ± 1.30 12.7.0 12.7.0 12.7.0 12.7.0
ReBatch/Reynaerde-7B-Chat (few-shot) 7248 33 32768 False 2,554 ± 483 / 781 ± 247 2.89 56.22 ± 2.46 / 38.04 ± 1.69 11.22 ± 1.85 / 30.99 ± 1.36 20.04 ± 1.67 / 55.38 ± 3.62 61.15 ± 1.01 / 72.89 ± 0.88 13.0.0 13.0.0 13.0.0 13.0.0
Rijgersberg/Mistral-7B-v0.1-chat-nl (few-shot) 7242 32 32768 False 5,907 ± 1,028 / 1,695 ± 549 2.89 56.73 ± 1.95 / 38.97 ± 1.84 11.08 ± 1.46 / 32.20 ± 1.43 19.41 ± 2.55 / 57.17 ± 2.38 58.91 ± 0.92 / 71.22 ± 0.72 12.5.2 12.5.2 12.5.2 12.5.2
gpt-4o-mini-2024-07-18 (few-shot) unknown 200 128126 True 1,171 ± 378 / 120 ± 39 2.89 68.20 ± 1.93 / 49.92 ± 2.53 5.65 ± 1.80 / 14.84 ± 2.18 31.15 ± 7.70 / 52.16 ± 7.03 53.17 ± 1.82 / 72.08 ± 0.90 12.11.0 12.11.0 12.11.0 12.11.0
sentence-transformers/stsb-xlm-r-multilingual 278 250 512 True 15,040 ± 2,953 / 3,417 ± 1,100 2.89 66.85 ± 1.32 / 72.84 ± 0.82 20.56 ± 1.44 / 39.67 ± 0.86 35.56 ± 1.76 / 66.00 ± 1.15 5.04 ± 0.46 / 10.13 ± 0.40 12.6.1 12.6.1 12.6.1 12.6.1
alpindale/Mistral-7B-v0.2-hf (few-shot) 7242 32 32768 True 1,841 ± 297 / 651 ± 193 2.90 56.76 ± 1.52 / 42.03 ± 1.98 7.11 ± 1.17 / 26.36 ± 2.97 23.55 ± 2.76 / 59.14 ± 3.18 61.89 ± 1.10 / 72.41 ± 1.08 12.5.2 12.5.2 12.5.2 12.5.2
mlabonne/AlphaMonarch-7B (few-shot, val) 7242 32 8192 False 5,340 ± 1,262 / 1,157 ± 375 2.90 64.71 ± 5.15 / 53.58 ± 3.82 11.14 ± 3.37 / 38.64 ± 2.36 25.22 ± 5.45 / 61.28 ± 2.51 46.34 ± 1.07 / 66.56 ± 1.49 12.5.2 12.5.2 12.5.2 12.5.2
ibm-granite/granite-3.0-2b-instruct (few-shot) 2634 49 4224 True 10,194 ± 2,403 / 2,193 ± 731 2.93 52.52 ± 1.62 / 44.69 ± 2.23 13.85 ± 1.90 / 36.43 ± 2.11 17.72 ± 1.86 / 57.31 ± 1.52 53.50 ± 1.16 / 67.02 ± 0.75 13.0.0 13.0.0 13.0.0 13.0.0
mistralai/Mistral-7B-v0.3 (few-shot) 7248 33 32768 True 4,120 ± 976 / 926 ± 306 2.93 56.52 ± 1.42 / 41.84 ± 1.84 7.02 ± 1.21 / 26.40 ± 2.96 23.41 ± 2.91 / 59.14 ± 3.11 61.90 ± 1.07 / 72.49 ± 1.05 12.10.4 12.10.4 12.10.4 12.10.5
mistralai/Mistral-7B-Instruct-v0.2 (few-shot) 7242 32 32768 False 634 ± 179 / 110 ± 35 2.94 55.56 ± 2.66 / 39.56 ± 2.13 12.37 ± 1.64 / 37.37 ± 1.35 21.50 ± 1.70 / 59.10 ± 1.32 50.77 ± 0.95 / 66.54 ± 0.79 9.3.1 9.2.0 9.3.1 12.4.0
Geotrend/distilbert-base-25lang-cased 108 85 512 True 26,099 ± 5,881 / 5,178 ± 1,665 2.96 75.02 ± 1.48 / 81.57 ± 0.76 7.45 ± 2.99 / 29.70 ± 1.94 45.28 ± 0.55 / 71.89 ± 0.59 20.18 ± 1.26 / 27.86 ± 1.48 0.0.0 0.0.0 0.0.0 0.0.0
ibm-granite/granite-8b-code-instruct-4k (few-shot) 8055 49 4096 True 5,617 ± 995 / 1,623 ± 540 2.99 60.72 ± 2.14 / 45.52 ± 2.46 12.38 ± 1.62 / 29.91 ± 1.91 10.96 ± 1.47 / 47.97 ± 3.45 51.20 ± 0.91 / 61.75 ± 0.63 13.0.0 13.0.0 13.0.0 13.0.0
google/gemma-7b (few-shot) 8538 256 8192 True 1,378 ± 260 / 387 ± 119 3.01 47.75 ± 2.33 / 35.64 ± 1.89 7.68 ± 0.61 / 26.25 ± 1.18 28.28 ± 2.48 / 62.81 ± 1.70 61.49 ± 1.15 / 73.19 ± 0.81 12.9.1 12.9.1 12.9.1 12.9.1
ibm-granite/granite-3.0-2b-base (few-shot) 2534 49 4224 True 10,187 ± 2,363 / 2,204 ± 737 3.01 47.28 ± 1.57 / 36.12 ± 1.72 12.12 ± 1.92 / 35.44 ± 1.80 12.74 ± 2.68 / 52.69 ± 2.88 60.36 ± 1.36 / 71.20 ± 0.77 13.0.0 13.0.0 13.0.0 13.0.0
microsoft/Phi-3-mini-4k-instruct (few-shot) 3821 32 4096 True 5,224 ± 1,371 / 1,063 ± 358 3.03 50.31 ± 1.94 / 41.54 ± 2.19 12.58 ± 1.62 / 36.56 ± 1.79 14.72 ± 1.84 / 50.23 ± 3.10 56.19 ± 0.80 / 66.72 ± 0.92 12.10.5 12.10.5 12.10.5 12.10.5
occiglot/occiglot-7b-eu5-instruct (few-shot) 7242 32 32768 False 2,088 ± 352 / 706 ± 214 3.04 53.78 ± 1.86 / 41.29 ± 2.07 7.78 ± 1.43 / 24.33 ± 1.57 16.23 ± 2.49 / 55.09 ± 3.18 63.09 ± 1.18 / 73.88 ± 0.72 12.5.2 12.2.0 12.3.1 12.4.0
RuterNorway/Llama-2-13b-chat-norwegian (few-shot) unknown 32 4096 False 3,254 ± 1,068 / 484 ± 173 3.06 57.66 ± 1.29 / 43.77 ± 2.78 8.41 ± 1.47 / 25.59 ± 1.30 16.93 ± 2.60 / 55.72 ± 3.35 56.29 ± 1.11 / 68.94 ± 0.81 9.3.1 9.3.1 9.3.1 9.3.1
meta-llama/Llama-2-13b-chat-hf (few-shot) 13016 32 4096 True 2,849 ± 622 / 723 ± 229 3.06 57.80 ± 1.53 / 39.43 ± 1.53 8.57 ± 1.27 / 29.84 ± 2.35 17.40 ± 1.54 / 57.26 ± 1.96 56.35 ± 0.85 / 69.69 ± 0.76 12.11.0 12.10.4 12.10.4 12.11.0
Rijgersberg/GEITje-7B-chat-v2 (few-shot) 7242 32 32768 False 5,908 ± 1,022 / 1,694 ± 551 3.07 42.12 ± 4.00 / 31.12 ± 1.86 11.06 ± 2.30 / 40.32 ± 1.64 19.71 ± 3.65 / 49.65 ± 4.28 59.19 ± 0.91 / 70.06 ± 0.82 12.5.2 12.5.2 12.5.2 12.5.2
google/gemma-7b-it (few-shot) 8538 256 8317 False 1,792 ± 249 / 668 ± 203 3.07 53.93 ± 2.71 / 47.48 ± 2.09 12.83 ± 2.37 / 34.00 ± 2.26 6.58 ± 3.36 / 48.51 ± 3.35 53.45 ± 2.52 / 67.47 ± 0.88 12.10.0 12.10.0 12.10.0 12.10.0
meta-llama/Llama-2-13b-hf (few-shot) 13016 32 4096 True 2,898 ± 637 / 736 ± 236 3.07 52.55 ± 1.64 / 43.32 ± 1.70 4.26 ± 2.09 / 28.32 ± 2.68 24.57 ± 3.54 / 54.94 ± 5.33 60.99 ± 0.95 / 72.74 ± 0.78 12.10.5 12.10.4 12.10.4 12.10.5
Rijgersberg/GEITje-7B (few-shot) 7242 32 32768 True 5,887 ± 1,087 / 1,600 ± 522 3.09 47.53 ± 1.90 / 32.42 ± 1.99 4.36 ± 2.96 / 28.11 ± 4.71 30.67 ± 4.45 / 63.78 ± 2.80 56.55 ± 0.70 / 67.56 ± 0.60 9.3.1 9.3.1 9.3.1 9.3.1
BramVanroy/GEITje-7B-ultra (few-shot) 7242 32 8192 False 2,475 ± 460 / 765 ± 238 3.10 42.20 ± 2.20 / 27.85 ± 1.11 12.78 ± 2.52 / 42.17 ± 1.91 18.23 ± 1.91 / 50.04 ± 2.54 53.41 ± 1.11 / 66.45 ± 0.46 10.0.1 10.0.1 10.0.1 12.4.0
Twitter/twhin-bert-base 278 250 512 True 11,514 ± 2,041 / 2,862 ± 918 3.11 74.03 ± 3.05 / 80.59 ± 2.24 9.53 ± 5.28 / 32.06 ± 4.17 39.12 ± 12.90 / 68.36 ± 6.85 7.71 ± 0.42 / 12.90 ± 0.39 0.0.0 0.0.0 0.0.0 0.0.0
occiglot/occiglot-7b-eu5 (few-shot) 7242 32 32768 True 2,219 ± 427 / 717 ± 224 3.12 51.31 ± 2.32 / 42.95 ± 2.58 7.41 ± 1.24 / 26.93 ± 1.56 13.04 ± 1.93 / 53.54 ± 2.70 59.28 ± 1.15 / 69.67 ± 0.95 12.5.2 12.1.0 12.1.0 12.1.0
Twitter/twhin-bert-large 560 250 512 True 9,707 ± 1,664 / 2,549 ± 831 3.13 77.35 ± 2.80 / 82.50 ± 1.87 6.55 ± 5.33 / 28.68 ± 3.64 18.25 ± 8.41 / 54.00 ± 5.57 28.37 ± 4.84 / 36.84 ± 5.92 0.0.0 0.0.0 0.0.0 0.0.0
microsoft/Phi-3-mini-128k-instruct (few-shot) 3821 32 131072 True 7,312 ± 1,668 / 1,609 ± 525 3.13 44.27 ± 2.14 / 34.47 ± 2.48 12.84 ± 1.82 / 36.64 ± 1.13 10.44 ± 1.58 / 48.93 ± 3.41 56.40 ± 1.14 / 68.02 ± 0.79 12.9.1 12.9.1 12.9.1 12.9.1
EuropeanParliament/EUBERT 93 66 512 True 20,070 ± 3,977 / 4,400 ± 1,435 3.14 49.54 ± 1.42 / 50.44 ± 1.10 14.86 ± 3.09 / 35.33 ± 1.77 27.90 ± 5.58 / 62.47 ± 3.34 20.65 ± 1.02 / 29.40 ± 1.29 0.0.0 0.0.0 0.0.0 0.0.0
Rijgersberg/GEITje-7B-chat (few-shot) 7242 32 32768 False 5,920 ± 1,028 / 1,696 ± 550 3.14 50.69 ± 1.67 / 35.96 ± 2.63 8.16 ± 1.68 / 27.37 ± 1.95 20.45 ± 2.12 / 59.00 ± 1.21 54.48 ± 0.86 / 66.71 ± 0.59 12.5.2 12.5.2 12.5.2 12.5.2
meta-llama/Llama-2-7b-chat-hf (few-shot) 6738 32 4096 False 2,643 ± 455 / 800 ± 247 3.14 50.23 ± 2.34 / 37.12 ± 3.30 10.07 ± 1.84 / 35.66 ± 2.24 14.73 ± 1.62 / 54.59 ± 2.24 53.42 ± 0.80 / 66.24 ± 0.84 9.3.1 9.3.1 9.3.1 12.4.0
BramVanroy/fietje-2b-chat (few-shot) 2775 50 2048 False 4,704 ± 1,015 / 1,185 ± 375 3.15 39.57 ± 2.74 / 31.81 ± 1.55 13.25 ± 2.12 / 39.92 ± 1.66 9.31 ± 1.92 / 50.99 ± 2.59 60.26 ± 0.62 / 71.03 ± 0.65 12.6.1 12.6.1 12.6.1 12.6.1
mistralai/Mistral-7B-Instruct-v0.1 (few-shot) 7242 32 32768 False 634 ± 179 / 110 ± 35 3.16 52.72 ± 2.58 / 33.51 ± 1.22 7.91 ± 2.16 / 27.82 ± 1.97 18.14 ± 2.10 / 55.42 ± 3.05 52.75 ± 0.88 / 67.15 ± 1.08 9.3.1 9.3.1 9.3.1 12.4.0
meta-llama/Llama-3.2-3B-Instruct (few-shot) 3213 128 131200 False 10,424 ± 2,641 / 2,081 ± 666 3.17 43.66 ± 2.01 / 40.23 ± 1.75 12.87 ± 1.32 / 37.23 ± 1.55 17.94 ± 3.48 / 55.88 ± 3.97 47.77 ± 1.82 / 64.44 ± 1.35 13.0.0 13.0.0 13.0.0 13.0.0
timpal0l/Mistral-7B-v0.1-flashback-v2 (few-shot) 7242 32 32768 True 5,054 ± 1,200 / 1,056 ± 339 3.17 54.56 ± 2.96 / 37.86 ± 2.49 8.43 ± 1.27 / 24.23 ± 0.94 10.99 ± 2.55 / 50.46 ± 4.17 55.91 ± 1.08 / 66.78 ± 1.13 12.5.3 12.5.3 12.5.3 12.5.3
BramVanroy/fietje-2b (few-shot) 2780 51 2048 True 4,804 ± 1,045 / 1,220 ± 392 3.19 33.92 ± 3.43 / 28.63 ± 2.42 13.39 ± 1.64 / 41.03 ± 1.87 6.75 ± 2.55 / 41.28 ± 2.37 58.57 ± 1.03 / 69.39 ± 0.79 12.6.1 12.6.1 12.6.1 12.6.1
Qwen/Qwen1.5-4B-Chat (few-shot) 3950 152 32768 False 4,347 ± 893 / 1,135 ± 365 3.19 42.52 ± 2.25 / 37.46 ± 3.08 14.68 ± 1.40 / 40.53 ± 1.64 4.07 ± 2.16 / 35.24 ± 1.77 55.18 ± 0.74 / 66.50 ± 0.80 12.5.2 10.0.1 12.1.0 12.5.2
TrustLLMeu/baseline-7-8b_1t-tokens_llama (few-shot) 7800 100 4096 True 6,197 ± 1,118 / 1,730 ± 577 3.19 48.24 ± 2.28 / 40.24 ± 1.90 11.37 ± 1.34 / 33.54 ± 1.83 10.73 ± 1.76 / 48.26 ± 4.65 54.83 ± 1.16 / 65.28 ± 1.11 13.0.0 13.0.0 13.0.0 13.0.0
BramVanroy/fietje-2b-instruct (few-shot) 2775 50 2048 False 4,710 ± 1,040 / 1,188 ± 383 3.20 36.50 ± 2.72 / 28.73 ± 1.41 13.70 ± 2.29 / 40.77 ± 1.77 4.81 ± 2.20 / 43.19 ± 1.45 60.63 ± 0.62 / 71.62 ± 0.59 12.6.1 12.6.1 12.6.1 12.6.1
ibm-granite/granite-3.0-3b-a800m-instruct (few-shot) 3374 49 4096 True 10,246 ± 3,021 / 1,629 ± 550 3.20 49.25 ± 2.57 / 36.48 ± 2.14 9.45 ± 1.76 / 39.66 ± 1.14 11.87 ± 2.68 / 47.32 ± 3.85 54.20 ± 1.44 / 67.04 ± 0.75 13.0.0 13.0.0 13.0.0 13.0.0
google/gemma-2-2b-it (few-shot) 2614 256 8320 True 5,374 ± 1,233 / 1,193 ± 377 3.23 40.58 ± 2.08 / 28.95 ± 1.56 11.17 ± 1.32 / 32.96 ± 1.96 19.63 ± 2.61 / 52.65 ± 2.73 49.30 ± 1.25 / 67.27 ± 0.73 13.0.0 13.0.0 13.0.0 13.0.0
ibm-granite/granite-3b-code-base-2k (few-shot) 3483 49 2048 True 2,732 ± 868 / 662 ± 238 3.24 50.88 ± 3.41 / 39.51 ± 3.29 12.39 ± 1.50 / 30.05 ± 1.80 3.31 ± 1.19 / 37.97 ± 4.40 48.44 ± 1.06 / 58.85 ± 0.75 13.0.0 13.0.0 13.0.0 13.0.0
meta-llama/Llama-2-7b-hf (few-shot) 6738 32 4096 True 930 ± 310 / 128 ± 43 3.24 40.49 ± 4.32 / 30.86 ± 2.27 7.10 ± 1.85 / 27.42 ± 1.76 18.66 ± 2.39 / 55.25 ± 3.77 59.92 ± 0.61 / 70.24 ± 0.75 9.2.0 9.2.0 9.2.0 12.5.1
01-ai/Yi-1.5-6B (few-shot) 6061 64 4224 True 2,867 ± 550 / 793 ± 253 3.30 51.18 ± 1.62 / 35.45 ± 1.88 9.23 ± 2.84 / 19.38 ± 4.37 1.99 ± 2.56 / 34.69 ± 1.59 54.66 ± 1.25 / 65.31 ± 1.06 13.0.0 13.0.0 13.0.0 13.0.0
AI-Sweden-Models/gpt-sw3-20b (few-shot) 20918 64 2048 True 1,875 ± 673 / 261 ± 91 3.32 35.30 ± 3.76 / 33.68 ± 1.80 15.67 ± 2.21 / 31.30 ± 4.51 1.76 ± 2.37 / 47.60 ± 1.68 45.05 ± 1.68 / 55.38 ± 1.66 9.3.1 9.3.1 9.3.1 9.3.1
BramVanroy/GEITje-7B-ultra-sft (few-shot) 7242 32 8192 False 5,979 ± 1,044 / 1,724 ± 559 3.32 39.41 ± 2.93 / 30.59 ± 1.59 7.00 ± 3.04 / 35.01 ± 3.72 16.10 ± 2.34 / 52.05 ± 3.60 53.02 ± 0.97 / 65.63 ± 0.72 12.5.2 12.5.2 12.5.2 12.5.2
01-ai/Yi-6B (few-shot) 6061 64 4096 True 6,435 ± 1,316 / 1,632 ± 549 3.35 46.34 ± 2.00 / 33.30 ± 1.78 8.96 ± 1.44 / 18.10 ± 2.39 0.88 ± 1.23 / 33.53 ± 0.48 55.33 ± 1.28 / 66.50 ± 0.94 9.3.2 10.0.0 10.0.0 12.5.1
ibm-granite/granite-3b-code-instruct-2k (few-shot) 3483 49 2048 True 9,059 ± 1,947 / 2,201 ± 728 3.35 48.53 ± 3.89 / 38.20 ± 2.92 10.15 ± 1.55 / 22.01 ± 1.44 4.88 ± 2.27 / 38.78 ± 3.56 45.38 ± 0.93 / 56.09 ± 1.05 13.0.0 13.0.0 13.0.0 13.0.0
sentence-transformers/distiluse-base-multilingual-cased-v1 135 120 512 True 34,042 ± 8,482 / 5,951 ± 1,950 3.35 58.67 ± 1.07 / 68.27 ± 0.94 17.82 ± 4.47 / 37.11 ± 2.75 9.27 ± 4.66 / 52.04 ± 4.01 2.17 ± 0.34 / 8.02 ± 0.41 0.0.0 0.0.0 0.0.0 0.0.0
meta-llama/Llama-3.2-3B (few-shot) 3213 128 131200 True 3,713 ± 877 / 836 ± 267 3.36 47.40 ± 3.29 / 33.11 ± 2.04 7.90 ± 1.98 / 30.71 ± 1.89 3.10 ± 1.93 / 34.24 ± 0.73 56.53 ± 1.48 / 68.47 ± 1.35 13.0.0 13.0.0 13.0.0 13.0.0
AI-Sweden-Models/roberta-large-1350k 354 50 512 True 5,744 ± 969 / 1,539 ± 492 3.38 73.03 ± 2.07 / 79.97 ± 1.63 3.65 ± 4.19 / 26.89 ± 2.64 2.00 ± 2.03 / 39.53 ± 4.47 42.85 ± 0.98 / 53.68 ± 0.89 10.0.1 10.0.1 10.0.1 10.0.1
ibm-granite/granite-3.0-3b-a800m-base (few-shot) 3374 49 4096 True 10,504 ± 3,028 / 1,678 ± 559 3.38 42.52 ± 3.31 / 33.08 ± 2.70 9.91 ± 1.71 / 35.24 ± 2.62 0.69 ± 2.82 / 36.10 ± 2.58 56.95 ± 1.18 / 66.87 ± 1.37 13.0.0 13.0.0 13.0.0 13.0.0
MaLA-LM/emma-500-llama2-7b (few-shot) 6738 32 4096 True 6,275 ± 1,193 / 1,755 ± 578 3.39 36.61 ± 3.37 / 31.91 ± 2.20 8.77 ± 1.80 / 25.31 ± 1.42 3.52 ± 2.07 / 35.34 ± 1.61 59.51 ± 0.97 / 70.33 ± 0.64 13.0.0 13.0.0 13.0.0 13.0.0
AI-Sweden-Models/gpt-sw3-20b-instruct (few-shot) 20918 64 2048 True 1,831 ± 587 / 268 ± 90 3.41 24.44 ± 1.62 / 25.02 ± 1.72 18.40 ± 2.14 / 40.21 ± 2.51 4.85 ± 2.01 / 49.10 ± 2.56 39.83 ± 1.08 / 52.69 ± 1.15 9.3.1 9.3.1 9.3.1 9.3.1
AI-Sweden-Models/roberta-large-1160k 354 50 512 True 14,014 ± 2,384 / 3,625 ± 1,146 3.42 70.92 ± 1.61 / 78.52 ± 1.23 3.50 ± 3.15 / 27.25 ± 2.24 2.06 ± 1.79 / 41.06 ± 5.11 41.40 ± 1.92 / 51.93 ± 2.09 10.0.1 10.0.1 10.0.1 10.0.1
Qwen/Qwen1.5-4B (few-shot) 3950 152 32768 True 3,248 ± 739 / 761 ± 252 3.42 35.74 ± 3.22 / 31.74 ± 2.24 12.55 ± 1.39 / 39.80 ± 1.38 0.23 ± 0.44 / 33.35 ± 0.31 51.30 ± 1.63 / 64.17 ± 0.87 12.5.2 10.0.1 12.1.0 12.1.0
allenai/OLMo-1.7-7B-hf (few-shot) 6888 50 4096 True 3,371 ± 876 / 561 ± 184 3.42 46.95 ± 2.32 / 36.13 ± 1.88 4.34 ± 2.10 / 19.37 ± 2.08 3.46 ± 1.91 / 41.32 ± 3.08 57.07 ± 0.89 / 68.14 ± 0.65 12.10.5 12.10.4 12.10.4 12.10.5
sentence-transformers/distilbert-multilingual-nli-stsb-quora-ranking 135 120 512 True 33,753 ± 8,349 / 5,937 ± 1,946 3.51 65.04 ± 1.07 / 70.94 ± 0.61 17.40 ± 6.56 / 39.25 ± 6.11 -0.95 ± 1.35 / 49.00 ± 0.64 3.94 ± 0.39 / 9.50 ± 0.40 12.6.1 12.6.1 12.6.1 12.6.1
google/gemma-2b-it (few-shot) 2506 256 8192 False 6,471 ± 1,142 / 1,961 ± 584 3.54 38.85 ± 3.77 / 32.18 ± 2.49 11.25 ± 1.90 / 28.36 ± 1.81 -2.27 ± 1.37 / 37.91 ± 2.26 45.95 ± 1.11 / 56.54 ± 0.95 12.5.2 12.1.0 12.1.0 12.4.0
ibm-granite/granite-7b-base (few-shot) 6738 32 2048 True 4,405 ± 1,098 / 1,032 ± 345 3.55 37.39 ± 3.37 / 32.77 ± 1.97 7.51 ± 1.57 / 19.22 ± 1.72 3.11 ± 0.88 / 50.54 ± 0.90 49.60 ± 1.28 / 61.06 ± 0.94 12.10.5 12.10.5 12.10.5 12.10.5
meta-llama/Llama-3.2-1B-Instruct (few-shot) 1236 128 131200 False 7,436 ± 1,846 / 1,508 ± 479 3.57 42.01 ± 2.06 / 37.16 ± 1.98 9.15 ± 1.70 / 32.55 ± 2.69 1.11 ± 2.15 / 36.71 ± 3.89 40.04 ± 1.61 / 53.75 ± 1.10 13.0.0 13.0.0 13.0.0 13.0.0
dbmdz/bert-base-historic-multilingual-cased 111 32 512 True 20,047 ± 4,407 / 3,844 ± 1,259 3.59 56.69 ± 1.80 / 68.42 ± 0.85 9.29 ± 3.04 / 30.73 ± 2.40 3.02 ± 1.45 / 50.08 ± 1.17 22.14 ± 1.13 / 31.59 ± 0.96 12.6.1 12.6.1 12.6.1 12.6.1
sentence-transformers/distiluse-base-multilingual-cased 135 120 512 True 19,206 ± 4,451 / 3,658 ± 1,187 3.60 56.98 ± 1.37 / 66.91 ± 1.60 9.66 ± 4.65 / 31.17 ± 3.34 19.37 ± 4.34 / 56.74 ± 3.13 3.11 ± 0.37 / 7.91 ± 0.25 0.0.0 0.0.0 0.0.0 0.0.0
stabilityai/stablelm-2-1_6b (few-shot) 1645 100 4096 True 7,259 ± 2,120 / 1,240 ± 432 3.60 36.58 ± 3.88 / 33.82 ± 2.87 6.32 ± 1.30 / 24.04 ± 1.14 4.01 ± 2.01 / 36.03 ± 1.61 52.81 ± 0.81 / 63.87 ± 1.27 12.10.8 12.10.8 12.10.8 12.10.8
RuterNorway/Llama-2-7b-chat-norwegian (few-shot) unknown 32 4096 False 10,890 ± 2,686 / 2,186 ± 750 3.63 35.49 ± 3.10 / 29.35 ± 2.75 11.36 ± 1.56 / 30.66 ± 3.68 2.52 ± 2.14 / 42.60 ± 4.80 37.49 ± 1.37 / 47.34 ± 1.53 9.3.1 9.3.1 9.3.1 12.5.2
google/gemma-2-2b (few-shot) 2614 256 8320 True 5,235 ± 1,226 / 1,154 ± 366 3.65 22.63 ± 4.98 / 22.71 ± 2.86 8.11 ± 1.55 / 28.07 ± 1.80 8.04 ± 1.79 / 48.95 ± 2.97 52.39 ± 2.14 / 65.22 ± 1.11 13.0.0 13.0.0 13.0.0 13.0.0
jpostma/DagoBERT 116 40 512 True 11,241 ± 2,115 / 2,565 ± 830 3.69 42.28 ± 1.41 / 47.68 ± 1.08 8.01 ± 2.88 / 31.60 ± 2.41 31.21 ± 1.62 / 64.82 ± 0.69 3.65 ± 0.33 / 9.49 ± 0.31 0.0.0 0.0.0 0.0.0 0.0.0
allenai/OLMo-7B (few-shot) 6888 50 2176 True 5,403 ± 1,133 / 1,294 ± 423 3.73 37.37 ± 2.22 / 30.45 ± 2.45 9.55 ± 1.82 / 23.90 ± 1.53 0.05 ± 1.35 / 35.78 ± 2.30 34.81 ± 1.54 / 46.37 ± 1.51 12.5.2 12.5.2 12.5.2 12.5.2
LumiOpen/Viking-13B (few-shot) 14030 131 4224 True 840 ± 79 / 400 ± 124 3.74 36.74 ± 3.36 / 32.36 ± 1.39 8.57 ± 2.44 / 34.17 ± 2.59 3.01 ± 1.94 / 46.03 ± 4.19 32.32 ± 1.55 / 40.73 ± 1.64 12.5.2 12.5.2 12.5.2 12.5.2
google/gemma-2b (few-shot) 2506 256 8192 True 6,087 ± 1,046 / 1,902 ± 563 3.81 16.90 ± 4.91 / 17.38 ± 4.30 9.95 ± 0.78 / 27.94 ± 1.43 0.41 ± 1.03 / 33.54 ± 0.32 49.15 ± 1.55 / 59.16 ± 1.44 12.5.2 12.1.0 12.1.0 12.1.0
Tweeties/tweety-7b-dutch-v24a (few-shot) 7391 50 1024 True 2,971 ± 423 / 1,351 ± 410 3.82 35.83 ± 3.06 / 29.15 ± 2.80 12.47 ± 2.51 / 40.41 ± 2.26 16.81 ± 3.31 / 53.38 ± 5.08 0.00 ± 0.00 / 14.18 ± 0.38 12.6.1 12.6.1 12.6.1 12.6.1
NorwAI/NorwAI-Mistral-7B (few-shot) 7537 68 4096 True 3,035 ± 503 / 911 ± 300 3.89 24.15 ± 5.73 / 26.49 ± 4.13 8.31 ± 1.56 / 20.06 ± 1.06 1.60 ± 1.71 / 41.51 ± 3.60 37.08 ± 1.76 / 49.32 ± 0.87 12.10.4 12.10.4 12.10.4 12.10.4
HuggingFaceTB/SmolLM2-1.7B-Instruct (few-shot) 1711 49 8192 True 15,971 ± 3,654 / 3,609 ± 1,197 3.91 31.84 ± 3.39 / 28.66 ± 1.77 1.56 ± 3.25 / 28.78 ± 2.60 5.05 ± 1.34 / 43.99 ± 4.14 40.55 ± 0.77 / 48.56 ± 0.95 13.1.0 13.1.0 13.1.0 13.1.0
meta-llama/Llama-3.2-1B (few-shot) 1236 128 131200 True 7,577 ± 1,884 / 1,555 ± 492 3.91 22.03 ± 4.43 / 19.22 ± 3.92 4.25 ± 2.95 / 26.57 ± 3.31 1.46 ± 1.83 / 42.29 ± 4.01 41.76 ± 1.92 / 52.60 ± 1.99 13.0.0 13.0.0 13.0.0 13.0.0
HuggingFaceTB/SmolLM2-1.7B (few-shot) 1711 49 8192 True 16,249 ± 3,690 / 3,689 ± 1,226 3.93 22.84 ± 5.42 / 25.11 ± 3.52 4.60 ± 2.12 / 29.94 ± 1.50 2.55 ± 1.41 / 40.88 ± 3.15 40.33 ± 1.19 / 48.35 ± 1.31 13.1.0 13.1.0 13.1.0 13.1.0
Qwen/Qwen1.5-1.8B-Chat (few-shot) 1837 152 32768 False 8,304 ± 1,846 / 1,933 ± 617 3.93 23.44 ± 5.09 / 25.00 ± 2.33 6.82 ± 1.82 / 30.97 ± 2.65 4.11 ± 1.73 / 43.70 ± 3.47 33.16 ± 1.61 / 46.66 ± 1.27 12.5.2 11.0.0 12.1.0 12.5.0
Qwen/Qwen1.5-0.5B-Chat (few-shot) 620 152 32768 False 11,740 ± 3,000 / 2,209 ± 721 4.02 18.66 ± 4.43 / 17.56 ± 4.28 8.59 ± 3.20 / 29.65 ± 5.10 0.34 ± 2.02 / 43.92 ± 3.15 26.74 ± 1.57 / 35.03 ± 2.14 12.5.2 11.0.0 12.1.0 12.5.0
3ebdola/Dialectal-Arabic-XLM-R-Base 277 250 512 True 12,783 ± 2,537 / 2,712 ± 885 4.04 44.46 ± 2.24 / 60.04 ± 1.09 8.39 ± 4.20 / 30.69 ± 2.83 2.07 ± 1.34 / 48.42 ± 1.31 4.30 ± 1.26 / 9.24 ± 1.13 0.0.0 0.0.0 0.0.0 0.0.0
dbmdz/bert-tiny-historic-multilingual-cased 5 32 512 True 78,027 ± 15,466 / 17,064 ± 5,335 4.04 41.38 ± 2.82 / 56.29 ± 1.61 8.45 ± 2.80 / 29.85 ± 1.86 1.55 ± 1.97 / 49.24 ± 1.16 4.40 ± 0.22 / 6.62 ± 0.38 0.0.0 0.0.0 0.0.0 0.0.0
Qwen/Qwen1.5-0.5B (few-shot) 620 152 32768 True 11,371 ± 2,924 / 2,122 ± 692 4.10 28.30 ± 3.90 / 28.67 ± 3.15 4.54 ± 2.76 / 26.53 ± 3.74 -0.42 ± 2.41 / 37.60 ± 3.89 20.81 ± 2.21 / 29.05 ± 2.31 12.5.2 10.0.1 12.1.0 12.1.0
sentence-transformers/distiluse-base-multilingual-cased-v2 135 120 512 True 33,247 ± 8,123 / 6,017 ± 1,977 4.10 49.82 ± 2.71 / 62.06 ± 1.69 2.70 ± 3.10 / 26.12 ± 2.40 6.60 ± 3.84 / 50.71 ± 1.92 2.13 ± 0.10 / 6.80 ± 0.37 12.6.1 12.6.1 12.6.1 12.6.1
allenai/OLMo-7B-Twin-2T (few-shot) 6888 50 2176 True 5,484 ± 1,125 / 1,317 ± 425 4.11 18.70 ± 5.76 / 19.58 ± 4.59 3.70 ± 1.69 / 17.91 ± 1.48 2.19 ± 2.08 / 45.43 ± 3.44 38.08 ± 1.07 / 48.44 ± 1.55 12.5.2 12.5.2 12.5.2 12.5.2
NorwAI/NorwAI-Llama2-7B (few-shot) 7033 68 4096 True 4,438 ± 1,128 / 1,028 ± 346 4.13 22.50 ± 2.27 / 24.09 ± 2.40 6.04 ± 1.51 / 18.08 ± 2.09 -0.61 ± 1.30 / 46.51 ± 2.55 26.96 ± 1.55 / 35.73 ± 1.87 12.10.4 12.10.4 12.10.4 12.10.4
sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2 118 250 512 True 29,201 ± 6,282 / 6,045 ± 2,027 4.13 59.61 ± 2.40 / 67.02 ± 1.16 0.00 ± 0.00 / 24.33 ± 0.14 -0.04 ± 1.84 / 48.65 ± 1.21 3.28 ± 0.31 / 9.04 ± 0.28 12.6.1 12.6.1 12.6.1 12.6.1
Qwen/Qwen1.5-1.8B (few-shot) 1837 152 32768 True 5,666 ± 1,328 / 1,256 ± 408 4.14 11.66 ± 6.46 / 15.15 ± 4.38 5.20 ± 1.78 / 35.43 ± 2.14 2.89 ± 1.91 / 41.36 ± 4.63 34.60 ± 2.17 / 48.83 ± 1.05 12.5.2 10.0.1 12.1.0 12.1.0
HuggingFaceTB/SmolLM2-360M (few-shot) 362 49 8192 True 22,023 ± 6,203 / 3,675 ± 1,231 4.16 20.95 ± 2.02 / 25.63 ± 1.96 6.84 ± 1.76 / 27.74 ± 5.49 -1.50 ± 1.30 / 34.07 ± 0.45 22.67 ± 1.77 / 30.14 ± 1.68 13.1.0 13.1.0 13.1.0 13.1.0
HuggingFaceTB/SmolLM2-360M-Instruct (few-shot) 362 49 8192 True 21,777 ± 6,115 / 3,617 ± 1,211 4.21 15.68 ± 5.54 / 22.21 ± 5.42 6.73 ± 2.20 / 27.67 ± 4.00 0.63 ± 1.05 / 43.48 ± 2.98 19.73 ± 1.42 / 27.47 ± 1.35 13.1.0 13.1.0 13.1.0 13.1.0
state-spaces/mamba-2.8b-hf (few-shot) 2768 50 32896 True 2,722 ± 495 / 766 ± 250 4.25 22.95 ± 1.99 / 23.05 ± 1.55 2.40 ± 1.70 / 22.89 ± 4.81 3.12 ± 1.51 / 45.58 ± 4.56 22.40 ± 1.73 / 32.09 ± 1.56 13.0.0 13.0.0 13.0.0 13.0.0
allenai/OLMo-1B (few-shot) 1177 50 2176 True 8,536 ± 1,926 / 1,940 ± 619 4.39 22.58 ± 5.05 / 26.82 ± 3.69 4.92 ± 2.71 / 19.51 ± 4.22 -1.27 ± 1.85 / 41.38 ± 3.59 6.64 ± 1.96 / 11.74 ± 1.62 12.5.2 12.1.0 12.1.0 12.1.0
HuggingFaceTB/SmolLM2-135M (few-shot) 135 49 8192 True 26,346 ± 7,812 / 4,082 ± 1,372 4.62 17.49 ± 2.94 / 18.59 ± 2.66 2.01 ± 1.88 / 15.88 ± 1.32 -0.02 ± 0.15 / 34.86 ± 2.12 0.53 ± 0.36 / 3.23 ± 0.51 13.1.0 13.1.0 13.1.0 13.1.0
HuggingFaceTB/SmolLM2-135M-Instruct (few-shot) 135 49 8192 True 25,602 ± 7,583 / 3,953 ± 1,325 4.69 15.82 ± 3.13 / 16.46 ± 2.61 -0.62 ± 1.55 / 16.18 ± 1.88 1.16 ± 1.38 / 34.30 ± 1.27 3.25 ± 1.17 / 5.89 ± 0.97 13.1.0 13.1.0 13.1.0 13.1.0
fresh-xlm-roberta-base 277 250 512 True 2,214 ± 94 / 1,494 ± 229 4.75 13.09 ± 1.68 / 16.25 ± 2.85 0.92 ± 2.11 / 25.39 ± 1.86 1.93 ± 1.37 / 40.76 ± 4.83 0.26 ± 0.09 / 2.70 ± 1.10 0.0.0 0.0.0 0.0.0 0.0.0
fresh-electra-small 13 31 512 True 7,840 ± 1,538 / 3,024 ± 438 4.78 11.66 ± 1.16 / 13.45 ± 1.20 0.00 ± 0.00 / 24.33 ± 0.14 -0.21 ± 1.89 / 35.79 ± 2.99 0.17 ± 0.04 / 0.17 ± 0.04 12.6.1 12.6.1 12.6.1 12.6.1
NorwAI/NorwAI-Mistral-7B-pretrain (few-shot) 7537 68 4096 True 3,024 ± 496 / 909 ± 301 4.91 3.80 ± 1.23 / 4.24 ± 1.19 0.97 ± 1.50 / 13.00 ± 2.52 -0.37 ± 0.55 / 33.40 ± 0.35 0.40 ± 0.25 / 2.98 ± 0.44 12.10.4 12.10.4 12.10.4 12.10.4
RJuro/kanelsnegl-v0.1 (few-shot) 7242 32 512 True 5,847 ± 1,029 / 1,640 ± 525 4.94 0.00 ± 0.00 / 0.00 ± 0.00 0.95 ± 1.17 / 9.87 ± 0.86 0.00 ± 0.00 / 33.34 ± 0.31 0.00 ± 0.00 / 5.43 ± 0.58 9.3.1 9.3.1 9.3.1 12.5.1
ssmits/Falcon2-5.5B-multilingual (few-shot) 5465 65 8192 True 7,692 ± 1,423 / 1,960 ± 644 4.94 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 8.62 ± 0.30 0.00 ± 0.00 / 33.34 ± 0.31 0.00 ± 0.00 / 0.01 ± 0.00 13.0.0 13.0.0 13.0.0 13.0.0
ai-forever/mGPT (few-shot) unknown 100 1024 True 11,734 ± 3,124 / 2,174 ± 720 4.96 0.11 ± 0.21 / 0.27 ± 0.53 -0.67 ± 1.33 / 8.96 ± 0.37 -0.97 ± 1.56 / 34.83 ± 1.94 0.29 ± 0.21 / 1.56 ± 0.19 9.3.1 10.0.1 11.0.0 12.5.1
Download as CSV   •   Copy embed HTML