Icelandic NLG 🇮🇸

Last updated: 30/06/2024 17:28:45 CET
Model ID Parameters Vocabulary size Context Commercial Speed Rank MIM-GOLD-NER ScaLA-is NQiI RRN ARC-is Winogrande-is MIM-GOLD-NER version ScaLA-is version NQiI version RRN version ARC-is version Winogrande-is version
gpt-4-1106-preview (few-shot, val) unknown 100 127999 True 576 ± 221 / 81 ± 28 1.15 86.37 ± 1.19 / 82.25 ± 2.73 43.03 ± 5.07 / 71.18 ± 2.64 37.26 ± 2.60 / 66.04 ± 1.95 69.61 ± 0.61 / 23.98 ± 1.17 89.09 ± 1.59 / 91.76 ± 1.19 72.03 ± 3.91 / 86.09 ± 1.96 12.5.1 12.10.2 12.5.1 12.5.1 12.10.8 12.10.2
gpt-4o-2024-05-13 (few-shot, val) unknown 200 127999 True 916 ± 329 / 114 ± 38 1.19 81.19 ± 2.45 / 54.02 ± 5.60 51.10 ± 5.09 / 73.25 ± 3.42 29.64 ± 2.12 / 55.46 ± 1.12 68.25 ± 0.27 / 19.22 ± 0.51 91.27 ± 1.41 / 93.40 ± 1.09 70.85 ± 5.98 / 85.55 ± 3.05 12.10.0 12.10.2 12.10.0 12.10.0 12.10.8 12.10.2
gpt-4-0613 (few-shot, val) unknown 100 8190 True 597 ± 197 / 93 ± 33 1.46 78.67 ± 2.16 / 59.54 ± 4.85 33.65 ± 3.19 / 60.56 ± 2.35 32.25 ± 2.01 / 60.12 ± 1.31 69.21 ± 0.25 / 22.50 ± 0.53 86.32 ± 1.50 / 89.65 ± 1.16 74.02 ± 4.17 / 86.80 ± 2.02 12.10.0 12.10.0 12.10.0 12.10.0 12.10.8 12.10.0
gpt-3.5-turbo-0613 (few-shot, val) unknown 100 4095 True 921 ± 293 / 113 ± 37 2.89 69.59 ± 4.54 / 54.49 ± 4.31 7.28 ± 4.10 / 52.96 ± 2.00 28.50 ± 1.79 / 50.29 ± 1.79 67.10 ± 0.30 / 19.43 ± 0.48 49.88 ± 2.38 / 62.27 ± 1.75 18.61 ± 6.00 / 61.33 ± 2.93 0.0.0 0.0.0 0.0.0 0.0.0 12.10.8 12.1.0
meta-llama/Meta-Llama-3-8B-Instruct (few-shot) 8030 128 8192 True 4,909 ± 1,215 / 978 ± 319 3.24 61.69 ± 2.17 / 41.25 ± 3.12 6.10 ± 1.61 / 48.74 ± 3.05 31.52 ± 2.08 / 58.96 ± 1.57 66.98 ± 1.04 / 19.84 ± 1.97 25.16 ± 1.55 / 43.57 ± 1.26 1.50 ± 1.22 / 48.33 ± 1.21 12.6.1 12.6.1 12.6.1 12.6.1 12.10.8 12.6.1
upstage/SOLAR-10.7B-v1.0 (few-shot) 10732 32 4096 True 3,780 ± 906 / 799 ± 261 3.26 62.08 ± 2.27 / 51.09 ± 4.15 7.58 ± 1.03 / 44.38 ± 3.90 29.66 ± 3.02 / 56.60 ± 2.22 66.11 ± 0.85 / 18.74 ± 0.90 18.75 ± 1.02 / 38.96 ± 0.70 7.64 ± 1.91 / 49.40 ± 1.45 12.5.3 12.5.3 12.5.3 12.5.3 12.10.8 12.5.3
meta-llama/Meta-Llama-3-8B (few-shot) 8030 128 8192 True 4,687 ± 1,121 / 967 ± 313 3.33 48.70 ± 3.02 / 34.52 ± 2.66 7.49 ± 2.51 / 43.40 ± 4.41 29.56 ± 5.47 / 55.53 ± 5.79 66.34 ± 1.09 / 19.13 ± 0.96 26.78 ± 1.59 / 45.17 ± 1.12 7.41 ± 3.26 / 52.13 ± 1.97 12.6.1 12.6.1 12.6.1 12.6.1 12.10.8 12.6.1
google/gemma-7b (few-shot) 8538 256 8192 True 1,378 ± 260 / 387 ± 119 3.48 30.61 ± 4.61 / 25.80 ± 3.57 5.60 ± 1.68 / 38.26 ± 2.47 32.22 ± 3.19 / 55.22 ± 2.59 65.03 ± 1.82 / 18.15 ± 2.24 32.53 ± 0.82 / 48.03 ± 0.80 1.14 ± 1.26 / 39.92 ± 1.92 12.9.1 12.9.1 12.9.1 12.9.1 12.10.8 12.10.0
senseable/WestLake-7B-v2 (few-shot) 7242 32 32768 False 5,993 ± 1,028 / 1,742 ± 561 3.58 56.71 ± 1.98 / 46.71 ± 5.28 3.44 ± 2.02 / 50.18 ± 1.14 21.55 ± 2.79 / 54.79 ± 2.02 65.39 ± 0.80 / 18.24 ± 1.00 9.11 ± 0.92 / 32.06 ± 0.70 3.30 ± 2.81 / 44.40 ± 1.61 12.6.1 12.6.1 12.6.1 12.6.1 12.10.8 12.6.1
mlabonne/NeuralBeagle14-7B (few-shot, val) 7242 32 8192 False 2,549 ± 472 / 784 ± 245 3.59 49.86 ± 4.28 / 42.54 ± 5.03 1.26 ± 3.83 / 48.46 ± 2.37 22.48 ± 4.43 / 55.51 ± 2.89 65.60 ± 0.69 / 19.46 ± 0.80 5.19 ± 3.15 / 28.44 ± 2.45 12.90 ± 6.92 / 56.88 ± 3.57 9.3.2 9.3.2 12.5.2 9.3.2 9.3.2 12.1.0
mhenrichsen/hestenettetLM (few-shot) 7242 32 32768 True 5,160 ± 804 / 1,654 ± 516 3.64 50.82 ± 2.72 / 40.35 ± 4.51 0.99 ± 1.54 / 39.38 ± 3.81 25.74 ± 5.44 / 49.45 ± 5.29 61.72 ± 3.16 / 16.00 ± 1.82 10.78 ± 1.15 / 33.40 ± 0.91 3.94 ± 2.97 / 54.60 ± 1.62 12.5.2 12.3.2 12.3.2 12.3.2 12.10.8 12.3.2
Nexusflow/Starling-LM-7B-beta (few-shot) 7242 32 8192 False 5,876 ± 1,021 / 1,677 ± 546 3.65 49.20 ± 2.64 / 40.79 ± 4.46 4.45 ± 1.40 / 51.11 ± 0.87 24.61 ± 3.36 / 54.99 ± 2.36 63.74 ± 2.25 / 18.29 ± 1.40 8.45 ± 1.35 / 31.54 ± 1.03 1.14 ± 0.97 / 50.10 ± 0.82 12.5.2 12.5.2 12.5.2 12.5.2 12.10.8 12.5.2
mistralai/Mistral-7B-v0.1 (few-shot) 7242 32 32768 True 2,657 ± 524 / 880 ± 278 3.67 47.24 ± 2.54 / 37.77 ± 3.87 1.35 ± 1.70 / 39.37 ± 3.87 25.70 ± 5.36 / 49.31 ± 5.21 61.96 ± 3.10 / 16.11 ± 1.80 10.25 ± 0.96 / 32.89 ± 0.83 1.99 ± 2.95 / 54.48 ± 1.27 9.1.2 9.1.2 12.5.1 11.0.0 12.10.8 12.1.0
mistralai/Mistral-7B-v0.3 (few-shot) 7248 33 32768 True 4,120 ± 976 / 926 ± 306 3.67 44.68 ± 3.50 / 36.20 ± 4.20 0.12 ± 1.68 / 35.09 ± 1.17 25.52 ± 5.24 / 49.15 ± 5.21 61.40 ± 2.38 / 14.90 ± 1.60 10.25 ± 1.54 / 32.81 ± 1.22 5.24 ± 1.65 / 52.80 ± 2.41 12.10.4 12.10.4 12.10.4 12.10.5 12.10.8 12.10.4
occiglot/occiglot-7b-eu5-instruct (few-shot) 7242 32 32768 False 2,088 ± 352 / 706 ± 214 3.83 40.71 ± 2.93 / 34.57 ± 4.02 0.71 ± 2.00 / 36.90 ± 2.10 20.66 ± 3.67 / 45.91 ± 3.45 65.25 ± 0.97 / 19.09 ± 1.05 5.35 ± 1.32 / 28.11 ± 1.13 0.35 ± 2.49 / 51.16 ± 2.74 12.5.2 12.3.1 12.4.0 12.4.0 12.10.8 12.3.1
mistralai/Mistral-7B-Instruct-v0.2 (few-shot) 7242 32 32768 False 634 ± 179 / 110 ± 35 3.85 43.11 ± 2.23 / 29.34 ± 3.27 3.40 ± 1.87 / 48.75 ± 1.47 19.18 ± 3.69 / 49.62 ± 2.59 65.01 ± 1.51 / 18.34 ± 1.35 5.49 ± 1.98 / 28.73 ± 1.39 0.24 ± 0.71 / 38.95 ± 0.84 9.2.0 9.3.1 12.4.0 12.4.0 12.10.8 12.1.0
timpal0l/Mistral-7B-v0.1-flashback-v2 (few-shot) 7242 32 32768 True 5,054 ± 1,200 / 1,056 ± 339 3.88 36.47 ± 4.24 / 30.33 ± 3.70 2.54 ± 1.29 / 50.66 ± 0.62 18.66 ± 4.26 / 38.73 ± 3.66 63.68 ± 1.75 / 16.38 ± 1.24 5.12 ± 1.30 / 28.85 ± 0.99 8.30 ± 1.28 / 57.35 ± 0.75 12.5.3 12.5.3 12.5.3 12.5.3 12.10.8 12.5.3
occiglot/occiglot-7b-eu5 (few-shot) 7242 32 32768 True 2,219 ± 427 / 717 ± 224 3.92 40.08 ± 2.82 / 37.15 ± 4.07 1.59 ± 1.86 / 39.93 ± 4.19 15.98 ± 3.74 / 39.67 ± 3.36 62.55 ± 3.03 / 15.26 ± 2.31 5.98 ± 1.66 / 28.18 ± 1.30 -0.51 ± 1.95 / 47.23 ± 2.39 12.5.2 12.1.0 12.1.0 12.1.0 12.10.8 12.2.0
meta-llama/Llama-2-13b-hf (few-shot) 13016 32 4096 True 2,898 ± 637 / 736 ± 236 3.93 36.56 ± 3.03 / 34.23 ± 3.45 1.73 ± 1.56 / 45.02 ± 2.05 21.97 ± 4.26 / 46.27 ± 3.94 63.50 ± 2.92 / 16.51 ± 2.19 5.60 ± 1.25 / 29.83 ± 1.00 -5.80 ± 2.91 / 46.82 ± 2.65 12.10.5 12.10.4 12.10.5 12.10.5 12.10.8 12.10.4
mistralai/Mistral-7B-Instruct-v0.1 (few-shot) 7242 32 32768 False 634 ± 179 / 110 ± 35 3.93 36.04 ± 2.59 / 24.74 ± 2.79 -0.36 ± 1.36 / 33.94 ± 0.32 18.06 ± 3.16 / 42.57 ± 2.89 62.80 ± 1.69 / 15.23 ± 1.01 5.44 ± 1.14 / 28.13 ± 1.06 6.35 ± 2.71 / 50.49 ± 1.57 9.3.1 9.3.1 12.4.0 12.4.0 12.10.8 12.1.0
google/gemma-7b-it (few-shot) 8538 256 8192 False 1,792 ± 249 / 668 ± 203 3.96 37.69 ± 3.97 / 34.52 ± 3.74 3.11 ± 1.49 / 48.48 ± 2.77 18.34 ± 2.07 / 43.26 ± 1.28 63.71 ± 1.04 / 16.63 ± 0.98 7.70 ± 1.44 / 28.98 ± 1.20 -5.17 ± 2.94 / 53.84 ± 1.87 12.10.0 12.10.0 12.10.0 12.10.0 12.10.8 12.10.0
meta-llama/Llama-2-7b-chat-hf (few-shot) 6738 32 4096 False 2,643 ± 455 / 800 ± 247 3.98 41.10 ± 3.35 / 40.54 ± 3.19 -1.07 ± 2.09 / 44.83 ± 2.20 16.13 ± 2.52 / 39.51 ± 1.98 62.30 ± 0.90 / 13.28 ± 1.36 3.16 ± 0.79 / 27.40 ± 0.79 1.84 ± 2.19 / 43.79 ± 0.73 9.3.1 9.3.1 12.4.0 12.4.0 12.10.8 12.1.0
AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct (few-shot) 7111 64 2048 True 2,383 ± 451 / 718 ± 221 4.03 19.39 ± 1.31 / 19.04 ± 1.36 0.01 ± 1.49 / 34.61 ± 0.73 20.92 ± 3.41 / 51.75 ± 2.10 66.55 ± 0.34 / 18.80 ± 0.51 6.02 ± 1.50 / 29.00 ± 1.28 -2.37 ± 1.71 / 39.83 ± 3.57 12.7.0 12.7.0 12.4.0 12.4.0 12.10.8 12.7.0
LumiOpen/Viking-13B (few-shot) 14030 131 4097 True 840 ± 79 / 400 ± 124 4.03 26.28 ± 5.09 / 22.73 ± 3.35 2.17 ± 1.98 / 48.03 ± 2.64 22.65 ± 3.50 / 45.48 ± 2.99 62.07 ± 1.73 / 11.04 ± 1.91 -0.07 ± 0.85 / 23.51 ± 0.95 -2.92 ± 4.86 / 53.55 ± 3.32 12.10.5 12.10.5 12.10.5 12.10.5 12.10.8 12.10.5
microsoft/Phi-3-mini-4k-instruct (few-shot) 3821 32 4096 True 5,224 ± 1,371 / 1,063 ± 358 4.04 33.05 ± 4.29 / 29.75 ± 3.67 0.71 ± 1.18 / 34.80 ± 0.88 17.23 ± 2.51 / 39.88 ± 1.59 60.08 ± 1.45 / 13.80 ± 0.81 2.67 ± 1.35 / 27.04 ± 1.07 2.74 ± 2.28 / 53.18 ± 0.93 12.10.5 12.10.5 12.10.5 12.10.5 12.10.8 12.10.5
bineric/NorskGPT-Llama-7B-v0.1 (few-shot) 6738 32 4096 False 5,384 ± 879 / 1,746 ± 553 4.07 34.62 ± 4.64 / 33.25 ± 4.37 -0.24 ± 1.43 / 33.75 ± 0.31 18.10 ± 1.85 / 43.52 ± 0.87 61.81 ± 0.98 / 15.04 ± 0.70 3.06 ± 1.01 / 28.01 ± 0.86 -1.90 ± 2.28 / 44.34 ± 1.19 12.5.2 12.3.2 12.3.2 12.3.2 12.10.8 12.3.2
meta-llama/Llama-2-7b-hf (few-shot) 6738 32 4096 True 930 ± 310 / 128 ± 43 4.07 32.71 ± 2.77 / 32.17 ± 2.13 0.66 ± 1.75 / 40.36 ± 4.19 18.04 ± 4.05 / 41.40 ± 3.27 60.73 ± 3.02 / 14.02 ± 1.57 3.65 ± 1.33 / 26.91 ± 1.00 -0.00 ± 2.41 / 44.93 ± 0.92 9.2.0 9.2.0 12.5.1 11.0.0 12.10.8 12.1.0
allenai/OLMo-1.7-7B-hf (few-shot) 6888 50 4096 True 3,371 ± 876 / 561 ± 184 4.08 27.44 ± 3.28 / 23.50 ± 3.29 1.46 ± 1.18 / 41.20 ± 3.25 17.26 ± 3.80 / 39.06 ± 3.00 58.30 ± 4.50 / 14.03 ± 2.00 2.14 ± 1.25 / 25.83 ± 1.11 2.43 ± 2.93 / 55.15 ± 1.40 12.10.5 12.10.4 12.10.5 12.10.5 12.10.8 12.10.4
microsoft/Phi-3-mini-128k-instruct (few-shot) 3821 32 131072 True 7,312 ± 1,668 / 1,609 ± 525 4.09 27.22 ± 3.65 / 24.21 ± 2.67 1.31 ± 1.28 / 39.67 ± 4.39 17.24 ± 2.72 / 41.15 ± 1.57 62.00 ± 1.66 / 15.80 ± 1.12 1.85 ± 1.18 / 26.88 ± 0.86 1.06 ± 2.84 / 47.19 ± 2.14 12.9.1 12.9.1 12.9.1 12.10.0 12.10.8 12.10.0
LumiOpen/Viking-7B (few-shot) 7550 131 4096 True 4,969 ± 1,109 / 1,134 ± 374 4.11 21.41 ± 5.01 / 19.94 ± 5.01 1.76 ± 1.62 / 42.86 ± 2.68 22.54 ± 1.74 / 44.93 ± 1.92 57.33 ± 2.62 / 10.47 ± 1.06 0.97 ± 1.29 / 26.73 ± 0.99 -0.36 ± 2.40 / 44.44 ± 1.08 12.7.0 12.7.0 12.7.0 12.7.0 12.10.8 12.7.0
norallm/normistral-7b-warm-instruct (few-shot) 7248 33 4096 True 6,194 ± 949 / 1,967 ± 619 4.11 36.59 ± 3.56 / 27.50 ± 2.53 0.86 ± 2.41 / 36.44 ± 1.27 14.58 ± 2.13 / 37.44 ± 1.86 61.99 ± 1.16 / 15.07 ± 0.81 1.48 ± 1.63 / 23.23 ± 0.93 -0.98 ± 2.63 / 56.13 ± 0.62 12.7.0 12.7.0 12.7.0 12.7.0 12.10.8 12.7.0
tollefj/nordavind-7b-instruct-warm (few-shot) 7248 33 2048 False 6,450 ± 961 / 2,082 ± 658 4.11 34.76 ± 4.42 / 23.42 ± 2.33 0.77 ± 1.05 / 39.63 ± 2.41 12.80 ± 2.37 / 30.77 ± 2.12 61.23 ± 1.78 / 15.53 ± 0.95 2.01 ± 0.96 / 25.83 ± 1.02 -0.76 ± 3.69 / 53.64 ± 2.57 12.5.2 12.3.2 12.4.0 12.4.0 12.10.8 12.3.2
HPLT/gpt-7b-nordic-prerelease (few-shot) 7550 131 4096 True 5,404 ± 931 / 1,638 ± 542 4.14 27.96 ± 3.08 / 25.78 ± 3.20 -0.00 ± 1.28 / 35.53 ± 1.87 23.17 ± 2.78 / 44.72 ± 2.82 55.57 ± 4.13 / 9.41 ± 1.58 0.94 ± 1.26 / 22.66 ± 0.57 -2.72 ± 3.17 / 53.79 ± 1.42 12.5.2 12.3.2 12.3.2 12.3.2 12.10.8 12.3.2
AI-Sweden-Models/gpt-sw3-1.3b (few-shot) 1445 64 2048 True 4,608 ± 988 / 1,115 ± 354 4.17 1.42 ± 1.60 / 3.11 ± 1.85 0.75 ± 0.73 / 45.87 ± 2.20 23.33 ± 2.22 / 45.28 ± 1.58 64.23 ± 1.78 / 15.08 ± 2.03 0.40 ± 1.88 / 23.35 ± 1.22 0.68 ± 4.15 / 50.85 ± 2.65 12.7.0 12.7.0 12.7.0 12.7.0 12.10.8 12.7.0
Qwen/Qwen1.5-4B-Chat (few-shot) 3950 152 32768 False 4,347 ± 893 / 1,135 ± 365 4.18 25.65 ± 2.99 / 22.30 ± 2.30 -0.35 ± 2.01 / 44.36 ± 4.13 14.46 ± 2.66 / 32.31 ± 1.66 62.11 ± 2.22 / 14.98 ± 1.53 4.50 ± 1.47 / 28.86 ± 1.09 -1.89 ± 2.66 / 43.72 ± 0.92 12.5.2 12.1.0 12.5.2 12.1.0 12.10.8 12.1.0
AI-Sweden-Models/gpt-sw3-356m-instruct (few-shot) 471 64 2048 True 5,855 ± 1,373 / 1,223 ± 391 4.24 17.79 ± 1.18 / 18.12 ± 1.18 0.08 ± 0.15 / 33.73 ± 0.26 15.04 ± 2.53 / 34.77 ± 1.72 59.45 ± 1.99 / 12.89 ± 1.04 1.06 ± 1.26 / 22.59 ± 0.63 5.69 ± 2.26 / 56.71 ± 0.87 12.7.0 12.7.0 12.7.0 12.7.0 12.10.8 12.7.0
01-ai/Yi-6B (few-shot) 6061 64 4096 True 6,435 ± 1,316 / 1,632 ± 549 4.28 0.00 ± 0.00 / 0.00 ± 0.00 2.12 ± 1.40 / 38.45 ± 2.47 16.91 ± 2.57 / 40.63 ± 2.83 60.02 ± 3.15 / 14.22 ± 1.52 4.35 ± 1.25 / 28.94 ± 1.09 0.72 ± 2.33 / 52.54 ± 2.18 9.3.2 10.0.0 12.5.1 12.0.0 12.10.8 12.1.0
google/gemma-2b (few-shot) 2506 256 8192 True 6,087 ± 1,046 / 1,902 ± 563 4.28 8.83 ± 5.85 / 9.93 ± 4.70 0.31 ± 1.95 / 45.42 ± 3.51 16.08 ± 2.91 / 37.41 ± 2.44 60.00 ± 2.62 / 13.07 ± 1.31 2.52 ± 1.20 / 26.23 ± 1.40 0.00 ± 2.53 / 56.42 ± 0.98 12.5.2 12.1.0 12.1.0 12.1.0 12.10.8 12.1.0
Qwen/Qwen1.5-4B (few-shot) 3950 152 32768 True 3,248 ± 739 / 761 ± 252 4.34 15.66 ± 5.89 / 15.78 ± 3.95 -0.55 ± 1.06 / 39.57 ± 3.61 14.11 ± 3.08 / 34.56 ± 2.38 57.17 ± 3.07 / 11.73 ± 1.00 5.46 ± 1.45 / 29.23 ± 1.11 -1.71 ± 3.79 / 50.88 ± 1.29 12.5.2 12.1.0 12.1.0 12.1.0 12.10.8 12.1.0
google/gemma-2b-it (few-shot) 2506 256 8192 False 6,471 ± 1,142 / 1,961 ± 584 4.35 20.49 ± 2.30 / 18.33 ± 1.40 -0.01 ± 2.13 / 46.02 ± 2.71 10.95 ± 2.39 / 37.64 ± 0.75 59.16 ± 0.96 / 9.92 ± 1.05 0.45 ± 1.44 / 22.94 ± 0.76 0.62 ± 1.42 / 56.02 ± 0.95 12.5.2 12.1.0 12.4.0 12.4.0 12.10.8 12.1.0
Qwen/Qwen1.5-1.8B-Chat (few-shot) 1837 152 32768 False 8,304 ± 1,846 / 1,933 ± 617 4.45 14.15 ± 1.92 / 14.96 ± 2.11 0.78 ± 1.70 / 44.74 ± 3.57 7.80 ± 1.32 / 23.47 ± 1.64 57.27 ± 1.42 / 10.43 ± 0.97 1.62 ± 1.18 / 25.09 ± 1.06 1.92 ± 2.32 / 50.07 ± 2.68 12.5.2 12.1.0 12.5.0 12.5.0 12.10.8 12.1.0
timpal0l/Mistral-7B-v0.1-flashback-v2-instruct (few-shot) 7242 32 32768 False 5,172 ± 813 / 1,647 ± 518 4.45 24.98 ± 5.71 / 25.35 ± 4.78 1.18 ± 1.09 / 39.01 ± 2.76 8.52 ± 2.30 / 21.32 ± 2.25 39.94 ± 9.39 / 5.18 ± 1.53 4.83 ± 1.40 / 29.14 ± 1.10 4.70 ± 2.96 / 56.56 ± 0.97 12.5.2 12.3.2 12.3.2 12.3.2 12.10.8 12.3.2
Qwen/Qwen1.5-1.8B (few-shot) 1837 152 32768 True 5,666 ± 1,328 / 1,256 ± 408 4.46 12.26 ± 4.13 / 12.77 ± 3.60 0.94 ± 1.34 / 40.66 ± 3.73 6.31 ± 1.01 / 20.24 ± 2.02 55.32 ± 3.49 / 8.91 ± 1.05 3.65 ± 1.45 / 26.36 ± 0.92 1.13 ± 3.74 / 52.30 ± 2.26 12.5.2 12.1.0 12.1.0 12.1.0 12.10.8 12.1.0
Qwen/Qwen1.5-0.5B-Chat (few-shot) 620 152 32768 False 11,740 ± 3,000 / 2,209 ± 721 4.52 9.50 ± 3.17 / 9.41 ± 3.40 1.76 ± 1.62 / 38.51 ± 3.72 3.14 ± 0.71 / 17.84 ± 2.26 58.92 ± 1.57 / 10.09 ± 1.41 -1.28 ± 1.48 / 24.82 ± 0.92 1.48 ± 3.56 / 53.95 ± 2.45 12.5.2 12.1.0 12.5.0 12.5.0 12.10.8 12.1.0
Qwen/Qwen1.5-0.5B (few-shot) 620 152 32768 True 11,371 ± 2,924 / 2,122 ± 692 4.55 16.20 ± 1.52 / 16.96 ± 1.71 -0.57 ± 1.20 / 41.25 ± 3.51 3.31 ± 0.82 / 16.86 ± 2.98 56.00 ± 3.13 / 10.05 ± 0.73 1.96 ± 1.72 / 25.55 ± 1.37 0.85 ± 1.91 / 52.12 ± 2.92 12.5.2 12.1.0 12.1.0 12.1.0 12.10.8 12.1.0
RJuro/kanelsnegl-v0.2 (few-shot) 7242 32 512 True 1,373 ± 120 / 709 ± 172 4.64 23.67 ± 5.16 / 23.19 ± 4.37 0.00 ± 0.00 / 33.69 ± 0.28 0.00 ± 0.00 / 14.61 ± 2.02 50.54 ± 0.14 / 3.11 ± 0.06 0.00 ± 0.00 / 22.04 ± 0.48 0.00 ± 0.00 / 56.52 ± 0.89 12.7.0 12.7.0 12.7.0 12.7.0 12.10.8 12.7.0
RuterNorway/Llama-2-7b-chat-norwegian (few-shot) 6738 32 4096 False 10,890 ± 2,686 / 2,186 ± 750 4.67 9.48 ± 1.48 / 10.10 ± 1.44 0.07 ± 1.06 / 43.54 ± 3.63 1.04 ± 0.96 / 7.35 ± 3.52 55.16 ± 1.26 / 10.52 ± 1.13 -0.80 ± 2.00 / 23.89 ± 0.86 -0.16 ± 0.86 / 32.02 ± 2.77 9.3.1 9.3.1 12.5.2 12.4.0 12.10.8 12.1.0
NorGLM/NorGPT-369M (few-shot) unknown 64 2048 True 19,896 ± 5,099 / 3,848 ± 1,251 4.93 1.68 ± 1.40 / 1.54 ± 1.28 -1.38 ± 1.13 / 34.41 ± 2.16 0.08 ± 0.09 / 10.05 ± 2.08 44.02 ± 1.31 / 6.35 ± 0.43 0.15 ± 1.45 / 23.95 ± 1.29 0.28 ± 1.39 / 32.09 ± 2.15 12.5.2 12.5.2 12.5.2 12.5.2 12.10.8 12.5.2
Sigurdur/icebreaker (few-shot) 110 32 1024 False 48,619 ± 7,681 / 13,831 ± 4,404 4.94 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 33.69 ± 0.28 0.00 ± 0.00 / 3.90 ± 0.28 44.80 ± 0.65 / 3.34 ± 0.08 0.23 ± 0.75 / 22.11 ± 0.48 0.38 ± 0.75 / 56.53 ± 0.89 12.5.2 12.5.2 12.5.2 12.5.2 12.10.8 12.5.2
Sigurdur/icechat (few-shot) 110 32 1024 False 49,558 ± 7,930 / 13,921 ± 4,425 4.97 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 33.69 ± 0.28 0.00 ± 0.00 / 0.64 ± 0.34 42.46 ± 0.47 / 3.58 ± 0.45 0.00 ± 0.00 / 22.04 ± 0.48 0.00 ± 0.00 / 56.52 ± 0.89 12.5.2 12.5.2 12.5.2 12.5.2 12.10.8 12.5.2
ai-forever/mGPT (few-shot) unknown 100 2048 True 11,734 ± 3,124 / 2,174 ± 720 5.30 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 33.69 ± 0.28 0.00 ± 0.00 / 0.05 ± 0.03 17.11 ± 1.37 / 0.96 ± 0.09 -0.02 ± 1.16 / 22.75 ± 0.50 0.47 ± 4.14 / 46.93 ± 3.13 9.3.1 11.0.0 12.5.1 12.0.0 12.10.8 12.1.0
Sigurdur/jonas-hallgrimsson-gpt2 (few-shot) 125 51 512 False 32,644 ± 3,887 / 11,289 ± 3,585 5.56 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 33.69 ± 0.28 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 0.00 ± 0.00 -0.26 ± 0.90 / 22.02 ± 0.48 -0.01 ± 1.21 / 55.08 ± 0.99 12.5.2 12.5.2 12.5.2 12.5.2 12.10.8 12.5.2
Download as CSV   •   Copy embed HTML