Swedish NLG 🇸🇪

Last updated: 25/07/2024 22:31:06 CET
Model ID Parameters Vocabulary size Context Commercial Speed Rank SUC3 SweReC ScaLA-sv ScandiQA-sv SweDN MMLU-sv HellaSwag-sv SUC3 version SweReC version ScaLA-sv version ScandiQA-sv version SweDN version MMLU-sv version HellaSwag-sv version
gpt-4-0613 (few-shot, val) unknown 100 8191 True 597 ± 197 / 93 ± 33 1.19 76.86 ± 1.89 / 54.97 ± 4.44 79.19 ± 1.87 / 80.56 ± 1.82 80.93 ± 1.67 / 89.90 ± 0.93 53.81 ± 1.28 / 65.15 ± 1.11 67.83 ± 0.15 / 22.67 ± 0.39 72.53 ± 1.82 / 79.26 ± 1.44 85.67 ± 2.59 / 89.14 ± 2.05 0.0.0 0.0.0 0.0.0 12.9.0 9.3.2 9.3.2 9.3.2
gpt-4o-2024-05-13 (few-shot, val) unknown 200 127999 True 916 ± 329 / 114 ± 38 1.26 76.66 ± 2.11 / 60.34 ± 4.71 77.16 ± 2.65 / 79.22 ± 2.36 68.99 ± 4.33 / 83.37 ± 2.61 57.96 ± 1.35 / 67.71 ± 0.96 66.00 ± 0.29 / 16.97 ± 0.45 70.70 ± 1.32 / 77.97 ± 0.99 86.30 ± 2.23 / 89.65 ± 1.68 12.10.0 12.10.2 12.10.2 12.10.0 12.10.0 12.10.2 12.10.2
gpt-4-1106-preview (few-shot, val) unknown 100 127999 True 576 ± 221 / 81 ± 28 1.28 74.45 ± 3.09 / 49.97 ± 4.23 77.59 ± 1.38 / 78.78 ± 1.69 71.35 ± 3.10 / 83.98 ± 2.23 56.56 ± 1.39 / 66.76 ± 1.10 66.08 ± 0.14 / 17.19 ± 0.38 71.32 ± 1.56 / 78.48 ± 1.15 84.09 ± 2.99 / 88.01 ± 2.26 12.10.0 12.10.2 12.10.2 12.10.0 12.10.0 12.10.2 12.10.2
AI-Sweden-Models/Llama-3-8B-instruct (few-shot) 8030 128 8192 False 4,314 ± 1,202 / 776 ± 245 1.41 88.78 ± 1.05 / 85.30 ± 1.16 81.73 ± 0.71 / 80.87 ± 0.96 75.83 ± 0.49 / 87.67 ± 0.30 61.35 ± 0.70 / 65.90 ± 0.55 66.96 ± 0.09 / 23.17 ± 0.18 41.07 ± 0.83 / 55.49 ± 0.64 75.73 ± 0.72 / 81.68 ± 0.55 12.10.4 12.10.4 12.10.4 12.10.4 12.10.4 12.10.4 12.10.4
meta-llama/Meta-Llama-3-70B (few-shot, val) 70554 128 8192 True 312 ± 55 / 177 ± 51 1.46 74.61 ± 2.99 / 56.50 ± 6.30 78.61 ± 1.40 / 78.64 ± 1.53 63.20 ± 3.34 / 80.61 ± 2.52 61.98 ± 1.65 / 66.85 ± 1.42 67.60 ± 0.41 / 22.47 ± 0.82 61.55 ± 1.68 / 71.02 ± 1.21 66.21 ± 3.22 / 73.40 ± 2.77 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
152334H/miqu-1-70b-sf (few-shot, val) 68977 32 32764 True 2,126 ± 676 / 319 ± 104 1.84 62.96 ± 3.44 / 52.14 ± 4.04 75.25 ± 2.41 / 78.80 ± 1.96 53.28 ± 3.33 / 75.37 ± 1.80 56.42 ± 1.65 / 65.04 ± 1.17 67.60 ± 0.30 / 21.60 ± 0.76 53.56 ± 2.20 / 64.88 ± 1.66 59.70 ± 4.69 / 69.38 ± 3.70 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
meta-llama/Meta-Llama-3-70B-Instruct (few-shot, val) 70554 128 8192 True 1,673 ± 583 / 275 ± 85 1.88 77.06 ± 2.72 / 67.75 ± 5.69 53.56 ± 7.15 / 67.07 ± 3.93 47.50 ± 3.37 / 71.31 ± 2.69 46.86 ± 1.77 / 60.96 ± 1.04 68.25 ± 0.18 / 22.84 ± 0.44 61.31 ± 2.07 / 70.78 ± 1.60 66.73 ± 2.34 / 74.65 ± 1.78 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
gpt-3.5-turbo-0613 (few-shot, val) unknown 100 4095 True 921 ± 293 / 113 ± 37 1.91 73.04 ± 2.74 / 61.64 ± 3.63 72.77 ± 2.64 / 72.56 ± 2.45 58.06 ± 3.84 / 76.06 ± 2.51 58.02 ± 2.11 / 66.84 ± 1.38 66.92 ± 0.16 / 19.00 ± 0.28 40.73 ± 3.36 / 55.16 ± 2.75 50.51 ± 2.33 / 62.07 ± 1.95 0.0.0 0.0.0 0.0.0 12.9.0 11.0.0 0.0.0 0.0.0
meta-llama/Llama-2-70b-hf (few-shot, val) 68977 32 4096 True 1,892 ± 650 / 318 ± 105 1.91 64.76 ± 3.91 / 61.08 ± 5.41 75.46 ± 1.99 / 74.35 ± 3.70 43.27 ± 5.03 / 65.62 ± 4.94 63.04 ± 1.52 / 66.95 ± 1.31 68.43 ± 0.33 / 24.92 ± 0.74 46.16 ± 2.67 / 59.53 ± 2.04 50.41 ± 5.18 / 62.34 ± 3.86 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
four-two-labs/lynx-micro (few-shot) 2506 256 8192 False 10,162 ± 2,661 / 1,878 ± 623 1.93 83.64 ± 1.90 / 81.69 ± 1.52 78.57 ± 0.78 / 78.35 ± 0.90 66.83 ± 0.98 / 83.20 ± 0.51 58.84 ± 0.54 / 62.97 ± 0.52 64.40 ± 0.05 / 18.09 ± 0.11 20.47 ± 0.72 / 40.39 ± 0.55 49.92 ± 0.92 / 62.45 ± 0.69 12.10.2 12.10.2 12.10.2 12.10.2 12.10.2 12.10.2 12.10.2
gpt-4o-mini-2024-07-18 (few-shot) unknown 200 127999 True 1,171 ± 378 / 120 ± 39 2.08 67.38 ± 1.52 / 52.86 ± 4.83 65.51 ± 4.81 / 67.01 ± 3.84 29.41 ± 10.34 / 52.38 ± 8.76 55.67 ± 0.89 / 66.26 ± 0.48 65.70 ± 0.71 / 16.98 ± 0.75 50.06 ± 2.65 / 61.33 ± 2.36 64.58 ± 2.48 / 72.57 ± 2.06 12.11.0 12.11.0 12.11.0 12.11.0 12.11.0 12.11.0 12.11.0
timpal0l/sol (few-shot) 10732 32 4096 False 3,701 ± 876 / 771 ± 247 2.10 57.51 ± 2.30 / 37.74 ± 3.15 77.31 ± 1.01 / 70.55 ± 2.26 25.06 ± 5.02 / 49.04 ± 4.68 60.16 ± 1.77 / 67.43 ± 1.02 65.22 ± 0.19 / 18.86 ± 0.30 39.52 ± 0.58 / 54.47 ± 0.38 70.93 ± 1.29 / 78.03 ± 1.01 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
upstage/SOLAR-10.7B-v1.0 (few-shot) 10732 32 4096 True 3,780 ± 906 / 799 ± 261 2.13 59.65 ± 2.22 / 39.33 ± 3.33 77.48 ± 1.23 / 70.13 ± 2.81 16.94 ± 2.36 / 40.98 ± 1.82 62.65 ± 0.56 / 68.15 ± 0.56 65.19 ± 0.36 / 19.09 ± 0.55 39.82 ± 0.69 / 54.84 ± 0.53 68.87 ± 1.35 / 76.46 ± 1.04 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3
Nexusflow/Starling-LM-7B-beta (few-shot) 7242 32 8192 False 5,876 ± 1,021 / 1,677 ± 546 2.27 60.38 ± 1.60 / 36.17 ± 3.66 77.49 ± 0.98 / 72.07 ± 1.56 29.32 ± 2.34 / 54.43 ± 2.67 56.79 ± 0.83 / 65.84 ± 0.48 65.75 ± 0.16 / 20.23 ± 0.23 36.05 ± 1.10 / 51.86 ± 0.87 51.15 ± 1.71 / 63.20 ± 1.32 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
timpal0l/dolphin-2.9-llama3-8b-flashback (few-shot, val) 8030 128 8192 False 5,018 ± 1,216 / 996 ± 324 2.36 65.33 ± 2.38 / 46.88 ± 3.97 74.99 ± 3.45 / 76.76 ± 1.80 32.65 ± 5.08 / 61.25 ± 4.41 55.71 ± 1.34 / 64.54 ± 1.00 66.53 ± 0.29 / 19.24 ± 0.57 33.16 ± 2.11 / 49.26 ± 1.55 32.51 ± 2.97 / 48.24 ± 2.10 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
RJuro/munin-neuralbeagle-7b (few-shot, val) 7242 32 32768 False 2,493 ± 466 / 773 ± 243 2.37 62.96 ± 2.62 / 51.99 ± 5.66 77.13 ± 2.43 / 78.36 ± 1.88 15.73 ± 7.07 / 47.41 ± 5.31 58.43 ± 1.59 / 65.06 ± 1.19 67.58 ± 0.22 / 22.52 ± 0.52 32.54 ± 2.61 / 49.30 ± 1.93 34.94 ± 3.79 / 50.39 ± 3.23 9.3.2 9.3.2 9.3.2 12.5.2 9.3.2 9.3.2 9.3.2
timpal0l/BeagleCatMunin (few-shot, val) 7242 32 32768 False 2,495 ± 458 / 775 ± 244 2.37 50.53 ± 3.30 / 37.77 ± 4.38 77.37 ± 2.25 / 78.66 ± 2.43 27.84 ± 4.72 / 49.46 ± 4.52 59.98 ± 1.65 / 65.44 ± 1.38 67.89 ± 0.44 / 23.94 ± 0.68 34.80 ± 2.79 / 50.82 ± 2.10 36.65 ± 5.07 / 51.56 ± 4.13 9.3.2 9.3.2 9.3.2 12.5.2 9.3.2 9.3.2 9.3.2
timpal0l/Llama-3-8B-flashback-v1 (few-shot) 8030 128 8192 True 4,807 ± 1,152 / 979 ± 319 2.37 59.03 ± 2.04 / 41.99 ± 4.35 81.13 ± 0.94 / 80.80 ± 1.09 33.06 ± 3.65 / 61.21 ± 3.26 58.21 ± 0.67 / 64.01 ± 0.68 64.42 ± 0.69 / 18.01 ± 0.55 35.24 ± 0.83 / 50.70 ± 0.57 25.14 ± 2.02 / 42.54 ± 1.99 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
four-two-labs/orpo-llama-3-swe (few-shot) 8030 128 8192 False 4,974 ± 1,208 / 1,032 ± 342 2.38 60.93 ± 2.85 / 38.87 ± 3.50 79.74 ± 0.68 / 75.13 ± 1.85 26.02 ± 4.38 / 52.19 ± 5.44 59.84 ± 0.92 / 65.92 ± 0.82 64.99 ± 0.26 / 18.65 ± 0.32 36.35 ± 1.03 / 51.91 ± 0.77 27.22 ± 2.24 / 45.02 ± 1.74 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
meta-llama/Meta-Llama-3-8B (few-shot) 8030 128 8192 True 4,687 ± 1,121 / 967 ± 313 2.40 60.36 ± 2.84 / 39.37 ± 3.56 79.74 ± 0.75 / 75.11 ± 1.91 28.24 ± 4.19 / 55.29 ± 5.35 59.73 ± 1.13 / 65.72 ± 0.94 64.81 ± 0.24 / 18.56 ± 0.35 35.86 ± 0.90 / 51.39 ± 0.69 26.49 ± 1.89 / 44.41 ± 1.56 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
mlabonne/NeuralBeagle14-7B (few-shot, val) 7242 32 8192 False 2,549 ± 472 / 784 ± 245 2.40 61.25 ± 3.35 / 50.76 ± 5.94 76.03 ± 2.11 / 78.25 ± 1.95 16.28 ± 4.81 / 49.04 ± 3.60 50.96 ± 2.34 / 60.05 ± 1.18 68.35 ± 0.32 / 24.05 ± 0.66 32.30 ± 2.48 / 48.98 ± 1.96 38.78 ± 5.70 / 52.89 ± 4.91 9.3.2 9.3.2 9.3.2 12.5.2 9.3.2 9.3.2 9.3.2
meta-llama/Llama-2-70b-chat-hf (few-shot, val) 68977 32 3843 True 1,979 ± 621 / 320 ± 105 2.41 55.91 ± 3.25 / 39.73 ± 4.94 64.52 ± 3.15 / 70.51 ± 2.49 23.85 ± 7.34 / 56.89 ± 6.08 58.88 ± 1.51 / 65.82 ± 1.07 67.57 ± 0.24 / 21.77 ± 0.61 37.60 ± 3.30 / 52.46 ± 2.31 31.78 ± 2.98 / 47.11 ± 2.71 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
birgermoell/Flashback-Bellman (few-shot, val) 7242 32 32768 False 2,887 ± 403 / 1,144 ± 345 2.43 55.29 ± 3.95 / 41.59 ± 4.48 78.29 ± 1.83 / 78.77 ± 2.06 18.45 ± 3.00 / 46.38 ± 2.81 58.42 ± 1.64 / 63.83 ± 1.18 67.54 ± 0.48 / 23.67 ± 0.69 29.44 ± 2.34 / 46.95 ± 1.75 37.45 ± 3.61 / 52.85 ± 2.76 9.3.1 9.3.1 9.3.1 12.5.2 9.3.1 9.3.1 9.3.1
google/gemma-7b (few-shot) 8538 256 8192 True 1,378 ± 260 / 387 ± 119 2.43 43.68 ± 3.65 / 29.40 ± 3.10 77.72 ± 4.50 / 77.58 ± 4.13 36.25 ± 2.66 / 65.08 ± 2.92 58.62 ± 0.98 / 64.23 ± 0.88 64.44 ± 0.29 / 18.70 ± 0.57 39.94 ± 1.17 / 53.25 ± 0.89 25.96 ± 3.94 / 40.33 ± 3.17 12.9.1 12.9.1 12.9.1 12.9.1 12.9.1 12.9.1 12.10.0
merge-crew/da-sv-task-arithmetic (few-shot, val) 7242 32 32768 True 2,500 ± 469 / 762 ± 238 2.43 47.28 ± 3.05 / 34.01 ± 3.73 76.62 ± 2.52 / 78.04 ± 2.98 33.23 ± 4.72 / 61.29 ± 4.67 60.00 ± 1.69 / 64.62 ± 1.44 66.68 ± 0.43 / 21.83 ± 0.75 29.95 ± 1.74 / 47.23 ± 1.24 31.12 ± 3.68 / 47.85 ± 2.80 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
merge-crew/da-sv-slerp (few-shot, val) 7242 32 32768 True 2,467 ± 469 / 762 ± 244 2.44 46.57 ± 3.34 / 33.94 ± 3.73 76.53 ± 2.55 / 77.96 ± 3.04 33.43 ± 3.89 / 61.87 ± 4.02 59.87 ± 1.52 / 64.53 ± 1.41 66.76 ± 0.41 / 21.92 ± 0.79 28.89 ± 2.22 / 46.41 ± 1.55 30.36 ± 2.80 / 47.15 ± 2.26 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
timpal0l/BeagleCatMunin2 (few-shot, val) 7242 32 32768 False 2,477 ± 459 / 767 ± 241 2.44 60.87 ± 3.71 / 47.40 ± 5.32 73.72 ± 2.20 / 67.79 ± 2.37 6.78 ± 4.34 / 35.90 ± 2.11 58.75 ± 1.46 / 65.08 ± 1.15 68.06 ± 0.31 / 23.91 ± 0.64 33.71 ± 2.28 / 50.08 ± 1.68 41.45 ± 3.36 / 55.51 ± 2.69 9.3.1 9.3.1 9.3.1 12.5.2 9.3.1 9.3.1 9.3.1
AI-Sweden-Models/tyr (few-shot, val) 7242 32 32768 False 6,079 ± 1,051 / 1,760 ± 570 2.48 56.21 ± 2.49 / 44.78 ± 4.19 78.30 ± 1.71 / 79.80 ± 2.03 14.35 ± 5.65 / 48.69 ± 4.30 61.08 ± 1.47 / 65.72 ± 1.07 67.96 ± 0.40 / 24.14 ± 0.74 31.74 ± 2.48 / 48.52 ± 1.88 30.12 ± 2.07 / 47.58 ± 1.29 12.5.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2
timpal0l/Mistral-7B-v0.1-flashback-v2 (few-shot) 7242 32 32768 True 5,054 ± 1,200 / 1,056 ± 339 2.49 44.14 ± 2.40 / 29.77 ± 4.06 80.14 ± 1.11 / 80.19 ± 0.78 34.23 ± 2.23 / 65.29 ± 2.17 57.07 ± 1.56 / 62.52 ± 1.11 65.15 ± 0.31 / 18.72 ± 0.57 33.24 ± 0.85 / 49.69 ± 0.67 25.50 ± 2.25 / 43.44 ± 2.03 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3 12.5.3
RJuro/munin-neuralbeagle-SkoleGPTOpenOrca-7b (few-shot, val) 7242 32 32768 False 3,008 ± 429 / 991 ± 323 2.50 59.36 ± 2.75 / 47.08 ± 4.17 72.04 ± 3.27 / 63.83 ± 2.07 22.38 ± 7.17 / 54.70 ± 5.49 57.96 ± 2.00 / 64.06 ± 1.76 65.13 ± 0.34 / 19.26 ± 0.43 29.81 ± 2.25 / 47.46 ± 1.72 35.59 ± 3.75 / 51.76 ± 2.63 9.3.2 9.3.2 9.3.2 12.5.2 9.3.2 9.3.2 9.3.2
bineric/NorskGPT-Llama3-8b (few-shot) 8030 128 8192 False 3,695 ± 1,277 / 532 ± 183 2.50 63.19 ± 2.83 / 51.22 ± 3.61 76.06 ± 0.64 / 61.59 ± 0.77 5.34 ± 1.42 / 34.32 ± 0.56 56.70 ± 0.87 / 66.00 ± 0.59 66.25 ± 0.16 / 20.28 ± 0.33 36.23 ± 0.91 / 51.68 ± 0.73 43.60 ± 1.33 / 56.86 ± 1.15 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
birgermoell/Rapid-Cycling (few-shot, val) 7242 32 32768 False 2,346 ± 450 / 666 ± 249 2.51 53.66 ± 3.57 / 41.97 ± 4.83 77.72 ± 2.51 / 78.40 ± 2.65 16.22 ± 4.46 / 43.17 ± 3.88 59.75 ± 1.13 / 64.72 ± 1.04 67.57 ± 0.48 / 23.65 ± 0.72 27.24 ± 2.07 / 45.51 ± 1.53 32.04 ± 4.21 / 48.67 ± 3.11 9.3.2 9.3.2 9.3.2 12.5.2 9.3.2 9.3.2 9.3.2
merge-crew/da-sv-dare-ties-density-0.9 (few-shot, val) 7242 32 32768 True 2,443 ± 458 / 750 ± 240 2.51 46.61 ± 3.11 / 34.10 ± 4.61 76.38 ± 2.01 / 78.30 ± 2.42 34.16 ± 4.39 / 60.06 ± 4.67 58.77 ± 1.76 / 63.50 ± 1.47 66.77 ± 0.46 / 22.42 ± 0.84 29.77 ± 2.44 / 46.25 ± 1.64 25.38 ± 3.56 / 39.34 ± 3.89 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
mhenrichsen/danskgpt-chat-v2.1 (few-shot) unknown 32 32768 True 5,085 ± 998 / 1,306 ± 404 2.51 54.37 ± 3.04 / 42.16 ± 4.00 75.98 ± 1.15 / 74.44 ± 1.12 17.98 ± 1.97 / 56.01 ± 2.08 55.07 ± 0.74 / 64.24 ± 0.61 64.42 ± 0.09 / 14.42 ± 0.51 32.81 ± 0.91 / 49.28 ± 0.64 36.24 ± 1.44 / 51.96 ± 1.13 12.0.0 12.0.0 12.0.0 12.0.0 12.0.0 12.0.0 12.0.0
birgermoell/BeagleCatMunin-Flashback-Bellman (few-shot, val) 7242 32 32768 False 2,890 ± 401 / 1,155 ± 348 2.52 52.96 ± 3.45 / 41.51 ± 4.30 76.99 ± 2.37 / 76.84 ± 2.99 14.27 ± 4.36 / 40.60 ± 3.04 59.92 ± 1.64 / 64.87 ± 1.47 67.62 ± 0.42 / 23.90 ± 0.77 27.95 ± 2.57 / 45.86 ± 1.85 36.11 ± 3.54 / 51.60 ± 2.44 9.3.1 9.3.1 9.3.1 12.5.2 9.3.1 9.3.1 9.3.1
meta-llama/Meta-Llama-3-8B-Instruct (few-shot) 8030 128 8192 True 4,909 ± 1,215 / 978 ± 319 2.52 69.67 ± 1.30 / 52.94 ± 4.01 59.93 ± 4.70 / 67.54 ± 3.04 27.63 ± 3.19 / 60.85 ± 3.29 49.84 ± 1.61 / 60.85 ± 0.93 66.60 ± 0.07 / 19.13 ± 0.31 33.54 ± 1.40 / 49.20 ± 1.13 30.32 ± 2.27 / 45.96 ± 1.87 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
senseable/WestLake-7B-v2 (few-shot) 7242 32 32768 False 5,993 ± 1,028 / 1,742 ± 561 2.52 58.90 ± 1.34 / 42.48 ± 3.97 67.74 ± 2.79 / 71.89 ± 1.89 16.52 ± 2.55 / 46.30 ± 2.62 49.41 ± 1.21 / 59.91 ± 0.48 66.09 ± 0.17 / 19.64 ± 0.27 31.76 ± 0.89 / 48.64 ± 0.69 45.84 ± 1.47 / 59.27 ± 1.16 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
birgermoell/Munin-NeuralBeagle-NorskGPT (few-shot, val) 7242 32 32768 False 2,903 ± 407 / 1,157 ± 350 2.54 63.85 ± 2.67 / 47.77 ± 4.72 73.72 ± 2.98 / 62.83 ± 1.64 -0.56 ± 2.24 / 33.54 ± 1.03 60.10 ± 1.48 / 66.26 ± 1.19 68.11 ± 0.21 / 23.63 ± 0.56 27.79 ± 2.32 / 45.82 ± 1.61 42.43 ± 2.76 / 56.52 ± 2.13 9.3.1 9.3.1 9.3.1 12.5.2 9.3.1 9.3.1 9.3.1
birgermoell/WestLake-Munin-Cat-NorskGPT (few-shot, val) 7242 32 32768 False 2,856 ± 391 / 1,142 ± 342 2.54 63.85 ± 2.67 / 47.77 ± 4.72 73.72 ± 2.98 / 62.83 ± 1.64 -0.56 ± 2.24 / 33.54 ± 1.03 60.10 ± 1.48 / 66.26 ± 1.19 68.11 ± 0.21 / 23.63 ± 0.56 27.79 ± 2.32 / 45.82 ± 1.61 42.43 ± 2.76 / 56.52 ± 2.13 9.3.1 9.3.1 9.3.1 12.5.2 9.3.1 9.3.1 9.3.1
merge-crew/da-sv-ties (few-shot, val) 7242 32 32768 True 2,457 ± 451 / 757 ± 237 2.56 48.36 ± 3.07 / 34.48 ± 5.22 76.57 ± 2.19 / 78.11 ± 2.73 20.94 ± 5.55 / 44.72 ± 4.06 59.07 ± 1.90 / 63.87 ± 1.46 66.59 ± 0.50 / 22.19 ± 0.78 31.44 ± 1.94 / 47.30 ± 1.54 26.04 ± 3.42 / 38.83 ± 4.24 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
bineric/NorskGPT-Mistral-7b (few-shot) 7242 32 32768 False 2,443 ± 451 / 761 ± 237 2.57 58.40 ± 2.62 / 40.55 ± 3.65 74.30 ± 1.26 / 60.35 ± 0.41 0.00 ± 0.00 / 33.37 ± 0.27 59.16 ± 1.23 / 65.78 ± 0.72 65.36 ± 0.14 / 18.81 ± 0.17 35.01 ± 0.99 / 51.07 ± 0.70 43.72 ± 0.69 / 57.66 ± 0.50 9.3.1 9.3.1 9.3.1 12.5.1 11.0.0 9.3.1 9.3.1
merge-crew/da-sv-dare-ties-density-0.6 (few-shot, val) 7242 32 32768 True 2,515 ± 465 / 785 ± 247 2.58 45.12 ± 2.72 / 30.73 ± 4.55 78.74 ± 2.13 / 80.11 ± 2.64 19.74 ± 6.09 / 46.97 ± 5.83 60.15 ± 1.71 / 65.22 ± 1.28 66.41 ± 0.46 / 21.90 ± 0.70 31.24 ± 3.01 / 47.77 ± 2.19 22.30 ± 3.50 / 39.45 ± 2.60 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
Mabeck/Heidrun-Mistral-7B-chat (few-shot) 7242 32 32768 False 5,822 ± 1,283 / 1,336 ± 430 2.59 55.06 ± 2.38 / 41.39 ± 4.31 77.50 ± 0.90 / 73.87 ± 1.21 17.47 ± 2.33 / 47.73 ± 3.35 58.67 ± 0.96 / 64.58 ± 0.78 64.18 ± 0.24 / 18.13 ± 0.35 31.04 ± 1.08 / 48.29 ± 0.82 23.57 ± 1.68 / 42.37 ± 1.34 10.0.1 10.0.1 10.0.1 12.5.0 12.5.0 10.0.1 10.0.1
timpal0l/njord-alpha (few-shot) 7242 32 32768 True 5,431 ± 1,267 / 1,139 ± 365 2.62 48.19 ± 2.55 / 37.50 ± 3.62 79.95 ± 0.87 / 81.24 ± 0.64 32.85 ± 2.28 / 61.74 ± 3.05 57.39 ± 1.52 / 63.58 ± 1.19 65.95 ± 0.25 / 20.56 ± 0.37 25.32 ± 0.99 / 42.09 ± 1.04 14.55 ± 2.32 / 31.99 ± 2.16 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
danish-foundation-models/munin-7b-v0.1dev0 (few-shot) 7242 32 8192 True 6,113 ± 1,044 / 1,790 ± 579 2.63 47.10 ± 2.60 / 35.06 ± 3.65 73.05 ± 5.27 / 74.56 ± 4.19 30.29 ± 2.63 / 61.40 ± 3.22 57.39 ± 1.38 / 63.51 ± 1.04 64.69 ± 0.53 / 19.03 ± 0.41 27.40 ± 0.76 / 44.17 ± 0.82 21.08 ± 3.28 / 38.46 ± 2.96 12.5.2 12.4.0 12.4.0 12.4.0 12.4.0 12.4.0 12.4.0
mlabonne/AlphaMonarch-7B (few-shot, val) 7242 32 8192 False 5,340 ± 1,262 / 1,157 ± 375 2.65 60.53 ± 3.06 / 48.45 ± 5.19 67.03 ± 3.61 / 70.77 ± 1.95 15.10 ± 4.60 / 48.57 ± 2.91 42.46 ± 1.63 / 53.50 ± 1.40 67.94 ± 0.21 / 22.99 ± 0.24 27.51 ± 3.08 / 45.43 ± 2.37 42.29 ± 5.08 / 55.43 ± 4.50 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
AI-Sweden-Models/Llama-3-8B (few-shot) 8030 128 8192 True 4,141 ± 994 / 905 ± 299 2.66 36.45 ± 2.44 / 27.24 ± 2.29 81.12 ± 1.02 / 77.04 ± 2.74 26.80 ± 2.07 / 59.15 ± 2.72 58.16 ± 0.82 / 64.72 ± 0.72 66.09 ± 0.31 / 20.48 ± 0.47 29.01 ± 0.68 / 45.32 ± 0.67 15.92 ± 1.74 / 35.28 ± 1.24 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5
alpindale/Mistral-7B-v0.2-hf (few-shot) 7242 32 32768 True 1,841 ± 297 / 651 ± 193 2.67 48.96 ± 2.72 / 39.25 ± 3.69 78.90 ± 0.95 / 78.62 ± 1.08 10.82 ± 3.46 / 38.95 ± 3.80 58.91 ± 1.02 / 64.72 ± 0.76 64.78 ± 0.33 / 19.24 ± 0.42 34.52 ± 1.19 / 50.47 ± 0.93 20.96 ± 1.96 / 39.95 ± 1.66 12.5.2 12.5.1 12.5.1 12.5.1 12.5.1 12.5.1 12.5.1
meta-llama/Llama-2-13b-hf (few-shot) 13016 32 4096 True 2,898 ± 637 / 736 ± 236 2.67 54.52 ± 3.33 / 44.11 ± 5.29 78.45 ± 1.21 / 79.73 ± 0.97 21.55 ± 3.74 / 49.54 ± 4.41 59.71 ± 0.68 / 65.01 ± 0.64 64.59 ± 0.34 / 18.56 ± 0.39 25.51 ± 0.81 / 43.68 ± 0.60 14.97 ± 1.73 / 34.85 ± 1.34 12.10.5 12.10.4 12.10.4 12.10.5 12.10.5 12.10.4 12.10.4
mistralai/Mistral-7B-v0.3 (few-shot) 7248 33 32768 True 4,120 ± 976 / 926 ± 306 2.67 49.18 ± 2.71 / 39.25 ± 3.60 79.08 ± 0.77 / 78.81 ± 0.94 11.06 ± 3.55 / 38.96 ± 3.77 58.98 ± 1.04 / 64.79 ± 0.79 64.79 ± 0.32 / 19.30 ± 0.42 34.51 ± 1.13 / 50.46 ± 0.88 20.84 ± 2.19 / 39.88 ± 1.85 12.10.4 12.10.4 12.10.4 12.10.4 12.10.5 12.10.4 12.10.4
KennethEnevoldsen/munin_mistral-7b (few-shot, val) 7242 32 32768 False 2,543 ± 466 / 787 ± 247 2.68 52.34 ± 3.07 / 39.14 ± 4.60 77.66 ± 2.09 / 78.59 ± 2.41 6.00 ± 4.15 / 36.34 ± 2.20 60.16 ± 1.81 / 64.12 ± 1.59 65.54 ± 0.49 / 19.31 ± 0.71 31.83 ± 2.27 / 48.55 ± 1.67 20.55 ± 3.93 / 38.95 ± 3.23 12.5.2 12.3.1 12.3.1 12.3.2 12.3.2 12.3.2 12.3.2
Mabeck/Heidrun-Mistral-7B-base (few-shot) 7242 32 32768 True 3,823 ± 967 / 860 ± 280 2.68 48.43 ± 2.75 / 35.31 ± 2.80 79.43 ± 0.85 / 78.21 ± 1.69 17.37 ± 2.57 / 52.91 ± 4.93 57.05 ± 1.22 / 62.72 ± 0.89 63.81 ± 0.34 / 18.13 ± 0.31 31.72 ± 0.55 / 48.70 ± 0.45 15.69 ± 2.43 / 35.96 ± 2.03 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0
mhenrichsen/hestenettetLM (few-shot) 7242 32 32768 True 5,160 ± 804 / 1,654 ± 516 2.68 53.00 ± 2.53 / 39.09 ± 3.72 79.70 ± 0.65 / 79.45 ± 0.68 4.32 ± 2.19 / 34.43 ± 0.87 59.03 ± 1.03 / 64.74 ± 0.84 64.89 ± 0.28 / 19.31 ± 0.40 35.48 ± 0.99 / 51.54 ± 0.72 20.54 ± 2.14 / 39.66 ± 1.80 12.5.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2
mistralai/Mistral-7B-v0.1 (few-shot) 7242 32 32768 True 2,657 ± 524 / 880 ± 278 2.68 53.34 ± 2.55 / 40.48 ± 3.66 80.00 ± 0.70 / 79.80 ± 0.66 4.61 ± 2.18 / 34.51 ± 0.86 58.99 ± 1.05 / 64.65 ± 0.83 64.87 ± 0.31 / 19.30 ± 0.43 35.52 ± 1.01 / 51.52 ± 0.73 19.67 ± 2.31 / 38.98 ± 1.98 0.0.0 0.0.0 0.0.0 12.5.1 11.0.0 9.1.2 9.1.2
ThatsGroes/munin-SkoleGPTOpenOrca-7b-16bit (few-shot) 7242 32 32768 False 3,006 ± 479 / 1,053 ± 319 2.71 44.64 ± 1.66 / 31.30 ± 2.96 77.98 ± 1.01 / 72.79 ± 2.47 16.57 ± 2.58 / 51.86 ± 3.69 57.31 ± 0.92 / 63.73 ± 1.04 63.23 ± 0.35 / 15.35 ± 0.57 28.15 ± 0.90 / 45.69 ± 0.72 23.58 ± 1.41 / 42.30 ± 1.04 11.0.0 11.0.0 11.0.0 12.4.0 12.4.0 11.0.0 11.0.0
meta-llama/Llama-2-13b-chat-hf (few-shot) 13016 32 4096 True 2,849 ± 622 / 723 ± 229 2.73 49.90 ± 2.28 / 35.48 ± 3.44 77.19 ± 2.05 / 79.13 ± 1.43 14.67 ± 2.27 / 53.90 ± 2.24 57.12 ± 0.70 / 63.72 ± 0.73 66.25 ± 0.10 / 19.64 ± 0.19 24.40 ± 1.20 / 42.73 ± 0.90 19.30 ± 1.27 / 38.75 ± 1.05 12.10.5 12.10.4 12.10.4 12.11.0 12.11.0 12.10.4 12.10.4
danish-foundation-models/munin-7b-alpha (few-shot) 7242 32 32768 True 6,116 ± 1,049 / 1,784 ± 577 2.78 42.23 ± 2.44 / 30.30 ± 4.71 78.80 ± 0.93 / 75.28 ± 1.78 15.47 ± 1.79 / 54.26 ± 3.41 56.75 ± 1.15 / 62.43 ± 0.95 62.78 ± 0.76 / 16.74 ± 0.45 30.86 ± 1.12 / 47.83 ± 0.93 19.11 ± 2.74 / 38.55 ± 2.29 12.5.2 12.4.0 12.4.0 12.4.0 12.4.0 12.4.0 12.4.0
timpal0l/Mistral-7B-v0.1-flashback-v2-instruct (few-shot) 7242 32 32768 False 5,172 ± 813 / 1,647 ± 518 2.79 46.74 ± 4.30 / 33.57 ± 4.51 77.06 ± 1.82 / 79.02 ± 1.37 14.00 ± 1.59 / 53.89 ± 3.10 56.74 ± 0.52 / 63.45 ± 0.49 62.56 ± 0.85 / 15.85 ± 0.34 30.87 ± 1.35 / 47.77 ± 1.01 15.79 ± 1.57 / 35.66 ± 0.84 12.5.2 12.3.2 12.3.2 12.4.0 12.4.0 12.3.2 12.3.2
bineric/NorskGPT-Llama-13B-v0.1 (few-shot) 13016 32 4096 False 2,856 ± 645 / 709 ± 243 2.80 49.26 ± 2.31 / 36.92 ± 4.05 79.05 ± 0.80 / 77.87 ± 2.06 0.22 ± 0.43 / 33.38 ± 0.26 56.78 ± 1.02 / 63.61 ± 0.61 65.99 ± 0.07 / 19.57 ± 0.17 25.56 ± 0.86 / 43.67 ± 0.64 28.26 ± 1.97 / 45.89 ± 1.42 12.10.4 12.10.4 12.10.4 12.10.4 12.10.4 12.10.4 12.10.4
mistralai/Mistral-7B-Instruct-v0.2 (few-shot) 7242 32 32768 False 634 ± 179 / 110 ± 35 2.81 47.92 ± 2.66 / 33.00 ± 3.24 62.90 ± 2.44 / 70.61 ± 1.19 19.95 ± 2.24 / 56.49 ± 2.10 52.51 ± 0.36 / 61.42 ± 0.52 66.11 ± 0.18 / 19.64 ± 0.28 25.60 ± 1.10 / 43.53 ± 0.90 21.75 ± 1.61 / 40.57 ± 1.45 9.2.0 9.2.0 9.3.1 12.4.0 12.4.0 9.3.2 9.3.2
birgermoell/NeuralBeagle-Flashback (few-shot, val) 7242 32 32768 False 2,904 ± 405 / 1,155 ± 349 2.83 51.73 ± 4.51 / 40.50 ± 6.05 36.06 ± 3.31 / 53.46 ± 1.79 19.42 ± 5.08 / 46.92 ± 5.36 59.26 ± 1.66 / 64.40 ± 1.35 67.55 ± 0.53 / 23.64 ± 0.72 23.10 ± 2.38 / 42.58 ± 1.74 29.31 ± 5.03 / 47.11 ± 3.62 9.3.0 9.3.0 9.3.0 12.5.2 9.3.0 9.3.0 9.3.0
RuterNorway/Llama-2-13b-chat-norwegian (few-shot) unknown 32 4096 False 3,254 ± 1,068 / 484 ± 173 2.86 50.85 ± 2.44 / 39.65 ± 3.83 74.17 ± 2.12 / 76.62 ± 1.83 7.51 ± 1.94 / 37.81 ± 1.76 57.32 ± 0.63 / 63.28 ± 0.71 65.20 ± 0.45 / 19.06 ± 0.15 23.92 ± 0.88 / 42.25 ± 0.73 17.67 ± 1.53 / 37.32 ± 1.20 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1
occiglot/occiglot-7b-eu5-instruct (few-shot) 7242 32 32768 False 2,088 ± 352 / 706 ± 214 2.88 47.67 ± 2.81 / 36.91 ± 3.50 71.73 ± 2.40 / 74.97 ± 1.84 7.90 ± 3.20 / 41.24 ± 4.78 57.78 ± 0.79 / 64.48 ± 0.73 65.07 ± 0.34 / 19.59 ± 0.38 25.52 ± 1.30 / 43.68 ± 1.03 14.06 ± 1.68 / 35.12 ± 1.47 12.5.2 12.2.0 12.3.1 12.4.0 12.4.0 12.3.1 12.3.1
occiglot/occiglot-7b-eu5 (few-shot) 7242 32 32768 True 2,219 ± 427 / 717 ± 224 2.90 49.02 ± 3.23 / 41.69 ± 3.74 76.56 ± 1.52 / 78.16 ± 1.12 2.18 ± 2.34 / 36.26 ± 3.89 58.98 ± 0.95 / 63.65 ± 0.89 64.42 ± 0.45 / 18.79 ± 0.47 23.68 ± 1.41 / 42.15 ± 1.14 14.05 ± 1.60 / 34.81 ± 1.58 12.5.2 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0 12.2.0
neph1/bellman-7b-mistral-instruct-v0.2 (few-shot) 7242 32 32768 False 2,518 ± 463 / 779 ± 243 2.91 54.38 ± 2.92 / 39.66 ± 5.20 55.84 ± 2.51 / 66.96 ± 1.37 16.05 ± 2.15 / 54.22 ± 2.86 53.22 ± 0.88 / 61.85 ± 0.63 64.90 ± 0.14 / 16.99 ± 0.20 22.36 ± 1.17 / 41.14 ± 0.78 12.52 ± 1.41 / 33.90 ± 1.11 9.2.0 9.2.0 9.2.0 12.4.0 12.4.0 9.2.0 9.2.0
01-ai/Yi-6B (few-shot) 6061 64 4096 True 6,435 ± 1,316 / 1,632 ± 549 2.93 46.69 ± 2.39 / 32.97 ± 4.57 75.39 ± 1.06 / 71.95 ± 1.42 2.91 ± 2.80 / 35.26 ± 2.12 54.95 ± 0.86 / 60.77 ± 0.75 62.70 ± 0.76 / 17.52 ± 0.40 25.28 ± 0.72 / 43.71 ± 0.56 19.20 ± 1.18 / 38.76 ± 0.96 9.3.2 10.0.0 10.0.0 12.5.1 12.0.0 10.0.1 10.0.1
mistralai/Mistral-7B-Instruct-v0.1 (few-shot) 7242 32 32768 False 634 ± 179 / 110 ± 35 2.97 45.01 ± 2.11 / 27.59 ± 3.35 73.33 ± 1.98 / 76.19 ± 1.59 11.59 ± 3.45 / 40.89 ± 4.15 52.12 ± 1.42 / 59.29 ± 1.17 63.10 ± 0.60 / 18.05 ± 0.36 24.03 ± 1.09 / 42.32 ± 0.70 15.37 ± 0.71 / 35.78 ± 0.69 9.3.1 9.3.1 9.3.1 12.4.0 12.4.0 9.3.1 9.3.1
meta-llama/Llama-2-7b-hf (few-shot) 6738 32 4096 True 930 ± 310 / 128 ± 43 2.98 44.11 ± 4.26 / 31.64 ± 4.48 79.05 ± 1.08 / 75.52 ± 2.66 7.34 ± 3.19 / 43.83 ± 5.31 57.49 ± 0.95 / 63.16 ± 0.77 64.63 ± 0.39 / 18.68 ± 0.39 15.65 ± 0.55 / 36.32 ± 0.55 8.74 ± 1.34 / 29.87 ± 1.40 9.2.0 9.2.0 9.2.0 12.5.1 12.0.0 9.2.0 9.2.0
bineric/NorskGPT-Llama-7B-v0.1 (few-shot) 6738 32 4096 False 5,384 ± 879 / 1,746 ± 553 3.02 53.95 ± 1.89 / 42.16 ± 4.59 60.91 ± 2.35 / 59.47 ± 1.21 0.32 ± 0.62 / 33.39 ± 0.28 55.28 ± 0.62 / 63.41 ± 0.55 63.73 ± 0.18 / 15.64 ± 0.27 20.96 ± 0.77 / 40.70 ± 0.59 25.76 ± 1.39 / 43.71 ± 1.09 12.5.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2
merge-crew/da-sv-dare-ties-density-0.3 (few-shot, val) 7242 32 32768 True 2,461 ± 476 / 773 ± 248 3.07 32.37 ± 3.05 / 24.60 ± 3.81 75.33 ± 2.41 / 77.99 ± 2.58 12.73 ± 6.32 / 45.51 ± 7.43 53.05 ± 1.83 / 58.32 ± 1.46 64.74 ± 0.74 / 19.59 ± 0.87 15.60 ± 1.96 / 33.16 ± 1.77 9.81 ± 2.55 / 28.12 ± 2.70 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1 10.0.1
meta-llama/Llama-2-7b-chat-hf (few-shot) 6738 32 4096 False 2,643 ± 455 / 800 ± 247 3.07 39.72 ± 2.82 / 29.85 ± 2.99 66.18 ± 3.25 / 72.00 ± 1.75 6.74 ± 1.66 / 45.55 ± 4.31 54.05 ± 0.84 / 60.90 ± 0.82 65.92 ± 0.05 / 18.51 ± 0.18 17.73 ± 0.98 / 37.55 ± 0.69 12.85 ± 0.93 / 33.37 ± 0.90 9.3.1 9.3.1 9.3.1 12.4.0 12.4.0 9.3.1 9.3.1
allenai/OLMo-1.7-7B-hf (few-shot) 6888 50 4096 True 3,371 ± 876 / 561 ± 184 3.09 41.25 ± 2.07 / 32.87 ± 2.49 76.60 ± 0.98 / 64.63 ± 2.41 6.37 ± 2.08 / 49.55 ± 3.34 54.87 ± 0.64 / 61.35 ± 0.68 62.90 ± 0.52 / 17.15 ± 0.51 16.18 ± 0.78 / 34.88 ± 0.66 8.52 ± 1.90 / 29.96 ± 1.75 12.10.5 12.10.4 12.10.4 12.10.5 12.10.5 12.10.4 12.10.4
Qwen/Qwen1.5-4B-Chat (few-shot) 3950 152 32768 False 4,347 ± 893 / 1,135 ± 365 3.13 40.19 ± 2.97 / 31.88 ± 4.51 64.08 ± 2.44 / 69.62 ± 1.29 5.43 ± 2.02 / 38.32 ± 2.54 53.21 ± 1.08 / 59.57 ± 0.97 61.90 ± 0.87 / 17.34 ± 0.52 20.95 ± 0.97 / 40.87 ± 0.76 16.59 ± 1.45 / 36.76 ± 1.20 12.5.2 10.0.1 12.1.0 12.5.2 12.1.0 12.1.0 12.1.0
microsoft/Phi-3-mini-4k-instruct (few-shot) 3821 32 4096 True 5,224 ± 1,371 / 1,063 ± 358 3.13 46.15 ± 2.77 / 24.28 ± 3.74 67.17 ± 1.93 / 70.99 ± 1.64 5.30 ± 1.62 / 47.01 ± 3.23 51.12 ± 1.02 / 57.49 ± 0.81 59.20 ± 0.99 / 15.57 ± 0.62 21.33 ± 1.03 / 41.04 ± 0.78 16.12 ± 1.15 / 36.99 ± 0.85 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5
AI-Sweden-Models/gpt-sw3-40b (few-shot) 39927 64 1795 True 409 ± 53 / 182 ± 54 3.14 32.00 ± 3.27 / 17.02 ± 2.03 80.44 ± 0.54 / 77.81 ± 1.18 10.73 ± 2.53 / 51.37 ± 4.05 53.80 ± 0.86 / 60.53 ± 0.78 65.16 ± 0.50 / 19.50 ± 0.58 8.35 ± 0.95 / 30.38 ± 0.80 5.74 ± 1.30 / 28.98 ± 1.13 12.9.0 12.9.0 12.9.0 12.9.0 12.9.1 12.9.1 12.9.1
emillykkejensen/Phi-3-mini-4k-instruct-dansk (few-shot) 3821 32 4096 False 1,360 ± 179 / 566 ± 190 3.15 47.81 ± 2.60 / 27.94 ± 5.79 68.43 ± 2.12 / 68.82 ± 2.65 3.63 ± 1.46 / 43.69 ± 4.56 53.03 ± 0.62 / 58.80 ± 0.59 56.14 ± 0.47 / 11.16 ± 0.37 23.29 ± 1.03 / 42.63 ± 0.76 12.06 ± 1.23 / 33.77 ± 0.99 12.10.4 12.10.4 12.10.4 12.10.4 12.10.4 12.10.4 12.10.4
NorwAI/NorwAI-Mistral-7B (few-shot) 7537 68 4065 True 3,035 ± 503 / 911 ± 300 3.21 23.88 ± 7.28 / 17.99 ± 3.56 80.26 ± 0.89 / 77.89 ± 0.89 13.50 ± 2.27 / 52.55 ± 2.86 55.02 ± 0.96 / 60.74 ± 0.89 64.78 ± 0.63 / 18.68 ± 0.51 6.62 ± 0.88 / 27.61 ± 0.74 2.66 ± 1.04 / 26.32 ± 0.72 12.10.5 12.10.4 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5
microsoft/Phi-3-mini-128k-instruct (few-shot) 3821 32 130819 True 7,312 ± 1,668 / 1,609 ± 525 3.22 42.36 ± 1.67 / 21.33 ± 2.90 51.53 ± 6.32 / 62.14 ± 3.43 3.11 ± 1.60 / 47.93 ± 2.93 51.11 ± 0.96 / 57.21 ± 0.95 59.28 ± 0.86 / 15.52 ± 0.64 22.99 ± 1.08 / 42.27 ± 0.81 20.19 ± 0.99 / 40.11 ± 0.71 12.9.1 12.9.1 12.9.1 12.9.1 12.10.0 12.10.0 12.10.0
AI-Sweden-Models/gpt-sw3-20b (few-shot) 20918 64 2048 True 1,875 ± 673 / 261 ± 91 3.24 31.86 ± 5.09 / 21.95 ± 3.90 79.20 ± 1.03 / 79.87 ± 1.11 12.26 ± 1.97 / 46.90 ± 4.11 53.58 ± 0.97 / 60.28 ± 0.81 64.14 ± 0.46 / 18.76 ± 0.39 3.15 ± 0.80 / 27.43 ± 0.91 2.77 ± 1.26 / 26.43 ± 0.84 9.3.1 12.10.0 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1
google/gemma-7b-it (few-shot) 8538 256 8192 False 1,792 ± 249 / 668 ± 203 3.28 59.26 ± 2.00 / 52.73 ± 2.71 28.63 ± 1.24 / 50.95 ± 0.75 11.43 ± 1.88 / 53.31 ± 1.74 46.67 ± 1.97 / 53.24 ± 1.72 63.88 ± 0.30 / 18.58 ± 0.16 17.95 ± 1.18 / 36.41 ± 1.02 13.55 ± 1.32 / 33.39 ± 1.32 12.7.0 12.7.0 12.7.0 12.7.0 12.10.0 12.10.0 12.10.0
tollefj/nordavind-7b-instruct-warm (few-shot) 7248 33 2048 False 6,450 ± 961 / 2,082 ± 658 3.28 47.24 ± 3.36 / 24.94 ± 3.21 77.91 ± 1.42 / 76.08 ± 2.54 5.55 ± 2.55 / 48.57 ± 3.21 51.41 ± 0.74 / 57.55 ± 0.69 61.11 ± 1.02 / 17.57 ± 0.33 1.49 ± 1.11 / 25.90 ± 0.71 3.97 ± 0.92 / 27.45 ± 0.68 12.5.2 12.3.2 12.3.2 12.4.0 12.4.0 12.3.2 12.3.2
LumiOpen/Viking-33B@1000B (few-shot) 33119 131 4099 True 2,080 ± 700 / 331 ± 117 3.30 42.35 ± 1.51 / 28.31 ± 3.87 77.68 ± 1.11 / 78.86 ± 0.93 8.08 ± 1.69 / 50.52 ± 2.25 54.57 ± 1.25 / 60.34 ± 1.10 58.30 ± 1.94 / 14.29 ± 0.83 1.73 ± 1.04 / 24.98 ± 0.69 -0.32 ± 1.25 / 25.53 ± 0.82 12.9.0 12.9.0 12.9.0 12.9.0 12.9.0 12.9.0 12.9.0
NorwAI/NorwAI-Llama2-7B (few-shot) 7033 68 4096 True 4,438 ± 1,128 / 1,028 ± 346 3.31 24.98 ± 2.04 / 25.50 ± 1.92 79.36 ± 1.35 / 76.34 ± 3.44 5.75 ± 2.23 / 41.27 ± 4.75 54.74 ± 0.84 / 60.74 ± 0.65 64.60 ± 1.09 / 18.36 ± 0.56 3.83 ± 1.03 / 26.46 ± 0.77 4.40 ± 1.42 / 28.14 ± 1.05 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5
norallm/normistral-7b-warm-instruct (few-shot) 7248 33 2048 True 6,194 ± 949 / 1,967 ± 619 3.31 51.45 ± 3.13 / 26.49 ± 3.00 63.64 ± 3.74 / 65.08 ± 2.46 5.80 ± 1.74 / 51.04 ± 1.54 48.95 ± 1.00 / 57.09 ± 0.92 62.18 ± 0.57 / 16.32 ± 0.33 4.88 ± 0.59 / 25.13 ± 0.54 4.63 ± 1.09 / 27.29 ± 1.02 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
AI-Sweden-Models/gpt-sw3-6.7b-v2 (few-shot) 7111 64 2048 True 2,351 ± 448 / 707 ± 216 3.32 28.73 ± 3.63 / 20.43 ± 3.72 77.47 ± 1.36 / 78.60 ± 1.25 8.78 ± 2.01 / 42.28 ± 3.17 50.57 ± 0.94 / 56.51 ± 0.79 62.41 ± 0.85 / 16.45 ± 0.64 5.23 ± 1.02 / 28.63 ± 0.82 5.39 ± 0.81 / 28.86 ± 0.60 9.2.0 9.2.0 9.2.0 12.5.1 11.0.0 9.2.0 9.2.0
AI-Sweden-Models/gpt-sw3-20b-instruct (few-shot) 20918 64 2048 True 1,831 ± 587 / 268 ± 90 3.37 15.70 ± 1.54 / 14.65 ± 1.52 68.23 ± 3.81 / 71.17 ± 3.07 12.39 ± 1.39 / 50.99 ± 3.37 52.04 ± 0.97 / 60.86 ± 0.77 65.44 ± 0.22 / 19.75 ± 0.32 6.86 ± 0.91 / 29.83 ± 1.12 6.92 ± 0.75 / 28.96 ± 0.67 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1 9.3.1
LumiOpen/Viking-13B (few-shot) 14030 131 4097 True 840 ± 79 / 400 ± 124 3.39 31.55 ± 4.67 / 18.37 ± 2.73 78.66 ± 1.03 / 78.34 ± 1.13 5.69 ± 2.24 / 44.98 ± 3.55 52.93 ± 1.30 / 57.87 ± 1.27 60.05 ± 1.20 / 13.94 ± 0.43 1.32 ± 1.02 / 23.28 ± 0.65 0.35 ± 0.66 / 25.33 ± 0.66 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5
norallm/normistral-7b-warm (few-shot) 7248 33 2048 True 3,175 ± 456 / 1,186 ± 354 3.39 48.78 ± 5.08 / 26.81 ± 3.42 76.09 ± 1.23 / 74.78 ± 1.97 2.53 ± 2.80 / 47.37 ± 2.29 48.93 ± 0.97 / 55.09 ± 0.85 57.49 ± 2.27 / 16.17 ± 0.78 1.28 ± 1.28 / 23.12 ± 0.63 1.27 ± 0.61 / 25.74 ± 0.70 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0
stabilityai/stablelm-2-1_6b (few-shot) 1645 100 4096 True 7,259 ± 2,120 / 1,240 ± 432 3.42 38.00 ± 4.39 / 29.74 ± 5.04 75.15 ± 0.55 / 61.46 ± 0.82 1.04 ± 2.08 / 34.49 ± 1.17 53.11 ± 0.53 / 58.91 ± 0.32 55.63 ± 1.02 / 13.45 ± 0.61 8.72 ± 0.93 / 30.92 ± 0.72 3.19 ± 0.62 / 26.88 ± 0.62 12.10.8 12.10.8 12.10.8 12.10.8 12.10.8 12.10.8 12.10.8
AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct (few-shot) 7111 64 2048 True 2,383 ± 451 / 718 ± 221 3.47 14.58 ± 1.30 / 14.79 ± 1.27 56.60 ± 3.37 / 62.73 ± 3.61 10.92 ± 1.83 / 52.63 ± 2.98 50.18 ± 0.54 / 57.90 ± 0.53 64.89 ± 0.15 / 18.79 ± 0.22 6.16 ± 0.81 / 28.35 ± 0.97 10.90 ± 0.86 / 32.01 ± 0.54 9.2.0 9.2.0 9.2.0 12.4.0 12.4.0 9.2.0 9.3.1
google/gemma-2b (few-shot) 2506 256 8192 True 6,087 ± 1,046 / 1,902 ± 563 3.47 14.67 ± 4.71 / 14.85 ± 3.77 75.45 ± 1.10 / 64.08 ± 1.47 3.82 ± 1.23 / 44.81 ± 3.55 51.73 ± 0.88 / 57.35 ± 0.82 59.72 ± 1.46 / 15.26 ± 0.64 10.98 ± 0.98 / 31.92 ± 0.80 4.24 ± 0.47 / 27.53 ± 0.44 12.5.2 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0
HPLT/gpt-33b-nordic-prerelease (few-shot) 33119 131 4099 True 501 ± 50 / 238 ± 69 3.51 33.61 ± 6.02 / 22.18 ± 4.32 76.75 ± 1.17 / 74.66 ± 1.20 1.66 ± 1.36 / 33.60 ± 0.30 50.68 ± 1.94 / 56.96 ± 1.73 56.37 ± 1.71 / 12.78 ± 0.63 -0.31 ± 0.91 / 25.20 ± 0.62 -0.04 ± 0.62 / 25.05 ± 0.50 12.9.1 12.9.1 12.10.0 12.10.0 12.10.0 12.10.0 12.10.0
ibm-granite/granite-7b-base (few-shot) 6738 32 2048 True 4,405 ± 1,098 / 1,032 ± 345 3.52 33.34 ± 3.41 / 30.50 ± 3.44 72.00 ± 1.15 / 69.12 ± 2.00 0.25 ± 1.72 / 43.46 ± 3.96 52.53 ± 1.03 / 57.96 ± 0.98 52.86 ± 1.75 / 11.38 ± 0.57 11.71 ± 0.83 / 32.71 ± 0.95 0.81 ± 0.88 / 25.27 ± 0.59 12.10.5 12.10.5 12.10.5 12.10.5 12.10.8 12.10.8 12.10.8
AI-Sweden-Models/gpt-sw3-1.3b-instruct (few-shot) 1445 64 2048 True 4,544 ± 1,000 / 1,106 ± 359 3.55 19.04 ± 2.67 / 19.98 ± 2.64 73.34 ± 1.34 / 68.41 ± 2.31 2.90 ± 1.74 / 44.43 ± 4.49 47.45 ± 0.58 / 54.69 ± 0.56 63.33 ± 0.86 / 17.11 ± 0.61 0.65 ± 1.12 / 25.94 ± 0.76 -0.18 ± 0.36 / 24.70 ± 0.60 12.5.2 9.3.1 12.1.0 12.4.0 12.4.0 12.1.0 12.1.0
HPLT/gpt-13b-nordic-prerelease (few-shot) 14030 131 4099 True 3,520 ± 736 / 823 ± 273 3.55 32.19 ± 4.64 / 24.93 ± 4.09 72.26 ± 6.90 / 72.58 ± 5.87 2.39 ± 1.29 / 48.49 ± 2.46 48.92 ± 2.28 / 53.44 ± 2.49 57.46 ± 1.64 / 13.21 ± 0.57 -0.49 ± 0.50 / 25.03 ± 0.45 0.50 ± 1.04 / 25.50 ± 0.72 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1 12.6.1
LumiOpen/Viking-7B (few-shot) 7550 131 4096 True 4,969 ± 1,109 / 1,134 ± 374 3.55 30.64 ± 4.19 / 23.90 ± 3.44 72.02 ± 3.18 / 72.36 ± 3.96 1.08 ± 1.36 / 38.63 ± 3.03 48.72 ± 1.05 / 54.59 ± 1.10 57.93 ± 2.32 / 13.16 ± 1.05 1.14 ± 0.79 / 22.39 ± 0.53 1.13 ± 1.01 / 25.61 ± 0.65 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
NorwAI/NorwAI-Mistral-7B-instruct (few-shot) 7537 68 4065 False 3,027 ± 503 / 903 ± 296 3.58 20.97 ± 2.51 / 16.21 ± 2.03 77.76 ± 0.75 / 67.99 ± 2.31 2.35 ± 1.63 / 38.48 ± 1.53 28.65 ± 2.11 / 38.84 ± 1.85 63.75 ± 0.70 / 17.94 ± 0.22 9.17 ± 0.61 / 31.89 ± 0.41 3.98 ± 1.37 / 26.35 ± 1.13 12.10.5 12.10.4 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5
Qwen/Qwen1.5-4B (few-shot) 3950 152 32768 True 3,248 ± 739 / 761 ± 252 3.62 37.26 ± 4.28 / 29.89 ± 5.96 5.20 ± 7.35 / 30.65 ± 4.97 1.85 ± 1.54 / 33.71 ± 0.46 54.15 ± 0.58 / 60.15 ± 0.59 58.24 ± 1.76 / 16.02 ± 0.88 22.04 ± 0.60 / 41.36 ± 0.54 14.76 ± 1.28 / 35.27 ± 1.32 12.5.2 9.3.2 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0
Qwen/Qwen1.5-1.8B-Chat (few-shot) 1837 152 32768 False 8,304 ± 1,846 / 1,933 ± 617 3.64 20.94 ± 3.73 / 18.26 ± 2.84 52.54 ± 3.33 / 60.44 ± 3.13 0.34 ± 1.22 / 36.61 ± 1.57 43.55 ± 1.14 / 50.53 ± 1.40 61.19 ± 0.69 / 15.92 ± 0.24 10.74 ± 0.92 / 32.65 ± 0.68 4.83 ± 0.62 / 28.76 ± 0.55 12.5.2 11.0.0 12.1.0 12.5.0 12.5.0 12.1.0 12.1.0
AI-Sweden-Models/gpt-sw3-6.7b (few-shot) 7111 64 2048 True 2,285 ± 443 / 671 ± 205 3.67 18.83 ± 6.41 / 17.59 ± 4.55 53.68 ± 10.39 / 58.92 ± 10.87 3.49 ± 2.20 / 46.13 ± 4.13 49.81 ± 0.70 / 55.99 ± 0.69 61.05 ± 1.33 / 15.89 ± 0.85 1.22 ± 0.65 / 26.19 ± 0.64 0.60 ± 1.34 / 25.62 ± 0.72 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0 11.0.0
AI-Sweden-Models/gpt-sw3-1.3b (few-shot) 1445 64 2048 True 4,608 ± 988 / 1,115 ± 354 3.68 6.08 ± 5.75 / 8.77 ± 4.46 71.38 ± 1.76 / 73.21 ± 1.18 1.17 ± 1.07 / 49.78 ± 0.86 45.55 ± 0.85 / 51.69 ± 0.79 60.11 ± 1.59 / 15.02 ± 0.84 2.20 ± 0.88 / 25.62 ± 0.86 0.67 ± 1.39 / 25.25 ± 0.51 9.3.1 9.3.1 9.3.1 12.5.1 11.0.0 9.3.1 9.3.1
HPLT/gpt-7b-nordic-prerelease (few-shot) 7550 131 4096 True 5,404 ± 931 / 1,638 ± 542 3.69 27.07 ± 6.33 / 25.24 ± 4.89 61.96 ± 2.69 / 67.81 ± 2.27 2.65 ± 1.46 / 40.25 ± 4.08 46.16 ± 0.91 / 52.35 ± 0.87 55.11 ± 1.21 / 12.07 ± 0.32 0.32 ± 0.43 / 21.99 ± 0.56 -0.00 ± 0.01 / 25.00 ± 0.77 12.5.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2 12.3.2
Qwen/Qwen1.5-1.8B (few-shot) 1837 152 32768 True 5,666 ± 1,328 / 1,256 ± 408 3.69 18.01 ± 6.41 / 18.55 ± 4.65 51.91 ± 4.78 / 59.44 ± 4.65 1.49 ± 1.95 / 40.76 ± 4.07 44.83 ± 0.63 / 51.87 ± 0.72 54.82 ± 1.62 / 14.43 ± 0.68 11.54 ± 0.73 / 32.55 ± 0.60 7.19 ± 1.40 / 29.76 ± 1.22 12.5.2 10.0.1 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0
norallm/normistral-7b-scratch (few-shot) 7248 33 2048 True 3,192 ± 454 / 1,198 ± 357 3.83 13.79 ± 8.46 / 14.43 ± 7.23 71.59 ± 2.78 / 59.82 ± 1.71 -0.89 ± 1.22 / 43.82 ± 3.45 38.33 ± 1.79 / 44.00 ± 1.70 55.77 ± 0.83 / 14.15 ± 0.51 -0.39 ± 1.21 / 22.30 ± 0.78 -0.52 ± 1.01 / 25.20 ± 0.85 10.0.0 10.0.0 10.0.0 10.0.0 11.0.0 10.0.1 10.0.1
AI-Sweden-Models/gpt-sw3-356m-instruct (few-shot) 471 64 2048 True 5,855 ± 1,373 / 1,223 ± 391 3.85 14.84 ± 1.63 / 15.90 ± 1.71 59.00 ± 3.60 / 54.09 ± 1.46 0.06 ± 1.21 / 34.76 ± 1.15 34.37 ± 1.36 / 40.44 ± 1.53 61.28 ± 0.92 / 14.60 ± 0.79 0.48 ± 1.07 / 23.44 ± 0.67 0.33 ± 0.50 / 25.01 ± 0.76 12.5.2 9.3.2 12.1.0 12.4.0 12.4.0 12.1.0 12.1.0
mhenrichsen/danskgpt-tiny-chat (few-shot) 1100 32 2048 False 1,745 ± 978 / 686 ± 159 3.85 27.31 ± 4.23 / 26.33 ± 4.40 45.94 ± 12.82 / 55.94 ± 8.25 -0.97 ± 1.64 / 36.69 ± 2.34 35.57 ± 2.45 / 41.66 ± 2.41 55.79 ± 0.24 / 10.61 ± 0.29 0.14 ± 1.02 / 24.76 ± 0.75 0.52 ± 0.83 / 25.53 ± 0.62 9.1.2 9.1.2 9.1.2 12.4.0 12.4.0 9.1.2 9.1.2
allenai/OLMo-7B (few-shot) 6888 50 2051 True 5,403 ± 1,133 / 1,294 ± 423 3.87 37.36 ± 2.11 / 28.59 ± 3.03 72.08 ± 1.20 / 63.52 ± 3.36 -0.86 ± 1.61 / 33.84 ± 0.59 45.16 ± 0.96 / 51.46 ± 0.93 41.03 ± 0.33 / 4.86 ± 0.09 -0.83 ± 1.04 / 25.47 ± 0.54 -0.62 ± 0.73 / 24.51 ± 0.53 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
RuterNorway/Llama-2-7b-chat-norwegian (few-shot) unknown 32 4096 False 10,890 ± 2,686 / 2,186 ± 750 3.96 22.38 ± 3.00 / 22.09 ± 2.85 31.11 ± 12.17 / 36.84 ± 11.52 0.09 ± 0.67 / 33.42 ± 0.30 44.36 ± 1.34 / 50.14 ± 1.15 55.44 ± 0.79 / 12.95 ± 0.51 1.12 ± 0.42 / 25.27 ± 0.68 -0.91 ± 0.96 / 24.26 ± 0.64 9.3.1 9.3.1 9.3.1 12.5.2 11.0.0 9.3.1 9.3.1
allenai/OLMo-7B-Twin-2T (few-shot) 6888 50 2051 True 5,484 ± 1,125 / 1,317 ± 425 3.99 20.49 ± 7.78 / 19.50 ± 6.82 70.04 ± 2.28 / 60.77 ± 3.00 2.28 ± 1.77 / 36.86 ± 3.97 45.85 ± 1.19 / 51.08 ± 1.21 39.53 ± 0.34 / 5.71 ± 0.10 0.69 ± 0.90 / 24.20 ± 0.89 0.12 ± 1.51 / 24.97 ± 1.28 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
google/gemma-2b-it (few-shot) 2506 256 8192 False 6,471 ± 1,142 / 1,961 ± 584 4.04 33.51 ± 2.12 / 23.48 ± 2.69 43.97 ± 1.64 / 57.41 ± 1.18 0.53 ± 1.09 / 39.60 ± 1.99 39.39 ± 1.04 / 47.28 ± 1.02 40.55 ± 6.41 / 11.10 ± 1.63 11.06 ± 0.98 / 31.69 ± 0.81 1.03 ± 0.85 / 25.55 ± 0.60 12.5.2 12.1.0 12.1.0 12.4.0 12.4.0 12.1.0 12.1.0
NbAiLab/nb-gpt-j-6B-alpaca (few-shot) 6055 50 1024 False 2,607 ± 592 / 680 ± 208 4.07 13.28 ± 4.32 / 13.40 ± 2.95 60.17 ± 8.39 / 65.99 ± 4.66 1.52 ± 1.94 / 45.19 ± 3.80 37.23 ± 1.07 / 46.83 ± 0.82 46.68 ± 0.33 / 12.40 ± 0.17 -0.03 ± 1.31 / 23.73 ± 1.11 0.02 ± 0.88 / 25.04 ± 0.61 9.3.1 10.0.1 10.0.1 12.4.0 12.4.0 10.0.1 10.0.1
Qwen/Qwen1.5-0.5B (few-shot) 620 152 32768 True 11,371 ± 2,924 / 2,122 ± 692 4.07 28.96 ± 2.39 / 26.49 ± 3.14 26.58 ± 5.12 / 28.64 ± 5.35 -1.88 ± 1.46 / 35.45 ± 2.92 34.59 ± 1.06 / 40.95 ± 1.11 53.36 ± 1.44 / 12.82 ± 0.58 6.52 ± 1.02 / 28.83 ± 0.78 1.91 ± 1.30 / 26.10 ± 0.65 12.5.2 10.0.1 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0
Qwen/Qwen1.5-0.5B-Chat (few-shot) 620 152 32768 False 11,740 ± 3,000 / 2,209 ± 721 4.11 18.57 ± 4.62 / 17.69 ± 4.61 40.23 ± 5.86 / 49.01 ± 4.77 0.21 ± 1.06 / 39.60 ± 3.61 29.49 ± 2.47 / 35.01 ± 2.72 53.29 ± 6.52 / 13.04 ± 1.68 2.59 ± 0.72 / 26.87 ± 0.72 -0.84 ± 1.01 / 24.44 ± 0.61 12.5.2 11.0.0 12.1.0 12.4.0 12.5.0 12.1.0 12.1.0
AI-Sweden-Models/gpt-sw3-356m (few-shot) 471 64 2048 True 5,758 ± 1,348 / 1,215 ± 391 4.13 23.77 ± 3.70 / 23.06 ± 3.46 34.29 ± 11.64 / 36.76 ± 7.46 1.57 ± 1.70 / 40.84 ± 1.99 33.70 ± 1.46 / 38.82 ± 1.54 51.36 ± 2.01 / 10.76 ± 0.54 -0.96 ± 1.08 / 21.85 ± 0.45 0.30 ± 0.48 / 25.10 ± 0.69 9.3.1 9.3.1 9.3.2 12.5.1 11.0.0 9.3.2 9.3.2
mhenrichsen/danskgpt-tiny (few-shot) 1100 32 2048 True 8,597 ± 1,983 / 1,926 ± 600 4.14 23.92 ± 6.88 / 22.42 ± 6.73 31.93 ± 14.68 / 43.80 ± 8.79 0.46 ± 1.91 / 43.45 ± 3.64 30.81 ± 2.73 / 35.67 ± 2.95 52.68 ± 0.76 / 11.19 ± 0.36 -0.85 ± 1.05 / 24.38 ± 0.51 -1.24 ± 0.90 / 24.30 ± 0.63 0.0.0 0.0.0 0.0.0 12.5.1 11.0.0 0.0.0 0.0.0
AI-Sweden-Models/gpt-sw3-126m-instruct (few-shot) 186 64 2048 True 7,717 ± 1,553 / 2,013 ± 625 4.25 23.05 ± 2.31 / 24.35 ± 1.99 12.47 ± 7.10 / 23.03 ± 8.78 0.08 ± 0.16 / 33.34 ± 0.30 20.43 ± 2.69 / 24.25 ± 2.67 59.80 ± 0.93 / 14.56 ± 0.36 0.72 ± 0.72 / 23.30 ± 0.96 0.11 ± 0.91 / 25.15 ± 0.81 9.3.2 9.3.2 11.0.0 12.4.0 12.4.0 11.0.0 11.0.0
allenai/OLMo-1B (few-shot) 1177 50 2051 True 8,536 ± 1,926 / 1,940 ± 619 4.32 29.39 ± 3.08 / 29.93 ± 3.14 38.95 ± 11.78 / 43.61 ± 8.46 -1.35 ± 1.76 / 40.70 ± 4.25 17.85 ± 3.77 / 20.30 ± 4.04 43.75 ± 0.28 / 4.67 ± 0.12 -0.22 ± 0.80 / 23.76 ± 0.84 0.75 ± 1.00 / 25.27 ± 0.56 12.5.2 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0 12.1.0
RJuro/kanelsnegl-v0.1 (few-shot) 7242 32 512 True 5,847 ± 1,029 / 1,640 ± 525 4.50 0.00 ± 0.00 / 0.00 ± 0.00 34.63 ± 9.69 / 40.92 ± 6.88 0.00 ± 0.00 / 33.30 ± 0.27 0.00 ± 0.00 / 8.92 ± 2.90 59.04 ± 0.07 / 10.84 ± 0.09 -0.25 ± 0.97 / 21.96 ± 0.57 0.08 ± 0.78 / 24.93 ± 0.77 9.3.1 9.3.1 9.3.1 12.5.1 11.0.0 9.3.1 9.3.1
RJuro/kanelsnegl-v0.2 (few-shot) 7242 32 512 True 1,373 ± 120 / 709 ± 172 4.56 0.00 ± 0.00 / 0.00 ± 0.00 28.62 ± 12.67 / 35.36 ± 8.35 0.00 ± 0.00 / 33.30 ± 0.27 0.00 ± 0.00 / 19.59 ± 6.84 58.16 ± 0.07 / 8.81 ± 0.07 0.47 ± 0.86 / 22.03 ± 0.59 0.71 ± 0.64 / 25.02 ± 0.72 10.0.1 10.0.1 10.0.1 10.0.1 11.0.0 10.0.1 11.0.0
NorwAI/NorwAI-Mistral-7B-pretrain (few-shot) 7537 68 4065 True 3,024 ± 496 / 909 ± 301 4.58 9.75 ± 3.30 / 9.18 ± 3.19 17.76 ± 4.89 / 28.16 ± 7.50 1.22 ± 0.95 / 43.54 ± 3.79 14.98 ± 2.49 / 18.46 ± 2.99 48.74 ± 1.72 / 10.30 ± 0.36 -0.62 ± 1.15 / 22.04 ± 0.67 0.99 ± 1.35 / 25.36 ± 0.92 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5 12.10.5
AI-Sweden-Models/gpt-sw3-126m (few-shot) 186 64 2048 True 8,958 ± 1,815 / 2,240 ± 696 4.61 5.66 ± 4.11 / 8.37 ± 3.24 8.15 ± 8.87 / 24.31 ± 7.12 -0.81 ± 1.16 / 36.81 ± 2.47 16.40 ± 2.88 / 19.18 ± 3.18 51.48 ± 1.14 / 10.63 ± 0.31 -0.49 ± 0.60 / 22.53 ± 0.75 1.17 ± 0.86 / 25.54 ± 0.87 9.2.0 9.2.0 9.2.0 12.5.1 11.0.0 9.2.0 9.2.0
NbAiLab/nb-gpt-j-6B-v2 (few-shot) 6051 50 1024 False 2,556 ± 580 / 681 ± 214 4.90 0.31 ± 0.55 / 0.29 ± 0.50 27.42 ± 12.16 / 38.74 ± 10.05 0.07 ± 1.06 / 35.80 ± 1.73 17.82 ± 11.21 / 31.12 ± 8.39 27.09 ± 0.29 / 6.80 ± 0.12 -0.67 ± 0.81 / 22.55 ± 0.71 0.86 ± 0.82 / 25.38 ± 0.51 9.3.1 10.0.1 11.0.0 12.5.1 11.0.0 11.0.0 11.0.0
peter-sk/gpt-neox-da (few-shot) 1515 50 1024 True 6,025 ± 1,442 / 1,342 ± 431 4.93 0.26 ± 0.16 / 0.26 ± 0.14 4.75 ± 2.54 / 27.85 ± 1.59 -0.60 ± 1.56 / 40.53 ± 2.93 0.06 ± 0.09 / 1.07 ± 0.35 41.84 ± 0.24 / 5.74 ± 0.09 -0.41 ± 1.39 / 24.48 ± 0.97 0.52 ± 0.81 / 25.32 ± 0.65 10.0.1 10.0.1 10.0.1 10.0.1 11.0.0 10.0.1 10.0.1
NbAiLab/nb-gpt-j-6B@sharded (few-shot) unknown 50 1024 True 2,630 ± 605 / 684 ± 217 4.98 0.01 ± 0.02 / 0.11 ± 0.12 33.50 ± 13.13 / 39.30 ± 11.93 -0.02 ± 0.60 / 34.92 ± 2.99 4.79 ± 3.55 / 18.06 ± 2.80 26.97 ± 0.41 / 6.56 ± 0.18 -0.11 ± 1.16 / 23.32 ± 0.92 0.56 ± 1.22 / 24.79 ± 0.91 9.3.1 10.0.1 10.0.1 12.5.1 11.0.0 10.0.1 10.0.1
NorGLM/NorGPT-369M (few-shot) unknown 64 1024 True 19,896 ± 5,099 / 3,848 ± 1,251 5.02 1.47 ± 1.90 / 1.32 ± 1.69 5.50 ± 4.49 / 28.77 ± 3.76 -2.19 ± 1.29 / 40.52 ± 3.02 0.10 ± 0.06 / 4.36 ± 0.44 37.40 ± 0.61 / 6.53 ± 0.13 -0.53 ± 1.01 / 24.38 ± 1.08 0.25 ± 1.22 / 25.23 ± 0.73 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2 12.5.2
ai-forever/mGPT (few-shot) unknown 100 1024 True 11,734 ± 3,124 / 2,174 ± 720 5.08 0.00 ± 0.00 / 0.00 ± 0.00 0.00 ± 0.00 / 19.32 ± 0.16 0.49 ± 1.29 / 39.12 ± 3.92 6.24 ± 3.13 / 7.85 ± 3.67 31.89 ± 0.27 / 2.03 ± 0.10 -0.37 ± 1.08 / 22.43 ± 0.55 0.36 ± 0.83 / 25.08 ± 0.77 9.3.1 10.0.1 11.0.0 12.5.1 12.0.0 11.0.0 11.0.0
Sigurdur/icebreaker (few-shot) 110 32 1024 False 48,619 ± 7,681 / 13,831 ± 4,404 5.11 0.00 ± 0.00 / 0.00 ± 0.00 -3.60 ± 3.63 / 20.29 ± 1.99 0.00 ± 0.00 / 33.30 ± 0.27 0.00 ± 0.00 / 0.05 ± 0.03 39.68 ± 0.08 / 1.23 ± 0.02 -0.20 ± 0.77 / 24.13 ± 0.67 -0.25 ± 0.67 / 24.68 ± 0.44 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0 12.7.0
Download as CSV   •   Copy embed HTML