https://git.altlinux.org/tasks/409907/logs/events.1.1.log
https://packages.altlinux.org/tasks/409907

subtask  name       aarch64  i586  x86_64
   #100  llama.cpp     5:51     -    5:06

2026-Mar-03 22:36:41 :: test-only task #409907 for sisyphus started by vt:
#100 build 8192-alt1 from /people/vt/packages/llama.cpp.git fetched at 
2026-Mar-03 22:36:39
2026-Mar-03 22:36:43 :: [x86_64] #100 llama.cpp.git 8192-alt1: build start
2026-Mar-03 22:36:43 :: [i586] #100 llama.cpp.git 8192-alt1: build start
2026-Mar-03 22:36:43 :: [aarch64] #100 llama.cpp.git 8192-alt1: build start
2026-Mar-03 22:36:50 :: [i586] #100 llama.cpp.git 8192-alt1: build SKIPPED
build/100/x86_64/log:[00:02:14] debuginfo.req: WARNING: 
/usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:02:14] debuginfo.req: WARNING: 
/usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2026-Mar-03 22:41:49 :: [x86_64] #100 llama.cpp.git 8192-alt1: build OK
2026-Mar-03 22:42:34 :: [aarch64] #100 llama.cpp.git 8192-alt1: build OK
2026-Mar-03 22:42:43 :: 100: build check OK
2026-Mar-03 22:42:44 :: build check OK
2026-Mar-03 22:42:58 :: #100: llama.cpp.git 8192-alt1: version check OK
2026-Mar-03 22:42:58 :: build version check OK
--- llama.cpp-cpu-8192-alt1.x86_64.rpm.share    2026-03-03 22:43:01.454818841 
+0000
+++ llama.cpp-cpu-8192-alt1.aarch64.rpm.share   2026-03-03 22:43:02.534828938 
+0000
@@ -8,3 +8,3 @@
 /usr/share/doc/llama.cpp/README.md     100644  UTF-8 Unicode English text, 
with very long lines
-/usr/share/doc/llama.cpp/build-options.txt     100644  ASCII English text, 
with very long lines
+/usr/share/doc/llama.cpp/build-options.txt     100644  ASCII English text
 /usr/share/doc/llama.cpp/docs  40755   directory
warning (#100): non-identical /usr/share part
2026-Mar-03 22:43:16 :: noarch check OK
2026-Mar-03 22:43:18 :: plan: src +1 -1 =21664, aarch64 +8 -8 =38437, x86_64 
+10 -10 =39454
#100 llama.cpp 8018-alt1 -> 1:8192-alt1
 Tue Mar 03 2026 Vitaly Chikunov <vt@altlinux> 1:8192-alt1
 - Update to b8192 (2026-03-03).
2026-Mar-03 22:44:00 :: patched apt indices
2026-Mar-03 22:44:09 :: created next repo
2026-Mar-03 22:44:19 :: duplicate provides check OK
2026-Mar-03 22:45:01 :: dependencies check OK
2026-Mar-03 22:45:37 :: [x86_64 aarch64] ELF symbols check OK
2026-Mar-03 22:45:48 :: [x86_64] #100 libllama: install check OK
2026-Mar-03 22:45:53 :: [x86_64] #100 libllama-debuginfo: install check OK
2026-Mar-03 22:45:56 :: [aarch64] #100 libllama: install check OK
        x86_64: libllama-devel=1:8192-alt1 post-install unowned files:
 /usr/lib64/cmake
2026-Mar-03 22:45:58 :: [x86_64] #100 libllama-devel: install check OK
2026-Mar-03 22:46:06 :: [aarch64] #100 libllama-debuginfo: install check OK
2026-Mar-03 22:46:15 :: [x86_64] #100 llama.cpp: install check OK
        aarch64: libllama-devel=1:8192-alt1 post-install unowned files:
 /usr/lib64/cmake
2026-Mar-03 22:46:16 :: [aarch64] #100 libllama-devel: install check OK
2026-Mar-03 22:46:21 :: [x86_64] #100 llama.cpp-cpu: install check OK
2026-Mar-03 22:46:27 :: [aarch64] #100 llama.cpp: install check OK
2026-Mar-03 22:46:30 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK
2026-Mar-03 22:46:38 :: [aarch64] #100 llama.cpp-cpu: install check OK
2026-Mar-03 22:46:47 :: [x86_64] #100 llama.cpp-cuda: install check OK
2026-Mar-03 22:46:54 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK
2026-Mar-03 22:47:06 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK
2026-Mar-03 22:47:06 :: [aarch64] #100 llama.cpp-vulkan: install check OK
2026-Mar-03 22:47:11 :: [x86_64] #100 llama.cpp-vulkan: install check OK
2026-Mar-03 22:47:19 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check 
OK
2026-Mar-03 22:47:20 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install 
check OK
2026-Mar-03 22:47:36 :: [x86_64-i586] generated apt indices
2026-Mar-03 22:47:36 :: [x86_64-i586] created next repo
2026-Mar-03 22:47:46 :: [x86_64-i586] dependencies check OK
2026-Mar-03 22:47:47 :: gears inheritance check OK
2026-Mar-03 22:47:48 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2026-Mar-03 22:47:48 :: acl check OK
2026-Mar-03 22:48:00 :: created contents_index files
2026-Mar-03 22:48:08 :: created hash files: aarch64 src x86_64
2026-Mar-03 22:48:11 :: task #409907 for sisyphus TESTED
_______________________________________________
Sisyphus-incominger mailing list
[email protected]
https://lists.altlinux.org/mailman/listinfo/sisyphus-incominger

Reply via email to