On Thu, 2025-02-06 at 01:33 +0100, Petter Reinholdtsen wrote: > > I was sad to discover the server example is missing, as it is the > llama.cpp progam I use the most. Without it, I will have to continue > using my own build.
I second this. llama-server is also the service endpoint for DebGPT. I pushed a fix for ppc64el. The hwcaps works correctly for power9, given the baseline is power 8. (chroot:unstable-ppc64el-sbuild) root@debian-project-1 /h/d/llama.cpp.pkg [1]# ldd (which llama-cli) linux-vdso64.so.1 (0x00007fffa4810000) libeatmydata.so => /lib/powerpc64le-linux-gnu/libeatmydata.so (0x00007fffa4600000) libllama.so => /usr/lib/powerpc64le-linux-gnu/llama.cpp/glibc-hwcaps/power9/libllama.so (0x00007fffa4450000) libggml.so => /usr/lib/powerpc64le-linux-gnu/llama.cpp/glibc-hwcaps/power9/libggml.so (0x00007fffa4420000) libggml-base.so => /usr/lib/powerpc64le-linux-gnu/llama.cpp/glibc-hwcaps/power9/libggml-base.so (0x00007fffa4330000) libstdc++.so.6 => /lib/powerpc64le-linux-gnu/libstdc++.so.6 (0x00007fffa3fd0000) libm.so.6 => /lib/powerpc64le-linux-gnu/libm.so.6 (0x00007fffa3e90000) libgcc_s.so.1 => /lib/powerpc64le-linux-gnu/libgcc_s.so.1 (0x00007fffa3e50000) libc.so.6 => /lib/powerpc64le-linux-gnu/libc.so.6 (0x00007fffa3be0000) /lib64/ld64.so.2 (0x00007fffa4820000) libggml-cpu.so => /usr/lib/powerpc64le-linux-gnu/llama.cpp/glibc-hwcaps/power9/libggml-cpu.so (0x00007fffa3b20000) libgomp.so.1 => /lib/powerpc64le-linux-gnu/libgomp.so.1 (0x00007fffa3a90000)