Package: python3-mpi4py Version: 4.0.0-2 Severity: serious Tags: ftbfs Justification: FTBFS Control: forwarded -1 https://github.com/mpi4py/mpi4py/issues/545
mpi4py 4 was building fine in experimental but shows tests error in unstable: testTestAll (test_util_pkl5.TestPKL5World.testTestAll) ... ok testWaitAll (test_util_pkl5.TestPKL5World.testWaitAll) ... ok test_apply (test_util_pool.TestProcessPool.test_apply) ... [sbuild:00243] 1 more process has sent help message help-mca-bml-r2.txt / unreachable proc [sbuild:00483] [[13078,42],0] ORTE_ERROR_LOG: Unreachable in file ../../../ompi/dpm/dpm.c at line 493 [sbuild:00242] [[13078,1],0] ORTE_ERROR_LOG: Unreachable in file ../../../ompi/dpm/dpm.c at line 493 Exception in thread Thread-6 (_manager_spawn): Traceback (most recent call last): File "/usr/lib/python3.12/threading.py", line 1075, in _bootstrap_inner self.run() File "/usr/lib/python3.12/threading.py", line 1012, in run self._target(*self._args, **self._kwargs) File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/mpi4py/futures/_core.py", line 350, in _manager_spawn comm = serialized(client_spawn)(pyexe, pyargs, nprocs, info) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/mpi4py/futures/_core.py", line 1058, in client_spawn comm = MPI.COMM_SELF.Spawn(python_exe, args, max_workers, info) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "src/mpi4py/MPI.src/Comm.pyx", line 2544, in mpi4py.MPI.Intracomm.Spawn with nogil: CHKERR( MPI_Comm_spawn( mpi4py.MPI.Exception: MPI_ERR_INTERN: internal error [sbuild:00243] 1 more process has sent help message help-mca-bml-r2.txt / unreachable proc [sbuild:00243] 1 more process has sent help message help-mpi-runtime.txt / mpi_init:startup:internal-failure [sbuild:00243] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal unknown handle E: Build killed with signal TERM after 150 minutes of inactivity The error is intermittent, indicating a race condition (may pass after trying the build again). Upstream acknowledges spawn is flakey in OpenMPI 4 (likely fixed in OpenMPI 5), but the suggested workaround MPI4PY_TEST_SPAWN=false does not prevent spawn tests from launching. -- System Information: Debian Release: trixie/sid APT prefers unstable-debug APT policy: (500, 'unstable-debug'), (500, 'unstable'), (1, 'experimental') Architecture: amd64 (x86_64) Foreign Architectures: i386 Kernel: Linux 6.10.6-amd64 (SMP w/8 CPU threads; PREEMPT) Locale: LANG=en_AU.UTF-8, LC_CTYPE=en_AU.UTF-8 (charmap=UTF-8), LANGUAGE=en_AU:en Shell: /bin/sh linked to /usr/bin/dash Init: systemd (via /run/systemd/system) LSM: AppArmor: enabled Versions of packages python3-mpi4py depends on: ii libc6 2.40-2 ii libopenmpi3t64 4.1.6-13.3 ii mpi-default-bin 1.17 ii python3 3.12.5-1 python3-mpi4py recommends no packages. Versions of packages python3-mpi4py suggests: ii python3-numpy 1:1.26.4+ds-11 -- no debconf information