Your message dated Mon, 29 Apr 2024 09:53:21 +0200
with message-id <zi9r8tnmmbgzw...@smaug.dr-blatt.de>
and subject line OpenMPI bug resolved
has caused the Debian Bug report #1069458,
regarding dune-uggrid: FTBFS on armhf: tests fail
to be marked as done.

This means that you claim that the problem has been dealt with.
If this is not the case it is now your responsibility to reopen the
Bug report if necessary, and/or fix the problem forthwith.

(NB: If you are a system administrator and have no idea what this
message is talking about, this may indicate a serious mail system
misconfiguration somewhere. Please contact ow...@bugs.debian.org
immediately.)


-- 
1069458: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1069458
Debian Bug Tracking System
Contact ow...@bugs.debian.org with problems
--- Begin Message ---
Source: dune-uggrid
Version: 2.9.0-2
Severity: serious
Justification: FTBFS
Tags: trixie sid ftbfs
User: lu...@debian.org
Usertags: ftbfs-20240420 ftbfs-trixie ftbfs-t64-armhf

Hi,

During a rebuild of all packages in sid, your package failed to build
on armhf.


Relevant part (hopefully):
> make[5]: Entering directory '/<<PKGBUILDDIR>>/build'
> make[5]: Nothing to be done for 'CMakeFiles/build_tests.dir/build'.
> make[5]: Leaving directory '/<<PKGBUILDDIR>>/build'
> [100%] Built target build_tests
> make[4]: Leaving directory '/<<PKGBUILDDIR>>/build'
> /usr/bin/cmake -E cmake_progress_start /<<PKGBUILDDIR>>/build/CMakeFiles 0
> make[3]: Leaving directory '/<<PKGBUILDDIR>>/build'
> make[2]: Leaving directory '/<<PKGBUILDDIR>>/build'
> cd build; PATH=/<<PKGBUILDDIR>>/debian/tmp-test:$PATH /usr/bin/dune-ctest 
>    Site: ip-10-84-234-180
>    Build name: Linux-c++
> Create new tag: 20240420-0421 - Experimental
> Test project /<<PKGBUILDDIR>>/build
>     Start 1: rm3-tetrahedron-rules-test
> 1/3 Test #1: rm3-tetrahedron-rules-test .......***Failed    0.02 sec
> --------------------------------------------------------------------------
> Sorry!  You were supposed to get help about:
>     pmix_init:startup:internal-failure
> But I couldn't open the help file:
>     /usr/share/pmix/help-pmix-runtime.txt: No such file or directory.  Sorry!
> --------------------------------------------------------------------------
> [ip-10-84-234-180:3635872] PMIX ERROR: NOT-FOUND in file 
> ../../../../../../../../opal/mca/pmix/pmix3x/pmix/src/server/pmix_server.c at 
> line 237
> [ip-10-84-234-180:3635871] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to 
> start a daemon on the local node in file 
> ../../../../../../orte/mca/ess/singleton/ess_singleton_module.c at line 716
> [ip-10-84-234-180:3635871] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to 
> start a daemon on the local node in file 
> ../../../../../../orte/mca/ess/singleton/ess_singleton_module.c at line 172
> --------------------------------------------------------------------------
> It looks like orte_init failed for some reason; your parallel process is
> likely to abort.  There are many reasons that a parallel process can
> fail during orte_init; some of which are due to configuration or
> environment problems.  This failure appears to be an internal failure;
> here's some additional information (which may only be relevant to an
> Open MPI developer):
> 
>   orte_ess_init failed
>   --> Returned value Unable to start a daemon on the local node (-127) 
> instead of ORTE_SUCCESS
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> It looks like MPI_INIT failed for some reason; your parallel process is
> likely to abort.  There are many reasons that a parallel process can
> fail during MPI_INIT; some of which are due to configuration or environment
> problems.  This failure appears to be an internal failure; here's some
> additional information (which may only be relevant to an Open MPI
> developer):
> 
>   ompi_mpi_init: ompi_rte_init failed
>   --> Returned "Unable to start a daemon on the local node" (-127) instead of 
> "Success" (0)
> --------------------------------------------------------------------------
> *** An error occurred in MPI_Init
> *** on a NULL communicator
> *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
> ***    and potentially your MPI job)
> [ip-10-84-234-180:3635871] Local abort before MPI_INIT completed completed 
> successfully, but am not able to aggregate error messages, and not able to 
> guarantee that all other processes were killed!
> 
>     Start 2: test-fifo
> 2/3 Test #2: test-fifo ........................   Passed    0.00 sec
>     Start 3: testbtree
> 3/3 Test #3: testbtree ........................   Passed    0.00 sec
> 
> 67% tests passed, 1 tests failed out of 3
> 
> Total Test time (real) =   0.03 sec
> 
> The following tests FAILED:
>         1 - rm3-tetrahedron-rules-test (Failed)
> Errors while running CTest
> ======================================================================
> Name:      rm3-tetrahedron-rules-test
> FullName:  ./dune/uggrid/gm/rm3-tetrahedron-rules-test
> Status:    FAILED
> 
> JUnit report for CTest results written to 
> /<<PKGBUILDDIR>>/build/junit/cmake.xml
> make[1]: *** [/usr/share/dune/dune-debian.mk:39: override_dh_auto_test] Error 
> 1


The full build log is available from:
http://qa-logs.debian.net/2024/04/20/dune-uggrid_2.9.0-2_unstable-armhf.log

All bugs filed during this archive rebuild are listed at:
https://bugs.debian.org/cgi-bin/pkgreport.cgi?tag=ftbfs-20240420;users=lu...@debian.org
or:
https://udd.debian.org/bugs/?release=na&merged=ign&fnewerval=7&flastmodval=7&fusertag=only&fusertagtag=ftbfs-20240420&fusertaguser=lu...@debian.org&allbugs=1&cseverity=1&ctags=1&caffected=1#results

A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!

If you reassign this bug to another package, please mark it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects

If you fail to reproduce this, please provide a build log and diff it with mine
so that we can identify if something relevant changed in the meantime.

--- End Message ---
--- Begin Message ---
Hi,

this bug is actually the same as bug 1069433 [1] in src:openmpi. That bug has 
been fixed with the upload of version 4.1.6-13 to unstable.

Closing it here.

Best,

Markus

[1] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1069433

--- End Message ---

Reply via email to