Your message dated Sun, 18 Feb 2018 16:36:56 +0000
with message-id <e1enrxq-000cnx...@fasolo.debian.org>
and subject line Bug#888879: fixed in rheolef 6.7-6
has caused the Debian Bug report #888879,
regarding rheolef FTBFS on several architectures: test runs forever
to be marked as done.

This means that you claim that the problem has been dealt with.
If this is not the case it is now your responsibility to reopen the
Bug report if necessary, and/or fix the problem forthwith.

(NB: If you are a system administrator and have no idea what this
message is talking about, this may indicate a serious mail system
misconfiguration somewhere. Please contact ow...@bugs.debian.org
immediately.)


-- 
888879: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=888879
Debian Bug Tracking System
Contact ow...@bugs.debian.org with problems
--- Begin Message ---
Source: rheolef
Version: 6.7-5
Severity: serious

https://buildd.debian.org/status/package.php?p=rheolef&suite=sid

...
      mpirun -np 1 ./form_mass_bdr_tst -app P2 -weight yz -I my_cube_TP-5-v2 
left >/dev/null 2>/dev/null
      mpirun -np 2 ./form_mass_bdr_tst -app P2 -weight yz -I my_cube_TP-5-v2 
left >/dev/null 2>/dev/null
      mpirun -np 3 ./form_mass_bdr_tst -app P2 -weight yz -I my_cube_TP-5-v2 
left >/dev/null 2>/dev/null
      mpirun -np 1 ./form_mass_bdr_tst -app P2 -weight yz -I my_cube_TP-5-v2 
right >/dev/null 2>/dev/null
      mpirun -np 2 ./form_mass_bdr_tst -app P2 -weight yz -I my_cube_TP-5-v2 
right >/dev/null 2>/dev/null
E: Build killed with signal TERM after 150 minutes of inactivity


I've reproduced this on i386, two processes are running
forever (aborted after 6 hours on a fast CPU) with 100% CPU.

Backtraces:

Thread 3 (Thread 0xf50ffb40 (LWP 29032)):
#0  0xf7ed6db9 in __kernel_vsyscall ()
#1  0xf70fabd3 in __GI___poll (fds=0xf47005d0, nfds=2, timeout=3600000) at 
../sysdeps/unix/sysv/linux/poll.c:29
#2  0xf5caed4a in poll (__timeout=3600000, __nfds=2, __fds=0xf47005d0) at 
/usr/include/i386-linux-gnu/bits/poll2.h:46
#3  poll_dispatch (base=0x578eb9c0, tv=0xf50f9bfc) at poll.c:165
#4  0xf5ca59e9 in opal_libevent2022_event_base_loop (base=<optimized out>, 
flags=<optimized out>) at event.c:1630
#5  0xf5c6b3bd in progress_engine (obj=0x578eb950) at 
runtime/opal_progress_threads.c:105
#6  0xf5df6316 in start_thread (arg=0xf50ffb40) at pthread_create.c:465
#7  0xf7105296 in clone () at ../sysdeps/unix/sysv/linux/i386/clone.S:108

Thread 2 (Thread 0xf5ac5b40 (LWP 29031)):
#0  0xf7ed6db9 in __kernel_vsyscall ()
#1  0xf71053fa in __GI_epoll_pwait (epfd=7, events=0x578ea930, maxevents=32, 
timeout=-1, set=0x0)
    at ../sysdeps/unix/sysv/linux/epoll_pwait.c:42
#2  0xf710569a in epoll_wait (epfd=7, events=0x578ea930, maxevents=32, 
timeout=-1)
    at ../sysdeps/unix/sysv/linux/epoll_wait.c:30
#3  0xf5ca199a in epoll_dispatch (base=0x578ea7a0, tv=0x0) at epoll.c:407
#4  0xf5ca59e9 in opal_libevent2022_event_base_loop (base=<optimized out>, 
flags=<optimized out>) at event.c:1630
#5  0xf5af23eb in progress_engine (obj=0x578ea7a0) at 
src/util/progress_threads.c:52
#6  0xf5df6316 in start_thread (arg=0xf5ac5b40) at pthread_create.c:465
#7  0xf7105296 in clone () at ../sysdeps/unix/sysv/linux/i386/clone.S:108

Thread 1 (Thread 0xf5b4fe00 (LWP 29002)):
#0  0xf7ed67f5 in ?? ()
#1  0xf7ed6b43 in __vdso_clock_gettime ()
#2  0xf7112961 in __GI___clock_gettime (clock_id=1, tp=0xffb74194) at 
../sysdeps/unix/clock_gettime.c:115
#3  0xf5cc3297 in opal_timer_linux_get_usec_clock_gettime () at 
timer_linux_component.c:197
#4  0xf5c669c3 in opal_progress () at runtime/opal_progress.c:197
#5  0xf74b5e05 in sync_wait_st (sync=<optimized out>) at 
../opal/threads/wait_sync.h:80
#6  ompi_request_default_wait_all (count=2, requests=0xffb742e4, statuses=0x0) 
at request/req_wait.c:221
#7  0xf750640d in ompi_coll_base_allreduce_intra_recursivedoubling 
(sbuf=0x57951030, rbuf=0x57a9b400, count=2, 
    dtype=0xf7565140 <ompi_mpi_unsigned>, op=0xf7573e60 <ompi_mpi_op_sum>, 
comm=0xf7569520 <ompi_mpi_comm_world>, 
    module=0x57976fa0) at base/coll_base_allreduce.c:225
#8  0xe991f640 in ompi_coll_tuned_allreduce_intra_dec_fixed (sbuf=0x57951030, 
rbuf=0x57a9b400, count=2, 
    dtype=0xf7565140 <ompi_mpi_unsigned>, op=0xf7573e60 <ompi_mpi_op_sum>, 
comm=0xf7569520 <ompi_mpi_comm_world>, 
    module=0x57976fa0) at coll_tuned_decision_fixed.c:66
#9  0xf74c5b77 in PMPI_Allreduce (sendbuf=0x57951030, recvbuf=0x57a9b400, 
count=2, 
    datatype=0xf7565140 <ompi_mpi_unsigned>, op=0xf7573e60 <ompi_mpi_op_sum>, 
comm=0xf7569520 <ompi_mpi_comm_world>)
    at pallreduce.c:107
#10 0xf7b476cf in boost::mpi::detail::all_reduce_impl<unsigned int, 
std::plus<unsigned int> > (comm=..., 
    in_values=0x57951030, n=n@entry=2, out_values=0x57a9b400) at 
/usr/include/boost/mpi/collectives/all_reduce.hpp:36
#11 0xf7b58fc0 in boost::mpi::all_reduce<unsigned int, std::plus<unsigned int> 
> (out_values=<optimized out>, n=2, 
    in_values=<optimized out>, comm=..., op=...) at 
/usr/include/boost/mpi/collectives/all_reduce.hpp:93
#12 rheolef::mpi_assembly_begin<std::multimap<unsigned int, unsigned int, 
std::less<unsigned int>, rheolef::heap_allocator<std::pair<unsigned int, 
unsigned int> > >, rheolef::disarray_rep<rheolef::index_set, 
rheolef::distributed, std::allocator<rheolef::index_set> >::message_type, 
rheolef::apply_iterator<std::_Rb_tree_iterator<std::pair<unsigned int const, 
unsigned int> >, rheolef::first_op<std::pair<unsigned int const, unsigned int> 
> > > (stash=..., first_stash_idx=..., 
    last_stash_idx=..., ownership=..., receive=..., send=...) at 
../../include/rheolef/mpi_assembly_begin.h:113
#13 0xf7b5a346 in rheolef::disarray_rep<rheolef::index_set, 
rheolef::distributed, std::allocator<rheolef::index_set> 
>::dis_entry_assembly_begin<rheolef::index_set_add_op<rheolef::index_set> > 
(this=0x57acab70, my_set_op=...)
    at ../../include/rheolef/disarray_mpi.icc:223
#14 rheolef::disarray<rheolef::index_set, rheolef::distributed, 
std::allocator<rheolef::index_set> >::dis_entry_assembly_begin (this=<optimized 
out>) at ../../include/rheolef/disarray.h:592
#15 rheolef::disarray<rheolef::index_set, rheolef::distributed, 
std::allocator<rheolef::index_set> >::dis_entry_assembly (this=<optimized out>) 
at ../../include/rheolef/disarray.h:594
#16 rheolef::geo_rep<double, rheolef::distributed>::set_element_side_index 
(this=<optimized out>, 
    side_dim=<optimized out>) at geo_mpi_get.cc:461
#17 0xf7b5f25a in rheolef::geo_rep<double, rheolef::distributed>::get 
(this=<optimized out>, ips=...)
    at geo_mpi_get.cc:965
#18 0xf7b60a48 in rheolef::geo_rep<double, rheolef::distributed>::load 
(this=<optimized out>, filename=..., comm=...)
    at geo_mpi_get.cc:989
#19 0xf7b3030a in rheolef::geo_load<double, rheolef::distributed> (name=...) at 
geo.cc:172
#20 0x56592bf8 in rheolef::geo_basic<double, rheolef::distributed>::geo_basic 
(comm=..., 
    
name="\360\265\225W\030\000\000\000\030\000\000\000\227@YV\320<ZV\002\000\000\000\000\060\bn\003\000\000\000y+YV@K\267\377\000\000\000\000\000\360\035\367\000\000\000\000\000\000\000\000\203w\002\367\000\360\035\367\000\360\035\367\000\000\000\000\203w\002\367\003\000\000\000\324K\267\377\344K\267\377dK\267\377\003\000\000\000\324K\267\377\000\360\035\367\352w\356\367\001\000\000\000\000\000\000\000\000\360\035\367\000\000\000\000\000\000\000\000Tl\324\tEf\254c",
 '\000' <repeats 12 times>, 
"\320K\267\377\000\200\355\367\354\202\355\367\350\210\355\367\003\000\000\000\320<ZV\003\000\000\000\310@YV\000\000\000\000\371@YV`+YV\003\000\000\000\324K\267\377"...,
 this=0xffb74ac0)
    at ../../include/rheolef/geo.h:1460
#21 main (argc=<optimized out>, argv=<optimized out>) at space_tst.cc:26




Thread 3 (Thread 0xf51ffb40 (LWP 29033)):
#0  0xf7fb7db9 in __kernel_vsyscall ()
#1  0xf71dbbd3 in __GI___poll (fds=0xf48005d0, nfds=2, timeout=3600000) at 
../sysdeps/unix/sysv/linux/poll.c:29
#2  0xf5d8fd4a in poll (__timeout=3600000, __nfds=2, __fds=0xf48005d0) at 
/usr/include/i386-linux-gnu/bits/poll2.h:46
#3  poll_dispatch (base=0x57eef9c0, tv=0xf51f9bfc) at poll.c:165
#4  0xf5d869e9 in opal_libevent2022_event_base_loop (base=<optimized out>, 
flags=<optimized out>) at event.c:1630
#5  0xf5d4c3bd in progress_engine (obj=0x57eef950) at 
runtime/opal_progress_threads.c:105
#6  0xf5ed7316 in start_thread (arg=0xf51ffb40) at pthread_create.c:465
#7  0xf71e6296 in clone () at ../sysdeps/unix/sysv/linux/i386/clone.S:108

Thread 2 (Thread 0xf5ba6b40 (LWP 29030)):
#0  0xf7fb7db9 in __kernel_vsyscall ()
#1  0xf71e63fa in __GI_epoll_pwait (epfd=7, events=0x57eee930, maxevents=32, 
timeout=-1, set=0x0) at ../sysdeps/unix/sysv/linux/epoll_pwait.c:42
#2  0xf71e669a in epoll_wait (epfd=7, events=0x57eee930, maxevents=32, 
timeout=-1) at ../sysdeps/unix/sysv/linux/epoll_wait.c:30
#3  0xf5d8299a in epoll_dispatch (base=0x57eee7a0, tv=0x0) at epoll.c:407
#4  0xf5d869e9 in opal_libevent2022_event_base_loop (base=<optimized out>, 
flags=<optimized out>) at event.c:1630
#5  0xf5bd33eb in progress_engine (obj=0x57eee7a0) at 
src/util/progress_threads.c:52
#6  0xf5ed7316 in start_thread (arg=0xf5ba6b40) at pthread_create.c:465
#7  0xf71e6296 in clone () at ../sysdeps/unix/sysv/linux/i386/clone.S:108

Thread 1 (Thread 0xf5c30e00 (LWP 29003)):
#0  0xf7fb77f5 in ?? ()
#1  0xf7fb7b43 in __vdso_clock_gettime ()
#2  0xf71f3961 in __GI___clock_gettime (clock_id=1, tp=0xffa8c4b4) at 
../sysdeps/unix/clock_gettime.c:115
#3  0xf5da4297 in opal_timer_linux_get_usec_clock_gettime () at 
timer_linux_component.c:197
#4  0xf5d479c3 in opal_progress () at runtime/opal_progress.c:197
#5  0xf7596e05 in sync_wait_st (sync=<optimized out>) at 
../opal/threads/wait_sync.h:80
#6  ompi_request_default_wait_all (count=2, requests=0xffa8c604, statuses=0x0) 
at request/req_wait.c:221
#7  0xf75e740d in ompi_coll_base_allreduce_intra_recursivedoubling 
(sbuf=0x5809b980, rbuf=0x580805b0, count=139, 
    dtype=0xf7646140 <ompi_mpi_unsigned>, op=0xf7655660 <ompi_mpi_op_max>, 
comm=0xf764a520 <ompi_mpi_comm_world>, module=0x57f74810)
    at base/coll_base_allreduce.c:225
#8  0xf1a05640 in ompi_coll_tuned_allreduce_intra_dec_fixed (sbuf=0x5809b980, 
rbuf=0x580805b0, count=139, dtype=0xf7646140 <ompi_mpi_unsigned>, 
    op=0xf7655660 <ompi_mpi_op_max>, comm=0xf764a520 <ompi_mpi_comm_world>, 
module=0x57f74810) at coll_tuned_decision_fixed.c:66
#9  0xf75a6b77 in PMPI_Allreduce (sendbuf=0x5809b980, recvbuf=0x580805b0, 
count=139, datatype=0xf7646140 <ompi_mpi_unsigned>, 
    op=0xf7655660 <ompi_mpi_op_max>, comm=0xf764a520 <ompi_mpi_comm_world>) at 
pallreduce.c:107
#10 0xf7c2862f in boost::mpi::detail::all_reduce_impl<unsigned int, 
boost::mpi::maximum<unsigned int> > (comm=..., in_values=0x5809b980, n=139, 
    out_values=0x580805b0) at 
/usr/include/boost/mpi/collectives/all_reduce.hpp:36
#11 0xf7c4019f in boost::mpi::all_reduce<unsigned int, 
boost::mpi::maximum<unsigned int> > (out_values=<optimized out>, n=<optimized 
out>, 
    in_values=<optimized out>, comm=..., op=...) at 
/usr/include/boost/mpi/collectives/all_reduce.hpp:93
#12 rheolef::geo_rep<double, rheolef::distributed>::get (this=<optimized out>, 
ips=...) at geo_mpi_get.cc:942
#13 0xf7c41a48 in rheolef::geo_rep<double, rheolef::distributed>::load 
(this=<optimized out>, filename=..., comm=...) at geo_mpi_get.cc:989
#14 0xf7c1130a in rheolef::geo_load<double, rheolef::distributed> (name=...) at 
geo.cc:172
#15 0x5658abf8 in rheolef::geo_basic<double, rheolef::distributed>::geo_basic 
(comm=..., 
    
name="\000\327\365W\030\000\000\000\030\000\000\000\227\300XV\320\274YV\002\000\000\000\000\240x1\003\000\000\000y\253XV\360\313\250\377\000\000\000\000\000\000,\367\000\000\000\000\000\000\000\000\203\207\020\367\000\000,\367\000\000,\367\000\000\000\000\203\207\020\367\003\000\000\000\204\314\250\377\224\314\250\377\024\314\250\377\003\000\000\000\204\314\250\377\000\000,\367\352\207\374\367\001\000\000\000\000\000\000\000\000\000,\367\000\000\000\000\000\000\000\000\276\"\333\071\257HBI",
 '\000' <repeats 12 times>, 
"\200\314\250\377\000\220\373\367\354\222\373\367\350\230\373\367\003\000\000\000\320\274YV\003\000\000\000\310\300XV\000\000\000\000\371\300XV`\253XV\003\000\000\000\204\314\250\377"...,
 
    this=0xffa8cb70) at ../../include/rheolef/geo.h:1460
#16 main (argc=<optimized out>, argv=<optimized out>) at space_tst.cc:26

--- End Message ---
--- Begin Message ---
Source: rheolef
Source-Version: 6.7-6

We believe that the bug you reported is fixed in the latest version of
rheolef, which is due to be installed in the Debian FTP archive.

A summary of the changes between this version and the previous one is
attached.

Thank you for reporting the bug, which will now be closed.  If you
have further comments please address them to 888...@bugs.debian.org,
and the maintainer will reopen the bug report if appropriate.

Debian distribution maintenance software
pp.
Pierre Saramito <pierre.saram...@imag.fr> (supplier of updated rheolef package)

(This message was generated automatically at their request; if you
believe that there is a problem with it please contact the archive
administrators by mailing ftpmas...@ftp-master.debian.org)


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256

Format: 1.8
Date: Fri, 16 Feb 2018 12:19:06 +0100
Source: rheolef
Binary: librheolef1 librheolef-dev rheolef-doc rheolef
Architecture: source
Version: 6.7-6
Distribution: unstable
Urgency: medium
Maintainer: Debian Science Maintainers 
<debian-science-maintain...@lists.alioth.debian.org>
Changed-By: Pierre Saramito <pierre.saram...@imag.fr>
Description:
 librheolef-dev - efficient Finite Element environment - development files
 librheolef1 - efficient Finite Element environment - shared library
 rheolef    - efficient Finite Element environment
 rheolef-doc - efficient Finite Element environment - documentation
Closes: 888879
Changes:
 rheolef (6.7-6) unstable; urgency=medium
 .
   * d/rules:
     - autoconf/automake regeneration forced, since configure.ac is patched
     - dh_auto_test delayed to the next 7.0 upstream version (closes: #888879)
   * d/*.install: pkglibdir changed to usr/lib/*/rheolef for multiarch
   * d/control:
     - Build-Depends: autoconf, automake & libtool added
     - Description: updated
   * d/patch: remove skip_failing_tests.patch
   * d/README.Debian: completed
   * d/compat: decreased from 11 to 9, for smooth (old)stable package building
          also decreased in d/control for debhelper dependency in Build-Depends
Checksums-Sha1:
 c0ba1125cb6db039aa1dc4a5e37e6c7e1bae620d 2515 rheolef_6.7-6.dsc
 a80289fe9ac84a1b0860afe1e9cf9eb410cfb542 10196 rheolef_6.7-6.debian.tar.xz
 93faa0b9514a9d9cdf1e472b56fb5f8ffe4707c3 16001 rheolef_6.7-6_source.buildinfo
Checksums-Sha256:
 91b4162321864b0d68a14b25c00f4db750ddb58737de1c0a82ce3d38fe67fad3 2515 
rheolef_6.7-6.dsc
 ffb3f8d568e78e49bf36fde25e77780286f5b2cac5d7f2d4e0e157678451c5ac 10196 
rheolef_6.7-6.debian.tar.xz
 08b195e1498594777842884b0d1bd540ae18586bf3fa8398bd6b145f7afa2fc6 16001 
rheolef_6.7-6_source.buildinfo
Files:
 9d900c3c7c53b9d3f2f58fb51267a527 2515 math optional rheolef_6.7-6.dsc
 526aafac02b136c42eaad9b18e9dd7b8 10196 math optional 
rheolef_6.7-6.debian.tar.xz
 b1b400484b9b55db15f9b58a17484df8 16001 math optional 
rheolef_6.7-6_source.buildinfo

-----BEGIN PGP SIGNATURE-----

iQJCBAEBCAAsFiEE8fAHMgoDVUHwpmPKV4oElNHGRtEFAlqJpX8OHHRpbGxlYUBy
a2kuZGUACgkQV4oElNHGRtGsaA//exqlmDZj/R1GzgW3hd+//HowZqaVK46x5YJH
Q/XL28oA4rJaT7KEv9C8E/z+KUdb9Z4cbbyQIhYvEAGATPiM10t/Jjx4bgdx6KMJ
OQgRokZEAnUpO5mkXIZar5wljEQ11Vf7jH9tNEtXIA/gBa81djQqG171IHmy9nzm
Y4FU5F4M2ko8/guNRyNoysyQJlPvCz54H901n0XfXLzEh6Jc12QO/e18UdoiK+xN
jZUNnlZ0q1314pEDcfKiyijoz1WTT5iRStWeuMhVJ35ItT3JGGvPockNtk4TvbTL
5SqdDGXfn5/wy5jA6bFWytdwHsp6XfxMwmO4rDqY2oTweZbYLZkn98+fS9orSv+a
O3niEFatUmdEN76Fwd3L+/+HjuD/9aV3I6jH9gj11r4E25FIbqio0AeskRZxhk6O
7vbEcUFw4NIs8BJydsmgQGsgu0DMxnYbPjSnqqzLFrUgvMT2R2cBq52BqoaPLPsL
/7QgY90z4T3oAbijQw1c7zz2gDOLEVZuIFfIhyNXrjgF44dmfQFMCm06jlW6IrZn
d4cd+Pyx5iegmrTRiXjCXa5THlcxFRSvaQ86M++BwCo0LQovbupvCrHtAot5vEdp
x0aUv8FwoHWnxeYl8YFYj8Jiz9nEpSzcKs8K3MdC22fKNO1QZsgoqLkFpq4RDe7h
l3OZXKQ=
=syrR
-----END PGP SIGNATURE-----

--- End Message ---

Reply via email to