Package: src:joblib
Version: 1.3.2-2
Severity: important
Tags: ftbfs patch

Dear maintainer:

During a rebuild of all packages in unstable, your package failed to build:

[ please read the notes at the very end ]

--------------------------------------------------------------------------------
[...]
 debian/rules binary
dh binary --buildsystem=pybuild --with=python3
   dh_update_autotools_config -O--buildsystem=pybuild
   dh_autoreconf -O--buildsystem=pybuild
   dh_auto_configure -O--buildsystem=pybuild
   dh_auto_build -O--buildsystem=pybuild
I: pybuild plugin_pyproject:129: Building wheel for python3.12 with "build" 
module
I: pybuild base:311: python3.12 -m build --skip-dependency-check --no-isolation --wheel 
--outdir /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12
* Building wheel...
running bdist_wheel
running build
running build_py
creating build
creating build/lib

[... snipped ...]

joblib/test/test_memmapping.py 
.s......................X................
 [ 46%]
........
                                                                 [ 47%]
joblib/test/test_memory.py 
.xx..........................................
 [ 50%]
...................
                                                      [ 51%]
joblib/test/test_missing_multiprocessing.py .                     
       [ 52%]
joblib/test/test_module.py ....        
                                  [ 52%]
joblib/test/test_numpy_pickle.py 
.......................................
 [ 55%]
..........s.............................................................
 [ 60%]
................s..
                                                      [ 61%]
joblib/test/test_numpy_pickle_compat.py .                         
       [ 61%]
joblib/test/test_numpy_pickle_utils.py ..                
                [ 62%]
joblib/test/test_parallel.py 
...........................................
 [ 65%]
........................................................................
 [ 70%]
........................................................................
 [ 75%]
...................XXX.............s....................................
 [ 81%]
..................................................s.ss.s...s.ss.s...s.ss
 [ 86%]
.s......................................................................
 [ 91%]
...........................................FFFFss....ss.....ssssssssssss
 [ 96%]
ssssssss.F.F.F.F.F.F...
                                                  [ 98%]
joblib/test/test_store_backends.py 
.....                         
        [ 98%]
joblib/test/test_testing.py 
.....                         
               [ 99%]
joblib/test/test_utils.py 
.........
                                      [100%]

=================================== FAILURES ===================================
_____________ test_nested_parallelism_limit[parallel_config-loky] 
______________

context = <class 'joblib.parallel.parallel_config'>, backend = 'loky'

    @with_multiprocessing
    @parametrize('backend', ['loky', 'threading'])
    @parametrize("context", [parallel_config, parallel_backend])
    def test_nested_parallelism_limit(context, backend):
        with context(backend, n_jobs=2):
            backend_types_and_levels = _recursive_backend_info()
if cpu_count() == 1:
            second_level_backend_type = 'SequentialBackend'
            max_level = 1
        else:
            second_level_backend_type = 'ThreadingBackend'
            max_level = 2
top_level_backend_type = backend.title() + 'Backend'
        expected_types_and_levels = [
            (top_level_backend_type, 0),
            (second_level_backend_type, 1),
            ('SequentialBackend', max_level),
            ('SequentialBackend', max_level)
        ]
      assert backend_types_and_levels == expected_types_and_levels
E       AssertionError: assert [('LokyBacken...lBackend', 2)] == 
[('LokyBacken...lBackend', 1)]
E         
E         At index 1 diff: ('ThreadingBackend', 1) != 
('SequentialBackend', 1)
E         Use -v to get more diff

joblib/test/test_parallel.py:1621: AssertionError
___________ test_nested_parallelism_limit[parallel_config-threading] 
___________

context = <class 'joblib.parallel.parallel_config'>, backend = 'threading'

    @with_multiprocessing
    @parametrize('backend', ['loky', 'threading'])
    @parametrize("context", [parallel_config, parallel_backend])
    def test_nested_parallelism_limit(context, backend):
        with context(backend, n_jobs=2):
            backend_types_and_levels = _recursive_backend_info()
if cpu_count() == 1:
            second_level_backend_type = 'SequentialBackend'
            max_level = 1
        else:
            second_level_backend_type = 'ThreadingBackend'
            max_level = 2
top_level_backend_type = backend.title() + 'Backend'
        expected_types_and_levels = [
            (top_level_backend_type, 0),
            (second_level_backend_type, 1),
            ('SequentialBackend', max_level),
            ('SequentialBackend', max_level)
        ]
      assert backend_types_and_levels == expected_types_and_levels
E       AssertionError: assert [('ThreadingB...lBackend', 2)] == 
[('ThreadingB...lBackend', 1)]
E         
E         At index 1 diff: ('ThreadingBackend', 1) != 
('SequentialBackend', 1)
E         Use -v to get more diff

joblib/test/test_parallel.py:1621: AssertionError
_____________ test_nested_parallelism_limit[parallel_backend-loky] 
_____________

context = <class 'joblib.parallel.parallel_backend'>, backend = 'loky'

    @with_multiprocessing
    @parametrize('backend', ['loky', 'threading'])
    @parametrize("context", [parallel_config, parallel_backend])
    def test_nested_parallelism_limit(context, backend):
        with context(backend, n_jobs=2):
            backend_types_and_levels = _recursive_backend_info()
if cpu_count() == 1:
            second_level_backend_type = 'SequentialBackend'
            max_level = 1
        else:
            second_level_backend_type = 'ThreadingBackend'
            max_level = 2
top_level_backend_type = backend.title() + 'Backend'
        expected_types_and_levels = [
            (top_level_backend_type, 0),
            (second_level_backend_type, 1),
            ('SequentialBackend', max_level),
            ('SequentialBackend', max_level)
        ]
      assert backend_types_and_levels == expected_types_and_levels
E       AssertionError: assert [('LokyBacken...lBackend', 2)] == 
[('LokyBacken...lBackend', 1)]
E         
E         At index 1 diff: ('ThreadingBackend', 1) != 
('SequentialBackend', 1)
E         Use -v to get more diff

joblib/test/test_parallel.py:1621: AssertionError
__________ test_nested_parallelism_limit[parallel_backend-threading] 
___________

context = <class 'joblib.parallel.parallel_backend'>, backend = 'threading'

    @with_multiprocessing
    @parametrize('backend', ['loky', 'threading'])
    @parametrize("context", [parallel_config, parallel_backend])
    def test_nested_parallelism_limit(context, backend):
        with context(backend, n_jobs=2):
            backend_types_and_levels = _recursive_backend_info()
if cpu_count() == 1:
            second_level_backend_type = 'SequentialBackend'
            max_level = 1
        else:
            second_level_backend_type = 'ThreadingBackend'
            max_level = 2
top_level_backend_type = backend.title() + 'Backend'
        expected_types_and_levels = [
            (top_level_backend_type, 0),
            (second_level_backend_type, 1),
            ('SequentialBackend', max_level),
            ('SequentialBackend', max_level)
        ]
      assert backend_types_and_levels == expected_types_and_levels
E       AssertionError: assert [('ThreadingB...lBackend', 2)] == 
[('ThreadingB...lBackend', 1)]
E         
E         At index 1 diff: ('ThreadingBackend', 1) != 
('SequentialBackend', 1)
E         Use -v to get more diff

joblib/test/test_parallel.py:1621: AssertionError
_ 
test_threadpool_limitation_in_child_override[parallel_config-OPENBLAS_NUM_THREADS--1]
 _

context = <class 'joblib.parallel.parallel_config'>, n_jobs = -1
var_name = 'OPENBLAS_NUM_THREADS'

    @with_multiprocessing
    @parametrize('n_jobs', [2, -1])
    @parametrize('var_name', ["OPENBLAS_NUM_THREADS",
                              "MKL_NUM_THREADS",
                              "OMP_NUM_THREADS"])
    @parametrize("context", [parallel_config, parallel_backend])
    def test_threadpool_limitation_in_child_override(context, n_jobs, var_name):
        # Check that environment variables set by the user on the main process
        # always have the priority.
# Clean up the existing executor because we change the environment of the
        # parent at runtime and it is not detected in loky intentionally.
        get_reusable_executor(reuse=True).shutdown()
def _get_env(var_name):
            return os.environ.get(var_name)
original_var_value = os.environ.get(var_name)
        try:
            os.environ[var_name] = "4"
            # Skip this test if numpy is not linked to a BLAS library
            results = Parallel(n_jobs=n_jobs)(
                delayed(_get_env)(var_name) for i in range(2))
            assert results == ["4", "4"]
with context('loky', inner_max_num_threads=1):
                results = Parallel(n_jobs=n_jobs)(
                    delayed(_get_env)(var_name) for i in range(2))
          assert results == ["1", "1"]
E           AssertionError: assert ['4', '4'] == ['1', '1']
E             
E             At index 0 diff: '4' != '1'
E             Use -v to get more diff

joblib/test/test_parallel.py:1879: AssertionError
_ 
test_threadpool_limitation_in_child_override[parallel_config-MKL_NUM_THREADS--1]
 _

context = <class 'joblib.parallel.parallel_config'>, n_jobs = -1
var_name = 'MKL_NUM_THREADS'

    @with_multiprocessing
    @parametrize('n_jobs', [2, -1])
    @parametrize('var_name', ["OPENBLAS_NUM_THREADS",
                              "MKL_NUM_THREADS",
                              "OMP_NUM_THREADS"])
    @parametrize("context", [parallel_config, parallel_backend])
    def test_threadpool_limitation_in_child_override(context, n_jobs, var_name):
        # Check that environment variables set by the user on the main process
        # always have the priority.
# Clean up the existing executor because we change the environment of the
        # parent at runtime and it is not detected in loky intentionally.
        get_reusable_executor(reuse=True).shutdown()
def _get_env(var_name):
            return os.environ.get(var_name)
original_var_value = os.environ.get(var_name)
        try:
            os.environ[var_name] = "4"
            # Skip this test if numpy is not linked to a BLAS library
            results = Parallel(n_jobs=n_jobs)(
                delayed(_get_env)(var_name) for i in range(2))
            assert results == ["4", "4"]
with context('loky', inner_max_num_threads=1):
                results = Parallel(n_jobs=n_jobs)(
                    delayed(_get_env)(var_name) for i in range(2))
          assert results == ["1", "1"]
E           AssertionError: assert ['4', '4'] == ['1', '1']
E             
E             At index 0 diff: '4' != '1'
E             Use -v to get more diff

joblib/test/test_parallel.py:1879: AssertionError
_ 
test_threadpool_limitation_in_child_override[parallel_config-OMP_NUM_THREADS--1]
 _

context = <class 'joblib.parallel.parallel_config'>, n_jobs = -1
var_name = 'OMP_NUM_THREADS'

    @with_multiprocessing
    @parametrize('n_jobs', [2, -1])
    @parametrize('var_name', ["OPENBLAS_NUM_THREADS",
                              "MKL_NUM_THREADS",
                              "OMP_NUM_THREADS"])
    @parametrize("context", [parallel_config, parallel_backend])
    def test_threadpool_limitation_in_child_override(context, n_jobs, var_name):
        # Check that environment variables set by the user on the main process
        # always have the priority.
# Clean up the existing executor because we change the environment of the
        # parent at runtime and it is not detected in loky intentionally.
        get_reusable_executor(reuse=True).shutdown()
def _get_env(var_name):
            return os.environ.get(var_name)
original_var_value = os.environ.get(var_name)
        try:
            os.environ[var_name] = "4"
            # Skip this test if numpy is not linked to a BLAS library
            results = Parallel(n_jobs=n_jobs)(
                delayed(_get_env)(var_name) for i in range(2))
            assert results == ["4", "4"]
with context('loky', inner_max_num_threads=1):
                results = Parallel(n_jobs=n_jobs)(
                    delayed(_get_env)(var_name) for i in range(2))
          assert results == ["1", "1"]
E           AssertionError: assert ['4', '4'] == ['1', '1']
E             
E             At index 0 diff: '4' != '1'
E             Use -v to get more diff

joblib/test/test_parallel.py:1879: AssertionError
_ 
test_threadpool_limitation_in_child_override[parallel_backend-OPENBLAS_NUM_THREADS--1]
 _

context = <class 'joblib.parallel.parallel_backend'>, n_jobs = -1
var_name = 'OPENBLAS_NUM_THREADS'

    @with_multiprocessing
    @parametrize('n_jobs', [2, -1])
    @parametrize('var_name', ["OPENBLAS_NUM_THREADS",
                              "MKL_NUM_THREADS",
                              "OMP_NUM_THREADS"])
    @parametrize("context", [parallel_config, parallel_backend])
    def test_threadpool_limitation_in_child_override(context, n_jobs, var_name):
        # Check that environment variables set by the user on the main process
        # always have the priority.
# Clean up the existing executor because we change the environment of the
        # parent at runtime and it is not detected in loky intentionally.
        get_reusable_executor(reuse=True).shutdown()
def _get_env(var_name):
            return os.environ.get(var_name)
original_var_value = os.environ.get(var_name)
        try:
            os.environ[var_name] = "4"
            # Skip this test if numpy is not linked to a BLAS library
            results = Parallel(n_jobs=n_jobs)(
                delayed(_get_env)(var_name) for i in range(2))
            assert results == ["4", "4"]
with context('loky', inner_max_num_threads=1):
                results = Parallel(n_jobs=n_jobs)(
                    delayed(_get_env)(var_name) for i in range(2))
          assert results == ["1", "1"]
E           AssertionError: assert ['4', '4'] == ['1', '1']
E             
E             At index 0 diff: '4' != '1'
E             Use -v to get more diff

joblib/test/test_parallel.py:1879: AssertionError
_ 
test_threadpool_limitation_in_child_override[parallel_backend-MKL_NUM_THREADS--1]
 _

context = <class 'joblib.parallel.parallel_backend'>, n_jobs = -1
var_name = 'MKL_NUM_THREADS'

    @with_multiprocessing
    @parametrize('n_jobs', [2, -1])
    @parametrize('var_name', ["OPENBLAS_NUM_THREADS",
                              "MKL_NUM_THREADS",
                              "OMP_NUM_THREADS"])
    @parametrize("context", [parallel_config, parallel_backend])
    def test_threadpool_limitation_in_child_override(context, n_jobs, var_name):
        # Check that environment variables set by the user on the main process
        # always have the priority.
# Clean up the existing executor because we change the environment of the
        # parent at runtime and it is not detected in loky intentionally.
        get_reusable_executor(reuse=True).shutdown()
def _get_env(var_name):
            return os.environ.get(var_name)
original_var_value = os.environ.get(var_name)
        try:
            os.environ[var_name] = "4"
            # Skip this test if numpy is not linked to a BLAS library
            results = Parallel(n_jobs=n_jobs)(
                delayed(_get_env)(var_name) for i in range(2))
            assert results == ["4", "4"]
with context('loky', inner_max_num_threads=1):
                results = Parallel(n_jobs=n_jobs)(
                    delayed(_get_env)(var_name) for i in range(2))
          assert results == ["1", "1"]
E           AssertionError: assert ['4', '4'] == ['1', '1']
E             
E             At index 0 diff: '4' != '1'
E             Use -v to get more diff

joblib/test/test_parallel.py:1879: AssertionError
_ 
test_threadpool_limitation_in_child_override[parallel_backend-OMP_NUM_THREADS--1]
 _

context = <class 'joblib.parallel.parallel_backend'>, n_jobs = -1
var_name = 'OMP_NUM_THREADS'

    @with_multiprocessing
    @parametrize('n_jobs', [2, -1])
    @parametrize('var_name', ["OPENBLAS_NUM_THREADS",
                              "MKL_NUM_THREADS",
                              "OMP_NUM_THREADS"])
    @parametrize("context", [parallel_config, parallel_backend])
    def test_threadpool_limitation_in_child_override(context, n_jobs, var_name):
        # Check that environment variables set by the user on the main process
        # always have the priority.
# Clean up the existing executor because we change the environment of the
        # parent at runtime and it is not detected in loky intentionally.
        get_reusable_executor(reuse=True).shutdown()
def _get_env(var_name):
            return os.environ.get(var_name)
original_var_value = os.environ.get(var_name)
        try:
            os.environ[var_name] = "4"
            # Skip this test if numpy is not linked to a BLAS library
            results = Parallel(n_jobs=n_jobs)(
                delayed(_get_env)(var_name) for i in range(2))
            assert results == ["4", "4"]
with context('loky', inner_max_num_threads=1):
                results = Parallel(n_jobs=n_jobs)(
                    delayed(_get_env)(var_name) for i in range(2))
          assert results == ["1", "1"]
E           AssertionError: assert ['4', '4'] == ['1', '1']
E             
E             At index 0 diff: '4' != '1'
E             Use -v to get more diff

joblib/test/test_parallel.py:1879: AssertionError
=============================== warnings summary 
===============================
joblib/testing.py:22
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/joblib/testing.py:22: 
PytestUnknownMarkWarning: Unknown pytest.mark.timeout - is this a typo?  You can register 
custom marks to avoid this warning - for details, see 
https://docs.pytest.org/en/stable/how-to/mark.html
    timeout = pytest.mark.timeout

joblib/test/test_parallel.py:1652
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/joblib/test/test_parallel.py:1652:
 PytestUnknownMarkWarning: Unknown pytest.mark.no_cover - is this a typo?  You can register 
custom marks to avoid this warning - for details, see 
https://docs.pytest.org/en/stable/how-to/mark.html
    @pytest.mark.no_cover

joblib/executor.py:105
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/joblib/executor.py:105: 
PytestCollectionWarning: cannot collect test class '_TestingMemmappingExecutor' because it 
has a __init__ constructor (from: 
.pybuild/cpython3_3.12/build/joblib/test/test_memmapping.py)
    class _TestingMemmappingExecutor(MemmappingExecutor):

.pybuild/cpython3_3.12/build/joblib/test/test_func_inspect.py::test_filter_args_2
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/joblib/test/test_func_inspect.py:131: 
UserWarning: Cannot inspect object functools.partial(<function f at 0x7ff88f8c3740>, 1), 
ignore list will not work.
    assert filter_args(ff, ['y'], (1, )) == {'*': [1], '**': {}}

.pybuild/cpython3_3.12/build/joblib/test/test_hashing.py: 2 warnings
.pybuild/cpython3_3.12/build/joblib/test/test_memmapping.py: 52 warnings
.pybuild/cpython3_3.12/build/joblib/test/test_parallel.py: 109 warnings
.pybuild/cpython3_3.12/build/joblib/test/test_store_backends.py: 2 warnings
  /usr/lib/python3.12/multiprocessing/popen_fork.py:66: DeprecationWarning: 
This process (pid=82879) is multi-threaded, use of fork() may lead to deadlocks 
in the child.
    self.pid = os.fork()

.pybuild/cpython3_3.12/build/joblib/test/test_memmapping.py: 51 warnings
.pybuild/cpython3_3.12/build/joblib/test/test_parallel.py: 81 warnings
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/joblib/externals/loky/backend/fork_exec.py:38:
 DeprecationWarning: This process (pid=82879) is multi-threaded, use of fork() may lead to 
deadlocks in the child.
    pid = os.fork()

.pybuild/cpython3_3.12/build/joblib/test/test_memory.py::test_memory_integration
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/joblib/test/test_memory.py:104: 
UserWarning: Compressed results cannot be memmapped
    memory = Memory(location=tmpdir.strpath, verbose=10,

.pybuild/cpython3_3.12/build/joblib/test/test_memory.py::test_memory_integration
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/joblib/memory.py:132: 
UserWarning: Compressed items cannot be memmapped in a filesystem store. Option will be 
ignored.
    obj.configure(location, verbose=verbose,

.pybuild/cpython3_3.12/build/joblib/test/test_memory.py::test_memory_integration
  /usr/lib/python3.12/contextlib.py:137: UserWarning: mmap_mode "r" is not compatible with 
compressed file 
/tmp/pytest-of-buildd/pytest-0/test_memory_integration0/joblib/joblib/test/test_memory/test_memory_integration/<locals>/f/b69f9d78d7bc537482721c40ce38db0a/output.pkl.
 "r" flag will be ignored.
    return next(self.gen)

.pybuild/cpython3_3.12/build/joblib/test/test_numpy_pickle.py::test_joblib_pickle_across_python_versions
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/joblib/test/test_numpy_pickle.py:461:
 PendingDeprecationWarning: the matrix subclass is not the recommended way to represent 
matrices or deal with linear algebra (see 
https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code 
to use regular ndarray.
    np.matrix([0, 1, 2], dtype=np.dtype('<i8')),

.pybuild/cpython3_3.12/build/joblib/test/test_numpy_pickle.py::test_joblib_pickle_across_python_versions_with_mmap
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/joblib/test/test_numpy_pickle.py:490:
 PendingDeprecationWarning: the matrix subclass is not the recommended way to represent 
matrices or deal with linear algebra (see 
https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code 
to use regular ndarray.
    np.matrix([0, 1, 2], dtype=np.dtype('<i8')),

.pybuild/cpython3_3.12/build/joblib/test/test_parallel.py::test_nested_loop[threading-multiprocessing]
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/joblib/parallel.py:1332: 
UserWarning: Multiprocessing-backed parallel loops cannot be nested below threads, setting 
n_jobs=1
    n_jobs = self._backend.configure(n_jobs=self.n_jobs, parallel=self,

.pybuild/cpython3_3.12/build/joblib/test/test_parallel.py::test_nested_loop[threading-loky]
.pybuild/cpython3_3.12/build/joblib/test/test_parallel.py::test_nested_loop[threading-back_compat_backend]
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/joblib/parallel.py:1332: 
UserWarning: Loky-backed parallel loops cannot be nested below threads, setting n_jobs=1
    n_jobs = self._backend.configure(n_jobs=self.n_jobs, parallel=self,

.pybuild/cpython3_3.12/build/joblib/test/test_parallel.py::test_deadlock_with_generator[2-loky]
.pybuild/cpython3_3.12/build/joblib/test/test_parallel.py::test_multiple_generator_call[2-threading]
.pybuild/cpython3_3.12/build/joblib/test/test_parallel.py::test_multiple_generator_call[2-loky]
.pybuild/cpython3_3.12/build/joblib/test/test_parallel.py::test_multiple_generator_call_managed[2-threading]
.pybuild/cpython3_3.12/build/joblib/test/test_parallel.py::test_multiple_generator_call_managed[2-loky]
.pybuild/cpython3_3.12/build/joblib/test/test_parallel.py::test_multiple_generator_call_separated_gc[loky-True]
.pybuild/cpython3_3.12/build/joblib/test/test_parallel.py::test_multiple_generator_call_separated_gc[threading-False]
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/joblib/parallel.py:1762: 
UserWarning: 4 tasks which were still being processed by the workers have been cancelled. 
You could benefit from adjusting the input task iterator to limit unnecessary computation 
time.
    warnings.warn(msg)

.pybuild/cpython3_3.12/build/joblib/test/test_testing.py::test_check_subprocess_call_timeout
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/joblib/testing.py:58: UserWarning: Timeout running 
['/usr/bin/python3.12', '-c', 'import time\nimport sys\nprint("before sleep on 
stdout")\nsys.stdout.flush()\nsys.stderr.write("before sleep on 
stderr")\nsys.stderr.flush()\ntime.sleep(10)\nprint("process should have be killed 
before")\nsys.stdout.flush()']
    warnings.warn(f"Timeout running {cmd}")

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=========================== short test summary info 
============================
FAILED 
joblib/test/test_parallel.py::test_nested_parallelism_limit[parallel_config-loky]
 - AssertionError: assert [('LokyBacken...lBackend', 2)] == 
[('LokyBacken...lB...
FAILED 
joblib/test/test_parallel.py::test_nested_parallelism_limit[parallel_config-threading]
 - AssertionError: assert [('ThreadingB...lBackend', 2)] == 
[('ThreadingB...lB...
FAILED 
joblib/test/test_parallel.py::test_nested_parallelism_limit[parallel_backend-loky]
 - AssertionError: assert [('LokyBacken...lBackend', 2)] == 
[('LokyBacken...lB...
FAILED 
joblib/test/test_parallel.py::test_nested_parallelism_limit[parallel_backend-threading]
 - AssertionError: assert [('ThreadingB...lBackend', 2)] == 
[('ThreadingB...lB...
FAILED 
joblib/test/test_parallel.py::test_threadpool_limitation_in_child_override[parallel_config-OPENBLAS_NUM_THREADS--1]
 - AssertionError: assert ['4', '4'] == ['1', '1']
FAILED 
joblib/test/test_parallel.py::test_threadpool_limitation_in_child_override[parallel_config-MKL_NUM_THREADS--1]
 - AssertionError: assert ['4', '4'] == ['1', '1']
FAILED 
joblib/test/test_parallel.py::test_threadpool_limitation_in_child_override[parallel_config-OMP_NUM_THREADS--1]
 - AssertionError: assert ['4', '4'] == ['1', '1']
FAILED 
joblib/test/test_parallel.py::test_threadpool_limitation_in_child_override[parallel_backend-OPENBLAS_NUM_THREADS--1]
 - AssertionError: assert ['4', '4'] == ['1', '1']
FAILED 
joblib/test/test_parallel.py::test_threadpool_limitation_in_child_override[parallel_backend-MKL_NUM_THREADS--1]
 - AssertionError: assert ['4', '4'] == ['1', '1']
FAILED 
joblib/test/test_parallel.py::test_threadpool_limitation_in_child_override[parallel_backend-OMP_NUM_THREADS--1]
 - AssertionError: assert ['4', '4'] == ['1', '1']
= 10 failed, 1308 passed, 42 skipped, 
2 deselected, 2 xfailed, 4 xpassed, 317 
warnings in 84.09s (0:01:24) =
E: pybuild pybuild:389: test: plugin pyproject failed with: exit code=1: cd 
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build; python3.12 -m pytest -k "not 
test_nested_loop_error_in_grandchild_resource_tracker_silent and not 
test_resource_tracker_silent_when_reference_cycles and not 
test_parallel_with_interactively_defined_functions_default_backend"
dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p 3.12 
returned exit code 13
make: *** [debian/rules:15: binary] Error 25
dpkg-buildpackage: error: debian/rules binary subprocess returned exit status 2
--------------------------------------------------------------------------------

The above is just how the build ends and not necessarily the most relevant part.
If required, the full build log is available here:

https://people.debian.org/~sanvila/build-logs/202410/

About the archive rebuild: The build was made on virtual machines from AWS,
using sbuild and a reduced chroot with only build-essential packages.

If you could not reproduce the bug please contact me privately, as I
am willing to provide ssh access to a virtual machine where the bug is
fully reproducible.

If this is really a bug in one of the build-depends, please use
reassign and affects, so that this is still visible in the BTS web
page for this package.


Notes:

The tests do not currently work since version 1.3.2-3, see Bug #1085691.
That's why I'm reporting this against version 1.3.2-2.

This happens 100% of the time for me when the machine has one CPU.
The trivial way to reproduce is GRUB_CMDLINE_LINUX="nr_cpus=1".

If this is not expected to work, I suggest applying the attached patch.

However, this used to work in bullseye (joblib 0.13.0-2), which also had
tests called "parallel".

Maybe this should be forwarded upstream to see that they think.

I'd like to backport the fix to bookworm-proposed-updates, but naturally only
after we know what is the good fix (or a fix which is considered good
enough for the team).

Thanks.
commit 9fd5a14a4b255b86d03f3d9e62eff09c37143226
Author: Santiago Vila <sanv...@debian.org>
Date:   Mon Oct 21 18:11:57 2024 +0200

    Exclude parallel tests if the building machine has a single CPU

diff --git a/debian/rules b/debian/rules
index da1c5f0..b3d51f2 100755
--- a/debian/rules
+++ b/debian/rules
@@ -9,6 +9,10 @@ EXCLUDE_TESTS += and not 
test_resource_tracker_silent_when_reference_cycles
 # Until https://github.com/joblib/joblib/issues/1329 is open
 EXCLUDE_TESTS += and not 
test_parallel_with_interactively_defined_functions_default_backend
 
+ifeq ($(shell nproc), 1)
+  EXCLUDE_TESTS += and not test_parallel
+endif
+
 export PYBUILD_NAME=joblib
 export PYBUILD_TEST_ARGS_python3 := -k "$(EXCLUDE_TESTS)"
 

Reply via email to