Package: src:snakemake
Version: 7.32.4-3
Severity: serious
Tags: ftbfs

Dear maintainer:

During a rebuild of all packages in unstable, your package failed to build:

--------------------------------------------------------------------------------
[...]
 debian/rules binary
dh binary --with python3,sphinxdoc --buildsystem=pybuild
   dh_update_autotools_config -O--buildsystem=pybuild
   dh_autoreconf -O--buildsystem=pybuild
   dh_auto_configure -O--buildsystem=pybuild
   dh_auto_build -O--buildsystem=pybuild
I: pybuild plugin_pyproject:129: Building wheel for python3.12 with "build" 
module
I: pybuild base:311: python3.12 -m build --skip-dependency-check --no-isolation --wheel 
--outdir /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_snakemake
* Building wheel...
running bdist_wheel
running build
running build_py
creating build
creating build/lib

[... snipped ...]

        for f in os.listdir(path):
            copy(os.path.join(path, f), tmpdir)
# Snakefile is now in temporary directory
        snakefile = join(tmpdir, snakefile)
# run snakemake
        if shellcmd:
            if not shellcmd.startswith("snakemake"):
                raise ValueError("shellcmd does not start with snakemake")
            shellcmd = "{} -m {}".format(sys.executable, shellcmd)
            try:
                if sigint_after is None:
                    subprocess.run(
                        shellcmd,
                        cwd=path if no_tmpdir else tmpdir,
                        check=True,
                        shell=True,
                        stderr=subprocess.STDOUT,
                        stdout=subprocess.PIPE,
                    )
                    success = True
                else:
                    with subprocess.Popen(
                        shlex.split(shellcmd),
                        cwd=path if no_tmpdir else tmpdir,
                        stderr=subprocess.STDOUT,
                        stdout=subprocess.PIPE,
                    ) as process:
                        time.sleep(sigint_after)
                        process.send_signal(signal.SIGINT)
                        time.sleep(2)
                        success = process.returncode == 0
            except subprocess.CalledProcessError as e:
                success = False
                print(e.stdout.decode(), file=sys.stderr)
        else:
            assert sigint_after is None, "Cannot sent SIGINT when calling 
directly"
            success = snakemake(
                snakefile=original_snakefile if no_tmpdir else snakefile,
                cores=cores,
                nodes=nodes,
                workdir=path if no_tmpdir else tmpdir,
                stats="stats.txt",
                config=config,
                verbose=True,
                targets=targets,
                conda_frontend=conda_frontend,
                container_image=container_image,
                overwrite_resource_scopes=(
                    ResourceScopes(overwrite_resource_scopes)
                    if overwrite_resource_scopes is not None
                    else overwrite_resource_scopes
                ),
                **params,
            )
if shouldfail:
            assert not success, "expected error on execution"
        else:
            if not success:
                print("Workdir:")
                print_tree(tmpdir, exclude=".snakemake/conda")
          assert success, "expected successful execution"
E           AssertionError: expected successful execution

.pybuild/cpython3_3.12_snakemake/build/tests/common.py:259: AssertionError
----------------------------- Captured stdout call -----------------------------
Workdir:
snakemake-pmpd9u7i/
    Snakefile
    .snakemake/
        log/
            2024-07-23T024036.012256.snakemake.log
        locks/
        singularity/
        incomplete/
        metadata/
        auxiliary/
        shadow/
    expected-results/
        .gitignore
----------------------------- Captured stderr call -----------------------------
Building DAG of jobs...
Using shell: /usr/bin/bash
Provided cores: 3
Rules claiming more threads will be scaled down.
Job stats:
job      count
-----  -------
2            3
3            1
all          1
total        5

Resources before job selection: {'_cores': 3, '_nodes': 9223372036854775807}
Ready jobs (1)
Select jobs to execute...
Using greedy selector because only single job has to be scheduled.
Selected jobs (1)
Resources after job selection: {'_cores': 2, '_nodes': 9223372036854775806}

[Tue Jul 23 02:40:36 2024]
rule 3:
    output: test.in
    jobid: 2
    reason: Missing output files: test.in
    resources: tmpdir=/tmp

/tmp/snakemake-pmpd9u7i/Snakefile:4: UserWarning: pytools.persistent_dict 
'mystorage': enabling safe_sync as default. This provides strong protection 
against data loss, but can be unnecessarily expensive for use cases such as 
caches.Pass 'safe_sync=False' if occasional data loss is tolerable. Pass 
'safe_sync=True' to suppress this warning.
  storage = PersistentDict("mystorage")
Building DAG of jobs...
Using shell: /usr/bin/bash
Provided cores: 3
Rules claiming more threads will be scaled down.
Select jobs to execute...
[Tue Jul 23 02:40:36 2024]
Error in rule 3:
    jobid: 0
    output: test.in

RuleException:
ProgrammingError in file /tmp/snakemake-pmpd9u7i/Snakefile, line 26:
SQLite objects created in a thread can only be used in that same thread. The 
object was created in thread id 140638792650880 and this is thread id 
140638741497536.
  File "/tmp/snakemake-pmpd9u7i/Snakefile", line 26, in __rule_3
  File "/usr/lib/python3/dist-packages/pytools/persistent_dict.py", line 783, 
in store
  File "/usr/lib/python3/dist-packages/pytools/persistent_dict.py", line 533, 
in _exec_sql
  File "/usr/lib/python3/dist-packages/pytools/persistent_dict.py", line 545, 
in _exec_sql_fn
  File "/usr/lib/python3/dist-packages/pytools/persistent_dict.py", line 531, 
in execute
  File "/usr/lib/python3.12/concurrent/futures/thread.py", line 58, in run
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: .snakemake/log/2024-07-23T024036.012256.snakemake.log
unlocking
removing lock
removing lock
removed all locks
------------------------------ Captured log call -------------------------------
WARNING  snakemake.logging:logging.py:580 Building DAG of jobs...
WARNING  snakemake.logging:logging.py:580 Using shell: /usr/bin/bash
WARNING  snakemake.logging:logging.py:588 Provided cores: 3
WARNING  snakemake.logging:logging.py:588 Rules claiming more threads will be 
scaled down.
WARNING  snakemake.logging:logging.py:590 Job stats:
job      count
-----  -------
2            3
3            1
all          1
total        5

DEBUG    snakemake.logging:logging.py:586 Resources before job selection: 
{'_cores': 3, '_nodes': 9223372036854775807}
DEBUG    snakemake.logging:logging.py:586 Ready jobs (1)
WARNING  snakemake.logging:logging.py:580 Select jobs to execute...
DEBUG    snakemake.logging:logging.py:586 Using greedy selector because only 
single job has to be scheduled.
DEBUG    snakemake.logging:logging.py:586 Selected jobs (1)
DEBUG    snakemake.logging:logging.py:586 Resources after job selection: 
{'_cores': 2, '_nodes': 9223372036854775806}
INFO     snakemake.logging:logging.py:498
INFO     snakemake.logging:logging.py:492 [Tue Jul 23 02:40:36 2024]
INFO     snakemake.logging:logging.py:505 rule 3:
    output: test.in
    jobid: 2
    reason: Missing output files: test.in
    resources: tmpdir=/tmp
INFO     snakemake.logging:logging.py:512
WARNING  snakemake.logging:logging.py:580 Shutting down, this might take some 
time.
ERROR    snakemake.logging:logging.py:584 Exiting because a job execution 
failed. Look above for error message
WARNING  snakemake.logging:logging.py:580 Complete log: 
.snakemake/log/2024-07-23T024036.012256.snakemake.log
DEBUG    snakemake.logging:logging.py:586 unlocking
DEBUG    snakemake.logging:logging.py:586 removing lock
DEBUG    snakemake.logging:logging.py:586 removing lock
DEBUG    snakemake.logging:logging.py:586 removed all locks
________________________________ test_issue328 _________________________________

    def test_issue328():
        try:
            import pytools
          run(dpath("test_issue328"), forcerun=["split"])

.pybuild/cpython3_3.12_snakemake/build/tests/tests.py:489:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

path = 
'/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_snakemake/build/tests/test_issue328'
shouldfail = False, snakefile = '/tmp/snakemake-xyjl7tjt/Snakefile'
subpath = None, no_tmpdir = False, check_md5 = True, check_results = True
cores = 3, nodes = None, set_pythonpath = True, cleanup = True
conda_frontend = 'mamba', config = {}, targets = None
container_image = 'snakemake/snakemake:latest', shellcmd = None
sigint_after = None, overwrite_resource_scopes = None
params = {'forcerun': ['split']}
results_dir = 
'/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_snakemake/build/tests/test_issue328/expected-results'
original_snakefile = 
'/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_snakemake/build/tests/test_issue328/Snakefile'
tmpdir = '/tmp/snakemake-xyjl7tjt', f = 'in.txt', success = False

    def run(
        path,
        shouldfail=False,
        snakefile="Snakefile",
        subpath=None,
        no_tmpdir=False,
        check_md5=True,
        check_results=None,
        cores=3,
        nodes=None,
        set_pythonpath=True,
        cleanup=True,
        conda_frontend="mamba",
        config=dict(),
        targets=None,
        container_image=os.environ.get("CONTAINER_IMAGE", 
"snakemake/snakemake:latest"),
        shellcmd=None,
        sigint_after=None,
        overwrite_resource_scopes=None,
        **params,
    ):
        """
        Test the Snakefile in the path.
        There must be a Snakefile in the path and a subdirectory named
        expected-results. If cleanup is False, we return the temporary
        directory to the calling test for inspection, and the test should
        clean it up.
        """
        if check_results is None:
            if not shouldfail:
                check_results = True
            else:
                check_results = False
if set_pythonpath:
            # Enforce current workdir (the snakemake source dir) to also be in 
PYTHONPATH
            # when subprocesses are invoked in the tempdir defined below.
            os.environ["PYTHONPATH"] = os.getcwd()
        elif "PYTHONPATH" in os.environ:
            del os.environ["PYTHONPATH"]
results_dir = join(path, "expected-results")
        original_snakefile = join(path, snakefile)
        assert os.path.exists(original_snakefile)
        assert os.path.exists(results_dir) and os.path.isdir(
            results_dir
        ), "{} does not exist".format(results_dir)
# If we need to further check results, we won't cleanup tmpdir
        tmpdir = next(tempfile._get_candidate_names())
        tmpdir = os.path.join(tempfile.gettempdir(), "snakemake-%s" % tmpdir)
        os.mkdir(tmpdir)
config = dict(config) # handle subworkflow
        if subpath is not None:
            # set up a working directory for the subworkflow and pass it in 
`config`
            # for now, only one subworkflow is supported
            assert os.path.exists(subpath) and os.path.isdir(
                subpath
            ), "{} does not exist".format(subpath)
            subworkdir = os.path.join(tmpdir, "subworkdir")
            os.mkdir(subworkdir)
# copy files
            for f in os.listdir(subpath):
                copy(os.path.join(subpath, f), subworkdir)
            config["subworkdir"] = subworkdir
# copy files
        for f in os.listdir(path):
            copy(os.path.join(path, f), tmpdir)
# Snakefile is now in temporary directory
        snakefile = join(tmpdir, snakefile)
# run snakemake
        if shellcmd:
            if not shellcmd.startswith("snakemake"):
                raise ValueError("shellcmd does not start with snakemake")
            shellcmd = "{} -m {}".format(sys.executable, shellcmd)
            try:
                if sigint_after is None:
                    subprocess.run(
                        shellcmd,
                        cwd=path if no_tmpdir else tmpdir,
                        check=True,
                        shell=True,
                        stderr=subprocess.STDOUT,
                        stdout=subprocess.PIPE,
                    )
                    success = True
                else:
                    with subprocess.Popen(
                        shlex.split(shellcmd),
                        cwd=path if no_tmpdir else tmpdir,
                        stderr=subprocess.STDOUT,
                        stdout=subprocess.PIPE,
                    ) as process:
                        time.sleep(sigint_after)
                        process.send_signal(signal.SIGINT)
                        time.sleep(2)
                        success = process.returncode == 0
            except subprocess.CalledProcessError as e:
                success = False
                print(e.stdout.decode(), file=sys.stderr)
        else:
            assert sigint_after is None, "Cannot sent SIGINT when calling 
directly"
            success = snakemake(
                snakefile=original_snakefile if no_tmpdir else snakefile,
                cores=cores,
                nodes=nodes,
                workdir=path if no_tmpdir else tmpdir,
                stats="stats.txt",
                config=config,
                verbose=True,
                targets=targets,
                conda_frontend=conda_frontend,
                container_image=container_image,
                overwrite_resource_scopes=(
                    ResourceScopes(overwrite_resource_scopes)
                    if overwrite_resource_scopes is not None
                    else overwrite_resource_scopes
                ),
                **params,
            )
if shouldfail:
            assert not success, "expected error on execution"
        else:
            if not success:
                print("Workdir:")
                print_tree(tmpdir, exclude=".snakemake/conda")
          assert success, "expected successful execution"
E           AssertionError: expected successful execution

.pybuild/cpython3_3.12_snakemake/build/tests/common.py:259: AssertionError
----------------------------- Captured stdout call -----------------------------
Workdir:
snakemake-xyjl7tjt/
    Snakefile
    in.txt
    .snakemake/
        log/
            2024-07-23T024223.243912.snakemake.log
        locks/
        singularity/
        incomplete/
            aW5fX19zbmFrZW1ha2VfZHluYW1pY19fLnR4dA==
        metadata/
        auxiliary/
        shadow/
    expected-results/
        out.txt
----------------------------- Captured stderr call -----------------------------
Building DAG of jobs...
Using shell: /usr/bin/bash
Provided cores: 3
Rules claiming more threads will be scaled down.
Job stats:
job      count
-----  -------
merge        1
split        1
total        2

Resources before job selection: {'_cores': 3, '_nodes': 9223372036854775807}
Ready jobs (1)
Select jobs to execute...
Using greedy selector because only single job has to be scheduled.
Selected jobs (1)
Resources after job selection: {'_cores': 2, '_nodes': 9223372036854775806}

[Tue Jul 23 02:42:23 2024]
rule split:
    input: in.txt
    output: in_{*}.txt (dynamic)
    jobid: 1
    reason: Forced execution
    resources: tmpdir=/tmp

Subsequent jobs will be added dynamically depending on the output of this job
/tmp/snakemake-xyjl7tjt/Snakefile:4: UserWarning: pytools.persistent_dict 
'storage': enabling safe_sync as default. This provides strong protection 
against data loss, but can be unnecessarily expensive for use cases such as 
caches.Pass 'safe_sync=False' if occasional data loss is tolerable. Pass 
'safe_sync=True' to suppress this warning.
  storage = PersistentDict("storage")
Building DAG of jobs...
Using shell: /usr/bin/bash
Provided cores: 3
Rules claiming more threads will be scaled down.
Select jobs to execute...
Subsequent jobs will be added dynamically depending on the output of this job
[Tue Jul 23 02:42:23 2024]
Error in rule split:
    jobid: 0
    input: in.txt
    output: in_{*}.txt (dynamic)

RuleException:
ProgrammingError in file /tmp/snakemake-xyjl7tjt/Snakefile, line 21:
SQLite objects created in a thread can only be used in that same thread. The 
object was created in thread id 140306412494976 and this is thread id 
140306361636544.
  File "/tmp/snakemake-xyjl7tjt/Snakefile", line 21, in __rule_split
  File "/usr/lib/python3/dist-packages/pytools/persistent_dict.py", line 576, 
in __getitem__
  File "/usr/lib/python3/dist-packages/pytools/persistent_dict.py", line 789, 
in fetch
  File "/usr/lib/python3/dist-packages/pytools/persistent_dict.py", line 533, 
in _exec_sql
  File "/usr/lib/python3/dist-packages/pytools/persistent_dict.py", line 545, 
in _exec_sql_fn
  File "/usr/lib/python3/dist-packages/pytools/persistent_dict.py", line 531, 
in execute
  File "/usr/lib/python3.12/concurrent/futures/thread.py", line 58, in run
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: .snakemake/log/2024-07-23T024223.243912.snakemake.log
unlocking
removing lock
removing lock
removed all locks
------------------------------ Captured log call -------------------------------
WARNING  snakemake.logging:logging.py:580 Building DAG of jobs...
WARNING  snakemake.logging:logging.py:580 Using shell: /usr/bin/bash
WARNING  snakemake.logging:logging.py:588 Provided cores: 3
WARNING  snakemake.logging:logging.py:588 Rules claiming more threads will be 
scaled down.
WARNING  snakemake.logging:logging.py:590 Job stats:
job      count
-----  -------
merge        1
split        1
total        2

DEBUG    snakemake.logging:logging.py:586 Resources before job selection: 
{'_cores': 3, '_nodes': 9223372036854775807}
DEBUG    snakemake.logging:logging.py:586 Ready jobs (1)
WARNING  snakemake.logging:logging.py:580 Select jobs to execute...
DEBUG    snakemake.logging:logging.py:586 Using greedy selector because only 
single job has to be scheduled.
DEBUG    snakemake.logging:logging.py:586 Selected jobs (1)
DEBUG    snakemake.logging:logging.py:586 Resources after job selection: 
{'_cores': 2, '_nodes': 9223372036854775806}
INFO     snakemake.logging:logging.py:498
INFO     snakemake.logging:logging.py:492 [Tue Jul 23 02:42:23 2024]
INFO     snakemake.logging:logging.py:505 rule split:
    input: in.txt
    output: in_{*}.txt (dynamic)
    jobid: 1
    reason: Forced execution
    resources: tmpdir=/tmp
INFO     snakemake.logging:logging.py:512
WARNING  snakemake.logging:logging.py:580 Subsequent jobs will be added 
dynamically depending on the output of this job
WARNING  snakemake.logging:logging.py:580 Shutting down, this might take some 
time.
ERROR    snakemake.logging:logging.py:584 Exiting because a job execution 
failed. Look above for error message
WARNING  snakemake.logging:logging.py:580 Complete log: 
.snakemake/log/2024-07-23T024223.243912.snakemake.log
DEBUG    snakemake.logging:logging.py:586 unlocking
DEBUG    snakemake.logging:logging.py:586 removing lock
DEBUG    snakemake.logging:logging.py:586 removing lock
DEBUG    snakemake.logging:logging.py:586 removed all locks
=============================== warnings summary ===============================
tests/test_schema.py: 12 warnings
tests/tests.py: 6 warnings
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_snakemake/build/snakemake/utils.py:50: 
DeprecationWarning: jsonschema.RefResolver is deprecated as of v4.18.0, in favor of the 
https://github.com/python-jsonschema/referencing library, which provides more compliant 
referencing behavior as well as more flexible APIs for customization. A future release will 
remove RefResolver. Please file a feature request (on referencing) if you are missing an 
API for the kind of customization you need.
    from jsonschema import validators, RefResolver

tests/testapi.py::test_keep_logger
  /usr/lib/python3/dist-packages/stopit/__init__.py:10: DeprecationWarning: 
pkg_resources is deprecated as an API. See 
https://setuptools.pypa.io/en/latest/pkg_resources.html
    import pkg_resources

tests/testapi.py::test_lockexception
  /usr/lib/python3/dist-packages/_pytest/python.py:166: 
PytestReturnNotNoneWarning: Expected None, but 
tests/testapi.py::test_lockexception returned True, which will be an error in a 
future version of pytest.  Did you mean to use `assert` instead of `return`?
    warnings.warn(

tests/tests.py::test_persistent_dict
  /tmp/snakemake-pmpd9u7i/Snakefile:4: UserWarning: pytools.persistent_dict 
'mystorage': enabling safe_sync as default. This provides strong protection 
against data loss, but can be unnecessarily expensive for use cases such as 
caches.Pass 'safe_sync=False' if occasional data loss is tolerable. Pass 
'safe_sync=True' to suppress this warning.
    storage = PersistentDict("mystorage")

tests/tests.py::test_issue328
  /tmp/snakemake-xyjl7tjt/Snakefile:4: UserWarning: pytools.persistent_dict 
'storage': enabling safe_sync as default. This provides strong protection 
against data loss, but can be unnecessarily expensive for use cases such as 
caches.Pass 'safe_sync=False' if occasional data loss is tolerable. Pass 
'safe_sync=True' to suppress this warning.
    storage = PersistentDict("storage")

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=========================== short test summary info ============================
FAILED 
.pybuild/cpython3_3.12_snakemake/build/tests/tests.py::test_persistent_dict
FAILED .pybuild/cpython3_3.12_snakemake/build/tests/tests.py::test_issue328
= 2 failed, 256 passed, 46 skipped, 90 deselected, 1 xfailed, 22 warnings in 
1124.50s (0:18:44) =
E: pybuild pybuild:389: test: plugin custom failed with: exit code=1: python3.12 -m pytest 
-o faulthandler_timeout=60 -v 
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_snakemake/build/tests/test*.py -k 'not 
test_cwl and not test_cwl_singularity and not test_url_include and not test_wrapper and not 
test_issue1083 and not test_github_issue78 and not test_container and not test_singularity 
and not test_singularity_conda and not test_convert_to_cwl and not test_report and not 
test_report_zip and not test_archive and not test_jupyter_notebook and not test_conda and 
not test_upstream_conda and not test_conda_custom_prefix and not test_script and not 
test_issue635 and not test_issue1093 and not test_default_resources and not 
test_default_remote and not test_remote and not test_output_file_cache_remote and not 
test_tibanna and not test_module_complex and not test_module_complex2 and not 
test_module_with_script and not test_module_report and not test_modules_meta_wrapper and 
not test_tes and not test_deploy_script and not test_deploy_hashing and not test_peppy and 
not test_modules_peppy and not test_report_display_code and not 
test_wrapper_local_git_prefix and not test_github_issue1396 and not test_az_batch_executor 
and not test_modules_prefix and not test_ancient and not test_github_issue1062 and not 
test_symlink_time_handling and not test_env_modules and not test_github_issue1460 and not 
test_rule_inheritance_globals and not test_prebuilt_conda_script'
dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p 3.12 
returned exit code 13
make[1]: *** [debian/rules:59: override_dh_auto_test] Error 25
make[1]: Leaving directory '/<<PKGBUILDDIR>>'
make: *** [debian/rules:46: binary] Error 2
dpkg-buildpackage: error: debian/rules binary subprocess returned exit status 2
--------------------------------------------------------------------------------

The above is just how the build ends and not necessarily the most relevant part.
If required, the full build log is available here:

https://people.debian.org/~sanvila/build-logs/202407/

About the archive rebuild: The build was made on virtual machines
of type m6a.large and r6a.large from AWS, using sbuild and a
reduced chroot with only build-essential packages.

If you could not reproduce the bug please contact me privately, as I
am willing to provide ssh access to a virtual machine where the bug is
fully reproducible.

If this is really a bug in one of the build-depends, please use
reassign and affects, so that this is still visible in the BTS web
page for this package.

Thanks.

Reply via email to