Source: pytest-regressions
Version: 2.5.0+ds-2
Severity: serious
Justification: FTBFS
Tags: trixie sid ftbfs
User: lu...@debian.org
Usertags: ftbfs-20250414 ftbfs-trixie

Hi,

During a rebuild of all packages in testing (trixie), your package failed
to build on i386.


Relevant part (hopefully):
> make[1]: Entering directory 
> '/build/reproducible-path/pytest-regressions-2.5.0+ds'
> # Test o'clock
> dh_auto_test
> I: pybuild base:311: cd 
> /build/reproducible-path/pytest-regressions-2.5.0+ds/.pybuild/cpython3_3.13_pytest-regressions/build;
>  python3.13 -m pytest tests
> ============================= test session starts 
> ==============================
> platform linux -- Python 3.13.2, pytest-8.3.5, pluggy-1.5.0
> rootdir: /build/reproducible-path/pytest-regressions-2.5.0+ds
> configfile: tox.ini
> plugins: regressions-2.5.0+ds, typeguard-4.4.2, datadir-1.4.1+ds
> collected 72 items
> 
> tests/test_data_regression.py ........                                   [ 
> 11%]
> tests/test_dataframe_regression.py ..............                        [ 
> 30%]
> tests/test_file_regression.py ....                                       [ 
> 36%]
> tests/test_filenames.py ...                                              [ 
> 40%]
> tests/test_grids.py ..                                                   [ 
> 43%]
> tests/test_image_regression.py ..                                        [ 
> 45%]
> tests/test_ndarrays_regression.py .F....................                 [ 
> 76%]
> tests/test_num_regression.py .................                           
> [100%]
> 
> =================================== FAILURES 
> ===================================
> _______________________________ test_common_case 
> _______________________________
> 
> ndarrays_regression = 
> <pytest_regressions.ndarrays_regression.NDArraysRegressionFixture object at 
> 0xf0b3c1a0>
> no_regen = None
> 
>     def test_common_case(ndarrays_regression, no_regen):
>         # Most common case: Data is valid, is present and should pass
>         data1 = np.full(5000, 1.1, dtype=float)
>         data2 = np.arange(5000, dtype=int)
>         ndarrays_regression.check({"data1": data1, "data2": data2})
>     
>         # Assertion error case 1: Data has one invalid place
>         data1 = np.full(5000, 1.1, dtype=float)
>         data2 = np.arange(5000, dtype=int)
>         data1[500] += 0.1
>         with pytest.raises(AssertionError) as excinfo:
>             ndarrays_regression.check({"data1": data1, "data2": data2})
>         obtained_error_msg = str(excinfo.value)
>         expected = "\n".join(
>             [
>                 "Values are not sufficiently close.",
>                 "To update values, use --force-regen option.",
>             ]
>         )
>         assert expected in obtained_error_msg
>         expected = "\n".join(
>             [
>                 "data1:",
>                 "  Shape: (5000,)",
>                 "  Number of differences: 1 / 5000 (0.0%)",
>                 "  Individual errors:",
>                 "          Index              Obtained              Expected  
>           Difference",
>                 "            500    1.2000000000000002                   1.1  
>  0.10000000000000009",
>             ]
>         )
>         assert expected in obtained_error_msg
>     
>         # Assertion error case 2: More than one invalid data
>         data1 = np.full(5000, 1.1, dtype=float)
>         data2 = np.arange(5000, dtype=int)
>         data1[500] += 0.1
>         data1[600] += 0.2
>         data2[0] += 5
>         data2[700:900] += 5
>         with pytest.raises(AssertionError) as excinfo:
>             ndarrays_regression.check({"data1": data1, "data2": data2})
>         obtained_error_msg = str(excinfo.value)
>         expected = "\n".join(
>             [
>                 "Values are not sufficiently close.",
>                 "To update values, use --force-regen option.",
>             ]
>         )
>         assert expected in obtained_error_msg
>         expected = "\n".join(
>             [
>                 "data1:",
>                 "  Shape: (5000,)",
>                 "  Number of differences: 2 / 5000 (0.0%)",
>                 "  Statistics are computed for differing elements only.",
>                 "  Stats for abs(obtained - expected):",
>                 "    Max:     0.19999999999999996",
>                 "    Mean:    0.15000000000000002",
>                 "    Median:  0.15000000000000002",
>                 "  Stats for abs(obtained - expected) / abs(expected):",
>                 "    Max:     0.18181818181818177",
>                 "    Mean:    0.13636363636363638",
>                 "    Median:  0.13636363636363638",
>                 "  Individual errors:",
>                 "          Index              Obtained              Expected  
>           Difference",
>                 "            500    1.2000000000000002                   1.1  
>  0.10000000000000009",
>                 "            600                   1.3                   1.1  
>  0.19999999999999996",
>             ]
>         )
>         assert expected in obtained_error_msg
>         expected = "\n".join(
>             [
>                 "data2:",
>                 "  Shape: (5000,)",
>                 "  Number of differences: 201 / 5000 (4.0%)",
>                 "  Statistics are computed for differing elements only.",
>                 "  Stats for abs(obtained - expected):",
>                 "    Max:     5",
>                 "    Mean:    5.0",
>                 "    Median:  5.0",
>                 "  Stats for abs(obtained - expected) / abs(expected):",
>                 "    Number of (differing) non-zero expected results: 200 / 
> 201 (99.5%)",
>                 "    Relative errors are computed for the non-zero expected 
> results.",
>                 "    Max:     0.007142857142857143",
>                 "    Mean:    0.006286830640674575",
>                 "    Median:  0.006253911138923655",
>                 "  Individual errors:",
>                 "    Only showing first 100 mismatches.",
>                 "          Index              Obtained              Expected  
>           Difference",
>                 "              0                     5                     0  
>                    5",
>                 "            700                   705                   700  
>                    5",
>                 "            701                   706                   701  
>                    5",
>             ]
>         )
> >       assert expected in obtained_error_msg
> E       AssertionError: assert 'data2:\n  Shape: (5000,)\n  Number of 
> differences: 201 / 5000 (4.0%)\n  Statistics are computed for differing 
> element...          700                     5\n            701                
>    706                   701                     5' in 'Values are not 
> sufficiently close.\nTo update values, use --force-regen option.\n\ndata1:\n  
> Shape: (5000,)\n  Number...      797                     5\n            798   
>                 803                   798                     5\n\n'
> 
> tests/test_ndarrays_regression.py:141: AssertionError
> =============================== warnings summary 
> ===============================
> .pybuild/cpython3_3.13_pytest-regressions/build/tests/test_dataframe_regression.py:
>  15 warnings
> .pybuild/cpython3_3.13_pytest-regressions/build/tests/test_filenames.py: 6 
> warnings
> .pybuild/cpython3_3.13_pytest-regressions/build/tests/test_num_regression.py: 
> 27 warnings
>   
> /build/reproducible-path/pytest-regressions-2.5.0+ds/.pybuild/cpython3_3.13_pytest-regressions/build/pytest_regressions/dataframe_regression.py:250:
>  DeprecationWarning: Data type alias 'a' was deprecated in NumPy 2.0. Use the 
> 'S' alias instead.
>     assert array.dtype not in ["m", "M", "O", "S", "a", "U", "V"], (
> 
> -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
> =========================== short test summary info 
> ============================
> FAILED tests/test_ndarrays_regression.py::test_common_case - AssertionError: 
> ...
> ================== 1 failed, 71 passed, 48 warnings in 3.43s 
> ===================
> E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd 
> /build/reproducible-path/pytest-regressions-2.5.0+ds/.pybuild/cpython3_3.13_pytest-regressions/build;
>  python3.13 -m pytest tests
> dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p 3.13 
> returned exit code 13


The full build log is available from:
http://qa-logs.debian.net/2025/04/14/pytest-regressions_2.5.0+ds-2_testing-i386.log

All bugs filed during this archive rebuild are listed at:
https://bugs.debian.org/cgi-bin/pkgreport.cgi?tag=ftbfs-20250414;users=lu...@debian.org
or:
https://udd.debian.org/bugs/?release=na&merged=ign&fnewerval=7&flastmodval=7&fusertag=only&fusertagtag=ftbfs-20250414&fusertaguser=lu...@debian.org&allbugs=1&cseverity=1&ctags=1&caffected=1#results

A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!

If you reassign this bug to another package, please mark it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects

If you fail to reproduce this, please provide a build log and diff it with mine
so that we can identify if something relevant changed in the meantime.

Reply via email to