Your message dated Sat, 17 Aug 2024 14:50:58 +0000
with message-id <e1sfkle-009qto...@fasolo.debian.org>
and subject line Bug#1076827: fixed in python-scrapy 2.11.2-2
has caused the Debian Bug report #1076827,
regarding python-scrapy: FTBFS: Tries to access Internet during build
to be marked as done.

This means that you claim that the problem has been dealt with.
If this is not the case it is now your responsibility to reopen the
Bug report if necessary, and/or fix the problem forthwith.

(NB: If you are a system administrator and have no idea what this
message is talking about, this may indicate a serious mail system
misconfiguration somewhere. Please contact ow...@bugs.debian.org
immediately.)


-- 
1076827: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1076827
Debian Bug Tracking System
Contact ow...@bugs.debian.org with problems
--- Begin Message ---
Package: src:python-scrapy
Version: 2.11.2-1
Severity: serious
Tags: ftbfs

Dear maintainer:

During a rebuild of all packages in unstable, your package failed to build:

--------------------------------------------------------------------------------
[...]
 debian/rules binary
dh binary --buildsystem=pybuild
   dh_update_autotools_config -O--buildsystem=pybuild
   dh_autoreconf -O--buildsystem=pybuild
   dh_auto_configure -O--buildsystem=pybuild
I: pybuild base:311: python3.12 setup.py config
running config
   dh_auto_build -O--buildsystem=pybuild
I: pybuild base:311: /usr/bin/python3 setup.py build
running build
running build_py
creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy
copying scrapy/interfaces.py -> 
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy
copying scrapy/mail.py -> 
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy

[... snipped ...]

KeyError: "namespace: urls key: {'url': 
'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'}"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 174, in 
_new_conn
    conn = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 73, in 
create_connection
    for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/socket.py", line 964, in getaddrinfo
    for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
socket.gaierror: [Errno -3] Temporary failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 716, in 
urlopen
    httplib_response = self._make_request(
                       ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 405, in 
_make_request
    self._validate_conn(conn)
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 1059, 
in _validate_conn
    conn.connect()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 363, in 
connect
    self.sock = conn = self._new_conn()
                       ^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 186, in 
_new_conn
    raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object 
at 0x7f8330b89ca0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 486, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 800, in 
urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 592, in 
increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: 
HTTPSConnectionPool(host='raw.githubusercontent.com', port=443): Max retries exceeded 
with url: /publicsuffix/list/master/public_suffix_list.dat (Caused by 
NewConnectionError('<urllib3.connection.HTTPSConnection object at 
0x7f8330b89ca0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/suffix_list.py", line 46, in 
find_first_response
    return cache.cached_fetch_url(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 220, in 
cached_fetch_url
    return self.run_and_cache(
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 211, in 
run_and_cache
    result = func(**kwargs)
             ^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 229, in 
_fetch_url
    response = session.get(url, timeout=timeout)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 602, in get
    return self.request("GET", url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 589, in 
request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 519, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: 
HTTPSConnectionPool(host='raw.githubusercontent.com', port=443): Max retries exceeded 
with url: /publicsuffix/list/master/public_suffix_list.dat (Caused by 
NewConnectionError('<urllib3.connection.HTTPSConnection object at 
0x7f8330b89ca0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution'))
------------------------------ Captured log call -------------------------------
ERROR    tldextract:suffix_list.py:50 Exception reading Public Suffix List url 
file:///usr/share/publicsuffix/effective_tld_names.dat
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 209, in 
run_and_cache
    result = cast(T, self.get(namespace=namespace, key=key_args))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 111, in get
    raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: publicsuffix.org-tlds key: {'urls': 
('file:///usr/share/publicsuffix/effective_tld_names.dat', 
'https://publicsuffix.org/list/public_suffix_list.dat', 
'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 
'fallback_to_snapshot': True}"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 209, in 
run_and_cache
    result = cast(T, self.get(namespace=namespace, key=key_args))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 111, in get
    raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: urls key: {'url': 
'file:///usr/share/publicsuffix/effective_tld_names.dat'}"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/suffix_list.py", line 46, in 
find_first_response
    return cache.cached_fetch_url(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 220, in 
cached_fetch_url
    return self.run_and_cache(
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 211, in 
run_and_cache
    result = func(**kwargs)
             ^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 230, in 
_fetch_url
    response.raise_for_status()
  File "/usr/lib/python3/dist-packages/requests/models.py", line 1021, in 
raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: None for url: None
DEBUG    urllib3.connectionpool:connectionpool.py:1020 Starting new HTTPS 
connection (1): publicsuffix.org:443
ERROR    tldextract:suffix_list.py:50 Exception reading Public Suffix List url 
https://publicsuffix.org/list/public_suffix_list.dat
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 209, in 
run_and_cache
    result = cast(T, self.get(namespace=namespace, key=key_args))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 111, in get
    raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: publicsuffix.org-tlds key: {'urls': 
('file:///usr/share/publicsuffix/effective_tld_names.dat', 
'https://publicsuffix.org/list/public_suffix_list.dat', 
'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 
'fallback_to_snapshot': True}"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 209, in 
run_and_cache
    result = cast(T, self.get(namespace=namespace, key=key_args))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 111, in get
    raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: urls key: {'url': 
'https://publicsuffix.org/list/public_suffix_list.dat'}"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 174, in 
_new_conn
    conn = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 73, in 
create_connection
    for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/socket.py", line 964, in getaddrinfo
    for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
socket.gaierror: [Errno -3] Temporary failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 716, in 
urlopen
    httplib_response = self._make_request(
                       ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 405, in 
_make_request
    self._validate_conn(conn)
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 1059, 
in _validate_conn
    conn.connect()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 363, in 
connect
    self.sock = conn = self._new_conn()
                       ^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 186, in 
_new_conn
    raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object 
at 0x7f8330b887d0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 486, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 800, in 
urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 592, in 
increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='publicsuffix.org', 
port=443): Max retries exceeded with url: /list/public_suffix_list.dat (Caused by 
NewConnectionError('<urllib3.connection.HTTPSConnection object at 
0x7f8330b887d0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/suffix_list.py", line 46, in 
find_first_response
    return cache.cached_fetch_url(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 220, in 
cached_fetch_url
    return self.run_and_cache(
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 211, in 
run_and_cache
    result = func(**kwargs)
             ^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 229, in 
_fetch_url
    response = session.get(url, timeout=timeout)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 602, in get
    return self.request("GET", url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 589, in 
request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 519, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='publicsuffix.org', 
port=443): Max retries exceeded with url: /list/public_suffix_list.dat (Caused by 
NewConnectionError('<urllib3.connection.HTTPSConnection object at 
0x7f8330b887d0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution'))
DEBUG    urllib3.connectionpool:connectionpool.py:1020 Starting new HTTPS 
connection (1): raw.githubusercontent.com:443
ERROR    tldextract:suffix_list.py:50 Exception reading Public Suffix List url 
https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 209, in 
run_and_cache
    result = cast(T, self.get(namespace=namespace, key=key_args))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 111, in get
    raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: publicsuffix.org-tlds key: {'urls': 
('file:///usr/share/publicsuffix/effective_tld_names.dat', 
'https://publicsuffix.org/list/public_suffix_list.dat', 
'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 
'fallback_to_snapshot': True}"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 209, in 
run_and_cache
    result = cast(T, self.get(namespace=namespace, key=key_args))
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 111, in get
    raise KeyError("namespace: " + namespace + " key: " + repr(key))
KeyError: "namespace: urls key: {'url': 
'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'}"

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 174, in 
_new_conn
    conn = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 73, in 
create_connection
    for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/socket.py", line 964, in getaddrinfo
    for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
socket.gaierror: [Errno -3] Temporary failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 716, in 
urlopen
    httplib_response = self._make_request(
                       ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 405, in 
_make_request
    self._validate_conn(conn)
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 1059, 
in _validate_conn
    conn.connect()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 363, in 
connect
    self.sock = conn = self._new_conn()
                       ^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 186, in 
_new_conn
    raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object 
at 0x7f8330b89ca0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 486, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 800, in 
urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 592, in 
increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: 
HTTPSConnectionPool(host='raw.githubusercontent.com', port=443): Max retries exceeded 
with url: /publicsuffix/list/master/public_suffix_list.dat (Caused by 
NewConnectionError('<urllib3.connection.HTTPSConnection object at 
0x7f8330b89ca0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tldextract/suffix_list.py", line 46, in 
find_first_response
    return cache.cached_fetch_url(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 220, in 
cached_fetch_url
    return self.run_and_cache(
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 211, in 
run_and_cache
    result = func(**kwargs)
             ^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/tldextract/cache.py", line 229, in 
_fetch_url
    response = session.get(url, timeout=timeout)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 602, in get
    return self.request("GET", url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 589, in 
request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 519, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: 
HTTPSConnectionPool(host='raw.githubusercontent.com', port=443): Max retries exceeded 
with url: /publicsuffix/list/master/public_suffix_list.dat (Caused by 
NewConnectionError('<urllib3.connection.HTTPSConnection object at 
0x7f8330b89ca0>: Failed to establish a new connection: [Errno -3] Temporary 
failure in name resolution'))
=============================== warnings summary ===============================
scrapy/spidermiddlewares/offsite.py:15
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/spidermiddlewares/offsite.py:15:
 ScrapyDeprecationWarning: The scrapy.spidermiddlewares.offsite module is deprecated, use 
scrapy.downloadermiddlewares.offsite instead.
    warnings.warn(

tests/test_addons.py: 2 warnings
tests/test_crawler.py: 2 warnings
tests/test_downloaderslotssettings.py: 1 warning
tests/test_extension_periodic_log.py: 16 warnings
tests/test_spider.py: 6 warnings
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/utils/request.py:254:
 ScrapyDeprecationWarning: '2.6' is a deprecated value for the 
'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting.
It is also the default value. In other words, it is normal to get this warning if you have not defined a value for the 'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting. This is so for backward compatibility reasons, but it will change in a future version of Scrapy. See the documentation of the 'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting for information on how to handle this deprecation.
    return cls(crawler)

tests/test_contracts.py::ContractsManagerTest::test_returns_async
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/contracts/__init__.py:170:
 RuntimeWarning: coroutine 'TestSpider.returns_request_async' was never awaited
    results.addError(case, sys.exc_info())

tests/test_crawl.py: 2 warnings
tests/test_downloader_handlers.py: 161 warnings
tests/test_downloader_handlers_http2.py: 34 warnings
tests/test_webclient.py: 4 warnings
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/core/downloader/contextfactory.py:90:
 DeprecationWarning: Passing method to twisted.internet.ssl.CertificateOptions was 
deprecated in Twisted 17.1.0. Please use a combination of insecurelyLowerMinimumTo, 
raiseMinimumTo, and lowerMaximumSecurityTo instead, as Twisted will correctly configure the 
method.
    return CertificateOptions(

tests/test_downloadermiddleware_offsite.py::test_process_request_invalid_domains
tests/test_downloadermiddleware_offsite.py::test_request_scheduled_invalid_domains
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/downloadermiddlewares/offsite.py:67:
 UserWarning: allowed_domains accepts only domains, not URLs. Ignoring URL entry 
http:////b.example in allowed_domains.
    warnings.warn(message)

tests/test_feedexport.py::StdoutFeedStoragePreFeedOptionsTest::test_init
tests/test_feedexport.py::StdoutFeedStoragePreFeedOptionsTest::test_init
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/extensions/feedexport.py:49:
 ScrapyDeprecationWarning: StdoutFeedStorageWithoutFeedOptions does not support the 
'feed_options' keyword argument. Add a 'feed_options' parameter to its signature to remove 
this warning. This parameter will become mandatory in a future version of Scrapy.
    warnings.warn(

tests/test_feedexport.py::FileFeedStoragePreFeedOptionsTest::test_init
tests/test_feedexport.py::FileFeedStoragePreFeedOptionsTest::test_init
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/extensions/feedexport.py:49:
 ScrapyDeprecationWarning: FileFeedStorageWithoutFeedOptions does not support the 
'feed_options' keyword argument. Add a 'feed_options' parameter to its signature to remove 
this warning. This parameter will become mandatory in a future version of Scrapy.
    warnings.warn(

tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_from_crawler
tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_from_crawler
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/extensions/feedexport.py:49:
 ScrapyDeprecationWarning: S3FeedStorageWithoutFeedOptionsWithFromCrawler.from_crawler does 
not support the 'feed_options' keyword argument. Add a 'feed_options' parameter to its 
signature to remove this warning. This parameter will become mandatory in a future version 
of Scrapy.
    warnings.warn(

tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_init
tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_init
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/extensions/feedexport.py:49:
 ScrapyDeprecationWarning: S3FeedStorageWithoutFeedOptions does not support the 
'feed_options' keyword argument. Add a 'feed_options' parameter to its signature to remove 
this warning. This parameter will become mandatory in a future version of Scrapy.
    warnings.warn(

tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_from_crawler
tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_from_crawler
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/extensions/feedexport.py:49:
 ScrapyDeprecationWarning: FTPFeedStorageWithoutFeedOptionsWithFromCrawler.from_crawler 
does not support the 'feed_options' keyword argument. Add a 'feed_options' parameter to its 
signature to remove this warning. This parameter will become mandatory in a future version 
of Scrapy.
    warnings.warn(

tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_init
tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_init
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/extensions/feedexport.py:49:
 ScrapyDeprecationWarning: FTPFeedStorageWithoutFeedOptions does not support the 
'feed_options' keyword argument. Add a 'feed_options' parameter to its signature to remove 
this warning. This parameter will become mandatory in a future version of Scrapy.
    warnings.warn(

tests/test_utils_datatypes.py::CaseInsensitiveDictTest::test_getdefault
tests/test_utils_datatypes.py::CaselessDictTest::test_getdefault
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:95:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = CaselessDict()

tests/test_utils_datatypes.py::CaseInsensitiveDictTest::test_setdefault
tests/test_utils_datatypes.py::CaselessDictTest::test_setdefault
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:101:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = CaselessDict({"a": 1, "b": 2})

tests/test_utils_datatypes.py::CaselessDictTest::test_caseless
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:79:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class()

tests/test_utils_datatypes.py::CaselessDictTest::test_contains
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:132:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class()

tests/test_utils_datatypes.py::CaselessDictTest::test_copy
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:185:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    h1 = self.dict_class({"header1": "value"})

tests/test_utils_datatypes.py::CaselessDictTest::test_copy
tests/test_utils_datatypes.py::CaselessDictTest::test_copy
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/utils/datatypes.py:55:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    return self.__class__(self)

tests/test_utils_datatypes.py::CaselessDictTest::test_delete
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:89:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class({"key_lower": 1})

tests/test_utils_datatypes.py::CaselessDictTest::test_fromkeys
tests/test_utils_datatypes.py::CaselessDictTest::test_fromkeys
tests/test_utils_datatypes.py::CaselessDictTest::test_fromkeys
tests/test_utils_datatypes.py::CaselessDictTest::test_fromkeys
tests/test_utils_datatypes.py::CaselessDictTest::test_normvalue
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/utils/datatypes.py:80:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    return cls((k, value) for k in keys)

tests/test_utils_datatypes.py::CaselessDictTest::test_fromkeys
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:122:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    instance = self.dict_class()

tests/test_utils_datatypes.py::CaselessDictTest::test_init_dict
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:24:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class(seq)

tests/test_utils_datatypes.py::CaselessDictTest::test_init_mapping
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:49:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class(seq)

tests/test_utils_datatypes.py::CaselessDictTest::test_init_mutable_mapping
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:74:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class(seq)

tests/test_utils_datatypes.py::CaselessDictTest::test_init_pair_sequence
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:30:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class(seq)

tests/test_utils_datatypes.py::CaselessDictTest::test_normkey
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:149:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = MyDict()

tests/test_utils_datatypes.py::CaselessDictTest::test_normvalue
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:161:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = MyDict({"key": 1})

tests/test_utils_datatypes.py::CaselessDictTest::test_normvalue
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:165:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = MyDict()

tests/test_utils_datatypes.py::CaselessDictTest::test_normvalue
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:170:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = MyDict()

tests/test_utils_datatypes.py::CaselessDictTest::test_normvalue
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:175:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = MyDict()

tests/test_utils_datatypes.py::CaselessDictTest::test_pop
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/tests/test_utils_datatypes.py:137:
 ScrapyDeprecationWarning: scrapy.utils.datatypes.CaselessDict is deprecated, please use 
scrapy.utils.datatypes.CaseInsensitiveDict instead
    d = self.dict_class()

tests/test_utils_misc/test_return_with_argument_inside_generator.py::UtilsMiscPy3TestCase::test_generators_return_none
tests/test_utils_misc/test_return_with_argument_inside_generator.py::UtilsMiscPy3TestCase::test_generators_return_none
tests/test_utils_misc/test_return_with_argument_inside_generator.py::UtilsMiscPy3TestCase::test_generators_return_something
tests/test_utils_misc/test_return_with_argument_inside_generator.py::UtilsMiscPy3TestCase::test_generators_return_something
tests/test_utils_misc/test_return_with_argument_inside_generator.py::UtilsMiscPy3TestCase::test_generators_return_something
tests/test_utils_misc/test_return_with_argument_inside_generator.py::UtilsMiscPy3TestCase::test_generators_return_something
tests/test_utils_misc/test_return_with_argument_inside_generator.py::UtilsMiscPy3TestCase::test_generators_return_something
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build/scrapy/utils/misc.py:249: 
DeprecationWarning: ast.NameConstant is deprecated and will be removed in Python 3.14; use 
ast.Constant instead
    value is None or isinstance(value, ast.NameConstant) and value.value is None

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=========================== short test summary info ============================
FAILED tests/test_crawl.py::CrawlTestCase::test_unbounded_response - twisted....
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_basic
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_complex_cookies
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_cookie_redirect_different_domain
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_cookie_redirect_different_domain_forcing_get
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_cookie_redirect_same_domain
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_cookie_redirect_same_domain_forcing_get
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_cookiejar_key
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_do_not_break_on_non_utf8_header
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_dont_merge_cookies
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_invalid_cookies
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_local_domain
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_merge_request_cookies
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_primitive_type_cookies
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_request_cookies_encoding
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_server_set_cookie_domain_public_period
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_server_set_cookie_domain_suffix_private
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_server_set_cookie_domain_suffix_public_period
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_server_set_cookie_domain_suffix_public_private
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_setting_disabled_cookies_debug
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_setting_enabled_cookies_debug
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_user_set_cookie_domain_public_period
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_user_set_cookie_domain_suffix_private
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_user_set_cookie_domain_suffix_public_period
FAILED 
tests/test_downloadermiddleware_cookies.py::CookiesMiddlewareTest::test_user_set_cookie_domain_suffix_public_private
= 25 failed, 3187 passed, 306 skipped, 5 deselected, 21 xfailed, 277 warnings 
in 379.84s (0:06:19) =
E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd 
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_scrapy/build; python3.12 -m pytest 
--ignore tests/test_command_check.py -k 'not (test_start_requests_laziness or test_utf16)'
dh_auto_test: error: pybuild --test -i python{version} -p 3.12 returned exit 
code 13
make: *** [debian/rules:17: binary] Error 25
dpkg-buildpackage: error: debian/rules binary subprocess returned exit status 2
--------------------------------------------------------------------------------

The above is just how the build ends and not necessarily the most relevant part.

For a full build log, please see:

https://tests.reproducible-builds.org/debian/rb-pkg/unstable/amd64/python-scrapy.html

Note: You can reproduce this easily by trying to build the package using sbuild 
unshare backend.

About the archive rebuild: The build was made on virtual machines
of type m6a.large and r6a.large from AWS, using sbuild and a
reduced chroot with only build-essential packages.

If you could not reproduce the bug please contact me privately, as I
am willing to provide ssh access to a virtual machine where the bug is
fully reproducible.

If this is really a bug in one of the build-depends, please use
reassign and affects, so that this is still visible in the BTS web
page for this package.

Thanks.

--- End Message ---
--- Begin Message ---
Source: python-scrapy
Source-Version: 2.11.2-2
Done: Andrey Rakhmatullin <w...@debian.org>

We believe that the bug you reported is fixed in the latest version of
python-scrapy, which is due to be installed in the Debian FTP archive.

A summary of the changes between this version and the previous one is
attached.

Thank you for reporting the bug, which will now be closed.  If you
have further comments please address them to 1076...@bugs.debian.org,
and the maintainer will reopen the bug report if appropriate.

Debian distribution maintenance software
pp.
Andrey Rakhmatullin <w...@debian.org> (supplier of updated python-scrapy 
package)

(This message was generated automatically at their request; if you
believe that there is a problem with it please contact the archive
administrators by mailing ftpmas...@ftp-master.debian.org)


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

Format: 1.8
Date: Sat, 17 Aug 2024 18:51:11 +0500
Source: python-scrapy
Architecture: source
Version: 2.11.2-2
Distribution: unstable
Urgency: medium
Maintainer: Debian Python Team <team+pyt...@tracker.debian.org>
Changed-By: Andrey Rakhmatullin <w...@debian.org>
Closes: 1076827
Changes:
 python-scrapy (2.11.2-2) unstable; urgency=medium
 .
   * Add B-D: publicsuffix to stop requiring network access (Closes: #1076827).
   * Set CRYPTOGRAPHY_OPENSSL_NO_LEGACY=1 to work around #1078747.
Checksums-Sha1:
 a245fa3fc671a920f941ac04a1993cb70ed3c559 3496 python-scrapy_2.11.2-2.dsc
 739a40064cfc4a802d5efb8a23d14d9eb6b35147 10628 
python-scrapy_2.11.2-2.debian.tar.xz
 d59bdc85aa5a046e4a5e08c64205a76d5bf35b75 11063 
python-scrapy_2.11.2-2_amd64.buildinfo
Checksums-Sha256:
 1c8a9f3d4c56328e74a99045d5cdcd069382c6c47cbdafd570c0760c7d2140b5 3496 
python-scrapy_2.11.2-2.dsc
 5c475bed0ab11e082e42215503d88728150d7200157d3f042ec67f9939d8303a 10628 
python-scrapy_2.11.2-2.debian.tar.xz
 5cc998023c413803c123fd7fd8a6947e0b53dafc2498c7dd09ab67b3708bb5bb 11063 
python-scrapy_2.11.2-2_amd64.buildinfo
Files:
 a43a574e5fb97a768f7d40a4c8091d2f 3496 python optional 
python-scrapy_2.11.2-2.dsc
 b02133791fab2b6fb633edfc45425f3c 10628 python optional 
python-scrapy_2.11.2-2.debian.tar.xz
 edc70e2cc64eba20fff184ecd4ddd224 11063 python optional 
python-scrapy_2.11.2-2_amd64.buildinfo

-----BEGIN PGP SIGNATURE-----

iQIzBAEBCgAdFiEEolIP6gqGcKZh3YxVM2L3AxpJkuEFAmbAriAACgkQM2L3AxpJ
kuGt6A//W/QXObLk+9Y5H18VuR1rVrymvLZL3tTZcKpPiB8IDwtA7SnFUCoMMNi1
ps0vr7uRK6QN4yP6eSAEZEOZX/xSV57lhqUzpXOR0zWXcomF37H6KyvNqd6W9MfV
5EF9aTKSy6zg+7YrZZuDSTM+LPhykLPKeTBKjqFytiWn+fX5GWuFI5Yon4K4Eitj
DHRoDgjvIHFZnv3UY1CHL90bKwk3cHl+J/tvLuYmSC5DsF2Sm+GYHnx/nQ6wplWA
p/u+JPJKTGcMUPpBKu7aE3SEs3B8/wD7CM7/4PId9AVC5fcHivUQr0Aucm9MjSSH
v+GXNjPjkACkTWFWtGBAPMzogfHe1nnibhQNJLcmM7rD4C2M93kTzLqt1nqNlvA9
zSlYWmIWQ1okenjW9rZD26Yo0VCiBCQblWt1dHC+MHkXQMeR1g5/KLzG3uAIctPL
sZzw49cF5eB/TPS3IlqYe3Ti5fey5u0eVaTVfPTjxbfYO7BSviFuEqRN7w6oJ+in
VD19IMY0DMScSXY4tmb4GvrIj0tBlvkKPQlE/wuPLlS5WaZZgOOduV/R5Wnuolnn
V64wtvaPaQA5TSgZQXAyspgvcOYNIplnbyocGaN5an21TGEjKGmF4sbiTcFDXT2m
klqmfRylDLOFgt/i5c+7sq9R4ooJilfw8MQgp+4mBpO4vDcwftk=
=6zYy
-----END PGP SIGNATURE-----

Attachment: pgpbFkrcXeYdF.pgp
Description: PGP signature


--- End Message ---

Reply via email to