Package: src:vcr.py
Version: 6.0.2-2
Severity: serious
Tags: ftbfs trixie sid

Dear maintainer:

During a rebuild of all packages in unstable, your package failed to build:

--------------------------------------------------------------------------------
[...]
 debian/rules clean
dh clean --buildsystem=pybuild
   dh_auto_clean -O--buildsystem=pybuild
I: pybuild base:311: python3.12 setup.py clean 
/usr/lib/python3/dist-packages/setuptools/_distutils/dist.py:261: UserWarning: 
Unknown distribution option: 'tests_require'
  warnings.warn(msg)
running clean
removing '/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_vcr/build' (and everything 
under it)
'build/bdist.linux-x86_64' does not exist -- can't clean it
'build/scripts-3.12' does not exist -- can't clean it
I: pybuild base:311: python3.13 setup.py clean 
/usr/lib/python3/dist-packages/setuptools/_distutils/dist.py:261: UserWarning: 
Unknown distribution option: 'tests_require'
  warnings.warn(msg)
running clean

[... snipped ...]

    
        # Set properties that are used by the pooling layer.
        response.retries = retries
        response._connection = response_conn  # type: ignore[attr-defined]
        response._pool = self  # type: ignore[attr-defined]
    
        log.debug(
            '%s://%s:%s "%s %s %s" %s %s',
            self.scheme,
            self.host,
            self.port,
            method,
            url,
>           response.version_string,
            response.status,
            response.length_remaining,
        )
E       AttributeError: 'VCRHTTPResponse' object has no attribute 
'version_string'

/usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError
----------------------------- Captured stderr call -----------------------------
127.0.0.1 - - [18/Jan/2025 15:58:39] "GET / HTTP/1.1" 200 9358
_____________________________ test_domain_redirect _____________________________

    def test_domain_redirect():
        """Ensure that redirects across domains are considered unique"""
        # In this example, seomoz.org redirects to moz.com, and if those
        # requests are considered identical, then we'll be stuck in a redirect
        # loop.
        url = "http://seomoz.org/";
        with vcr.use_cassette("tests/fixtures/wild/domain_redirect.yaml") as 
cass:
>           requests.get(url, headers={"User-Agent": "vcrpy-test"})

tests/integration/test_wild.py:20: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3/dist-packages/requests/api.py:73: in get
    return request("get", url, params=params, **kwargs)
/usr/lib/python3/dist-packages/requests/api.py:59: in request
    return session.request(method=method, url=url, **kwargs)
/usr/lib/python3/dist-packages/requests/sessions.py:589: in request
    resp = self.send(prep, **send_kwargs)
/usr/lib/python3/dist-packages/requests/sessions.py:703: in send
    r = adapter.send(request, **kwargs)
/usr/lib/python3/dist-packages/requests/adapters.py:667: in send
    resp = conn.urlopen(
/usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen
    response = self._make_request(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fdee7e97ed0>
conn = 
<vcr.patch.VCRRequestsHTTPConnectiontests/fixtures/wild/domain_redirect.yaml 
object at 0x7fdee7e6f380>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'vcrpy-test', 'Accept-Encoding': 'gzip, deflate, br', 
'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
timeout = Timeout(connect=None, read=None, total=None), chunked = False
response_conn = 
<vcr.patch.VCRRequestsHTTPConnectiontests/fixtures/wild/domain_redirect.yaml 
object at 0x7fdee7e6f380>
preload_content = False, decode_content = False, enforce_content_length = True

    def _make_request(
        self,
        conn: BaseHTTPConnection,
        method: str,
        url: str,
        body: _TYPE_BODY | None = None,
        headers: typing.Mapping[str, str] | None = None,
        retries: Retry | None = None,
        timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
        chunked: bool = False,
        response_conn: BaseHTTPConnection | None = None,
        preload_content: bool = True,
        decode_content: bool = True,
        enforce_content_length: bool = True,
    ) -> BaseHTTPResponse:
        """
        Perform a request on a given urllib connection object taken from our
        pool.
    
        :param conn:
            a connection from one of our connection pools
    
        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)
    
        :param url:
            The URL to perform the request on.
    
        :param body:
            Data to send in the request body, either :class:`str`, 
:class:`bytes`,
            an iterable of :class:`str`/:class:`bytes`, or a file-like object.
    
        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.
    
        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.
    
            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.
    
            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.
    
        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.
    
        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.
    
        :param response_conn:
            Set this to ``None`` if you will handle releasing the connection or
            set the connection to have the response release it.
    
        :param preload_content:
          If True, the response's body will be preloaded during construction.
    
        :param decode_content:
            If True, will attempt to decode the body based on the
            'content-encoding' header.
    
        :param enforce_content_length:
            Enforce content length checking. Body returned by server must match
            value of Content-Length header, if present. Otherwise, raise error.
        """
        self.num_requests += 1
    
        timeout_obj = self._get_timeout(timeout)
        timeout_obj.start_connect()
        conn.timeout = 
Timeout.resolve_default_timeout(timeout_obj.connect_timeout)
    
        try:
            # Trigger any extra validation we need to do.
            try:
                self._validate_conn(conn)
            except (SocketTimeout, BaseSSLError) as e:
                self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
                raise
    
        # _validate_conn() starts the connection to an HTTPS proxy
        # so we need to wrap errors with 'ProxyError' here too.
        except (
            OSError,
            NewConnectionError,
            TimeoutError,
            BaseSSLError,
            CertificateError,
            SSLError,
        ) as e:
            new_e: Exception = e
            if isinstance(e, (BaseSSLError, CertificateError)):
                new_e = SSLError(e)
            # If the connection didn't successfully connect to it's proxy
            # then there
            if isinstance(
                new_e, (OSError, NewConnectionError, TimeoutError, SSLError)
            ) and (conn and conn.proxy and not conn.has_connected_to_proxy):
                new_e = _wrap_proxy_error(new_e, conn.proxy.scheme)
            raise new_e
    
        # conn.request() calls http.client.*.request, not the method in
        # urllib3.request. It also calls makefile (recv) on the socket.
        try:
            conn.request(
                method,
                url,
                body=body,
                headers=headers,
                chunked=chunked,
                preload_content=preload_content,
                decode_content=decode_content,
                enforce_content_length=enforce_content_length,
            )
    
        # We are swallowing BrokenPipeError (errno.EPIPE) since the server is
        # legitimately able to close the connection after sending a valid 
response.
        # With this behaviour, the received response is still readable.
        except BrokenPipeError:
            pass
        except OSError as e:
            # MacOS/Linux
            # EPROTOTYPE and ECONNRESET are needed on macOS
            # 
https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/
            # Condition changed later to emit ECONNRESET instead of only 
EPROTOTYPE.
            if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET:
                raise
    
        # Reset the timeout for the recv() on the socket
        read_timeout = timeout_obj.read_timeout
    
        if not conn.is_closed:
            # In Python 3 socket.py will catch EAGAIN and return None when you
            # try and read into the file pointer created by http.client, which
            # instead raises a BadStatusLine exception. Instead of catching
            # the exception and assuming all BadStatusLine exceptions are read
            # timeouts, check for a zero timeout before making the request.
            if read_timeout == 0:
                raise ReadTimeoutError(
                    self, url, f"Read timed out. (read timeout={read_timeout})"
                )
            conn.timeout = read_timeout
    
        # Receive the response from the server
        try:
            response = conn.getresponse()
        except (BaseSSLError, OSError) as e:
            self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
            raise
    
        # Set properties that are used by the pooling layer.
        response.retries = retries
        response._connection = response_conn  # type: ignore[attr-defined]
        response._pool = self  # type: ignore[attr-defined]
    
        log.debug(
            '%s://%s:%s "%s %s %s" %s %s',
            self.scheme,
            self.host,
            self.port,
            method,
            url,
>           response.version_string,
            response.status,
            response.length_remaining,
        )
E       AttributeError: 'VCRHTTPResponse' object has no attribute 
'version_string'

/usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError
_________________________________ test_cookies _________________________________

tmpdir = local('/tmp/pytest-of-buildd/pytest-1/test_cookies0')
httpbin = <pytest_httpbin.serve.Server object at 0x7fdee80b6a50>

    def test_cookies(tmpdir, httpbin):
        testfile = str(tmpdir.join("cookies.yml"))
        with vcr.use_cassette(testfile):
            with requests.Session() as s:
>               s.get(httpbin.url + "/cookies/set?k1=v1&k2=v2")

tests/integration/test_wild.py:67: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3/dist-packages/requests/sessions.py:602: in get
    return self.request("GET", url, **kwargs)
/usr/lib/python3/dist-packages/requests/sessions.py:589: in request
    resp = self.send(prep, **send_kwargs)
/usr/lib/python3/dist-packages/requests/sessions.py:703: in send
    r = adapter.send(request, **kwargs)
/usr/lib/python3/dist-packages/requests/adapters.py:667: in send
    resp = conn.urlopen(
/usr/lib/python3/dist-packages/urllib3/connectionpool.py:787: in urlopen
    response = self._make_request(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fdee7dd90f0>
conn = 
<vcr.patch.VCRRequestsHTTPConnection/tmp/pytest-of-buildd/pytest-1/test_cookies0/cookies.yml
 object at 0x7fdee7e6d550>
method = 'GET', url = '/cookies/set?k1=v1&k2=v2', body = None
headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, 
deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
timeout = Timeout(connect=None, read=None, total=None), chunked = False
response_conn = 
<vcr.patch.VCRRequestsHTTPConnection/tmp/pytest-of-buildd/pytest-1/test_cookies0/cookies.yml
 object at 0x7fdee7e6d550>
preload_content = False, decode_content = False, enforce_content_length = True

    def _make_request(
        self,
        conn: BaseHTTPConnection,
        method: str,
        url: str,
        body: _TYPE_BODY | None = None,
        headers: typing.Mapping[str, str] | None = None,
        retries: Retry | None = None,
        timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
        chunked: bool = False,
        response_conn: BaseHTTPConnection | None = None,
        preload_content: bool = True,
        decode_content: bool = True,
        enforce_content_length: bool = True,
    ) -> BaseHTTPResponse:
        """
        Perform a request on a given urllib connection object taken from our
        pool.
    
        :param conn:
            a connection from one of our connection pools
    
        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)
    
        :param url:
            The URL to perform the request on.
    
        :param body:
            Data to send in the request body, either :class:`str`, 
:class:`bytes`,
            an iterable of :class:`str`/:class:`bytes`, or a file-like object.
    
        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.
    
        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.
    
            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.
    
            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.
    
        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.
    
        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.
    
        :param response_conn:
            Set this to ``None`` if you will handle releasing the connection or
            set the connection to have the response release it.
    
        :param preload_content:
          If True, the response's body will be preloaded during construction.
    
        :param decode_content:
            If True, will attempt to decode the body based on the
            'content-encoding' header.
    
        :param enforce_content_length:
            Enforce content length checking. Body returned by server must match
            value of Content-Length header, if present. Otherwise, raise error.
        """
        self.num_requests += 1
    
        timeout_obj = self._get_timeout(timeout)
        timeout_obj.start_connect()
        conn.timeout = 
Timeout.resolve_default_timeout(timeout_obj.connect_timeout)
    
        try:
            # Trigger any extra validation we need to do.
            try:
                self._validate_conn(conn)
            except (SocketTimeout, BaseSSLError) as e:
                self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
                raise
    
        # _validate_conn() starts the connection to an HTTPS proxy
        # so we need to wrap errors with 'ProxyError' here too.
        except (
            OSError,
            NewConnectionError,
            TimeoutError,
            BaseSSLError,
            CertificateError,
            SSLError,
        ) as e:
            new_e: Exception = e
            if isinstance(e, (BaseSSLError, CertificateError)):
                new_e = SSLError(e)
            # If the connection didn't successfully connect to it's proxy
            # then there
            if isinstance(
                new_e, (OSError, NewConnectionError, TimeoutError, SSLError)
            ) and (conn and conn.proxy and not conn.has_connected_to_proxy):
                new_e = _wrap_proxy_error(new_e, conn.proxy.scheme)
            raise new_e
    
        # conn.request() calls http.client.*.request, not the method in
        # urllib3.request. It also calls makefile (recv) on the socket.
        try:
            conn.request(
                method,
                url,
                body=body,
                headers=headers,
                chunked=chunked,
                preload_content=preload_content,
                decode_content=decode_content,
                enforce_content_length=enforce_content_length,
            )
    
        # We are swallowing BrokenPipeError (errno.EPIPE) since the server is
        # legitimately able to close the connection after sending a valid 
response.
        # With this behaviour, the received response is still readable.
        except BrokenPipeError:
            pass
        except OSError as e:
            # MacOS/Linux
            # EPROTOTYPE and ECONNRESET are needed on macOS
            # 
https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/
            # Condition changed later to emit ECONNRESET instead of only 
EPROTOTYPE.
            if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET:
                raise
    
        # Reset the timeout for the recv() on the socket
        read_timeout = timeout_obj.read_timeout
    
        if not conn.is_closed:
            # In Python 3 socket.py will catch EAGAIN and return None when you
            # try and read into the file pointer created by http.client, which
            # instead raises a BadStatusLine exception. Instead of catching
            # the exception and assuming all BadStatusLine exceptions are read
            # timeouts, check for a zero timeout before making the request.
            if read_timeout == 0:
                raise ReadTimeoutError(
                    self, url, f"Read timed out. (read timeout={read_timeout})"
                )
            conn.timeout = read_timeout
    
        # Receive the response from the server
        try:
            response = conn.getresponse()
        except (BaseSSLError, OSError) as e:
            self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
            raise
    
        # Set properties that are used by the pooling layer.
        response.retries = retries
        response._connection = response_conn  # type: ignore[attr-defined]
        response._pool = self  # type: ignore[attr-defined]
    
        log.debug(
            '%s://%s:%s "%s %s %s" %s %s',
            self.scheme,
            self.host,
            self.port,
            method,
            url,
>           response.version_string,
            response.status,
            response.length_remaining,
        )
E       AttributeError: 'VCRHTTPResponse' object has no attribute 
'version_string'

/usr/lib/python3/dist-packages/urllib3/connectionpool.py:551: AttributeError
----------------------------- Captured stderr call -----------------------------
127.0.0.1 - - [18/Jan/2025 15:58:39] "GET /cookies/set?k1=v1&k2=v2 HTTP/1.1" 
302 203
=========================== short test summary info ============================
FAILED tests/integration/test_urllib3.py::test_status_code[http] - AttributeE...
FAILED tests/integration/test_urllib3.py::test_headers[http] - AttributeError...
FAILED tests/integration/test_urllib3.py::test_body[http] - AttributeError: '...
FAILED tests/integration/test_urllib3.py::test_auth[http] - AttributeError: '...
FAILED tests/integration/test_urllib3.py::test_auth_failed[http] - AttributeE...
FAILED tests/integration/test_urllib3.py::test_post[http] - AttributeError: '...
FAILED tests/integration/test_urllib3.py::test_gzip[http] - AttributeError: '...
FAILED tests/integration/test_urllib3.py::test_status_code[https] - Attribute...
FAILED tests/integration/test_urllib3.py::test_headers[https] - AttributeErro...
FAILED tests/integration/test_urllib3.py::test_body[https] - AttributeError: ...
FAILED tests/integration/test_urllib3.py::test_auth[https] - AttributeError: ...
FAILED tests/integration/test_urllib3.py::test_auth_failed[https] - Attribute...
FAILED tests/integration/test_urllib3.py::test_post[https] - AttributeError: ...
FAILED tests/integration/test_urllib3.py::test_gzip[https] - AttributeError: ...
FAILED tests/integration/test_proxy.py::test_use_proxy - AttributeError: 'VCR...
FAILED tests/integration/test_urllib3.py::test_cross_scheme - AttributeError:...
FAILED 
tests/integration/test_urllib3.py::test_https_with_cert_validation_disabled
FAILED tests/integration/test_wild.py::test_domain_redirect - AttributeError:...
FAILED tests/integration/test_wild.py::test_cookies - AttributeError: 'VCRHTT...
=========== 19 failed, 265 passed, 3 skipped, 19 deselected in 2.03s ===========
E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd 
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_vcr/build; python3.13 -m pytest 
--ignore tests/integration/test_aiohttp.py --ignore 
tests/integration/test_tornado.py --ignore tests/integration/test_requests.py 
-m "not online" -k "not test_basic_json_use and not 
test_load_cassette_with_custom_persister"
dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p "3.12 
3.13" returned exit code 13
make: *** [debian/rules:21: binary] Error 25
dpkg-buildpackage: error: debian/rules binary subprocess returned exit status 2
--------------------------------------------------------------------------------

The above is just how the build ends and not necessarily the most relevant part.
If required, the full build log is available here:

https://people.debian.org/~sanvila/build-logs/202501/

About the archive rebuild: The build was made on virtual machines from AWS,
using sbuild and a reduced chroot with only build-essential packages.

If you could not reproduce the bug please contact me privately, as I
am willing to provide ssh access to a virtual machine where the bug is
fully reproducible.

If this is really a bug in one of the build-depends, please use
reassign and add an affects on src:vcr.py, so that this is still
visible in the BTS web page for this package.

Thanks.

Reply via email to