Your message dated Sat, 03 Aug 2024 14:35:57 +0000
with message-id <e1safrr-009but...@fasolo.debian.org>
and subject line Bug#1073442: fixed in python-requests-toolbelt 1.0.0-3
has caused the Debian Bug report #1073442,
regarding python-requests-toolbelt: FTBFS: dh_auto_test: error: pybuild --test 
--test-pytest -i python{version} -p "3.12 3.11" returned exit code 13
to be marked as done.

This means that you claim that the problem has been dealt with.
If this is not the case it is now your responsibility to reopen the
Bug report if necessary, and/or fix the problem forthwith.

(NB: If you are a system administrator and have no idea what this
message is talking about, this may indicate a serious mail system
misconfiguration somewhere. Please contact ow...@bugs.debian.org
immediately.)


-- 
1073442: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1073442
Debian Bug Tracking System
Contact ow...@bugs.debian.org with problems
--- Begin Message ---
Source: python-requests-toolbelt
Version: 1.0.0-2
Severity: serious
Justification: FTBFS
Tags: trixie sid ftbfs
User: lu...@debian.org
Usertags: ftbfs-20240615 ftbfs-trixie

Hi,

During a rebuild of all packages in sid, your package failed to build
on amd64.


Relevant part (hopefully):
>  debian/rules binary
> dh binary --with python3 --buildsystem=pybuild
>    dh_update_autotools_config -O--buildsystem=pybuild
>    dh_autoreconf -O--buildsystem=pybuild
>    dh_auto_configure -O--buildsystem=pybuild
>       pybuild --configure -i python{version} -p "3.12 3.11"
> I: pybuild base:311: python3.12 setup.py config 
> running config
> I: pybuild base:311: python3.11 setup.py config 
> running config
>    dh_auto_build -O--buildsystem=pybuild
>       pybuild --build -i python{version} -p "3.12 3.11"
> I: pybuild base:311: /usr/bin/python3.12 setup.py build 
> running build
> running build_py
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/streaming_iterator.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/exceptions.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/_compat.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/sessions.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/source.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/socket_options.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/ssl.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/fingerprint.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/host_header_ssl.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/x509.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/adapters
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/guess.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/_digest_auth_compat.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/http_proxy_digest.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/handler.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/auth
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/downloadutils
> copying requests_toolbelt/downloadutils/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/downloadutils
> copying requests_toolbelt/downloadutils/tee.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/downloadutils
> copying requests_toolbelt/downloadutils/stream.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/downloadutils
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/multipart
> copying requests_toolbelt/multipart/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/multipart
> copying requests_toolbelt/multipart/decoder.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/multipart
> copying requests_toolbelt/multipart/encoder.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/multipart
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/threaded
> copying requests_toolbelt/threaded/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/threaded
> copying requests_toolbelt/threaded/pool.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/threaded
> copying requests_toolbelt/threaded/thread.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/threaded
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/dump.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/user_agent.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/formdata.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/deprecated.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/utils
> running egg_info
> creating requests_toolbelt.egg-info
> writing requests_toolbelt.egg-info/PKG-INFO
> writing dependency_links to requests_toolbelt.egg-info/dependency_links.txt
> writing requirements to requests_toolbelt.egg-info/requires.txt
> writing top-level names to requests_toolbelt.egg-info/top_level.txt
> writing manifest file 'requests_toolbelt.egg-info/SOURCES.txt'
> reading manifest file 'requests_toolbelt.egg-info/SOURCES.txt'
> reading manifest template 'MANIFEST.in'
> no previously-included directories found matching 'docs/_build'
> warning: no previously-included files matching '*.py[cdo]' found anywhere in 
> distribution
> warning: no previously-included files matching '__pycache__' found anywhere 
> in distribution
> warning: no previously-included files matching '*.so' found anywhere in 
> distribution
> warning: no previously-included files matching '*.pyd' found anywhere in 
> distribution
> adding license file 'LICENSE'
> adding license file 'AUTHORS.rst'
> writing manifest file 'requests_toolbelt.egg-info/SOURCES.txt'
> /usr/lib/python3/dist-packages/setuptools/command/build_py.py:204: _Warning: 
> Package 'requests_toolbelt.cookies' is absent from the `packages` 
> configuration.
> !!
> 
>         
> ********************************************************************************
>         ############################
>         # Package would be ignored #
>         ############################
>         Python recognizes 'requests_toolbelt.cookies' as an importable 
> package[^1],
>         but it is absent from setuptools' `packages` configuration.
> 
>         This leads to an ambiguous overall configuration. If you want to 
> distribute this
>         package, please make sure that 'requests_toolbelt.cookies' is 
> explicitly added
>         to the `packages` configuration field.
> 
>         Alternatively, you can also rely on setuptools' discovery methods
>         (for example by using `find_namespace_packages(...)`/`find_namespace:`
>         instead of `find_packages(...)`/`find:`).
> 
>         You can read more about "package discovery" on setuptools 
> documentation page:
> 
>         - 
> https://setuptools.pypa.io/en/latest/userguide/package_discovery.html
> 
>         If you don't want 'requests_toolbelt.cookies' to be distributed and 
> are
>         already explicitly excluding 'requests_toolbelt.cookies' via
>         `find_namespace_packages(...)/find_namespace` or 
> `find_packages(...)/find`,
>         you can try to use `exclude_package_data`, or 
> `include-package-data=False` in
>         combination with a more fine grained `package-data` configuration.
> 
>         You can read more about "package data files" on setuptools 
> documentation page:
> 
>         - https://setuptools.pypa.io/en/latest/userguide/datafiles.html
> 
> 
>         [^1]: For Python, any directory (with suitable naming) can be 
> imported,
>               even if it does not contain any `.py` files.
>               On the other hand, currently there is no concept of package data
>               directory, all directories are treated like packages.
>         
> ********************************************************************************
> 
> !!
>   check.warn(importable)
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/cookies
> copying requests_toolbelt/cookies/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/cookies
> copying requests_toolbelt/cookies/forgetful.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build/requests_toolbelt/cookies
> I: pybuild base:311: /usr/bin/python3 setup.py build 
> running build
> running build_py
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/streaming_iterator.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/exceptions.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/_compat.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt
> copying requests_toolbelt/sessions.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/source.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/socket_options.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/ssl.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/fingerprint.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/host_header_ssl.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> copying requests_toolbelt/adapters/x509.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/adapters
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/guess.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/_digest_auth_compat.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/http_proxy_digest.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/auth
> copying requests_toolbelt/auth/handler.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/auth
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/downloadutils
> copying requests_toolbelt/downloadutils/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/downloadutils
> copying requests_toolbelt/downloadutils/tee.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/downloadutils
> copying requests_toolbelt/downloadutils/stream.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/downloadutils
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/multipart
> copying requests_toolbelt/multipart/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/multipart
> copying requests_toolbelt/multipart/decoder.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/multipart
> copying requests_toolbelt/multipart/encoder.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/multipart
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/threaded
> copying requests_toolbelt/threaded/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/threaded
> copying requests_toolbelt/threaded/pool.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/threaded
> copying requests_toolbelt/threaded/thread.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/threaded
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/dump.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/user_agent.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/formdata.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/utils
> copying requests_toolbelt/utils/deprecated.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/utils
> running egg_info
> writing requests_toolbelt.egg-info/PKG-INFO
> writing dependency_links to requests_toolbelt.egg-info/dependency_links.txt
> writing requirements to requests_toolbelt.egg-info/requires.txt
> writing top-level names to requests_toolbelt.egg-info/top_level.txt
> reading manifest file 'requests_toolbelt.egg-info/SOURCES.txt'
> reading manifest template 'MANIFEST.in'
> no previously-included directories found matching 'docs/_build'
> warning: no previously-included files matching '*.py[cdo]' found anywhere in 
> distribution
> warning: no previously-included files matching '__pycache__' found anywhere 
> in distribution
> warning: no previously-included files matching '*.so' found anywhere in 
> distribution
> warning: no previously-included files matching '*.pyd' found anywhere in 
> distribution
> adding license file 'LICENSE'
> adding license file 'AUTHORS.rst'
> writing manifest file 'requests_toolbelt.egg-info/SOURCES.txt'
> /usr/lib/python3/dist-packages/setuptools/command/build_py.py:204: _Warning: 
> Package 'requests_toolbelt.cookies' is absent from the `packages` 
> configuration.
> !!
> 
>         
> ********************************************************************************
>         ############################
>         # Package would be ignored #
>         ############################
>         Python recognizes 'requests_toolbelt.cookies' as an importable 
> package[^1],
>         but it is absent from setuptools' `packages` configuration.
> 
>         This leads to an ambiguous overall configuration. If you want to 
> distribute this
>         package, please make sure that 'requests_toolbelt.cookies' is 
> explicitly added
>         to the `packages` configuration field.
> 
>         Alternatively, you can also rely on setuptools' discovery methods
>         (for example by using `find_namespace_packages(...)`/`find_namespace:`
>         instead of `find_packages(...)`/`find:`).
> 
>         You can read more about "package discovery" on setuptools 
> documentation page:
> 
>         - 
> https://setuptools.pypa.io/en/latest/userguide/package_discovery.html
> 
>         If you don't want 'requests_toolbelt.cookies' to be distributed and 
> are
>         already explicitly excluding 'requests_toolbelt.cookies' via
>         `find_namespace_packages(...)/find_namespace` or 
> `find_packages(...)/find`,
>         you can try to use `exclude_package_data`, or 
> `include-package-data=False` in
>         combination with a more fine grained `package-data` configuration.
> 
>         You can read more about "package data files" on setuptools 
> documentation page:
> 
>         - https://setuptools.pypa.io/en/latest/userguide/datafiles.html
> 
> 
>         [^1]: For Python, any directory (with suitable naming) can be 
> imported,
>               even if it does not contain any `.py` files.
>               On the other hand, currently there is no concept of package data
>               directory, all directories are treated like packages.
>         
> ********************************************************************************
> 
> !!
>   check.warn(importable)
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/cookies
> copying requests_toolbelt/cookies/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/cookies
> copying requests_toolbelt/cookies/forgetful.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build/requests_toolbelt/cookies
>    dh_auto_test -O--buildsystem=pybuild
>       pybuild --test --test-pytest -i python{version} -p "3.12 3.11"
> I: pybuild base:311: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build; python3.12 
> -m pytest -k 'not test_reads_open_file_objects and not 
> test_reads_open_file_objects_using_to_string and not 
> test_reads_open_file_objects_with_a_specified_filename'
> ..........................................FF......ssss.................. [ 
> 43%]
> ................................FF.FF..............................sss.. [ 
> 86%]
> ......................                                                   
> [100%]
> =================================== FAILURES 
> ===================================
> ___________________ TestDumpRealResponses.test_dump_response 
> ___________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfbd2140>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfbd2140>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfbd2140>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_dump.TestDumpRealResponses object at 0x7fc0bfc36db0>
> 
>     def test_dump_response(self):
>         session = requests.Session()
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.get('https://httpbin.org/get')
> 
> tests/test_dump.py:376: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _____________________ TestDumpRealResponses.test_dump_all 
> ______________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfac0ee0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfac0ee0>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfac0ee0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_dump.TestDumpRealResponses object at 0x7fc0bfc37890>
> 
>     def test_dump_all(self):
>         session = requests.Session()
>         recorder = get_betamax(session)
>         with recorder.use_cassette('redirect_request_for_dump_all'):
> >           response = session.get('https://httpbin.org/redirect/5')
> 
> tests/test_dump.py:392: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:724: in send
>     history = [resp for resp in gen]
> /usr/lib/python3/dist-packages/requests/sessions.py:265: in resolve_redirects
>     resp = self.send(
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _____________ TestBasedSession.test_prepared_request_override_base 
> _____________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa79b70>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa79b70>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa79b70>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_prepared_request_override_base>
> 
>     def test_prepared_request_override_base(self):
>         session = sessions.BaseUrlSession('https://www.google.com')
>         request = Request(method="GET", url="https://httpbin.org/get";)
>         prepared_request = session.prepare_request(request)
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.send(prepared_request)
> 
> tests/test_sessions.py:53: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _______________ TestBasedSession.test_prepared_request_with_base 
> _______________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa7a560>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa7a560>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa7a560>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_prepared_request_with_base>
> 
>     def test_prepared_request_with_base(self):
>         session = sessions.BaseUrlSession('https://httpbin.org')
>         request = Request(method="GET", url="/get")
>         prepared_request = session.prepare_request(request)
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.send(prepared_request)
> 
> tests/test_sessions.py:37: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _________________ TestBasedSession.test_request_override_base 
> __________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa1d2a0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa1d2a0>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfa1d2a0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_request_override_base>
> 
>     def test_request_override_base(self):
>         session = sessions.BaseUrlSession('https://www.google.com')
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.get('https://httpbin.org/get')
> 
> tests/test_sessions.py:27: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> requests_toolbelt/sessions.py:76: in request
>     return super(BaseUrlSession, self).request(
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> ___________________ TestBasedSession.test_request_with_base 
> ____________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfc545e0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfc545e0>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fc0bfc545e0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_request_with_base>
> 
>     def test_request_with_base(self):
>         session = sessions.BaseUrlSession('https://httpbin.org/')
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.get('/get')
> 
> tests/test_sessions.py:15: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> requests_toolbelt/sessions.py:76: in request
>     return super(BaseUrlSession, self).request(
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> =============================== warnings summary 
> ===============================
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_auth.py: 3 warnings
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_downloadutils.py: 9 
> warnings
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_dump.py: 2 warnings
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_fingerprintadapter.py:
>  1 warning
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_forgetfulcookiejar.py:
>  1 warning
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_multipart_encoder.py:
>  3 warnings
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_sessions.py: 4 
> warnings
> .pybuild/cpython3_3.12_requests-toolbelt/build/tests/test_ssladapter.py: 1 
> warning
>   /usr/lib/python3/dist-packages/betamax/adapter.py:105: DeprecationWarning: 
> datetime.datetime.utcnow() is deprecated and scheduled for removal in a 
> future version. Use timezone-aware objects to represent datetimes in UTC: 
> datetime.datetime.now(datetime.UTC).
>     now = datetime.utcnow()
> 
> -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
> =========================== short test summary info 
> ============================
> FAILED tests/test_dump.py::TestDumpRealResponses::test_dump_response - 
> reques...
> FAILED tests/test_dump.py::TestDumpRealResponses::test_dump_all - 
> requests.ex...
> FAILED 
> tests/test_sessions.py::TestBasedSession::test_prepared_request_override_base
> FAILED 
> tests/test_sessions.py::TestBasedSession::test_prepared_request_with_base
> FAILED tests/test_sessions.py::TestBasedSession::test_request_override_base
> FAILED tests/test_sessions.py::TestBasedSession::test_request_with_base - 
> req...
> 6 failed, 153 passed, 7 skipped, 3 deselected, 24 warnings in 3.55s
> E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_requests-toolbelt/build; python3.12 
> -m pytest -k 'not test_reads_open_file_objects and not 
> test_reads_open_file_objects_using_to_string and not 
> test_reads_open_file_objects_with_a_specified_filename'
> I: pybuild base:311: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build; python3.11 
> -m pytest -k 'not test_reads_open_file_objects and not 
> test_reads_open_file_objects_using_to_string and not 
> test_reads_open_file_objects_with_a_specified_filename'
> ..........................................FF......ssss.................. [ 
> 43%]
> ................................FF.FF..............................sss.. [ 
> 86%]
> ......................                                                   
> [100%]
> =================================== FAILURES 
> ===================================
> ___________________ TestDumpRealResponses.test_dump_response 
> ___________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19caa9f60>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19caa9f60>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19caa9f60>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_dump.TestDumpRealResponses object at 0x7fa19c731590>
> 
>     def test_dump_response(self):
>         session = requests.Session()
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.get('https://httpbin.org/get')
> 
> tests/test_dump.py:376: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _____________________ TestDumpRealResponses.test_dump_all 
> ______________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c98b460>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c98b460>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c98b460>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_dump.TestDumpRealResponses object at 0x7fa19c733110>
> 
>     def test_dump_all(self):
>         session = requests.Session()
>         recorder = get_betamax(session)
>         with recorder.use_cassette('redirect_request_for_dump_all'):
> >           response = session.get('https://httpbin.org/redirect/5')
> 
> tests/test_dump.py:392: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:724: in send
>     history = [resp for resp in gen]
> /usr/lib/python3/dist-packages/requests/sessions.py:724: in <listcomp>
>     history = [resp for resp in gen]
> /usr/lib/python3/dist-packages/requests/sessions.py:265: in resolve_redirects
>     resp = self.send(
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _____________ TestBasedSession.test_prepared_request_override_base 
> _____________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c79bc10>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c79bc10>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c79bc10>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_prepared_request_override_base>
> 
>     def test_prepared_request_override_base(self):
>         session = sessions.BaseUrlSession('https://www.google.com')
>         request = Request(method="GET", url="https://httpbin.org/get";)
>         prepared_request = session.prepare_request(request)
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.send(prepared_request)
> 
> tests/test_sessions.py:53: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _______________ TestBasedSession.test_prepared_request_with_base 
> _______________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19caa8340>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19caa8340>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19caa8340>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_prepared_request_with_base>
> 
>     def test_prepared_request_with_base(self):
>         session = sessions.BaseUrlSession('https://httpbin.org')
>         request = Request(method="GET", url="/get")
>         prepared_request = session.prepare_request(request)
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.send(prepared_request)
> 
> tests/test_sessions.py:37: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> _________________ TestBasedSession.test_request_override_base 
> __________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19cae93c0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19cae93c0>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19cae93c0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_request_override_base>
> 
>     def test_request_override_base(self):
>         session = sessions.BaseUrlSession('https://www.google.com')
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.get('https://httpbin.org/get')
> 
> tests/test_sessions.py:27: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> requests_toolbelt/sessions.py:76: in request
>     return super(BaseUrlSession, self).request(
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> ___________________ TestBasedSession.test_request_with_base 
> ____________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c57e7a0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c57e7a0>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(234 
> bytes read, 5 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7fa19c57e7a0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 bytes 
> read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_sessions.TestBasedSession 
> testMethod=test_request_with_base>
> 
>     def test_request_with_base(self):
>         session = sessions.BaseUrlSession('https://httpbin.org/')
>         recorder = get_betamax(session)
>         with recorder.use_cassette('simple_get_request'):
> >           response = session.get('/get')
> 
> tests/test_sessions.py:15: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/requests/sessions.py:602: in get
>     return self.request("GET", url, **kwargs)
> requests_toolbelt/sessions.py:76: in request
>     return super(BaseUrlSession, self).request(
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(234 bytes read, 5 more expected)', IncompleteRead(234 
> bytes read, 5 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> =========================== short test summary info 
> ============================
> FAILED tests/test_dump.py::TestDumpRealResponses::test_dump_response - 
> reques...
> FAILED tests/test_dump.py::TestDumpRealResponses::test_dump_all - 
> requests.ex...
> FAILED 
> tests/test_sessions.py::TestBasedSession::test_prepared_request_override_base
> FAILED 
> tests/test_sessions.py::TestBasedSession::test_prepared_request_with_base
> FAILED tests/test_sessions.py::TestBasedSession::test_request_override_base
> FAILED tests/test_sessions.py::TestBasedSession::test_request_with_base - 
> req...
> 6 failed, 153 passed, 7 skipped, 3 deselected in 3.76s
> E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_requests-toolbelt/build; python3.11 
> -m pytest -k 'not test_reads_open_file_objects and not 
> test_reads_open_file_objects_using_to_string and not 
> test_reads_open_file_objects_with_a_specified_filename'
>       rm -fr -- /tmp/dh-xdg-rundir-dPeX5v9W
> dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p "3.12 
> 3.11" returned exit code 13


The full build log is available from:
http://qa-logs.debian.net/2024/06/15/python-requests-toolbelt_1.0.0-2_unstable.log

All bugs filed during this archive rebuild are listed at:
https://bugs.debian.org/cgi-bin/pkgreport.cgi?tag=ftbfs-20240615;users=lu...@debian.org
or:
https://udd.debian.org/bugs/?release=na&merged=ign&fnewerval=7&flastmodval=7&fusertag=only&fusertagtag=ftbfs-20240615&fusertaguser=lu...@debian.org&allbugs=1&cseverity=1&ctags=1&caffected=1#results

A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!

If you reassign this bug to another package, please mark it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects

If you fail to reproduce this, please provide a build log and diff it with mine
so that we can identify if something relevant changed in the meantime.

--- End Message ---
--- Begin Message ---
Source: python-requests-toolbelt
Source-Version: 1.0.0-3
Done: Emmanuel Arias <eam...@debian.org>

We believe that the bug you reported is fixed in the latest version of
python-requests-toolbelt, which is due to be installed in the Debian FTP 
archive.

A summary of the changes between this version and the previous one is
attached.

Thank you for reporting the bug, which will now be closed.  If you
have further comments please address them to 1073...@bugs.debian.org,
and the maintainer will reopen the bug report if appropriate.

Debian distribution maintenance software
pp.
Emmanuel Arias <eam...@debian.org> (supplier of updated 
python-requests-toolbelt package)

(This message was generated automatically at their request; if you
believe that there is a problem with it please contact the archive
administrators by mailing ftpmas...@ftp-master.debian.org)


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

Format: 1.8
Date: Sat, 03 Aug 2024 10:21:09 -0300
Source: python-requests-toolbelt
Architecture: source
Version: 1.0.0-3
Distribution: unstable
Urgency: medium
Maintainer: Debian Python Team <team+pyt...@tracker.debian.org>
Changed-By: Emmanuel Arias <eam...@debian.org>
Closes: 1073442
Changes:
 python-requests-toolbelt (1.0.0-3) unstable; urgency=medium
 .
   * d/control: Bump Standards-Version to 4.7.0 (from 4.6.2; no further changes
     needed).
   * d/copyright: Update copyright year for debian/* files
   * d/rules: Ignore test_dump_response,
     test_dump_all,test_prepared_request_override_base,
     test_prepared_request_with_base, test_request_override_base,
     test_request_with_base tests that use internet connection (Closes:
     #1073442).
     - d/tests/pytest: Ignore tests also un autpkgtest.
Checksums-Sha1:
 3ca70bd17e22d8c837b5bbb81b60d262fe2cebdc 2490 
python-requests-toolbelt_1.0.0-3.dsc
 dbd16513eeb658284bd3eedf631578d333b970b2 4424 
python-requests-toolbelt_1.0.0-3.debian.tar.xz
 124ebfd896d8e0ecb19bb3f4763cbf2d30538aa8 8279 
python-requests-toolbelt_1.0.0-3_amd64.buildinfo
Checksums-Sha256:
 df7a35aedab823c5ce43f008a953c04d9bcd538ad2c61c3ef67d2c3522468868 2490 
python-requests-toolbelt_1.0.0-3.dsc
 8e62d83d16fadea8b3c538b17709de8febfa1ee3f1e43a98efc7ae502e9766dc 4424 
python-requests-toolbelt_1.0.0-3.debian.tar.xz
 3cd4b8b70906996227f8261b89b37a421c1e3c4ba725df10d8c85cf5ec9c7298 8279 
python-requests-toolbelt_1.0.0-3_amd64.buildinfo
Files:
 caf5927f1751fd59b95c3db7cec6d41a 2490 python optional 
python-requests-toolbelt_1.0.0-3.dsc
 ff7d27a25efebeb2949d463eb03f7b31 4424 python optional 
python-requests-toolbelt_1.0.0-3.debian.tar.xz
 0ba763a0287a67957872f987647aabce 8279 python optional 
python-requests-toolbelt_1.0.0-3_amd64.buildinfo

-----BEGIN PGP SIGNATURE-----

iQJGBAEBCgAwFiEEE3lnVbvHK7ir4q61+p3sXeEcY/EFAmauOjESHGVhbWFudUBk
ZWJpYW4ub3JnAAoJEPqd7F3hHGPxAOoP/2S8pU5UGY+htqhlTijXUf0gDa8Ns4IL
R4aJXTTlOsOW4eJx6/0NjGN1VJGviTKjGl7dY8fg0vDYD7Zhl2dkZzgHlE38xVmd
31nFw09051Q8ls6lN5EtR0wM13foxPGvAW9P/WE5pvvTFNjpCfWMAPG5JHPFYfRB
YDWbu95NpgQooGS4udErfusBKufBIU3ktGhI0BKOTvahq1901o/GDHpzHGJeR4rO
vu8uyGXuKB9TeWAeV/XcXVvNKCwFa0bfGCUcInyzYb+xnD+QhhFuTA9kFaKSaQNb
G0Broi4cb+TLR5wt2s3l3V4sFakpxuLsCEjIRPZpc4Nf+vIIhfvHXs3I8WcTKq/5
gL7ANUMOrrUjFajZSWFgvLekdhX2JliE620ayjKog1FMGVPn7eUFbC4UTYZPA9JB
P+YggymrWsdn4CEm18l3nG/HJ5eGkVlt/iWSW+1gJknDWbRWItine8aUPsPMT0j3
GXL9+0Uzjo06vZ5HaDhA6VYDepRarVhGX5hMN+evmPS13uDRvTUT9F6hbJkioChL
1o7wIBTtfua2Dl83htu6CouW/JGejgJ89/ZsXqDwzOR0QVqMjsBuF+ZEKwCZqTko
hhsiBzbrL4PXygUwUkTaotM5gDTD8TZHj39qYA0zVQgCQh4Gio4XRz2OjzojXJHE
e83CH9ylC/HE
=BEgx
-----END PGP SIGNATURE-----

Attachment: pgpmHEdyf8wRf.pgp
Description: PGP signature


--- End Message ---

Reply via email to