--- Begin Message ---
Source: python-scrapy
Version: 2.5.1-1
Severity: serious
Justification: FTBFS
Tags: bookworm sid ftbfs
User: lu...@debian.org
Usertags: ftbfs-20211220 ftbfs-bookworm
Hi,
During a rebuild of all packages in sid, your package failed to build
on amd64.
Relevant part (hopefully):
> debian/rules binary
> dh binary --with python3,bash_completion,sphinxdoc --buildsystem=pybuild
> dh_update_autotools_config -O--buildsystem=pybuild
> dh_autoreconf -O--buildsystem=pybuild
> dh_auto_configure -O--buildsystem=pybuild
> I: pybuild base:237: python3.10 setup.py config
> running config
> I: pybuild base:237: python3.9 setup.py config
> running config
> dh_auto_build -O--buildsystem=pybuild
> I: pybuild base:237: /usr/bin/python3.10 setup.py build
> running build
> running build_py
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/dupefilters.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/signalmanager.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/item.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/extension.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/resolver.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/middleware.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/__main__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/interfaces.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/cmdline.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/shell.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/exceptions.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/statscollectors.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/robotstxt.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/spiderloader.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/crawler.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/squeues.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/responsetypes.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/pqueues.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/link.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/signals.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/mail.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/exporters.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/logformatter.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/settings
> copying scrapy/settings/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/settings
> copying scrapy/settings/default_settings.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/settings
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/settings.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/runspider.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/shell.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/startproject.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/edit.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/parse.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/view.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/list.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/genspider.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/fetch.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/crawl.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/check.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/bench.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> copying scrapy/commands/version.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/commands
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/cookies.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/retry.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/httpcache.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/defaultheaders.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/robotstxt.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/redirect.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/httpproxy.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/downloadtimeout.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/httpcompression.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/stats.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/httpauth.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/ajaxcrawl.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/useragent.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/decompression.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/pipelines
> copying scrapy/pipelines/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/pipelines
> copying scrapy/pipelines/media.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/pipelines
> copying scrapy/pipelines/images.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/pipelines
> copying scrapy/pipelines/files.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/pipelines
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/loader
> copying scrapy/loader/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/loader
> copying scrapy/loader/processors.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/loader
> copying scrapy/loader/common.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/loader
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions
> copying scrapy/extensions/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions
> copying scrapy/extensions/logstats.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions
> copying scrapy/extensions/telnet.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions
> copying scrapy/extensions/httpcache.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions
> copying scrapy/extensions/memusage.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions
> copying scrapy/extensions/corestats.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions
> copying scrapy/extensions/debug.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions
> copying scrapy/extensions/spiderstate.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions
> copying scrapy/extensions/throttle.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions
> copying scrapy/extensions/feedexport.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions
> copying scrapy/extensions/memdebug.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions
> copying scrapy/extensions/closespider.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions
> copying scrapy/extensions/statsmailer.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/spidermiddlewares
> copying scrapy/spidermiddlewares/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/spidermiddlewares
> copying scrapy/spidermiddlewares/depth.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/spidermiddlewares
> copying scrapy/spidermiddlewares/urllength.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/spidermiddlewares
> copying scrapy/spidermiddlewares/httperror.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/spidermiddlewares
> copying scrapy/spidermiddlewares/referer.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/spidermiddlewares
> copying scrapy/spidermiddlewares/offsite.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/spidermiddlewares
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/selector
> copying scrapy/selector/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/selector
> copying scrapy/selector/unified.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/selector
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http
> copying scrapy/http/cookies.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http
> copying scrapy/http/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http
> copying scrapy/http/headers.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http
> copying scrapy/http/common.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/gz.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/template.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/datatypes.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/deprecate.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/iterators.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/test.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/project.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/ftp.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/response.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/console.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/httpobj.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/engine.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/log.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/versions.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/ssl.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/decorators.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/serialize.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/trackref.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/display.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/reactor.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/python.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/ossignal.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/sitemap.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/defer.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/job.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/testproc.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/asyncgen.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/boto.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/curl.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/url.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/testsite.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/py36.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/conf.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/misc.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/spider.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/benchserver.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/request.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/reqser.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> copying scrapy/utils/signal.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/linkextractors
> copying scrapy/linkextractors/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/linkextractors
> copying scrapy/linkextractors/lxmlhtml.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/linkextractors
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/spiders
> copying scrapy/spiders/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/spiders
> copying scrapy/spiders/feed.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/spiders
> copying scrapy/spiders/init.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/spiders
> copying scrapy/spiders/sitemap.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/spiders
> copying scrapy/spiders/crawl.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/spiders
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/contracts
> copying scrapy/contracts/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/contracts
> copying scrapy/contracts/default.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/contracts
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core
> copying scrapy/core/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core
> copying scrapy/core/engine.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core
> copying scrapy/core/scraper.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core
> copying scrapy/core/spidermw.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core
> copying scrapy/core/scheduler.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http/request
> copying scrapy/http/request/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http/request
> copying scrapy/http/request/rpc.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http/request
> copying scrapy/http/request/json_request.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http/request
> copying scrapy/http/request/form.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http/request
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http/response
> copying scrapy/http/response/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http/response
> copying scrapy/http/response/text.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http/response
> copying scrapy/http/response/html.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http/response
> copying scrapy/http/response/xml.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/http/response
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader
> copying scrapy/core/downloader/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader
> copying scrapy/core/downloader/contextfactory.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader
> copying scrapy/core/downloader/middleware.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader
> copying scrapy/core/downloader/webclient.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader
> copying scrapy/core/downloader/tls.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/http2
> copying scrapy/core/http2/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/http2
> copying scrapy/core/http2/stream.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/http2
> copying scrapy/core/http2/agent.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/http2
> copying scrapy/core/http2/protocol.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/http2
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/ftp.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/http.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/http10.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/s3.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/datauri.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/file.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/http11.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/http2.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader/handlers
> running egg_info
> creating Scrapy.egg-info
> writing Scrapy.egg-info/PKG-INFO
> writing dependency_links to Scrapy.egg-info/dependency_links.txt
> writing entry points to Scrapy.egg-info/entry_points.txt
> writing requirements to Scrapy.egg-info/requires.txt
> writing top-level names to Scrapy.egg-info/top_level.txt
> writing manifest file 'Scrapy.egg-info/SOURCES.txt'
> reading manifest file 'Scrapy.egg-info/SOURCES.txt'
> reading manifest template 'MANIFEST.in'
> warning: no files found matching 'requirements-*.txt'
> warning: no files found matching 'license.txt' under directory 'scrapy'
> no previously-included directories found matching 'docs/build'
> warning: no files found matching '*' under directory 'bin'
> warning: no previously-included files matching '__pycache__' found anywhere
> in distribution
> warning: no previously-included files matching '*.py[cod]' found anywhere in
> distribution
> adding license file 'LICENSE'
> adding license file 'AUTHORS'
> writing manifest file 'Scrapy.egg-info/SOURCES.txt'
> copying scrapy/VERSION ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> copying scrapy/mime.types ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/project
> copying scrapy/templates/project/scrapy.cfg ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/project
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/project/module
> copying scrapy/templates/project/module/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/project/module
> copying scrapy/templates/project/module/items.py.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/project/module
> copying scrapy/templates/project/module/middlewares.py.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/project/module
> copying scrapy/templates/project/module/pipelines.py.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/project/module
> copying scrapy/templates/project/module/settings.py.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/project/module
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/project/module/spiders
> copying scrapy/templates/project/module/spiders/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/project/module/spiders
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/spiders
> copying scrapy/templates/spiders/basic.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/spiders
> copying scrapy/templates/spiders/crawl.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/spiders
> copying scrapy/templates/spiders/csvfeed.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/spiders
> copying scrapy/templates/spiders/xmlfeed.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/templates/spiders
> I: pybuild base:237: /usr/bin/python3 setup.py build
> running build
> running build_py
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/dupefilters.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/signalmanager.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/item.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/extension.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/resolver.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/middleware.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/__main__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/interfaces.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/cmdline.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/shell.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/exceptions.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/statscollectors.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/robotstxt.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/spiderloader.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/crawler.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/squeues.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/responsetypes.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/pqueues.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/link.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/signals.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/mail.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/exporters.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/logformatter.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/settings
> copying scrapy/settings/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/settings
> copying scrapy/settings/default_settings.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/settings
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/settings.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/runspider.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/shell.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/startproject.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/edit.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/parse.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/view.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/list.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/genspider.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/fetch.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/crawl.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/check.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/bench.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> copying scrapy/commands/version.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/commands
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/cookies.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/retry.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/httpcache.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/defaultheaders.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/robotstxt.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/redirect.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/httpproxy.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/downloadtimeout.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/httpcompression.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/stats.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/httpauth.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/ajaxcrawl.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/useragent.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> copying scrapy/downloadermiddlewares/decompression.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/pipelines
> copying scrapy/pipelines/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/pipelines
> copying scrapy/pipelines/media.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/pipelines
> copying scrapy/pipelines/images.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/pipelines
> copying scrapy/pipelines/files.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/pipelines
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/loader
> copying scrapy/loader/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/loader
> copying scrapy/loader/processors.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/loader
> copying scrapy/loader/common.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/loader
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions
> copying scrapy/extensions/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions
> copying scrapy/extensions/logstats.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions
> copying scrapy/extensions/telnet.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions
> copying scrapy/extensions/httpcache.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions
> copying scrapy/extensions/memusage.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions
> copying scrapy/extensions/corestats.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions
> copying scrapy/extensions/debug.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions
> copying scrapy/extensions/spiderstate.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions
> copying scrapy/extensions/throttle.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions
> copying scrapy/extensions/feedexport.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions
> copying scrapy/extensions/memdebug.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions
> copying scrapy/extensions/closespider.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions
> copying scrapy/extensions/statsmailer.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/spidermiddlewares
> copying scrapy/spidermiddlewares/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/spidermiddlewares
> copying scrapy/spidermiddlewares/depth.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/spidermiddlewares
> copying scrapy/spidermiddlewares/urllength.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/spidermiddlewares
> copying scrapy/spidermiddlewares/httperror.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/spidermiddlewares
> copying scrapy/spidermiddlewares/referer.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/spidermiddlewares
> copying scrapy/spidermiddlewares/offsite.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/spidermiddlewares
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/selector
> copying scrapy/selector/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/selector
> copying scrapy/selector/unified.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/selector
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http
> copying scrapy/http/cookies.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http
> copying scrapy/http/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http
> copying scrapy/http/headers.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http
> copying scrapy/http/common.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/gz.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/template.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/datatypes.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/deprecate.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/iterators.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/test.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/project.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/ftp.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/response.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/console.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/httpobj.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/engine.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/log.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/versions.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/ssl.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/decorators.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/serialize.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/trackref.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/display.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/reactor.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/python.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/ossignal.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/sitemap.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/defer.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/job.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/testproc.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/asyncgen.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/boto.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/curl.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/url.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/testsite.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/py36.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/conf.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/misc.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/spider.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/benchserver.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/request.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/reqser.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> copying scrapy/utils/signal.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/utils
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/linkextractors
> copying scrapy/linkextractors/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/linkextractors
> copying scrapy/linkextractors/lxmlhtml.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/linkextractors
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/spiders
> copying scrapy/spiders/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/spiders
> copying scrapy/spiders/feed.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/spiders
> copying scrapy/spiders/init.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/spiders
> copying scrapy/spiders/sitemap.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/spiders
> copying scrapy/spiders/crawl.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/spiders
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/contracts
> copying scrapy/contracts/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/contracts
> copying scrapy/contracts/default.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/contracts
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core
> copying scrapy/core/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core
> copying scrapy/core/engine.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core
> copying scrapy/core/scraper.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core
> copying scrapy/core/spidermw.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core
> copying scrapy/core/scheduler.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http/request
> copying scrapy/http/request/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http/request
> copying scrapy/http/request/rpc.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http/request
> copying scrapy/http/request/json_request.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http/request
> copying scrapy/http/request/form.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http/request
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http/response
> copying scrapy/http/response/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http/response
> copying scrapy/http/response/text.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http/response
> copying scrapy/http/response/html.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http/response
> copying scrapy/http/response/xml.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/http/response
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader
> copying scrapy/core/downloader/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader
> copying scrapy/core/downloader/contextfactory.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader
> copying scrapy/core/downloader/middleware.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader
> copying scrapy/core/downloader/webclient.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader
> copying scrapy/core/downloader/tls.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/http2
> copying scrapy/core/http2/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/http2
> copying scrapy/core/http2/stream.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/http2
> copying scrapy/core/http2/agent.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/http2
> copying scrapy/core/http2/protocol.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/http2
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/ftp.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/http.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/http10.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/s3.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/datauri.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/file.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/http11.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader/handlers
> copying scrapy/core/downloader/handlers/http2.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader/handlers
> running egg_info
> writing Scrapy.egg-info/PKG-INFO
> writing dependency_links to Scrapy.egg-info/dependency_links.txt
> writing entry points to Scrapy.egg-info/entry_points.txt
> writing requirements to Scrapy.egg-info/requires.txt
> writing top-level names to Scrapy.egg-info/top_level.txt
> reading manifest file 'Scrapy.egg-info/SOURCES.txt'
> reading manifest template 'MANIFEST.in'
> warning: no files found matching 'requirements-*.txt'
> warning: no files found matching 'license.txt' under directory 'scrapy'
> no previously-included directories found matching 'docs/build'
> warning: no files found matching '*' under directory 'bin'
> warning: no previously-included files matching '__pycache__' found anywhere
> in distribution
> warning: no previously-included files matching '*.py[cod]' found anywhere in
> distribution
> adding license file 'LICENSE'
> adding license file 'AUTHORS'
> writing manifest file 'Scrapy.egg-info/SOURCES.txt'
> copying scrapy/VERSION ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> copying scrapy/mime.types ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/project
> copying scrapy/templates/project/scrapy.cfg ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/project
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/project/module
> copying scrapy/templates/project/module/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/project/module
> copying scrapy/templates/project/module/items.py.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/project/module
> copying scrapy/templates/project/module/middlewares.py.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/project/module
> copying scrapy/templates/project/module/pipelines.py.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/project/module
> copying scrapy/templates/project/module/settings.py.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/project/module
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/project/module/spiders
> copying scrapy/templates/project/module/spiders/__init__.py ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/project/module/spiders
> creating
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/spiders
> copying scrapy/templates/spiders/basic.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/spiders
> copying scrapy/templates/spiders/crawl.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/spiders
> copying scrapy/templates/spiders/csvfeed.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/spiders
> copying scrapy/templates/spiders/xmlfeed.tmpl ->
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/templates/spiders
> dh_auto_test -O--buildsystem=pybuild
> I: pybuild pybuild:286: cd /<<PKGBUILDDIR>>/tests/keys; cat
> example-com.key.pem example-com.cert.pem >cert.pem
> I: pybuild base:237: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build;
> python3.10 -m pytest --ignore tests/test_command_check.py -k 'not
> (test_squeues.py and (test_peek_fifo or test_peek_one_element or
> test_peek_lifo))'
> ============================= test session starts
> ==============================
> platform linux -- Python 3.10.1, pytest-6.2.5, py-1.10.0, pluggy-0.13.0
> rootdir: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build, configfile:
> pytest.ini
> collected 2664 items / 24 deselected / 2640 selected
>
> docs/intro/tutorial.rst ........................ [
> 0%]
> docs/topics/commands.rst .......... [
> 1%]
> docs/topics/debug.rst ..... [
> 1%]
> docs/topics/developer-tools.rst ..... [
> 1%]
> docs/topics/request-response.rst . [
> 1%]
> scrapy/pqueues.py . [
> 1%]
> scrapy/core/downloader/handlers/http11.py . [
> 1%]
> scrapy/downloadermiddlewares/ajaxcrawl.py . [
> 1%]
> scrapy/extensions/httpcache.py . [
> 1%]
> scrapy/http/cookies.py . [
> 1%]
> scrapy/pipelines/media.py . [
> 1%]
> scrapy/utils/deprecate.py . [
> 1%]
> scrapy/utils/misc.py . [
> 2%]
> scrapy/utils/python.py ..... [
> 2%]
> scrapy/utils/template.py . [
> 2%]
> scrapy/utils/url.py . [
> 2%]
> tests/test_closespider.py .... [
> 2%]
> tests/test_command_fetch.py .... [
> 2%]
> tests/test_command_parse.py .............. [
> 3%]
> tests/test_command_shell.py ................ [
> 3%]
> tests/test_command_version.py .. [
> 3%]
> tests/test_commands.py ...........................................s.s.ss [
> 5%]
> s.ssss...... [
> 6%]
> tests/test_contracts.py ......... [
> 6%]
> tests/test_core_downloader.py . [
> 6%]
> tests/test_crawl.py .....................sssssss..........x. [
> 7%]
> tests/test_crawler.py ................................. [
> 9%]
> tests/test_dependencies.py s. [
> 9%]
> tests/test_downloader_handlers.py ...................................... [
> 10%]
> ........................................................................ [
> 13%]
> ........................................................................ [
> 16%]
> ..s.....s................................. [
> 17%]
> tests/test_downloader_handlers_http2.py ..............................s. [
> 19%]
> .....x.....ssssss......................x.....ssssss..................... [
> 21%]
> .x.....ssssss......................x.....ssssss.......................s. [
> 24%]
> .... [
> 24%]
> tests/test_downloadermiddleware.py .........s [
> 25%]
> tests/test_downloadermiddleware_ajaxcrawlable.py ..... [
> 25%]
> tests/test_downloadermiddleware_cookies.py ......xx...x..... [
> 25%]
> tests/test_downloadermiddleware_decompression.py ... [
> 25%]
> tests/test_downloadermiddleware_defaultheaders.py .. [
> 26%]
> tests/test_downloadermiddleware_downloadtimeout.py .... [
> 26%]
> tests/test_downloadermiddleware_httpauth.py ........ [
> 26%]
> tests/test_downloadermiddleware_httpcache.py ........................... [
> 27%]
> ..... [
> 27%]
> tests/test_downloadermiddleware_httpcompression.py ................s.... [
> 28%]
> [
> 28%]
> tests/test_downloadermiddleware_httpproxy.py .......... [
> 28%]
> tests/test_downloadermiddleware_redirect.py ........................ [
> 29%]
> tests/test_downloadermiddleware_retry.py ............................... [
> 30%]
> [
> 30%]
> tests/test_downloadermiddleware_robotstxt.py ..........sssssssssssssssss [
> 31%]
> sss [
> 32%]
> tests/test_downloadermiddleware_stats.py ... [
> 32%]
> tests/test_downloadermiddleware_useragent.py ..... [
> 32%]
> tests/test_dupefilters.py ......... [
> 32%]
> tests/test_engine.py ...... [
> 32%]
> tests/test_engine_stop_download_bytes.py ............ [
> 33%]
> tests/test_engine_stop_download_headers.py ............ [
> 33%]
> tests/test_exporters.py ................................................ [
> 35%]
> ........................................................................ [
> 38%]
> .................................... [
> 39%]
> tests/test_extension_telnet.py ... [
> 39%]
> tests/test_feedexport.py .........................sss................... [
> 41%]
> ................... [
> 42%]
> tests/test_http2_client_protocol.py .......................... [
> 43%]
> tests/test_http_cookies.py ............ [
> 43%]
> tests/test_http_headers.py ................. [
> 44%]
> tests/test_http_request.py ............................................. [
> 46%]
> ........................................................................ [
> 48%]
> ....................................................... [
> 50%]
> tests/test_http_response.py ............................................ [
> 52%]
> ........................................................................ [
> 55%]
> ........................................................................ [
> 58%]
> . [
> 58%]
> tests/test_item.py ........................ [
> 59%]
> tests/test_link.py ... [
> 59%]
> tests/test_linkextractors.py ................................. [
> 60%]
> tests/test_loader.py ................................................... [
> 62%]
> ................... [
> 63%]
> tests/test_loader_deprecated.py ........................................ [
> 64%]
> ............... [
> 65%]
> tests/test_logformatter.py .................... [
> 65%]
> tests/test_mail.py ...... [
> 66%]
> tests/test_middleware.py .... [
> 66%]
> tests/test_pipeline_crawl.py ........ [
> 66%]
> tests/test_pipeline_files.py ........................ss [
> 67%]
> tests/test_pipeline_images.py ................... [
> 68%]
> tests/test_pipeline_media.py ................................. [
> 69%]
> tests/test_pipelines.py s... [
> 69%]
> tests/test_proxy_connect.py ... [
> 69%]
> tests/test_request_attribute_binding.py ....... [
> 70%]
> tests/test_request_cb_kwargs.py F [
> 70%]
> tests/test_request_left.py .... [
> 70%]
> tests/test_responsetypes.py ....... [
> 70%]
> tests/test_robotstxt_interface.py .s..s..ssssssssssssss.....s. [
> 71%]
> tests/test_scheduler.py .................. [
> 72%]
> tests/test_selector.py ....... [
> 72%]
> tests/test_signals.py s [
> 72%]
> tests/test_spider.py ................................................... [
> 74%]
> ................................ [
> 75%]
> tests/test_spidermiddleware.py .... [
> 75%]
> tests/test_spidermiddleware_depth.py . [
> 75%]
> tests/test_spidermiddleware_httperror.py ............ [
> 76%]
> tests/test_spidermiddleware_offsite.py ........ [
> 76%]
> tests/test_spidermiddleware_output_chain.py ......... [
> 77%]
> tests/test_spidermiddleware_referer.py ................................. [
> 78%]
> .... [
> 78%]
> tests/test_spidermiddleware_urllength.py .. [
> 78%]
> tests/test_spiderstate.py ... [
> 78%]
> tests/test_squeues.py ......x.............x.............x.............x. [
> 80%]
> ............x.............x.................x.................x......... [
> 83%]
> ........x.................x.................x............x......... [
> 85%]
> tests/test_stats.py .... [
> 85%]
> tests/test_toplevel.py ...... [
> 86%]
> tests/test_urlparse_monkeypatches.py . [
> 86%]
> tests/test_utils_asyncio.py .. [
> 86%]
> tests/test_utils_conf.py .................... [
> 87%]
> tests/test_utils_console.py .ss [
> 87%]
> tests/test_utils_curl.py ............. [
> 87%]
> tests/test_utils_datatypes.py ......................... [
> 88%]
> tests/test_utils_defer.py ..........x [
> 88%]
> tests/test_utils_deprecate.py ................ [
> 89%]
> tests/test_utils_display.py ........ [
> 89%]
> tests/test_utils_gz.py ...... [
> 90%]
> tests/test_utils_httpobj.py . [
> 90%]
> tests/test_utils_iterators.py ..............x................. [
> 91%]
> tests/test_utils_log.py .......... [
> 91%]
> tests/test_utils_project.py ..... [
> 91%]
> tests/test_utils_python.py .................... [
> 92%]
> tests/test_utils_reqser.py .............. [
> 93%]
> tests/test_utils_request.py .... [
> 93%]
> tests/test_utils_response.py ..... [
> 93%]
> tests/test_utils_serialize.py ...... [
> 93%]
> tests/test_utils_signal.py ....s. [
> 94%]
> tests/test_utils_sitemap.py .......... [
> 94%]
> tests/test_utils_spider.py .. [
> 94%]
> tests/test_utils_template.py . [
> 94%]
> tests/test_utils_trackref.py ..... [
> 94%]
> tests/test_utils_url.py ................................................ [
> 96%]
> ...s........... [
> 97%]
> tests/test_webclient.py ................ [
> 97%]
> tests/test_cmdline/__init__.py ..... [
> 97%]
> tests/test_cmdline_crawl_with_pipeline/__init__.py .. [
> 97%]
> tests/test_settings/__init__.py ............................... [
> 99%]
> tests/test_spiderloader/__init__.py ............ [
> 99%]
> tests/test_utils_misc/__init__.py ........ [
> 99%]
> tests/test_utils_misc/test_return_with_argument_inside_generator.py ...
> [100%]
>
> =================================== FAILURES
> ===================================
> ____________ CallbackKeywordArgumentsTestCase.test_callback_kwargs
> _____________
>
> result = None, g = <generator object DownloadHandlers._close at
> 0x7f483145c350>
> status = _CancellationStatus(deferred=<Deferred at 0x7f48393e1120 current
> result: (<bound method DownloadHandlers._close of <sc...s.DownloadHandlers
> object at 0x7f483a1b1c00>>, None)>, waitingOn=<DeferredList at 0x7f48393e0580
> current result: None>)
>
> @failure._extraneous
> def _inlineCallbacks(result, g, status):
> """
> Carry out the work of L{inlineCallbacks}.
>
> Iterate the generator produced by an C{@}L{inlineCallbacks}-decorated
> function, C{g}, C{send()}ing it the results of each value C{yield}ed
> by
> that generator, until a L{Deferred} is yielded, at which point a
> callback
> is added to that L{Deferred} to call this function again.
>
> @param result: The last result seen by this generator. Note that
> this is
> never a L{Deferred} - by the time this function is invoked, the
> L{Deferred} has been called back and this will be a particular
> result
> at a point in its callback chain.
>
> @param g: a generator object returned by calling a function or method
> decorated with C{@}L{inlineCallbacks}
>
> @param status: a L{_CancellationStatus} tracking the current status
> of C{g}
> """
> # This function is complicated by the need to prevent unbounded
> recursion
> # arising from repeatedly yielding immediately ready deferreds. This
> while
> # loop and the waiting variable solve that by manually unfolding the
> # recursion.
>
> waiting = [True, # waiting for result?
> None] # result
>
> while 1:
> try:
> # Send the last result back as the result of the yield
> expression.
> isFailure = isinstance(result, failure.Failure)
> if isFailure:
> result = result.throwExceptionIntoGenerator(g)
> else:
> > result = g.send(result)
> E StopIteration
>
> /usr/lib/python3/dist-packages/twisted/internet/defer.py:1418: StopIteration
>
> During handling of the above exception, another exception occurred:
>
> result = None, g = <generator object ExecutionEngine.start at 0x7f483145d380>
> status = _CancellationStatus(deferred=<Deferred at 0x7f4839e4b6a0 current
> result: None>, waitingOn=<Deferred at 0x7f484818fc10 current result: None>)
>
> @failure._extraneous
> def _inlineCallbacks(result, g, status):
> """
> Carry out the work of L{inlineCallbacks}.
>
> Iterate the generator produced by an C{@}L{inlineCallbacks}-decorated
> function, C{g}, C{send()}ing it the results of each value C{yield}ed
> by
> that generator, until a L{Deferred} is yielded, at which point a
> callback
> is added to that L{Deferred} to call this function again.
>
> @param result: The last result seen by this generator. Note that
> this is
> never a L{Deferred} - by the time this function is invoked, the
> L{Deferred} has been called back and this will be a particular
> result
> at a point in its callback chain.
>
> @param g: a generator object returned by calling a function or method
> decorated with C{@}L{inlineCallbacks}
>
> @param status: a L{_CancellationStatus} tracking the current status
> of C{g}
> """
> # This function is complicated by the need to prevent unbounded
> recursion
> # arising from repeatedly yielding immediately ready deferreds. This
> while
> # loop and the waiting variable solve that by manually unfolding the
> # recursion.
>
> waiting = [True, # waiting for result?
> None] # result
>
> while 1:
> try:
> # Send the last result back as the result of the yield
> expression.
> isFailure = isinstance(result, failure.Failure)
> if isFailure:
> result = result.throwExceptionIntoGenerator(g)
> else:
> > result = g.send(result)
> E StopIteration
>
> /usr/lib/python3/dist-packages/twisted/internet/defer.py:1418: StopIteration
>
> During handling of the above exception, another exception occurred:
>
> result = None, g = <generator object Crawler.crawl at 0x7f483145d460>
> status = _CancellationStatus(deferred=<Deferred at 0x7f483a1b2f20 current
> result: None>, waitingOn=<Deferred at 0x7f4839e4b6a0 current result: None>)
>
> @failure._extraneous
> def _inlineCallbacks(result, g, status):
> """
> Carry out the work of L{inlineCallbacks}.
>
> Iterate the generator produced by an C{@}L{inlineCallbacks}-decorated
> function, C{g}, C{send()}ing it the results of each value C{yield}ed
> by
> that generator, until a L{Deferred} is yielded, at which point a
> callback
> is added to that L{Deferred} to call this function again.
>
> @param result: The last result seen by this generator. Note that
> this is
> never a L{Deferred} - by the time this function is invoked, the
> L{Deferred} has been called back and this will be a particular
> result
> at a point in its callback chain.
>
> @param g: a generator object returned by calling a function or method
> decorated with C{@}L{inlineCallbacks}
>
> @param status: a L{_CancellationStatus} tracking the current status
> of C{g}
> """
> # This function is complicated by the need to prevent unbounded
> recursion
> # arising from repeatedly yielding immediately ready deferreds. This
> while
> # loop and the waiting variable solve that by manually unfolding the
> # recursion.
>
> waiting = [True, # waiting for result?
> None] # result
>
> while 1:
> try:
> # Send the last result back as the result of the yield
> expression.
> isFailure = isinstance(result, failure.Failure)
> if isFailure:
> result = result.throwExceptionIntoGenerator(g)
> else:
> > result = g.send(result)
> E StopIteration
>
> /usr/lib/python3/dist-packages/twisted/internet/defer.py:1418: StopIteration
>
> During handling of the above exception, another exception occurred:
>
> self = <tests.test_request_cb_kwargs.CallbackKeywordArgumentsTestCase
> testMethod=test_callback_kwargs>
>
> @defer.inlineCallbacks
> def test_callback_kwargs(self):
> crawler = self.runner.create_crawler(KeywordArgumentsSpider)
> with LogCapture() as log:
> yield crawler.crawl(mockserver=self.mockserver)
> self.assertTrue(all(crawler.spider.checks))
> self.assertEqual(len(crawler.spider.checks),
> crawler.stats.get_value('boolean_checks'))
> # check exceptions for argument mismatch
> exceptions = {}
> for line in log.records:
> for key in ('takes_less', 'takes_more'):
> if key in line.getMessage():
> exceptions[key] = line
> self.assertEqual(exceptions['takes_less'].exc_info[0], TypeError)
> > self.assertEqual(
> str(exceptions['takes_less'].exc_info[1]),
> "parse_takes_less() got an unexpected keyword argument 'number'"
> )
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/tests/test_request_cb_kwargs.py:161:
>
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
> /usr/lib/python3/dist-packages/twisted/trial/_synctest.py:434: in assertEqual
> super(_Assertions, self).assertEqual(first, second, msg)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> _
>
> self = <tests.test_request_cb_kwargs.CallbackKeywordArgumentsTestCase
> testMethod=test_callback_kwargs>
> msg = '"KeywordArgumentsSpider.parse_takes_less()[40 chars]ber\'" !=
> "parse_takes_less() got an unexpected keyw[17 chars]ber...d argument
> \'number\'\n? -----------------------\n+ parse_takes_less() got an unexpected
> keyword argument \'number\'\n'
>
> def fail(self, msg=None):
> """
> Absolutely fail the test. Do not pass go, do not collect $200.
>
> @param msg: the message that will be displayed as the reason for the
> failure
> """
> > raise self.failureException(msg)
> E twisted.trial.unittest.FailTest:
> "KeywordArgumentsSpider.parse_takes_less()[40 chars]ber'" !=
> "parse_takes_less() got an unexpected keyw[17 chars]ber'"
> E - KeywordArgumentsSpider.parse_takes_less() got an unexpected keyword
> argument 'number'
> E ? -----------------------
> E + parse_takes_less() got an unexpected keyword argument 'number'
>
> /usr/lib/python3/dist-packages/twisted/trial/_synctest.py:377: FailTest
> ----------------------------- Captured stderr call
> -----------------------------
> 2021-12-20 18:52:59 [scrapy.crawler] INFO: Overridden settings:
> {}
> 2021-12-20 18:52:59 [scrapy.crawler] INFO: Overridden settings:
> {}
> 2021-12-20 18:52:59 [scrapy.crawler] INFO: Overridden settings:
> {}
> 2021-12-20 18:52:59 [scrapy.crawler] INFO: Overridden settings:
> {}
> 2021-12-20 18:52:59 [scrapy.crawler] INFO: Overridden settings:
> {}
> 2021-12-20 18:52:59 [scrapy.crawler] INFO: Overridden settings:
> {}
> 2021-12-20 18:52:59 [scrapy.crawler] INFO: Overridden settings:
> {}
> 2021-12-20 18:52:59 [scrapy.extensions.telnet] INFO: Telnet Password:
> 9d1b91e73a378260
> 2021-12-20 18:52:59 [scrapy.extensions.telnet] INFO: Telnet Password:
> 9d1b91e73a378260
> 2021-12-20 18:52:59 [scrapy.extensions.telnet] INFO: Telnet Password:
> 9d1b91e73a378260
> 2021-12-20 18:52:59 [scrapy.extensions.telnet] INFO: Telnet Password:
> 9d1b91e73a378260
> 2021-12-20 18:52:59 [scrapy.extensions.telnet] INFO: Telnet Password:
> 9d1b91e73a378260
> 2021-12-20 18:52:59 [scrapy.extensions.telnet] INFO: Telnet Password:
> 9d1b91e73a378260
> 2021-12-20 18:52:59 [scrapy.extensions.telnet] INFO: Telnet Password:
> 9d1b91e73a378260
> 2021-12-20 18:52:59 [scrapy.middleware] INFO: Enabled extensions:
> ['scrapy.extensions.corestats.CoreStats',
> 'scrapy.extensions.telnet.TelnetConsole',
> 'scrapy.extensions.memusage.MemoryUsage',
> 'scrapy.extensions.logstats.LogStats']
> 2021-12-20 18:52:59 [scrapy.middleware] INFO: Enabled extensions:
> ['scrapy.extensions.corestats.CoreStats',
> 'scrapy.extensions.telnet.TelnetConsole',
> 'scrapy.extensions.memusage.MemoryUsage',
> 'scrapy.extensions.logstats.LogStats']
> 2021-12-20 18:52:59 [scrapy.middleware] INFO: Enabled extensions:
> ['scrapy.extensions.corestats.CoreStats',
> 'scrapy.extensions.telnet.TelnetConsole',
> 'scrapy.extensions.memusage.MemoryUsage',
> 'scrapy.extensions.logstats.LogStats']
> 2021-12-20 18:52:59 [scrapy.middleware] INFO: Enabled extensions:
> ['scrapy.extensions.corestats.CoreStats',
> 'scrapy.extensions.telnet.TelnetConsole',
> 'scrapy.extensions.memusage.MemoryUsage',
> 'scrapy.extensions.logstats.LogStats']
> 2021-12-20 18:52:59 [scrapy.middleware] INFO: Enabled extensions:
> ['scrapy.extensions.corestats.CoreStats',
> 'scrapy.extensions.telnet.TelnetConsole',
> 'scrapy.extensions.memusage.MemoryUsage',
> 'scrapy.extensions.logstats.LogStats']
> 2021-12-20 18:52:59 [scrapy.middleware] INFO: Enabled extensions:
> ['scrapy.extensions.corestats.CoreStats',
> 'scrapy.extensions.telnet.TelnetConsole',
> 'scrapy.extensions.memusage.MemoryUsage',
> 'scrapy.extensions.logstats.LogStats']
> 2021-12-20 18:52:59 [scrapy.middleware] INFO: Enabled extensions:
> ['scrapy.extensions.corestats.CoreStats',
> 'scrapy.extensions.telnet.TelnetConsole',
> 'scrapy.extensions.memusage.MemoryUsage',
> 'scrapy.extensions.logstats.LogStats']
> ------------------------------ Captured log call
> -------------------------------
> INFO scrapy.crawler:crawler.py:59 Overridden settings:
> {}
> INFO scrapy.extensions.telnet:telnet.py:55 Telnet Password:
> 9d1b91e73a378260
> INFO scrapy.middleware:middleware.py:45 Enabled extensions:
> ['scrapy.extensions.corestats.CoreStats',
> 'scrapy.extensions.telnet.TelnetConsole',
> 'scrapy.extensions.memusage.MemoryUsage',
> 'scrapy.extensions.logstats.LogStats']
> =============================== warnings summary
> ===============================
> ../../../../../../usr/lib/python3/dist-packages/_pytest/config/__init__.py:1233
> /usr/lib/python3/dist-packages/_pytest/config/__init__.py:1233:
> PytestConfigWarning: Unknown config option: flake8-ignore
>
> self._warn_or_fail_if_strict(f"Unknown config option: {key}\n")
>
> ../../../../../../usr/lib/python3/dist-packages/_pytest/config/__init__.py:1233
> /usr/lib/python3/dist-packages/_pytest/config/__init__.py:1233:
> PytestConfigWarning: Unknown config option: flake8-max-line-length
>
> self._warn_or_fail_if_strict(f"Unknown config option: {key}\n")
>
> ../../../../../../usr/lib/python3/dist-packages/sybil/integration/pytest.py:58:
> 45 warnings
> /usr/lib/python3/dist-packages/sybil/integration/pytest.py:58:
> PytestDeprecationWarning: A private pytest class or function was used.
> self._request = fixtures.FixtureRequest(self)
>
> scrapy/utils/display.py:8
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils/display.py:8:
> DeprecationWarning: The distutils package is deprecated and slated for
> removal in Python 3.12. Use setuptools or check PEP 632 for potential
> alternatives
> from distutils.version import LooseVersion as parse_version
>
> ../../../../../../usr/lib/python3.10/importlib/__init__.py:126
> /usr/lib/python3.10/importlib/__init__.py:126: ScrapyDeprecationWarning:
> Module `scrapy.utils.py36` is deprecated, please import from
> `scrapy.utils.asyncgen` instead.
> return _bootstrap._gcd_import(name[level:], package, level)
>
> ../../../../../../usr/lib/python3.10/asynchat.py:48
> /usr/lib/python3.10/asynchat.py:48: DeprecationWarning: The asyncore module
> is deprecated. The recommended replacement is asyncio
> import asyncore
>
> ../../../../../../usr/lib/python3/dist-packages/pyftpdlib/handlers.py:5
> /usr/lib/python3/dist-packages/pyftpdlib/handlers.py:5: DeprecationWarning:
> The asynchat module is deprecated. The recommended replacement is asyncio
> import asynchat
>
> ../../../../../../usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:25
> /usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:25:
> DeprecationWarning: twisted.test.proto_helpers.StringTransport was deprecated
> in Twisted 19.7.0: Please use twisted.internet.testing.StringTransport
> instead.
> from twisted.test.proto_helpers import (
>
> ../../../../../../usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:25
> /usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:25:
> DeprecationWarning: twisted.test.proto_helpers.waitUntilAllDisconnected was
> deprecated in Twisted 19.7.0: Please use
> twisted.internet.testing.waitUntilAllDisconnected instead.
> from twisted.test.proto_helpers import (
>
> ../../../../../../usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:25
> /usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:25:
> DeprecationWarning: twisted.test.proto_helpers.EventLoggingObserver was
> deprecated in Twisted 19.7.0: Please use
> twisted.internet.testing.EventLoggingObserver instead.
> from twisted.test.proto_helpers import (
>
> ../../../../../../usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1643
> /usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1643:
> DeprecationWarning: twisted.web.client.HTTPPageGetter was deprecated in
> Twisted 16.7.0: please use https://pypi.org/project/treq/ or
> twisted.web.client.Agent instead
> protocolClass = client.HTTPPageGetter
>
> ../../../../../../usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1672
> /usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1672:
> DeprecationWarning: twisted.web.client.HTTPPageGetter was deprecated in
> Twisted 16.7.0: please use https://pypi.org/project/treq/ or
> twisted.web.client.Agent instead
> protocolClass = client.HTTPPageGetter
>
> ../../../../../../usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1703
> /usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1703:
> DeprecationWarning: twisted.web.client.HTTPPageDownloader was deprecated in
> Twisted 16.7.0: please use https://pypi.org/project/treq/ or
> twisted.web.client.Agent instead
> protocolClass = client.HTTPPageDownloader
>
> ../../../../../../usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1713
> /usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1713:
> DeprecationWarning: twisted.web.client.HTTPPageDownloader was deprecated in
> Twisted 16.7.0: please use https://pypi.org/project/treq/ or
> twisted.web.client.Agent instead
> protocolClass = client.HTTPPageDownloader
>
> tests/test_closespider.py: 4 warnings
> tests/test_command_fetch.py: 4 warnings
> tests/test_command_parse.py: 14 warnings
> tests/test_command_shell.py: 16 warnings
> tests/test_command_version.py: 2 warnings
> tests/test_contracts.py: 1 warning
> tests/test_crawl.py: 31 warnings
> tests/test_crawler.py: 4 warnings
> tests/test_downloader_handlers.py: 203 warnings
> tests/test_downloader_handlers_http2.py: 180 warnings
> tests/test_downloadermiddleware_robotstxt.py: 6 warnings
> tests/test_engine.py: 3 warnings
> tests/test_engine_stop_download_bytes.py: 6 warnings
> tests/test_engine_stop_download_headers.py: 6 warnings
> tests/test_feedexport.py: 33 warnings
> tests/test_http2_client_protocol.py: 26 warnings
> tests/test_logformatter.py: 2 warnings
> tests/test_pipeline_crawl.py: 8 warnings
> tests/test_pipeline_files.py: 5 warnings
> tests/test_pipeline_media.py: 14 warnings
> tests/test_pipelines.py: 3 warnings
> tests/test_proxy_connect.py: 3 warnings
> tests/test_request_attribute_binding.py: 7 warnings
> tests/test_request_cb_kwargs.py: 1 warning
> tests/test_request_left.py: 4 warnings
> tests/test_scheduler.py: 1 warning
> tests/test_spidermiddleware.py: 4 warnings
> tests/test_spidermiddleware_httperror.py: 3 warnings
> tests/test_spidermiddleware_output_chain.py: 9 warnings
> tests/test_utils_defer.py: 2 warnings
> tests/test_utils_signal.py: 1 warning
> tests/test_webclient.py: 13 warnings
> /usr/lib/python3/dist-packages/twisted/python/threadable.py:107:
> DeprecationWarning: currentThread() is deprecated, use current_thread()
> instead
> return threadingmodule.currentThread().ident
>
> tests/test_crawl.py: 2 warnings
> tests/test_downloader_handlers.py: 143 warnings
> tests/test_downloader_handlers_http2.py: 150 warnings
> tests/test_proxy_connect.py: 2 warnings
> tests/test_webclient.py: 4 warnings
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/core/downloader/contextfactory.py:54:
> DeprecationWarning: Passing method to
> twisted.internet.ssl.CertificateOptions was deprecated in Twisted 17.1.0.
> Please use a combination of insecurelyLowerMinimumTo, raiseMinimumTo, and
> lowerMaximumSecurityTo instead, as Twisted will correctly configure the
> method.
> return CertificateOptions(
>
> tests/test_crawler.py::CrawlerRunnerHasSpider::test_crawler_process_asyncio_enabled_true
> tests/test_utils_asyncio.py::AsyncioTest::test_install_asyncio_reactor
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/utils/reactor.py:65:
> DeprecationWarning: There is no current event loop
> event_loop = asyncio.get_event_loop()
>
> tests/test_downloadermiddleware_httpauth.py::HttpAuthMiddlewareLegacyTest::test_auth
> tests/test_downloadermiddleware_httpauth.py::HttpAuthMiddlewareLegacyTest::test_auth_already_set
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/downloadermiddlewares/httpauth.py:32:
> ScrapyDeprecationWarning: Using HttpAuthMiddleware without http_auth_domain
> is deprecated and can cause security problems if the spider makes requests to
> several different domains. http_auth_domain will be set to the domain of the
> first request, please set it to the correct value explicitly.
> warnings.warn('Using HttpAuthMiddleware without http_auth_domain is
> deprecated and can cause security '
>
> tests/test_exporters.py::PythonItemExporterTest::test_export_binary
> tests/test_exporters.py::PythonItemExporterDataclassTest::test_export_binary
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/exporters.py:307:
> ScrapyDeprecationWarning: PythonItemExporter will drop support for binary
> export in the future
> warnings.warn(
>
> tests/test_feedexport.py: 33 warnings
> tests/test_pipeline_files.py: 30 warnings
> tests/test_pipeline_images.py: 24 warnings
> /usr/lib/python3/dist-packages/botocore/httpsession.py:62:
> DeprecationWarning: ssl.PROTOCOL_TLS is deprecated
> context = SSLContext(ssl_version or ssl.PROTOCOL_SSLv23)
>
> tests/test_feedexport.py::FeedExportTest::test_export_no_items_multiple_feeds
> tests/test_feedexport.py::FeedExportTest::test_export_no_items_multiple_feeds
> tests/test_feedexport.py::FeedExportTest::test_export_no_items_multiple_feeds
> tests/test_feedexport.py::FeedExportTest::test_export_no_items_multiple_feeds
> tests/test_feedexport.py::FeedExportTest::test_export_no_items_multiple_feeds
> tests/test_feedexport.py::FeedExportTest::test_export_no_items_multiple_feeds
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning: LogOnStoreFileStorage does not support the
> 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_failing_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_failing_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_failing_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_failing_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_failing_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_failing_logs_blocking_feed_storage
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning: FailingBlockingFeedStorage does not support the
> 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_success_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_success_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_success_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_success_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_success_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_success_logs_blocking_feed_storage
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning: DummyBlockingFeedStorage does not support the
> 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py: 12 warnings
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions/feedexport.py:247:
> ScrapyDeprecationWarning: The `FEED_URI` and `FEED_FORMAT` settings have
> been deprecated in favor of the `FEEDS` setting. Please see the `FEEDS`
> setting docs for more details
> exporter = cls(crawler)
>
> tests/test_feedexport.py::StdoutFeedStoragePreFeedOptionsTest::test_init
> tests/test_feedexport.py::StdoutFeedStoragePreFeedOptionsTest::test_init
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning: StdoutFeedStorageWithoutFeedOptions does not
> support the 'feed_options' keyword argument. Add a 'feed_options' parameter
> to its signature to remove this warning. This parameter will become mandatory
> in a future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py::FileFeedStoragePreFeedOptionsTest::test_init
> tests/test_feedexport.py::FileFeedStoragePreFeedOptionsTest::test_init
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning: FileFeedStorageWithoutFeedOptions does not support
> the 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_from_crawler
> tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_from_crawler
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning:
> S3FeedStorageWithoutFeedOptionsWithFromCrawler.from_crawler does not support
> the 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_init
> tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_init
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning: S3FeedStorageWithoutFeedOptions does not support
> the 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_from_crawler
> tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_from_crawler
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning:
> FTPFeedStorageWithoutFeedOptionsWithFromCrawler.from_crawler does not support
> the 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_init
> tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_init
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning: FTPFeedStorageWithoutFeedOptions does not support
> the 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_http_response.py: 32 warnings
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/tests/test_http_response.py:137:
> ScrapyDeprecationWarning: Response.body_as_unicode() is deprecated, please
> use Response.text instead.
> self.assertEqual(response.body_as_unicode(), body_unicode)
>
> tests/test_http_response.py::TextResponseTest::test_unicode_body
> tests/test_http_response.py::HtmlResponseTest::test_unicode_body
> tests/test_http_response.py::XmlResponseTest::test_unicode_body
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/tests/test_http_response.py:348:
> ScrapyDeprecationWarning: Response.body_as_unicode() is deprecated, please
> use Response.text instead.
> self.assertTrue(isinstance(r1.body_as_unicode(), str))
>
> tests/test_http_response.py::TextResponseTest::test_unicode_body
> tests/test_http_response.py::HtmlResponseTest::test_unicode_body
> tests/test_http_response.py::XmlResponseTest::test_unicode_body
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/tests/test_http_response.py:349:
> ScrapyDeprecationWarning: Response.body_as_unicode() is deprecated, please
> use Response.text instead.
> self.assertEqual(r1.body_as_unicode(), unicode_string)
>
> tests/test_item.py::BaseItemTest::test_isinstance_check
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/tests/test_item.py:331:
> ScrapyDeprecationWarning: scrapy.item.BaseItem is deprecated, please use
> scrapy.item.Item instead
> self.assertTrue(isinstance(BaseItem(), BaseItem))
>
> tests/test_item.py::BaseItemTest::test_isinstance_check
> tests/test_item.py::BaseItemTest::test_isinstance_check
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/tests/test_item.py:332:
> ScrapyDeprecationWarning: scrapy.item.BaseItem is deprecated, please use
> scrapy.item.Item instead
> self.assertTrue(isinstance(SubclassedBaseItem(), BaseItem))
>
> tests/test_item.py::BaseItemTest::test_isinstance_check
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/tests/test_item.py:333:
> ScrapyDeprecationWarning: scrapy.item.BaseItem is deprecated, please use
> scrapy.item.Item instead
> self.assertTrue(isinstance(Item(), BaseItem))
>
> tests/test_item.py::BaseItemTest::test_isinstance_check
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/tests/test_item.py:334:
> ScrapyDeprecationWarning: scrapy.item.BaseItem is deprecated, please use
> scrapy.item.Item instead
> self.assertTrue(isinstance(SubclassedItem(), BaseItem))
>
> tests/test_item.py::BaseItemTest::test_isinstance_check
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/tests/test_item.py:337:
> ScrapyDeprecationWarning: scrapy.item.BaseItem is deprecated, please use
> scrapy.item.Item instead
> self.assertTrue(isinstance(BaseItem(), _BaseItem))
>
> tests/test_item.py::BaseItemTest::test_isinstance_check
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/tests/test_item.py:338:
> ScrapyDeprecationWarning: scrapy.item.BaseItem is deprecated, please use
> scrapy.item.Item instead
> self.assertTrue(isinstance(SubclassedBaseItem(), _BaseItem))
>
> tests/test_proxy_connect.py::ProxyConnectTestCase::test_https_connect_tunnel
> tests/test_proxy_connect.py::ProxyConnectTestCase::test_https_tunnel_without_leak_proxy_authorization_header
> /usr/lib/python3/dist-packages/service_identity/pyopenssl.py:49:
> SubjectAltNameWarning: Certificate with CN 'mitmproxy' has no
> `subjectAltName`, falling back to check for a `commonName` for now. This
> feature is being removed by major browsers and deprecated by RFC 2818.
> service_identity will remove the support for it in mid-2018.
> cert_patterns=extract_ids(connection.get_peer_certificate()),
>
> tests/test_utils_deprecate.py::WarnWhenSubclassedTest::test_warning_on_instance
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/tests/test_utils_deprecate.py:113:
> MyWarning: tests.test_utils_deprecate.UserClass inherits from deprecated
> class tests.test_utils_deprecate.Deprecated, please inherit from
> tests.test_utils_deprecate.NewName. (warning only on first subclass, there
> may be others)
> class UserClass(Deprecated):
>
> tests/test_utils_python.py::UtilsPythonTestCase::test_weakkeycache
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build/tests/test_utils_python.py:163:
> ScrapyDeprecationWarning: The WeakKeyCache class is deprecated
> wk = WeakKeyCache(lambda k: next(_values))
>
> -- Docs: https://docs.pytest.org/en/stable/warnings.html
> =========================== short test summary info
> ============================
> FAILED
> tests/test_request_cb_kwargs.py::CallbackKeywordArgumentsTestCase::test_callback_kwargs
> = 1 failed, 2522 passed, 95 skipped, 24 deselected, 22 xfailed, 1162 warnings
> in 337.48s (0:05:37) =
> E: pybuild pybuild:355: test: plugin distutils failed with: exit code=1: cd
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_scrapy/build; python3.10 -m pytest
> --ignore tests/test_command_check.py -k 'not (test_squeues.py and
> (test_peek_fifo or test_peek_one_element or test_peek_lifo))'
> I: pybuild pybuild:286: cd /<<PKGBUILDDIR>>/tests/keys; cat
> example-com.key.pem example-com.cert.pem >cert.pem
> I: pybuild base:237: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build;
> python3.9 -m pytest --ignore tests/test_command_check.py -k 'not
> (test_squeues.py and (test_peek_fifo or test_peek_one_element or
> test_peek_lifo))'
> ============================= test session starts
> ==============================
> platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.0
> rootdir: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build, configfile:
> pytest.ini
> collected 2664 items / 24 deselected / 2640 selected
>
> docs/intro/tutorial.rst ........................ [
> 0%]
> docs/topics/commands.rst .......... [
> 1%]
> docs/topics/debug.rst ..... [
> 1%]
> docs/topics/developer-tools.rst ..... [
> 1%]
> docs/topics/request-response.rst . [
> 1%]
> scrapy/pqueues.py . [
> 1%]
> scrapy/core/downloader/handlers/http11.py . [
> 1%]
> scrapy/downloadermiddlewares/ajaxcrawl.py . [
> 1%]
> scrapy/extensions/httpcache.py . [
> 1%]
> scrapy/http/cookies.py . [
> 1%]
> scrapy/pipelines/media.py . [
> 1%]
> scrapy/utils/deprecate.py . [
> 1%]
> scrapy/utils/misc.py . [
> 2%]
> scrapy/utils/python.py ..... [
> 2%]
> scrapy/utils/template.py . [
> 2%]
> scrapy/utils/url.py . [
> 2%]
> tests/test_closespider.py .... [
> 2%]
> tests/test_command_fetch.py .... [
> 2%]
> tests/test_command_parse.py .............. [
> 3%]
> tests/test_command_shell.py ................ [
> 3%]
> tests/test_command_version.py .. [
> 3%]
> tests/test_commands.py ...........................................s.s.ss [
> 5%]
> s.ssss...... [
> 6%]
> tests/test_contracts.py ......... [
> 6%]
> tests/test_core_downloader.py . [
> 6%]
> tests/test_crawl.py .....................sssssss..........x. [
> 7%]
> tests/test_crawler.py ................................. [
> 9%]
> tests/test_dependencies.py s. [
> 9%]
> tests/test_downloader_handlers.py ...................................... [
> 10%]
> ........................................................................ [
> 13%]
> ........................................................................ [
> 16%]
> ..s.....s................................. [
> 17%]
> tests/test_downloader_handlers_http2.py ..............................s. [
> 19%]
> .....x.....ssssss......................x.....ssssss..................... [
> 21%]
> .x.....ssssss......................x.....ssssss.......................s. [
> 24%]
> .... [
> 24%]
> tests/test_downloadermiddleware.py .........s [
> 25%]
> tests/test_downloadermiddleware_ajaxcrawlable.py ..... [
> 25%]
> tests/test_downloadermiddleware_cookies.py ......xx...x..... [
> 25%]
> tests/test_downloadermiddleware_decompression.py ... [
> 25%]
> tests/test_downloadermiddleware_defaultheaders.py .. [
> 26%]
> tests/test_downloadermiddleware_downloadtimeout.py .... [
> 26%]
> tests/test_downloadermiddleware_httpauth.py ........ [
> 26%]
> tests/test_downloadermiddleware_httpcache.py ........................... [
> 27%]
> ..... [
> 27%]
> tests/test_downloadermiddleware_httpcompression.py ................s.... [
> 28%]
> [
> 28%]
> tests/test_downloadermiddleware_httpproxy.py .......... [
> 28%]
> tests/test_downloadermiddleware_redirect.py ........................ [
> 29%]
> tests/test_downloadermiddleware_retry.py ............................... [
> 30%]
> [
> 30%]
> tests/test_downloadermiddleware_robotstxt.py ..........sssssssssssssssss [
> 31%]
> sss [
> 32%]
> tests/test_downloadermiddleware_stats.py ... [
> 32%]
> tests/test_downloadermiddleware_useragent.py ..... [
> 32%]
> tests/test_dupefilters.py ......... [
> 32%]
> tests/test_engine.py ...... [
> 32%]
> tests/test_engine_stop_download_bytes.py ............ [
> 33%]
> tests/test_engine_stop_download_headers.py ............ [
> 33%]
> tests/test_exporters.py ................................................ [
> 35%]
> ........................................................................ [
> 38%]
> .................................... [
> 39%]
> tests/test_extension_telnet.py ... [
> 39%]
> tests/test_feedexport.py .........................sss................... [
> 41%]
> ................... [
> 42%]
> tests/test_http2_client_protocol.py .......................... [
> 43%]
> tests/test_http_cookies.py ............ [
> 43%]
> tests/test_http_headers.py ................. [
> 44%]
> tests/test_http_request.py ............................................. [
> 46%]
> ........................................................................ [
> 48%]
> ....................................................... [
> 50%]
> tests/test_http_response.py ............................................ [
> 52%]
> ........................................................................ [
> 55%]
> ........................................................................ [
> 58%]
> . [
> 58%]
> tests/test_item.py ........................ [
> 59%]
> tests/test_link.py ... [
> 59%]
> tests/test_linkextractors.py ................................. [
> 60%]
> tests/test_loader.py ................................................... [
> 62%]
> ................... [
> 63%]
> tests/test_loader_deprecated.py ........................................ [
> 64%]
> ............... [
> 65%]
> tests/test_logformatter.py .................... [
> 65%]
> tests/test_mail.py ...... [
> 66%]
> tests/test_middleware.py .... [
> 66%]
> tests/test_pipeline_crawl.py ........ [
> 66%]
> tests/test_pipeline_files.py ........................ss [
> 67%]
> tests/test_pipeline_images.py ................... [
> 68%]
> tests/test_pipeline_media.py ................................. [
> 69%]
> tests/test_pipelines.py s... [
> 69%]
> tests/test_proxy_connect.py ... [
> 69%]
> tests/test_request_attribute_binding.py ....... [
> 70%]
> tests/test_request_cb_kwargs.py . [
> 70%]
> tests/test_request_left.py .... [
> 70%]
> tests/test_responsetypes.py ....... [
> 70%]
> tests/test_robotstxt_interface.py .s..s..ssssssssssssss.....s. [
> 71%]
> tests/test_scheduler.py .................. [
> 72%]
> tests/test_selector.py ....... [
> 72%]
> tests/test_signals.py s [
> 72%]
> tests/test_spider.py ................................................... [
> 74%]
> ................................ [
> 75%]
> tests/test_spidermiddleware.py .... [
> 75%]
> tests/test_spidermiddleware_depth.py . [
> 75%]
> tests/test_spidermiddleware_httperror.py ............ [
> 76%]
> tests/test_spidermiddleware_offsite.py ........ [
> 76%]
> tests/test_spidermiddleware_output_chain.py ......... [
> 77%]
> tests/test_spidermiddleware_referer.py ................................. [
> 78%]
> .... [
> 78%]
> tests/test_spidermiddleware_urllength.py .. [
> 78%]
> tests/test_spiderstate.py ... [
> 78%]
> tests/test_squeues.py ......x.............x.............x.............x. [
> 80%]
> ............x.............x.................x.................x......... [
> 83%]
> ........x.................x.................x............x......... [
> 85%]
> tests/test_stats.py .... [
> 85%]
> tests/test_toplevel.py ...... [
> 86%]
> tests/test_urlparse_monkeypatches.py . [
> 86%]
> tests/test_utils_asyncio.py .. [
> 86%]
> tests/test_utils_conf.py .................... [
> 87%]
> tests/test_utils_console.py .ss [
> 87%]
> tests/test_utils_curl.py ............. [
> 87%]
> tests/test_utils_datatypes.py ......................... [
> 88%]
> tests/test_utils_defer.py ..........x [
> 88%]
> tests/test_utils_deprecate.py ................ [
> 89%]
> tests/test_utils_display.py ........ [
> 89%]
> tests/test_utils_gz.py ...... [
> 90%]
> tests/test_utils_httpobj.py . [
> 90%]
> tests/test_utils_iterators.py ..............x................. [
> 91%]
> tests/test_utils_log.py .......... [
> 91%]
> tests/test_utils_project.py ..... [
> 91%]
> tests/test_utils_python.py .................... [
> 92%]
> tests/test_utils_reqser.py .............. [
> 93%]
> tests/test_utils_request.py .... [
> 93%]
> tests/test_utils_response.py ..... [
> 93%]
> tests/test_utils_serialize.py ...... [
> 93%]
> tests/test_utils_signal.py ....s. [
> 94%]
> tests/test_utils_sitemap.py .......... [
> 94%]
> tests/test_utils_spider.py .. [
> 94%]
> tests/test_utils_template.py . [
> 94%]
> tests/test_utils_trackref.py ..... [
> 94%]
> tests/test_utils_url.py ................................................ [
> 96%]
> ...s........... [
> 97%]
> tests/test_webclient.py ................ [
> 97%]
> tests/test_cmdline/__init__.py ..... [
> 97%]
> tests/test_cmdline_crawl_with_pipeline/__init__.py .. [
> 97%]
> tests/test_settings/__init__.py ............................... [
> 99%]
> tests/test_spiderloader/__init__.py ............ [
> 99%]
> tests/test_utils_misc/__init__.py ........ [
> 99%]
> tests/test_utils_misc/test_return_with_argument_inside_generator.py ...
> [100%]
>
> =============================== warnings summary
> ===============================
> ../../../../../../usr/lib/python3/dist-packages/_pytest/config/__init__.py:1233
> /usr/lib/python3/dist-packages/_pytest/config/__init__.py:1233:
> PytestConfigWarning: Unknown config option: flake8-ignore
>
> self._warn_or_fail_if_strict(f"Unknown config option: {key}\n")
>
> ../../../../../../usr/lib/python3/dist-packages/_pytest/config/__init__.py:1233
> /usr/lib/python3/dist-packages/_pytest/config/__init__.py:1233:
> PytestConfigWarning: Unknown config option: flake8-max-line-length
>
> self._warn_or_fail_if_strict(f"Unknown config option: {key}\n")
>
> ../../../../../../usr/lib/python3/dist-packages/sybil/integration/pytest.py:58:
> 45 warnings
> /usr/lib/python3/dist-packages/sybil/integration/pytest.py:58:
> PytestDeprecationWarning: A private pytest class or function was used.
> self._request = fixtures.FixtureRequest(self)
>
> ../../../../../../usr/lib/python3.9/importlib/__init__.py:127
> /usr/lib/python3.9/importlib/__init__.py:127: ScrapyDeprecationWarning:
> Module `scrapy.utils.py36` is deprecated, please import from
> `scrapy.utils.asyncgen` instead.
> return _bootstrap._gcd_import(name[level:], package, level)
>
> ../../../../../../usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:25
> /usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:25:
> DeprecationWarning: twisted.test.proto_helpers.StringTransport was deprecated
> in Twisted 19.7.0: Please use twisted.internet.testing.StringTransport
> instead.
> from twisted.test.proto_helpers import (
>
> ../../../../../../usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:25
> /usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:25:
> DeprecationWarning: twisted.test.proto_helpers.waitUntilAllDisconnected was
> deprecated in Twisted 19.7.0: Please use
> twisted.internet.testing.waitUntilAllDisconnected instead.
> from twisted.test.proto_helpers import (
>
> ../../../../../../usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:25
> /usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:25:
> DeprecationWarning: twisted.test.proto_helpers.EventLoggingObserver was
> deprecated in Twisted 19.7.0: Please use
> twisted.internet.testing.EventLoggingObserver instead.
> from twisted.test.proto_helpers import (
>
> ../../../../../../usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1643
> /usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1643:
> DeprecationWarning: twisted.web.client.HTTPPageGetter was deprecated in
> Twisted 16.7.0: please use https://pypi.org/project/treq/ or
> twisted.web.client.Agent instead
> protocolClass = client.HTTPPageGetter
>
> ../../../../../../usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1672
> /usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1672:
> DeprecationWarning: twisted.web.client.HTTPPageGetter was deprecated in
> Twisted 16.7.0: please use https://pypi.org/project/treq/ or
> twisted.web.client.Agent instead
> protocolClass = client.HTTPPageGetter
>
> ../../../../../../usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1703
> /usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1703:
> DeprecationWarning: twisted.web.client.HTTPPageDownloader was deprecated in
> Twisted 16.7.0: please use https://pypi.org/project/treq/ or
> twisted.web.client.Agent instead
> protocolClass = client.HTTPPageDownloader
>
> ../../../../../../usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1713
> /usr/lib/python3/dist-packages/twisted/web/test/test_webclient.py:1713:
> DeprecationWarning: twisted.web.client.HTTPPageDownloader was deprecated in
> Twisted 16.7.0: please use https://pypi.org/project/treq/ or
> twisted.web.client.Agent instead
> protocolClass = client.HTTPPageDownloader
>
> tests/test_crawl.py: 2 warnings
> tests/test_downloader_handlers.py: 143 warnings
> tests/test_downloader_handlers_http2.py: 150 warnings
> tests/test_proxy_connect.py: 2 warnings
> tests/test_webclient.py: 4 warnings
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/core/downloader/contextfactory.py:54:
> DeprecationWarning: Passing method to
> twisted.internet.ssl.CertificateOptions was deprecated in Twisted 17.1.0.
> Please use a combination of insecurelyLowerMinimumTo, raiseMinimumTo, and
> lowerMaximumSecurityTo instead, as Twisted will correctly configure the
> method.
> return CertificateOptions(
>
> tests/test_downloadermiddleware_httpauth.py::HttpAuthMiddlewareLegacyTest::test_auth
> tests/test_downloadermiddleware_httpauth.py::HttpAuthMiddlewareLegacyTest::test_auth_already_set
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/downloadermiddlewares/httpauth.py:32:
> ScrapyDeprecationWarning: Using HttpAuthMiddleware without http_auth_domain
> is deprecated and can cause security problems if the spider makes requests to
> several different domains. http_auth_domain will be set to the domain of the
> first request, please set it to the correct value explicitly.
> warnings.warn('Using HttpAuthMiddleware without http_auth_domain is
> deprecated and can cause security '
>
> tests/test_exporters.py::PythonItemExporterTest::test_export_binary
> tests/test_exporters.py::PythonItemExporterDataclassTest::test_export_binary
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/exporters.py:307:
> ScrapyDeprecationWarning: PythonItemExporter will drop support for binary
> export in the future
> warnings.warn(
>
> tests/test_feedexport.py::FeedExportTest::test_export_no_items_multiple_feeds
> tests/test_feedexport.py::FeedExportTest::test_export_no_items_multiple_feeds
> tests/test_feedexport.py::FeedExportTest::test_export_no_items_multiple_feeds
> tests/test_feedexport.py::FeedExportTest::test_export_no_items_multiple_feeds
> tests/test_feedexport.py::FeedExportTest::test_export_no_items_multiple_feeds
> tests/test_feedexport.py::FeedExportTest::test_export_no_items_multiple_feeds
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning: LogOnStoreFileStorage does not support the
> 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_failing_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_failing_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_failing_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_failing_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_failing_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_failing_logs_blocking_feed_storage
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning: FailingBlockingFeedStorage does not support the
> 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_success_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_success_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_success_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_success_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_success_logs_blocking_feed_storage
> tests/test_feedexport.py::FeedExportTest::test_multiple_feeds_success_logs_blocking_feed_storage
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning: DummyBlockingFeedStorage does not support the
> 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py: 12 warnings
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions/feedexport.py:247:
> ScrapyDeprecationWarning: The `FEED_URI` and `FEED_FORMAT` settings have
> been deprecated in favor of the `FEEDS` setting. Please see the `FEEDS`
> setting docs for more details
> exporter = cls(crawler)
>
> tests/test_feedexport.py::StdoutFeedStoragePreFeedOptionsTest::test_init
> tests/test_feedexport.py::StdoutFeedStoragePreFeedOptionsTest::test_init
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning: StdoutFeedStorageWithoutFeedOptions does not
> support the 'feed_options' keyword argument. Add a 'feed_options' parameter
> to its signature to remove this warning. This parameter will become mandatory
> in a future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py::FileFeedStoragePreFeedOptionsTest::test_init
> tests/test_feedexport.py::FileFeedStoragePreFeedOptionsTest::test_init
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning: FileFeedStorageWithoutFeedOptions does not support
> the 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_from_crawler
> tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_from_crawler
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning:
> S3FeedStorageWithoutFeedOptionsWithFromCrawler.from_crawler does not support
> the 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_init
> tests/test_feedexport.py::S3FeedStoragePreFeedOptionsTest::test_init
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning: S3FeedStorageWithoutFeedOptions does not support
> the 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_from_crawler
> tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_from_crawler
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning:
> FTPFeedStorageWithoutFeedOptionsWithFromCrawler.from_crawler does not support
> the 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_init
> tests/test_feedexport.py::FTPFeedStoragePreFeedOptionsTest::test_init
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/scrapy/extensions/feedexport.py:38:
> ScrapyDeprecationWarning: FTPFeedStorageWithoutFeedOptions does not support
> the 'feed_options' keyword argument. Add a 'feed_options' parameter to its
> signature to remove this warning. This parameter will become mandatory in a
> future version of Scrapy.
> warnings.warn(
>
> tests/test_http_response.py: 32 warnings
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/tests/test_http_response.py:137:
> ScrapyDeprecationWarning: Response.body_as_unicode() is deprecated, please
> use Response.text instead.
> self.assertEqual(response.body_as_unicode(), body_unicode)
>
> tests/test_http_response.py::TextResponseTest::test_unicode_body
> tests/test_http_response.py::HtmlResponseTest::test_unicode_body
> tests/test_http_response.py::XmlResponseTest::test_unicode_body
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/tests/test_http_response.py:348:
> ScrapyDeprecationWarning: Response.body_as_unicode() is deprecated, please
> use Response.text instead.
> self.assertTrue(isinstance(r1.body_as_unicode(), str))
>
> tests/test_http_response.py::TextResponseTest::test_unicode_body
> tests/test_http_response.py::HtmlResponseTest::test_unicode_body
> tests/test_http_response.py::XmlResponseTest::test_unicode_body
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/tests/test_http_response.py:349:
> ScrapyDeprecationWarning: Response.body_as_unicode() is deprecated, please
> use Response.text instead.
> self.assertEqual(r1.body_as_unicode(), unicode_string)
>
> tests/test_item.py::BaseItemTest::test_isinstance_check
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/tests/test_item.py:331:
> ScrapyDeprecationWarning: scrapy.item.BaseItem is deprecated, please use
> scrapy.item.Item instead
> self.assertTrue(isinstance(BaseItem(), BaseItem))
>
> tests/test_item.py::BaseItemTest::test_isinstance_check
> tests/test_item.py::BaseItemTest::test_isinstance_check
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/tests/test_item.py:332:
> ScrapyDeprecationWarning: scrapy.item.BaseItem is deprecated, please use
> scrapy.item.Item instead
> self.assertTrue(isinstance(SubclassedBaseItem(), BaseItem))
>
> tests/test_item.py::BaseItemTest::test_isinstance_check
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/tests/test_item.py:333:
> ScrapyDeprecationWarning: scrapy.item.BaseItem is deprecated, please use
> scrapy.item.Item instead
> self.assertTrue(isinstance(Item(), BaseItem))
>
> tests/test_item.py::BaseItemTest::test_isinstance_check
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/tests/test_item.py:334:
> ScrapyDeprecationWarning: scrapy.item.BaseItem is deprecated, please use
> scrapy.item.Item instead
> self.assertTrue(isinstance(SubclassedItem(), BaseItem))
>
> tests/test_item.py::BaseItemTest::test_isinstance_check
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/tests/test_item.py:337:
> ScrapyDeprecationWarning: scrapy.item.BaseItem is deprecated, please use
> scrapy.item.Item instead
> self.assertTrue(isinstance(BaseItem(), _BaseItem))
>
> tests/test_item.py::BaseItemTest::test_isinstance_check
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/tests/test_item.py:338:
> ScrapyDeprecationWarning: scrapy.item.BaseItem is deprecated, please use
> scrapy.item.Item instead
> self.assertTrue(isinstance(SubclassedBaseItem(), _BaseItem))
>
> tests/test_proxy_connect.py::ProxyConnectTestCase::test_https_connect_tunnel
> tests/test_proxy_connect.py::ProxyConnectTestCase::test_https_tunnel_without_leak_proxy_authorization_header
> /usr/lib/python3/dist-packages/service_identity/pyopenssl.py:49:
> SubjectAltNameWarning: Certificate with CN 'mitmproxy' has no
> `subjectAltName`, falling back to check for a `commonName` for now. This
> feature is being removed by major browsers and deprecated by RFC 2818.
> service_identity will remove the support for it in mid-2018.
> cert_patterns=extract_ids(connection.get_peer_certificate()),
>
> tests/test_utils_deprecate.py::WarnWhenSubclassedTest::test_warning_on_instance
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/tests/test_utils_deprecate.py:113:
> MyWarning: tests.test_utils_deprecate.UserClass inherits from deprecated
> class tests.test_utils_deprecate.Deprecated, please inherit from
> tests.test_utils_deprecate.NewName. (warning only on first subclass, there
> may be others)
> class UserClass(Deprecated):
>
> tests/test_utils_python.py::UtilsPythonTestCase::test_weakkeycache
>
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.9_scrapy/build/tests/test_utils_python.py:163:
> ScrapyDeprecationWarning: The WeakKeyCache class is deprecated
> wk = WeakKeyCache(lambda k: next(_values))
>
> -- Docs: https://docs.pytest.org/en/stable/warnings.html
> = 2523 passed, 95 skipped, 24 deselected, 22 xfailed, 451 warnings in 342.63s
> (0:05:42) =
> dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p "3.10
> 3.9" returned exit code 13
The full build log is available from:
http://qa-logs.debian.net/2021/12/20/python-scrapy_2.5.1-1_unstable.log
A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!
If you reassign this bug to another package, please marking it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects
If you fail to reproduce this, please provide a build log and diff it with mine
so that we can identify if something relevant changed in the meantime.
--- End Message ---