On Mon, Oct 07, 2024 at 10:38:13AM +0200, Santiago Vila wrote:
> During a rebuild of all packages in unstable, your package failed to build:
[...]
> pydoctor/test/__init__.py:13: in <module>
>     from pydoctor.templatewriter import IWriter, TemplateLookup
> pydoctor/templatewriter/__init__.py:423: in <module>
>     from pydoctor.templatewriter.writer import TemplateWriter
> pydoctor/templatewriter/writer.py:10: in <module>
>     from pydoctor.templatewriter import (
> pydoctor/templatewriter/search.py:16: in <module>
>     from lunr import lunr, get_default_builder
> /usr/lib/python3/dist-packages/lunr/__init__.py:1: in <module>
>     from lunr.lunr import lunr, get_default_builder
> /usr/lib/python3/dist-packages/lunr/lunr.py:1: in <module>
>     from lunr import languages as lang
> /usr/lib/python3/dist-packages/lunr/languages/__init__.py:34: in <module>
>     import nltk  # type: ignore
> /usr/lib/python3/dist-packages/nltk/__init__.py:156: in <module>
>     from nltk.stem import *
> /usr/lib/python3/dist-packages/nltk/stem/__init__.py:34: in <module>
>     from nltk.stem.wordnet import WordNetLemmatizer
> /usr/lib/python3/dist-packages/nltk/stem/wordnet.py:13: in <module>
>     class WordNetLemmatizer:
> /usr/lib/python3/dist-packages/nltk/stem/wordnet.py:48: in WordNetLemmatizer
>     morphy = wn.morphy
> /usr/lib/python3/dist-packages/nltk/corpus/util.py:120: in __getattr__
>     self.__load()
> /usr/lib/python3/dist-packages/nltk/corpus/util.py:86: in __load
>     raise e
> /usr/lib/python3/dist-packages/nltk/corpus/util.py:81: in __load
>     root = nltk.data.find(f"{self.subdir}/{self.__name}")
> /usr/lib/python3/dist-packages/nltk/data.py:579: in find
>     raise LookupError(resource_not_found)
> E   LookupError:
> E   **********************************************************************
> E     Resource wordnet not found.
> E     Please use the NLTK Downloader to obtain the resource:
> E
> E     >>> import nltk
> E     >>> nltk.download('wordnet')
> E     
> E     For more information see: https://www.nltk.org/data.html
> E
> E     Attempted to load corpora/wordnet
> E
> E     Searched in:
> E       - '/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_pydoctor/nltk_data'
> E       - '/usr/nltk_data'
> E       - '/usr/share/nltk_data'
> E       - '/usr/lib/nltk_data'
> E       - '/usr/share/nltk_data'
> E       - '/usr/local/share/nltk_data'
> E       - '/usr/lib/nltk_data'
> E       - '/usr/local/lib/nltk_data'
> E   **********************************************************************

I assume this is because some downloadable data went away, though I'm
not certain.  Still, we obviously shouldn't have an implicit dependency
on downloaded data during package builds.

Carsten, what would you think of this patch to python-lunr, which fixes
both pydoctor and twisted (and I suspect probably a bunch of other
packages, since mkdocs also depends on python3-lunr)?  nltk is an
optional dependency, so Suggests seems like the right representation of
it.

diff --git a/debian/changelog b/debian/changelog
index 5f97ca6..e3b3bcd 100644
--- a/debian/changelog
+++ b/debian/changelog
@@ -1,3 +1,9 @@
+python-lunr (0.7.0-2) UNRELEASED; urgency=medium
+
+  * Drop python3-nltk to Suggests.
+
+ -- Colin Watson <cjwat...@debian.org>  Mon, 07 Oct 2024 11:10:01 +0100
+
 python-lunr (0.7.0-1) unstable; urgency=medium
 
   * [fc4a05d] New upstream version 0.7.0
diff --git a/debian/control b/debian/control
index ebff949..0838508 100644
--- a/debian/control
+++ b/debian/control
@@ -45,10 +45,11 @@ Description: Python implementation of Lunr.js 
(Documentation)
 Package: python3-lunr
 Architecture: all
 Depends:
- python3-nltk,
  ${misc:Depends},
  ${python3:Depends},
-Suggests: python-lunr-doc
+Suggests:
+ python-lunr-doc,
+ python3-nltk,
 Description: Python implementation of Lunr.js (Python3 version)
  This package includes the Python version of Lunr.js aims to bring the simple
  and powerful full text search capabilities into Python guaranteeing results as

Thanks,

-- 
Colin Watson (he/him)                              [cjwat...@debian.org]

Reply via email to