Re: Upstreams with "official" tarballs differing from their git
Hello, On Sun 16 Feb 2025 at 06:18am -06, rhys wrote: > The potential for additional function is not relevant. > > If the upstream intends to distribute it with a tarball, that's the "golden" > package that downstream should base code upon. The Debian project officially disagrees with you. The preferred form for modification, which is what NEW cares about, is determined by upstream's actual practices, not by their fiat. We frequently reject packages from NEW because we have minified files; we add the source to debian/missing-sources/. -- Sean Whitton signature.asc Description: PGP signature
Bug#1096128: ITP: python-nbstripout -- strip output from Jupyter and IPython notebooks
Package: wnpp Severity: wishlist Owner: Francesco Ballarin X-Debbugs-Cc: debian-devel@lists.debian.org, francesco.balla...@unicatt.it * Package name: python-nbstripout Version : 0.8.1 Upstream Contact: Florian Rathgeber * URL : https://github.com/kynan/nbstripout * License : MIT Programming Lang: Python Description : Strip output from Jupyter and IPython notebooks Reads a notebook from a file or stdin, strips output and some metadata, and writes the "cleaned" version of the notebook to the original file or stdout. Intended to be used as a Git filter or pre-commit hook for users who don't want to track output in Git. Maintainer: Debian Python Team, at https://salsa.debian.org/python-team/packages/python-nbstripout
Automatically detecting header-only libraries and setting Static-Built-Using automatically
Hello fellow Debian people, [Please keep me in CC as I'm not subscribed to -devel] As the primary maintainer of a header-only C/C++ package, I'm really happy to see the progress in formalizing the use of 'Static-Built-Using' [0] to aid in determining when packages that use header-only libraries [1] need a binNMU rebuild. However I'm concerned about implementing this and I would like to find an automatic solution so that we don't have to manually update every packaged that Build-Depends on a header-only library. While `dh_builtusing`[2] makes this a lot easier, maintainers still have to identify which of their dependencies are header-only libraries, not to mention which of their packages need upgrading. I would like to help extend debhelper to detect these automatically, and add the appropriate 'Static-Built-Using' lines to the binary package control file (DEBIAN/control)[3], so that no effort is required for the maintainers of packages that Build-Depend on header-only libraries Please join me at https://bugs.debian.org/1096114 to discuss and co-ordinate the implementation of an automatic fix. [0] https://bugs.debian.org/1069256 [1] and other situations, see Maytham Alsudany's current proposed policy changes at https://bugs.debian.org/cgi-bin/bugreport.cgi?att=1;bug=1069256;filename=0001-Require-use-of-Static-Built-Using-to-declare-statica.patch;msg=95 [2] https://manpages.debian.org/testing/dh-builtusing/dh_builtusing.1.en.html [3] https://www.debian.org/doc/debian-policy/ch-controlfields.html#debian-binary-package-control-files-debian-control Cheers, -- Michael R. Crusoe OpenPGP_signature.asc Description: OpenPGP digital signature
Re: Upstreams with "official" tarballs differing from their git
On Sun, Feb 16, 2025 at 10:39:59AM +0800, Sean Whitton wrote: > I think that basing our work on upstream Git makes our source packages > more useful, and more accurately reflects our commitment to providing > the preferred form of modification for everything in our archive. > > If our work is based on upstream Git then users can clone source > packages from salsa (or, better, 'dgit clone' if the maintainer has used > 'dgit push-source') and can use powerful tools like 'git blame' and 'git > bisect' to understand their bug. > > With tarballs the granularity of these tools is so much less. This is a false dichotomy, though. It's perfectly possible to use both in conjunction with each other, by importing a tarball on top of an upstream git tag so that the differences between them are represented by a git commit. There are various tools in Debian to help with this. -- Colin Watson (he/him) [cjwat...@debian.org]
Re: Bug#1093192: #1093192 "ITS: vtgrab": no uploaders specified?
Am Sun, Feb 16, 2025 at 05:53:37PM +0100 schrieb Chris Hofstaedtler: > On Sun, Feb 16, 2025 at 05:44:34PM +0100, наб wrote: > > On Sun, Feb 16, 2025 at 05:29:26PM +0100, Chris Hofstaedtler wrote: > > > On Sun, Feb 16, 2025 at 05:18:39PM +0100, наб wrote: > > > > Quoting the relevant: > > > > > It is recommended to choose between one of the two following schemes: > > > > > 2. Put the mailing list address in the Maintainer field. > > > > >In the Uploaders field, put the team members who care for the > > > > > package. > > > > > > > > In the packages salvaged into the salvage team we have a choice between: > > > [..] > > > > 3. Maintainer: salvage team > > > > > > > [..] > > > > 3 is a better fit for what I term dead-end packages > > > > (ones that truly no-one cares about, with no upstream, > > > >or no maintainer, or no utility, or otherwise 0 forward motion; > > > >and with little potential to generate bugs except 1 FTBFS/decade). > > > > This is most of the salvage team packages. > > > > > > Why are what you call "dead-end packages" "salvaged" at all? I seem > > > to recall that the salvaging process is for packages you actually > > > want to maintain. > > Because a more aggressive RM RoQA policy got me yelled at last time > > for making work for the ftpmasters, so I stopped arguing for RMs > > and do Andreas' preferred methodology of salvaging everything. > > > Doing this allows packages that tend to be in a functionally-orphaned > > state to be team-maintained in the long term. This satisfies the salvage > > criteria as I see them and I have an equal interest in every weird > > ancient FTBFS these packages generate. > > If you are still interested in them, then properly document this and > add yourself to Uploaders: > > Not doing this seems like a clear abuse of the ITS process to me. Yes, indeed. The ITS process is not a process to orphan packages, it is for taking over maintainership. Long term interest in the package maintainainership is required. This is explictly spelled out in the procedures, refer to https://www.debian.org/doc/manuals/developers-reference/pkgs.en.html#package-salvaging The only process that leads to orphaned packages and has project consensus is by maintainer action or the MIA process. If a package is no longer useful, it should be removed. A removal can be announced throught the BTS in advance, for example something like https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1091838 -- tobi
Bug#1096105: ITP: ggml -- Tensor library for machine learning
Package: wnpp Severity: wishlist Owner: Christian Kastner X-Debbugs-Cc: debian-devel@lists.debian.org, debian...@lists.debian.org * Package name: libggml * Version : 0.0+git * Upstream Contact: The ggml authors * URL : https://github.com/ggml-org/ggml * License : MIT * Programming Lang: C/C++ * Description : Tensor library for machine learning libggml is a tensor library for machine learning, currently used by llama.cpp and whisper.cpp. Features: * Low-level cross-platform implementation * Integer quantization support * Broad hardware support * Automatic differentiation * ADAM and L-BFGS optimizers * No third-party dependencies * Zero memory allocations during runtime This library is not yet stable, so it will only ship private interfaces. The main advantage to this to avoid duplication between llama.cpp and whisper.cpp, and to simplify their build processes, instead of having embedded copies of libggml. This package will be maintained under the Debian Deep Learning Team.
Bug#1096132: ITP: golang-github-code-hex-go-generics-cache -- Key:value store/cache library written in Go generics
Package: wnpp Severity: wishlist Owner: Martina Ferrari * Package name: golang-github-code-hex-go-generics-cache Version : 1.5.1-1 Upstream Author : Kei Kamikawa * URL : https://github.com/Code-Hex/go-generics-cache * License : Expat Programming Lang: Go Description : Key:value store/cache library written in Go generics This package provides an in-memory key:value store/cache that is suitable for applications running on a single machine. This in-memory cache uses Go Generics (https://go.dev/blog/generics-proposal) introduced in golang 1.18.A key:value store/cache library written in Go generics. . Supports LRU, LFU, FIFO, MRU, and Clock cache replacement policies. Note: This library is a new build dependency for Prometheus
Bug#1096131: ITP: golang-github-r3labs-sse -- Server Sent Events server and client for Golang
Package: wnpp Severity: wishlist Owner: Nicolas Peugnet * Package name: golang-github-r3labs-sse Version : 2.10.0-1 Upstream Author : R3 Labs * URL : https://github.com/r3labs/sse * License : MPL-2.0 Programming Lang: Go Description : Server Sent Events server and client for Golang With server-sent events, it's possible for a server to send new data to a web page at any time, by pushing messages to the web page. . This library provides a server and a client implementation of SSE for Go. This is a dependency of docker-compose v2.
Re: Bug#1093192: #1093192 "ITS: vtgrab": no uploaders specified?
On Sun, Feb 16, 2025 at 05:44:34PM +0100, наб wrote: > On Sun, Feb 16, 2025 at 05:29:26PM +0100, Chris Hofstaedtler wrote: > > On Sun, Feb 16, 2025 at 05:18:39PM +0100, наб wrote: > > > Quoting the relevant: > > > > It is recommended to choose between one of the two following schemes: > > > > 2. Put the mailing list address in the Maintainer field. > > > >In the Uploaders field, put the team members who care for the > > > > package. > > > > > > In the packages salvaged into the salvage team we have a choice between: > > [..] > > > 3. Maintainer: salvage team > > > > > [..] > > > 3 is a better fit for what I term dead-end packages > > > (ones that truly no-one cares about, with no upstream, > > >or no maintainer, or no utility, or otherwise 0 forward motion; > > >and with little potential to generate bugs except 1 FTBFS/decade). > > > This is most of the salvage team packages. > > > > Why are what you call "dead-end packages" "salvaged" at all? I seem > > to recall that the salvaging process is for packages you actually > > want to maintain. > Because a more aggressive RM RoQA policy got me yelled at last time > for making work for the ftpmasters, so I stopped arguing for RMs > and do Andreas' preferred methodology of salvaging everything. > Doing this allows packages that tend to be in a functionally-orphaned > state to be team-maintained in the long term. This satisfies the salvage > criteria as I see them and I have an equal interest in every weird > ancient FTBFS these packages generate. If you are still interested in them, then properly document this and add yourself to Uploaders: Not doing this seems like a clear abuse of the ITS process to me. Otherwise if you just want to "create facts", do an O: upload and set Maintainer: Debian QA Group. Chris
Re: Upstreams with "official" tarballs differing from their git
Hi, Le 2025-02-16 17:00, Julien Puydt a écrit : I have always considered the true way to know if you have the source of a package is: imagine you're stuck somewhere with a Debian mirror and no external link. Could you take over the development of the package at hand? Written like that it looks dramatic, but in that project there are only 5 occurences of %%VERSION_NUM%%. That's only 4 too many, and then it would not be much different than many other project releases where a script (not always provided), a bot (same) or plain old manual intervention (not always fully documented) is used to adjust a hardcoded version number in some file somewhere at release time. But it's true that if you're stranded somewhere you may have some free time at hand to implement your own layer of templating ^ ^. Cheers, -- Julien Plissonneau Duquène
Re: Upstreams with "official" tarballs differing from their git
Julien Puydt writes: > If you use autotools, you start with configure.ac and .in files. If > upstream prepared the tree (as they generally do), you also get a > libtool/configure, etc. From this, you can run ./configure && make && > make install ; that will do substitutions in the .in files. And if you > want to start anew, you can use autoconf/autoheader/whatever to re- > create the configure/libtool script and then ./configure && make && > make install. This is a long-standing misunderstanding in the Debian community. There is no guarantee that autoconf/autoheader/whatever re-create all generated autotools files. In fact, there are several examples of situations where they are not re-generated (e.g., modified aclocal *.m4 files without bumping serial number) and this is intentional upstream behaviour from autotools and there are no signs that this will change. To be certain to not get pre-generated files, one approach is to not use tarballs with pre-generated content at all, but to insist on packaging autotools projects based on git content; few upstreams commit generated files into git so this ought to be the safest approach. Look at 'inetutils' in Debian for an example. Another is to manually prepare a 'rm' list of files to remove in debian/rules, or to use debian/copyright Files-Excluded; to remove all generated autotools files. /Simon signature.asc Description: PGP signature
Re: Upstreams with "official" tarballs differing from their git
Le dimanche 16 février 2025 à 06:18 -0600, rhys a écrit : > > If the upstream intends to distribute it with a tarball, that's the > "golden" package that downstream should base code upon. > > Going around that decision means subjecting all of Debian to code > pulled from their repo outside of their distribution process. I don't understand: it never was about packaging a random git commit. Upstream works on code in a preferred form, and tags a definite commit as being version 3.14159. This gets in some tarballs already in some cases (github comes to mind) ; call that source-level-zero. >From this upstream runs some tool (generally triggered by their tagging) ; often some autotools, 'dune' on the package which started this thread, but rust/js/ruby/whatever also have their own. This tool turns the git commit tree into some pre-compiled tree, put into another tarball. Call this source-level-one. The point I made about the elpi package is that we want to ship source- zero, as that's what upstream works on. This is in line with Debian shipping autotools-based packages but re-running the autotools before they run configure again. > How much function is lost as a result is nothing compared to the > instability of packages that can result from distributing code that > was not meant for distribution. Well, "use what upstream publishes" sounds nice until you look at the landscape : - many Python upstreams use pypi and consider that their pypi package is what they publish ; - many JavaScript upstreams use npm and consider that their npm package is what they publish ; - many Rust upstream use cargo and consider that their cargo package is what they publish ; - many OCaml upstreams use opam and consider that their opam package is what they publish ; - many Coq upstreams use a sub-distribution of opam (the Coq Platform) and consider this is what they publish ; - etc. There is a huge confusion in many upstream's heads that putting their software on people's computers is what "distributing" means. The fact that their software is only semi-coherent with the system (only within a certain programming language boundary) is a problem. Debian is a whole-system distribution ; we make sure software works in a coherent system-wide way, and basing our packages on code which has already been pre-package for a subpar distribution (language-limited) isn't a good option. Cheers, J.Puydt
Re: Upstreams with "official" tarballs differing from their git
Le dimanche 16 février 2025 à 14:51 +, Colin Watson a écrit : > > This is a false dichotomy, though. It's perfectly possible to use > both > in conjunction with each other, by importing a tarball on top of an > upstream git tag so that the differences between them are represented > by > a git commit. There are various tools in Debian to help with this. > Actually, it's a bit more complex. If you use autotools, you start with configure.ac and .in files. If upstream prepared the tree (as they generally do), you also get a libtool/configure, etc. From this, you can run ./configure && make && make install ; that will do substitutions in the .in files. And if you want to start anew, you can use autoconf/autoheader/whatever to re- create the configure/libtool script and then ./configure && make && make install. The .in files are at hand to start again, we have the developer sources and more. So at this point, no real dichotomy on what you use as source. But in the case at hand, the 'dune' tool to configure&compile the 'elpi' software (and others), things work differently. If you go here: https://github.com/LPCIC/elpi/tags and get the .tar.gz, you'll get an extract of the git tree, which lacks the versioning information, but has the %%VERSION_NUM%% (and others) ready for substitution. That's what I want to use. So my problem was to force-feed the missing git information to dune so it can actually makes those substitutions. Stéphane's suggestion was to use the .tbz taken here: https://github.com/LPCIC/elpi/releases/tag/v2.0.7 where the substitution have already been done. But contrary to the autotools situation, there's no going back: the substitutions are not just ready to be applied, they are made, done and gone! If I wanted to re-version as 2.0.7-debian or whatever, that isn't a possibility: %%VERSION_NUM%% doesn't appear anymore in the tree. The versioning information is soldered all other the place. So here there is a clear dichotomy: depending on the tarball, you don't have the same. I'm hopeful dune upstream will accept my proposition and provide a force-feeding mechanism to use the real source tree painlessly, because I really think considering this .tbz a source tarball is incorrect. I have always considered the true way to know if you have the source of a package is: imagine you're stuck somewhere with a Debian mirror and no external link. Could you take over the development of the package at hand? Cheers, J.Puydt
Re: Bug#932103: RFP: fuidshift -- remap a filesystem tree to shift one set of UID/GID ranges to another
Hello, Le lundi 12 août 2019, 00 h 04 min 37 s UTC+1 Nicholas D Steeves a écrit : > LXD is going to eventually be packaged for Debian, so bin:fuidshift > from src:lxd makes sense to me. I see that fuidshift is now part of the lxd-tools packages, so I think that this RFP can be closed. Regards, Vincent
Bug#1096137: ITP: golang-github-landlock-lsm-go-landlock -- A Go library for the Linux Landlock sandboxing feature
Package: wnpp Severity: wishlist Owner: Simon Josefsson * Package name: golang-github-landlock-lsm-go-landlock Version : 0.0~git20241014.479ddab-1 Upstream Author : Landlock * URL : https://github.com/landlock-lsm/go-landlock * License : Expat Programming Lang: Go Description : A Go library for the Linux Landlock sandboxing feature The Go-Landlock library restricts the current processes' ability to use files, using Linux 5.13's Landlock feature. (Package documentation (https://pkg.go.dev/github.com/landlock-lsm/go-landlock/landlock)) Needed by sbctl: https://lists.debian.org/debian-go/2025/02/msg00025.html https://salsa.debian.org/go-team/packages/golang-github-landlock-lsm-go-landlock https://salsa.debian.org/jas/golang-github-landlock-lsm-go-landlock/-/pipelines/ /Simon signature.asc Description: PGP signature
Bug#1096146: ITP: python-pyramid-retry -- Retry policy for the Pyramid web framework
Package: wnpp Severity: wishlist Owner: Maximilian Engelhardt X-Debbugs-Cc: debian-devel@lists.debian.org, m...@daemonizer.de * Package name: python-pyramid-retry Version : 2.1.1 Upstream Contact: Pylons Project * URL : https://github.com/Pylons/pyramid_retry * License : Expat Programming Lang: Python Description : Retry policy for the Pyramid web framework pyramid_retry is an execution policy for Pyramid that wraps requests and can retry them a configurable number of times under certain "retryable" error conditions before indicating a failure to the client. I use pyramid_retry in a private project and thus want to package it in Debian. If possible, I want to maintain this package as part of the Debian Python Team and I will need a sponsor once the package is ready. signature.asc Description: This is a digitally signed message part.
Bug#1096145: ITP: python-pylons-sphinx-themes -- Sphinx themes for Pylons related projects
Package: wnpp Severity: wishlist Owner: Maximilian Engelhardt X-Debbugs-Cc: debian-devel@lists.debian.org, m...@daemonizer.de * Package name: python-pylons-sphinx-themes Version : 1.0.13 Upstream Contact: Pylons Project * URL : https://github.com/Pylons/pylons-sphinx-themes * License : Pyramid (other) Programming Lang: Python Description : Sphinx themes for Pylons related projects The following Pylons Sphinx Themes are provided by this package: * pylons - the generic Pylons Project documentation theme * pyramid - the specific Pyramid documentation theme * pylonsfw - the specific Pylons Framework documentation theme This package is a dependency for building the documentation of python-pyramid- retry which I also intend to package. If possible, I want to maintain this package as part of the Debian Python Team and I will need a sponsor once the package is ready. signature.asc Description: This is a digitally signed message part.
Re: Packages with a history of security issues and whose packaged version is not up to date
On Feb 14, Colin Watson wrote: > But it doesn't. Santiago's using the data from the security tracker to > determine whether CVEs are open. And in the case of one of my own packages these CVEs have not yet been fixed upstream, not even in an unreleased branch. -- ciao, Marco signature.asc Description: PGP signature
Re: Upstreams with "official" tarballs differing from their git
Le dimanche 16 février 2025 à 19:50 +0100, Julien Plissonneau Duquène a écrit : > Hi, > > Le 2025-02-16 17:00, Julien Puydt a écrit : > > > I have always considered the true way to know if you have the > > source of > > a package is: imagine you're stuck somewhere with a Debian mirror > > and > > no external link. Could you take over the development of the > > package at > > hand? > > Written like that it looks dramatic, but in that project there are > only > 5 occurences of %%VERSION_NUM%%. That's only 4 too many, and then it > would not be much different than many other project releases where a > script (not always provided), a bot (same) or plain old manual > intervention (not always fully documented) is used to adjust a > hardcoded > version number in some file somewhere at release time. But it's true > that if you're stranded somewhere you may have some free time at hand > to > implement your own layer of templating ^ ^. You miss parts of the picture : - the previous version used %%VERSION_NUM%% in only three places, the new one uses it more, so it broke my previous hack -- ; - there are other things than the substitutions done by dune when compiling the package, which do not break the build, but will break some depending packages later on with strange and misleading errors. My current hack creating a fake .git is in fact much more efficient and less fragile. As mentioned somewhere in the thread I proposed to dune upstream a simple mechanism to bypass this git reliance issue, which will make packaging much cleaner. Cheers, J.Puydt
Bug#1096150: ITP: forwords -- A very simple but effective tool to learn foreign language words
Package: wnpp Severity: wishlist Owner: Alexander Fomin X-Debbugs-Cc: debian-devel@lists.debian.org, fomin_a...@yahoo.com * Package name: forwords Version : 1.0.2 Upstream Contact: Alexander Fomin * URL : https://salsa.debian.org/AlexFomin/forwords * License : GPL-3+ Programming Lang: C++ Description : A very simple but effective tool to learn foreign language words This program is a very simple but quite effective tool when you are learning a foreign language. It helps you to expand your foreign word bank with three simple tests. The program is written using 'Qt' library and has some speech ability with 'espeak'. This approach to the learning process has been used for more than 20 years and it has proof its effectiveness. But the programme was never published. At this time (2024) the programme has been rebuilt, refurbished and published. I will maintain this package myself. Sponsor is required.
Bug#1096151: ITP: python-briefcase -- Convert Python project to native application
Package: wnpp Severity: wishlist Owner: Josenilson Ferreira da Silva X-Debbugs-Cc: debian-devel@lists.debian.org, nilsonfsi...@hotmail.com * Package name: python-briefcase Version : 0.3.22 Upstream Contact: Russell Keith-Magee * URL : https://github.com/beeware/briefcase * License : BSD-3-Clause Programming Lang: Python Description : Convert Python project to native application Briefcase is a tool from the BeeWare project that allows you to package and distribute Python applications as native executables for different operating systems. With it, a developer can turn a Python project into an application that can be installed and run like any other common software on Windows, macOS, Linux, iOS, and Android. . The tool automates the process of packaging Python applications, generating packages in the appropriate format for each operating system. This means that it not only compiles Python code into an executable format, but also structures the project to meet the specific requirements of each platform. . Generates installation files compatible with each operating system (such as .exe, .dmg, .deb, .apk, among others). As well as packaging applications with graphical interfaces, making them indistinguishable from native applications. . Main features: - Independence from the development environment: The application can be developed on any operating system and, with Briefcase, can be packaged for multiple platforms, without having to be recreated from scratch for each of them. - Native package generation: Creates ready-to-install and use packages, without the need to install Python separately. - Integration with GUI frameworks: Support for Toga (from BeeWare) and other GUI libraries such as PySide.