Bug#1079754: ITP: scaphandre -- electric power/energy consumption monitoring agent
Package: wnpp Severity: wishlist Owner: Jonas Smedegaard X-Debbugs-Cc: debian-devel@lists.debian.org -BEGIN PGP SIGNED MESSAGE- Hash: SHA512 * Package name: scaphandre Version : 1.0.0 Upstream Contact: Benoit Petit * URL : https://github.com/hubblo-org/scaphandre * License : Apache-2.0 Programming Lang: Rust Description : electric power/energy consumption monitoring agent Scaphandre is a metrology agent dedicated to electric power and energy consumption metrics. The goal of the project is to permit to any company or individual to measure the power consumption of its tech services and get this data in a convenient form, sending it through any monitoring or data analysis toolchain. This package will be maintained in the collaborative Debian section of Salsa, at https://salsa.debian.org/debian/scaphandre -BEGIN PGP SIGNATURE- iQJABAEBCgAqFiEEn+Ppw2aRpp/1PMaELHwxRsGgASEFAmbNfU8MHGRyQGpvbmVz LmRrAAoJECx8MUbBoAEhEF8P/3jp6H08YoVinTpSJHnvZlbY63kInz03pnfnJxFY ItPvop1G8GPBrA7TwLl0hhfid1Xa3+A25lYAAnbFYuqXNkG2MKn9aRyDK1nfUiUa iWhZ/JDFO4Oazuayt6H5tL2f/Foiol5qV7ia1fp4scrYIkAMNYjjhprlN4hUJfMw m2xUh7KmUKDVjThuk1Lzc870sL4z9u/PU7a9lfbzu4SSBaVxkzcmwRUmsrhOXpfl gDO8oWmeT8RX+88iIUFpIsAqD5AaYZ48RcxFRQEuXTwG5DWACHm+UdXqP7HT/pRD oqfdA9qdAJ8pht94YVi5Z07ucaHrbd1fLEiiBU2VaS5fK29GnUHSxnI4ngLM5I/M cuKhOftl06tXH2spgh1rZFlC9oFsnUZU9j7FvA3XlWPcc8W6atZPddRJ9wYI337T gMkiRu67KxM7gY+J8sXX05aXhJMWjJLJHW6RBPLfW/3wVTdBBKsnYZEu5XlWes4U EvnrZL5gp1Zzm4wmHRqsV7v0E8Sjlg9gGYVpLyFISWRL9AkXsBwOmYDj6vC/x6cv FLESUq9Psk/Msj45IGx27Gwv/jZrnF2SkZLJCY+qWIxcfs90+JRCV/soDTqKT2EH dRPHrUYAXHc+KDCOIrxZeDkaOyQmOw6D+D7PXNAULW8Ssoh7JMOqB/GMt5cBE6wi nQcz =X8ld -END PGP SIGNATURE-
Re: DEP-18: Git and GitLab usage in other Linux distros (Re: Representing Debian Metadata in Git)
Hi, Quoting Otto Kekäläinen (2024-08-27 08:42:53) > > Before pushing for new ways of representing Debian stuff in git, I think it > > would be a good idea to learn from all the other distros and distro-like > > systems successfully using git [1]. Debian is not the only distro that > > wants to use git to capture changes and encourage contributions to its > > packages. > > > > Chris > > > > [1] alpine, homebrew, freebsd ports come to mind immediately. nixos > > and others too. > > …this is the right attitude and I wanted to cater to it and summarize > how packaging sources look in various distros. thank you for your investigations. > - The number of contributors/maintainers is low everywhere. Ending > single-person maintainership is not going to happen any soon, but hopefully, > we can work towards first increasing the pool of contributors who > participate, and then expand on practices around Merge Requests and reviews > and maybe have some kind of formal sign-offs from at least two people before > upload. Initially, perhaps only for the top-150 packages. But before we can > institute review workflows, we need to have more unification around the > version control and basic packaging workflows. I'm still dubious any "2 people sign-off" can work [1]. In your investigations, did you find other distributions which implemented this successfully? I think "work towards easier collaboration" and "require more than one person for every commit/upload" are two very different things which should be discussed independently. Thanks! cheers, josch [1] My own experience with this comes from my contributions to devscripts which is in the debian group, thus "team" maintained and probably all of you have it installed and should feel responsible for it (right?). Nevertheless, my MRs mostly get zero replies, so I usually just merge them after waiting a couple of months. The situation is a bit better for sbuild but not by much. signature.asc Description: signature
Re: Validating tarballs against git repositories
On Mon, Aug 26, 2024 at 09:57:41PM -0700, Russ Allbery wrote: > My guess is that the sweet spots are --depth=1 and a full checkout, it's > not generally possible to tell which a given package needs in advance (in > other words, it's best handled as a configuration option), and it's > probably not worth the effort to mess around with any intermediate depth. > I suspect we'll find that the vast majority of packages work fine with > --depth=1, and the remaining cases should just use a full checkout to > avoid creating fragile assumptions that may work today and break tomorrow. Could --shallow-since= (one month ago, or one year ago) be another possible thing to consider? Enrico -- GPG key: 4096R/634F4BD1E7AD5568 2009-05-08 Enrico Zini
Re: Validating tarballs against git repositories
* Enrico Zini [2024-08-27 11:00]: On Mon, Aug 26, 2024 at 09:57:41PM -0700, Russ Allbery wrote: My guess is that the sweet spots are --depth=1 and a full checkout, it's not generally possible to tell which a given package needs in advance (in other words, it's best handled as a configuration option), and it's probably not worth the effort to mess around with any intermediate depth. I suspect we'll find that the vast majority of packages work fine with --depth=1, and the remaining cases should just use a full checkout to avoid creating fragile assumptions that may work today and break tomorrow. Could --shallow-since= (one month ago, or one year ago) be another possible thing to consider? I recommend: --filter=blob:none Cheers Jochen signature.asc Description: PGP signature
Bug#1079769: ITP: python-farama-notifications -- Notifications for all Farama Foundation maintained libraries
Package: wnpp Severity: wishlist Owner: Dong Xu X-Debbugs-Cc: debian-devel@lists.debian.org * Package name: python-farama-notifications Version : 0.0.4 Upstream Contact: Jordan Terry * URL : https://github.com/Farama-Foundation/Farama-Notifications * License : MIT Programming Lang: Python Description : Notifications for all Farama Foundation maintained libraries This package allows for providing notifications on import to all Farama Packages. Actually this is a dependency of python package gymnasium and stable-baselines3, which are important libraries for deep reinforcement learning and I use them daily. Deep reinforcement learning is an important tech powered many important things like AlphaGo, AlphaFold. Gaming agents in video games like Dota 2 are usually powered by deep reinforcement too. As far as I know, libraries linked with machine learning is still rare in debian repo, thus I want to package them to debian. I will maintain this package by myself in the future. By the way, I intend to join python team later, I may maitain this package with other members from python team then. And I need sponsor if you can help. Thank you.
Re: Validating tarballs against git repositories
On 2024-08-26 21:28:38 -0700 (-0700), Otto Kekäläinen wrote: > On Tue, 2 Apr 2024 at 17:19, Jeremy Stanley wrote: > > On 2024-04-02 16:44:54 -0700 (-0700), Russ Allbery wrote: > > [...] > > > I think a shallow clone of depth 1 is sufficient, although that's not > > > sufficient to get the correct version number from Git in all cases. > > [...] > > > > Some tools (python3-reno, for example) want to inspect the commits > > and historical tags on branches, in order to do things like > > assembling release notes documents. I don't know if any reno-using > > projects packaged in Debian get release notes included, but if they > > do then shallow clones would break that process. The python3-pbr > > You could use --depth=99 perhaps? > > Usually the difference of having depth=1 or 99 isn't that big unless > there was a recent large refactoring. Git repositories that are very > big (e.g. LibreOffice, MariaDB) have hundreds of thousands of commits, > and by doing a depth=99 clone you avoid 99.995% of the history, and in > projects where changelog/release notes is based on git commits, then > 99 commits is probably enough. [...] Maybe, but only if you want authorship, release note or changelog generation truncated at 99 commits worth of information at build time. Take OpenStack Nova for example, which has historically averaged around a thousand non-merge commits between major releases every 6 months; --depth=99 would be an order of magnitude too low to find just one major release's worth of notes and tags on a stable branch. Granted, this is why upstream in OpenStack we recommend package maintainers use our source distribution changelogs rather than rebuilding source themselves from code from version control. Our community considers our version control to be an implementation detail of our development workflow, not the primary means of supplying source code to downstream consumers. Where version control is concerned, we consider our source code to include the full Git metadata and not merely the files held in the worktree. For a hassle-free source distribution, we extract that metadata and incorporate the relevant parts as files in our signed source tarballs. -- Jeremy Stanley signature.asc Description: PGP signature
Re: Debian 10 "buster" moved to archive.debian.org
Hello Ansgar, Am Sat, Mar 23, 2024 at 09:30:49AM +0100 schrieb Ansgar 🙀: > Debian 10 "buster" has moved to archive.debian.org in order to free > space on the main mirror network. We plan to start removing files for > non-LTS architectures in about two weeks; the existing Release files > will then refer to no longer existing files on the main mirror network. > > An exception is the security archive (which already has no non-LTS > architectures): we will only archive it after LTS support ended. > > For LTS users this does not require any changes. I'd like to understand the process a little better: 1. First *non*-LTS architectures (mips,ppc,s390,…) have been moved 2024-03 2. LTS for Buster ended recently 2024-06-30 ¹ 3. Debian 13 Trixie release is expected in 2025 4. ELTS for Buster is probably until 2029-06 ² When will *LTS* architectures (x86,arm,…) also be removed? My guess would be between 2. and either when more free space is needed anytime in the future, or latest 3. Thank you for your excellent work and hopefully an answer. Philipp 1: https://wiki.debian.org/LTS 2: https://wiki.debian.org/LTS/Extended
Packages "confirmed" and ready for DD review/possible upload - Debian Mentors - 2024-08-27
Dear all DDs, Below is the link to the page of currently "confirmed" being in good order packages that are awaiting a DD review and possible upload. If DDs could spare the time to pick up a package or two and finish off the package mentor process it would be greatly appreciated. https://bugs.debian.org/cgi-bin/pkgreport.cgi?include=tags%3Aconfirmed;package=sponsorship-requests Please check that another DD is not already involved in the package. P.S. I have have emailed some team lists, as we have packages in a variety of laguages and may interest DDs from these teams. Regards Phil -- "I play the game for the game’s own sake" Arthur Conan Doyle - The Adventure of the Bruce-Partington Plans -- Buy Me A Coffee: https://buymeacoffee.com/kathenasorg Internet Relay Chat (IRC): kathenas Matrix: #kathenas:matrix.org Website: https://kathenas.org Instagram: https://instagram.com/kathenasorg/ Threads: https://www.threads.net/@kathenasorg -- signature.asc Description: This is a digitally signed message part
Re: Validating tarballs against git repositories
On Aug 27, 2024 12:07 PM, Jeremy Stanley wrote: > > On 2024-08-26 21:28:38 -0700 (-0700), Otto Kekäläinen wrote: > > On Tue, 2 Apr 2024 at 17:19, Jeremy Stanley wrote: > > > On 2024-04-02 16:44:54 -0700 (-0700), Russ Allbery wrote: > > > [...] > > > > I think a shallow clone of depth 1 is sufficient, although that's not > > > > sufficient to get the correct version number from Git in all cases. > > > [...] > > > > > > Some tools (python3-reno, for example) want to inspect the commits > > > and historical tags on branches, in order to do things like > > > assembling release notes documents. I don't know if any reno-using > > > projects packaged in Debian get release notes included, but if they > > > do then shallow clones would break that process. The python3-pbr > > > > You could use --depth=99 perhaps? > > > > Usually the difference of having depth=1 or 99 isn't that big unless > > there was a recent large refactoring. Git repositories that are very > > big (e.g. LibreOffice, MariaDB) have hundreds of thousands of commits, > > and by doing a depth=99 clone you avoid 99.995% of the history, and in > > projects where changelog/release notes is based on git commits, then > > 99 commits is probably enough. > [...] > > Maybe, but only if you want authorship, release note or changelog > generation truncated at 99 commits worth of information at build > time. Take OpenStack Nova for example, which has historically > averaged around a thousand non-merge commits between major releases > every 6 months; --depth=99 would be an order of magnitude too low to > find just one major release's worth of notes and tags on a stable > branch. > > Granted, this is why upstream in OpenStack we recommend package > maintainers use our source distribution changelogs rather than > rebuilding source themselves from code from version control. Our > community considers our version control to be an implementation > detail of our development workflow, not the primary means of > supplying source code to downstream consumers. Where version control > is concerned, we consider our source code to include the full Git > metadata and not merely the files held in the worktree. For a > hassle-free source distribution, we extract that metadata and > incorporate the relevant parts as files in our signed source > tarballs. > -- > Jeremy Stanley All you wrote is precisely why I am not using these tarballs. I know we don't agree... :) Also, the FTP master do NOT want the changeLog as they are too big and provide no value when one can check the git repo to find the same info. Thomas Goirand (zigo)
Re: Validating tarballs against git repositories
On 2024-08-27 19:41:54 +0200 (+0200), tho...@goirand.fr wrote: [...] > All you wrote is precisely why I am not using these tarballs. I > know we don't agree... :) > > Also, the FTP master do NOT want the changeLog as they are too big > and provide no value when one can check the git repo to find the > same info. Sure, but the assembled release notes are not nearly as large as the changelog while still relying on having Git history available to build, and the generated authorship list is referred to in the license information for at least one OpenStack project as a stand-in for referencing Git committer metadata. To put it another way, upstream in OpenStack when the project was started in 2010, we were aware that package maintainers preferred signed and clearly versioned tarballs for every release, so that's what we structured our workflows and tooling around providing. In the meantime, package maintainers decided to take advantage of the fact that we use Git repositories in our development workflow but the release process we settled on isn't designed with that in mind, and changing workflows and processes in a developer community that size is sometimes like trying to steer a train. -- Jeremy Stanley signature.asc Description: PGP signature
Bug#1079821: ITP: python-aiosomecomfort -- client for Honeywell's US-based cloud devices
Package: wnpp Severity: wishlist Owner: Thomas Goirand X-Debbugs-Cc: debian-devel@lists.debian.org * Package name: python-aiosomecomfort Version : 0.0.25 Upstream Author : Mike Kasper * URL : https://github.com/mkmer/AIOSomecomfort * License : GPL-3 Programming Lang: Python Description : A client for Honeywell's US-based cloud devices This package provides a client for Honeywell's US-based cloud devices. This is for the US model and website. Be aware that EU models are different! . This package is a dependency of Home Assistant. I intend to maintain this package within the Home Assistant team.
Re: Validating tarballs against git repositories
On 8/27/24 22:30, Jeremy Stanley wrote: On 2024-08-27 19:41:54 +0200 (+0200), tho...@goirand.fr wrote: [...] All you wrote is precisely why I am not using these tarballs. I know we don't agree... :) Also, the FTP master do NOT want the changeLog as they are too big and provide no value when one can check the git repo to find the same info. Sure, but the assembled release notes are not nearly as large as the changelog while still relying on having Git history available to build, and the generated authorship list is referred to in the license information for at least one OpenStack project as a stand-in for referencing Git committer metadata. To put it another way, upstream in OpenStack when the project was started in 2010, we were aware that package maintainers preferred signed and clearly versioned tarballs for every release, so that's what we structured our workflows and tooling around providing. In the meantime, package maintainers decided to take advantage of the fact that we use Git repositories in our development workflow but the release process we settled on isn't designed with that in mind, and changing workflows and processes in a developer community that size is sometimes like trying to steer a train. Well, I don't want to just package the generated stuff, I would prefer to have the tools to generate them myself from source at build time. And that's what has been bothering me since the beginning: I do not know how to do that, currently, neither for the authorship list or the release notes. In both case, the Git repo is needed, and that doesn't fit at all a packaging workflow, unless I embed all of the .git folder in the source package. This was truth in 2010, and still is in 2024... As a consequence, I decided not to care, as I haven't find a solution to "build from source", so I'm not packaging release notes and authorship list. It's probably my fault that I didn't contribute some fixes to reno though. Cheers, Thomas Goirand (zigo)
Re: Validating tarballs against git repositories
On 2024-08-28 00:21:27 +0200 (+0200), Thomas Goirand wrote: [...] > Well, I don't want to just package the generated stuff, I would > prefer to have the tools to generate them myself from source at > build time. And that's what has been bothering me since the > beginning: I do not know how to do that, currently, neither for > the authorship list or the release notes. In both case, the Git > repo is needed, and that doesn't fit at all a packaging workflow, > unless I embed all of the .git folder in the source package. This > was truth in 2010, and still is in 2024... > > As a consequence, I decided not to care, as I haven't find a > solution to "build from source", so I'm not packaging release > notes and authorship list. It's probably my fault that I didn't > contribute some fixes to reno though. For release notes, you can run `reno report` in a (non-shallow) Git checkout of any reno-using project and redirect its stdout to a file. Alternatively, PBR will call reno itself if it's found in the build environment when generating an sdist (this may require adding reno to the install_requires for the project when using build isolation). For example invoking `pyproject-build --sdist` will call SetupTools (PBR is a SetupTools plugin) to generate all of AUTHORS, ChangeLog, and RELEASENOTES.rst. Maybe pybuild from the dh-python package can do the same? But to bring the subthread back on topic, yes, all of this absolutely depends on having the Git branch history present, because the information required for all of these is the Git metadata itself. Storing another copy of the same data in flat files inside the worktree would be both duplicative and laggy, since some process would need to commit those files, and there lurks the specter of Gödel's First Incompleteness Theorem. Much in the same way Debian package maintainers have an aversion to reusing pre-generated files when they can be trivially recreated at package build time, some upstream projects have an aversion to checking generated files into version control if they can be recreated from existing contents of version control (not merely the versioned files but also the accompanying metadata). -- Jeremy Stanley signature.asc Description: PGP signature
DEP-18 discussion summary (Re: Request for feedback on draft: DEP-18: Enable true open collaboration on all Debian packages)
Hi! While I intend to continue on iterating DEP-18, here is a summary to those who did not wade through the 140+ messages on the topic. Unfortunately, the summary itself is also a bit long :) Summary of mailing list discussion in https://lists.debian.org/debian-devel/2024/07/msg00429.html ## Overall Sentiment There was a broad consensus that Debian workflows are too fractured and would benefit from more standardization and unification. However, there were differing opinions on the right approach to achieve this. Soren Stoutner expressed this sentiment clearly: > 1. Debian workflows are too fractured. The project would be better if we > asked people to standardize around a single (or a small number) of workflows. > To do so, the workflow would need to be flexible enough to handle the wide > range of technical needs of all the packages and upstream configurations. > 2. Standardizing around a single (or small number of) workflows wil make some > people unhappy. But that is an acceptable price to pay because of the general > benefit to the project as long as the correct solution is adopted. Unity is > more important than minority opinions on this particular issue. Similarly, Andrey Rakhmatullin argued that while some may resign, "the '1000 DDs status quo' problem also means that more people leave than join _anyway_. Not the unity per se, but having significantly lower barriers to start contributing" is important. ## Git and Gitlab Usage Multiple participants noted that most other Linux distributions use Git as the primary version control system, often with GitLab or GitHub for collaboration. Debian's multi-branch approach with pristine-tar was seen as somewhat unique. There were differing views on whether Debian should move closer to the more common Git-based workflows used elsewhere, or maintain its own custom approach, which of git-buildpackage and dgit are the most common ones (both with multiple ways to use them). ## Mandatory vs Optional Policies Some participants, like Salvo Tomaselli, felt DEP-18 was too prescriptive in mandating specific tools and workflows, and that a more flexible, optional approach would be better: > Keep in mind that unhappy people quit. I don't think that unity is so > important that we're willing to sacrifice project members. Others, including Soren Stoutner, argued that standardization was important, even if it made some people initially unhappy, as long as the right solution was adopted: "Unity is more important than minority opinions on this particular issue." ## Maintainer Workflows There were concerns that requiring specific Git and Gitlab practices could create burdens for existing maintainers, especially single-person maintainers. Sean Whitton described his own preferences as a maintainer: > I am happy to use salsa for git hosting and access management. I love that I > can easily grant push access to my non-DD team members. But, I turn off salsa > MRs for the repos of all packages I regularly upload. I would hope that this > DEP can be written such as to account for these sorts of choices. Fabio > Fantoni suggested allowing maintainers to specify their preferred > collaboration methods in a machine-readable way, for example through a > "Collaboration-Policy" field in debian/control. ## Performance and Reliability Multiple participants, including Salvo Tomaselli, Johannes Schauer Marin Rodrigues, Andrea Pappacoda, and Gioele Barabucci, complained about Salsa/GitLab being slow or unreliable at times, which deterred contribution. Improvements to performance and uptime were seen as important. In response, Otto Kekäläinen noted that the Salsa admins had posted about upcoming hardware upgrades and other improvements to address these issues at https://salsa.debian.org/salsa/support/-/issues. ## Machine-Readable Metadata Fabio Fantoni and Niels Thykier proposed including more machine-readable metadata about packaging workflows (e.g. in debian/control) to help automate contributor onboarding. Niels Thykier outlined some specific examples of information that could be captured: > Does this package use `gbp dch` (or some other mechanisms) to generate the > changelog OR should I include a changelog entry with my patch. Does this > package use some form of automatic formatting that I should apply when I do > my changes (if `wrap-and-sort`, then which options)? Does the maintainer > prefer MR via salsa or BTS with patches for when I want to submit my changes > for review. ## Overall There seemed to be general agreement that improving collaboration was important, but the right approach was still being debated. ## Mailing list participants - Jonas Smedegaard - Salvo Tomaselli - Luca Boccassi - Charles Plessy - Marco d'Itri - Sean Whitton - Marc Haber - Jeremy Stanley - Shengjing Zhu - Noah Meyerhans - PICCA Frederic-Emmanuel - Fabio Fantoni - Kentaro Hayashi - Tobia