Re: Remove clamav-unofficial-sigs

2016-04-06 Thread Mathieu Parent
2016-04-06 6:55 GMT+02:00 Paul Wise :
> On Wed, 2016-04-06 at 00:35 +0200, Marco d'Itri wrote:

I didn't knew about those third-party signatures. This is a good news for me.

>> I was discussing this yeasterday with Paul.
>> While the current package has some issues I believe that it is already
>> quite useful as is, so if the alternative is to remove it from the
>> archive then I am going to adopt it.
>
> It would be great if you could join pkg-clamav and adopt it, as I'm not
> particularly interested at this point in time.
>
> Personally I am still waiting for clamav freshclam to properly support
> third-party signatures, so clamav-unofficial-sigs can be a config file.

Is there a tracking bug for this? How can we help?

>> I have some doubts about the quality of this fork, so I plan to
>> investigate in detail what has changed before blindly adopting it.
>
> I was also unimpressed when I looked at the fork, resulting in me
> putting it further down my TODO pile.

OK

Regards

-- 
Mathieu Parent



Re: Remove clamav-unofficial-sigs

2016-04-06 Thread Paul Wise
On Wed, Apr 6, 2016 at 3:47 PM, Mathieu Parent wrote:
> 2016-04-06 6:55 GMT+02:00 Paul Wise:
>> Personally I am still waiting for clamav freshclam to properly support
>> third-party signatures, so clamav-unofficial-sigs can be a config file.
>
> Is there a tracking bug for this? How can we help?

This was an upstream initiative that now appears to be completely
removed from their website. Some references still exist on archive.org
though:

https://wayback.archive.org/web/http://www.clamav.net/lang/en/2011/07/25/clamav-0-97-2-is-now-available/
https://wayback.archive.org/web/http://www.clamav.net/lang/en/download/cvd/3rdparty/

CCing Luca from the clamav project, perhaps he has some news about this.

-- 
bye,
pabs

https://wiki.debian.org/PaulWise



Re: Packages without long term stable releases

2016-04-06 Thread Raphael Hertzog
On Tue, 05 Apr 2016, Simon McVittie wrote:
> This sounds quite a lot like the "rolling" suite that gets proposed
> every few years, with the possible exception that some proposals
> for "rolling" have had it bypass unstable while unstable is frozen,
> and it sounds as though this doesn't.
> 
> I think this would be good to have, particularly if the backports team
> can treat it as a valid source for backports to stable, at least for
> packages that are not also present in testing. If a package

+1

That said I'm a bit concerned by the idea of splitting up repositories
when all the packages are combined in unstable at the start.

The testing vs rolling difference does not make much sense because we
have a reasonably up-to-date system where newer versions can be packaged.

The problems really start later when testing has become stable... it gets
harder and harder to build updated versions of the "rolling" packages
as the "core" is no longer getting updates (like newer libstdc++ or python
that some packages like chromium or firefox tends to require, both are
real examples).

That's why it makes sense to filter out leaf packages that have no
explicit LTS support at the time we change testing into stable. We should
avoid doings promises that we can't honor.

Thus we need a way to tag packages so that everybody can know whether a
package has proper LTS suppport (either from upstream or from Debian).

And we need mechanisms/procedures to ensure that all packages which are
build dependencies and libraries are part of the stable core.

Cheers,
-- 
Raphaël Hertzog ◈ Debian Developer

Support Debian LTS: http://www.freexian.com/services/debian-lts.html
Learn to master Debian: http://debian-handbook.info/get/



Re: Overall bitrot, package reviews and fast(er) unmaintained package removals

2016-04-06 Thread Craig Small
On Wed, Apr 6, 2016 at 3:26 PM Paul Wise  wrote:

> Sounds a lot like some of these should be added to bapase:
>
> https://udd.debian.org/cgi-bin/bapase.cgi


I think you mean https://udd.debian.org/bapase.cgi
but yes, looks like a good idea.
 - Craig
-- 
Craig Small (@smallsees)   http://enc.com.au/   csmall at : enc.com.au
Debian GNU/Linux   http://www.debian.org/   csmall at : debian.org
GPG fingerprint:5D2F B320 B825 D939 04D2  0519 3938 F96B DF50 FEA5


Re: Overall bitrot, package reviews and fast(er) unmaintained package removals

2016-04-06 Thread Felipe Sateler
On Wed, 06 Apr 2016 00:18:10 +0200, Ondřej Surý wrote:

> Hey,
> 
> while doing some work on PHP transitions, saving courier-imap, finally
> packaging seafile since they finally stopped violating GPL, I found a
> quite a lot of bitrot in some (mostly leaf) packages. Packages untouched
> for years after initial upload, packages with unreachable maintainers,
> etc[1].
> 
> I totally understand that our QA team can't solve all of this, but I
> have a couple of automated ideas that might help:

This is something we really need to start thinking about. Tasks that
involve more than a few packages usually require a large number of NMUs,
which is quite sad.

> * Some automated check that would mark the package as outdated. Outdated
> packages won't make it into stable and would be removed from unstable.
> Some indicators that package might be outdated:
>  - big difference (in time, in version numbers?) between upstream
>  version and Debian version
>  - no upload in a long time

s/upload/maintainer upload/

>  - some really outdated standards version
>  - some really outdated dh compat level
>  - using outdated packaging tools (and please don't go into the 1.0 vs
>  3.0 fight again here :-)
>  - something with being a leaf library and not used by anybody else for
>  a long time (combine that with popcon, f.e.?)
>  - other indicators

- Is maintained by the QA group (for longer than X time?)
- Is orphaned (for longer than X time?)
- Is RFA (for longer than X time? Or maybe it should auto-move to
  orphaned)

Essentially, if nobody steps up to maintain the packages, then they 
should go.

- Maintainer does not respond to bug reports in a timely (eg, 1.5 months
  calculated per package).

I think that maintainer responsiveness should be the key metric, not up-
to-dateness (ie, the maintainer may be holding back for good reasons, but 
those reasons should be explained).

This should also help detecting teams that have effectively become empty.

> 
> * Package marked as "outdated" would:
>  a) not be able to enter "stable"
>  b) not be able to enter "testing"
>  c) would be removed from "unstable"

Adding to the testing autoremoval queue would be a great start.

-- 
Saludos,
Felipe Sateler



Re: Overall bitrot, package reviews and fast(er) unmaintained package removals

2016-04-06 Thread Neil Williams
On Wed, 6 Apr 2016 15:27:48 + (UTC)
Felipe Sateler  wrote:

> On Wed, 06 Apr 2016 00:18:10 +0200, Ondřej Surý wrote:
> 
> > Hey,
> > 
> > while doing some work on PHP transitions, saving courier-imap,
> > finally packaging seafile since they finally stopped violating GPL,
> > I found a quite a lot of bitrot in some (mostly leaf) packages.
> > Packages untouched for years after initial upload, packages with
> > unreachable maintainers, etc[1].

'Unreachable maintainer' is not the same as 'invalid maintainer email
address' (as that is RC). It is much harder to identify "unreachable",
it's why we have an MIA process after all. So I think that needs to be
dropped from this metric.

> > I totally understand that our QA team can't solve all of this, but I
> > have a couple of automated ideas that might help:  
> 
> This is something we really need to start thinking about. Tasks that
> involve more than a few packages usually require a large number of
> NMUs, which is quite sad.

Removal from testing would be the way to work down the dependency
chain, so the metrics should be based on the version of the package in
testing. Yet there would also need to be a way of stopping the package
immediately migrating back into testing as none of the current
migration excuses would apply.

If the idea of marking packages are "not for testing" gets implemented,
then all packages which have been removed from testing, which still
exist in unstable and which cannot migrate back into testing but which
are not marked as "not for testing" could then be removed from the
archive (unstable) after a specified period of time. That may actually
be sufficient - it starts with leaf packages, then next time around,
there's a whole new bunch of leaf packages that were formerly used by
the first bunch to be removed.

(Before you know it, Debian fits on a single CD again - all of Debian.)
;-)

> > * Some automated check that would mark the package as outdated.
> > Outdated packages won't make it into stable and would be removed
> > from unstable. Some indicators that package might be outdated:
> >  - big difference (in time, in version numbers?) between upstream
> >  version and Debian version

Only matters if someone cares enough to file a "please upgrade" bug.
Upstream could change the versioning scheme and completely throw the
metric. e.g. 0.1 uploaded, 2016.4 in upstream.

> >  - no upload in a long time  
> 
> s/upload/maintainer upload/

One key part of the metric would be >2 NMUs without maintainer upload.

No maintainer upload alone is insufficient - uploading every package
once a year "just because" does not help anyone. It's another reason
why simply having an outdated Standards-Version is also insufficient.

> >  - some really outdated standards version
> >  - some really outdated dh compat level
> >  - using outdated packaging tools (and please don't go into the 1.0
> > vs 3.0 fight again here :-)
> >  - something with being a leaf library and not used by anybody else
> > for a long time (combine that with popcon, f.e.?)

auto-removal from testing covers that issue - work down the dependency
chain. Don't rely on popcon, it is indicative only, it cannot be used
in any automated metric. Reverse dependencies are what matter here for
determining "usage", more accurately read as "necessary".

> >  - other indicators  
> 
> - Is maintained by the QA group (for longer than X time?)
> - Is orphaned (for longer than X time?)
> - Is RFA (for longer than X time? Or maybe it should auto-move to
>   orphaned)
> 
> Essentially, if nobody steps up to maintain the packages, then they 
> should go.
> 
> - Maintainer does not respond to bug reports in a timely (eg, 1.5
> months calculated per package).
> 
> I think that maintainer responsiveness should be the key metric, not
> up- to-dateness (ie, the maintainer may be holding back for good
> reasons, but those reasons should be explained).

That could lead to a lot of ping messages in bug reports which might
not be that useful. It could also lead to maintainers closing bugs
which may have previously been left open as wontfix or wishlist. The
severity of the bug may need to be considered.

How do we assess responsiveness on those packages which have 0 bugs?

This does need to be about the package quality, not the maintainer. If
there is a stack of bugs with no response, it is very different to a
package with a couple of wishlist issues. So more than just
responsiveness, it needs to take account of the number and severity of
the bugs to which there has not been a response. There may also need to
be some protection from the implications of severity-ping-pong.
Overall, I think this is an unreliable metric and should not be used.

> This should also help detecting teams that have effectively become
> empty.

That is not the same as low quality packages.

Packages with NMUs not resolved by the maintainer is a much better
metric. The bugs are closed, so responsiveness would not be counted,
but the pack

Bug#820217: ITP: mercurial-extension-utils -- This module contains group of reusable functions for writing Mercurial extensions.

2016-04-06 Thread Christoph Mathys
Package: wnpp
Severity: wishlist
Owner: Christoph Mathys 

* Package name: mercurial-extension-utils
  Version : 1.2.0
  Upstream Author : Marcin Kasperski 
* URL : https://pypi.python.org/pypi/mercurial_extension_utils
* License : BSD
  Programming Lang: Python
  Description : This module contains functions for writing Mercurial 
extensions.

Contains functions used by Mercurial extension mercurial-keyring. They
are mostly tiny utilities related to configuration processing or
location matching. They either extend Mercurial APIs a bit (like
function to iterate config items which match regexp), or support tasks
which aren't strictly Mercurial related, but happen repeatedly during
extension writing (like



Re: MBF Announcement: Transition libpng12 -> libpng16

2016-04-06 Thread Tobias Frost
Hallo -devel,

Note that libpng1.6 is now in sid, so the libpng 1.6 transition has
finally started. 

To keep the transition short, please keep an eye on packages; of course
we will also do NMUs when neeeded.

The transistion tracker is here:
https://release.debian.org/transitions/html/libpng1.6.html


As announced, I will now raise the remaining bugs to RC level, none of
the affected packages are in testing right now:

#814879: timidity: FTBFS with libpng16 / does not specify libpng-dev B-
D,
 
#809874: root-system: FTBFS with libpng16, 

#810209: yt: Please update dependency on libpng-dev

#741894: libtk-img: FTBFS with libpng16,

#809935: fw4spl: FTBFS with libpng16,
 
#816115: openvrml-dev: Please depend on libpng-dev instead of libpng12-
dev

--
tobi



Re: Overall bitrot, package reviews and fast(er) unmaintained package removals

2016-04-06 Thread Wookey
+++ Ondřej Surý [2016-04-06 00:18 +0200]:
> Hey,
> 
> while doing some work on PHP transitions, saving courier-imap, finally
> packaging seafile since they finally stopped violating GPL, I found a
> quite a lot of bitrot in some (mostly leaf) packages. Packages untouched
> for years after initial upload, packages with unreachable maintainers,
> etc[1].

As a porter I've seen a lot of this too. 
 
> I have a feeling that we are hoarding packages, but the overall quality
> varies a lot (not pointing fingers here). The feeling I have now was
> same when I was doing Berkeley DB transition (and I really wish I just
> filled couple more ROMs/RQAs then instead of fixing the outdated
> software in the archive).

Well, I don't think it's necessarily bad when someone who has nothing
to do with the package just fixes an issue they know about (I've done
quite a lot of 'make dh_autoreconf work' and 'make multiarch work' and
'make cross-building work' NMUs for example, as well as more specific
'make work on arm*' uploads). It's understandable that maintainers
often ignore that stuff because they don't understand it, and don't
want to break things.

What I don't know is whether anyone in the world actually uses this
software or cares about it, and the relative benfits of updating it or
removing it. Am I completely wasting my time fixing up some old package
that doesn't build on arm64?

> * Not really sure if we have packages so "rock-stable" that they still
> work even though they haven't been touched in years,

I think we do have some of these, but I don't know which ones they
are...

What is frustrating about largely-unmaintained software is that it
would often be much quicker and easier to fix most of the issues by
throwing away the old packaging and just making it a dh package (or
even debhelper at all). But this is strongly discouraged in NMUs, so
adds a lot of work for porters, who have to do things the hard way,
and in an old package often find a load of other things have broken
along the way so you have to fix those too in order to upload (here is
a good example where what should have been applying a simple
autoreconf patch led to a rabbit-hole of woe due to accumulated FTBFS
problems with new gcc and java
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=755840 )

openvrml used to be important, but maybe it's not worth the effort to
fix anymore (although I've now done 90% of it - if someone did the
last 10% it'd be good for another release at least). It does seem
clear that the maintainer isn't taking much/any notice.

> * Some automated check that would mark the package as outdated. 

I would certainly find this useful as some kind of metric.  I'm not
sure I agree with all your scoring items in detail, but they are clearly
indicative.

We have got a lot of cruft, and perhaps removing some of it is
actually sensible use of people's time.

> .. perhaps be more aggressive in
> removing software that's no longer useful and just lies in the archive
> dormant.

The fact that Debian has a lot of software is a genuine benefit. Just
because stuff is old, does not mean it is no longer useful. The
problem is that we don't really know how to distinguish between
old-and-just-cruft and old-and-still-handy.

I do agree that we could remove more than we currently do, probably with very
little real fallout, and a corresponding increase in overall quality.

Wookey
-- 
Principal hats:  Linaro, Debian, Wookware, ARM
http://wookware.org/


signature.asc
Description: Digital signature


Bug#820245: ITP: groestlcoin -- peer-to-peer network based digital currency

2016-04-06 Thread Jonas Smedegaard
Package: wnpp
Severity: wishlist
Owner: Jonas Smedegaard 

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

* Package name: groestlcoin
  Version : 2.11.0
  Upstream Author : The Groestlcoin developers
* URL : http://www.groestlcoin.org/
* License : Expat
  Programming Lang: C++
  Description : peer-to-peer network based digital currency

 Groestlcoin is an experimental new digital currency that enables
 instant payments to anyone, anywhere in the world. Groestlcoin uses
 peer-to-peer technology to operate with no central authority: managing
 transactions and issuing money are carried out collectively by the
 network. Groestlcoin Core is the name of open source software which
 enables the use of this currency.
 .
 Grøstl is a cryptographic hash function submitted to a NIST competition
 - and was chosen as one of the five finalists of the competition.  The
 name is a word play on the researchers being from Austria (the dish
 "gröstl") and Denmark (the letter "ø")).

The package will be maintained in the Debian Bitcoin Team.

-BEGIN PGP SIGNATURE-
Version: GnuPG v1

iQIcBAEBAgAGBQJXBYjmAAoJECx8MUbBoAEhqOQP/2iSDYEqGAMdf5Ge/X9YB5Ba
ItuBcuDAvZNEhaykBlG0PR1Jmnmav1l0A7q9eGH8n+vrbTwT/oGnGerXVODX7Hy5
rRa+FoIQOnUkYs2VfAe2M68KsN8G4zxieyLANHuUE6ufNUmJSf+3QWWWRtH8WQiw
vw474fUtoBp1W8TXCHKm7ZRwXbnHCCRWy08iW29XcUXroJCQE/yM32a4yVhcHOPu
kLqfM9EM478PVjYu0iU4sNLq4P3+XELIlmABpJeF1RFWRCNIbKyIiQG3+2PzhsNt
OwS+/6cym6CcwPOVPeoqUzsMTJb0YtpYzU8LXZV1Ng8z5lDERlMqg2A4vaChP0I1
bCzKqtXRrZltCBAn8p1MZFmBnWsPgbUCIzfvUz205/kIDTuHSmKZ1dE66lnsz7z4
gBNNSNE9tg0agFeE6qWLLUYiF+T6APYpi9xAhcEM2JQY1vF6ANwglcvUMkhk6XzB
GlDlF6GpbzHm7jA3ylwpgtdBd35lji/urc3njFVzI0mNAj6gXKkuVSkSmBy7irfn
oOqD70OWtdWpqznFfiAA3QBd/7gopMO2aiANDtUfL9cSL9bCWFmlVYUwIlSNnAL1
BkrHOgMc36TnUN3s/CWn1YVbUiHzDuku1H0Xwq27jTfwV3d2EwpGLm9z8hFeuRS8
qD7akoOswm4pxMG+VDoq
=8nTQ
-END PGP SIGNATURE-



Re: Overall bitrot, package reviews and fast(er) unmaintained package removals

2016-04-06 Thread Steffen Möller


On 06/04/16 21:19, Wookey wrote:
>> > .. perhaps be more aggressive in
>> > removing software that's no longer useful and just lies in the archive
>> > dormant.
> The fact that Debian has a lot of software is a genuine benefit. Just
> because stuff is old, does not mean it is no longer useful. The
> problem is that we don't really know how to distinguish between
> old-and-just-cruft and old-and-still-handy.
The popcon stats may help.

For the packages in Debian Science and Debian Med I tend to think that
it accommodates a bunch of packages that mostly are of historic value
now. People may  use them to compare how well their new methods compare
against the old stuff but the package itself may not be used that much
and the authors never did much maintenance beyond their scientific
questions, anyway - which is also because of the grant-driven funding
schemes and the scientics moving institutions after some 1-5 years.
Those archeological gems I consider to be valuable, in particular when
original binaries were only offered for the then common but today unseen
platforms like DEC, SGI and Sun. So, we have old-and-just-craft,
old-and-still-handy, and some first-step-on-the-moon kind of packages.

Steffen

That said, now, with Debian-Astro established, do we possibly find
someone to adopt the code and emulators for the Apollo missions
(http://www.ibiblio.org/apollo/) for us? And, no, I do not really think
that footprints on the moon are good for much scientific benchmarking.
Although, who knows, some extraterrestrials may find those more easily
accessible than any such on earth. Uh, bed time.



Re: Overall bitrot, package reviews and fast(er) unmaintained package removals

2016-04-06 Thread Brian May
Felipe Sateler  writes:
>>  - no upload in a long time
>
> s/upload/maintainer upload/

In the past I have maintained some important packages by doing regular
NMUs when the maintainer is not responsive (including emails asking to
take over the package). So just because the *maintainer* hasn't made an
upload in a long time doesn't mean that nobody cares about the package.

If there aren't RC bugs to be fixed and the package is up-to-date, an
unresponsible maintainer still doesn't mean nobody cares about the
package; however it could mean another maintainer (or team maintained)
is desirable.
-- 
Brian May 



Bug#820256: ITP: espr -- Building performance modelling software

2016-04-06 Thread Wookey
Package: wnpp
Severity: wishlist
Owner: Wookey 

* Package name: esp-r
  Version : 12.3
  Upstream Author : Energy Systems Research Unit, University of Strathcycle
* URL : http://www.esru.strath.ac.uk/Programs/ESP-r_central.htm
* License : GPL2+
  Programming Lang: C, Fortran
  Description : Building performance modelling software

 ESP-r is a multi-domain building performance simulation program. It
 can model heat, air, moisture light and electrical power flows at
 user specified spatial and temporal resolution. It comprises a
 central Project Manager around which are arranged support databases,
 a simulator, various performance assessment tools and a variety of
 third party applications for CAD, visualisation and report
 generation.


This is useful scientific/building software, and the only Free
Software for doing thermal modelling on GNU/Linux (until Therm get
round to their proposed licence change 'sometime in the next two
years').

It has taken me several years (of very sporadic efforts) to get this
packaged as upstream have such a shitty build system. It has improved
significantly (to merely crappy) since I first started prodding it so
the effort is now much less herculean.



Re: Overall bitrot, package reviews and fast(er) unmaintained package removals

2016-04-06 Thread Ben Hutchings
On Thu, 2016-04-07 at 01:05 +0200, Steffen Möller wrote:
> 
> On 06/04/16 21:19, Wookey wrote:
> > 
> > > 
> > > > 
> > > > .. perhaps be more aggressive in
> > > > removing software that's no longer useful and just lies in the archive
> > > > dormant.
> > The fact that Debian has a lot of software is a genuine benefit. Just
> > because stuff is old, does not mean it is no longer useful. The
> > problem is that we don't really know how to distinguish between
> > old-and-just-cruft and old-and-still-handy.
> The popcon stats may help.
> 
> For the packages in Debian Science and Debian Med I tend to think that
> it accommodates a bunch of packages that mostly are of historic value
> now. People may  use them to compare how well their new methods compare
> against the old stuff
[...]

Given the low quality and lack of unit tests in many scientific
applications, how confident can we be that the 'old' packages (that
have now built with newer toolchains and libraries) actually still
produce the same results they used to?  If we are not, even that
historic value is lost.

Ben.

-- 
Ben Hutchings
Who are all these weirdos? - David Bowie, reading IRC for the first time

signature.asc
Description: This is a digitally signed message part


Re: Overall bitrot, package reviews and fast(er) unmaintained package removals

2016-04-06 Thread Ben Hutchings
On Thu, 2016-04-07 at 01:02 +, Potter, Tim (HPE Linux Support)
wrote:
> On 7 Apr 2016, at 10:52 AM, Ben Hutchings 
> wrote:
> 
> > 
> > Given the low quality and lack of unit tests in many scientific
> > applications, how confident can we be that the 'old' packages (that
> > have now built with newer toolchains and libraries) actually still
> > produce the same results they used to?  If we are not, even that
> > historic value is lost.
> 
> Full archive rebuilds are done every so often.  The switchover to the
> gcc-5 toolchain
> was an example and everything was rebuilt at least once during that
> time.  My understanding
> is that packages are dropped if they don't build in this case, and
> no-one steps up to fix them
> within a reasonable (months) period of time.

You are missing the point, which is that while they still build with
the new toolchain (possibly after a developer without intimate
knowledge of the program makes a best-effort fix) we don't know that
they behave the same way.

Ben.

-- 
Ben Hutchings
Who are all these weirdos? - David Bowie, reading IRC for the first time


signature.asc
Description: This is a digitally signed message part


Re: Overall bitrot, package reviews and fast(er) unmaintained package removals

2016-04-06 Thread Potter, Tim (HPE Linux Support)
On 7 Apr 2016, at 10:52 AM, Ben Hutchings  wrote:

> Given the low quality and lack of unit tests in many scientific
> applications, how confident can we be that the 'old' packages (that
> have now built with newer toolchains and libraries) actually still
> produce the same results they used to?  If we are not, even that
> historic value is lost.


Full archive rebuilds are done every so often.  The switchover to the gcc-5 
toolchain
was an example and everything was rebuilt at least once during that time.  My 
understanding
is that packages are dropped if they don't build in this case, and no-one steps 
up to fix them
within a reasonable (months) period of time.


Tim.


signature.asc
Description: Message signed with OpenPGP using GPGMail


Re: Overall bitrot, package reviews and fast(er) unmaintained package removals

2016-04-06 Thread Potter, Tim (HPE Linux Support)
On 7 Apr 2016, at 11:18 AM, Ben Hutchings  wrote:
> On Thu, 2016-04-07 at 01:02 +, Potter, Tim (HPE Linux Support)
> wrote:
>> On 7 Apr 2016, at 10:52 AM, Ben Hutchings 
>> wrote:
>>> Given the low quality and lack of unit tests in many scientific
>>> applications, how confident can we be that the 'old' packages (that
>>> have now built with newer toolchains and libraries) actually still
>>> produce the same results they used to?  If we are not, even that
>>> historic value is lost.
>> 
>> Full archive rebuilds are done every so often.  The switchover to the
>> gcc-5 toolchain
>> was an example and everything was rebuilt at least once during that
>> time.  My understanding
>> is that packages are dropped if they don't build in this case, and
>> no-one steps up to fix them
>> within a reasonable (months) period of time.
> 
> You are missing the point, which is that while they still build with
> the new toolchain (possibly after a developer without intimate
> knowledge of the program makes a best-effort fix) we don't know that
> they behave the same way.

OK - good point.  I wonder if there is any information about how many packages
run unit tests?  It would be interesting to see the data.

I get the impression that more upstream packages have built-in tests that
can be run as part of dpkg-buildpackage (e.g Python, Perl, Ruby, Java and
Go) but maybe that's just because I've been working in those environments
recently.


Tim.


signature.asc
Description: Message signed with OpenPGP using GPGMail


Re: Overall bitrot, package reviews and fast(er) unmaintained package removals

2016-04-06 Thread Neil Williams
On Thu, 7 Apr 2016 01:36:51 +
"Potter, Tim (HPE Linux Support)"  wrote:

> On 7 Apr 2016, at 11:18 AM, Ben Hutchings  wrote:
> > On Thu, 2016-04-07 at 01:02 +, Potter, Tim (HPE Linux Support)
> > wrote:  
> >> On 7 Apr 2016, at 10:52 AM, Ben Hutchings 
> >> wrote:  
> >>> Given the low quality and lack of unit tests in many scientific
> >>> applications, how confident can we be that the 'old' packages
> >>> (that have now built with newer toolchains and libraries)
> >>> actually still produce the same results they used to?  If we are
> >>> not, even that historic value is lost.  
> >> 
> >> Full archive rebuilds are done every so often.  The switchover to
> >> the gcc-5 toolchain
> >> was an example and everything was rebuilt at least once during that
> >> time.  My understanding
> >> is that packages are dropped if they don't build in this case, and
> >> no-one steps up to fix them
> >> within a reasonable (months) period of time.  
> > 
> > You are missing the point, which is that while they still build with
> > the new toolchain (possibly after a developer without intimate
> > knowledge of the program makes a best-effort fix) we don't know that
> > they behave the same way.  
> 
> OK - good point.  I wonder if there is any information about how many
> packages run unit tests?  It would be interesting to see the data.

Not for packages which build and execute unit tests during the build -
although failures there cause an FTBFS, so the archived bug history
will give some indication of whether the tests are likely to fail with
toolchain changes.

Clearer records exist at ci.debian.net.

> I get the impression that more upstream packages have built-in tests
> that can be run as part of dpkg-buildpackage (e.g Python, Perl, Ruby,
> Java and Go) but maybe that's just because I've been working in those
> environments recently.

Not all unit test suites can run during the build.

-- 


Neil Williams
=
http://www.linux.codehelp.co.uk/



pgpUEiH363o2e.pgp
Description: OpenPGP digital signature


Re: Overall bitrot, package reviews and fast(er) unmaintained package removals

2016-04-06 Thread Neil Williams
On Thu, 7 Apr 2016 01:05:48 +0200
Steffen Möller  wrote:

> On 06/04/16 21:19, Wookey wrote:
> >> > .. perhaps be more aggressive in
> >> > removing software that's no longer useful and just lies in the
> >> > archive dormant.  
> > The fact that Debian has a lot of software is a genuine benefit.
> > Just because stuff is old, does not mean it is no longer useful. The
> > problem is that we don't really know how to distinguish between
> > old-and-just-cruft and old-and-still-handy.  
> The popcon stats may help.

Really, NO.

popcon is indicative only and the stats are only useful as metrics when
popcon counts become "significant". Any popcon score which is less than
1% or 5% of the archive is not much better than a guess. The problem is
that those are precisely the packages where removal is likely.

Also, packages may provide services to a lot more users than just the
ones with it and popcon installed. There are many the packages which
depend on a webserver of some kind - a single install can have many
thousands of users.

popcon data can be handy for humans who bother to go after the
package-specific context but it is not sufficiently reliable for
automated metrics.

-- 


Neil Williams
=
http://www.linux.codehelp.co.uk/



pgp8PHlZlmkhX.pgp
Description: OpenPGP digital signature


Re: Overall bitrot, package reviews and fast(er) unmaintained package removals

2016-04-06 Thread Ole Streicher
"Potter, Tim (HPE Linux Support)"  writes:
> On 7 Apr 2016, at 11:18 AM, Ben Hutchings  wrote:
>> You are missing the point, which is that while they still build with
>> the new toolchain (possibly after a developer without intimate
>> knowledge of the program makes a best-effort fix) we don't know that
>> they behave the same way.
>
> OK - good point.  I wonder if there is any information about how many packages
> run unit tests?  It would be interesting to see the data.

When we speak about "historic" science packages, they often don't have
this. What they usually have is a kind of manual test, where one has to
look for some graphs, interpret them and decide whether the result is OK.

And usually the code quality is quite poor; there are workarounds and
uncommented dirty stuff to speed things upm that really cannot give
exactly the same result on modern computers.

However, the scientist still trust these programs, and so it makes sense
to keep them. And often, there is just no modern replacement.

Best regards

Ole