Ühel kenal päeval, N, 28.02.2019 kell 04:13, kirjutas Joshua Kinard:
> On 2/25/2019 05:18, Alexander Tsoy wrote:
> > В Пн, 25/02/2019 в 13:07 +0300, Alexander Tsoy пишет:
> > > В Чт, 21/02/2019 в 04:36 -0500, Joshua Kinard пишет:
> > > > Does anyone have an idea why util-linux's build time would go
> > > > up
> > > > significantly from 2.32.x to 2.33.x?  It may be a MIPS thing,
> > > > as my
> > > > x86_64
> > > > box shows no discernible change in build times between the same
> > > > versions.
> > > > Can any other archs check w/ genlop to see if they see a large
> > > > jump
> > > > in build
> > > > time?
> > > > 
> > > > 'genlop -t util-linux' output on my SGI system (some entries
> > > > removed
> > > > for
> > > > brevity):
> > > > 
> > > >      Thu Feb  1 11:26:33 2018 >>> sys-apps/util-linux-2.31.1
> > > >        merge time: 27 minutes and 48 seconds.
> > > > 
> > > >      Sat Mar 31 08:07:20 2018 >>> sys-apps/util-linux-2.32
> > > >        merge time: 28 minutes and 44 seconds.
> > > > 
> > > >      Mon Aug 27 06:21:30 2018 >>> sys-apps/util-linux-2.32.1
> > > >        merge time: 32 minutes and 58 seconds.
> > > > 
> > > >      Tue Nov 13 10:03:58 2018 >>> sys-apps/util-linux-2.33
> > > >        merge time: 1 hour, 19 minutes and 49 seconds.
> > > > 
> > > >      Fri Jan 11 09:20:21 2019 >>> sys-apps/util-linux-2.33.1
> > > >        merge time: 1 hour, 23 minutes and 37 seconds.
> > > > 
> > > >      Thu Feb 21 04:14:33 2019 >>> sys-apps/util-linux-2.33.1
> > > >        merge time: 1 hour, 25 minutes and 15 seconds.
> > > > 
> > > 
> > > 2.33 was changed to use python-r1 eclass instead of python-
> > > single-r1
> > > eclass. And the increase of build time seems caused by an out-of-
> > > source 
> > > build for each python implementation. Some libraries are built
> > > several
> > > times (for native abi + for each python implementation).
> > 
> > And there is additional configure run for each python
> > implementation.
> 
> Hmm, this might explain things, somewhat.  I think there's possibly
> some
> truth to the getcwd bit discussed earlier, but that may be limited to
> glibc
> only.

Right, util-linux doesn't conduct that test. coreutils and tar do,
maybe some more.
That doesn't mean running with sandbox doesn't have a slowdown effect -
it most certainly does, just hopefully not so drastic as that
particular case - it involves glibc own generic getcwd being slow with
long paths, and sandbox calling it three times for its access checks
even for mkdir call, just to error with ENAMETOOLONG, in addition to
many getcwd calls the configure check itself does. So it's slow even
without sandbox, but with sandbox that slowness is doubled or more.
That has made me wonder if maybe by having some more ENAMETOOLONGs
earlier, the test would finish earlier, instead of slowly spinning
through paths that are in length between PATH_MAX and PATH_MAX*2, when
it's slow..
But not sure why these m4 macros seems to be calling getcwd after each
mkdir+chdirm etc just to get a boolean configure check result. Didn't
look into the specific case, I only debugged the test case that just
loops mkdir+chdir.
Someone should maybe convert these project to meson and do that check
smarter :D

> util-linux-2.33.1 on my uclibc-ng chroot took about ~25mins.  Have to
> re-time the glibc build to see if it's something w/ the libc
> implementation.
> 
> Temp workaround I guess is to cut down on the PYTHON_TARGETS before
> my next
> catalyst attempt.  2.7 + 3.7 should be enough...

Personally I seem to get by with just USE=-python on util-linux (it's
actually not globally enabled on my systems, it seems). Otherwise,
sure, if the slowness is in configure.


Mart

Attachment: signature.asc
Description: This is a digitally signed message part

Reply via email to