On 2019/01/16 19:09, Otto Moerbeek wrote:
> On Wed, Jan 16, 2019 at 01:25:25PM +0000, Stuart Henderson wrote:
> 
> > On 2019/01/04 08:09, Otto Moerbeek wrote:
> > > On Thu, Dec 27, 2018 at 09:39:56AM +0100, Otto Moerbeek wrote:
> > > 
> > > > 
> > > > Very little feedback so far. This diff can only give me valid feedback
> > > > if the coverage of systems and use cases is wide.  If I do not get
> > > > more feedback, I have to base my decisions on my own testing, which
> > > > will benefit my systems and use cases, but might harm yours.
> > > > 
> > > > So, ladies and gentlemen, start your tests!
> > > 
> > > Another reminder. I like to make progress on this. That means I need
> > > tests for various use-cases.
> > 
> > I have a map based website I use that is quite good at stressing things
> > (high spin% cpu) and have been timing from opening chromium (I'm using
> > this for the test because it typically performs less well than firefox).
> > Time is real time from starting the browser set to 'start with previously
> > opened windows' and the page open, until when the page reports that it's
> > finished loading (i.e. fetching data from the server and rendering it).
> > 
> > It's not a perfect test - depends on network/server conditions etc - and
> > it's a visualisation of conditions in a game so may change slightly from
> > run to run but there shouldn't be huge changes between the times I've
> > run it - but is a bit more repeatable than a subjective "does the browser
> > feel slow".
> > 
> > 4x "real" cores, Xeon E3-1225v3, 16GB ram (not going into swap).
> > 
> > I've mixed up the test orders so it's not 3x +++, 2x ++, 3x + etc in order,
> > more like +++, -, '', -, ++ etc.
> > 
> >  +++        90      98      68
> >  ++ 85      82
> >  +  87      56      71
> >  '' 76      60      69      88
> >  -  77      74      85
> >  -- 48      86      77      67
> > 
> > So while it's not very consistent, the fastest times I've seen are on
> > runs with fewer pools, and the slowest times on runs with more pools,
> > with '' possibly seeming a bit more consistent from run to run. But
> > there's not enough consistency with any of it to be able to make any
> > clear conclusion (and I get the impression it would be hard to
> > tell without some automated test that can be repeated many times
> > and carrying out a statistical analysis on results).
> > 
> 
> Thanks for testing. To be clear: this is with the diff I posted and not the
> committed code, right? (There is a small change in the committed code
> to change the default to what 1 plus was with the diff).
> 
>       -Otto
> 

Ah I missed that it was committed (and thought that the diff as sent
was in snapshots) - this was the committed version then.

(It took a while to test as I was trying to think of something where
I actually had a chance of noticing a difference!).

Reply via email to