The good-sounding thing - you can do that easily with JMeter running the GUI or 
the command-line

Cheers,

Siegfried Goeschl

> On 06 Apr 2015, at 21:35, Davis, Daniel (NIH/NLM) [C] <daniel.da...@nih.gov> 
> wrote:
> 
> This sounds really good:
> 
> "For load testing, we replay production logs to test that we meet the SLA at 
> a given traffic level."
> 
> The rest sounds complicated.   Ah well, that's the job.
> 
> -----Original Message-----
> From: Walter Underwood [mailto:wun...@wunderwood.org] 
> Sent: Monday, April 06, 2015 2:48 PM
> To: solr-user@lucene.apache.org
> Subject: Re: Measuring QPS
> 
> We built a servlet request filter that is configured in front of the Solr 
> servlets. It reports response times to metricsd, using the Codahale library.
> 
> That gives us counts, rates, and response time metrics. We mostly look at 
> percentiles, because averages are thrown off by outliers. Average is just the 
> wrong metric for a one-sided distribution like response times.
> 
> We use Graphite to display the 95th percentile response time for each request 
> handler. We use Tattle for alerting on those metrics.
> 
> We also use New Relic for a different look at the performance. It is good at 
> tracking from the front end through to Solr.
> 
> For load testing, we replay production logs to test that we meet the SLA at a 
> given traffic level.
> 
> Walter Underwood
> wun...@wunderwood.org
> http://observer.wunderwood.org/  (my blog)
> 
> On Apr 6, 2015, at 11:31 AM, Davis, Daniel (NIH/NLM) [C] 
> <daniel.da...@nih.gov> wrote:
> 
>> OK,
>> 
>> I have a lot of chutzpah posting that here ;)    The other guys answering 
>> the questions can probably explain it better.
>> I love showing off, however, so please forgive me.
>> 
>> -----Original Message-----
>> From: Davis, Daniel (NIH/NLM) [C]
>> Sent: Monday, April 06, 2015 2:25 PM
>> To: solr-user@lucene.apache.org
>> Subject: RE: Measuring QPS
>> 
>> Its very common to do autocomplete based on popular queries/titles over some 
>> sliding time window.   Some enterprise search systems even apply age 
>> weighting so that they don't need to re-index but continuously add to the 
>> index.   This way, they can do autocomplete based on what's popular these 
>> days.
>> 
>> We use relevance/field boosts/phrase matching etc. to get the best guess 
>> about what results they want to see.   This is similar - we use relevance, 
>> field boosting to guess what users want to search for.   Zipf's law applies 
>> to searches as well as results.
>> 
>> -----Original Message-----
>> From: Siegfried Goeschl [mailto:sgoes...@gmx.at]
>> Sent: Monday, April 06, 2015 2:17 PM
>> To: solr-user@lucene.apache.org
>> Subject: Re: Measuring QPS
>> 
>> Hi Daniel,
>> 
>> interesting - I never thought of autocompletion but for keeping track 
>> of user behaviour :-)
>> 
>> * the numbers are helpful for the online advertisement team to sell 
>> campaigns
>> * it is used for sanity checks - sensible queries returning no results 
>> or returning too many results
>> 
>> Cheers,
>> 
>> Siegfried Goeschl
>> 
>>> On 06 Apr 2015, at 20:04, Davis, Daniel (NIH/NLM) [C] 
>>> <daniel.da...@nih.gov> wrote:
>>> 
>>> Siegfried,
>>> 
>>> It is early days as yet.   I don't think we need a code drop.   AFAIK, none 
>>> of our current Solr applications autocomplete the search box based on 
>>> popular query/title keywords.   We have other applications that do that, 
>>> but they don't use Solr.
>>> 
>>> Thanks again,
>>> 
>>> Dan
>>> 
>>> -----Original Message-----
>>> From: Siegfried Goeschl [mailto:sgoes...@gmx.at]
>>> Sent: Monday, April 06, 2015 1:42 PM
>>> To: solr-user@lucene.apache.org
>>> Subject: Re: Measuring QPS
>>> 
>>> Hi Dan,
>>> 
>>> at willhaben.at (customer of mine) two SOLR components were written 
>>> for SOLR 3 and ported to SORL 4
>>> 
>>> 1) SlowQueryLog which dumps long-running search requests into a log 
>>> file
>>> 
>>> 2) Most Frequent Search Terms allowing to query & filter the most 
>>> frequent user search terms over the browser
>>> 
>>> Some notes along the line
>>> 
>>> 
>>> * For both components I have the "GO" to open source them but I never 
>>> had enough time to do that (shame on me) - see
>>> https://issues.apache.org/jira/browse/SOLR-4056
>>> 
>>> * The Most Frequent Search Term component actually mimics a SOLR 
>>> server you feed the user search terms so this might be a better 
>>> solution in the long run. But this requires to have a separate SOLR 
>>> core & ingest  plus GUI (check out SILK or ELK) - in other words more 
>>> moving parts in production :-)
>>> 
>>> * If there is sufficient interest I can make a code drop on GitHub
>>> 
>>> Cheers,
>>> 
>>> Siegfried Goeschl
>>> 
>>> 
>>> 
>>>> On 06 Apr 2015, at 16:25, Davis, Daniel (NIH/NLM) [C] 
>>>> <daniel.da...@nih.gov> wrote:
>>>> 
>>>> Siegfried,
>>>> 
>>>> This is a wonderful find.   The second presentation is a nice write-up of 
>>>> a large number of free tools.   The first presentation prompts a question 
>>>> - did you add custom request handlers/code to automate determination of 
>>>> best user search terms?   Did any of your custom work end-up in Solr?
>>>> 
>>>> Thank you so much,
>>>> 
>>>> Dan
>>>> 
>>>> P.S. - your first presentation takes me back to seeing "Angrif der 
>>>> Klonkrieger" in Berlin after a conference - Hayden Christensen was less 
>>>> annoying in German, because my wife and I don't speak German ;)   I 
>>>> haven't thought of that in a while.
>>>> 
>>>> -----Original Message-----
>>>> From: Siegfried Goeschl [mailto:sgoes...@gmx.at]
>>>> Sent: Saturday, April 04, 2015 4:54 AM
>>>> To: solr-user@lucene.apache.org
>>>> Subject: Re: Measuring QPS
>>>> 
>>>> Hi Dan,
>>>> 
>>>> I'm using JavaMelody for my SOLR production servers - gives you the 
>>>> relevant HTTP stats (what's happening now & historical data) plus 
>>>> JVM monitoring as additional benefit. The servers are deployed on 
>>>> Tomcat so I'm of little help regarding Jetty - having said that
>>>> 
>>>> * you need two Jars (javamelody & robin)
>>>> * tinker with web.xml
>>>> 
>>>> Here are two of my presentations mentioning JavaMelody (plus some 
>>>> other stuff)
>>>> 
>>>> http://people.apache.org/~sgoeschl/presentations/solr-from-developme
>>>> n
>>>> t
>>>> -to-production-20121210.pdf
>>>> <http://people.apache.org/~sgoeschl/presentations/solr-from-developm
>>>> e
>>>> n
>>>> t-to-production-20121210.pdf>
>>>> http://people.apache.org/~sgoeschl/presentations/jsug-2015/jee-perfo
>>>> r
>>>> m
>>>> ance-monitoring.pdf
>>>> <http://people.apache.org/~sgoeschl/presentations/jsug-2015/jee-perf
>>>> o
>>>> r
>>>> mance-monitoring.pdf>
>>>> 
>>>> Cheers,
>>>> 
>>>> Siegfried Goeschl
>>>> 
>>>>> On 03 Apr 2015, at 17:53, Shawn Heisey <apa...@elyograg.org> wrote:
>>>>> 
>>>>> On 4/3/2015 9:37 AM, Davis, Daniel (NIH/NLM) [C] wrote:
>>>>>> I wanted to gather QPS for our production Solr instances, but I was 
>>>>>> surprised that the Admin UI did not contain this information.   We are 
>>>>>> running a mix of versions, but mostly 4.10 at this point.   We are not 
>>>>>> using SolrCloud at present; that's part of why I'm checking - I want to 
>>>>>> validate the size of our existing setup and what sort of SolrCloud setup 
>>>>>> would be needed to centralize several of them.
>>>>>> 
>>>>>> What is the best way to gather QPS information?
>>>>>> 
>>>>>> What is the best way to add information like this to the Admin UI, if I 
>>>>>> decide to take that step?
>>>>> 
>>>>> As of Solr 4.1 (three years ago), request rate information is 
>>>>> available in the admin UI and via JMX.  In the admin UI, choose a 
>>>>> core from the dropdown, click on Plugins/Stats, then QUERYHANDLER, 
>>>>> and open the handler you wish to examine.  You have 
>>>>> avgRequestsPerSecond, which is calculated for the entire runtime of 
>>>>> the SolrCore, as well as 5minRateReqsPerSecond and 
>>>>> 15minRateReqsPerSecond, which are far more useful pieces of information.
>>>>> 
>>>>> https://issues.apache.org/jira/browse/SOLR-1972
>>>>> 
>>>>> Thanks,
>>>>> Shawn
>>>>> 
>>>> 
>>> 
>> 
> 

Reply via email to