For a project I'm working on, what we do is store the user's query in a 
separated core that we also use to provide an autocomplete query functionality, 
so far, the frontend app is responsible of sending the query to Solr, meaning: 
1. execute the query against our search core and 2. send an update request to 
store the query in the separated core. We use some deduplication (provided by 
Solr) to avoid storing the same query several times. We don't do what you're 
after but it would't be to hard to tag each query with a timestamp field and 
provide analytics. Thinking from the top of my head we could wrap this logic 
that is currently done in the frontend app in a custom SearchComponent that 
automatically send the search query into the other core for storing, 
abstracting all this logic from the client app. Keep in mind that the 
considerations regarding volume of data that Shawn has talked keeps being valid.

Hope it helps, 

----- Original Message -----
From: "Shawn Heisey" <apa...@elyograg.org>
To: solr-user@lucene.apache.org
Sent: Sunday, February 8, 2015 11:03:33 AM
Subject: [MASSMAIL]Re: Trending functionality in Solr

On 2/7/2015 9:26 PM, S.L wrote:
> Is there a way to implement the trending functionality using Solr , to give
> the results using a query for say the most searched terms in the past hours
> or so , if the most searched terms is not possible is it possible to at
> least the get results for the last 100 terms?

I'm reasonably sure that the only thing Solr has out of the box that can
record queries is the logging feature that defaults to INFO.  That data
is not directly available to Solr, and it's not in a good format for
easy parsing.

Queries are not stored anywhere else by Solr.  From what I understand,
analysis is a relatively easy part of the equation, but the data must be
available first, which is the hard part.  Storing it in RAM is pretty
much a non-starter -- there are installations that see thousands of
queries every second.

This is an area for improvement, but the infrastructure must be written
from scratch.  All work on this project is volunteer.  We are highly
motivated volunteers, but extensive work like this is difficult to fit
into donated time.

Many people who use Solr are already recording all queries in some other
system (like a database), so it is far easier to implement analysis on
that data.

Thanks,
Shawn

Reply via email to