Increase Search Speed for multiple solr request for different time range query

2010-12-25 Thread saureen

I am using the fq parameter in my solr query for getting the results based on
a date range.so lets say i am searching for a term "iphone" and i want to
get results from '2010-12-01' to '2010-12-25' ,so fq sets my start date and
end date,i am doin this in a loop for different time frames and for same
search keyword.The loop runs 10 to 12 times,
i.e
 q='iphone'
fq = date:[2010-12-01T00:00:00Z TO 2010-12-25T00:00:00Z]
rows=500

This is taking a long time to process also i am using the sharding based on
the time range.

Suggest possible steps to optimize the solr search for above scenario.


-- 
View this message in context: 
http://lucene.472066.n3.nabble.com/Increase-Search-Speed-for-multiple-solr-request-for-different-time-range-query-tp2144426p2144426.html
Sent from the Solr - User mailing list archive at Nabble.com.


solr speed issues..

2011-01-14 Thread saureen

I am working on an application that requires fetching results from solr based
on date parameter..earlier i was using sharding to fetch the results but
that was making things too slow,so instead of sharding,i queried on three
different cores with the same parameters and merged the results..still the
things are slow..

for one call i generally get around 500 to 1000 docs from solr..so basically
i am including following parameters in url for solr call

sort=created+desc
json.nl=map
wt=json
rows=1000
version=1.2
omitHeader=true
fl=title
start=0
q=apple
qt=standard
fq=created:[date1 TO date2]


Its taking long time to get the results,any solution for the above problem
would be great..

-- 
View this message in context: 
http://lucene.472066.n3.nabble.com/solr-speed-issues-tp2254823p2254823.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: solr speed issues..

2011-01-15 Thread saureen

I am using three solr cores and all the cores are on the same m/c

My configuration is such that i have a machine where code resides and there
are 10 other machines which contains the indexed data in cluster.so the code
machine picks up a random machine for getting solr data.

Also the reason why no of rows are 500 for my application is because i am
selecting a text field and based on the 500 results i am calculating
word-frequency array,i cant reduce the rows to less than 500 or else the
application wont work as expected..

The data comes over the network and is in compressed form,also the data in
stored in compressed form.

Also the memory allocated for JVM is 1GB which is fair enough.

so,without reducing the rows fetch on single solr call i need to speed up
the process.So plz suggest the solution in this direction.
-- 
View this message in context: 
http://lucene.472066.n3.nabble.com/solr-speed-issues-tp2254823p2260692.html
Sent from the Solr - User mailing list archive at Nabble.com.


Detect Out of Memory Errors

2011-01-27 Thread saureen

Hi,

is ther a way by which i could detect the out of memory errors in solr so
that i could implement some functionality such as restarting the tomcat or
alert me via email whenever such error is detected.?
-- 
View this message in context: 
http://lucene.472066.n3.nabble.com/Detect-Out-of-Memory-Errors-tp2362872p2362872.html
Sent from the Solr - User mailing list archive at Nabble.com.