On Mon, Feb 6, 2012 at 5:53 PM, XJ <oleol...@gmail.com> wrote: > Yes as I mentioned in previous email, we do dismax queries(with different mm > values), solr function queries (map, etc) math calculations (sum, product, > log). I understand those are expensive. But worst case it should only double > the time not going from 200ms to 1200ms right?
You mention dismax... but I assume that's as the main query and you sort by score (which is fine). The only issue with relevancy queries is if you sorted by one that was not the main query - this is not yet optimized. But for straight function queries that don't contain embedded relevancy queries, I would definitely not expect the degradation you are seeing - hence we should try to get to the bottom of this. -Yonik lucidimagination.com > XJ > > On Mon, Feb 6, 2012 at 2:37 PM, Yonik Seeley <yo...@lucidimagination.com> > wrote: >> >> On Mon, Feb 6, 2012 at 5:35 PM, XJ <oleol...@gmail.com> wrote: >> > hm.. just looked at the log only 112 matched, and start=0, rows=30 >> >> Are any of the sort criteria sort-by-function with anything complex >> (like an embedded relevance query)? >> >> -Yonik >> lucidimagination.com >> >> >> > >> > On Mon, Feb 6, 2012 at 1:33 PM, Yonik Seeley >> > <yo...@lucidimagination.com> >> > wrote: >> >> >> >> On Mon, Feb 6, 2012 at 3:30 PM, oleole <oleol...@gmail.com> wrote: >> >> > Thanks for your reply. Yeah that's the first thing I tried (adding >> >> > fsv=true >> >> > to the query) and it surprised me too. Could it due to we're using >> >> > many >> >> > complex sortings (20 sortings with dismax, and, or...). Any thing it >> >> > can >> >> > be >> >> > optimized? Looks like it's calculated twice in solr? >> >> >> >> It currently does calculate it twice... but only for those documents >> >> being returned (which should not be significant). >> >> What is "rows" set to? >> >> >> >> -Yonik >> >> lucidimagination.com >> > >> > > >