TermsQueryParser I think is somewhat new. Have you tried that one?
https://cwiki.apache.org/confluence/display/solr/Other+Parsers#OtherParsers-TermsQueryParser
Regards,
Alex.
Sign up for my Solr resources newsletter at http://www.solr-start.com/
On 13 January 2015 at 12:54, rashmy1 wrot
Hello,
We have a similar requirement where a large list of IDs needs to be sent to
SOLR in filter query.
Could someone please help understand if this feature is now supported in the
new versions of SOLR?
Thanks
--
View this message in context:
http://lucene.472066.n3.nabble.com/Solr-large-bool
OK, thank you Otis, I *think* this should be easy to add - I can try. We
were calling them 'private library' searches
roman
On Mon, Jul 8, 2013 at 11:58 PM, Otis Gospodnetic <
otis.gospodne...@gmail.com> wrote:
> Hi Roman,
>
> I referred to something I called ""server-side named filters". It
>
Hi Roman,
I referred to something I called ""server-side named filters". It
matches the feature described at
http://www.elasticsearch.org/blog/terms-filter-lookup/
Would be a cool addition, IMHO.
Otis
--
Solr & ElasticSearch Support -- http://sematext.com/
Performance Monitoring -- http://semat
Roman,
It's covered in http://wiki.apache.org/solr/ContentStream
| For POST requests where the content-type is not
"application/x-www-form-urlencoded", the raw POST body is passed as a
stream.
So, there is no need for encoding of binary data inside the body.
Regarding encoding, I have a pos
Hello Mikhail,
Yes, GET is limited, but POST is not - so I just wanted that it works in
both the same way. But I am not sure if I am understanding your question
completely. Could you elaborate on the parameters/body part? Is there no
need for encoding of binary data inside the body? Or do you mean
Hello Roman,
Don't you consider to pass long id sequence as body and access internally
in solr as a content stream? It makes base64 compression not necessary.
AFAIK url length is limited somehow, anyway.
On Tue, Jul 2, 2013 at 9:32 PM, Roman Chyla wrote:
> Wrong link to the parser, should be:
Wrong link to the parser, should be:
https://github.com/romanchyla/montysolr/blob/master/contrib/adsabs/src/java/org/apache/solr/search/BitSetQParserPlugin.java
On Tue, Jul 2, 2013 at 1:25 PM, Roman Chyla wrote:
> Hello @,
>
> This thread 'kicked' me into finishing som long-past task of
> sendi
Hello @,
This thread 'kicked' me into finishing som long-past task of
sending/receiving large boolean (bitset) filter. We have been using bitsets
with solr before, but now I sat down and wrote it as a qparser. The use
cases, as you have discussed are:
- necessity to send lng list of ids as a
Not necessarily. If the auth tokens are available on some
other system (DB, LDAP, whatever), one could get them
in the PostFilter and cache them somewhere since,
presumably, they wouldn't be changing all that often. Or
use a UserCache and get notified whenever a new searcher
was opened and regenera
Hi,
The unfortunate thing about this is what you still have to *pass* that
filter from the client to the server every time you want to use that
filter. If that filter is big/long, passing that in all the time has
some price that could be eliminated by using "server-side named
filters".
Otis
--
S
You might consider "post filters". The idea
is to write a custom filter that gets applied
after all other filters etc. One use-case
here is exactly ACL lists, and can be quite
helpful if you're not doing *:* type queries.
Best
Erick
On Mon, Jun 17, 2013 at 5:12 PM, Otis Gospodnetic
wrote:
> Btw.
Btw. ElasticSearch has a nice feature here. Not sure what it's
called, but I call it "named filter".
http://www.elasticsearch.org/blog/terms-filter-lookup/
Maybe that's what OP was after?
Otis
--
Solr & ElasticSearch Support
http://sematext.com/
On Mon, Jun 17, 2013 at 4:59 PM, Alexandre R
On Mon, Jun 17, 2013 at 12:35 PM, Igor Kustov wrote:
> So I'm using query like
> http://127.0.0.1:8080/solr/select?q=*:*&fq={!mqparser}id:%281%202%203%29
If the IDs are purely numeric, I wonder if the better way is to send a
bitset. So, bit 1 is on if ID:1 is included, bit 2000 is on if ID:2000
i
nonono, mate! I warn you before by 'Mind term ecoding due to field type!'
you need to obtain schema from request, then access fieldtype and convert
external string representation into (might be) tricky encoded bytes by
readableToIndexed() see FieldType.getFieldQuery()
btw, it's a really frequent
Menawhile I'm currently trying to write custom QParser which will use
FieldCacheTermsFilter
So I'm using query like
http://127.0.0.1:8080/solr/select?q=*:*&fq={!mqparser}id:%281%202%203%29
And I couldn't make it work - I just couldn't find a proper constructor and
also not sure that i'm filterin
Krupansky
-Original Message-
From: Igor Kustov
Sent: Monday, June 17, 2013 6:52 AM
To: solr-user@lucene.apache.org
Subject: Re: Solr large boolean filter
Where do the long list of IDs come from?
I'm indexing database, so the id list is security access control list.
--
View this mes
> Where do the long list of IDs come from?
I'm indexing database, so the id list is security access control list.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Solr-large-boolean-filter-tp4070747p4070964.html
Sent from the Solr - User mailing list archive at Nabble.com.
Whenever I see one of this "big" query filters, my first thought is that
there is something wrong with the application data model.
Where do the long list of IDs come from? Somebody must be generating and/or
storing them, right? Why not store them in Solr, right in the data model?
Maybe store
Right.
FieldCacheTermsFilter is an option. You need to create own QParserPlugin
which yields FieldCacheTermsFilter, hook him as ..&fq={!idsqp
cache=false}&..
Mind disabling caching! Mind term ecoding due to field type!
I also suggest to check how much it spend for tokenization. Once a day I've
got
20 matches
Mail list logo