Re: Maximum number of SolrCloud collections in limited hardware resource

2018-07-12 Thread Sharif Shahriar
Thanks a lot Shawn for your details reply.



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Number of fields in a solrCloud collection config

2018-07-12 Thread Sharif Shahriar
Thanks a lot Shawn for your reply. I'm using SolrCloud v7.3 and using
schema-less approach. I add documents and new fields automatically added in
managed-schema files. I've tried it several times and it stops at around
13,500 fields.
If I try to add fields using SolrCloud API I can add them, but cannot add
using schemaless approach documents adding.

Regards,
Sharif
 



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Unified highlighter

2018-07-12 Thread Julien Massiera

Hi Solr community,

I would like some help with a strange behavior that I observe on the 
unified highlighter.


Here is the configuration of my highlighter :

on
unified
false


content_fr content_en exactContent
true
CHARACTER
html
200
51200


I indexed some html documents from the www.datafari.com website.

The problem is that on some documents (not all), there is not enough 
"context" wrapping the found search terms.


For example, by searching "France labs", here is the highlighting 
obtained for a certain document:


"content_en":["France class=\"em\">Labs"]


Now, if I perform the same query but with the hl.bs.type set to SENTENCE 
instead of CHARACTER, I obtain the following highlighting for the same 
document :


"content_en":["Trusted by About Contact Home Migrating GSA © 2018 Datafari by class=\"em\">France Labs"]


This is way better but I strongly prefer using the WORD or CHARACTER 
types because highlighting can be too big with the SENTENCE or LINE 
types, depending on the indexed documents.


I tried to change the hl.bs.type to WORD or either to increase the 
hl.fragsize up to 1000, but with any other hl.bs.type than SENTENCE or 
LINE, the highlighting is limited to the found words only, which is not 
enough for what I need.


Is there something I am missing with the configuration ? For infos, I am 
using Solr 6.6.4.


Thanks for your help.

Julien



Re: Solr7.3.1 Installation

2018-07-12 Thread tapan1707
Thanks, everyone for your replies.

I tried using "-Dtests.badapples=false" flag but it didn't help.

I will try again without *ant test* command.



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Hackdays in October, London & Montreal

2018-07-12 Thread Charlie Hull

Hi all,

A couple of years ago I ran two free Lucene Hackdays in London and 
Boston (the latter just before Lucene Revolution). Here's what we got up 
to with the kind support of Alfresco, Bloomberg, BA Insight and 
Lucidworks 
http://www.flax.co.uk/blog/2016/10/21/tale-two-cities-two-lucene-hackdays/


I'd like to do this again during the weeks of 8th and 15th October in 
London and Montreal (so just before the Activate event). It's a great 
chance to get together IRL with other Lucene/Solr/Elasticsearch hackers! 
I have a venue for London but a sponsor for evening curry/drinks would 
be wonderful, and for Montreal I still need a venue and evening sponsor 
- do let me know if you or your employer can help.


I'll post again once there are more details and with a call for ideas as 
to what we should work on.


Best

Charlie
--
Charlie Hull
Flax - Open Source Enterprise Search

tel/fax: +44 (0)8700 118334
mobile:  +44 (0)7767 825828
web: www.flax.co.uk


Re: Hackdays in October, London & Montreal

2018-07-12 Thread Sameer Maggon
Charlie,

I might be able to get sponsorship from SearchStax for eve/drinks in
Montreal. Do you want to start a thread offline?

Sameer.

On Thu, Jul 12, 2018 at 4:28 AM Charlie Hull  wrote:

> Hi all,
>
> A couple of years ago I ran two free Lucene Hackdays in London and
> Boston (the latter just before Lucene Revolution). Here's what we got up
> to with the kind support of Alfresco, Bloomberg, BA Insight and
> Lucidworks
> http://www.flax.co.uk/blog/2016/10/21/tale-two-cities-two-lucene-hackdays/
>
> I'd like to do this again during the weeks of 8th and 15th October in
> London and Montreal (so just before the Activate event). It's a great
> chance to get together IRL with other Lucene/Solr/Elasticsearch hackers!
> I have a venue for London but a sponsor for evening curry/drinks would
> be wonderful, and for Montreal I still need a venue and evening sponsor
> - do let me know if you or your employer can help.
>
> I'll post again once there are more details and with a call for ideas as
> to what we should work on.
>
> Best
>
> Charlie
> --
> Charlie Hull
> Flax - Open Source Enterprise Search
>
> tel/fax: +44 (0)8700 118334
> mobile:  +44 (0)7767 825828
> web: www.flax.co.uk
>
-- 
Sameer Maggon
https://www.searchstax.com


SOLR json request API syntax on facet domain with filter & field list

2018-07-12 Thread jeebix
Hello everybody,

I work on a SOLR version 6.0.0. I use the JSON request API to facet the
results. Here is a SOLR query example :

{"query":"object_type_s:contact","params":{"wt":"json","start":0,"rows":10},"filter":[],"facet":"byNestedGammes":{"type":"terms","field":"gamme","domain":{"blockChildren":"object_type_s:contact"},"sort":"index","limit":-1},"byEtatMarketing":{"type":"terms","field":"etat_marketing_s","sort":"count","limit":-1},"bySuite":{"type":"terms","field":"kits_sans_suite_ss","sort":"count","limit":-1},"byAgeInactivite":{"type":"terms","field":"age_inactivite_s","domain":{"excludeTags":"byAgeInactivite"},"sort":"count","limit":-1},"byTTC":{"type":"terms","field":"TTC_i","domain":{"blockChildren":"object_type_s:contact","filter":"enseigne_s:INI"},"sort":"index","limit":-1}},"sort":""}

For the "byTTC" facet, the "domain" parameter with "blockChildren" returns
all values of TTC_i, a field nested in a parent document. Example :

"id":"693897.870735",
"asso_i":693897,
"personne_i":870735,
"etat_technique_s":"avec_documents",
"etat_marketing_s":"actif",
"type_parent_s":"Société",
"groupe_type_parent_s":"SOCIETE",
"_childDocuments_":[
{
  "kit_sans_suite":["false"],
  "TTC_i":300,
  "type_cde_s":"DEVIS",
  "object_type":["order"],
  "enseigne_s":"INI"}

I don't underdtand why the results are the same for the facet byTTC with or
without filter parameter in the domain
("domain":{"blockChildren":"object_type_s:contact","filter":"enseigne_s:INI").

It should filter the results when in a child document "enseigne_s" has the
value "INI", but not...

I thought I had to add the field list parameter to return child documents
with parent documents (like on the solr UI admin : "fl":"*,[child
parentFilter=object_type_s:contact]"), but I can't find the syntax for that
parameter in the JSON request API...

Thanks for your help.

Best

JB



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Local Params handling (6.5->7.x)

2018-07-12 Thread Adam Constabaris
Hi folks,

I am trying to track down what might have changed and where i could tweak a
configuration that worked under Solr 6.5.1 but which is not working as
expected under 7.3.1 or 7.4.0. We have default "qf" and "pf" defeined for
our select handler, along with "special" versions of those for use in
particular kinds of queries; our application (based on Project Blacklight)
make extensive use of local parameter expansion.

Default query parser is edismax, and we are sending in queries like:

"q={! qf=$special_qf pf=$special_pf}query terms'

what used to happen, and what we expect given the documentation at
https://lucene.apache.org/solr/guide/6_6/local-parameters-in-queries.html
(6.6) and
https://lucene.apache.org/solr/guide/7_3/local-parameters-in-queries.html
(7.3) is that this should set qf and pf to their 'special' variants, where
the values are pulled from solrconfig.xml/confogoverlay.json (as
appropriate for the version).  We can achieve the indended effect via:

q=query terms&pf=${special_pf}&qf=${special_qf}

(and we have tested this and it seems to yield the expected behavior) but
this seems like it would be more involved than figuring out *why* the old
syntax isn't working and what we could change in our configset to make it
work again.  Our select handler configuration doesn't look particularly
weird:

The actual values of qf/pf and special_qf/pf are really big, but I created
a 'smaller' configuration for the select handler:

 "requestHandler": {  "/select": {"name": "/select",
 "class": "solr.SearchHandler","defaults": {
"defType": "edismax",  "echoParams": "explicit",
"rows": 10,  "q.alt": "*:*",  "mm": "6<90%",
"facet.mincount": "1",  "qf": "original_qf_notexpanded^1.5",
   "pf": "original_pf_notexpanded^1.5",  "sample_qf":
"sample_qf_expanded^2",  "sample_pf": "sample_pf_expanded^2"
 }  },



What we are seeing when we turn debugging on and look at the parsed query
is something like:

"parsedquery":"+(DisjunctionMaxQuery(((original_qf_notexpanded:[[7b
21] TO [7b 21]])^1.5))
DisjunctionMaxQuery(((original_qf_notexpanded:[[71 66 3d 24 73 61 6d
70 6c 65 5f 71 66] TO [71 66 3d 24 73 61 6d 70 6c 65 5f 71 66]])^1.5))
DisjunctionMaxQuery(((original_qf_notexpanded:[[70 66 3d 24 73 61 6d
70 6c 65 5f 70 66 7d 71 75 65 72 79] TO [70 66 3d 24 73 61 6d 70 6c 65
5f 70 66 7d 71 75 65 72 79]])^1.5))
DisjunctionMaxQuery(((original_qf_notexpanded:[[74 65 72 6d 73] TO [74
65 72 6d 73]])^1.5)))~4 ()",
"parsedquery_toString":"+original_qf_notexpanded:[[7b 21] TO
[7b 21]])^1.5) ((original_qf_notexpanded:[[71 66 3d 24 73 61 6d 70 6c
65 5f 71 66] TO [71 66 3d 24 73 61 6d 70 6c 65 5f 71 66]])^1.5)
((original_qf_notexpanded:[[70 66 3d 24 73 61 6d 70 6c 65 5f 70 66 7d
71 75 65 72 79] TO [70 66 3d 24 73 61 6d 70 6c 65 5f 70 66 7d 71 75 65
72 79]])^1.5) ((original_qf_notexpanded:[[74 65 72 6d 73] TO [74 65 72
6d 73]])^1.5))~4) ()",

the expanded fields are the ones from the default `qf` and `pf`
settings, and so i it looks like the local param syntax is not even
recognized, that it's somehow being "short-circuited".  Debug output
indicates edismax parser is still used in this case, and this is true
even if we change our query string to something like:

q={!lucene ...}

or

q={!type=lucene ...}

We've tried a number of variations, including

(a) setting the 'sow' parameter to true and false, both inside the
expression and as a standalone parameter

(b) changing the luceneMatchVersion in the solrconfig (originally it
was 6.0.0, have changed it to match Solr versions, both via
re-uploading teh configset or creating a copy and modifying it before
upload).

What else should I look at? Changing the searchcomponent stack?

thanks!

AC


Re: Local Params handling (6.5->7.x)

2018-07-12 Thread Joel Bernstein
This may be the issue:

https://issues.apache.org/jira/browse/SOLR-11501

If it turns out that this causing the problem, please create a jira. It's
important to discuss how SOLR-11501 is affecting real deployments.

Joel Bernstein
http://joelsolr.blogspot.com/

On Thu, Jul 12, 2018 at 12:30 PM, Adam Constabaris 
wrote:

> Hi folks,
>
> I am trying to track down what might have changed and where i could tweak a
> configuration that worked under Solr 6.5.1 but which is not working as
> expected under 7.3.1 or 7.4.0. We have default "qf" and "pf" defeined for
> our select handler, along with "special" versions of those for use in
> particular kinds of queries; our application (based on Project Blacklight)
> make extensive use of local parameter expansion.
>
> Default query parser is edismax, and we are sending in queries like:
>
> "q={! qf=$special_qf pf=$special_pf}query terms'
>
> what used to happen, and what we expect given the documentation at
> https://lucene.apache.org/solr/guide/6_6/local-parameters-in-queries.html
> (6.6) and
> https://lucene.apache.org/solr/guide/7_3/local-parameters-in-queries.html
> (7.3) is that this should set qf and pf to their 'special' variants, where
> the values are pulled from solrconfig.xml/confogoverlay.json (as
> appropriate for the version).  We can achieve the indended effect via:
>
> q=query terms&pf=${special_pf}&qf=${special_qf}
>
> (and we have tested this and it seems to yield the expected behavior) but
> this seems like it would be more involved than figuring out *why* the old
> syntax isn't working and what we could change in our configset to make it
> work again.  Our select handler configuration doesn't look particularly
> weird:
>
> The actual values of qf/pf and special_qf/pf are really big, but I created
> a 'smaller' configuration for the select handler:
>
>  "requestHandler": {  "/select": {"name": "/select",
>  "class": "solr.SearchHandler","defaults": {
> "defType": "edismax",  "echoParams": "explicit",
> "rows": 10,  "q.alt": "*:*",  "mm": "6<90%",
> "facet.mincount": "1",  "qf": "original_qf_notexpanded^1.5",
>"pf": "original_pf_notexpanded^1.5",  "sample_qf":
> "sample_qf_expanded^2",  "sample_pf": "sample_pf_expanded^2"
>  }  },
>
>
>
> What we are seeing when we turn debugging on and look at the parsed query
> is something like:
>
> "parsedquery":"+(DisjunctionMaxQuery(((original_qf_notexpanded:[[7b
> 21] TO [7b 21]])^1.5))
> DisjunctionMaxQuery(((original_qf_notexpanded:[[71 66 3d 24 73 61 6d
> 70 6c 65 5f 71 66] TO [71 66 3d 24 73 61 6d 70 6c 65 5f 71 66]])^1.5))
> DisjunctionMaxQuery(((original_qf_notexpanded:[[70 66 3d 24 73 61 6d
> 70 6c 65 5f 70 66 7d 71 75 65 72 79] TO [70 66 3d 24 73 61 6d 70 6c 65
> 5f 70 66 7d 71 75 65 72 79]])^1.5))
> DisjunctionMaxQuery(((original_qf_notexpanded:[[74 65 72 6d 73] TO [74
> 65 72 6d 73]])^1.5)))~4 ()",
> "parsedquery_toString":"+original_qf_notexpanded:[[7b 21] TO
> [7b 21]])^1.5) ((original_qf_notexpanded:[[71 66 3d 24 73 61 6d 70 6c
> 65 5f 71 66] TO [71 66 3d 24 73 61 6d 70 6c 65 5f 71 66]])^1.5)
> ((original_qf_notexpanded:[[70 66 3d 24 73 61 6d 70 6c 65 5f 70 66 7d
> 71 75 65 72 79] TO [70 66 3d 24 73 61 6d 70 6c 65 5f 70 66 7d 71 75 65
> 72 79]])^1.5) ((original_qf_notexpanded:[[74 65 72 6d 73] TO [74 65 72
> 6d 73]])^1.5))~4) ()",
>
> the expanded fields are the ones from the default `qf` and `pf`
> settings, and so i it looks like the local param syntax is not even
> recognized, that it's somehow being "short-circuited".  Debug output
> indicates edismax parser is still used in this case, and this is true
> even if we change our query string to something like:
>
> q={!lucene ...}
>
> or
>
> q={!type=lucene ...}
>
> We've tried a number of variations, including
>
> (a) setting the 'sow' parameter to true and false, both inside the
> expression and as a standalone parameter
>
> (b) changing the luceneMatchVersion in the solrconfig (originally it
> was 6.0.0, have changed it to match Solr versions, both via
> re-uploading teh configset or creating a copy and modifying it before
> upload).
>
> What else should I look at? Changing the searchcomponent stack?
>
> thanks!
>
> AC
>


copy field

2018-07-12 Thread Anil
HI,

i have a date field which needs to copied to different field with different
format/value. is there any way  to achieve this using copy field ? or needs
to be done when creating solr document itself.

lets say createdDate is 10-23-2017 10:15:00, it needs to be copied to
transformedDate field as  10-23-2017.

please help. thanks.

Regards,
Anil


Re: copy field

2018-07-12 Thread Andrea Gazzarini
Hi Anil,
The copy Field directive is not what you're looking for because it doesn't
change the stored value of a field.

What you need is an Update Request Processor, which is a kind of
interceptor in the indexing chain (i.e. It allows you to change an incoming
document before it gets indexed).
Unfortunately, as far as I know there's not an available processor for
doing what you need in the example you described, but consider that writing
a new processor is a trivial thing.

Andrea

Il gio 12 lug 2018, 19:23 Anil  ha scritto:

> HI,
>
> i have a date field which needs to copied to different field with different
> format/value. is there any way  to achieve this using copy field ? or needs
> to be done when creating solr document itself.
>
> lets say createdDate is 10-23-2017 10:15:00, it needs to be copied to
> transformedDate field as  10-23-2017.
>
> please help. thanks.
>
> Regards,
> Anil
>


Re: copy field

2018-07-12 Thread Erick Erickson
This seems like an XY problem, you've asked how to do X without
explaining _why_ (the Y).

If this is just because you want to search the field without having
to specify the full string, consider a DateRangeField.

Best,
Erick

On Thu, Jul 12, 2018 at 10:22 AM, Anil  wrote:
> HI,
>
> i have a date field which needs to copied to different field with different
> format/value. is there any way  to achieve this using copy field ? or needs
> to be done when creating solr document itself.
>
> lets say createdDate is 10-23-2017 10:15:00, it needs to be copied to
> transformedDate field as  10-23-2017.
>
> please help. thanks.
>
> Regards,
> Anil


Re: Local Params handling (6.5->7.x)

2018-07-12 Thread Adam Constabaris
Thanks Joel, that's it!

I had tried all sorts of variants on the search terms but somehow missed
that issue (probably because my eyes glazed over before getting to the 7.2
upgrade notes).   While we were negatively impacted by the change, given
the existence of a simple workaround (anybody coming across the issue in
the future: defType=lucene in your query parameters), I don't think I'd
advocate for a behavioral change.  It might be worth adding a note to the
Local Params page in the reference guide, though, as I imagine edismax is a
fairly popular defType.

cheers,

AC

On Thu, Jul 12, 2018 at 12:37 PM Joel Bernstein  wrote:

> This may be the issue:
>
> https://issues.apache.org/jira/browse/SOLR-11501
>
> If it turns out that this causing the problem, please create a jira. It's
> important to discuss how SOLR-11501 is affecting real deployments.
>
> Joel Bernstein
> http://joelsolr.blogspot.com/
>
> On Thu, Jul 12, 2018 at 12:30 PM, Adam Constabaris 
> wrote:
>
> > Hi folks,
> >
> > I am trying to track down what might have changed and where i could
> tweak a
> > configuration that worked under Solr 6.5.1 but which is not working as
> > expected under 7.3.1 or 7.4.0. We have default "qf" and "pf" defeined for
> > our select handler, along with "special" versions of those for use in
> > particular kinds of queries; our application (based on Project
> Blacklight)
> > make extensive use of local parameter expansion.
> >
> > Default query parser is edismax, and we are sending in queries like:
> >
> > "q={! qf=$special_qf pf=$special_pf}query terms'
> >
> > what used to happen, and what we expect given the documentation at
> >
> https://lucene.apache.org/solr/guide/6_6/local-parameters-in-queries.html
> > (6.6) and
> >
> https://lucene.apache.org/solr/guide/7_3/local-parameters-in-queries.html
> > (7.3) is that this should set qf and pf to their 'special' variants,
> where
> > the values are pulled from solrconfig.xml/confogoverlay.json (as
> > appropriate for the version).  We can achieve the indended effect via:
> >
> > q=query terms&pf=${special_pf}&qf=${special_qf}
> >
> > (and we have tested this and it seems to yield the expected behavior) but
> > this seems like it would be more involved than figuring out *why* the old
> > syntax isn't working and what we could change in our configset to make it
> > work again.  Our select handler configuration doesn't look particularly
> > weird:
> >
> > The actual values of qf/pf and special_qf/pf are really big, but I
> created
> > a 'smaller' configuration for the select handler:
> >
> >  "requestHandler": {  "/select": {"name": "/select",
> >  "class": "solr.SearchHandler","defaults": {
> > "defType": "edismax",  "echoParams": "explicit",
> > "rows": 10,  "q.alt": "*:*",  "mm": "6<90%",
> > "facet.mincount": "1",  "qf": "original_qf_notexpanded^1.5",
> >"pf": "original_pf_notexpanded^1.5",  "sample_qf":
> > "sample_qf_expanded^2",  "sample_pf": "sample_pf_expanded^2"
> >  }  },
> >
> >
> >
> > What we are seeing when we turn debugging on and look at the parsed query
> > is something like:
> >
> > "parsedquery":"+(DisjunctionMaxQuery(((original_qf_notexpanded:[[7b
> > 21] TO [7b 21]])^1.5))
> > DisjunctionMaxQuery(((original_qf_notexpanded:[[71 66 3d 24 73 61 6d
> > 70 6c 65 5f 71 66] TO [71 66 3d 24 73 61 6d 70 6c 65 5f 71 66]])^1.5))
> > DisjunctionMaxQuery(((original_qf_notexpanded:[[70 66 3d 24 73 61 6d
> > 70 6c 65 5f 70 66 7d 71 75 65 72 79] TO [70 66 3d 24 73 61 6d 70 6c 65
> > 5f 70 66 7d 71 75 65 72 79]])^1.5))
> > DisjunctionMaxQuery(((original_qf_notexpanded:[[74 65 72 6d 73] TO [74
> > 65 72 6d 73]])^1.5)))~4 ()",
> > "parsedquery_toString":"+original_qf_notexpanded:[[7b 21] TO
> > [7b 21]])^1.5) ((original_qf_notexpanded:[[71 66 3d 24 73 61 6d 70 6c
> > 65 5f 71 66] TO [71 66 3d 24 73 61 6d 70 6c 65 5f 71 66]])^1.5)
> > ((original_qf_notexpanded:[[70 66 3d 24 73 61 6d 70 6c 65 5f 70 66 7d
> > 71 75 65 72 79] TO [70 66 3d 24 73 61 6d 70 6c 65 5f 70 66 7d 71 75 65
> > 72 79]])^1.5) ((original_qf_notexpanded:[[74 65 72 6d 73] TO [74 65 72
> > 6d 73]])^1.5))~4) ()",
> >
> > the expanded fields are the ones from the default `qf` and `pf`
> > settings, and so i it looks like the local param syntax is not even
> > recognized, that it's somehow being "short-circuited".  Debug output
> > indicates edismax parser is still used in this case, and this is true
> > even if we change our query string to something like:
> >
> > q={!lucene ...}
> >
> > or
> >
> > q={!type=lucene ...}
> >
> > We've tried a number of variations, including
> >
> > (a) setting the 'sow' parameter to true and false, both inside the
> > expression and as a standalone parameter
> >
> > (b) changing the luceneMatchVersion in the solrconfig (originally it
> > was 6.0.0, have changed it to match Solr versions, both via
> > re-uploading teh configset or creating a copy and modifying it before
> > upload).

Re: copy field

2018-07-12 Thread Gus Heck
XY question not withstanding, this is exactly the sort of thing one might
want to do in their indexing pipeline. For example:

https://github.com/nsoft/jesterj/blob/master/code/ingest/src/main/java/org/jesterj/ingest/processors/SimpleDateTimeReformatter.java

On Thu, Jul 12, 2018 at 1:34 PM, Erick Erickson 
wrote:

> This seems like an XY problem, you've asked how to do X without
> explaining _why_ (the Y).
>
> If this is just because you want to search the field without having
> to specify the full string, consider a DateRangeField.
>
> Best,
> Erick
>
> On Thu, Jul 12, 2018 at 10:22 AM, Anil  wrote:
> > HI,
> >
> > i have a date field which needs to copied to different field with
> different
> > format/value. is there any way  to achieve this using copy field ? or
> needs
> > to be done when creating solr document itself.
> >
> > lets say createdDate is 10-23-2017 10:15:00, it needs to be copied to
> > transformedDate field as  10-23-2017.
> >
> > please help. thanks.
> >
> > Regards,
> > Anil
>



-- 
http://www.the111shift.com


CloudSolrClient URL Too Long

2018-07-12 Thread Joe Obernberger

Hi - I'm using SolrCloud 7.3.1 and calling a search from Java using:

org.apache.solr.client.solrj.response.QueryResponse response = 
CloudSolrClient.query(ModifiableSolrParams)


If the ModifiableSolrParams are long, I get an error:
Bad Message 414reason: URI Too Long

I have the maximum number of terms set to 1024 (default), and I'm using 
about 500 terms.  Is there a way around this?  The total query length is 
10,131 bytes.


Thank you!

-Joe



Re: CloudSolrClient URL Too Long

2018-07-12 Thread Alexandre Rafalovitch
Have you tried using POST, instead of GET:
https://lucene.apache.org/solr/7_3_0//solr-solrj/org/apache/solr/client/solrj/SolrClient.html#query-java.lang.String-org.apache.solr.common.params.SolrParams-org.apache.solr.client.solrj.SolrRequest.METHOD-

Also, if you have a set of parameters that do not change between
calls, you can have them as defaults (or compulsory) values in
solrconfig.xml for a custom QueryHandler definition. This includes
variable substitutions.

Regards,
   Alex.

On 12 July 2018 at 14:48, Joe Obernberger  wrote:
> Hi - I'm using SolrCloud 7.3.1 and calling a search from Java using:
>
> org.apache.solr.client.solrj.response.QueryResponse response =
> CloudSolrClient.query(ModifiableSolrParams)
>
> If the ModifiableSolrParams are long, I get an error:
> Bad Message 414reason: URI Too Long
>
> I have the maximum number of terms set to 1024 (default), and I'm using
> about 500 terms.  Is there a way around this?  The total query length is
> 10,131 bytes.
>
> Thank you!
>
> -Joe
>


Re: copy field

2018-07-12 Thread Terry Steichen
Gus,

Perhaps you might try the technique described in the forwarded exchange
below.  It has been working very nicely for me.

Terry


 Forwarded Message 
Subject:Re: Changing Field Assignments
Date:   Tue, 12 Jun 2018 12:21:16 +0900
From:   Yasufumi Mizoguchi 
Reply-To:   solr-user@lucene.apache.org
To: solr-user@lucene.apache.org



Hi,

You can do that via adding the following lines in managed-schema.

  
  
  

After adding the above and re-indexing docs, you will get the result like
following.

{ "responseHeader":{ "status":0, "QTime":0, "params":{ "q":"*:*", "indent":
"on", "wt":"json", "_":"1528772599296"}}, "response":{"numFound":2,"start":0
,"docs":[ { "id":"test2", "meta_creation_date":["2018-04-30T00:00:00Z"], "
meta_creation_date_range":"2018-04-30T00:00:00Z", "_version_":
1603034044781559808}, { "id":"test", "meta_creation_date":[
"1944-04-02T00:00:00Z"], "meta_creation_date_range":"1944-04-02T00:00:00Z",
"_version_":1603034283921899520}] }}

thanks,
Yasufumi


2018年6月12日(火) 5:04 Terry Steichen :

> I am using Solr (6.6.0) in the automatic mode (where it discovers
> fields).  It's working fine with one exception.  The problem is that
> Solr maps the discovered "meta_creation_date" is assigned the type
> TrieDateField.
>
> Unfortunately, that type is limited in a number of ways (like sorting,
> abbreviated forms and etc.).  What I'd like to do is have that
> ("meta_creation_date") field assigned to a different type, like
> DateRangeField.
>
> Is it possible to accomplish this (during indexing) by creating a copy
> field to a different type, and using the copy field in the query?  Or
> via some kind of function operation (which I've never understood)?
>
>


On 07/12/2018 02:43 PM, Gus Heck wrote:
> XY question not withstanding, this is exactly the sort of thing one might
> want to do in their indexing pipeline. For example:
>
> https://github.com/nsoft/jesterj/blob/master/code/ingest/src/main/java/org/jesterj/ingest/processors/SimpleDateTimeReformatter.java
>
> On Thu, Jul 12, 2018 at 1:34 PM, Erick Erickson 
> wrote:
>
>> This seems like an XY problem, you've asked how to do X without
>> explaining _why_ (the Y).
>>
>> If this is just because you want to search the field without having
>> to specify the full string, consider a DateRangeField.
>>
>> Best,
>> Erick
>>
>> On Thu, Jul 12, 2018 at 10:22 AM, Anil  wrote:
>>> HI,
>>>
>>> i have a date field which needs to copied to different field with
>> different
>>> format/value. is there any way  to achieve this using copy field ? or
>> needs
>>> to be done when creating solr document itself.
>>>
>>> lets say createdDate is 10-23-2017 10:15:00, it needs to be copied to
>>> transformedDate field as  10-23-2017.
>>>
>>> please help. thanks.
>>>
>>> Regards,
>>> Anil
>
>



Re: CloudSolrClient URL Too Long

2018-07-12 Thread Shawn Heisey
On 7/12/2018 12:48 PM, Joe Obernberger wrote:
> Hi - I'm using SolrCloud 7.3.1 and calling a search from Java using:
>
> org.apache.solr.client.solrj.response.QueryResponse response =
> CloudSolrClient.query(ModifiableSolrParams)
>
> If the ModifiableSolrParams are long, I get an error:
> Bad Message 414reason: URI Too Long
>
> I have the maximum number of terms set to 1024 (default), and I'm
> using about 500 terms.  Is there a way around this?  The total query
> length is 10,131 bytes.

Add a parameter to the query call.  Change this:

client.query(query)

to this:

client.query(query, METHOD.POST)

The import you'll need for that is
org.apache.solr.client.solrj.SolrRequest.METHOD.

What you're running into is the length limit on the HTTP request of 8192
characters.  This is the limit that virtually all webservers, including
the Jetty that Solr includes, have configured.  Changing the request
method to POST puts all those parameters into the request body.  Solr's
default limit on the size of the request body is 2 megabytes.

Thanks,
Shawn



Re: Number of fields in a solrCloud collection config

2018-07-12 Thread Sharif Shahriar
Hi Erick,
Thank you for your suggestions. 

Regards,
Sharif



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: copy field

2018-07-12 Thread Anil
HI Eric,

i have a schema with a date field (tweetedDate) with  10-23-2017 10:15:00
format. Same schema is used with number of collections. i have to index the
date field with different formats in different collections.

lets says collections as collection-day, collection-hour, etc

if the date field( tweetedDate) value is  10-23-2017 10:15:00
date format in collection day -  10-23-2017 *00:00:00*

* date format in collection hour -  10-23-2017 10:00:00 *

so that i can use that date field for faceting for aggregation.

my idea is to create new date field (newTweetedDate)  which holds the
transformed value which is used for aggregation.

One way of indexing newTweetedDate is to perform the transformation based
on collection and set to solr document for indexing.
I am checking if there is any other alternative way where newTweetedDate is
derived from tweetedDate automatically using schema using copyField or some
other feature.

Hope this is clear. thanks.

Regards,
Anil



On 12 July 2018 at 23:04, Erick Erickson  wrote:

> This seems like an XY problem, you've asked how to do X without
> explaining _why_ (the Y).
>
> If this is just because you want to search the field without having
> to specify the full string, consider a DateRangeField.
>
> Best,
> Erick
>
> On Thu, Jul 12, 2018 at 10:22 AM, Anil  wrote:
> > HI,
> >
> > i have a date field which needs to copied to different field with
> different
> > format/value. is there any way  to achieve this using copy field ? or
> needs
> > to be done when creating solr document itself.
> >
> > lets say createdDate is 10-23-2017 10:15:00, it needs to be copied to
> > transformedDate field as  10-23-2017.
> >
> > please help. thanks.
> >
> > Regards,
> > Anil
>


Re: copy field

2018-07-12 Thread Anil
Thanks Andrea. i will write update processor in index pipe line.

I feel this is very good feature to support.

Thanks,
Anil

On 12 July 2018 at 22:59, Andrea Gazzarini  wrote:

> Hi Anil,
> The copy Field directive is not what you're looking for because it doesn't
> change the stored value of a field.
>
> What you need is an Update Request Processor, which is a kind of
> interceptor in the indexing chain (i.e. It allows you to change an incoming
> document before it gets indexed).
> Unfortunately, as far as I know there's not an available processor for
> doing what you need in the example you described, but consider that writing
> a new processor is a trivial thing.
>
> Andrea
>
> Il gio 12 lug 2018, 19:23 Anil  ha scritto:
>
> > HI,
> >
> > i have a date field which needs to copied to different field with
> different
> > format/value. is there any way  to achieve this using copy field ? or
> needs
> > to be done when creating solr document itself.
> >
> > lets say createdDate is 10-23-2017 10:15:00, it needs to be copied to
> > transformedDate field as  10-23-2017.
> >
> > please help. thanks.
> >
> > Regards,
> > Anil
> >
>


Enabling the Auto purging for the documents which are already index.

2018-07-12 Thread Adarsh_infor
Hi All,

I have index which is being lying in production for quite some time. Now we
need to delete the documents based on Date range, i.e., after 270 days we
should be able to delete the old documents. beyond that many days old.  I
hear about this feature Time to Live i need to know couple of things before
trying it 

1. In all the example it talks about seconds, can we specify the day counts. 
2. And the time to live needs couple new fields in index, which means a
schema change then do we need to re-index all the existing index.  ?
3.  Is there way to enable autopurging without doing  re-index acitiviy. 


thanks 
Adarsh





--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Enabling the Auto purging for the documents which are already index.

2018-07-12 Thread Adarsh_infor
Hi All,

I have index which is being lying in production for quite some time. Now we
need to delete the documents based on Date range, i.e., after 270 days we
should be able to delete the old documents. beyond that many days old.  I
hear about this feature Time to Live i need to know couple of things before
trying it 

1. In all the example it talks about seconds, can we specify the day counts. 
2. And the time to live needs couple new fields in index, which means a
schema change then do we need to re-index all the existing index.  ?
3.  Is there way to enable autopurging without doing  re-index acitiviy. 


thanks 
Adarsh





--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html