trying DIH but get 'Sorry, no dataimport-handler defined!'

2016-05-24 Thread scott.chu

I do following things:

* I create folder : D:\solr-6.0.0\myconfigsets\testdih.
* Copy D:\portable_sw\solr-6.0.0\example\example-DIH\solr\db\conf to 
D:\solr-6.0.0\myconfigsets\testdih.
* Go into D:\solr-6.0.0\myconfigsets\testdih\conf and edit db-data-config.xml 
as follows (I am pretty sure mysql environment is ok):

  
  
  
  
  
  
  
  
  
  
  

* Then I copy mysql-connector-java-5.0.8-bin.jar to 
D:\portable_sw\solr-6.0.0\server\solr-webapp\webapp\WEB-INF\lib.
* I check solrconfig.xml  and see these relevant lines:


  ...
  ...
  
  
db-data-config.xml
  


* cd to  D:solr-6.0.0, issue 'bin\solr start', it starts ok.
* Issue 'bin\solr create_core -c testdih -d myconfigsets\testdih\conf' to 
create a core. It's ok, too.

* The solr.log has these log messages:

2016-05-24 15:59:24,781 INFO  (coreLoadExecutor-6-thread-1) [   ] 
o.a.s.c.SolrResourceLoader Adding 
'file:/D:/portable_sw/solr-6.0.0/dist/solr-dataimporthandler-6.0.0.jar' to 
classloader
2016-05-24 15:59:24,781 INFO  (coreLoadExecutor-6-thread-1) [   ] 
o.a.s.c.SolrResourceLoader Adding 
'file:/D:/portable_sw/solr-6.0.0/dist/solr-dataimporthandler-extras-6.0.0.jar' 
to classloader

* So I think dih jars are loaded ok.

I go to localhost:893 in browser and select core 'testdih', then click 
'DataImport' item but rightpane shows "Sorry, no dataimport-handler defined!".

 What do I miss?


scott.chu,scott@udngroup.com
2016/5/24 (週二)


How to set id in DIHConfiguration?

2016-05-24 Thread Andreas Meyer
Hello!

I try to import email from am IMAP maildir and see the following in
the log:

2016-05-24 08:46:11.655 INFO  (qtp428746855-14) [   x:ccc] 
o.a.s.h.d.DataImporter Loading DIH Configuration: mail-data-config.xml
2016-05-24 08:46:11.685 INFO  (qtp428746855-14) [   x:ccc] 
o.a.s.h.d.c.DIHConfiguration id is a required field in SolrSchema . But not 
found in DataConfig
2016-05-24 08:46:11.691 INFO  (qtp428746855-14) [   x:ccc] 
o.a.s.h.d.DataImporter Data Configuration loaded successfully

Where do I set this id? In the entity within the mail-data-config.xml?
In the managed-schema it is there.


Kind regards

  Andreas


Re: trying DIH but get 'Sorry, no dataimport-handler defined!'

2016-05-24 Thread scott.chu

I try run the example by issuing "bin\solr create_core -c exampledih -d 
example\example-DIH\solr\db\conf". It also shows same error. Do I issue wrong 
command?

scott.chu,scott@udngroup.com
2016/5/24 (週二)
- Original Message - 
From: scott(自己) 
To: solr-user 
CC: 
Date: 2016/5/24 (週二) 16:35
Subject: trying DIH but get 'Sorry, no dataimport-handler defined!'



I do following things: 

* I create folder : D:\solr-6.0.0\myconfigsets\testdih. 
* Copy D:\portable_sw\solr-6.0.0\example\example-DIH\solr\db\conf to 
D:\solr-6.0.0\myconfigsets\testdih. 
* Go into D:\solr-6.0.0\myconfigsets\testdih\conf and edit db-data-config.xml 
as follows (I am pretty sure mysql environment is ok): 

   
   
   
   
   
   
   
   
   
   
   

* Then I copy mysql-connector-java-5.0.8-bin.jar to 
D:\portable_sw\solr-6.0.0\server\solr-webapp\webapp\WEB-INF\lib. 
* I check solrconfig.xml and see these relevant lines: 

 
  ... 
  ... 
   
   
db-data-config.xml 
   
 

* cd to D:solr-6.0.0, issue 'bin\solr start', it starts ok. 
* Issue 'bin\solr create_core -c testdih -d myconfigsets\testdih\conf' to 
create a core. It's ok, too. 

* The solr.log has these log messages: 

2016-05-24 15:59:24,781 INFO (coreLoadExecutor-6-thread-1) [ ] 
o.a.s.c.SolrResourceLoader Adding 
'file:/D:/portable_sw/solr-6.0.0/dist/solr-dataimporthandler-6.0.0.jar' to 
classloader 
2016-05-24 15:59:24,781 INFO (coreLoadExecutor-6-thread-1) [ ] 
o.a.s.c.SolrResourceLoader Adding 
'file:/D:/portable_sw/solr-6.0.0/dist/solr-dataimporthandler-extras-6.0.0.jar' 
to classloader 

* So I think dih jars are loaded ok. 

I go to localhost:893 in browser and select core 'testdih', then click 
'DataImport' item but rightpane shows "Sorry, no dataimport-handler defined!". 

 What do I miss? 


scott.chu,scott@udngroup.com 
2016/5/24 (週二) 


- 
未在此訊息中找到病毒。 
已透過 AVG 檢查 - www.avg.com 
版本: 2015.0.6201 / 病毒庫: 4568/12285 - 發佈日期: 05/23/16


Re: trying DIH but get 'Sorry, no dataimport-handler defined!'

2016-05-24 Thread kostali hassan
if you have in  this path server/solr/configsets/testdih/conf you shoud
right this in your line commande:
'bin\solr>solr create -c your_core -d testdih -p 8983 to create a core with
an exemple config testdih.

2016-05-24 9:35 GMT+01:00 scott.chu :

>
> I do following things:
>
> * I create folder : D:\solr-6.0.0\myconfigsets\testdih.
> * Copy D:\portable_sw\solr-6.0.0\example\example-DIH\solr\db\conf to
> D:\solr-6.0.0\myconfigsets\testdih.
> * Go into D:\solr-6.0.0\myconfigsets\testdih\conf and edit
> db-data-config.xml as follows (I am pretty sure mysql environment is ok):
>
>   
>url="jdbc:mysql://localhost:3306/test" user="hello" password="hellothere" />
>   
>   
>   
>   
>   
>   
>   
>   
>   
>
> * Then I copy mysql-connector-java-5.0.8-bin.jar to
> D:\portable_sw\solr-6.0.0\server\solr-webapp\webapp\WEB-INF\lib.
> * I check solrconfig.xml  and see these relevant lines:
>
>  regex="solr-dataimporthandler-.*\.jar" />
>   ...
>   ...
>   
>   
> db-data-config.xml
>   
> 
>
> * cd to  D:solr-6.0.0, issue 'bin\solr start', it starts ok.
> * Issue 'bin\solr create_core -c testdih -d myconfigsets\testdih\conf' to
> create a core. It's ok, too.
>
> * The solr.log has these log messages:
>
> 2016-05-24 15:59:24,781 INFO  (coreLoadExecutor-6-thread-1) [   ]
> o.a.s.c.SolrResourceLoader Adding
> 'file:/D:/portable_sw/solr-6.0.0/dist/solr-dataimporthandler-6.0.0.jar' to
> classloader
> 2016-05-24 15:59:24,781 INFO  (coreLoadExecutor-6-thread-1) [   ]
> o.a.s.c.SolrResourceLoader Adding
> 'file:/D:/portable_sw/solr-6.0.0/dist/solr-dataimporthandler-extras-6.0.0.jar'
> to classloader
>
> * So I think dih jars are loaded ok.
>
> I go to localhost:893 in browser and select core 'testdih', then click
> 'DataImport' item but rightpane shows "Sorry, no dataimport-handler
> defined!".
>
>  What do I miss?
>
>
> scott.chu,scott@udngroup.com
> 2016/5/24 (週二)
>


Re: Auto Suggestion in solr

2016-05-24 Thread Mugeesh Husain
thank Erick for reply, actually I am using solr 4.4, solr.SuggestComponent
class is not available in solr 4.4, 

Can I implement this into my solr 4.4 lib and how ?



if possible share  any article before how people used suggestion in 4.4.



Thanks
Mugeesh



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Auto-Suggestion-in-solr-tp4278458p4278756.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: More Like This on not new documents

2016-05-24 Thread Vincenzo D'Amore
Thanks Nick,

I don't plan to index the document, the document is a kind of disposable
object. And it is based on the user query.

I have seen that page, I didn't get how pass the document (my disposable
object) via stream.body parameter.

Googling I found this https://issues.apache.org/jira/browse/SOLR-5351

I see Solr committers are just working on this bug recently, but for now
only the first field is handled by mlt .

So it is not clear how to use of stream.body parameter.

Best regards,
Vincenzo

On Fri, May 13, 2016 at 7:03 PM, Nick D  wrote:

> https://wiki.apache.org/solr/MoreLikeThisHandler
>
> Bottom of the page, using context streams. I believe this still works in
> newer versions of Solr. Although I have not tested it on a new version of
> Solr.
>
> But if you plan on indexing the document anyways then just indexing and
> then passing the ID to mlt isn't a bad thing at all.
>
> Nick
>
> On Fri, May 13, 2016 at 2:23 AM, Vincenzo D'Amore 
> wrote:
>
> > Hi all,
> >
> > anybody know if is there a chance to use the mlt component with a new
> > document not existing in the collection?
> >
> > In other words, if I have a new document, should I always first add it to
> > my collection and only then, using the mlt component, have the list of
> > similar documents?
> >
> >
> > Best regards,
> > Vincenzo
> >
> >
> > --
> > Vincenzo D'Amore
> > email: v.dam...@gmail.com
> > skype: free.dev
> > mobile: +39 349 8513251
> >
>



-- 
Vincenzo D'Amore
email: v.dam...@gmail.com
skype: free.dev
mobile: +39 349 8513251


RE: Import html data in mysql and map schemas using only SolrCELL+TIKA+DIH [scottchu]

2016-05-24 Thread Markus Jelsma
Hello - did you find this manual page? It explains how HTML can be uploaded.

https://cwiki.apache.org/confluence/display/solr/Uploading+Data+with+Solr+Cell+using+Apache+Tika

Markus

 
 
-Original message-
> From:scott.chu 
> Sent: Tuesday 24th May 2016 7:48
> To: solr-user 
> Subject: Re: Import html data in mysql and map schemas using only 
> SolrCELL+TIKA+DIH [scottchu]
> 
> Can anyone show me an example or short help of how I can do it? I am to use 
> Solr 5 or up to carry out it.
> 
> 
> scott.chu,scott@udngroup.com
> 2016/5/24 (週二)
> - Original Message - 
> From: scott(自己) 
> To: solr-user 
> CC: 
> Date: 2016/5/20 (週五) 14:17
> Subject: Import html data in mysql and map schemas using only 
> SolrCELL+TIKA+DIH [scottchu]
> 
> 
> 
> I have a mysql table with over 300M blog articles. The records are in html 
> format. Is it possible to import these records using only Solr CELL+TIKA+DIH 
> to some Solr with schema? I mean when importing, I can map schema on mysql to 
> schema in Solr? 
> 
> scott.chu,scott@udngroup.com 
> 2016/5/20 (週五) 
> 
> 
> - 
> 未在此訊息中找到病毒。 
> 已透過 AVG 檢查 - www.avg.com 
> 版本: 2015.0.6201 / 病毒庫: 4568/12262 - 發佈日期: 05/19/16
> 


Re(2): Import html data in mysql and map schemas using onlySolrCELL+TIKA+DIH [scottchu]

2016-05-24 Thread scott.chu

I read that but can't quite understand the steps! That's why I ask help here.

scott.chu,scott@udngroup.com
2016/5/24 (週二)
- Original Message - 
From: Markus Jelsma 
To: solr-user ; solr-user 
CC: 
Date: 2016/5/24 (週二) 17:52
Subject: RE: Import html data in mysql and map schemas using 
onlySolrCELL+TIKA+DIH [scottchu]


Hello - did you find this manual page? It explains how HTML can be uploaded. 

https://cwiki.apache.org/confluence/display/solr/Uploading+Data+with+Solr+Cell+using+Apache+Tika
 

Markus 



-Original message- 
> From:scott.chu  
> Sent: Tuesday 24th May 2016 7:48 
> To: solr-user  
> Subject: Re: Import html data in mysql and map schemas using only 
> SolrCELL+TIKA+DIH [scottchu] 
> 
> Can anyone show me an example or short help of how I can do it? I am to use 
> Solr 5 or up to carry out it. 
> 
> 
> scott.chu,scott@udngroup.com 
> 2016/5/24 (週二) 
> - Original Message - 
> From: scott(自己) 
> To: solr-user 
> CC: 
> Date: 2016/5/20 (週五) 14:17 
> Subject: Import html data in mysql and map schemas using only 
> SolrCELL+TIKA+DIH [scottchu] 
> 
> 
> 
> I have a mysql table with over 300M blog articles. The records are in html 
> format. Is it possible to import these records using only Solr CELL+TIKA+DIH 
> to some Solr with schema? I mean when importing, I can map schema on mysql to 
> schema in Solr? 
> 
> scott.chu,scott@udngroup.com 
> 2016/5/20 (週五) 
> 
> 
> - 
> 未在此訊息中找到病毒。 
> 已透過 AVG 檢查 - www.avg.com 
> 版本: 2015.0.6201 / 病毒庫: 4568/12262 - 發佈日期: 05/19/16 
> 


- 
未在此訊息中找到病毒。 
已透過 AVG 檢查 - www.avg.com 
版本: 2015.0.6201 / 病毒庫: 4568/12285 - 發佈日期: 05/23/16 


Re: trying DIH but get 'Sorry, no dataimport-handler defined!'

2016-05-24 Thread scott.chu

I happen to find the problem. The problem seems to come from the html file that 
shows DIH function page. I use Maxthon browser, it has a function that can 
switch between IE mode and non-IE mode (actually the Webkit engine). I happen 
to switch back to non-IE mode and the error message is gone and everything is 
ok now! TOO WEIRD!


scott.chu,scott@udngroup.com
2016/5/24 (週二)
- Original Message - 
From: kostali hassan 
To: solr-user ; scott(自己) 
CC: 
Date: 2016/5/24 (週二) 16:55
Subject: Re: trying DIH but get 'Sorry, no dataimport-handler defined!'


if you have in this path server/solr/configsets/testdih/conf you shoud 
right this in your line commande: 
'bin\solr>solr create -c your_core -d testdih -p 8983 to create a core with 
an exemple config testdih. 

2016-05-24 9:35 GMT+01:00 scott.chu : 

> 
> I do following things: 
> 
> * I create folder : D:\solr-6.0.0\myconfigsets\testdih. 
> * Copy D:\portable_sw\solr-6.0.0\example\example-DIH\solr\db\conf to 
> D:\solr-6.0.0\myconfigsets\testdih. 
> * Go into D:\solr-6.0.0\myconfigsets\testdih\conf and edit 
> db-data-config.xml as follows (I am pretty sure mysql environment is ok): 
> 
>  
>  url="jdbc:mysql://localhost:3306/test" user="hello" password="hellothere" /> 
>  
>  

>  
>  
>  
>  
>  
>  
>  
> 
> * Then I copy mysql-connector-java-5.0.8-bin.jar to 
> D:\portable_sw\solr-6.0.0\server\solr-webapp\webapp\WEB-INF\lib. 
> * I check solrconfig.xml and see these relevant lines: 
> 
>  regex="solr-dataimporthandler-.*\.jar" /> 
> ... 
> ... 
>  
>  
> db-data-config.xml 
>  
>  
> 
> * cd to D:solr-6.0.0, issue 'bin\solr start', it starts ok. 
> * Issue 'bin\solr create_core -c testdih -d myconfigsets\testdih\conf' to 
> create a core. It's ok, too. 
> 
> * The solr.log has these log messages: 
> 
> 2016-05-24 15:59:24,781 INFO (coreLoadExecutor-6-thread-1) [ ] 
> o.a.s.c.SolrResourceLoader Adding 
> 'file:/D:/portable_sw/solr-6.0.0/dist/solr-dataimporthandler-6.0.0.jar' to 
> classloader 
> 2016-05-24 15:59:24,781 INFO (coreLoadExecutor-6-thread-1) [ ] 
> o.a.s.c.SolrResourceLoader Adding 
> 'file:/D:/portable_sw/solr-6.0.0/dist/solr-dataimporthandler-extras-6.0.0.jar'
>  
> to classloader 
> 
> * So I think dih jars are loaded ok. 
> 
> I go to localhost:893 in browser and select core 'testdih', then click 
> 'DataImport' item but rightpane shows "Sorry, no dataimport-handler 
> defined!". 
> 
> What do I miss? 
> 
> 
> scott.chu,scott@udngroup.com 
> 2016/5/24 (週二) 
> 



- 
未在此訊息中找到病毒。 
已透過 AVG 檢查 - www.avg.com 
版本: 2015.0.6201 / 病毒庫: 4568/12285 - 發佈日期: 05/23/16


Caused by: java.sql.SQLException: Unknown character set index for field '224' received from server. when do DIH [scottchu]

2016-05-24 Thread scott.chu

I test a table with ony 2 records. This table's properties are: MyISAM, utf8 
character set, utf8_general_ci collation. the conents are html source with 
Chinese characters.  I doesn't specify any schema.xml but only setup 
db-data-config.xml as follows:

  
  

  
  
  
  

  


When I do DIH, the solr.log shows:

"Unable to execute SELECT * FROM ..." 
...
...
...
Caused by: java.sql.SQLException: Unknown character set index for field '224'.

What's does this error message mean?
 
scott.chu,scott@udngroup.com
2016/5/24 (週二)


[Solr 6] Legacy faceting Term Enum method VS DocValues

2016-05-24 Thread Alessandro Benedetti
Hi guys,
It has been a while I was thinking about this and yesterday I took a look
into the code :

I was wondering if the termEnum approach is still a valid alternative to
docValues when we have low cardinality fields.

The reason I am asking this is because yesterday I run into this piece of
code :

org/apache/solr/request/SimpleFacets.java:448

if (method == FacetMethod.ENUM && sf.hasDocValues()) {
  // only fc can handle docvalues types
  method = FacetMethod.FC;
}

So it seems that , if you enable the docValues in the schema, we are always
going to use them even if the method select is term enum.

So does it mean, in case we have enough disk space, that it is always
suggested to use docValues now ?

Of course I know that would be great to move as soon as possible to the new
json facet API approach.

P.S. still verifying the famous legacy facet degradation on latest Solr
compared to old Solr4.

Cheers
-- 
--

Benedetti Alessandro
Visiting card : http://about.me/alessandro_benedetti

"Tyger, tyger burning bright
In the forests of the night,
What immortal hand or eye
Could frame thy fearful symmetry?"

William Blake - Songs of Experience -1794 England


Re: Atomic updates and "stored"

2016-05-24 Thread Mark Robinson
Thanks Eric!

Best,
Mark

On Mon, May 23, 2016 at 1:35 PM, Erick Erickson 
wrote:

> Yes, currently when using Atomic updates _all_ fields
> have to be stored, except the _destinations_ of copyField
> directives.
>
> Yes, it will make your index bigger. The affects on speed are
> probably minimal though. The stored data is in your *.fdt and
> *.fdx segments files and are not referenced only to pull
> the top N docs back, they're not referenced for _search_ at all.
>
> Coming Real Soon will be updateable DocValues, which may
> be what you really need.
>
> Best,
> Erick
>
> On Mon, May 23, 2016 at 6:13 AM, Mark Robinson 
> wrote:
> > Hi,
> >
> > I have some 150 fields in my schema out of which about 100 are dynamic
> > fields which I am not storing (stored="false").
> > In case I need to do an atomic update to one or two fields which belong
> to
> > the stored list of fields, do I need to change my dynamic fields (100 or
> so
> > now not "stored") to stored="true"?
> >
> > If so wouldn't it considerably increase index size and affect performance
> > in the negative?
> >
> > Is there any way currently to do partial/ atomic updates to one or two
> > fields (which I will make stored="true") without having to make my now
> > stored="false" fields to stored="true" just
> > to accommodate atomic updates.
> >
> > Could some one pls give your suggestions.
> >
> > Thanks!
> > Mark.
>


dynamicfields, copyfields, and multivalues

2016-05-24 Thread John Blythe
hi all,

i'm going mad over something that seems incredibly simple. in an attempt to
maintain some order to my growing data, i've begun to employ dynamicFields.
basic stuff here, just using *_s, *_d, etc. for my strings, doubles, and
other common datatypes.

i have these stored but not indexed. i'm then using copyFields to copy them
over to other fields. for instance, vendorItem_s would copy into
vendorItem_ng (ngram), vendorItem (a simple text_split), etc.

one of the fields keeps throwing an error on my import attempts. i'm
guessing it'll occur on other fields, too, but this one throws the first
tripwire. it's my uom field, unit of measure. it contains a single value
per row/doc, e.g. "EA" or "BX."

i get the following error on import: "multiple values encountered for non
multiValued copy field uom: EA"

i don't want this field to be multivalued and there is no reason it needs
to be. is this something innate to dynamicField or copyField?

thanks for any info-


Re: Import html data in mysql and map schemas using onlySolrCELL+TIKA+DIH [scottchu]

2016-05-24 Thread Scott Chu

Justa let everybody know. I use DIH+template (without TIKA and Solr Cell, I 
really don't understand that part in reference guide) to achieve what I want. 
But still need to test more various form of HTML source.

Scott Chu,scott@udngroup.com
2016/5/24 (週二)

p.s. There're really many many extensive, worthy stuffs in Solr. If the project 
team can provide some "dictionary" of them, It would be a "Santa Claus" for we 
solr users. Ha! Just a X'mas wish! Sigh! I know it's quite not possbile. I 
really like to study them one after another, to learn about all of them. 
However, Internet IT goes too fast to have time to congest all of the great 
stuffs in Solr.
- Original Message - 
From: scott.chu 
To: solr-user 
CC: 
Date: 2016/5/21 (週六) 03:39
Subject: Re: Import html data in mysql and map schemas using 
onlySolrCELL+TIKA+DIH [scottchu]



For this project, I intend to use Solr 5.5 or Solr 6. I know how to modify 
config to go back to use ClassicIndex, ie. manual schema.xml. 

Scott Chu,scott@udngroup.com 
2016/5/21 (週六) 
- Original Message - 
From: Siddhartha Singh Sandhu 
To: solr-user ; scott.chu 
CC: 
Date: 2016/5/21 (週六) 03:33 
Subject: Re: Import html data in mysql and map schemas using only 
SolrCELL+TIKA+DIH [scottchu] 


You will have to configure your schema.xml in Solr. 

What version are you using? 

On Fri, May 20, 2016 at 2:17 AM, scott.chu  wrote: 


> 
> I have a mysql table with over 300M blog articles. The records are in html 
> format. Is it possible to import these records using only Solr 
> CELL+TIKA+DIH to some Solr with schema? I mean when importing, I can map 

> schema on mysql to schema in Solr? 
> 
> scott.chu,scott@udngroup.com 
> 2016/5/20 (週五) 
> 



- 
未在此訊息中找到病毒。 
已透過 AVG 檢查 - www.avg.com 
版本: 2015.0.6201 / 病毒庫: 4568/12265 - 發佈日期: 05/20/16 


- 
未在此訊息中找到病毒。 
已透過 AVG 檢查 - www.avg.com 
版本: 2015.0.6201 / 病毒庫: 4568/12265 - 發佈日期: 05/20/16


Re: Import html data in mysql and map schemas using onlySolrCELL+TIKA+DIH [scottchu]

2016-05-24 Thread Tom Evans
On Tue, May 24, 2016 at 3:06 PM, Scott Chu  wrote:
> p.s. There're really many many extensive, worthy stuffs in Solr. If the
> project team can provide some "dictionary" of them, It would be a "Santa 
> Claus"
> for we solr users. Ha! Just a X'mas wish! Sigh! I know it's quite not 
> possbile.
> I really like to study them one after another, to learn about all of them.
> However, Internet IT goes too fast to have time to congest all of the great
>  stuffs in Solr.

The reference guide is both extensive and also broadly informative.
Start from the top page and browse away!

https://cwiki.apache.org/confluence/display/solr/Apache+Solr+Reference+Guide

Handy to keep the glossary handy for any terms that you don't recognise:

https://cwiki.apache.org/confluence/display/solr/Solr+Glossary

Cheers

Tom


Re: dynamicfields, copyfields, and multivalues

2016-05-24 Thread John Blythe
never mind, the issue ended up being that i had the copyField for that uom
field in two places and hadn't realized it, doh!

-- 
*John Blythe*
Product Manager & Lead Developer

251.605.3071 | j...@curvolabs.com
www.curvolabs.com

58 Adams Ave
Evansville, IN 47713

On Tue, May 24, 2016 at 9:05 AM, John Blythe  wrote:

> hi all,
>
> i'm going mad over something that seems incredibly simple. in an attempt
> to maintain some order to my growing data, i've begun to employ
> dynamicFields. basic stuff here, just using *_s, *_d, etc. for my strings,
> doubles, and other common datatypes.
>
> i have these stored but not indexed. i'm then using copyFields to copy
> them over to other fields. for instance, vendorItem_s would copy into
> vendorItem_ng (ngram), vendorItem (a simple text_split), etc.
>
> one of the fields keeps throwing an error on my import attempts. i'm
> guessing it'll occur on other fields, too, but this one throws the first
> tripwire. it's my uom field, unit of measure. it contains a single value
> per row/doc, e.g. "EA" or "BX."
>
> i get the following error on import: "multiple values encountered for non
> multiValued copy field uom: EA"
>
> i don't want this field to be multivalued and there is no reason it needs
> to be. is this something innate to dynamicField or copyField?
>
> thanks for any info-
>


Re: Auto Suggestion in solr

2016-05-24 Thread Erick Erickson
Well, theoretically you could back-port the suggester stuff, but
I'd guess it would be very, very difficult. It'd be far easier to
upgrade to, say, 4.10.

But there's a different problem there. Until, I think 5.1, the
suggester would rebuild itself on startup, and that could
take a very long time. This was true even if you had
buildOnStartup=false.

I don't have anything off the top of my head for suggester
that far back, I do know people would use spellcheck
as one form of suggester though.

Best,
Erick

On Tue, May 24, 2016 at 2:04 AM, Mugeesh Husain  wrote:
> thank Erick for reply, actually I am using solr 4.4, solr.SuggestComponent
> class is not available in solr 4.4,
>
> Can I implement this into my solr 4.4 lib and how ?
>
>
>
> if possible share  any article before how people used suggestion in 4.4.
>
>
>
> Thanks
> Mugeesh
>
>
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Auto-Suggestion-in-solr-tp4278458p4278756.html
> Sent from the Solr - User mailing list archive at Nabble.com.


Is it possible to pass parameters through solrconfig.xml ?

2016-05-24 Thread vitaly bulgakov
I need to pass a parameter to one of my searchComponent class from
solrconfog.xml file.
Please advice me how to do it if it is possible.  



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Is-it-possible-to-pass-parameters-through-solrconfig-xml-tp4278852.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Auto Suggestion in solr

2016-05-24 Thread Mugeesh Husain
Thank you so much for pulling me out for upgrading trouble.

I have implemented auto suggester following this wiki page
https://wiki.apache.org/solr/Suggester.

but i am looking for a result should be populated based on some ranking or
boosting(some business logic). how to implement ranking based auto suggest
result. even there is no parameter in this class WFSTLookupFactory  also.


Where i should add some ranking or boosting logic in this kind of suggester.



Thanks



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Auto-Suggestion-in-solr-tp4278458p4278857.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: [Solr 6] Migration from Solr 4.10.2

2016-05-24 Thread Alessandro Benedetti
Update , it seems clear I incurred in the bad
https://issues.apache.org/jira/browse/SOLR-8096 :

Just adding some additional information as I just incurred on the issue
with Solr 6.0 :
Static index, around 50 *10^6 docs, 20 fields to facet, 1 of them with high
cardinality on top of grouping.
Groping was not affecting at all.

All the symptoms are there, Solr 4.10.2 around 150 ms and Solr 6.0 around
550 ms .
The 'fieldValueCache' seems to be unused (no inserts nor lookups) in Solr
6.0.
In Solr 4.10 the 'fieldValueCache' is in heavy use with a
cumulative_hitratio of 0.96 .
Switching from enum to fc to fcs to uif did not change that much.

Moving to DocValues didn't improve that much the situation ( but I was on
an optimized index, so I need to try the multi-segmented one according
to Mikhail
Khludnev

contribution
in Solr 5.4.0 ) .

Moving to field collapsing moved down the query to 110-120 ms ( but this is
normal, we were faceting on 260 /1 million orignal docs)
Adding facet.threads=NCores moved down the queryTime to 100 ms, in
combination with field collapsing we reached 80-90 ms when warmed.

What are the plan for the future related this ?
Do we want to deprecate the legacy facets implementation and move
everything to Json facets ( like it happened with the UIF ) ?
So backward compatible but different implementation ?

I think for migrations should be a transparent process.


Cheers

On Mon, May 23, 2016 at 6:49 PM, Alessandro Benedetti <
benedetti.ale...@gmail.com> wrote:

> Furthermore I was checking the internals of the old facet implementation (
> which comes when using the classic request parameter based,  instead of the
> json facet). It seems that if you enable docValues even with the enun
> method passed as parameter , actually fc with docValues will be used.
> i will give some report on the performance we get with docValues.
>
> Cheers
> On 23 May 2016 16:29, "Joel Bernstein"  wrote:
>
>> If you can make min/max work for you instead of sort then it should be
>> faster, but I haven't spent time comparing the performance.
>>
>> But if you're using the top_fc with the min/max param the performance
>> between Solr 4 & Solr 6 should be very close as the data structures behind
>> them are the same.
>>
>>
>>
>>
>>
>>
>> Joel Bernstein
>> http://joelsolr.blogspot.com/
>>
>> On Mon, May 23, 2016 at 3:34 PM, Alessandro Benedetti <
>> abenede...@apache.org
>> > wrote:
>>
>> > Hi Joel,
>> > thanks for the reply, actually we were not using field collapsing
>> before,
>> > we basically want to replace grouping with that.
>> > The grouping performance between Solr 4 and 6 are basically comparable.
>> > It's surprising I got so big degradation with the field collapsing.
>> >
>> > So basically the comparison we did were based on the Solr4 queries ,
>> > extracted from logs, and modified slightly to include field collapsing
>> > parameter.
>> >
>> > To build the tests to compare Solr 4.10.2 to Solr 6 we basically
>> proceeded
>> > in this way :
>> >
>> > 1) install Solr 4.10.2 and Solr 6.0.0
>> > 2) migrate the index with the related lucene tool ( 4.10.2 -> 5.5.0 ->
>> Solr
>> > 6.0 )
>> > 3) switch on/off the 2 instances and repeating the tests both with cold
>> > instances and warm instances.
>> >
>> > This means that the query looks the same.
>> > I have not double checked the results but only the timings.
>> > I will provide additional feedback to see if the query are producing
>> > comparable results as well.
>> >
>> > Related your suggestion about the top_fc, thanks, I will try that .
>> > I actually discovered that a little bit after I posted the mailing list
>> ( I
>> > think exactly from another post of yours :) )
>> >
>> > Not sure if setting up docValues for the field we use to collapse could
>> > give some benefit as well.
>> >
>> > I keep you updated,
>> >
>> > Cheers
>> >
>> > On Mon, May 23, 2016 at 2:48 PM, Joel Bernstein 
>> > wrote:
>> >
>> > > Were you using the sort param or min/max param in Solr 4 to select the
>> > > group head? The sort work came later and I'm not sure how it compares
>> in
>> > > performance to the min/max param.
>> > >
>> > > Since you are collapsing on a string field you can use the top_fc hint
>> > > which will use a top level field cache for the collapse. This is
>> faster
>> > at
>> > > query time then the default which uses MultiDocValue ordinal map.
>> > >
>> > > The docs cover the top_fc hint.
>> > >
>> > >
>> >
>> https://cwiki.apache.org/confluence/display/solr/Collapse+and+Expand+Results
>> > >
>> > >
>> > >
>> > > Joel Bernstein
>> > > http://joelsolr.blogspot.com/
>> > >
>> > > On Mon, May 23, 2016 at 12:14 PM, Alessandro Benedetti <
>> > > abenede...@apache.org> wrote:
>> > >
>> > > > Let's add some additional details guys :
>> > > >
>> > > > 1) *Faceting*
>> > > > Currently the facet method used is "enum" and it runs over 20 fields
>> > more
>> > > > or less.
>> > > > Mainly using it on lo

Re: Solr mysql Json import

2016-05-24 Thread vsriram30
Looks like it is available through http post request as given in
https://lucidworks.com/blog/2014/08/12/indexing-custom-json-data/ 

Hence I assume corresponding json data import from mysql should also be
available. Can someone point me to related docs?

Thanks,
Sriram



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Solr-mysql-Json-import-tp4278686p4278875.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: [Solr 6] Migration from Solr 4.10.2

2016-05-24 Thread Mikhail Khludnev
Alessandro,

I checked with Solr 6.0 distro on techproducts.
Faceting on cat with uif hits fieldValueCache
http://localhost:8983/solr/techproducts/select?facet.field=cat&facet.method=uif&facet=on&indent=on&q=*:*&wt=json

fieldValueCache
- class:org.apache.solr.search.FastLRUCache
- description:Concurrent LRU Cache(maxSize=1, initialSize=10,
minSize=9000, acceptableSize=9500, cleanupThread=false)
- src:
- version:1.0 stats:

   - cumulative_evictions:0
   - cumulative_hitratio:0.5
   - cumulative_hits:1
   - cumulative_inserts:2
   - cumulative_lookups:2
   - evictions:0
   - hitratio:0.5
   - hits:1
   - inserts:2
   - item_cat:
   
{field=cat,memSize=4665,tindexSize=46,time=28,phase1=27,nTerms=16,bigTerms=2,termInstances=21,uses=0}
   - lookups:2
   - size:1

Beware, for example field manu_exact doesn't hit field value cache, because
it single valued and goes to FacetFieldProcessorDV instead of
FacetFieldProcessorUIF.  And cat is multivalued and hits UIF. see
org.apache.solr.search.facet.FacetField.createFacetProcessor(FacetContext)
it might need to just debug there.

In summary, uif works and you have a chance to hit it. Goof Luck!

On Tue, May 24, 2016 at 7:43 PM, Alessandro Benedetti <
benedetti.ale...@gmail.com> wrote:

> Update , it seems clear I incurred in the bad
> https://issues.apache.org/jira/browse/SOLR-8096 :
>
> Just adding some additional information as I just incurred on the issue
> with Solr 6.0 :
> Static index, around 50 *10^6 docs, 20 fields to facet, 1 of them with high
> cardinality on top of grouping.
> Groping was not affecting at all.
>
> All the symptoms are there, Solr 4.10.2 around 150 ms and Solr 6.0 around
> 550 ms .
> The 'fieldValueCache' seems to be unused (no inserts nor lookups) in Solr
> 6.0.
> In Solr 4.10 the 'fieldValueCache' is in heavy use with a
> cumulative_hitratio of 0.96 .
> Switching from enum to fc to fcs to uif did not change that much.
>
> Moving to DocValues didn't improve that much the situation ( but I was on
> an optimized index, so I need to try the multi-segmented one according
> to Mikhail
> Khludnev
> 
> contribution
> in Solr 5.4.0 ) .
>
> Moving to field collapsing moved down the query to 110-120 ms ( but this is
> normal, we were faceting on 260 /1 million orignal docs)
> Adding facet.threads=NCores moved down the queryTime to 100 ms, in
> combination with field collapsing we reached 80-90 ms when warmed.
>
> What are the plan for the future related this ?
> Do we want to deprecate the legacy facets implementation and move
> everything to Json facets ( like it happened with the UIF ) ?
> So backward compatible but different implementation ?
>
> I think for migrations should be a transparent process.
>
>
> Cheers
>
> On Mon, May 23, 2016 at 6:49 PM, Alessandro Benedetti <
> benedetti.ale...@gmail.com> wrote:
>
> > Furthermore I was checking the internals of the old facet implementation
> (
> > which comes when using the classic request parameter based,  instead of
> the
> > json facet). It seems that if you enable docValues even with the enun
> > method passed as parameter , actually fc with docValues will be used.
> > i will give some report on the performance we get with docValues.
> >
> > Cheers
> > On 23 May 2016 16:29, "Joel Bernstein"  wrote:
> >
> >> If you can make min/max work for you instead of sort then it should be
> >> faster, but I haven't spent time comparing the performance.
> >>
> >> But if you're using the top_fc with the min/max param the performance
> >> between Solr 4 & Solr 6 should be very close as the data structures
> behind
> >> them are the same.
> >>
> >>
> >>
> >>
> >>
> >>
> >> Joel Bernstein
> >> http://joelsolr.blogspot.com/
> >>
> >> On Mon, May 23, 2016 at 3:34 PM, Alessandro Benedetti <
> >> abenede...@apache.org
> >> > wrote:
> >>
> >> > Hi Joel,
> >> > thanks for the reply, actually we were not using field collapsing
> >> before,
> >> > we basically want to replace grouping with that.
> >> > The grouping performance between Solr 4 and 6 are basically
> comparable.
> >> > It's surprising I got so big degradation with the field collapsing.
> >> >
> >> > So basically the comparison we did were based on the Solr4 queries ,
> >> > extracted from logs, and modified slightly to include field collapsing
> >> > parameter.
> >> >
> >> > To build the tests to compare Solr 4.10.2 to Solr 6 we basically
> >> proceeded
> >> > in this way :
> >> >
> >> > 1) install Solr 4.10.2 and Solr 6.0.0
> >> > 2) migrate the index with the related lucene tool ( 4.10.2 -> 5.5.0 ->
> >> Solr
> >> > 6.0 )
> >> > 3) switch on/off the 2 instances and repeating the tests both with
> cold
> >> > instances and warm instances.
> >> >
> >> > This means that the query looks the same.
> >> > I have not double checked the results but only the timings.
> >> > I will provide additional feedback to see if the query are producing
> >> > comparable results as well.
> >> >
> >> > R

Re: [Solr 6] Migration from Solr 4.10.2

2016-05-24 Thread Alessandro Benedetti
Mikhail, you have been really helpful!

On Tue, May 24, 2016 at 9:38 PM, Mikhail Khludnev <
mkhlud...@griddynamics.com> wrote:

> Alessandro,
>
> I checked with Solr 6.0 distro on techproducts.
> Faceting on cat with uif hits fieldValueCache
>
> http://localhost:8983/solr/techproducts/select?facet.field=cat&facet.method=uif&facet=on&indent=on&q=*:*&wt=json
>
> fieldValueCache
> - class:org.apache.solr.search.FastLRUCache
> - description:Concurrent LRU Cache(maxSize=1, initialSize=10,
> minSize=9000, acceptableSize=9500, cleanupThread=false)
> - src:
> - version:1.0 stats:
>
>- cumulative_evictions:0
>- cumulative_hitratio:0.5
>- cumulative_hits:1
>- cumulative_inserts:2
>- cumulative_lookups:2
>- evictions:0
>- hitratio:0.5
>- hits:1
>- inserts:2
>- item_cat:
>
>  
> {field=cat,memSize=4665,tindexSize=46,time=28,phase1=27,nTerms=16,bigTerms=2,termInstances=21,uses=0}
>- lookups:2
>- size:1
>
> Beware, for example field manu_exact doesn't hit field value cache, because
> it single valued and goes to FacetFieldProcessorDV instead of
> FacetFieldProcessorUIF.  And cat is multivalued and hits UIF.


It does completely make sense !
I think the query I was debugging today was containing only single valued
fields.
On the other hand the Solr 4.10.2 version I was testing was with a schema
with the same fields but set multi-valued.

It seems to me that proceeding with UIF seems the most reasonable approach
in my case, as it will automatically redirect to the proper method
depending on multi-value/single value.
Today I was mainly testing with FCS ( but I optimised the index in my
experiments so basically FCS =FC ).
Tomorrow I will try on a fresh index not optimised.
I have 3 additional questions:

1) Let's assume we set DocValues for the fields involved .
If some field is misconfigured, set multivalued in the schema but actually
single valued, according to the code we are going to hit UIF. This is going
to cause un-necessary usage of the FieldValueCache and slowness in
comparison with the DV approach that was the correct algorithm to apply ?

2) thanks to the facet.thread I got a huge benefit on a single query with
FC. Am I expecting to see even more benefit if I have a segmented index ? (
today I was playing with an optimised one).

3) In my experiments today, in Solr 4.10.2 I was getting better results
with the enum approach ( the overall cardinality of the fields involved was
pretty low). Using the enum approach in Solr 6 with no-DocValues was worst
in comparison to Solr 4 ( we know that with the legacy facet approach, if
you set docValues and the field is multi-valued we redirect always to DV).
This bit seems a little bit unrelated the well known bug, as according to
my knowledge the enum approach should make a massive usage of the
filterCache, but the fieldValueCache should not be involved.
Do you know why the termEnum approach has been involved in the regression
in the recents Solr ?

Thank you very much again!

see
> org.apache.solr.search.facet.FacetField.createFacetProcessor(FacetContext)
> it might need to just debug there.
>
> In summary, uif works and you have a chance to hit it. Goof Luck!
>
> On Tue, May 24, 2016 at 7:43 PM, Alessandro Benedetti <
> benedetti.ale...@gmail.com> wrote:
>
> > Update , it seems clear I incurred in the bad
> > https://issues.apache.org/jira/browse/SOLR-8096 :
> >
> > Just adding some additional information as I just incurred on the issue
> > with Solr 6.0 :
> > Static index, around 50 *10^6 docs, 20 fields to facet, 1 of them with
> high
> > cardinality on top of grouping.
> > Groping was not affecting at all.
> >
> > All the symptoms are there, Solr 4.10.2 around 150 ms and Solr 6.0 around
> > 550 ms .
> > The 'fieldValueCache' seems to be unused (no inserts nor lookups) in Solr
> > 6.0.
> > In Solr 4.10 the 'fieldValueCache' is in heavy use with a
> > cumulative_hitratio of 0.96 .
> > Switching from enum to fc to fcs to uif did not change that much.
> >
> > Moving to DocValues didn't improve that much the situation ( but I was on
> > an optimized index, so I need to try the multi-segmented one according
> > to Mikhail
> > Khludnev
> > 
> > contribution
> > in Solr 5.4.0 ) .
> >
> > Moving to field collapsing moved down the query to 110-120 ms ( but this
> is
> > normal, we were faceting on 260 /1 million orignal docs)
> > Adding facet.threads=NCores moved down the queryTime to 100 ms, in
> > combination with field collapsing we reached 80-90 ms when warmed.
> >
> > What are the plan for the future related this ?
> > Do we want to deprecate the legacy facets implementation and move
> > everything to Json facets ( like it happened with the UIF ) ?
> > So backward compatible but different implementation ?
> >
> > I think for migrations should be a transparent process.
> >
> >
> > Cheers
> >
> > On Mon, May 23, 2016 at 6:49 PM, Alessandro Benedetti <
> > b

Re: Adding information to Solr response in custom filter query code?

2016-05-24 Thread Chris Hostetter

: Is there any way a custom search component can access data created in custom
: post filter query so that the data can be added to the response?

Yes - this is exactly what the example i mentioned in my 
previous message do -- as i said before...

>> Take a look at the CollapseQParser (which is used as a PostFilter) and 
>> the ExpandComponent (which modifies the results after QueryComponent 
>> has run).   


-Hoss
http://www.lucidworks.com/


Re: Is it possible to pass parameters through solrconfig.xml ?

2016-05-24 Thread Chris Hostetter

your question confuses me - pass "through" from where?  

when search components are defined in solrconfig.xml, they can be declared 
with any init params you want which will be passed to the init() method.   
Both the sample_techproducts_configs and data_driven_schema_configs that 
come with Solr show off examples of this (via SpellCheckComponent & 
QueryElevationComponent)

SearchComponents can also access any request params via the 
SolrQueryRequest (see ResponseBuilder.req).  These could also include 
default/invariant/appends params if they are defined on the requestHandler 
used (or in the recenlty added "initParams" options in solrconfig.xml)


: Date: Tue, 24 May 2016 09:08:30 -0700 (MST)
: From: vitaly bulgakov 
: Reply-To: solr-user@lucene.apache.org
: To: solr-user@lucene.apache.org
: Subject: Is it possible to pass parameters through solrconfig.xml ?
: 
: I need to pass a parameter to one of my searchComponent class from
: solrconfog.xml file.
: Please advice me how to do it if it is possible.  
: 
: 
: 
: --
: View this message in context: 
http://lucene.472066.n3.nabble.com/Is-it-possible-to-pass-parameters-through-solrconfig-xml-tp4278852.html
: Sent from the Solr - User mailing list archive at Nabble.com.
: 

-Hoss
http://www.lucidworks.com/


debugging solr query

2016-05-24 Thread Jay Potharaju
Hi,
I am trying to debug solr performance problems on an old version of solr,
4.3.1.
The queries are taking really long -in the range of 2-5 seconds!!.
Running filter query with only one condition also takes about a second.

There is memory available on the box for solr to use. I have been looking
at the following link but was looking for some more reference that would
tell me why a particular query is slow.

https://wiki.apache.org/solr/SolrPerformanceProblems

Solr version:4.3.1
Index size:128 GB
Heap:65 GB
Index size:75 GB
Memory usage:70 GB

Even though there is available memory is high all is not being used ..i
would expect the complete index to be in memory but it doesnt look like it
is. Any recommendations ??

-- 
Thanks
Jay


Re: debugging solr query

2016-05-24 Thread Ahmet Arslan


Hi,

Is it QueryComponent taking time? 
Ot other components?

Also make sure there is plenty of RAM for OS cache.

Ahmet

On Wednesday, May 25, 2016 1:47 AM, Jay Potharaju  wrote:



Hi,
I am trying to debug solr performance problems on an old version of solr,
4.3.1.
The queries are taking really long -in the range of 2-5 seconds!!.
Running filter query with only one condition also takes about a second.

There is memory available on the box for solr to use. I have been looking
at the following link but was looking for some more reference that would
tell me why a particular query is slow.

https://wiki.apache.org/solr/SolrPerformanceProblems

Solr version:4.3.1
Index size:128 GB
Heap:65 GB
Index size:75 GB
Memory usage:70 GB

Even though there is available memory is high all is not being used ..i
would expect the complete index to be in memory but it doesnt look like it
is. Any recommendations ??

-- 
Thanks
Jay


Re: debugging solr query

2016-05-24 Thread Erick Erickson
Try adding debug=timing, that'll give you an idea of what component is
taking all the time.
>From there, it's "more art than science".

But you haven't given us much to go on. What is the query? Are you grouping?
Faceting on high-cardinality fields? Returning 10,000 rows?

Best,
Erick

On Tue, May 24, 2016 at 4:52 PM, Ahmet Arslan  wrote:
>
>
> Hi,
>
> Is it QueryComponent taking time?
> Ot other components?
>
> Also make sure there is plenty of RAM for OS cache.
>
> Ahmet
>
> On Wednesday, May 25, 2016 1:47 AM, Jay Potharaju  
> wrote:
>
>
>
> Hi,
> I am trying to debug solr performance problems on an old version of solr,
> 4.3.1.
> The queries are taking really long -in the range of 2-5 seconds!!.
> Running filter query with only one condition also takes about a second.
>
> There is memory available on the box for solr to use. I have been looking
> at the following link but was looking for some more reference that would
> tell me why a particular query is slow.
>
> https://wiki.apache.org/solr/SolrPerformanceProblems
>
> Solr version:4.3.1
> Index size:128 GB
> Heap:65 GB
> Index size:75 GB
> Memory usage:70 GB
>
> Even though there is available memory is high all is not being used ..i
> would expect the complete index to be in memory but it doesnt look like it
> is. Any recommendations ??
>
> --
> Thanks
> Jay


Re: ngroup for MLT results

2016-05-24 Thread Zheng Lin Edwin Yeo
Anyone has any information on this?

Regards,
Edwin

On 21 May 2016 at 14:50, Zheng Lin Edwin Yeo  wrote:

> Hi,
>
> Would like to check, is there a way to do ngrouping for MLT queries?
> I tried the following query with the MoreLikeThisHandler, but I could not
> get any ngroup results.
>
>
> http://localhost:8983/solr/collection1/mlt?q=testing&json.facet={ngroups:%22unique(signature)%22}&start=0
> 
>
> I can only get the ngroup results like this when I use the normal
> SearchHandler
>
>   "facets":{
> "count":15177209,
> "ngroups":9181621}}
>
>
> Regards,
> Edwin
>


Issues with coordinates in Solr during updating of fields

2016-05-24 Thread Zheng Lin Edwin Yeo
Hi,

I have an implementation of storing the coordinates in Solr during
indexing.
During indexing, I will only store the value in the field name ="gps". For
the field name = "gps_0_coordinate" and "gps_1_coordinate", the value will
be auto filled and indexed from the "gps" field.

   
   
   

But when I tried to do an update on any other fields in the index, Solr
will try to add another value in the "gps_0_coordinate" and
"gps_1_coordinate". However, as these 2 fields are not multi-Valued, it
will lead to an error:
multiple values encountered for non multiValued field gps_0_coordinate:
[1.0,1.0]

Does anyone knows how we can solve this issue?

I am using Solr 5.4.0

Regards,
Edwin