Ok, thanks. I deduped the jarfiles and now only have the
solr-dataimporthandler-8.5.2.jar in server\lib folder.
The errors are now gone on the admin page.
But it also states "No cores available", and when I try to create a new core
`mytest` (whose files are already on my disk) I get
me really
weird problems, and I can see this issue being a result of that. For
most DIH uses, you only need the "solr-dataimporthandler-X.Y.Z.jar"
file. For some DIH use cases (but not most of them) you might also need
the extras jar.
Thanks,
Shawn
I'm migrating from solr v4.3.1 to v8.5.2 and using this guide:
https://lucene.apache.org/solr/guide/8_5/installing-solr.html
I can't get data-import handler to work. I wanted to create a new core and then
copy my old data-config fields into it to get me a quick start.
1. I ran command `solr crea
stored. Also note that cursorMark
>> support
>>>> was added a bit later to entity processor, so if you are running a bit
>>>> older version of Solr, you might not have cursors - I’ve found it the
>> hard
>>>> way.
>>>>
>>>> Em
; >> older version of Solr, you might not have cursors - I’ve found it the
> hard
> >> way.
> >>
> >> Emir
> >> --
> >> Monitoring - Log Management - Alerting - Anomaly Detection
> >> Solr & Elasticsearch Consulting Support Training - h
; Monitoring - Log Management - Alerting - Anomaly Detection
>> Solr & Elasticsearch Consulting Support Training - http://sematext.com/
>>
>>
>>
>>> On 27 Apr 2020, at 13:11, Bjarke Buur Mortensen
>> wrote:
>>>
>>> Hi list,
>>>
, or change the analysis
> chain
> > of a field or some other change.
> > It seems to me to be an alluring choice to use a very simple
> > dataimporthandler to reindex all documents, by using a
> SolrEntityProcessor
> > that points to itself. I have just done this
http://sematext.com/
> On 27 Apr 2020, at 13:11, Bjarke Buur Mortensen wrote:
>
> Hi list,
>
> Let's say I add a copyField to my solr schema, or change the analysis chain
> of a field or some other change.
> It seems to me to be an alluring choice to use a very simple
>
Hi list,
Let's say I add a copyField to my solr schema, or change the analysis chain
of a field or some other change.
It seems to me to be an alluring choice to use a very simple
dataimporthandler to reindex all documents, by using a SolrEntityProcessor
that points to itself. I have just
url string inside db-data-config.xml for
> the DataImportHandler.
>
> Now, I see that this version does not support downconfig and upconfig as good
> as in current versions.
> I was able to downconfig using zkcli.sh scripts. But, I notice that zookeeper
> is not storing all
ll PROTECTED 関係者外秘
Hi,
We have an old collection running on a very old solr version. 5.3
Now, we have a need to update the url string inside db-data-config.xml for the
DataImportHandler.
Now, I see that this version does not support downconfig and upconfig as good
as in current versions.
I was
Karl, what would you do if that own implementation stalls in GC, or smashes
Solr over?
On Thu, Feb 6, 2020 at 1:04 PM Karl Stoney
wrote:
> Spoke too soon, looks like it memory leaks. After about 1.3m the old gc
> times went through the root and solr was almost unresponsive, had to
> abort. We'
Egor, would you mind to share some best practices regarding cursorMark in
SolrEntityProcessor?
On Thu, Feb 6, 2020 at 1:04 PM Karl Stoney
wrote:
> Spoke too soon, looks like it memory leaks. After about 1.3m the old gc
> times went through the root and solr was almost unresponsive, had to
> abo
Spoke too soon, looks like it memory leaks. After about 1.3m the old gc times
went through the root and solr was almost unresponsive, had to abort. We're
going to write our own implementation to copy data from one core to another
that runs outside of solr.
On 06/02/2020, 09:57, "Karl Stoney"
I cannot believe how much of a difference that cursorMark and sort order made.
Previously it died about 800k docs, now we're at 1.2m without any slowdown.
Thank you so much
On 06/02/2020, 08:14, "Mikhail Khludnev" wrote:
Hello, Karl.
Please check these:
https://eur03.safelinks.pro
Hello, Karl.
Please check these:
https://lucene.apache.org/solr/guide/6_6/pagination-of-results.html#constraints-when-using-cursors
https://lucene.apache.org/solr/guide/6_6/uploading-structured-data-store-data-with-the-data-import-handler.html#solrentityprocessor
cursorMark="true"
Good luck.
On
Hey All,
I'm trying to implement a simplistic reindex strategy to copy all of the data
out of one collection, into another, on a single node (no distributed queries).
It's approx 4 million documents, with an index size of 26gig. Based on your
experience, I'm wondering what people feel sensible
The DataImportHandler, an optional but popular module to pull in data from
databases and other sources, has a feature in which the whole DIH
configuration can come from a request's "dataConfig" parameter. The debug
mode of the DIH admin screen uses this to allow convenient debuggin
gt; > dataimport?command=full-import&clean=false&commit=true
> >
> > If nevertheless nothing imported, please check the log
> > --
> > Vadim
> >
> >
> >
> > > -Original Message-
> > > From: Joakim Hansson [mailto:joakim.
the log
> --
> Vadim
>
>
>
> > -Original Message-
> > From: Joakim Hansson [mailto:joakim.hansso...@gmail.com]
> > Sent: Tuesday, February 12, 2019 12:47 PM
> > To: solr-user@lucene.apache.org
> > Subject: What's the deal with dataimporthandler o
, February 12, 2019 12:47 PM
> To: solr-user@lucene.apache.org
> Subject: What's the deal with dataimporthandler overwriting indexes?
>
> Hi!
> We are currently upgrading from solr 6.2 master slave setup to solr 7.6
> running solrcloud.
> I dont know if I've missed somethin
Hi Joakim,
This might not be what you expect but it is expected behaviour. When you do
clean=true, DIH will first delete all records. That is how it works in both M/S
and Cloud. The diff might be that you disabled replication or disabled auto
commits in your old setup so it is not visible. You c
Hi!
We are currently upgrading from solr 6.2 master slave setup to solr 7.6
running solrcloud.
I dont know if I've missed something really trivial, but everytime I start
a full import (dataimport?command=full-import&clean=true&optimize=true) the
old index gets overwritten by the new import.
In 6.2
You very often get _much_ more detail in the Solr log rather than the
admin UI BTW
On Mon, Sep 10, 2018 at 8:50 AM Monique Monteiro
wrote:
>
> Hi Andrea,
>
> In fact, I had to add *logLevel="debug"* to the DIH configuration. Just
> checking "Debug" on console is not enough. I also checked "V
Hi Andrea,
In fact, I had to add *logLevel="debug"* to the DIH configuration. Just
checking "Debug" on console is not enough. I also checked "Verbose". Now,
a database exception (locked account is shown).
Thanks!
Monique
On Mon, Sep 10, 2018 at 12:26 PM Andrea Gazzarini
wrote:
> I cannot gi
I cannot give you detailed instructions as I don't have in front of me a
Solr console with the dataimport enabled, but I remember that there's a
detailed section which reports a lot of information.
In the meantime: shooting in the dark, if the query is working and
"total rows fetched" = 0, the
This is shown in the section "Raw Debug-Response".
On Mon, Sep 10, 2018 at 12:20 PM Andrea Gazzarini
wrote:
> Hi Monique, this is the output; when you check the debug checkbox
> another section is printed
>
> Andrea
>
> On 10/09/2018 17:19, Monique Monteiro wrote:
> > Text:
> >
> > { "responseHe
Hi Monique, this is the output; when you check the debug checkbox
another section is printed
Andrea
On 10/09/2018 17:19, Monique Monteiro wrote:
Text:
{ "responseHeader": { "status": 0, "QTime": 463 }, "initArgs": [ "defaults",
[ "config", "data-cnpj-config.xml" ] ], "command": "full-import",
Text:
{ "responseHeader": { "status": 0, "QTime": 463 }, "initArgs": [ "defaults",
[ "config", "data-cnpj-config.xml" ] ], "command": "full-import", "mode":
"debug", "documents": [], "verbose-output": [], "status": "idle", "
importResponse": "", "statusMessages": { "Time Elapsed": "0:0:0.432", "To
Copy and paste the text of the error. Pictures of text aren’t very useful, even
when
they do make it through the mail reflector.
Also, expand the error (the small info button) to get a stack trace.
wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunderwood.org/ (my blog)
> On Sep
Hi Monique,
I think you cannot attach files / images, please post, if available, the
url of the image or a text description.
Andrea
On 10/09/2018 17:05, Monique Monteiro wrote:
Hi Andrea,
Solr console doesn't return a very different information even with
debug mode enabled:
image.png
On
Hi Andrea,
Solr console doesn't return a very different information even with debug
mode enabled:
[image: image.png]
On Mon, Sep 10, 2018 at 12:00 PM Andrea Gazzarini
wrote:
> You can check the solr.log or the solr-console.log. Another option is to
> activate the debug mode in the Solr console
You can check the solr.log or the solr-console.log. Another option is to
activate the debug mode in the Solr console before running the data import.
Andrea
On 10/09/2018 16:57, Monique Monteiro wrote:
Hi all,
I have a data import handler configured with an Oracle SQL query which
works like a
Hi all,
I have a data import handler configured with an Oracle SQL query which
works like a charm. However, when I have the same query configured in
Solr's data import handler, nothing happens, and it returns:
"*Total Requests made to DataSource*": "1",
"*Total Rows Fetched*": "0",
H Shawn,
Thank you very much for your response!
Solr DataImportHandler "could not" directly index MongoDB collections. I
used open source SolrMongoImporter project
(https://github.com/james75/SolrMongoImporter) on top of Solr DIH to to
directly index data of MongoDB collections.
What
On 8/13/2018 8:56 AM, Wendy2 wrote:
Hi Solr users:I encountered the following error when indexing MongoDB data by
using Solr DataImportHandler:org.apache.solr.common.SolrException:
TransactionLog doesn't know how to serialize class org.bson.types.ObjectId;
try implementing ObjectResolver?
The b
Update:
I resolved this issue by checking key:value to convert ObjectId to String:
if(value instanceof ObjectId) {
map.put(key, (String) value.toString());
} else {
..
}
A Solr happy user :-)
--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-
Hi Solr users:I encountered the following error when indexing MongoDB data by
using Solr DataImportHandler:org.apache.solr.common.SolrException:
TransactionLog doesn't know how to serialize class org.bson.types.ObjectId;
try implementing ObjectResolver?Is there any fix or workaround for this
issue?
?
Sent via the Samsung Galaxy S® 6, an AT&T 4G LTE smartphone
Original message
From: Mikhail Khludnev
Date: 5/27/18 3:23 PM (GMT-05:00)
To: solr-user
Subject: [EXTERNAL] Re: How to merge child documents using DataImportHandler
Hello, Abhijit.
Have you tried to drop
tried to drop some of child=true? They usually cause slicing to
> separate documents, rather than default "merge to root" mode.
>
> On Sun, May 27, 2018 at 9:48 PM, Abhijit Pawar >
> wrote:
>
> >
> > Hello,
> >
> > I am using DataImportHand
Hello, Abhijit.
Have you tried to drop some of child=true? They usually cause slicing to
separate documents, rather than default "merge to root" mode.
On Sun, May 27, 2018 at 9:48 PM, Abhijit Pawar
wrote:
>
> Hello,
>
> I am using DataImportHandler to index data from m
Hello,
I am using DataImportHandler to index data from mongoDB.
Here's how my data-source-config file looks like:
entityA(Root Entity) - *products*
entityB (child=true,pk=unique field) - *skus*
entityC - *attributevalues*
entityD - *attributenames*
en
DIH does string replacement
https://github.com/apache/lucene-solr/blob/8b9c2a3185d824a9aaae5c993b872205358729dd/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/SqlEntityProcessor.java#L73
Hard refactoring is required to make it use preparedStatement.
However there
On 5/2/2018 1:03 PM, Mike Konikoff wrote:
> Is there a way to configure the DataImportHandler to use bind variables for
> the entity queries? To improve database performance.
Can you clarify where these variables would come from and precisely what
you want to do?
>From what I can tel
Is there a way to configure the DataImportHandler to use bind variables for
the entity queries? To improve database performance.
Thanks,
Mike
I'm getting the incorrect the reported time deltas on the admin console for
"indexing since" and "started". It looks like DIH is converting the last
start time to UTC:
Last Update: 09:57:15
Indexing completed. Added/Updated: 94078 documents. Deleted 0 documents.
(Duration: 06s)
Requests: 1 , Fet
Hi,
Struggling to Import an XML containing an XSL transformation from
dataImport.
Do we need to run in Cloud mode for this ?
When I start solr in DIH mode, my other Cores are not visible.
1) My SolrConfig.XML has this:
rahul-data-config.xml
2) My rahul-data-config.xml looks
On 1/3/2018 2:20 PM, Tech Id wrote:
I stumbled across https://wiki.apache.org/solr/DataImportHandler and found
it matching my needs exactly.
So I just wanted to confirm if it is an actively supported plugin, before I
start using it for production.
Are there any users who have had a good or a
hough, provides some
bits you should be aware of.
Best,
Erick
On Wed, Jan 3, 2018 at 1:20 PM, Tech Id wrote:
> Hi,
>
> I stumbled across https://wiki.apache.org/solr/DataImportHandler and found
> it matching my needs exactly.
> So I just wanted to confirm if it is an actively suppor
Hi,
I stumbled across https://wiki.apache.org/solr/DataImportHandler and found
it matching my needs exactly.
So I just wanted to confirm if it is an actively supported plugin, before I
start using it for production.
Are there any users who have had a good or a bad experience with DIH ?
Thanks
l] Re: Request return behavior when trigger
DataImportHandler updates
On 12/1/2017 9:37 AM, Nathan Friend wrote:
> When triggering the DataImportHandler to update via an HTTP request, I've
> noticed the handler behaves differently depending on how I specify the
> request parameters.
On 12/1/2017 9:37 AM, Nathan Friend wrote:
> When triggering the DataImportHandler to update via an HTTP request, I've
> noticed the handler behaves differently depending on how I specify the
> request parameters.
>
> If I use query parameters, e.g.
> http://loc
Hello,
When triggering the DataImportHandler to update via an HTTP request, I've
noticed the handler behaves differently depending on how I specify the request
parameters.
If I use query parameters, e.g.
http://localhost:8983/solr/mycore/dataimport?command=full-import, the HTTP
reque
has an idea why this NPE occurs it would be great. Do I
> perhaps have to add something to solrconfig.xml?
>
> Thanks,
> Birgit
>
>
>
> -Original Message-
> From: Jamie Jackson [mailto:jamieja...@gmail.com]
> Sent: Tuesday, September 19, 2017 6:54 PM
> To: s
e.org
Subject: [bulk]: [bulk]: Re: Dates and DataImportHandler
As far as I understood, you can use the locale so that DIH saves the last index
time for the given time zone and not for UTC. So if you set the locale
according to the timezone of your DB you don't need to convert dates for
comparison
lr-user@lucene.apache.org
Subject: [bulk]: Re: [bulk]: Dates and DataImportHandler
FWIW, I know mine worked, so maybe try:
I can't conceive of what the locale would possibly do when a dateFormat is
specified, so I omitted the attribute. (Maybe one can specify dateFormat *or
*locale--it se
git
>
>
>
> -Original Message-
> From: Jamie Jackson [mailto:jamieja...@gmail.com]
> Sent: Tuesday, September 19, 2017 3:42 AM
> To: solr-user@lucene.apache.org
> Subject: [bulk]: Dates and DataImportHandler
>
> Hi folks,
>
> My DB server is on America/Chica
: Jamie Jackson [mailto:jamieja...@gmail.com]
Sent: Tuesday, September 19, 2017 3:42 AM
To: solr-user@lucene.apache.org
Subject: [bulk]: Dates and DataImportHandler
Hi folks,
My DB server is on America/Chicago time. Solr (on Docker) is running on UTC.
Dates coming from my (MariaDB) data source
Hi folks,
My DB server is on America/Chicago time. Solr (on Docker) is running on
UTC. Dates coming from my (MariaDB) data source seem to get translated
properly into the Solr index without me doing anything special.
However when doing delta imports using last_index_time (
http://wiki.apache.org/
It is not working if a cron job is given. It is executing the other enities
as well. Is there any solution?
--
View this message in context:
http://lucene.472066.n3.nabble.com/DataImportHandler-full-import-of-a-single-entity-tp2258037p4349551.html
Sent from the Solr - User mailing list archive
Hello,
I am running *Solr 3.5* and using Data Import Handler. I am using the
following query -
Although the FULL Import is running fine but the delta import is having
trouble. Here is what I am experiencing -
1. Delta Imports are working in cumulative fashion - any increment
(delta) is t
On 4/1/2017 4:17 PM, marotosg wrote:
> I am trying to load a big table into Solr using DataImportHandler and Mysql.
> I am getting OutOfMemory error because Solr is trying to load the full
> table. I have been reading different posts and tried batchSize="-1".
> https:
Hello, Sergio.
Have you tried Integer.MIN_VALUE ? -2147483648 see
https://dev.mysql.com/doc/connector-j/5.1/en/connector-j-reference-implementation-notes.html
On Sun, Apr 2, 2017 at 1:17 AM, marotosg wrote:
> Hi,
>
> I am trying to load a big table into Solr using DataImportHa
Hi,
I am trying to load a big table into Solr using DataImportHandler and Mysql.
I am getting OutOfMemory error because Solr is trying to load the full
table. I have been reading different posts and tried batchSize="-1".
https://wiki.apache.org/solr/DataImportHandlerFaq
Do you hav
the entire text file as it is, I am able to achieve
this in Solr Standalone
For testing my code in SolrCloud I just kept a small file with 3 characters in
it , Solr does not throw any error but also not indexing the file
I tried below approaches
1. Issue with Dataimporthandler -- Zookeeper is
> Thank you I will follow Erick's steps
> BTW I am also trying to ingesting using Flume , Flume uses Morphlines along
> with Tika
> Even Flume SolrSink will have the same issue?
Yes, when using Tika you run the risk of it choking on a document, eating CPU
and/or RAM until everything dies. This i
-user@lucene.apache.org
Subject: Re: DataImportHandler - Unable to load Tika Config Processing Document
# 1
On 2/8/2017 9:08 AM, Anatharaman, Srinatha (Contractor) wrote:
> Thank you for your reply
> Other archive message you mentioned is posted by me only I am new to
> Solr, When you say proces
On 2/8/2017 9:08 AM, Anatharaman, Srinatha (Contractor) wrote:
> Thank you for your reply
> Other archive message you mentioned is posted by me only
> I am new to Solr, When you say process outside Solr program. What exactly I
> should do?
>
> I am having lots of text document which I need to inde
document
I was able to successfully do this in Solr Core stand alone
-Original Message-
From: Allison, Timothy B. [mailto:talli...@mitre.org]
Sent: Wednesday, February 08, 2017 1:56 PM
To: solr-user@lucene.apache.org
Subject: RE: DataImportHandler - Unable to load Tika Config Processing
>It is *strongly* recommended to *not* use >the Tika that's embedded within
>Solr, but >instead to do the processing outside of Solr >in a program of your
>own and index the results.
+1
http://mail-archives.apache.org/mod_mbox/lucene-solr-user/201601.mbox/%3CBY2PR09MB11210EDFCFA297528940B07C
Solr?
Regards,
~Sri
-Original Message-
From: Shawn Heisey [mailto:apa...@elyograg.org]
Sent: Wednesday, February 08, 2017 9:46 AM
To: solr-user@lucene.apache.org
Subject: Re: DataImportHandler - Unable to load Tika Config Processing Document
# 1
On 2/6/2017 3:45 PM, Anatharaman, Srinatha
On 2/6/2017 3:45 PM, Anatharaman, Srinatha (Contractor) wrote:
> I am having below error while trying to index using dataImporthandler
>
> Data-Config file is mentioned below. zookeeper is not able to read
> "tikaConfig.xml" on below statement
>
> processor=&quo
Hi,
I am having below error while trying to index using dataImporthandler
Data-Config file is mentioned below. zookeeper is not able to read
"tikaConfig.xml" on below statement
processor="TikaEntityProcessor" tikaConfig="tikaConfig.xml"
Please help
Thanks a lot Shawn.
Regards,
Prateek Jain
-Original Message-
From: Shawn Heisey [mailto:apa...@elyograg.org]
Sent: 23 December 2016 01:36 PM
To: solr-user@lucene.apache.org
Subject: Re: DataImportHandler | Query | performance
On 12/23/2016 5:15 AM, Prateek Jain J wrote:
> We n
similar
> pattern.
>
> 2. This causes SOLR to hang and cause OOM in some cases due to, too
> many FIleDescriptors opened (sometimes, due to other issues)
>
> We would like to know if using DataImportHandler give us any advantage? I
> just gave a quick glance on Solr Wiki
(sometimes, due to other issues)
We would like to know if using DataImportHandler give us any advantage? I just
gave a quick glance on Solr Wiki but not clear if it offers any advantages in
terms of performance (in this scenario).
Regards,
Prateek Jain
Hi All,
I am facing some issue with the Custom EntityProcessor for DataImportHandler
related Issue.
*My Requirement:*
My Requirement is to process a main file along with it's associated chunk
files( Child Files) placed in a folder. The file related information are
part of JSON file plac
might be right, according to the source:
> https://github.com/apache/lucene-solr/blob/master/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java#L662
>
> Sometimes, the magic (and schemaless is rather magical) fails when
> combined with older assumpti
Seem you might be right, according to the source:
https://github.com/apache/lucene-solr/blob/master/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java#L662
Sometimes, the magic (and schemaless is rather magical) fails when
combined with older assumptions
rta wrote:
>>>> Hi Alex,
>>>> thanks for your answer.
>>>>
>>>> Yes my solrconfig.xml contains the add-unknown-fields-to-the-schema.
>>>>
>>>>
>>>>
>>>> add-unknown-fields-to-the-schema
>>
>> Yes my solrconfig.xml contains the add-unknown-fields-to-the-schema.
>>>
>>>
>>>
>>> add-unknown-fields-to-the-schema
>>>
>>>
>>>
>>> I created my core using this command:
>>>
>>>
;>
>>
>>
>> I created my core using this command:
>>
>> curl
>> http://192.168.99.100:8999/solr/admin/cores?action=CREATE&name=solrexchange&instanceDir=/opt/solr/server/solr/solrexchange&configSet=data_driven_schema_configs_cus
tp://192.168.99.100:8999/solr/admin/cores?action=CREATE&name=solrexchange&instanceDir=/opt/solr/server/solr/solrexchange&configSet=data_driven_schema_configs_custom
>
> I am using the example configset data_driven_schema_configs and I simply
> added:
>
>regex="so
a because it is designed to work
> with only extracting relevant fields from the database even with
> 'select *' statement.
>
>
> Regards,
> Alex.
>
> Newsletter and resources for Solr beginners and intermediates:
> http://www.solr-start.com/
>
>
2, Pierre Caserta wrote:
> Hi,
> It seems that using the DataImportHandler with a XPathEntityProcessor config
> with a managed-schema setup, only import the id and version field.
>
> data-config.xml
>
>
>
>
> processor="XPathEntityPro
Hi,It seems that using the DataImportHandler with a XPathEntityProcessor config with a managed-schema setup, only import the id and version field.data-config.xml processor="XPathEntityProcessor" stream="true" forEach="/p
ish it?
>
> Thanks a lot in advance.
>
> --
> View this message in context:
http://lucene.472066.n3.nabble.com/DataImportHandler-Automatic-
scheduling-of-delta-imports-in-Solr-in-windows-7-tp4130565.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
a mile age can vary
http://blog.griddynamics.com/2015/07/how-to-import-structured-data-into-solr.html
On Fri, Jan 22, 2016 at 8:29 PM, Brian Narsi wrote:
> What are the various ways DataImportHandler can be scaled?
>
> Thanks
>
--
Sincerely yours
Mikhail Khludnev
Principal En
On 1/22/2016 10:29 AM, Brian Narsi wrote:
What are the various ways DataImportHandler can be scaled?
I'm not very familiar with how DIH interacts with SolrCloud. I know you
can use it with SolrCloud, but nothing else. Assuming you're not
running SolrCloud, the following inform
What are the various ways DataImportHandler can be scaled?
Thanks
> Hi,
> I'm trying to import my data from an sql database using the
> dataimporthandler. For some nested entity I want to use the cache to cache
> the result of my stored procedure. My config looks like this
>
>
> >
> >
> >
>
Hi,
I'm trying to import my data from an sql database using the
dataimporthandler. For some nested entity I want to use the cache to cache
the result of my stored procedure. My config looks like this
>
>
>
>
>cacheLookup="pr
Hmmm, not entirely sure. It's perfectly reasonable to use the core
admin API, just
be careful with it especially for things like reload, it's pretty easy
to have your cluster
in an inconsistent state.
Looks like the collections RELOAD command sends requests out all
replicas at once.
Under the cov
Does the collection reload do a rolling reload of each node or does it do them
all at once? We were planning on using the core reload on each system, one at a
time. That would make sure the collection stays available.
I read the documentation, it didn’t say anything about that.
wunder
Walter Un
Please be very careful using the core admin UI for anything related to
SolrCloud. In fact, I try to avoid using it at all.
The reason is that it is very low-level, and it is very easy to use it
incorrectly. For instance, reloading a core in a multi-replica setup
(doesnt matter whether it's several
Mikhail,
I solved the problem, I putfile to wrong path. /synonyms.txt should be
/configs/gettingstarted/synonyms.txt .
Regards,
Hangu
On Wed, Oct 21, 2015 at 4:17 PM, Hangu Choi wrote:
> Mikhail,
>
> I didn't understatnd that's what I need to do. thank you.
>
> but at the first moment, I am n
Mikhail,
I didn't understatnd that's what I need to do. thank you.
but at the first moment, I am not doing well..
I am testing to change configuration in solrcloud, through this command
./zkcli.sh -zkhost localhost:9983 -cmd putfile /synonyms.txt
/usr/local/solr-5.3.1-test/server/scripts/cloud-s
did you try something like
$> zkcli.sh -zkhost localhost:2181 -cmd putfile /solr.xml /path/to/solr.xml
?
On Mon, Oct 19, 2015 at 11:15 PM, hangu choi wrote:
> Hi,
>
> I am trying to start SolrCloud with embedded ZooKeeper.
>
> I know how to config solrconfig.xml and schema.xml, and other things
Hi,
I am trying to start SolrCloud with embedded ZooKeeper.
I know how to config solrconfig.xml and schema.xml, and other things for
data import handler.
but when I trying to config it with solrCloud, I don't know where to start.
I know there is no conf directory in SolrCloud because conf direct
Hello everyone,
I need to run the following query to import my index from a H2 database:
but if I start to full-import nothing happens. The last information from my log
file is the following:
[25.09.2015 20:06:24.418 INFO commitScheduler-11-thread-1
o.a.s.u.DirectUpdateHandler2.commit:548] star
1 - 100 of 999 matches
Mail list logo