Have you done what the message says and looked at your Solr log? If so,
what information is there?
> On Dec 23, 2020, at 5:13 AM, DINSD | SPAutores
> wrote:
>
> Hi,
>
> I'm trying to install the package "data-import-handler", since it was
> discontinu
Hi,
I'm trying to install the package "data-import-handler", since it was
discontinued from core SolR distro.
https://github.com/rohitbemax/dataimporthandler
However, as soon as the first command is carried out
solr -c -Denable.packages=true
I get this screen in web interf
On 12/18/2020 12:03 AM, basel altameme wrote:
While trying to Import & Index data from MySQL DB custom view i am facing the
error below:
Data Config problem: The value of attribute "query" associated with an element type
"entity" must not contain the '<' character.
Please note that in my SQL st
Have you tried escaping that character?
> On Dec 18, 2020, at 2:03 AM, basel altameme
> wrote:
>
> Dear,
> While trying to Import & Index data from MySQL DB custom view i am facing the
> error below:
> Data Config problem: The value of attribute "query" associated with an
> element type "enti
Dear,
While trying to Import & Index data from MySQL DB custom view i am facing the
error below:
Data Config problem: The value of attribute "query" associated with an element
type "entity" must not contain the '<' character.
Please note that in my SQL statements i am using '<>' as an operator fo
On 11/30/2020 7:50 AM, David Smiley wrote:
Yes, absolutely to what Eric said. We goofed on news / release highlights
on how to communicate what's happening in Solr. From a Solr insider point
of view, we are "deprecating" because strictly speaking, the code isn't in
our codebase any longer. Fro
Yes, absolutely to what Eric said. We goofed on news / release highlights
on how to communicate what's happening in Solr. From a Solr insider point
of view, we are "deprecating" because strictly speaking, the code isn't in
our codebase any longer. From a user point of view (the audience of news
You don’t need to abandon DIH right now…. You can just use the Github hosted
version…. The more people who use it, the better a community it will form
around it!It’s a bit chicken and egg, since no one is actively discussing
it, submitting PR’s etc, it may languish. If you use it, and
On 11/29/2020 10:32 AM, Erick Erickson wrote:
And I absolutely agree with Walter that the DB is often where
the bottleneck lies. You might be able to
use multiple threads and/or processes to query the
DB if that’s the case and you can find some kind of partition
key.
IME the difficult part has
If you like Java instead of Python, here’s a skeletal program:
https://lucidworks.com/post/indexing-with-solrj/
It’s simple and single-threaded, but could serve as a basis for
something along the lines that Walter suggests.
And I absolutely agree with Walter that the DB is often where
the bottle
I recommend building an outboard loader, like I did a dozen years ago for
Solr 1.3 (before DIH) and did again recently. I’m glad to send you my Python
program, though it reads from a JSONL file, not a database.
Run a loop fetching records from a database. Put each record into a synchronized
(threa
I went through the same stages of grief that you are about to start
but (luckily?) my core dataset grew some weird cousins and we ended up
writing our own indexer to join them all together/do partial
updates/other stuff beyond DIH. It's not difficult to upload docs but
is definitely slower so far.
On 11/28/2020 5:48 PM, matthew sporleder wrote:
... The bottom of
that github page isn't hopeful however :)
Yeah, "works with MariaDB" is a particularly bad way of saying "BYO JDBC
JAR" :)
It's a more general queston though, what is the path forward for users
who with data in two places?
https://solr.cool/#utilities -> https://github.com/rohitbemax/dataimporthandler
You can import it in the many new/novel ways to add things to a solr
install and it should work like always (apparently). The bottom of
that github page isn't hopeful however :)
On Sat, Nov 28, 2020 at 5:21 PM Dmitri
Hi all,
trying to set up solr-8.7.0, contrib/dataimporthandler/README.txt says
this module is deprecated as of 8.6 and scheduled for removal in 9.0.
How do we pull data out of our relational database in 8.7+?
TIA
Dima
check out the videos on this website TROO.TUBE don't be such a
sheep/zombie/loser/NPC. Much love!
https://troo.tube/videos/watch/aaa64864-52ee-4201-922f-41300032f219
On Tue, May 5, 2020 at 1:58 PM Mikhail Khludnev wrote:
>
> Hello, James.
>
> DataImportHandler has a lock preventing concurrent exe
Hello, James.
DataImportHandler has a lock preventing concurrent execution. If you need
to run several imports in parallel at the same core, you need to duplicate
"/dataimport" handlers definition in solrconfig.xml. Thus, you can run them
in parallel. Regarding schema, I prefer the latter but mile
Hello, I'm new to the group here so please excuse me if I do not have the
etiquette down yet.
Is it possible to have multiple entities (customer configurable, up to 40
atm) in a DIH configuration to be imported at once? Right now I have
multiple root entities in my configuration but they get inde
We are doing hourly data import to our index, per day one or two requests
are getting failed with the message "A command is still running...".
1. Does it mean, the data import not happened for the last hour?
2. If you look at the "Full Dump Started" time has an older data,
scia, Michael [mailto:michael.fris...@yale.edu]
> Sent: Monday, September 09, 2019 1:22 PM
> To: solr-user@lucene.apache.org
> Subject: SQL data import handler
>
> I setup SOLR on Ubuntu 18.04 and installed Java from apt-get with
default-jre
> which installed versi
; From: Friscia, Michael [mailto:michael.fris...@yale.edu]
> Sent: Monday, September 09, 2019 1:22 PM
> To: solr-user@lucene.apache.org
> Subject: SQL data import handler
>
> I setup SOLR on Ubuntu 18.04 and installed Java from apt-get with default-jre
> which installed version 11. So
.
Best regards
> Am 09.09.2019 um 12:21 schrieb Friscia, Michael :
>
> I setup SOLR on Ubuntu 18.04 and installed Java from apt-get with default-jre
> which installed version 11. So after a day of trying to make my Microsoft SQL
> Server data import handler work and failing, I bui
I setup SOLR on Ubuntu 18.04 and installed Java from apt-get with default-jre
which installed version 11. So after a day of trying to make my Microsoft SQL
Server data import handler work and failing, I built a new VM and installed JRE
8 and then everything works perfectly.
The root of the
Hello everyone,
We are using Solr(7.1) on cloud mode and trying to get data from Cassandra
source. Can't import data from Cassandra.
In the error logs;
Full Import
failed:org.apache.solr.handler.dataimport.DataImportHandlerException: Unable to
PropertyWriter implementation:SimplePropertiesWri
Ok, what that means is you're letting Solr do its best to figure out
what fields you should have in the schema and how they're defined.
Almost invariably, you can do better by explicitly defining the fields
you need in your schema rather than enabling add-unknown. It's
fine for getting started,
Hello, i managed to fix the problem. I'm using Solr 7.5.0. My problem was
that in the server logs i got "This Indexschema is not mutable" (i did not
know about the logs folder, so i just found out 5 minutes ago). I fixed it
by modifying solrconfig.xml to
false*}"
proce
November 9, 2018 7:51 AM
To: solr-user@lucene.apache.org
Subject: Sql server data import
Hello, i managed to set up a connection to my sql server to import data into
Solr. The idea is to import filetables but for now i first want to get it
working using regular tables. So i created
*data-
Which version of Solr is it? Because we have not used schema.xml for a
very long time. It has been managed-schema instead.
Also, have you tried using DIH example that uses database and
modifying it just enough to read data from your database. Even if it
has a lot of extra junk, this would test hal
Hello, i managed to set up a connection to my sql server to import data into
Solr. The idea is to import filetables but for now i first want to get it
working using regular tables. So i created
*data-config.xml*
*schema.xml*
i added
: Re: data-import-handler for solr-7.5.0
Ok, so then you can switch to debug mode and keep trying to figure it out. Also
try BinFileDataSource or URLDataSource, maybe it will have an easier way.
Or using relative path (example:
https://github.com/arafalov/solr-apachecon2018-presentation/blob
Data, IM & Analytics
>
>
>
> Lautrupparken 40-42, DK-2750 Ballerup
> E-mail m...@kmd.dk Web www.kmd.dk
> Mobil +4525571418
>
> -Oprindelig meddelelse-
> Fra: Alexandre Rafalovitch
> Sendt: 2. oktober 2018 18:18
> Til: solr-user
> Emne: Re: data-imp
1418
-Oprindelig meddelelse-
Fra: Alexandre Rafalovitch
Sendt: 2. oktober 2018 18:18
Til: solr-user
Emne: Re: data-import-handler for solr-7.5.0
Admin UI for DIH will show you the config file read. So, if nothing is there,
the path is most likely the issue
You can also provide or update
;,
"Total Documents Skipped":"0",
"Full Dump Started":"2018-10-02 16:15:21",
"":"Indexing completed. Added/Updated: 0 documents. Deleted 0 documents.",
"Committed":"2018-10-02 16:15:22",
"Time taken&quo
Høydahl, search solution architect
> Cominvent AS - www.cominvent.com
>
> > 2. okt. 2018 kl. 17:15 skrev Martin Frank Hansen (MHQ) :
> >
> > Hi,
> >
> > I am having some problems getting the data-import-handler in Solr to work.
> > I have tried a lot of things b
> Hi,
>
> I am having some problems getting the data-import-handler in Solr to work. I
> have tried a lot of things but I simply get no response from Solr, not even
> an error.
>
> When calling the API:
> http://localhost:8983/solr/nh/dataimport?command=full-import
>
Hi,
I am having some problems getting the data-import-handler in Solr to work. I
have tried a lot of things but I simply get no response from Solr, not even an
error.
When calling the API:
http://localhost:8983/solr/nh/dataimport?command=full-import
{
"responseHeader":{
port Training - http://sematext.com/
> On 12 Sep 2018, at 05:53, Zimmermann, Thomas
> wrote:
>
> We have a Solr v7 Instance sourcing data from a Data Import Handler with a
> Solr data source running Solr v4. When it hits a single server in that
> instance directly, all document
We have a Solr v7 Instance sourcing data from a Data Import Handler with a Solr
data source running Solr v4. When it hits a single server in that instance
directly, all documents are read and written correctly to the v7. When we hit
the load balancer DNS entry, the resulting data import handler
Thank you both for the responses. I was able to get the import working
through telnet, and I'll see if I can get the post utility working as that
seems like a better option.
Thanks,
Adam
On Mon, Aug 20, 2018, 2:04 PM Alexandre Rafalovitch
wrote:
> Admin UI just hits Solr for a particular URL wi
Admin UI just hits Solr for a particular URL with specific parameters.
You could totally call it from the command line, but it _would_ need
to be an HTTP client of some sort. You could encode all of the
parameters into the DIH (or a new) handler, it is all defined in
solrconfig.xml (/dataimport is
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Adam,
On 8/20/18 1:45 PM, Adam Blank wrote:
> I'm running Solr 5.5.0 on AIX, and I'm wondering if there's a way
> to import the index from the command line instead of using the
> admin console? I don't have the ability to use a HTTP client such
> a
Hi,
I'm running Solr 5.5.0 on AIX, and I'm wondering if there's a way to import
the index from the command line instead of using the admin console? I
don't have the ability to use a HTTP client such as cURL to connect to the
console.
Thank you,
Adam
But in my case i see output as below
0
0
*:*
on
xml
1533734431931
IT
1
1
1608130338704326656
Data
1
2
1608130338704326656
omkar
1
1608130338704326656
ITI
2
3
1608130338712715264
Entry
2
This is how nested docs look like. These are document blocks with parent in
the end. Block Join Queries work on these blocks.
On Wed, Aug 8, 2018 at 12:47 PM omp...@rediffmail.com <
omkar.pra...@gmail.com> wrote:
> Thanks a lot Mikhail. But as per documentation below nested document
> ingestion i
Thanks a lot Mikhail. But as per documentation below nested document
ingestion is possible. Is this limitation of DIH?
https://lucene.apache.org/solr/guide/6_6/uploading-data-with-index-handlers.html#UploadingDatawithIndexHandlers-NestedChildDocuments
Also can block join query be used to get exp
It never works like you expect. You need to search for parents and then
hook up [child]. I see some improvements are coming, but now that is.
On Mon, Aug 6, 2018 at 9:11 PM omp...@rediffmail.com
wrote:
> Thanks Mikhail verbose did help. _root_ field was missing in schema also in
> make some chan
Thanks Mikhail verbose did help. _root_ field was missing in schema also in
make some changes in child entity. Like i created id as alias to emp_id ( in
child query) which is id column of parent table.
DIH has debug&verbose modes. Have you tried to use them?
On Mon, Aug 6, 2018 at 4:11 PM omp...@rediffmail.com
wrote:
> Thanks Mikhail, i tried changing conf but that did not help
>
>
> driver="com.mysql.jdbc.Driver"
> url="jdbc:mysql://localhost:3306/test"
>
Thanks Mikhail, i tried changing conf but that did not help
Hi, Omkar.
Could it happen that child docs as well as parents are assigned same "id"
field values implicitly and removed due to uniqueKey collision?
On Sat, Aug 4, 2018 at 10:12 PM omkar.pra...@gmail.com <
omkar.pra...@gmail.com> wrote:
> I am using similar db-data config as below for indexing t
I am using similar db-data config as below for indexing this parent-child
data. solr version 6.6.2
SELECT id as emp_id, name FROM emp;
+++
| emp_id | name |
+++
| 1 | omkar |
| 2 | ashwin |
+++
2 rows in set (0.00 sec)
select * fro
Hi,
I am trying to use tika-OCR(Tesseract) in data import handler
and found that processing English documents was quite good.
But I am struggling to process the other languages such as
Japanese, Chinese, etc...
So, I want to know how to switch Tesseract-OCR's processing
language via data i
;
> > I am trying to indexing files into Solr 7.2 using data import handler
> with
> > onError=skip option.
> > But, I am struggling with determining the skipped documents as logs do
> not
> > tell which file was bad.
> > So, how can I know those files?
> >
> > Thanks,
> > Yasufumi
>
Have you tried changing the log level
https://lucene.apache.org/solr/guide/7_2/configuring-logging.html
--
Rahul Singh
rahul.si...@anant.us
Anant Corporation
On Jul 8, 2018, 8:54 PM -0500, Yasufumi Mizoguchi ,
wrote:
> Hi,
>
> I am trying to indexing files into Solr 7.2 using da
Hi,
I am trying to indexing files into Solr 7.2 using data import handler with
onError=skip option.
But, I am struggling with determining the skipped documents as logs do not
tell which file was bad.
So, how can I know those files?
Thanks,
Yasufumi
mysql jdbc connector version I need ?
-Message d'origine-
De : msaunier [mailto:msaun...@citya.com]
Envoyé : jeudi 26 avril 2018 13:13
À : solr-user@lucene.apache.org
Objet : RE: SolrCloud DIH (Data Import Handler) MySQL 404
Hello,
Where I add that? In the Solr start comman
: Re: SolrCloud DIH (Data Import Handler) MySQL 404
Can you share more log lines around this odd NPE?
It might be necessary to restart jvm with -verbose:class and look through its'
output to find why it can't load this class.
On Wed, Apr 25, 2018 at 11:42 AM, msaunier wrote:
> Hello
o:elyog...@elyograg.org]
> Envoyé : mardi 24 avril 2018 17:39
> À : solr-user@lucene.apache.org
> Objet : Re: SolrCloud DIH (Data Import Handler) MySQL 404
>
> On 4/24/2018 2:03 AM, msaunier wrote:
> > If I access to the interface, I have a null pointer exception:
> >
&g
: Re: SolrCloud DIH (Data Import Handler) MySQL 404
On 4/24/2018 2:03 AM, msaunier wrote:
> If I access to the interface, I have a null pointer exception:
>
> null:java.lang.NullPointerException
> at
> org.apache.solr.handler.RequestHandlerBase.getVersion(RequestHandlerBa
> se
On 4/24/2018 2:03 AM, msaunier wrote:
If I access to the interface, I have a null pointer exception:
null:java.lang.NullPointerException
at
org.apache.solr.handler.RequestHandlerBase.getVersion(RequestHandlerBase.java:233)
The line of code where this exception occurred uses fundamenta
di 24 avril 2018 10:04
À : solr-user@lucene.apache.org
Objet : RE: SolrCloud DIH (Data Import Handler) MySQL 404
If I access to the interface, I have a null pointer exception:
null:java.lang.NullPointerException
at
org.apache.solr.handler.RequestHandlerBase.getVersion(RequestHandlerBase
a.com]
Envoyé : mardi 24 avril 2018 09:25
À : solr-user@lucene.apache.org
Objet : RE: SolrCloud DIH (Data Import Handler) MySQL 404
Hello Shawn,
Thanks for your answers.
#
So, indexation_events.xml fi
or-java":{
"name":"mysql-connector-java",
"version":1},
"data-import-handler":{
"name":"data-import-handler",
"version":1}},
"requestHandler":{"/test_dih":{
"n
st.java:95)
> at
> org.eclipse.jetty.io.SelectChannelEndPoint$2.run(
> SelectChannelEndPoint.java:
> 93)
> at
> org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.
> executeProduceC
> onsume(ExecuteProduceConsume.java:303)
> at
>
On 4/23/2018 8:30 AM, msaunier wrote:
I have add debug:
curl
"http://srv-formation-solr:8983/solr/arguments_test/test_dih?command=full-im
port&commit=true&debug=true"
500588true1DIH/indexation_events.xml
This is looking like a really nasty error that I cannot understand,
possibly caused by a
On 4/23/2018 6:12 AM, msaunier wrote:
I have a problem with DIH in SolrCloud. I don't understand why, so I need
your help.
Solr 6.6 in Cloud.
##
COMMAND:
curl http://srv-formation-solr:8983/solr/test_dih?command=full-import
RESULT:
Error 404 Not F
]
Envoyé : lundi 23 avril 2018 14:47
À : solr-user@lucene.apache.org
Objet : RE: SolrCloud DIH (Data Import Handler) MySQL 404
I have correct url to : curl
http://srv-formation-solr:8983/solr/arguments_test/test_dih?command=full-imp
ort
And change overlay config
"/configs/arguments_test/DIH/ind
Envoyé : lundi 23 avril 2018 14:12
À : solr-user@lucene.apache.org
Objet : SolrCloud DIH (Data Import Handler) MySQL 404
Hello,
I have a problem with DIH in SolrCloud. I don't understand why, so I need
your help.
Solr 6.6 in Cloud.
##
COMMAND:
curl
e add data-import-handler and mysql-connector-java runtimeLib
on the configoverlay.json file with the API
4. I have create the DIH folder on the cloud with zkcli.sh script
5. I have push with zkcli the DIH .xml configuration file
CONFIGOVERLAY CONTENT :
{
"runtimeLib"
On 4/16/2018 7:32 PM, gadelkareem wrote:
I cannot complain cuz it actually worked well for me so far but..
I still do not understand if Solr already paginates the results from the
full import, why not do the same for the delta. It is almost the same query:
`select id from t where t.lastmod > ${s
Thanks Shawn.
I cannot complain cuz it actually worked well for me so far but..
I still do not understand if Solr already paginates the results from the
full import, why not do the same for the delta. It is almost the same query:
`select id from t where t.lastmod > ${solrTime}`
`select * from t w
On 4/5/2018 7:31 PM, gadelkareem wrote:
Why the deltaImportQuery uses "where id='${dataimporter.id}'" instead of
something like where id IN ('${dataimporter.id})'
Because there's only one value for that property.
If the deltaQuery returns a million rows, then deltaImportQuery is going
to be e
Why the deltaImportQuery uses "where id='${dataimporter.id}'" instead of
something like where id IN ('${dataimporter.id})'
--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
-user@lucene.apache.org
Subject: [EXTERNAL] - data import class not found
I still can't understand how Solr establishes the classpath.
I have a custom entity processor that subclasses EntityProcessorBase. When I
execute the /dataimport call I get
java.lang.NoClassDefFoundError:
org/apache/so
I still can't understand how Solr establishes the classpath.
I have a custom entity processor that subclasses EntityProcessorBase. When I
execute the /dataimport call I get
java.lang.NoClassDefFoundError:
org/apache/solr/handler/dataimport/EntityProcessorBase
no matter how I state in solrconf
Hi,
I am getting following error , when I index data using Dataimporter.
I am using File Data source in the data config file
here is the config file
.
> >
> > First for to connect to DB & have data from DB Tables.
> > And Second for to have data from all pdf files using TikaEntityProcessor.
> >
> > Now the problem is there is no error in the console or anywhere but
> > whenever I want to search using "
is no error in the console or anywhere but
> whenever I want to search using "Query" tab it gives me the result of Data
> Import.
>
> So let's say if I last Imported data for Tables then it gives me to result
> from the table and if I imported PDF Files then it searches
df files using TikaEntityProcessor.
Now the problem is there is no error in the console or anywhere but
whenever I want to search using "Query" tab it gives me the result of Data
Import.
So let's say if I last Imported data for Tables then it gives me to result
from the table and if I
Hi,
I use DIH in solr-cloud mode (implicit route) in solr6.5.1.
When I start the import it works fine and I see the progress in the logfile.
However, when I click the "Refresh Status" button in the web-ui while the
import is running
I only see "No information available (idle)".
So I have to look
Hi Zac,
I think you have added entity closing tag 2 times. that might be
causing an issue. It been a long time . not sure whether you are still
working on it or not.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Using-the-Data-Import-Handler-with-SQLite
ogether with an
update processor that removes the temporary field (and possibly other
unwanted fields) seemed to work great for us.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Version-conflict-during-data-import-from-another-Solr-instance-into-clean-Solr-tp4046937p4331876
>>> As per my experience DIH is the best for RDBMS to solr index. DIH
> > with
> > > >>> caching has best performance. DIH nested entities allow you to
> define
> > > >>> simple queries.
> > > >>> Also, solrj is good when you want your
> available in solr. DIH full import can be used for index all data
> first
> > >>> time or restore index in case index is corrupted.
> > >>>
> > >>> Thanks,
> > >>> Sujay
> > >>&
@lucene.apache.org
Subject: Re: Data Import
Hi Daphne,
Are you using DSE?
Thanks & Regards,
Vishal
On Fri, Mar 17, 2017 at 7:40 PM, Liu, Daphne
wrote:
> I just want to share my recent project. I have successfully sent all
> our EDI documents to Cassandra 3.7 clusters using Solr 6.3 Data Imp
e index is corrupted.
> >>>
> >>> Thanks,
> >>> Sujay
> >>>
> >>> On Fri, Mar 17, 2017 at 2:34 PM, vishal jain
> wrote:
> >>>
> >>> > Hi,
> >>> >
> >>
ties allow you to define
>>>>> simple queries.
>>>>> Also, solrj is good when you want your RDBMS updates make immediately
>>>>> available in solr. DIH full import can be used for index all data first
>>>>> time or restore index in case ind
Hi Daphne,
Are you using DSE?
Thanks & Regards,
Vishal
On Fri, Mar 17, 2017 at 7:40 PM, Liu, Daphne
wrote:
> I just want to share my recent project. I have successfully sent all our
> EDI documents to Cassandra 3.7 clusters using Solr 6.3 Data Import JDBC
> Cassandra connector
r. DIH full import can be used for index all data first
> >>> time or restore index in case index is corrupted.
> >>>
> >>> Thanks,
> >>> Sujay
> >>>
> >>> On Fri, Mar 17, 2017 at 2:34 PM, vishal jain
> wrote:
> >>>
&
se index is corrupted.
>>>
>>> Thanks,
>>> Sujay
>>>
>>> On Fri, Mar 17, 2017 at 2:34 PM, vishal jain wrote:
>>>
>>> > Hi,
>>> >
>>> >
>>> > I am new to Solr and am trying to move data from
irst
> >> time or restore index in case index is corrupted.
> >>
> >> Thanks,
> >> Sujay
> >>
> >> On Fri, Mar 17, 2017 at 2:34 PM, vishal jain
> wrote:
> >>
> >> > Hi,
> >> >
> >> >
> >> > I am new to Solr a
t; >
>> >
>> > I am new to Solr and am trying to move data from my RDBMS to Solr. I know
>> > the available options are:
>> > 1) Post Tool
>> > 2) DIH
>> > 3) SolrJ (as ours is a J2EE application).
>> >
>> > I want
t; 1) Post Tool
> > 2) DIH
> > 3) SolrJ (as ours is a J2EE application).
> >
> > I want to know what is the recommended way for Data import in production
> > environment.
> > Will sending data via SolrJ in batches be faster than posting a csv using
> > POST tool?
> >
> >
> > Thanks,
> > Vishal
> >
>
>
>
> --
> Thanks,
> Sujay P Bawaskar
> M:+91-77091 53669
>
I just want to share my recent project. I have successfully sent all our EDI
documents to Cassandra 3.7 clusters using Solr 6.3 Data Import JDBC Cassandra
connector indexing our documents.
Since Cassandra is so fast for writing, compression rate is around 13% and all
my documents can be keep in
2) DIH
>> 3) SolrJ (as ours is a J2EE application).
>>
>> I want to know what is the recommended way for Data import in production
>> environment. Will sending data via SolrJ in batches be faster than posting a
>> csv using POST tool?
>
> I've heard that CSV
On 3/17/2017 3:04 AM, vishal jain wrote:
> I am new to Solr and am trying to move data from my RDBMS to Solr. I know the
> available options are:
> 1) Post Tool
> 2) DIH
> 3) SolrJ (as ours is a J2EE application).
>
> I want to know what is the recommended way for Data
; 3) SolrJ (as ours is a J2EE application).
>
> I want to know what is the recommended way for Data import in production
> environment.
> Will sending data via SolrJ in batches be faster than posting a csv using
> POST tool?
>
>
> Thanks,
> Vishal
>
--
Thanks,
Sujay P Bawaskar
M:+91-77091 53669
Hi,
I am new to Solr and am trying to move data from my RDBMS to Solr. I know
the available options are:
1) Post Tool
2) DIH
3) SolrJ (as ours is a J2EE application).
I want to know what is the recommended way for Data import in production
environment.
Will sending data via SolrJ in batches be
Hi,
I am new to Solr and am trying to move data from my RDBMS to Solr. I know
the available options are:
1) Post Tool
2) DIH
3) SolrJ (as ours is a J2EE application).
I want to know what is the recommended way for Data import in production
environment.
Will sending data via SolrJ in batches be
.1448 /
> daphne@cevalogistics.com
>
>
> -Original Message-
> From: Michael Tobias [mailto:mtob...@btinternet.com]
> Sent: Wednesday, March 15, 2017 2:36 PM
> To: solr-user@lucene.apache.org
> Subject: Data Import Handler on 6.4.1
>
> I am sure I am missin
1 - 100 of 691 matches
Mail list logo