solr-user@lucene.apache.org
Subject: [bulk]: RE: [bulk]: Re: Optimizing Dataimport from Oracle; cursor
sharing; changing oracle session parameters
Yes, I'm using Data Import Handler and I would prefer a solution for this way
of import because it's already tested and the imported data is ok and
, only changed data is
managed.
-Original Message-
From: Shawn Heisey [mailto:apa...@elyograg.org]
Sent: Wednesday, 16 August 2017 5:42 a.m.
To: solr-user@lucene.apache.org
Subject: Re: Optimizing Dataimport from Oracle; cursor sharing; changing oracle
session parameters
On 8/15/2017 8:09
This might be a hack, but the CSV importer is really fast. Run the query in
your favorite command line and export to CSV, then load it.
You can even make batches. Maybe use ranges of the ID, then delete by query for
that range.
wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunde
On 8/15/2017 8:09 AM, Mannott, Birgit wrote:
> I'm using solr 6.6.0 and I have to do a complex data import from an oracle db
> concerning 3.500.000 data rows.
> For each row I have 15 additional entities. That means that more than 52
> Million selects are send to the database.
> For every select
Birgit,
any chance to utilise one of the caching strategies that DIH offers?
Like building a complete map for one of the subentities? That would mean
reading the whole table at the beginning and then only doing lookups by key.
Or getting data from subentities with joins in your main entity?
Hea
t 15, 2017 4:33 PM
> To: solr-user
> Subject: [bulk]: Re: Optimizing Dataimport from Oracle; cursor sharing;
> changing oracle session parameters
>
> I presume you're using Data Import Handler? An alternative when you get
> into complex imports is to use a SolrJ client, here'
Regards,
Birgit
-Original Message-
From: Erick Erickson [mailto:erickerick...@gmail.com]
Sent: Tuesday, August 15, 2017 4:33 PM
To: solr-user
Subject: [bulk]: Re: Optimizing Dataimport from Oracle; cursor sharing;
changing oracle session parameters
I presume you're using Dat
I presume you're using Data Import Handler? An alternative when you
get into complex imports is to use a SolrJ client, here's a sample.
That way you can use whatever tools the particular JDBC connector will
allow and can be much faster.
https://lucidworks.com/2012/02/14/indexing-with-solrj/
Best,
Hi,
I'm using solr 6.6.0 and I have to do a complex data import from an oracle db
concerning 3.500.000 data rows.
For each row I have 15 additional entities. That means that more than 52
Million selects are send to the database.
For every select that is done I optimized the oracle execution path