On 2/20/2012 6:49 AM, v_shan wrote:
DIH still running out of memory for me, with Full Import on a database of
size 1.5 GB.
Solr version: 3_5_0
Note that I have already added batchSize="-1" but getting same error.
A few questions:
- How much memory have you given to the JVM running this Solr
DIH still running out of memory for me, with Full Import on a database of
size 1.5 GB.
Solr version: 3_5_0
Note that I have already added batchSize="-1" but getting same error.
Sharing my DIH config below.
Hi,
I tried batchSize =-1 but when I'm doing that I will use all mysql's memory
and it's a problem for mysql's database.
:s
Noble Paul നോബിള് नोब्ळ् wrote:
>
> I've moved the FAQ to a new Page
> http://wiki.apache.org/solr/DataImportHandlerFaq
> The DIH page is too big and editing has become
I've moved the FAQ to a new Page
http://wiki.apache.org/solr/DataImportHandlerFaq
The DIH page is too big and editing has become harder
On Thu, Jun 26, 2008 at 6:07 PM, Shalin Shekhar Mangar
<[EMAIL PROTECTED]> wrote:
> I've added a FAQ section to DataImportHandler wiki page which captures
> quest
Hi Grant,
How did you finally managed it
I've the same problem with less data, 8,5M, if I put a batchsize -1, I will
slow down a lot the database which is not that good for the website and
stack request.
What did you do you ???
Thanks,
Grant Ingersoll-6 wrote:
>
> I think it's a bit di
I've added a FAQ section to DataImportHandler wiki page which captures
question on out of memory exception with both MySQL and MS SQL Server
drivers.
http://wiki.apache.org/solr/DataImportHandler#faq
On Thu, Jun 26, 2008 at 9:36 AM, Noble Paul നോബിള് नोब्ळ्
<[EMAIL PROTECTED]> wrote:
> We must d
We must document this information in the wiki. We never had a chance
to play w/ ms sql server
--Noble
On Thu, Jun 26, 2008 at 12:38 AM, wojtekpia <[EMAIL PROTECTED]> wrote:
>
> It looks like that was the problem. With responseBuffering=adaptive, I'm able
> to load all my data using the sqljdbc dr
It looks like that was the problem. With responseBuffering=adaptive, I'm able
to load all my data using the sqljdbc driver.
--
View this message in context:
http://www.nabble.com/DataImportHandler-running-out-of-memory-tp18102644p18119732.html
Sent from the Solr - User mailing list archive at Na
Hi,
I don't think the problem is within DataImportHandler since it just streams
resultset. The fetchSize is just passed as a parameter passed to
Statement#setFetchSize() and the Jdbc driver is supposed to honor it and
keep only that many rows in memory.
From what I could find about the Sql Server
I'm trying with batchSize=-1 now. So far it seems to be working, but very
slowly. I will update when it completes or crashes.
Even with a batchSize of 100 I was running out of memory.
I'm running on a 32-bit Windows machine. I've set the -Xmx to 1.5 GB - I
believe that's the maximum for my envir
The latest patch sets fetchSize as Integer.MIN_VALUE if -1 is passed.
It is added specifically for mysql driver
--Noble
On Wed, Jun 25, 2008 at 4:35 PM, Grant Ingersoll <[EMAIL PROTECTED]> wrote:
> I think it's a bit different. I ran into this exact problem about two weeks
> ago on a 13 million r
DIH does not modify SQL. This value is used as a connection property
--Noble
On Wed, Jun 25, 2008 at 4:40 PM, Grant Ingersoll <[EMAIL PROTECTED]> wrote:
> I'm assuming, of course, that the DIH doesn't automatically modify the SQL
> statement according to the batch size.
>
> -Grant
>
> On Jun 25, 2
The OP is actually using Sql Server (not MySql) as per his mail.
On Wed, Jun 25, 2008 at 4:40 PM, Grant Ingersoll <[EMAIL PROTECTED]>
wrote:
> I'm assuming, of course, that the DIH doesn't automatically modify the SQL
> statement according to the batch size.
>
> -Grant
>
>
> On Jun 25, 2008, at 7
I'm assuming, of course, that the DIH doesn't automatically modify the
SQL statement according to the batch size.
-Grant
On Jun 25, 2008, at 7:05 AM, Grant Ingersoll wrote:
I think it's a bit different. I ran into this exact problem about
two weeks ago on a 13 million record DB. MySQL doe
I think it's a bit different. I ran into this exact problem about two
weeks ago on a 13 million record DB. MySQL doesn't honor the fetch
size for it's v5 JDBC driver.
See http://www.databasesandlife.com/reading-row-by-row-into-java-from-mysql/
or do a search for MySQL fetch size.
You ac
it is batchSize="-1" not fetchSize. Or keep it to a very small value.
--Noble
On Wed, Jun 25, 2008 at 9:31 AM, Noble Paul നോബിള് नोब्ळ्
<[EMAIL PROTECTED]> wrote:
> DIH streams rows one by one.
> set the fetchSize="-1" this might help. It may make the indexing a bit
> slower but memory consumptio
DIH streams rows one by one.
set the fetchSize="-1" this might help. It may make the indexing a bit
slower but memory consumption would be low.
The memory is consumed by the jdbc driver. try tuning the -Xmx value for the VM
--Noble
On Wed, Jun 25, 2008 at 8:05 AM, Shalin Shekhar Mangar
<[EMAIL PRO
Setting the batchSize to 1 would mean that the Jdbc driver will keep
1 rows in memory *for each entity* which uses that data source (if
correctly implemented by the driver). Not sure how well the Sql Server
driver implements this. Also keep in mind that Solr also needs memory to
index docum
This is a bug in MySQL. Try setting the Fetch Size the Statement on
the connection to Integer.MIN_VALUE.
See http://forums.mysql.com/read.php?39,137457 amongst a host of other
discussions on the subject. Basically, it tries to load all the rows
into memory, the only alternative is to set
19 matches
Mail list logo