Have you investigated the logs of your servlet container? There's
probably some explanation for why the documents weren't submitted in
there.
Michael Della Bitta
Appinions | 18 East 41st St., Suite 1806 | New York, NY 10017
www.appinions.com
Where
he index. We would need the DIH and the schema
file, as Andy pointed out already.
Cheers,
Chantal
>
> -Original Message-
> From: ranmatrix S [mailto:ranmat...@gmail.com]
> Sent: Thursday, August 23, 2012 5:46 PM
> To: solr-user@lucene.apache.org
> Subject: Solr Inde
Are you committing? You have to commit for them to be actually added
-Original Message-
From: ranmatrix S [mailto:ranmat...@gmail.com]
Sent: Thursday, August 23, 2012 5:46 PM
To: solr-user@lucene.apache.org
Subject: Solr Index problem
Hi,
I have setup Solr to index data from Oracle
On Aug 23, 2012, at 4:46 PM, ranmatrix S wrote:
> The schema and fields in db-data-config.xml are one and the same.
Please attach or post both the schema and the DIH config XML files so we can
see them. The DIH can be pretty tricky.
You say you can see 9 records are returned back. How do yo
Hi,
I have setup Solr to index data from Oracle DB through DIH handler. However
through Solr admin I could see the DB connection is successfull, data
retrieved from DB to Solr but not added into index. The message is that "0
documents added" even when I am able to see that 9 records are returned
b
ucene-2.9.jar with your lucene-3.0.3.jar in WEB-INF/lib
>>> directory
>>>
>>> then use jar -cxf solr.war * to again pack the war
>>>
>>> deploy that war hope that work
>>>
>>> -
>>> Thanx:
>>> Grijesh
>>> --
>>> View this message in context:
>>>
>>> http://lucene.472066.n3.nabble.com/SOLR-1-4-and-Lucene-3-0-3-index-problem-tp2396605p2403542.html
>>> Sent from the Solr - User mailing list archive at Nabble.com.
>>>
>>>
e war
deploy that war hope that work
-
Thanx:
Grijesh
--
View this message in context:
http://lucene.472066.n3.nabble.com/SOLR-1-4-and-Lucene-3-0-3-index-problem-tp2396605p2403542.html
Sent from the Solr - User mailing list archive at Nabble.com.
gt;
> then use jar -cxf solr.war * to again pack the war
>
> deploy that war hope that work
>
> -
> Thanx:
> Grijesh
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/SOLR-1-4-and-Lucene-3-0-3-index-problem-tp2396605p2403542.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
http://lucene.472066.n3.nabble.com/SOLR-1-4-and-Lucene-3-0-3-index-problem-tp2396605p2403542.html
Sent from the Solr - User mailing list archive at Nabble.com.
So I should use 1.4.1, and that is already built
what if I use solr 4 ?? from the source code do you know of any tutorial I
can use to learn how to build it using netbeans IDE ??
I already have ant installed
or you advice I go with the 1.4.1 ??
Mambe Churchill Nanje
237 33011349,
AfroVisioN Foun
solr 1.4.x uses 2.9.x of lucene
you could try the trunk which uses lucene 3.0.3 and should be compatible
if I'm correct
Regards,
Peter.
> I have the exact opposite problem where Luke won't even load the index but
> Solr starts fine. I believe there are major differences between the two
> index
(11/02/01 23:58), Churchill Nanje Mambe wrote:
am sorry
I downloaded the solr released version as I dont know how to build solr
myself
but I wrote my crawler with lucene 3.x
now I need solr to search this index so I tried used the solr 1.4 I
downloaded from the site as the most recent versi
am sorry
I downloaded the solr released version as I dont know how to build solr
myself
but I wrote my crawler with lucene 3.x
now I need solr to search this index so I tried used the solr 1.4 I
downloaded from the site as the most recent version
now I cant seem to read the index. I considered
What problem are you trying to solve by using a Lucene 3.x index within
a Solr 1.4 system?
Upayavira
On Tue, 01 Feb 2011 14:59 +0100, "Churchill Nanje Mambe"
wrote:
> is there any way I can change the lucene version wrapped in side solr 1.4
> from lucene 2.x to lucene 3.x.
> any tutorials as I
is there any way I can change the lucene version wrapped in side solr 1.4
from lucene 2.x to lucene 3.x.
any tutorials as I am guessing thats where the index data doesnt match.
something I also found out is that solr 1.4 expects the index to be
luce_index_folder/index while lucene 3.x index is ju
I have the exact opposite problem where Luke won't even load the index but Solr
starts fine. I believe there are major differences between the two indexes that
are causing all these issues.
Adam
On Feb 1, 2011, at 6:28 AM, Churchill Nanje Mambe
wrote:
> hi guys
> I have developed a java cr
hi guys
I have developed a java crawler and integrated the lucene 3.0.3 API into it
so it creates a Lucene.
now I wish to search this lucene index using solr, I tried to configure the
solrconfig.xml and schema.xml, everything seems to be fine
but then solr told me the index is corrupt but I use l
I correct it,,,i index 17M docs. not 1.7M,,,so OutOfMemory happen when it
finish index ~11.3m docs
It is new index.
i think it maybe the reason:
On 7/18/07, Otis Gospodnetic <[EMAIL PROTECTED]> wrote:
> Why? Too small of a Java heap. :)
> Increase the size of the Java heap and lower the maxBu
On 18-Jul-07, at 2:58 PM, Yonik Seeley wrote:
On 7/18/07, Mike Klaas <[EMAIL PROTECTED]> wrote:
Could happen when doDeleting the pending docs too. James: try
sending commit every 500k docs or so.
Hmmm, right... some of the memory usage will be related to the treemap
keeping track of delete
On 7/18/07, Mike Klaas <[EMAIL PROTECTED]> wrote:
On 18-Jul-07, at 11:47 AM, Yonik Seeley wrote:
> On 7/18/07, Otis Gospodnetic <[EMAIL PROTECTED]> wrote:
>> Why? Too small of a Java heap. :)
>> Increase the size of the Java heap and lower the maxBufferedDocs
>> number in solrconfig.xml and the
On 18-Jul-07, at 11:47 AM, Yonik Seeley wrote:
On 7/18/07, Otis Gospodnetic <[EMAIL PROTECTED]> wrote:
Why? Too small of a Java heap. :)
Increase the size of the Java heap and lower the maxBufferedDocs
number in solrconfig.xml and then try again.
If it only happens after a lot of docs, it'
On 7/18/07, Otis Gospodnetic <[EMAIL PROTECTED]> wrote:
Why? Too small of a Java heap. :)
Increase the size of the Java heap and lower the maxBufferedDocs number in
solrconfig.xml and then try again.
If it only happens after a lot of docs, it's probably not
maxBufferedDocs, but when a big lui
r-user@lucene.apache.org
Sent: Wednesday, July 18, 2007 4:50:58 AM
Subject: solr index problem
when i index 1.7m docs and 4k-5k per doc.
OutOfMemory happen when it finish index ~1.13m docs
I just restart tomcat , delete all lock and restart do index.
No error or warning infor until it finish.
anyone know w
when i index 1.7m docs and 4k-5k per doc.
OutOfMemory happen when it finish index ~1.13m docs
I just restart tomcat , delete all lock and restart do index.
No error or warning infor until it finish.
anyone know why? or have the same error?
--
regards
jl
/jira/browse/SOLR-240
- will
-Original Message-
From: Chris Hostetter [mailto:[EMAIL PROTECTED]
Sent: Friday, May 25, 2007 1:50 AM
To: solr-user@lucene.apache.org
Subject: Re: index problem with write lock
: i know how to fix it.
:
: but i just don't know why it happen.
:
: this
: i know how to fix it.
:
: but i just don't know why it happen.
:
: this solr error information:
:
: > Exception during commit/optimize:java.io.IOException: Lock obtain timed
: > out: SimpleFSLock@/usr/solrapp/solr21/data/index/write.lock
that's the problem you see ... but in normal SOlr opera
2007/5/25, Chris Hostetter <[EMAIL PROTECTED]>:
: when i index data with 45 solr boxs.(data have 1700w, freebsd6, java:
: diablo-1.5.0_07-b01, tomcat6), write lock will happen in the procedure.
1) bug reports about errors are nearly useless without a real error
message including a stack trace
: when i index data with 45 solr boxs.(data have 1700w, freebsd6, java:
: diablo-1.5.0_07-b01, tomcat6), write lock will happen in the procedure.
1) bug reports about errors are nearly useless without a real error
message including a stack trace.
2) what do you mean you "index data with 45 solr
i find it always happen when index have been doing for a while.
for example, it will happen after starting index 1 hour - 2hours.
2007/5/24, James liu <[EMAIL PROTECTED]>:
i find one interesting thing.
when i index data with 45 solr boxs.(data have 1700w, freebsd6, java:
diablo-1.5.0_07-b0
i find one interesting thing.
when i index data with 45 solr boxs.(data have 1700w, freebsd6, java:
diablo-1.5.0_07-b01, tomcat6), write lock will happen in the procedure.
Reindex with solr box which have problem with write block.
it show me well.
it happen serveral times, so i wanna know why i
Problem i fix it.
Thks,Yonik.
2007/3/30, Yonik Seeley <[EMAIL PROTECTED]>:
On 3/29/07, James liu <[EMAIL PROTECTED]> wrote:
> i use freebsd6, tomcat 6(without install)+jdk1.5_07+php5+mssql
>
> i debug my program and data is ok before update to do index
>
> and index process is ok. no error
On 3/29/07, James liu <[EMAIL PROTECTED]> wrote:
i use freebsd6, tomcat 6(without install)+jdk1.5_07+php5+mssql
i debug my program and data is ok before update to do index
and index process is ok. no error.
but i find index file not what i wanna. it have changed.
tomcat6's server.xml,,i
i use freebsd6, tomcat 6(without install)+jdk1.5_07+php5+mssql
i debug my program and data is ok before update to do index
and index process is ok. no error.
but i find index file not what i wanna. it have changed.
tomcat6's server.xml,,i added "URIEncoding="UTF-8"
data send to solr do i
33 matches
Mail list logo