Thanks Chirs, I found in our application code it was related to optimistic
concurrency failure.
On Mon, Mar 3, 2014 at 6:13 PM, Chris Hostetter wrote:
>
> : Subject: java.lang.Exception: Conflict with StreamingUpdateSolrServer
>
> the fact that you are using StreamingUpdateSolr
: Subject: java.lang.Exception: Conflict with StreamingUpdateSolrServer
the fact that you are using StreamingUpdateSolrServer isn't really a
factor here -- what matters is the data you are sending to solr in the
updates...
: location=StreamingUpdateSolrServer line=162 Status for: null i
-336
location=StreamingUpdateSolrServer line=162 Status for: null is 409
2014-03-01 16:52:16,858 [1a44#fb2a/ActiveListingPump] priority=ERROR
app_name=listing-search-index thread=pool-15-thread-336
location=StreamingUpdateSolrServer line=304 error
java.lang.Exception: Conflict
Conflict
request: http
Thank you Gopal,
I've updated the wiki accordingly.
Best
Lewis
On Tue, Jan 22, 2013 at 5:58 PM, Gopal Patwa wrote:
> These classes were deprecated from 4.0 and it is replaced with
> ConcurrentUpdateSolrServer and HttpSolrServer
>
> On Tue, Jan 22, 2013 at 5:28 PM, Lewis John Mcgibbney <
> lewis.
These classes were deprecated from 4.0 and it is replaced with
ConcurrentUpdateSolrServer and HttpSolrServer
On Tue, Jan 22, 2013 at 5:28 PM, Lewis John Mcgibbney <
lewis.mcgibb...@gmail.com> wrote:
> Hi All,
>
> As above, I am upgrading our application and I wish to give it a facelift
> to use t
Hi All,
As above, I am upgrading our application and I wish to give it a facelift
to use the new shiny 4.1.0 deps.
I notice that not both the packages above are no longer included in the
4.1.0 solr-solrj artifact.
Can someone please explain what I should replace this with?
I also notice that the w
but indexing just
> continues to the next document. This happens with the
> StreamingUpdateSolrServer which is multithreaded.
>
> Thanks.
>
> On Tue, Jun 19, 2012 at 9:58 AM, Lance Norskog wrote:
>
>> When one document fails, the entire update fails, right? Is there now
&
Hi Lance,
As far as i can see, one document failing does not fail the entire update.
>From my logs i can see the error logged in the logs but indexing just
continues to the next document. This happens with the
StreamingUpdateSolrServer which is multithreaded.
Thanks.
On Tue, Jun 19, 2012 a
wrote:
> You could instantiate an anonymous instance of StreamingUpdateSolrServer
> that has a "handleError" method that then parses the exception message to
> get the request URI. If there isn't enough information there, you could add
> a dummy request option to your
You should also call the glue code ;-):
Protocol.registerProtocol("http", http);
regards
Torsten
smime.p7s
Description: S/MIME cryptographic signature
AddOn: You can even set a custom http factory for commons-http (which is
used by SolrStreamingUpdateServer) at all to influence socket options,
example is:
final Protocol http = new Protocol("http",
MycustomHttpSocketFactory.getSocketFactory(), 80);
and MycustomHttpSocketFactory.getSocketFactory
Am Freitag, den 15.06.2012, 18:22 +0100 schrieb Kissue Kissue:
> Hi,
>
> Does anybody know what the default connection timeout setting is for
> StreamingUpdateSolrServer? Can i explicitly set one and how?
>
> Thanks.
Use a custom HttpClient to set one (only snippets, shoul
You could instantiate an anonymous instance of StreamingUpdateSolrServer
that has a "handleError" method that then parses the exception message to
get the request URI. If there isn't enough information there, you could add
a dummy request option to your original request that
Hi,
Using the StreamingUpdateSolrServer, does anybody know how i can get the
list of documents that failed during indexing so maybe i can index them
later? Is it possible? I am using Solr 3.5 with SolrJ.
Thanks.
now what the default connection timeout setting is for
> > StreamingUpdateSolrServer? Can i explicitly set one and how?
> >
> > Thanks.
>
wrote:
> Hi,
>
> Does anybody know what the default connection timeout setting is for
> StreamingUpdateSolrServer? Can i explicitly set one and how?
>
> Thanks.
Hi,
Does anybody know what the default connection timeout setting is for
StreamingUpdateSolrServer? Can i explicitly set one and how?
Thanks.
Hi,
Can someone officially confirm that it is not supported by current Solr
version
to use both EmbeddedSolrServer(For Full indexing) and
StreamingUpdateSolrServer(For Incremental indexing )
to update the same index?
How can I request for enhancement in the next version?
I think that this
Hi Ryan,
I see.
Yes, for incremental indexing(Hourly) we use StreamingUpdateSolrServer
and it is faster than EmbeddedSolrServer.
We are also using, Embedded server for full indexing on a daily basis and
it is efficient for full indexing as it can handle large number of documents
in a better way
In general -- i would not suggest mixing EmbeddedSolrServer with a
different style (unless the other instances are read only). If you
have multiple instances writing to the same files on disk you are
asking for problems.
Have you tried just using StreamingUpdateSolrServer for daily update?
I
Hi,
Any more thoughts??
Thanks,
PC Rao.
--
View this message in context:
http://lucene.472066.n3.nabble.com/EmbeddedSolrServer-and-StreamingUpdateSolrServer-tp3889073p3940383.html
Sent from the Solr - User mailing list archive at Nabble.com.
Hi Mikhail Khludnev,
THank you for your help.
Let me explain you the scenario about JVM.
The JVM in which tomcat is running will not be restarted every time the
StreamingUpdateSolrServer
is running where as the EmbeddedSolrServer is a fresh JVM instance(new
process) every time.
In this scenario
Thanks,
> PC Rao
>
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/EmbeddedSolrServer-and-StreamingUpdateSolrServer-tp3889073p3925014.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
--
Sincerely yours
Mikhail Khludnev
ge...@yandex.ru
<http://www.griddynamics.com>
Hi,
Any update?
Thanks,
PC Rao
--
View this message in context:
http://lucene.472066.n3.nabble.com/EmbeddedSolrServer-and-StreamingUpdateSolrServer-tp3889073p3925014.html
Sent from the Solr - User mailing list archive at Nabble.com.
Please let me know your thoughts.
Thanks,
PC Rao.
--
View this message in context:
http://lucene.472066.n3.nabble.com/EmbeddedSolrServer-and-StreamingUpdateSolrServer-tp3889073p3916521.html
Sent from the Solr - User mailing list archive at Nabble.com.
is message in context:
> http://lucene.472066.n3.nabble.com/EmbeddedSolrServer-and-StreamingUpdateSolrServer-tp3889073p3907223.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
--
Sincerely yours
Mikhail Khludnev
ge...@yandex.ru
<http://www.griddynamics.com>
Hi Shawn,
Thanks for sharing your opinion.
Mikhail Khludnev, what do you think of Shawn's opinion?
Thanks,
PC Rao.
--
View this message in context:
http://lucene.472066.n3.nabble.com/EmbeddedSolrServer-and-StreamingUpdateSolrServer-tp3889073p3907223.html
Sent from the Solr - User mailing
On 4/12/2012 4:52 AM, pcrao wrote:
I think the index is getting corrupted because StreamingUpdateSolrServer is
keeping reference
to some index files that are being deleted by EmbeddedSolrServer during
commit/optimize process.
As a result when I Index(Full) using EmbeddedSolrServer and then do
Hi Mikhail Khludnev,
Thank you for the reply.
I think the index is getting corrupted because StreamingUpdateSolrServer is
keeping reference
to some index files that are being deleted by EmbeddedSolrServer during
commit/optimize process.
As a result when I Index(Full) using EmbeddedSolrServer and
his.
>
> Thanks,
> PC Rao.
>
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/EmbeddedSolrServer-and-StreamingUpdateSolrServer-tp3889073p3902171.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
--
Sincerely yours
Mikhail Khludnev
ge...@yandex.ru
<http://www.griddynamics.com>
Hi,
Any update on this?
Please let me know if you need additional information on this.
Thanks,
PC Rao.
--
View this message in context:
http://lucene.472066.n3.nabble.com/EmbeddedSolrServer-and-StreamingUpdateSolrServer-tp3889073p3902171.html
Sent from the Solr - User mailing list archive at
Hi,
I am using EmbeddedSolrServer for full indexing (Multi core)
and StreamingUpdateSolrServer for incremental indexing.
The steps involved are mentioned below.
Full indexing (Daily)
1) Start EmbeddedSolrServer
2) Delete all docs
3) Add all docs
4) Commit and optimize collection
5) Stop
t no work on it has
gotten any traction.
It looks like StreamingUpdateSolrServer is not meant for situations where strict error
checking is required. I think the documentation should reflect that. Would you be
opposed to a javadoc update at the class level (plus a wiki addition) like the follo
ivial.
>>> Other ideas for reporting errors have been thrown around in the past, but
>>> no work on it has gotten any traction.
>>
>> It looks like StreamingUpdateSolrServer is not meant for situations where
>> strict error checking is required. I think the
ound in the past, but no
>> work on it has gotten any traction.
>
> It looks like StreamingUpdateSolrServer is not meant for situations where
> strict error checking is required. I think the documentation should reflect
> that. Would you be opposed to a javadoc update at the cla
o what you would like. I've written sample code around stopping
and throwing an exception, but I guess its not totally trivial. Other ideas for
reporting errors have been thrown around in the past, but no work on it has
gotten any traction.
It looks like StreamingUpdateSolrServer i
Like I said, you have to extend the class and override the error method.
Sent from my iPhone
On Mar 27, 2012, at 2:29 AM, Shawn Heisey wrote:
> On 3/26/2012 10:25 PM, Shawn Heisey wrote:
>> The problem is that I currently have no way (that I know of so far) to
>> detect that a problem happene
On 3/26/2012 10:25 PM, Shawn Heisey wrote:
The problem is that I currently have no way (that I know of so far) to
detect that a problem happened. As far as my code is concerned,
everything worked, so it updates my position tracking and those
documents will never be inserted. I have not yet de
012, at 7:33 PM, Shawn Heisey wrote:
I've been building a new version of my app that keeps our Solr indexes up to
date. I had hoped to use StreamingUpdateSolrServer instead of
CommonsHttpSolrServer for performance reasons, but I have run into a
showstopper problem that has made me revert t
#x27;ve been building a new version of my app that keeps our Solr indexes up to
> date. I had hoped to use StreamingUpdateSolrServer instead of
> CommonsHttpSolrServer for performance reasons, but I have run into a
> showstopper problem that has made me revert to CHSS.
>
> I have
I've been building a new version of my app that keeps our Solr indexes
up to date. I had hoped to use StreamingUpdateSolrServer instead of
CommonsHttpSolrServer for performance reasons, but I have run into a
showstopper problem that has made me revert to CHSS.
I have been relyi
On 3/20/2012 8:11 PM, Chris Hostetter wrote:
:> Is there any way to get get the threads within SUSS objects to immediately
:> exit without creating other issues? Alternatively, if immediate isn't
:> possible, the exit could take 1-2 seconds. I could not find any kind of
:> method in the API
: > Is there any way to get get the threads within SUSS objects to immediately
: > exit without creating other issues? Alternatively, if immediate isn't
: > possible, the exit could take 1-2 seconds. I could not find any kind of
: > method in the API that closes down the object.
you should take
On 3/15/2012 5:53 PM, Shawn Heisey wrote:
Is there any way to get get the threads within SUSS objects to
immediately exit without creating other issues? Alternatively, if
immediate isn't possible, the exit could take 1-2 seconds. I could
not find any kind of method in the API that closes down
I've got a build system that uses SolrJ. The currently running version
uses CommonsHttpSolrServer for everything. In the name of performance,
I am writing a new version that uses CommonsHttpSolrServer for queries
and StreamingUpdateSolr for updates.
One of the things that my program does is
Whats the output of jstack $PID ?
If the program does not exit, there must be some non-daemon threads
still running.
smime.p7s
Description: S/MIME cryptographic signature
at 3:49 PM, T Vinod Gupta wrote:
> here is how i was playing with it..
>
> StreamingUpdateSolrServer solrServer = new
> StreamingUpdateSolrServer("http://localhost:8983/solr/";, 10, 1);
>
> SolrInputDocument doc1 = new SolrInputDocument();
> doc
here is how i was playing with it..
StreamingUpdateSolrServer solrServer = new
StreamingUpdateSolrServer("http://localhost:8983/solr/";, 10, 1);
SolrInputDocument doc1 = new SolrInputDocument();
doc1.addField( "pk_id", "id1");
doc1
> Hi,
> I wrote a hello world program to add documents to solr server. When I
> use CommonsHttpSolrServer, the program exits but when I
> use StreamingUpdateSolrServer, the program never exits. And I couldn't find
> a way to close it? Are there any best practices here? Do I have t
Hi,
I wrote a hello world program to add documents to solr server. When I
use CommonsHttpSolrServer, the program exits but when I
use StreamingUpdateSolrServer, the program never exits. And I couldn't find
a way to close it? Are there any best practices here? Do I have to do
anything differ
have designed a synchronizer that goes out to various databases,
extracts some data, does some processing, and then uses the
StreamingUpdateSolrServer to send the records to a Solr index. When
everything is up, it works just fine.
Now I'm trying to account for problems, like if the Solr ind
Hi,
I'm confused about using StreamingUpdateSolrServer and commitWithin
parameter in conjuction with waitSearcher and waitFlush.
Does it make sense a request like this?
UpdateRequest updateRequest = new UpdateRequest();
updateRequest.setCommitWithin(12);
updateRequest.setWaitSearcher(
to CommonsHTTPServer and added a doc at a time there
> wouldn't be any ambiguity, but that would be very slow indeed
You can still use CommonsHTTPServer and use multiple threads to
increase concurrency.
While not quite as fast as StreamingUpdateSolrServer, it's much, much
faster th
ambiguity, but that would be very slow indeed
-Simon
On Tue, Sep 6, 2011 at 12:58 PM, Leonardo Souza wrote:
> Hi Mark,
>
> The implementation is logging anyway, we have subclassed
> StreamingUpdateSolrServer and used handleError to log, but inspecting the
> stack trace in in
Hi Mark,
The implementation is logging anyway, we have subclassed
StreamingUpdateSolrServer and used handleError to log, but inspecting the
stack trace in in the handleError method
does not give any clue about the document(s) that failed. We have a solution
that uses Solr as backend for indexing
The default impl logs with slf4j - just setup logging properly and you will see
the results?
Alternatively, you can subclass and impl that method however you'd like.
On Sep 5, 2011, at 6:36 PM, Leonardo Souza wrote:
> Hi,
>
> Inspecting StreamingUpdateSolrServer#handleError i ca
Hi,
Inspecting StreamingUpdateSolrServer#handleError i can't see how to keep
track of failures, i'd like to discover
which documents failed during the request.
thanks in advance!
--
Leonardo S Souza
least one step missing in your process -- the XML you showed
us isn't the Solr XML Message format, so it can't be the poutput
of using the StreamingUpdateSolrServer after you build up
SOlrInputDocuments.
i suspect you have some code that parses your XML, and in that code you
ca
Hi All
I'm indexing a set of xml documents using StreamingUpdateSolrServer but
I'm having trouble indexing dates.
I get an error like:
SEVERE: error
java.lang.Exception: Invalid Date Math String:'2011-04-22T05:35:37Z
'
Invalid Date Math String:'2011-04
ucene.apache.org
> Sent: Mon, May 30, 2011 8:40:28 AM
> Subject: Can we stream binary data with StreamingUpdateSolrServer ?
>
> Hi,
>
> I'm using StreamingUpdateSolrServer to post a batch of content to SOLR1.4.1.
> By looking at StreamingUpdateSolrServer code, it l
Hi,
I'm using StreamingUpdateSolrServer to post a batch of content to SOLR1.4.1.
By looking at StreamingUpdateSolrServer code, it looks it only provides the
content to be streamed in XML format only.
Can we use it to stream data in binary format?
--
View this message in context:
ing speed i would like to use
> StreamingUpdateSolrServer but as a newbie I am not sure where to use... I
> have checked the wiki but all i get is how to implement. not where to put
> that method... Or maybe i am missing some facts...
>
> anyway, anyone used StreamingUpdateSolrServer before
Hi all,
to improve crappy indexing speed i would like to use
StreamingUpdateSolrServer but as a newbie I am not sure where to use... I
have checked the wiki but all i get is how to implement. not where to put
that method... Or maybe i am missing some facts...
anyway, anyone used
Hello.
i want to change my full-imports from DIH to use of Java and
StreamingUpdateSolrServer ...
is in the wiki a little how to or something similar ?
-
--- System
One Server, 12 GB RAM, 2 Solr Instances, 7 Cores,
1
is it possible to use StreamingUpdateSolrServer with a php application ?
-
--- System
One Server, 12 GB RAM, 2 Solr Instances, 7 Cores,
1 Core with 31 Million Documents other Cores < 100.000
- Solr1 for Search-Reque
Yes. Each thread uses its own connection, and each becomes a new
thread in the servlet container.
On Mon, Mar 7, 2011 at 2:54 AM, Isan Fulia wrote:
> Hi all,
> I am using StreamingUpdateSolrServer with queuesize = 5 and threadcount=4
> The no. of connections created are same as threadco
Hi all,
I am using StreamingUpdateSolrServer with queuesize = 5 and threadcount=4
The no. of connections created are same as threadcount.
Is it that it creates a new connection for every thread.
--
Thanks & Regards,
Isan Fulia.
010 at 11:10 AM, Christopher Gross wrote:
> Hi all.
>
> I have designed a synchronizer that goes out to various databases,
> extracts some data, does some processing, and then uses the
> StreamingUpdateSolrServer to send the records to a Solr index. When
> everything is up, it works j
Hi all.
I have designed a synchronizer that goes out to various databases,
extracts some data, does some processing, and then uses the
StreamingUpdateSolrServer to send the records to a Solr index. When
everything is up, it works just fine.
Now I'm trying to account for problems, li
There's an issue open for this:
https://issues.apache.org/jira/browse/SOLR-1565
I'm not sure off the top of my head how much is involved in making it
happen though.
-Yonik
http://www.lucidimagination.com
On Thu, Jul 1, 2010 at 9:25 AM, Jan Høydahl / Cominvent
wrote:
> Hi,
>
> I had the impressio
The streaming won't use the 'set' Requestwriter. It uses a custom xml
requestwriter embedded in the StreamingUpdateSolrServer.
I was also hoping it would use a BinaryRequestWriter but after digging
it turned-out not to.
On 1-7-2010 15:25, Jan Høydahl / Cominvent wrote:
Hi,
I had the impression that the StreamingUpateSolrServer in SolrJ would
automatically use the /update/javabin UpdateRequestHandler. Is this not true?
Do we need to call
server.setRequestWriter(new BinaryRequestWriter()) for it to transmit content
with the binary protocol?
--
Jan Høydahl, sea
On Tue, Jun 22, 2010 at 9:38 AM, Stephen Duncan Jr
wrote:
> I'm prototyping using StreamingUpdateSolrServer. I want to send a commit
> (or optimize) after I'm done adding all of my docs, rather than wait for the
> autoCommit to kick in. However, since StreamingUpdateSo
On Tue, Jun 22, 2010 at 9:38 AM, Stephen Duncan Jr wrote:
> I'm prototyping using StreamingUpdateSolrServer. I want to send a commit
> (or optimize) after I'm done adding all of my docs, rather than wait for the
> autoCommit to kick in. However, since StreamingUpdateSo
I'm prototyping using StreamingUpdateSolrServer. I want to send a commit
(or optimize) after I'm done adding all of my docs, rather than wait for the
autoCommit to kick in. However, since StreamingUpdateSolrServer is
multi-threaded, I can't simply call commit when I'm do
On Thu, Apr 29, 2010 at 7:51 PM, Yonik Seeley
wrote:
> I'm trying to reproduce now... single thread adding documents to a
> multithreaded client, StreamingUpdateSolrServer(addr,32,4)
>
> I'm currently at the 2.5 hour mark and 100M documents - no issues so far.
I
I'm trying to reproduce now... single thread adding documents to a
multithreaded client, StreamingUpdateSolrServer(addr,32,4)
I'm currently at the 2.5 hour mark and 100M documents - no issues so far.
-Yonik
Apache Lucene Eurocon 2010
18-21 May 2010 | Prague
On Thu, Apr 29, 2010
What is the garbage collection status when this happens?
What are the open sockets in the OS when this happens? Run 'netstat
-an | fgrep 8983' where 8983 is the Solr incoming port number.
A side note on sockets:
SUSS uses the MultiThreadedHttpConnectionManager but never calls
MultiThreadedHttpCo
On Thu, Apr 29, 2010 at 6:04 PM, Lance Norskog wrote:
> In solrconfig.xml, there is a parameter controlling remote streaming:
>
>
> multipartUploadLimitInKB="2048000" />
>
> 1) Is this relevant with the SUSS?
No, this relates to solr pulling data from another source (via stream.url
In solrconfig.xml, there is a parameter controlling remote streaming:
1) Is this relevant with the SUSS?
2) It seems to be 'true' in the example default, which may not be a good idea.
On Thu, Apr 29, 2010 at 2:12 PM, Yonik Seeley
wrote:
> On Fri, Apr 16, 2010 at 1:34 PM, Sascha
On Fri, Apr 16, 2010 at 1:34 PM, Sascha Szott wrote:
> In my case the whole application hangs and never recovers (CPU utilization
> goes down to near 0%). Interestingly, the problem reproducibly occurs only
> if SUSS is created with *more than 2* threads.
Is your application also using multiple t
Greetings!
I'm using StreamingUpdateSolrServer to index my daily Solr shards.
However, at midnight when I need to start indexing the next day's shard, is
there a way to reset the StreamingUpdateSolrServer URL to point to my new
shard, or is there a way to flush t
>
>>> Yonik Seeley wrote:
>>>
>>>>
>>>> Stephen, were you running stock Solr 1.4, or did you apply any of the
>>>> SolrJ patches?
>>>> I'm trying to figure out if anyone still has any problems, or if this
>>>>
e
SolrJ patches?
I'm trying to figure out if anyone still has any problems, or if this
was fixed with SOLR-1711:
I'm using the latest trunk version (rev. 934846) and constantly running into
the same problem. I'm using StreamingUpdateSolrServer with 3 treads and a
queue size of
R-1711:
>
> I'm using the latest trunk version (rev. 934846) and constantly running into
> the same problem. I'm using StreamingUpdateSolrServer with 3 treads and a
> queue size of 20 (not really knowing if this configuration is optimal). My
> multi-threaded application ind
into the same problem. I'm using StreamingUpdateSolrServer with 3 treads
and a queue size of 20 (not really knowing if this configuration is
optimal). My multi-threaded application indexes 200k data items
(bibliographic metadata in Dublin Core format) and constantly hangs
after runnin
Stephen, were you running stock Solr 1.4, or did you apply any of the
SolrJ patches?
I'm trying to figure out if anyone still has any problems, or if this
was fixed with SOLR-1711:
* SOLR-1711: SolrJ - StreamingUpdateSolrServer had a race condition that
could halt the streaming of docu
StreamingUpdateSolrServer logs "starting runner: ...", sends a POST
with ... and I guess also opens a new HTTP connection
every time it has managed to empty its queue. In
StreamingUpdateSolrServer.java it says this:
// info is ok since this should only happen once for each thread
I am trying to use the StreamingUpdateSolrServer to index a bunch of
bibliographic data and it is hanging up every time I run it. Sometimes
it hangs after about 100k records (after about 2 minutes), sometimes
after 4M records (after about 80 minutes) and all different intervals in
between. It
On Mon, Jan 25, 2010 at 7:27 PM, Jake Brownell wrote:
> I swapped our indexing process over to the streaming update server, but now
> I'm seeing places where our indexing code adds several documents, but
> eventually hangs. It hangs just before the completion message, which comes
> directly aft
I'll have to defer that one for now.
2010/1/26 Tim Terlegård
> 2010/1/26 Erick Erickson :
> > > My indexing script has been running all
> > > night and has accomplished nothing. I see lots of disk activity
> > > though, which is weird.
> >
> >
> > One explanation would be that you're memory-
2010/1/26 Erick Erickson :
> > My indexing script has been running all
> > night and has accomplished nothing. I see lots of disk activity
> > though, which is weird.
>
>
> One explanation would be that you're memory-starved and
> the disk activity you see is thrashing. How much memory
> do you all
trunk:
* SOLR-1595: StreamingUpdateSolrServer used the platform default character
set when streaming updates, rather than using UTF-8 as the HTTP headers
indicated, leading to an encoding mismatch. (hossman, yonik)
Could you try a recent nightly build (or build your own from trunk)
and see if it fi
<<< My indexing script has been running all
night and has accomplished nothing. I see lots of disk activity
though, which is weird.>>>
One explanation would be that you're memory-starved and
the disk activity you see is thrashing. How much memory
do you allocate to your JVM? A further indication
2010/1/26 Jake Brownell :
> I swapped our indexing process over to the streaming update server, but now
> I'm seeing places where our indexing code adds several documents, but
> eventually hangs. It hangs just before the completion message, which comes
> directly after sending to solr. I found
Hi,
I swapped our indexing process over to the streaming update server, but now I'm
seeing places where our indexing code adds several documents, but eventually
hangs. It hangs just before the completion message, which comes directly after
sending to solr. I found this issue in jira
https://is
The issue was sometimes null result during facet navigation or simple
search, results were back after a refresh, we tried to changed the cache to
. But same behaviour.
That is strange. Just to make sure, you were using the same LBHttpSolrServer
instance for all requests, weren't you?
On Mon, Jan 4, 2010 at 7:13 PM, Patrick Sauts wrote:
> The issue was sometimes null result during facet navigation or simple
> search, results were back after a refresh, we tried to changed the cache to
> . But same behaviour.
>
>
That is strange. Just to make sure, you were using the same LBHttpS
The issue was sometimes null result during facet navigation or simple
search, results were back after a refresh, we tried to changed the cache
to . But same behaviour.
*My implementation was :* (maybe wrong ?)
LBHttpSolrServer solrServer = new LBHttpSolrServer(new HttpClient(), new
XMLResponse
On Mon, Jan 4, 2010 at 6:11 PM, Patrick Sauts wrote:
>
> I've also tested LBHttpSolrServer (We wanted to have it as a "backup" for
> HAproxy) and it appears not to be thread safe ( what is also curious about
> it, is that there's no way to manage the connections' pool ). If you're
> interresting
1 - 100 of 140 matches
Mail list logo