Glad to hear it's working for you.
Erick
On Wed, Jul 22, 2015 at 10:56 AM, Vineeth Dasaraju
wrote:
> Hi Erick,
>
> As correctly pointed out by you, the main reason why documents were
> disappearing was that I was assigning same id to multiple documents. This
> got resolved after I used the UUID
Sorry, the "ID" mistake was pointed out by Upayavira. Thank you Upayavira!
On Wed, Jul 22, 2015 at 10:56 AM, Vineeth Dasaraju
wrote:
> Hi Erick,
>
> As correctly pointed out by you, the main reason why documents were
> disappearing was that I was assigning same id to multiple documents. This
> g
Hi Erick,
As correctly pointed out by you, the main reason why documents were
disappearing was that I was assigning same id to multiple documents. This
got resolved after I used the UUID as suggested by Mohsen. Thank you for
your inputs.
Regards,
Vineeth
On Wed, Jul 22, 2015 at 9:39 AM, Erick Er
The other classic error is to not send the batch at the end, but
at a glance that's not a problem for you, after the while loop
you send the batch that'll catch any docs left over.
solr.user, that might be your problem? Because I've never seen
this happen.
On Tue, Jul 21, 2015 at 1:47 PM, Fadi M
In Java: UUID.randomUUID();
That is what I'm using.
Regards
> On 21 Jul 2015, at 22:38, Vineeth Dasaraju wrote:
>
> Hi Upayavira,
>
> I guess that is the problem. I am currently using a function for generating
> an ID. It takes the current date and time to milliseconds and generates the
> id.
Hi Upayavira,
I guess that is the problem. I am currently using a function for generating
an ID. It takes the current date and time to milliseconds and generates the
id. This is the function.
public static String generateID(){
Date dNow = new Date();
SimpleDateFormat ft = new Simp
Are you making sure that every document has a unique ID? Index into an
empty Solr, then look at your maxdocs vs numdocs. If they are different
(maxdocs is higher) then some of your documents have been deleted,
meaning some were overwritten.
That might be a place to look.
Upayavira
On Tue, Jul 21
I can confirm this behavior, seen when sending json docs in batch, never
happens when sending one by one, but sporadic when sending batches.
Like if sole/jetty drops couple of documents out of the batch.
Regards
> On 21 Jul 2015, at 21:38, Vineeth Dasaraju wrote:
>
> Hi,
>
> Thank You Erick
Hi,
Thank You Erick for your inputs. I tried creating batches of 1000 objects
and indexing it to solr. The performance is way better than before but I
find that number of indexed documents that is shown in the dashboard is
lesser than the number of documents that I had actually indexed through
sol
First thing is it looks like you're only sending one document at a
time, perhaps with child objects. This is not optimal at all. I
usually batch my docs up in groups of 1,000, and there is anecdotal
evidence that there may (depending on the docs) be some gains above
that number. Gotta balance the b
Hi,
I am trying to index JSON objects (which contain nested JSON objects and
Arrays in them) into solr.
My JSON Object looks like the following (This is fake data that I am using
for this example):
{
"RawEventMessage": "Lorem ipsum dolor sit amet, consectetur adipiscing
elit. Aliquam dolor o
11 matches
Mail list logo