It's actually quite easy to index from a DB to Solr via SolrJ, here's
an example.
https://lucidworks.com/2012/02/14/indexing-with-solrj/
On Tue, Jul 10, 2018 at 6:06 AM, Shawn Heisey wrote:
> On 7/8/2018 9:44 AM, shruti suri wrote:
>>
>> I am using solr-6.1.0 version. This is the response I am g
On 7/8/2018 9:44 AM, shruti suri wrote:
I am using solr-6.1.0 version. This is the response I am getting. But every
time I run delta import , it fetches same number of records but didn't
commit them.
0:0:42.255
2
10208
0
0
2018-07-08 15:37:31
2018-07-08 15:37:31
2018-07-08 15:38:13
2018-07-08 1
Agreed. DIH is not an industrial grade ETL tool.. may want to consider other
options. May want to look into Kafka Connect as an alternative. It has
connectors for JDBC into Kafka, and from Kafka into Solr.
--
Rahul Singh
rahul.si...@anant.us
Anant Corporation
On Jul 9, 2018, 6:14 AM -0500, Alex
I think you are moving so fast it is hard to understand where you need help.
Can you setup one clean smallest issue (maybe as test) and try our original
suggestions.
Otherwise, nobody has enough attention energy to figure out what is
happening.
And even then, this list is voluntary help, we are
Still not working, same issue documents are not getting pushed to index.
-
Regards
Shruti
--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
I have as well faced the problem when we have composite primary key in the
table, so below is how have went with workaround.
deltaQuery retrieve concat value with time criteria (that should retrieves
only modified rows) and use it in deltaImportQuery with where clause.
On Sun, Jul 8, 2
Dataconfig I am using now
*managed-schema*
data_id
-
Regards
Shruti
--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
My oracle table doesn't have any primary key and delta import requires a
primary key, that's why I am creating it by concatenating 2 columns. For now
just for testing I am using only one column.
I am using solr-6.1.0 version. This is the response I am getting. But every
time I run delta import , i
Could you try something like this.?
deltaQuery="select CONCAT(cast(col1 as varchar(10)) , name) as ID FROM
MyTable WHERE UPDATED_DATE > '${dih.last_index_time}'"
deltaImportQuery="select CONCAT(cast(col1 as varchar(10)) , name) as ID, *
from MyTable WHERE CONCAT(cast(col1 as varchar(10)) , name) =
On 7/6/2018 5:53 AM, shruti suri wrote:
> Please help me with delta import form one oracle table into solr. I don't
> have any primary key in the table. We need to use composite key using
> (LOCAL_MASTER_ID,LOCAL_ID).
>
>
> query="select * from dat
Which version of Solr is it and in which way is it not working.
And should not the deltaQuery and deltaImportQuery both have "as ID" part?
Regards,
Alex.
On 6 July 2018 at 07:53, shruti suri wrote:
> HI,
>
> Please help me with delta import form one oracle table into solr. I don't
> have any
On 6/7/2018 12:22 AM, kotekaman wrote:
Is the deltaimport should use the timestamp in sql table?
The text above, and the subject, are the ONLY things I can see in this
message. Which makes this an extremely vague question. This wiki page
may be relevant:
https://wiki.apache.org/solr/Using
On 3/1/2017 8:48 AM, Liu, Daphne wrote:
> Hello Solr experts, Is there a place in Solr (Delta Import
> Datasource?) where I can adjust the JDBC connection frame size to 256
> mb ? I have adjusted the settings in Cassandra but I'm still getting
> this error. NonTransientConnectionException:
> org.ap
It also should be ${dataimporter.last_index_time}
Also, that's two queries - an outer query to get the IDs that are modified,
and another query (done repeatedly) to get the data. You can go faster
using a parameterized data import as described in the wiki:
http://wiki.apache.org/solr/DataImport
It looks like you are returning the transformed ID, along with some other
fields, in the deltaQuery command.deltaQuery should only return the ID,
without the "stk_" prefix, and then deltaImportQuery should retrieve the
transformed ID. I'd suggest:
I'm not sure which RDBMS you are using, bu
good.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Delta-Import-Cleaning-Index-tp4151217p4152221.html
Sent from the Solr - User mailing list archive at Nabble.com.
On 8/5/2014 7:31 AM, Jako de Wet wrote:
> Thanks for the insight. Why the size increase when not specifying the clean
> parameter then? The PK for the documents remain the same throughout the
> whole import process.
>
> Should a full optimize combine all the results into one and decrease the
> phy
Hi Shawn
Thanks for the insight. Why the size increase when not specifying the clean
parameter then? The PK for the documents remain the same throughout the
whole import process.
Should a full optimize combine all the results into one and decrease the
physical size of the core?
On Tue, Aug 5, 2
On 8/5/2014 7:20 AM, Jako de Wet wrote:
> I have a Solr Index that has 20+ million products, the core is about 70GB.
>
> What I would like to do, is a weekly delta-import, but it seems to be
> growing in size each week. (Currently its running a full-import +
> clean=false)
>
> Shouldn't the Delta
: As Ahmet indicated, you must have a way to detect that deletions have
: happened. Marking rows as deleted with an active/inactive field is one
: way. Another way (the way that we use) is to have a delete trigger on
: the table that creates an entry in a delete tracking table.
If you have no c
On 7/11/2014 1:04 AM, madhav bahuguna wrote:
> How do i make delta import detect deleted values.
> I do have a timestamp column to detect changes ,but the requirement is such
> that rows will be deleted from the table.Every time i run delta import the
> index still shows the deleted value.
> How d
Hi,
You need a soft deletion column. Don't delete the entry instead mark it as
deleted/inactive etc.
Ahmet
On Friday, July 11, 2014 10:12 AM, madhav bahuguna
wrote:
Hi,
How do i make delta import detect deleted values.
I do have a timestamp column to detect changes ,but the requirement is s
I thought about this, and is it possible that for delta imports Solr is
expecting a persistent cache, like the old BerkleyBackedCache? I can't
imagine any other reason why it wouldn't run the subentity queries on delta
imports.
- Drew
On Fri, Jun 13, 2014 at 4:37 PM, Drew Mazurek wrote:
> Sur
Sure... here's the document stanza, simplified:
In the log, I'm seeing the "profile" deltaQuery run (which in testing is
correctly returning 0 records), and then the "object" deltaQuery runs,
which returns 1 record. Then it logs that the delta import is
On 14 June 2014 00:36, Drew Mazurek wrote:
>
> A little more info... removing the cache fixes everything. Are delta
> queries incompatible with the cache? There isn't a lot of current
> documentation on this, far as I can tell.
Not quite au courant myself with 4.8.1 (need to install new Jave so
A little more info... removing the cache fixes everything. Are delta
queries incompatible with the cache? There isn't a lot of current
documentation on this, far as I can tell.
Thanks,
Drew
On Fri, Jun 13, 2014 at 2:46 PM, Drew Mazurek wrote:
> I'm running into an issue with the SqlEntityPro
Hi,
I tried the way you said , but still it's not working. I am sharing the
screenshots for your reference.
Thanks for help.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Delta-Import-Functionality-tp4140063p4140709.html
Sent from the Solr - User mailing list archive at
Hi,
I think you need to select * in deltaImportQuery. You are just selecting one
field for both delta*Query SQL sentences.
On Thursday, June 5, 2014 3:34 PM, ajay59 wrote:
Hi,
We are using the SOLR 4.6 version and trying to implement Delta import
functionality .On implementing the delta impo
Hi Furkan,
sure, this is my data-config.xml:
...
...
...
Currently I have 2.1 Million of activities.
Thanks a lot,
Richard
2014-03-12 19:16 GMT-04:00 Furkan KAMACI :
> Hi;
>
> Could you send your data-config.xml?
>
> Tha
Hi;
Could you send your data-config.xml?
Thanks;
Furkan KAMACI
2014-03-13 1:01 GMT+02:00 Richard Marquina Lopez :
> Hi Ahmet,
>
> Thank you for your response, currently I have the next configuration for
> JVM:
> -XX:+PrintGCDetails-XX:-UseParallelGC-XX:SurvivorRatio=8-XX:NewRatio=2
> -XX:+Heap
Hi Ahmet,
Thank you for your response, currently I have the next configuration for
JVM:
-XX:+PrintGCDetails-XX:-UseParallelGC-XX:SurvivorRatio=8-XX:NewRatio=2
-XX:+HeapDumpOnOutOfMemoryError-XX:PermSize=128m-XX:MaxPermSize=256m
-Xms1024m-Xmx2048m
I have 3.67 GB of physical RAM and 2GB is asigned t
Hi Richard,
How much ram do you assign to java heap? Try increasing it to 1 gb for example.
Please see : https://wiki.apache.org/solr/ShawnHeisey
Ahmet
On Wednesday, March 12, 2014 10:53 PM, Richard Marquina Lopez
wrote:
Hi,
I have some problems when execute the delta import with 2 millio
I think issue was with deltaImportQuery, it is case sensitive. I was using
'${dataimporter.delta.clai_idn}'
instead of '${dataimporter.delta.CLAI_IDN}'
--
View this message in context:
http://lucene.472066.n3.nabble.com/delta-import-giving-Total-Documents-Processed-0-tp4089118p4109798.html
Se
: Subject: "delta-import" giving Total Documents Processed = 0
: I am using solr 4.3.1, during Delta-import, i am always getting "Total
: Documents Processed" as 0 enenthough it is getting the changed documents.
: And no error in the log. I tried with "dih" instead of "dataimporter", still
: same
I am aware of this..my actual delta query is like below, to test the issue, i
restricted the delta query to one record earlier.
deltaQuery ="select distinct clai_idn as clai_idn from claim_history where
TO_CHAR(EVENT_DTE , '-MM-D
Your delta query i.e.
deltaQuery ="select distinct clai_idn as clai_idn from claim_history where
clai_idn=29">
always gets only one row with a fixed "clai_idn". So here you fetch the same
row. What you would want is to get all rows after a p
I did try with din namespace and that didn't seem to make any difference. Since
the PK is a composite in my case, just specifying the bib_id was throwing an
exception stating "could not find the matching pk column" or something to that
effect. Although I realize the use cases for using one or th
On 6/2/2013 10:11 AM, PeriS wrote:
> I found using the strategy mentioned at
> http://wiki.apache.org/solr/DataImportHandlerDeltaQueryViaFullImport, it
> works for me. Not sure what the difference is between this one and writing
> individual queries for fetching the IDs first and then getting th
Shawn,
I found using the strategy mentioned at
http://wiki.apache.org/solr/DataImportHandlerDeltaQueryViaFullImport, it works
for me. Not sure what the difference is between this one and writing individual
queries for fetching the IDs first and then getting the data; I mean I know the
differen
Shawn,
The db-import-config.xml snippet can be found here:http://apaste.info/sTUw
Thanks
-Peri.S
On Jun 2, 2013, at 11:15 AM, Shawn Heisey wrote:
> On 6/2/2013 8:45 AM, PeriS wrote:
>> Ok, so I fixed the issue by providing the pk="" in the entity definition as
>> mentioned in
>> http://wiki.
On 6/2/2013 8:45 AM, PeriS wrote:
> Ok, so I fixed the issue by providing the pk="" in the entity definition as
> mentioned in
> http://wiki.apache.org/solr/DataImportHandler#Using_delta-import_command
>
> I also have a transformer declared for the entity and the DIH during the
> deltaImport do
Ok, so I fixed the issue by providing the pk="" in the entity definition as
mentioned in
http://wiki.apache.org/solr/DataImportHandler#Using_delta-import_command
I also have a transformer declared for the entity and the DIH during the
deltaImport doesn't seem to be passing all the fields to the
BTW the primary key is a combination of 2 fields. So not sure if thats the
issue.
On Jun 2, 2013, at 1:08 AM, PeriS wrote:
> I have configured the delta query properly, but not sure why the DIH is
> throwing the following error;
>
> SEVERE: Delta Import Failed
> java.lang.RuntimeException: ja
Hi Shawn;
and first off, thanks bunches for your pointers.
Am Tue, 28 May 2013 09:31:54 -0600
schrieb Shawn Heisey :
> My workaround was to store the highest indexed autoincrement value in
> a location outside Solr. In my original Perl code, I dropped it into
> a file on NFS. The latest iterati
On 5/28/2013 12:31 AM, Kristian Rink wrote:
(a) The usual tutorials outline something like
WHERE LASTMODIFIED > '${dih.last_index_time}
[snip]
(b) I see that "last_index_time" returns a particularly fixed format.
In our database, with a modestly more complex SELECT, we also could
figure out
> -Original Message-
> From: Keith Naas [mailto:keithn...@dswinc.com]
> Sent: Tuesday, May 14, 2013 3:31 PM
> Stepping through the code on a live instance we can see the cache being
> "disabled" by the destroy calls after each root doc. This destruction causes
> EntityProcessorBase to ch
age-
From: Keith Naas [mailto:keithn...@dswinc.com]
Sent: Tuesday, May 14, 2013 3:31 PM
To: solr-user@lucene.apache.org
Subject: RE: delta-import and cache (a story in conflict)
We had the same though about the toString being the symptom.
We have performed heap analysis on four separate heap 4GB
nt.com]
Sent: Tuesday, May 14, 2013 4:08 PM
To: solr-user@lucene.apache.org
Subject: RE: delta-import and cache (a story in conflict)
The reason it is writing all the imput fields for that document is this
particular error message appends "doc" to the end, which is a subclass of
SolrIn
The reason it is writing all the imput fields for that document is this
particular error message appends "doc" to the end, which is a subclass of
SolrInputDocument, which has a "toString" that shows all the fields. Not sure
if this in particular changed, but I suspect this is a symptom not a ca
I have made unique key in schema.xml now its working for me
thanx a lot
Regards,
-
Suneel Pandey
Sr. Software Developer
--
View this message in context:
http://lucene.472066.n3.nabble.com/Delta-Import-adding-duplicate-entry-tp3783114p3783550.html
Sent from the Solr - User mailing list arc
> but every time when i am executing delta-import through DIH
> it picked only
> changed data that is ok, but rather then updating its adding
> duplicate
> records.
Do you have ... defined in your schema.xml?
http://wiki.apache.org/solr/UniqueKey
Hi Guys,
I probably found a way to mime the delta import for the fileEntityProcessor
( I have used it for xml files ... )
Adding this configuration in the xml-data-config :
And using command :
*command=full-import&clean=false*
*
*
Solr adds to the index only the files that were changed from the
I am using solr 3.4 and configured my DataImportHandler to get some data from
MySql as well as index some rich document from the disk.
This is the part of db-data-config file where i am indexing Rich text
documents.
http://localhost/resumes-new/resumes${re
When I set my fileSize of type string. It shows error as I have posted above.
Then I changed it to slong and results was severe..here is log
18 Nov, 2011 3:00:54 PM
org.apache.solr.response.BinaryResponseWriter$Resolver getDoc
WARNING: Error reading a field from document : SolrDocument[{}]
Sorry for disturbing you allactually I had to add plong instead of type
string.
My problem is solved
Be ready for new thread
CHEERS
--
View this message in context:
http://lucene.472066.n3.nabble.com/delta-import-of-rich-documents-like-word-and-pdf-files-tp3502039p3515711.html
Sent from t
I ran this command and can see size of my files
http://localhost:8080/solr/select?q=user&f.fileSize.facet.range.start=100
Great thanks...string worked...i dont know why that did not work last time
But when I do that in browse section..following output i saw in my logs
SEVERE: Exception during
And also I set my fileSize of type long. "String" will not work I think !
Size can not be a string...it shows error on using string as type.
--
View this message in context:
http://lucene.472066.n3.nabble.com/delta-import-of-rich-documents-like-word-and-pdf-files-tp3502039p3515505.html
Sent from
Thanks for your reply, I performed these steps.
in data-config.xml :
in schema.xml :
--
But still there is no response in browse sectionI edited facet_r
> Now, I want to index my files according to their size and
> facet them
> according to their size ranges. I know that there is an
> option of "fileSize"
> in FileListEntityProcessor but I am not getting any way to
> perform this.
> Is fileSize a metadata?
You don't need a dynamic field for this.
Thank you for your replies guys.that helped a lot. Thanks
"iorixxx" that was the command that worked out.
I also tried my solr with mysql and that worked too. Congo! :)
Now, I want to index my files according to their size and facet them
according to their size ranges. I know that t
And you cannot update-in-place. That is, you can't update
just selected fields in a document, you have to re-index the
whole document.
Best
Erick
On Mon, Nov 14, 2011 at 6:11 AM, Ahmet Arslan wrote:
>
>> Thanks for your reply...my
>> data-config.xml is
>>
>> > type="BinF
> Thanks for your reply...my
> data-config.xml is
>
> type="BinFileDataSource" name="bin"/>
>
> name="f" pk="id" processor="FileListEntityProcessor"
> recursive="true"
> rootEntity="false"
> dataSource="null" baseDir="/var/data/solr"
> fileName=
Thanks for your reply...my data-config.xml is
--
View this message in context:
http://lucene.472066.n3.nabble.com/delta-import-of-ric
> Thanks for your reply Mr. Erick
> All I want to do is that I have indexed some of my pdf
> files and doc files.
> Now, any changes I make to them, I want a
> delta-import(incremental) so that
> I do not have to re index whole document by full import .
> Only changes made
> to these documents shou
and changes are : file content, maybe I change its author and headers
--
View this message in context:
http://lucene.472066.n3.nabble.com/delta-import-of-rich-documents-like-word-and-pdf-files-tp3502039p3505951.html
Sent from the Solr - User mailing list archive at Nabble.com.
Thanks for your reply Mr. Erick
All I want to do is that I have indexed some of my pdf files and doc files.
Now, any changes I make to them, I want a delta-import(incremental) so that
I do not have to re index whole document by full import . Only changes made
to these documents should get updated.
Can you give more details about what you're trying to do? It
looks like you're using DataImportHandler? What defines a
document needing to be re-indexed? How do you expect to
be able to identify them???
Perhaps you can review:
http://wiki.apache.org/solr/UsingMailingLists
Best
Erick
On Sat, Nov
That did the trick! Thanks!
--
View this message in context:
http://lucene.472066.n3.nabble.com/Delta-import-issue-tp3162581p3163009.html
Sent from the Solr - User mailing list archive at Nabble.com.
On Tue, Jul 12, 2011 at 11:34 AM, PeterKerk wrote:
> Hi Rahul,
>
> Not sure how I would do this "Try adding the primary key attribute to the
> root entity 'ad'"?
>
> In my entity ad I already have these fields (I left those out earlier for
> readability):
><-- this is primary key of ads tab
Hi Rahul,
Not sure how I would do this "Try adding the primary key attribute to the
root entity 'ad'"?
In my entity ad I already have these fields (I left those out earlier for
readability):
<-- this is primary key of ads table
Is that what you mean?
And I'm using MSSQL2008
Thanks!
--
V
Hi Peter,
Try adding the primary key attribute to the root entity 'ad' and check if
delta import works.
By the way, which database are you using ?
On Tue, Jul 12, 2011 at 10:27 AM, PeterKerk wrote:
>
> I'm having an issue with a delta import.
>
> I have the following in my data-config.xml:
>
>
The SolrEntityProcessor would be a top-level entity. You would do a
query like this: &sort=timestamp,desc&rows=1&fl=timestamp. This gives
you one data item: the timestamp of the last item added to the index.
With this, the JDBC sub-entity would create a query that chooses all
rows with a timestamp
Thank you for your response.
In what way is 'timestamp' not perfect?
I've looked into the SolrEntityProcessor and added a timestamp field to our
index.
However i'm struggling to work out a query to get the max value od the
timestamp field
and does the SolrEntityProcessor entity appear before the
The timestamp thing is not perfect. You can instead do a search
against Solr and find the latest timestamp in the index. SOLR-1499
allows you to search against Solr in the DataImportHandler.
On Fri, Jan 21, 2011 at 2:27 AM, btucker wrote:
>
> Hello
>
> We've just started using solr to provide sea
id in last_id_table (for the next
delta-import) in addition to returning the data from the query.
Ephraim Ofir
-Original Message-
From: Shawn Heisey [mailto:s...@elyograg.org]
Sent: Friday, September 10, 2010 4:54 AM
To: solr-user@lucene.apache.org
Subject: Re: Delta Import with some
> Can you provide a sample of passing the parameter via URL? And how using it
> would look in the data-config.xml
http://wiki.apache.org/solr/DataImportHandler#Accessing_request_parameters
On 9/9/2010 1:23 PM, Vladimir Sutskever wrote:
Shawn,
Can you provide a sample of passing the parameter via URL? And how using it
would look in the data-config.xml
Here's the URL that I send to do a full build on my last shard:
http://idxst5-a:8983/solr/build/dataimport?command=full-import
Subject: Re: Delta Import with something other than Date
On 9/8/2010 4:32 PM, David Yang wrote:
> I have a table that I want to index, and the table has no datetime
> stamp. However, the table is append only so the primary key can only go
> up. Is it possible to store the last primary key
On 9/8/2010 4:32 PM, David Yang wrote:
I have a table that I want to index, and the table has no datetime
stamp. However, the table is append only so the primary key can only go
up. Is it possible to store the last primary key, and use some delta
query="select id where id>${last_id_value}"
I
On 09.09.2010, at 00:44, David Yang wrote:
> Currently DIH delta import uses the SQL query of type "select id from
> item where last_modified > ${dataimporter.last_index_time}"
> What I need is some field like ${dataimporter.last_primary_key}
> wiki.apache.org/solr/DataImportHandler
> I am thinki
To: solr-user@lucene.apache.org
Subject: Re: Delta Import with something other than Date
Of course you can store whatever you want in a solr index. And if you
store an integer as a Solr 1.4 "int" type, you can certainly query for
all documents that have greater than some specified integer in a fi
To: solr-user@lucene.apache.org
Subject: Re: Delta Import with something other than Date
Of course you can store whatever you want in a solr index. And if you
store an integer as a Solr 1.4 "int" type, you can certainly query for
all documents that have greater than some specified integer in a field
Of course you can store whatever you want in a solr index. And if you
store an integer as a Solr 1.4 "int" type, you can certainly query for
all documents that have greater than some specified integer in a field.
You can't use SQL to query Solr though.
I'm not sure what you're really asking?
Hi,
Make sure you use a proper "ID" field, which does *not* change even if the
content in the database changes. In this way, when your delta-import fetches
changed rows to index, they will update the existing rows in your index.
--
Jan Høydahl, search solution architect
Cominvent AS - www.comin
Short answer is no, there isn't a way. Solr doesn't have the concept of
'Update' to an indexed document. You need to add the full document (all
'columns') each time any one field changes. If doing that in your
DataImportHandler logic is difficult you may need to write a separate Update
Service tha
I found my problem! It was a bad custom EntityProcessor I wrote.
My EntityProcessor wasn't checking for hasNext() on the Iterator from my
FileImportDataImportHandler, it was just returning next(). The second bug
was that when the Iterator ran out of records it was returning an empty
Map (it now r
I'm not certain but i think what you want is something like this...
deltaQuery="select '${dataimporter.request.do_this_id}'"
deltaImportQuery="select ... from destinations
where DestID='${dataimporter.delta.id}'
"
...and then hit the handler with a URL like..
/da
The only way is to backport the patch to 1.3 . If you are confortable
doing that just modify the relevant code and do an "ant dist" to get
the jar
On Tue, Aug 18, 2009 at 11:42 AM, djain101 wrote:
>
> How can i get the version of DIH which fixes this issue and is compatible
> with 1.3?
>
>
> Noble
How can i get the version of DIH which fixes this issue and is compatible
with 1.3?
Noble Paul നോബിള് नोब्ळ्-2 wrote:
>
> OK, I thought you were using an older version of 1.4. the new DIH is
> not compatible with 1.3
>
> On Tue, Aug 18, 2009 at 11:37 AM, djain101
> wrote:
>>
>> I replaced th
OK, I thought you were using an older version of 1.4. the new DIH is
not compatible with 1.3
On Tue, Aug 18, 2009 at 11:37 AM, djain101 wrote:
>
> I replaced the dataimporthandler.jar from 8/7/2009 build in WEB-INF/lib of
> solr.war but on restarting of JBOSS, it threw me following exception but i
I replaced the dataimporthandler.jar from 8/7/2009 build in WEB-INF/lib of
solr.war but on restarting of JBOSS, it threw me following exception but if
i revert back to 1.3 jar then it loads the class fine. Is there any
compatibilty issue between latest dataimporthandler.jar and solr1.3.war?
INFO:
http://people.apache.org/builds/lucene/solr/nightly/
you can just replace the dataimporthandler jar in your current
installation and it should be fine
On Tue, Aug 18, 2009 at 11:18 AM, djain101 wrote:
>
> Can you please point me to the url for downloading latest DIH? Thanks for
> your help.
>
>
>
Can you please point me to the url for downloading latest DIH? Thanks for
your help.
Noble Paul നോബിള് नोब्ळ्-2 wrote:
>
> you can take a nightly of DIH jar alone. It is quite stable
>
> On Tue, Aug 18, 2009 at 8:21 AM, djain101 wrote:
>>
>> Looks like this issue has been fixed on Sept 20, 2
you can take a nightly of DIH jar alone. It is quite stable
On Tue, Aug 18, 2009 at 8:21 AM, djain101 wrote:
>
> Looks like this issue has been fixed on Sept 20, 2008 against issue SOLR-768.
> Can someone please let me know which one is a stable jar after Sept 20,
> 2008.
>
>
>
> djain101 wrote:
>
Looks like this issue has been fixed on Sept 20, 2008 against issue SOLR-768.
Can someone please let me know which one is a stable jar after Sept 20,
2008.
djain101 wrote:
>
> After debugging dataimporter code, i found that it is a bug in the
> dataimporter 1.3 code itself. doFullImport() in
After debugging dataimporter code, i found that it is a bug in the
dataimporter code itself. doFullImport() in DataImporter class is not
loading last index time where as doDeltaImport() is. The code snippet from
doFullImport() is:
if (requestParams.commit)
setIndexStartTime(new Date());
Yes, database and Solr are different machines and their dates are not
synchronized. Could that be the issue? Why the date difference between Solr
and DB machine fails to put the timestamp from dataimport.properties file?
Thanks,
Dharmveer
Avlesh Singh wrote:
>
> Solr and your database are di
Solr and your database are different machines? If yes, are their dates
synchronized?
If you have access to your database server logs, looking at the queries that
DIH generated might help.
Cheers
Avlesh
On Mon, Aug 17, 2009 at 11:40 PM, djain101 wrote:
>
> Any help?
> --
> View this message in c
Any help?
--
View this message in context:
http://www.nabble.com/delta-import-using-a-full-import-command-is-not-working-tp24989144p25011540.html
Sent from the Solr - User mailing list archive at Nabble.com.
Thanks for your response. It is not empty, it contains following:
#Sat Aug 15 16:44:18 PDT 2009
last_index_time=2009-08-15 16\:44\:17
Noble Paul നോബിള് नोब्ळ्-2 wrote:
>
> actually your dataimport.properties is empty , I guess that is the reason
>
> On Sun, Aug 16, 2009 at 5:19 AM, djain101
1 - 100 of 139 matches
Mail list logo