No actually I worked as well on replication so both answers are interesting.
Ok Just saw that, I've to create a cron job that uses wget to hit the delta
import, every 5mn or so.

Am I doing something wrong or not?
Every time I start (manually) delta-import
(.../dataimport?command=delta-import)
and then I go back to check the statut : http://.../solr/books/dataimport,
it's still running like it can't never ending :

<str name="status">busy</str>
<str name="importResponse">A command is still running...</str>
−
<lst name="statusMessages">
<str name="Time Elapsed">0:13:54.194</str>
<str name="Total Requests made to DataSource">881696</str>
<str name="Total Rows Fetched">2418310</str>
<str name="Total Documents Processed">125956</str>
<str name="Total Documents Skipped">0</str>
<str name="Delta Dump started">2008-09-17 17:24:07</str>
<str name="Identifying Delta">2008-09-17 17:24:07</str>
<str name="Deltas Obtained">2008-09-17 17:24:49</str>
<str name="Building documents">2008-09-17 17:24:49</str>
<str name="Total Changed Documents">390796</str>
</lst>


Even if I've just done a full-import, so, where can I check in stat or ???
what did it just changed by delta-import ? 

Does it loop for checking cuz I have to admit I didn't put my
parentDeltaQuery in data-config. Is it for that ?

 <entity name="books"
            pk="books.book_id"
            transformer="RegexTransformer"
            deltaQuery="SELECT book_id FROM book INNER JOIN user
USING(user_id)
                          WHERE book.modified >
'${dataimporter.last_index_time}'
                            OR user.modified  >
'${dataimporter.last_index_time}'"
            query="SELECT ..."
  >
 ....








Shalin Shekhar Mangar wrote:
> 
> On Wed, Sep 17, 2008 at 8:12 PM, sunnyfr <[EMAIL PROTECTED]> wrote:
> 
>>
>> According to the fact that a Collection is a Lucene collection is a
>> directory of files. These comprise the indexed and returnable data of a
>> Solr
>> search repository.
>>
>> I just want to be sure because this page speak about :
>>
>> http://wiki.apache.org/solr/CollectionDistribution#head-9f393ae2a6230fe23e422f1583f31edbff7b1007
>> replication.
>>
>> To synchronize master and slave and apply different job to check
>> snapshot.
>> But my question about update was essentially, between my database mysql
>> and
>> solr's indexes.
>> I just want to know, if somebody add a book in my database, how can I be
>> sure that will be update in my indexes and commit ?
>>
> 
> 
> I had assumed that you wanted to sync the index between a master and
> slaves.
> Now I realize that your question was different.
> 
> Look at DataImportHandler and delta imports. You can run delta-imports
> through a cron job syncing the database with Solr frequently depending on
> the update frequency of the database.
> 
> http://wiki.apache.org/solr/DataImportHandler
> 
> 
>>
>> --
>> View this message in context:
>> http://www.nabble.com/cron-job-update-index-tp19520468p19533680.html
>> Sent from the Solr - User mailing list archive at Nabble.com.
>>
>>
> 
> 
> -- 
> Regards,
> Shalin Shekhar Mangar.
> 
> 

-- 
View this message in context: 
http://www.nabble.com/cron-job-update-index-tp19520468p19535082.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to