Hi Gora and Roman ,
Thank you for you valuable comments, I am trying to fallow your suggestion.
I will notify you when its done.
If I face any problem to integrate solr please help me at the same way in
future.
Thank you,
Ashim
--
View this message in context:
http://lucene.472066.n3.nabbl
hi,
it is probably correct to revisit your design/requirements, but it you
still find you need it, then there may be a different way
DIH is using a writer to commit documents, you can detect errors inside
these and try to recover - ie. in some situations, you want to commit,
instead of calling rol
On 21 January 2013 17:06, ashimbose wrote:
[...]
> Here I used two data config
> 1. data_conf1.xml
> 2. data_conf2.xml
[...]
Your configuration looks fine.
> Any one of them running fine at a single instant. Means,
> If I run first dataimport, it will successfully index, if after that I run
> da
Do you have programming skills? If so, I'd suggest you write your own
importer that allows you to control precisely what it is you are trying
to do. The DIH, in my book is a great generic tool for low to medium
complexity tasks. It very much appears you are pushing beyond its
levels, and it would m
Hi Gora,
Thank you for your suggestion.
I have tried with you below option,
>* Have never tried this, but one can set up multiple request handlers
> in solrconfig.xml for each DIH instance that one plans to run.
> These can run in parallel rather than the sequential indexing of
> root enti
Hi Gora,
I will have a Archive Data as some format like relational, Provided by
client. Which may have some relation but I will not know. I have to make
index that data without restoring to my sql db. That means I have to read
the data from archive file directly and this will be full automated.
W
On 18 January 2013 15:04, ashimbose wrote:
> Hi Gora,
>
> Thank you for your reply again,
>
> Joining is not possible in my case. coz there is no relation between all
> tables. is there joining is possible without any relation in this solr case?
No, one needs some kind of a relationship to join.
On 18 January 2013 13:16, ashimbose wrote:
> Hi Gora ,
>
> Thank you for your quick reply.
>
> I have only one data source, But have more than 300 tables. Each tables I
> have put in individual in data-confic.xml
>
> But when I am trying to do full import, Its showing Thant much as
> 169
>
> Thi
Hi Gora ,
Thank you for your quick reply.
I have only one data source, But have more than 300 tables. Each tables I
have put in individual in data-confic.xml
But when I am trying to do full import, Its showing Thant much as
169
This 169 means I took 169 tables from my data source and each 169
On 18 January 2013 12:49, ashimbose wrote:
> Hi Otis,
>
> Thank you for your reply.
>
> But I am unable to get any search result related to the error code. Its not
> response for more than 168 Data Source. I have tested it. If you have any
> other solution please let me know.
Not sure about the l
ashimbose,
It is possible that this is happening because Solr reaches a point where
it is doing so many simultaneous merges that ongoing indexing is stopped
until a huge merge finishes. This causes the JDBC driver to time out
and disconnect, and there is no viable generic way to recover from
Hi,
It looks like this is the cause:
JBC0016E: Remote call failed
(return code=-2,220). SDK9019E: internal errorSDK9019X:
Interestingly, Google gives just 1 hit for the above as query - your post.
But it seems you should look up what the above codes mean first...
Otis
--
Solr & ElasticSe
12 matches
Mail list logo