Thanks everyone. I got it working.
--
Craig Hoffman
w: http://www.craighoffmanphotography.com
FB: www.facebook.com/CraigHoffmanPhotography
TW: https://twitter.com/craiglhoffman
> On Oct 30, 2014, at 1:48 PM, Shawn Heisey wrote:
>
> On 10/30/2014 1:27 PM, Craig Hoffman wrote:
>> Thank
On 10/30/2014 1:27 PM, Craig Hoffman wrote:
> Thanks! One more question. WGET seems to choking on a my URL in particular
> the # and the & character . What’s the best method escaping?
>
> http://
> :8983/solr/#/articles/dataimport//dataimport?command=full-import&clean=true&optimize=true
Putting
You probably just need to put double quotes around the url.
On 10/30/14 15:27, Craig Hoffman wrote:
Thanks! One more question. WGET seems to choking on a my URL in particular the #
and the & character . What’s the best method escaping?
http://
:8983/solr/#/articles/dataimport//dataimport?com
Thanks! One more question. WGET seems to choking on a my URL in particular the
# and the & character . What’s the best method escaping?
http://
:8983/solr/#/articles/dataimport//dataimport?command=full-import&clean=true&optimize=true
--
Craig Hoffman
w: http://www.craighoffmanphotography.com
FB
Simple add this line to your crontab with crontab -e command:
0,30 * * * * /usr/bin/wget
http://:8983/solr//dataimport?command=full-import
This will full import every 30 minutes. Replace and
with your configuration
*Using delta-import command*
Delta Import operation can be started by hitting
Do you mean DataImportHandler? If so, you can create full and
incremental queries and trigger them - from CRON - as often as you
would like. E.g. 1am nightly.
Regards,
Alex.
On 30 October 2014 14:17, Craig Hoffman wrote:
> The data gets into Solr via MySQL script.
Then you have to run it again and again
30. okt. 2014 19:18 skrev "Craig Hoffman" følgende:
> The data gets into Solr via MySQL script.
> --
> Craig Hoffman
> w: http://www.craighoffmanphotography.com
> FB: www.facebook.com/CraigHoffmanPhotography
> TW: https://twitter.com/craiglhoffman
>
>
>
>
>
The data gets into Solr via MySQL script.
--
Craig Hoffman
w: http://www.craighoffmanphotography.com
FB: www.facebook.com/CraigHoffmanPhotography
TW: https://twitter.com/craiglhoffman
> On Oct 30, 2014, at 12:11 PM, Craig Hoffman wrote:
>
> Right, of course. The data changes every fe
Right, of course. The data changes every few days. According to this
article, you can run a CRON Job to create a new index.
http://www.finalconcept.com.au/article/view/apache-solr-hints-and-tips
On Thu, Oct 30, 2014 at 12:04 PM, Alexandre Rafalovitch
wrote:
> You don't "reindex Solr". You reinde
You don't "reindex Solr". You reindex data into Solr. So, this depends
where you data is coming from and how often it changes. If the data
does not change, no point re-indexing it. And how do you get the data
into the Solr in the first place?
Regards,
Alex.
Personal: http://www.outerthoughts.co
10 matches
Mail list logo