Hi,
I setup solr4.2 under apache tomcat on windows m/c. I created solr.xml under
catalina/localhost that holds the solr/home path, I have only one core, so
the solr.xml under the solr instance looks like:
after starting the apache service, I did not find the core on the admin
page. I ch
yes, I do. I installed the solr example instance.
Engy.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Can-not-find-solr-core-on-admin-page-after-setup-tp4098236p4098380.html
Sent from the Solr - User mailing list archive at Nabble.com.
Hi Bayu ,
I did that but for solr 4.2, the catalaina.out has no exceptions at all.
Thanks
--
View this message in context:
http://lucene.472066.n3.nabble.com/Can-not-find-solr-core-on-admin-page-after-setup-tp4098236p4098385.html
Sent from the Solr - User mailing list archive at Nabble.com.
Thank you Jack. So, I need to convert those nodes holding data to HDFS.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Solr-indexer-and-Hadoop-tp4072951p4073013.html
Sent from the Solr - User mailing list archive at Nabble.com.
Michael,
I understand from your post that I can use the current storage without in
Hadoop. I already have the storage mounted via NFS.
Does your map function read from the mounted storage directly? If possible
can you please illustrate more on that.
Thanks
Engy
--
View this message in contex
Hi All,
I have TB of data that need to be indexed. I am trying to use hadoop to
index those TB. I am still newbie.
I thought that the Map function will read data from hard disks and the
reduce function will index them. The problem I am facing is how to read
those data from hard disks which are n