Re: Solr 4.8.1 : Response Code 500 when creating the new request handler

2015-02-28 Thread Aman Tandon
Thanks Chris, yes on providing the qf it is working fine.

With Regards
Aman Tandon

On Wed, Feb 18, 2015 at 12:25 AM, Chris Hostetter 
wrote:

>
> : 1. Look further down in the stack trace for the "caused by" that details
> : > the specific cause of the exception.
>
> : I am still not able to find the cause of this.
>
> jack is refering to the log file from your server ... sometimes there
> are more details there.
>
> : Sorry i but don't know it is non-standard approach. please guide me here.
>
> I'm not sure what jack was refering to -- i don't see anything "non
> standard" about how you have your handler configured.
>
> : We are trying to find all the results so we are using q.alt=*:*.
> : There are some products in our company who wants of find all the results
> *whose
> : type is garments* and i forgot to mention we are trying to find only 6
> : rows. So using this request handler we are providing the 6 rows.
>
> Jack's point here is that you have specified a q.alt in your "invariants"
> but you have also specified it in the query params -- which will be
> totally ignored.  what specifically is your goal of haivng that query
> param in the sample query you tried?
>
> As a general debugging tip: Did you try ignoring your custom
> reuqestHandler, and just running a simple /select query with all of those
> params specified in the URL?  ... it can help to try and narrow down the
> problem -- in this case, i'm pretty sure you would have gotten the same
> error, and then the distractions of hte "invariants" question owuld have
> been irellevant
>
>
> Looking at the source code for 4.8.1 it appears that the error you are
> seeing is edismax doing a really bad job of trying to report an error
> parsing in parsing the "qf" param -- which you haven't specified at all in
> your params
>
>   try {
> queryFields = DisMaxQParser.parseQueryFields(req.getSchema(),
> solrParams);  // req.getSearcher() here causes searcher refcount imbalance
>   } catch (SyntaxError e) {
> throw new RuntimeException();
>   }
>
> ..if you add a "qf" param with the list of fields you want to search, (of
> a 'df' param to specify a default field) i suspect this error will go away.
>
>
> I filed a bug to fix this terrible code to give a useful error msg in the
> future...
>
> https://issues.apache.org/jira/browse/SOLR-7120
>
>
>
>
> : > 3. You have q.alt in invariants, but also in the actual request, which
> is a
> : > contradiction in terms - what is your actual intent? This isn't the
> cause
> : > of the exception, but does raise questions of what you are trying to
> do.
> : > 4. Why don't you have a q parameter for the actual query?
> : >
> : >
> : > -- Jack Krupansky
> : >
> : > On Sat, Feb 14, 2015 at 1:57 AM, Aman Tandon 
> : > wrote:
> : >
> : > > Hi,
> : > >
> : > > I am using Solr 4.8.1 and when i am creating the new request handler
> i am
> : > > getting the following error:
> : > >
> : > > *Request Handler config:*
> : > >
> : > > 
> : > > 
> : > > edismax
> : > > on
> : > > *:*
> : > >
> : > > 0.01
> : > > 
> : > >
> : > > 
> : > > type:garments
> : > > 
> : > > 
> : > >
> : > > *Error:*
> : > >
> : > > java.lang.RuntimeException at
> : > > >
> : > >
> : >
> org.apache.solr.search.ExtendedDismaxQParser$ExtendedDismaxConfiguration.(ExtendedDismaxQParser.java:1455)
> : > > > at
> : > > >
> : > >
> : >
> org.apache.solr.search.ExtendedDismaxQParser.createConfiguration(ExtendedDismaxQParser.java:239)
> : > > > at
> : > > >
> : > >
> : >
> org.apache.solr.search.ExtendedDismaxQParser.(ExtendedDismaxQParser.java:108)
> : > > > at
> : > > >
> : > >
> : >
> org.apache.solr.search.ExtendedDismaxQParserPlugin.createParser(ExtendedDismaxQParserPlugin.java:37)
> : > > > at org.apache.solr.search.QParser.getParser(QParser.java:315) at
> : > > >
> : > >
> : >
> org.apache.solr.handler.component.QueryComponent.prepare(QueryComponent.java:144)
> : > > > at
> : > > >
> : > >
> : >
> org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:197)
> : > > > at
> : > > >
> : > >
> : >
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135)
> : > > > at org.apache.solr.core.SolrCore.execute(SolrCore.java:1952) at
> : > > >
> : > >
> : >
> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:774)
> : > > > at
> : > > >
> : > >
> : >
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:418)
> : > > > at
> : > > >
> : > >
> : >
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:207)
> : > > > at
> : > > >
> : > >
> : >
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
> : > > > at
> : > > >
> : > >
> : >
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
> : > > > at
> : > > >
> : > >
> : >
> org.eclipse.jetty.s

Is it possible to use multiple index data directory in Apache Solr?

2015-02-28 Thread Jou Sung-Shik
I'm new in Apache Lucene/Solr.

I try to move from Elasticsearch to Apache Solr.

So, I have a question about following index data location configuration.


*in Elasticsearch*

# Can optionally include more than one lo # the locations (a la RAID 0) on
a file l # space on creation. For example:
#
# path.data: /path/to/data1,/path/to/data2

*in Apache Solr*

/var/data/solr/


I want to configure multiple index data directory like Elasticsearch in
Apache Solr.

Is it possible?

How I can reach the goal?





-- 
-
BLOG : http://www.codingstar.net
-


Re: Is it possible to use multiple index data directory in Apache Solr?

2015-02-28 Thread Shawn Heisey
On 2/28/2015 8:03 PM, Jou Sung-Shik wrote:
> *in Elasticsearch*
> 
> # Can optionally include more than one lo # the locations (a la RAID 0) on
> a file l # space on creation. For example:
> #
> # path.data: /path/to/data1,/path/to/data2
> 
> *in Apache Solr*
> 
> /var/data/solr/
> 
> 
> I want to configure multiple index data directory like Elasticsearch in
> Apache Solr.
> 
> Is it possible?
> 
> How I can reach the goal?

I don't believe this is possible in Solr.

How exactly does ES split the index files when multiple paths are
configured?  I am very curious about exactly how this works.  Google is
not helping me figure it out.  I even grabbed the ES master branch and
wasn't able to trace how path.data is used after it makes it into the
environment.

In truth, for most people I do not really see this feature as all that
much of an advantage.  For best performance, you want to completely
avoid hitting the disk at all -- the index should be entirely cached in
RAM.  When that is achieved, disk performance won't matter.  It could
help in situations where the total index data on a single server is far
too big to ever fit into RAM, or where each disk is small.

Thanks,
Shawn



Re: Getting started with Solr

2015-02-28 Thread Baruch Kogan
Thanks for bearing with me.

I start Solr with `bin/solr start -e cloud' with 2 nodes. Then I get this:

*Welcome to the SolrCloud example!*


*This interactive session will help you launch a SolrCloud cluster on your
local workstation.*

*To begin, how many Solr nodes would you like to run in your local cluster?
(specify 1-4 nodes) [2] *
*Ok, let's start up 2 Solr nodes for your example SolrCloud cluster.*

*Please enter the port for node1 [8983] *
*8983*
*Please enter the port for node2 [7574] *
*7574*
*Cloning Solr home directory /home/ubuntu/crawler/solr/example/cloud/node1
into /home/ubuntu/crawler/solr/example/cloud/node2*

*Starting up SolrCloud node1 on port 8983 using command:*

*solr start -cloud -s example/cloud/node1/solr -p 8983   *

I then go to http://localhost:8983/solr/admin/cores and get the following:


*This XML file does not appear to have any style information associated
with it. The document tree is shown below.*

*02testCollection_shard1_replica1/home/ubuntu/crawler/solr/example/cloud/node1/solr/testCollection_shard1_replica1//home/ubuntu/crawler/solr/example/cloud/node1/solr/testCollection_shard1_replica1/data/solrconfig.xmlschema.xml2015-03-01T06:59:12.296Z4638010truefalseorg.apache.lucene.store.NRTCachingDirectory:NRTCachingDirectory(MMapDirectory@/home/ubuntu/crawler/solr/example/cloud/node1/solr/testCollection_shard1_replica1/data/index
lockFactory=org.apache.lucene.store.NativeFSLockFactory@2a4f8f8b;
maxCacheMB=48.0 maxMergeSizeMB=4.0)7171 bytestestCollection_shard1_replica2/home/ubuntu/crawler/solr/example/cloud/node1/solr/testCollection_shard1_replica2//home/ubuntu/crawler/solr/example/cloud/node1/solr/testCollection_shard1_replica2/data/solrconfig.xmlschema.xml2015-03-01T06:59:12.751Z4592610truefalseorg.apache.lucene.store.NRTCachingDirectory:NRTCachingDirectory(MMapDirectory@/home/ubuntu/crawler/solr/example/cloud/node1/solr/testCollection_shard1_replica2/data/index
lockFactory=org.apache.lucene.store.NativeFSLockFactory@2a4f8f8b;
maxCacheMB=48.0 maxMergeSizeMB=4.0)7171 bytestestCollection_shard2_replica1/home/ubuntu/crawler/solr/example/cloud/node1/solr/testCollection_shard2_replica1//home/ubuntu/crawler/solr/example/cloud/node1/solr/testCollection_shard2_replica1/data/solrconfig.xmlschema.xml2015-03-01T06:59:12.596Z4608110truefalseorg.apache.lucene.store.NRTCachingDirectory:NRTCachingDirectory(MMapDirectory@/home/ubuntu/crawler/solr/example/cloud/node1/solr/testCollection_shard2_replica1/data/index
lockFactory=org.apache.lucene.store.NativeFSLockFactory@2a4f8f8b;
maxCacheMB=48.0 maxMergeSizeMB=4.0)7171 bytestestCollection_shard2_replica2/home/ubuntu/crawler/solr/example/cloud/node1/solr/testCollection_shard2_replica2//home/ubuntu/crawler/solr/example/cloud/node1/solr/testCollection_shard2_replica2/data/solrconfig.xmlschema.xml2015-03-01T06:59:12.718Z4595910truefalseorg.apache.lucene.store.NRTCachingDirectory:NRTCachingDirectory(MMapDirectory@/home/ubuntu/crawler/solr/example/cloud/node1/solr/testCollection_shard2_replica2/data/index
lockFactory=org.apache.lucene.store.NativeFSLockFactory@2a4f8f8b;
maxCacheMB=48.0 maxMergeSizeMB=4.0)7171
bytes*

I do not seem to have a gettingstarted collection.

Sincerely,

Baruch Kogan
Marketing Manager
Seller Panda 
+972(58)441-3829
baruch.kogan at Skype

On Fri, Feb 27, 2015 at 12:00 AM, Erik Hatcher 
wrote:

> I’m sorry, I’m not following exactly.
>
> Somehow you no longer have a gettingstarted collection, but it is not
> clear how that happened.
>
> Could you post the exact script steps you used that got you this error?
>
> What collections/cores does the Solr admin show you have?What are the
> results of http://localhost:8983/solr/admin/cores <
> http://localhost:8983/solr/admin/cores> ?
>
> —
> Erik Hatcher, Senior Solutions Architect
> http://www.lucidworks.com 
>
>
>
>
> > On Feb 26, 2015, at 9:58 AM, Baruch Kogan 
> wrote:
> >
> > Oh, I see. I used the start -e cloud command, then ran through a setup
> with
> > one core and default options for the rest, then tried to post the json
> > example again, and got another error:
> > buntu@ubuntu-VirtualBox:~/crawler/solr$ bin/post -c gettingstarted
> > example/exampledocs/*.json
> > /usr/lib/jvm/java-7-oracle/bin/java -classpath
> > /home/ubuntu/crawler/solr/dist/solr-core-5.0.0.jar -Dauto=yes
> > -Dc=gettingstarted -Ddata=files org.apache.solr.util.SimplePostTool
> > example/exampledocs/books.json
> > SimplePostTool version 5.0.0
> > Posting files to [base] url
> > http://localhost:8983/solr/gettingstarted/update...
> > Entering auto mode. File endings considered are
> >
> xml,json,csv,pdf,doc,docx,ppt,pptx,xls,xlsx,odt,odp,ods,ott,otp,ots,rtf,htm,html,txt,log
> > POSTing file books.json (application/json) to [base]
> > SimplePostTool: WARNING: Solr returned an error #404 (Not Found) for url:
> > http://localhost:8983/solr/gettingstarted/update
> > SimplePostTool: WARNING: Response: 
> > 
> > 
> > Error 404 Not F