Re: Does defType overrides other settings for default request handler

2012-07-24 Thread amitesh116
Thanks for your reply. It worked for me. For now, I am using it in query
string only. But planning to move it to solrconfig.
Thanks again!
~Amitesh



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Does-defType-overrides-other-settings-for-default-request-handler-tp3995946p3996926.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: NumberFormatException while indexing TextField with LengthFilter and then copying to tfloat

2012-07-24 Thread Chantal Ackermann
Hi Hoss,

thank you for the quick response and the explanations!

> My suggestion would be to modify the XPath expression you are using to 
> pull data out of your original XML files and ignore  ""
> 

I don't think this is possible. That would include text() in the XPath which is 
not handled by the XPathRecordReader. I've checked in the code, as well, and 
the JavaDoc does not list this possibility. I've tried those patterns:

/issues/issue/estimated_hours[text()]
/issues/issue/estimated_hours/text()

No value at all will be added for that field for any of the documents 
(including those that do have a value in the XML).

> Alternatively: there are some new UpdateProcessors available in 4.0 that 
> let you easily prune field values based on various criteria (update 
> porcessors happen well before copyField)...
> 
> http://lucene.apache.org/solr/api-4_0_0-ALPHA/org/apache/solr/update/processor/RemoveBlankFieldUpdateProcessorFactory.html

Thanks for pointing me to it. I've switched to 4.0.0-ALPHA (hoping, the ALPHA 
doesn't show itself too often ;-) ).

For anyone interested, my DataImportHandler Setup in solrconfig.xml now reads:







emptyFieldChain
data-config.xml
true
true
true



Works as expected!

And kudos to those working on the admin frontend, as well! The new admin is 
indeed slick!



> But i can certainly understand the confusion, i've opened SOLR-3657 to try 
> and improve on this.  Ideally the error message should make it clear that 
> the "value" from "source" field was copied to "dest" field which then 
> encountered "error"
> 

Thank you! Good Exception messages are certainly helpful!

Chantal



Re: [Announce] Solr 4.0-ALPHA with RankingAlgorithm 1.4.4 with Realtime NRT available for download

2012-07-24 Thread Nagendra Nagarajayya

Hi Yonik:

Please see my comments below:

On 7/23/2012 8:52 AM, Yonik Seeley wrote:

On Mon, Jul 23, 2012 at 11:37 AM, Nagendra Nagarajayya
  wrote:

Realtime NRT algorithm enables NRT functionality in
Solr by not closing the Searcher object  and so is very fast. I am in the
process of contributing the algorithm back to Apache Solr as a patch.

Since you're in the process of contributing this back, perhaps you
could explain your approach - it never made sense to me.

Replacing the reader in an existing SolrIndexSearcher as you do means
that all the related caches will be invalid (meaning you can't use
solr's caches).  You could just ensure that there is no auto-warming
set up for Solr's caches (which is now the default), or you could
disable caching altogether.  It's not clear what you're comparing
against when you claim it's faster.


Solr with RankingAlgorithm does not replace the reader in 
SolrIndexSearcher object. All it does is override the 
IndexSearcher.getIndexReader() method so as to supply a NRTReader if 
realtime is enabled. All direct references to the "reader" member has 
been replaced with a getIndexReader() method access.


The performance is better as SolrIndexSearcher is not closed every 1 sec 
as in soft-commit. SolrIndexSearcher is a heavy object with caches, etc. 
and is reference counted. So every 1 sec this object needs to closed, 
re-allocated and the indexes need to be re-opened, caches invalidated, 
while waiting for existing searchers to complete, making this very 
expensive. realtime NRT does not close the SolrIndexSearcher object but 
makes available a new NRTReader with document updates ie. 
getIndexReader() returns a new NRTReader.



There are also consistency and concurrency issues with replacing the
reader in an existing SolrIndexSearcher, which is supposed to have a
static view of the index.  If a reader replacement happens in the
middle of a request, it's bound to cause trouble, including returning
the wrong documents!


The reader member is not replaced in the existing SolrIndexSearcher 
object. The IndexSearcher.getIndexReader() method has been overriden in 
SolrIndexSearcher and all direct reader member access has been replaced 
with a getIndexReader() method call allowing a NRT reader to be supplied 
when realtime is enabled. The concurrency is handled by the 
getNRTReader() method, with the static index view now increased to the 
granularity provided by the NRTIndexReader.



Regards,

Nagendra Nagarajayya
http://solr-ra.tgels.org
http://rankingalgorithm.tgels.org


-Yonik
http://lucidimagination.com







Re: [Announce] Solr 4.0-ALPHA with RankingAlgorithm 1.4.4 with Realtime NRT available for download

2012-07-24 Thread Yonik Seeley
On Tue, Jul 24, 2012 at 8:24 AM, Nagendra Nagarajayya
 wrote:
> SolrIndexSearcher is a heavy object with caches, etc.

As I've said, the caches are configurable, and it's trivial to disable
all caching (to the point where the cache objects are not even
created).

> The reader member is not replaced in the existing SolrIndexSearcher object.
> The IndexSearcher.getIndexReader() method has been overriden in
> SolrIndexSearcher and all direct reader member access has been replaced with
> a getIndexReader() method call allowing a NRT reader to be supplied when
> realtime is enabled.

In a single Solr request (that runs through multiple components like
query, highlight, facet, and response writing),
does IndexSearcher.getIndexReader() always return the same reader?  If
not, this breaks pretty much every standard solr component - but it
will only be apparent under load, and if you are carefully sanity
checking the results.

-Yonik
http://lucidimagination.com


Re: [Announce] Solr 4.0-ALPHA with RankingAlgorithm 1.4.4 with Realtime NRT available for download

2012-07-24 Thread Nagendra Nagarajayya
Thanks Mark! I am already working with Apache Software Foundation on the 
mark and am using the correct usage of the mark as suggested by them.


Regards,

Nagendra Nagarajayya
http://solr-ra.tgels.org
http://rankingalgorithm.tgels.org


On 7/23/2012 12:15 PM, Mark Miller wrote:

On Jul 23, 2012, at 11:27 AM, Nagendra Nagarajayya wrote:


I am not sure why any one will get offended by an announcement that NRT 
functionality was available with older releases.

FWIW, I'm not offended - I don't mind if third parties post announcements if 
they are related to Solr.

I just want to make sure it's very clear that it's a third party announce so 
there is no confusion - people that don't follow the lists on a daily basis 
read these things. A lot of these emails end up archived on various sites that 
collect mailing lists. It's easy to run into them without the proper context.

I think part of the confusion is the naming. Technically, Apache does not allow the use 
of Apache marks as part of a third party name. Instead, the name should be something like 
"Product X, powered by Solr"

See http://www.apache.org/foundation/marks/faq/#products

- Mark Miller
lucidimagination.com

















Upgrade solr 1.4.1 to 3.6

2012-07-24 Thread alexander81
Hi,
after this upgrade I need to see if the cache has to be rebuilt or not.

regards



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Upgrade-solr-1-4-1-to-3-6-tp3996952.html
Sent from the Solr - User mailing list archive at Nabble.com.


problem using lat longfield

2012-07-24 Thread yair even-zohar
I'm using solr 3.5 via SIREn
I modified the schema.xml by adding:

 
     
 


My code is the following: 


SolrInputDocument document = new SolrInputDocument();
SolrInputField inputField = new SolrInputField("coordinate");
 String latlon = "40.19,-73.978551"; 

inputField.addValue(latlon, 1.0F);
document.put("coordinate", inputField);  

 SolrServer  server1 = new CommonsHttpSolrServer("http://localhost:8080/siren";);
final UpdateRequest request = new UpdateRequest();
request.add(document);
request.process(server1);

and the error I'm getting is below.
Help is appreciated.

org.apache.solr.common.SolrException: Bad Request

Bad Request

request: http://localhost:8080/siren/update?wt=javabin&version=2
    at 
org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:432)
    at 
org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:246)
    at 
org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:105)
    at com.fetch5.siren.SirenHelper.indexTabularFile(SirenHelper.java:195)
    at com.fetch5.siren.SirenHelper.indexFile(SirenHelper.java:55)
    at com.fetch5.siren.SirenHelper.main(SirenHelper.java:299)

Thanks
-Yair


Re: Upgrade solr 1.4.1 to 3.6

2012-07-24 Thread Michael Della Bitta
Hello Alexander,

The cache gets dumped when you restart for the upgrade. Do you mean the index?

Michael Della Bitta


Appinions, Inc. -- Where Influence Isn’t a Game.
http://www.appinions.com


On Tue, Jul 24, 2012 at 7:43 AM, alexander81  wrote:
> Hi,
> after this upgrade I need to see if the cache has to be rebuilt or not.
>
> regards
>
>
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Upgrade-solr-1-4-1-to-3-6-tp3996952.html
> Sent from the Solr - User mailing list archive at Nabble.com.


SOLR 4.0-ALPHA : DIH : Indexed and Committed Successfully but Index is empty

2012-07-24 Thread Chantal Ackermann
Hi there,

sorry for the length - it is mostly (really) log output. The basic issue is 
reflected in the subject: DIH runs fine, but even with an extra optimize on top 
(which should not be necessary given my DIH config) the index remains empty.

(I have changed from 3.6.1 to 4.0-ALPHA because of Hoss' answer to my question 
"NumberFormatException while indexing TextField with LengthFilter" (on this 
same list). I had an index setup with 4.0-ALPHA today, I could verify that 
Hoss' suggestion works. But now, I seem not to be able to get that index filled 
yet another time.
SOLR runs inside Jetty which is started via "mvn jetty:run-war". SOLR_HOME is 
set to a subdirectory of maven's target dir. I have been using this setup 
successfully with SOLR 3.* for some time, now. While configuring the index, I 
often do a "mvn clean; mvn jetty:run-war" so SOLR_HOME including the index is 
completely removed and recreated from scratch.)


After running a full import of DIH on core "issues" using:
http://localhost:9090/solr/issues/dataimport?command=full-import&importfile=/absolute/path/to/issues.xml

I get the response:



0
1



emptyFieldChain
data-config.xml
true
true
true


idle


0
294
0
2012-07-24 15:46:27

Indexing completed. Added/Updated: 294 documents. Deleted 0 documents.

2012-07-24 15:46:28
2012-07-24 15:46:28
294
0:0:0.605


This response format is experimental. It is likely to change in the future.



Meaning that everything went fine including commit and optimize and the index 
should now contain 294 documents. Well, it doesn't.
Trying to get it working again, I have now replaced large parts of my 
solrconfig.xml with the new parts taken from the current 4.0-ALPHA 
(https://builds.apache.org/job/Solr-trunk/ws/checkout/) but this doesn't change 
a thing. The schema version is set to 1.5.



When starting the server it outputs:

24.07.2012 16:00:16 org.apache.solr.core.SolrCore 
INFO: [issues] Opening new SolrCore at target/classes/core_issues/, 
dataDir=target/classes/core_issues/data/
…
24.07.2012 16:00:16 org.apache.solr.core.SolrCore getNewIndexDir
WARNUNG: New index directory detected: old=null 
new=target/classes/core_issues/data/index/
24.07.2012 16:00:16 org.apache.solr.core.SolrCore initIndex
WARNUNG: [issues] Solr index directory 'target/classes/core_issues/data/index' 
doesn't exist. Creating new index...
24.07.2012 16:00:16 org.apache.solr.core.SolrDeletionPolicy onCommit
INFO: SolrDeletionPolicy.onCommit: commits:num=1

commit{dir=/path/to/maven-project/target/classes/core_issues/data/index,segFN=segments_1,generation=1,filenames=[segments_1]
24.07.2012 16:00:16 org.apache.solr.core.SolrDeletionPolicy updateCommits
INFO: newest commit = 1
…
24.07.2012 16:00:16 org.apache.solr.search.SolrIndexSearcher 
INFO: Opening Searcher@920ab60 main
24.07.2012 16:00:16 org.apache.solr.core.SolrCore registerSearcher
INFO: [issues] Registered new searcher Searcher@920ab60 
main{StandardDirectoryReader(segments_1:1)}
24.07.2012 16:00:16 org.apache.solr.update.CommitTracker 
INFO: Hard AutoCommit: if uncommited for 15000ms; 
24.07.2012 16:00:16 org.apache.solr.update.CommitTracker 
INFO: Soft AutoCommit: disabled
24.07.2012 16:00:16 org.apache.solr.handler.dataimport.DataImportHandler 
processConfiguration
INFO: Processing configuration from solrconfig.xml: 
{update.chain=emptyFieldChain,config=data-config.xml,clean=true,commit=true,optimize=true}
24.07.2012 16:00:16 org.apache.solr.handler.dataimport.DataImporter 
loadDataConfig
INFO: Data Configuration loaded successfully
24.07.2012 16:00:16 org.apache.solr.core.QuerySenderListener newSearcher
INFO: QuerySenderListener sending requests to Searcher@920ab60 
main{StandardDirectoryReader(segments_1:1)}
24.07.2012 16:00:16 org.apache.solr.core.CoreContainer register
INFO: registering core: issues



When running the DIH full import, the log output is:

24.07.2012 16:00:31 org.apache.solr.handler.dataimport.DataImporter doFullImport
INFO: Starting Full Import
24.07.2012 16:00:31 org.apache.solr.core.SolrCore execute
INFO: [issues] webapp=/solr path=/dataimport 
params={command=full-import&importfile=/path/to/maven-project/src/test/resources/issues.xml}
 status=0 QTime=4 
24.07.2012 16:00:31 org.apache.solr.handler.dataimport.SimplePropertiesWriter 
readIndexerProperties
WARNUNG: Unable to read: dataimport.properties
24.07.2012 16:00:32 org.apache.solr.handler.dataimport.DocBuilder finish
INFO: Import completed successfully
24.07.2012 16:00:32 org.apache.solr.handler.dataimport.SimplePropertiesWriter 
readIndexerProperties
WARNUNG: Unable to read: dataimport.properties
24.07.2012 16:00:32 org.apache.solr.handler.dataimport.SimplePropertiesWriter 
persist
INFO: Wrote last indexed time to dataimport.properties
24.07.2012 16:00:32 org.apache.solr.handler.dataimport.DocBuilder execute
INFO: Time taken = 0:0:0.566



I'm a bit confused that it does not output the commit logs but it didn't do 
that before, neither, if I remember correctly. When I issue an "Opt

Re: What API can I use to partially update a field in Solr 4.0

2012-07-24 Thread Alexandre Rafalovitch
I asked that question just a couple of days ago. Check archive for
full conversation on July 14-16th.

In summary, what you want is only possible if all (ALL!) fields are
stored (not just indexed). And you may want to look at:
https://issues.apache.org/jira/browse/SOLR-139

Regards,
   Alex.
Personal blog: http://blog.outerthoughts.com/
LinkedIn: http://www.linkedin.com/in/alexandrerafalovitch
- Time is the quality of nature that keeps events from happening all
at once. Lately, it doesn't seem to be working.  (Anonymous  - via GTD
book)


On Tue, Jul 24, 2012 at 11:52 AM, Xiao Li  wrote:
> Hi,
>
> I am a Java programmer using Solr. I have 2 million documents in the solr
> index. I need to change only one filed for all docs. Does Solr have a
> partial update Java API?
>
> XIao


geospacial / coordinates java example anyone?

2012-07-24 Thread yair even-zohar
Can someone please send a simple java example for indexing and querying a 
latitude, longitude coordinate on SolrDocument.
That is, assume we have a document and we want to simply add the lat,lon as 
field to the document and then query according to distance too

Thanks
-Yair    

Re: geospacial / coordinates java example anyone?

2012-07-24 Thread Hasan Diwan
On 24 July 2012 09:55, yair even-zohar  wrote:

> Can someone please send a simple java example for indexing and querying a
> latitude, longitude coordinate on SolrDocument.
> That is, assume we have a document and we want to simply add the lat,lon
> as field to the document and then query according to distance too
>
> There are examples on the wiki. Please see
http://wiki.apache.org/solr/SpatialSearch/ -- H

-- 
Sent from my mobile device
Envoyait de mon portable


Re: geospacial / coordinates java example anyone?

2012-07-24 Thread Ram Marpaka
Yair 


1. In case if you have not seen this, you can download and play with the 
example 

http://www.ibm.com/developerworks/opensource/library/j-spatial/ 


2. Download solr-example.tgz from below which has input xml files, index using 
DIH {It doesnt use solrj  but I think you will get some idea} and try

http://www.nsshutdown.com/blog/index.php?itemid=87 

 
Thanks
Ram M Marpaka



 From: Hasan Diwan 
To: solr-user@lucene.apache.org; yair even-zohar  
Sent: Tuesday, 24 July 2012 10:01 AM
Subject: Re: geospacial / coordinates java example anyone?
 
On 24 July 2012 09:55, yair even-zohar  wrote:

> Can someone please send a simple java example for indexing and querying a
> latitude, longitude coordinate on SolrDocument.
> That is, assume we have a document and we want to simply add the lat,lon
> as field to the document and then query according to distance too
>
> There are examples on the wiki. Please see
http://wiki.apache.org/solr/SpatialSearch/ -- H

-- 
Sent from my mobile device
Envoyait de mon portable

Re: "Invalid or unreadable WAR file : .../solr.war" when starting solr 3.6.1 app on Tomcat 7 (upgrade to 7.0.29/upstream)?

2012-07-24 Thread Chris Hostetter

: I removed distro pacakged Tomcat from the eqaation,
...
: replacing it with an upstream instance
...
: Repeating the process, at attempt to 'start' the /solr webapp, there's
: no change.  I still get
...
:   java.lang.IllegalArgumentException: Invalid or unreadable WAR
:   file : /srv/solr_home/solr.war

Are you sure you didn't accidently corrupt the war file in some way?

what is the md5 or sha1sum of the war file you have?
does "jar tf solr.war" give you any errors?

..

I just used the following steps (basically the same as yours just 
different paths) and got solr running in tomcat 7.0.29 with no 
problems 

hossman@frisbee:/var/tmp$ ls -al
total 110188
drwxrwxrwt  2 rootroot 4096 Jul 24 10:37 .
drwxr-xr-x 13 rootroot 4096 Jul 18 09:34 ..
-rw-rw-r--  1 hossman hossman 105132366 Jul 24 10:29 apache-solr-4.0.0-ALPHA.tgz
-rw-rw-r--  1 hossman hossman   7679160 Jul  3 04:25 apache-tomcat-7.0.29.tar.gz
-rw-rw-r--  1 hossman hossman   183 Jul 24 10:29 solr-context-file.xml
hossman@frisbee:/var/tmp$ tar -xzf apache-solr-4.0.0-ALPHA.tgz 
hossman@frisbee:/var/tmp$ tar -xzf apache-tomcat-7.0.29.tar.gz 
hossman@frisbee:/var/tmp$ cp -r apache-solr-4.0.0-ALPHA/example/solr solr-home
hossman@frisbee:/var/tmp$ cp 
apache-solr-4.0.0-ALPHA/dist/apache-solr-4.0.0-ALPHA.war solr.war
hossman@frisbee:/var/tmp$ sha1sum solr.war 
51c9e4bf6f022ea3873ee315eb08a96687e71964  solr.war
hossman@frisbee:/var/tmp$ md5sum solr.war 
a154197657bb5cb9cee13fb5cfca931b  solr.war
hossman@frisbee:/var/tmp$ cat solr-context-file.xml 

   

hossman@frisbee:/var/tmp$ mkdir -p apache-tomcat-7.0.29/conf/Catalina/localhost/
hossman@frisbee:/var/tmp$ cp solr-context-file.xml 
apache-tomcat-7.0.29/conf/Catalina/localhost/solr.xml
hossman@frisbee:/var/tmp$ ./apache-tomcat-7.0.29/bin/catalina.sh start
Using CATALINA_BASE:   /var/tmp/apache-tomcat-7.0.29
Using CATALINA_HOME:   /var/tmp/apache-tomcat-7.0.29
Using CATALINA_TMPDIR: /var/tmp/apache-tomcat-7.0.29/temp
Using JRE_HOME:/usr/lib/jvm/java-6-openjdk-amd64/
Using CLASSPATH:   
/var/tmp/apache-tomcat-7.0.29/bin/bootstrap.jar:/var/tmp/apache-tomcat-7.0.29/bin/tomcat-juli.jar
hossman@frisbee:/var/tmp$ 


...and now solr is up and running on http://localhost:8080/solr/ and there 
are no errors in the logs.




-Hoss


Re: "Invalid or unreadable WAR file : .../solr.war" when starting solr 3.6.1 app on Tomcat 7 (upgrade to 7.0.29/upstream)?

2012-07-24 Thread k9157
Hi,

On Tue, Jul 24, 2012, at 10:50 AM, Chris Hostetter wrote:
>
> : I removed distro pacakged Tomcat from the eqaation,
>   ...
> : replacing it with an upstream instance
>   ...
> : Repeating the process, at attempt to 'start' the /solr webapp, there's
> : no change.  I still get
>   ...
> :   java.lang.IllegalArgumentException: Invalid or unreadable WAR
> :   file : /srv/solr_home/solr.war
>
> Are you sure you didn't accidently corrupt the war file in some way?

I DL'd solr, and cp'd its .war over.  Not much opportunity for
corruption.  Nonetheless, checking the .war integrity

unzip -t /srv/solr_home/solr.war | grep -v OK
Archive:  /srv/solr_home/solr.war
No errors detected in compressed data of
/srv/solr_home/solr.war.

> what is the md5 or sha1sum of the war file you have?

sha1sum /usr/local/apache-solr-3.6.1/example/webapps/solr.war
e1da03c97be88e2e72d120b6ba32de1133603766 
/usr/local/apache-solr-3.6.1/example/webapps/solr.war
sha1sum /srv/solr_home/solr.war
e1da03c97be88e2e72d120b6ba32de1133603766 
/srv/solr_home/solr.war

> does "jar tf solr.war" give you any errors?

nope.

jar tf /data/webapps/solr_home/solr.war
META-INF/
META-INF/MANIFEST.MF
WEB-INF/
WEB-INF/web.xml
WEB-INF/lib/
WEB-INF/lib/commons-codec-1.6.jar
WEB-INF/lib/commons-fileupload-1.2.1.jar
WEB-INF/lib/commons-httpclient-3.1.jar
WEB-INF/lib/commons-io-2.1.jar
WEB-INF/lib/commons-lang-2.6.jar
WEB-INF/lib/geronimo-stax-api_1.0_spec-1.0.1.jar
WEB-INF/lib/guava-r05.jar
WEB-INF/lib/jcl-over-slf4j-1.6.1.jar
WEB-INF/lib/log4j-over-slf4j-1.6.1.jar
WEB-INF/lib/slf4j-api-1.6.1.jar
WEB-INF/lib/slf4j-jdk14-1.6.1.jar
WEB-INF/lib/wstx-asl-3.2.7.jar
WEB-INF/lib/lucene-analyzers-3.6.1.jar
WEB-INF/lib/lucene-core-3.6.1.jar
WEB-INF/lib/lucene-grouping-3.6.1.jar
WEB-INF/lib/lucene-highlighter-3.6.1.jar
WEB-INF/lib/lucene-kuromoji-3.6.1.jar
WEB-INF/lib/lucene-memory-3.6.1.jar
WEB-INF/lib/lucene-misc-3.6.1.jar
WEB-INF/lib/lucene-phonetic-3.6.1.jar
WEB-INF/lib/lucene-queries-3.6.1.jar
WEB-INF/lib/lucene-spatial-3.6.1.jar
WEB-INF/lib/lucene-spellchecker-3.6.1.jar
WEB-INF/lib/apache-solr-core-3.6.1.jar
WEB-INF/lib/apache-solr-solrj-3.6.1.jar
admin/
admin/dataimport.jsp
admin/debug.jsp
admin/replication/
WEB-INF/weblogic.xml
admin/_info.jsp
admin/action.jsp
admin/analysis.jsp
admin/analysis.xsl
admin/distributiondump.jsp
admin/favicon.ico
admin/form.jsp
admin/get-file.jsp
admin/get-properties.jsp
admin/header.jsp
admin/index.jsp
admin/jquery-1.4.3.min.js
admin/meta.xsl
admin/ping.jsp
admin/ping.xsl
admin/raw-schema.jsp
admin/registry.jsp
admin/registry.xsl
admin/replication/header.jsp
admin/replication/index.jsp
admin/schema.jsp
admin/solr-admin.css
admin/solr_small.png
admin/stats.jsp
admin/stats.xsl
admin/tabular.xsl
admin/threaddump.jsp
admin/threaddump.xsl
favicon.ico
index.jsp
META-INF/LICENSE.txt
META-INF/NOTICE.txt

> I just used the following steps (basically the same as yours just
> different paths) and got solr running in tomcat 7.0.29 with no
> problems

Well, *and* using a different version of solr ...



Re: Query for records that have more than N values in a multi-valued field

2012-07-24 Thread Chris Hostetter

: I have a multivalued field and I want to find records that have (for
: example) at least 3 values in that list. Is there an easy way to do
: it?

Typically this is solved by adding another field that contains the count 
-- I thought i had already written an update processor in Solr 4.0.0-ALPHA 
that would do this, but aparently i didn't, so i'll add one in SOLR-3670


-Hoss


filed type for text search

2012-07-24 Thread Xiao Li
I have used Solr 3.4 for a long time. Recently, when I upgrade to Solr 4.0
and reindex the whole data, I find that the fields which are specified as
string type can not be searched by q parameter. If  I just change the type
to text_general, it works.  So my question is for Solr 4.0, must I set the
field type of text_general for text search?


Re: "Invalid or unreadable WAR file : .../solr.war" when starting solr 3.6.1 app on Tomcat 7 (upgrade to 7.0.29/upstream)?

2012-07-24 Thread Chris Hostetter

: > I just used the following steps (basically the same as yours just
: > different paths) and got solr running in tomcat 7.0.29 with no
: > problems
: 
: Well, *and* using a different version of solr ...

Whoops ... sorry, somehow i totally missed that. I just tried again with 
Solr 3.6.1 and everything still worked fine...

hossman@frisbee:/var/tmp$ ls
apache-solr-3.6.1.tgz  apache-solr-4.0.0-ALPHA.tgz  apache-tomcat-7.0.29.tar.gz 
 solr-context-file.xml
hossman@frisbee:/var/tmp$ tar -xzf apache-solr-3.6.1.tgz 
hossman@frisbee:/var/tmp$ tar -xzf apache-tomcat-7.0.29.tar.gz 
hossman@frisbee:/var/tmp$ cp apache-solr-3.6.1/dist/apache-solr-3.6.1.war 
solr.war
hossman@frisbee:/var/tmp$ sha1sum solr.war 
e1da03c97be88e2e72d120b6ba32de1133603766  solr.war
hossman@frisbee:/var/tmp$ mkdir -p apache-tomcat-7.0.29/conf/Catalina/localhost/
hossman@frisbee:/var/tmp$ cp solr-context-file.xml 
apache-tomcat-7.0.29/conf/Catalina/localhost/solr.xml
hossman@frisbee:/var/tmp$ cp -r apache-solr-3.6.1/example/solr solr-home
hossman@frisbee:/var/tmp$ ./apache-tomcat-7.0.29/bin/catalina.sh start
Using CATALINA_BASE:   /var/tmp/apache-tomcat-7.0.29
Using CATALINA_HOME:   /var/tmp/apache-tomcat-7.0.29
Using CATALINA_TMPDIR: /var/tmp/apache-tomcat-7.0.29/temp
Using JRE_HOME:/usr/lib/jvm/java-6-openjdk-amd64/
Using CLASSPATH:   
/var/tmp/apache-tomcat-7.0.29/bin/bootstrap.jar:/var/tmp/apache-tomcat-7.0.29/bin/tomcat-juli.jar

...solr was up and running at http://localhost:8080/solr/ ... no errors in 
the logs.



-Hoss


Re: "Invalid or unreadable WAR file : .../solr.war" when starting solr 3.6.1 app on Tomcat 7 (upgrade to 7.0.29/upstream)?

2012-07-24 Thread Chris Hostetter

: Whoops ... sorry, somehow i totally missed that. I just tried again with 
: Solr 3.6.1 and everything still worked fine...

BTW: I ran all these steps on Solr 3.6.1 using both...

java version "1.6.0_24"
OpenJDK Runtime Environment (IcedTea6 1.11.3) (6b24-1.11.3-1ubuntu0.12.04.1)
OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)

...and...

java version "1.7.0_03"
OpenJDK Runtime Environment (IcedTea7 2.1.1pre) (7~u3-2.1.1~pre1-1ubuntu3)
OpenJDK 64-Bit Server VM (build 22.0-b10, mixed mode)

(when i tested the steps using Solr 4.0.0-ALPHA before i only tested using 
1.6.0_24)


-Hoss


Re: "Invalid or unreadable WAR file : .../solr.war" when starting solr 3.6.1 app on Tomcat 7 (upgrade to 7.0.29/upstream)?

2012-07-24 Thread k9157
On Tue, Jul 24, 2012, at 11:43 AM, Chris Hostetter wrote:
> 
> : Whoops ... sorry, somehow i totally missed that. I just tried again
> with Solr 3.6.1 and everything still worked fine...
> 
> BTW: I ran all these steps on Solr 3.6.1 using both...
> 
> java version "1.6.0_24"
> OpenJDK Runtime Environment (IcedTea6 1.11.3)
> (6b24-1.11.3-1ubuntu0.12.04.1)
> OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
> 
>   ...and...
> 
> java version "1.7.0_03"
> OpenJDK Runtime Environment (IcedTea7 2.1.1pre)
> (7~u3-2.1.1~pre1-1ubuntu3)
> OpenJDK 64-Bit Server VM (build 22.0-b10, mixed mode)
> 
> (when i tested the steps using Solr 4.0.0-ALPHA before i only tested
> using 
> 1.6.0_24)

Hm.  Works on Ubuntu.  Useful info.

I'm suspicious now, of an Opensuse-ism.  Maybe lingering pkg pollution
of the java stack on this box.

Since I'm headed in the direction of a local stack anyway, I'll try this
on a 'clean' VM that I build up, building all java* -- other than the
JDK -- from src, and see what happens.

Thanks again.


Re: "Invalid or unreadable WAR file : .../solr.war" when starting solr 3.6.1 app on Tomcat 7 (upgrade to 7.0.29/upstream)?

2012-07-24 Thread Chris Hostetter

: I'm suspicious now, of an Opensuse-ism.  Maybe lingering pkg pollution
: of the java stack on this box.

I would be more suspicious of lingering pkg pollution in tomcat ... i'm 
assuming you already tried the solr tutorial using the provided jetty 
first ... right?  

if that works it's unlikely that the JVM is having a problem reading the 
WAR.


-Hoss


Re: Redirecting SolrQueryRequests to another core with Handler

2012-07-24 Thread Chris Hostetter

: Managed to do this in the end by reconstructing a new SolrQueryRequest
: with a SolrRequestParsers (method buildRequestFrom()) and then calling
: core.execute();
: Took some fiddling but seems to be working now! :)

FWIW, i think the simplest way to do something like this would be...

CoreContainer cc = req.getCore().getCoreDescriptor().getCoreContainer()
SolrCore other = cc.getSolrCore("the_other_core_name") {
try {
  LocalSolrQueryRequest oreq = new LocalSolrQueryRequest(other, ...);
  try {
SolrQueryResponse orsp = new SolrQueryResponse(); 
other.execute(oreq, orsp);
// do something with orsp
  } finally {
oreq.close();
  }
} finally {
  other.close();
}



-Hoss


Re: [Announce] Solr 4.0-ALPHA with RankingAlgorithm 1.4.4 with Realtime NRT available for download

2012-07-24 Thread Andy
Nagendra,

Does RankingAlgorithm work with faceting which requires the use of cache? As 
new documents are added or updated, the cache will be constantly invalidated. 
So how would RankingAlgorithm work in this case?



 From: Nagendra Nagarajayya 
To: solr-user@lucene.apache.org 
Sent: Tuesday, July 24, 2012 8:24 AM
Subject: Re: [Announce] Solr 4.0-ALPHA with RankingAlgorithm 1.4.4 with 
Realtime NRT available for download
 
Hi Yonik:

Please see my comments below:

On 7/23/2012 8:52 AM, Yonik Seeley wrote:
> On Mon, Jul 23, 2012 at 11:37 AM, Nagendra Nagarajayya
>   wrote:
>> Realtime NRT algorithm enables NRT functionality in
>> Solr by not closing the Searcher object  and so is very fast. I am in the
>> process of contributing the algorithm back to Apache Solr as a patch.
> Since you're in the process of contributing this back, perhaps you
> could explain your approach - it never made sense to me.
> 
> Replacing the reader in an existing SolrIndexSearcher as you do means
> that all the related caches will be invalid (meaning you can't use
> solr's caches).  You could just ensure that there is no auto-warming
> set up for Solr's caches (which is now the default), or you could
> disable caching altogether.  It's not clear what you're comparing
> against when you claim it's faster.

Solr with RankingAlgorithm does not replace the reader in SolrIndexSearcher 
object. All it does is override the IndexSearcher.getIndexReader() method so as 
to supply a NRTReader if realtime is enabled. All direct references to the 
"reader" member has been replaced with a getIndexReader() method access.

The performance is better as SolrIndexSearcher is not closed every 1 sec as in 
soft-commit. SolrIndexSearcher is a heavy object with caches, etc. and is 
reference counted. So every 1 sec this object needs to closed, re-allocated and 
the indexes need to be re-opened, caches invalidated, while waiting for 
existing searchers to complete, making this very expensive. realtime NRT does 
not close the SolrIndexSearcher object but makes available a new NRTReader with 
document updates ie. getIndexReader() returns a new NRTReader.

> There are also consistency and concurrency issues with replacing the
> reader in an existing SolrIndexSearcher, which is supposed to have a
> static view of the index.  If a reader replacement happens in the
> middle of a request, it's bound to cause trouble, including returning
> the wrong documents!

The reader member is not replaced in the existing SolrIndexSearcher object. The 
IndexSearcher.getIndexReader() method has been overriden in SolrIndexSearcher 
and all direct reader member access has been replaced with a getIndexReader() 
method call allowing a NRT reader to be supplied when realtime is enabled. The 
concurrency is handled by the getNRTReader() method, with the static index view 
now increased to the granularity provided by the NRTIndexReader.


Regards,

Nagendra Nagarajayya
http://solr-ra.tgels.org
http://rankingalgorithm.tgels.org

> -Yonik
> http://lucidimagination.com
> 
> 

Copy lucene index into Solr

2012-07-24 Thread spredd1208
Is there a best practice to copy a lucene index which is built using the core
API of Lucene 3.6 into a
solr server (also 3.6) and then have it work?

I cannot find a mapping anywhere of lucene fields to solr fields and what
the corresponding schema.xml would look like.

This seems like something that should be on page 1 of the solr docs, but I
cannot find it ANYWHERE!

Is this possible? If not I would really like to know now, because I will
need to toss SOLR before getting stuck in a dead end.

Thanks.



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Copy-lucene-index-into-Solr-tp3997078.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: SOLR 4.0-ALPHA : DIH : Indexed and Committed Successfully but Index is empty

2012-07-24 Thread Chris Hostetter

: emptyFieldChain
...
: 294
...
: When running the DIH full import, the log output is:
...
: I'm a bit confused that it does not output the commit logs but it didn't 
: do that before, neither, if I remember correctly. When I issue an 

It may not commit by default unless you explicitly use "commit=true" when 
running the full-import (i don't remember what the default is) ... but 
that log output (or lack of it) is still highly suspicious.  can you 
please post what your "emptyFieldChain" looks like in solrconfig.xml?

Did you perhaps forget to include RunUpdateProcessorFactory at the end?

(i don't remember what exactly my advice to you was before, i may have 
posted a snippet of example syntax on how to do something with some new 
UpdateProcessors w/o posting a full chain that includes things you almost 
always want like LogUpdateProcessorFactory & RunUpdateProcessorFactory)


-Hoss


Re: Copy lucene index into Solr

2012-07-24 Thread Upayavira
Solr places constraints upon what you can do with your lucene index
(e.g. You must conform to a schema). If your Lucene index cannot be
mapped to a schema, then it cannot be used within Solr.

Upayavira

On Tue, Jul 24, 2012, at 11:05 PM, spredd1208 wrote:
> Is there a best practice to copy a lucene index which is built using the
> core
> API of Lucene 3.6 into a
> solr server (also 3.6) and then have it work?
> 
> I cannot find a mapping anywhere of lucene fields to solr fields and what
> the corresponding schema.xml would look like.
> 
> This seems like something that should be on page 1 of the solr docs, but
> I
> cannot find it ANYWHERE!
> 
> Is this possible? If not I would really like to know now, because I will
> need to toss SOLR before getting stuck in a dead end.
> 
> Thanks.
> 
> 
> 
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/Copy-lucene-index-into-Solr-tp3997078.html
> Sent from the Solr - User mailing list archive at Nabble.com.


SOLR - Query - Filtering on grouped result

2012-07-24 Thread SARKAR, Subrata
Hi,
Is there a way to make a SOLR query which will filter the result returned by 
the group.

I am having the following three document in SOLR, and I want to pick up the 
latest document based on RqDate which is associated with OrganizationIdRef:154.

So I wrote my query to group the results by UserId and sort it by RqDate 
(group=true&group.field=UserId&group.sort=RqDate desc) upto which it is 
returning the [Document 3] which is fine.

Then in order to check whether the last document I got is associated with 
OrganizationIdRef:154 or not I add a filter query (fq=OrganizationIdRef:154). 
Then instead of returning me 0 results found which I am expecting it is 
returning me the [Document 2]

Is there a way I can make a filter on the grouped result in SOLR? Please help.

Document 1:
  
153154
2012-07-07T12:50:02.323Z
1
  
Document 2:
  
153154
2012-08-07T12:31:46.885Z
1
  

Document 3: [latest based on RqDate]
  
153
2012-09-07T12:23:31.583Z
1
  

Thanks & Regards,
Subrata Sarkar
subrata.sar...@oup.com
+44 (0)1865 354 486 (Office)
+44 (0)7741 559 346 (Cell)


Oxford University Press (UK) Disclaimer

This message is confidential. You should not copy it or disclose its contents 
to anyone. You may use and apply the information for the intended purpose only. 
OUP does not accept legal responsibility for the contents of this message. Any 
views or opinions presented are those of the author only and not of OUP. If 
this email has come to you in error, please delete it, along with any 
attachments. Please note that OUP may intercept incoming and outgoing email 
communications.


Re: Copy lucene index into Solr

2012-07-24 Thread spredd1208
So on the lucene side I am only using:

http://lucene.apache.org/core/3_6_0/api/core/org/apache/lucene/document/Field.html#Field(java.lang.String,
java.lang.String, org.apache.lucene.document.Field.Store,
org.apache.lucene.document.Field.Index)

and
http://lucene.apache.org/core/3_6_0/api/core/org/apache/lucene/document/NumericField.html#NumericField(java.lang.String,
org.apache.lucene.document.Field.Store, boolean)

to index my fields.

How does this line up with solr?

What do I use for my field types in my schema and how will this affect my
use and/or limits of using solr's query and index configurations?



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Copy-lucene-index-into-Solr-tp3997078p3997105.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: PathHierarchyTokenizerFactory behavior

2012-07-24 Thread Chris Hostetter

: Modifying the field definition to 

Alok: thanks for reporting this.  I've opend an issue to improve hte 
example and the docs...

https://issues.apache.org/jira/browse/SOLR-3674


-Hoss


Re: Copy lucene index into Solr

2012-07-24 Thread Lance Norskog
It will be easier to understand these if you make an index in Solr and
examine it with the Luke program. Build Solr,
cd solr/example
java -jar start.jar
in another window
cd solr/exampledocs
sh post.sh *.xml

Now use Luke on the index in solr/example/solr/collection1/data/index

You will see what fields are made by indexing.

In general, very few people move binary indexes from Lucene to Solr or
between Solr versions. Lucene is not a database. There are people who
use it as their system of record, but that is not wise. You should be
able to reindex everything from your original data.

On Tue, Jul 24, 2012 at 6:11 PM, spredd1208  wrote:
> So on the lucene side I am only using:
>
> http://lucene.apache.org/core/3_6_0/api/core/org/apache/lucene/document/Field.html#Field(java.lang.String,
> java.lang.String, org.apache.lucene.document.Field.Store,
> org.apache.lucene.document.Field.Index)
>
> and
> http://lucene.apache.org/core/3_6_0/api/core/org/apache/lucene/document/NumericField.html#NumericField(java.lang.String,
> org.apache.lucene.document.Field.Store, boolean)
>
> to index my fields.
>
> How does this line up with solr?
>
> What do I use for my field types in my schema and how will this affect my
> use and/or limits of using solr's query and index configurations?
>
>
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Copy-lucene-index-into-Solr-tp3997078p3997105.html
> Sent from the Solr - User mailing list archive at Nabble.com.



-- 
Lance Norskog
goks...@gmail.com