Hi All,
I am using Apache solr 3.1 and trying to caching 50 gb records but it is
taking more then 20 hours this is very painful to update records.
1. Is there any way to reduce caching time or this time is ok for 50 gb
records ?.
2. What is the delta-import, this will be helpful for me cache on
Looks like you are using a java util logging sys prop and pointing to a log4j
config file?
On Feb 24, 2012, at 6:42 PM, Anand Henry wrote:
> Forgot to mention, I also append arg
> '-Djava.util.logging.config.file=log4j.properties' on start up.
>
> On Fri, Feb 24, 2012 at 3:38 PM, Anand Henry wr
I'm always excited to try the new stuff, we've been running off of a
fairly old version from solrbranch which I'd like to move off of, this
is the first step.
Are there any outstanding issues that I should be aware of? I've only
run simple tests thus far, but plan to run some more comprehensive
t
No problem! Only wish I had found what the issue was sooner - I suspected a
couple other issues we found could have been related, but since I had not
duplicated it I could not be sure - the early use and feedback is invaluable
though.
On Feb 24, 2012, at 10:53 PM, Jamie Johnson wrote:
> Pullin
Pulling the latest version seems to have fixed whatever issue that
previously existed so everything appears to be working properly. I'm
seeing updates make it to the downed server once it recovers and
deletes (even by query, which I wasn't sure would work) being
forwarded as well. So everything l
I'm pulling the latest now. Once I've rebuilt and setup the test I'll
forward all the logs on to you. Again thanks for looking into this.
On Fri, Feb 24, 2012 at 9:20 PM, Mark Miller wrote:
>
> On Feb 22, 2012, at 9:54 PM, Jamie Johnson wrote:
>
>> Perhaps if you could give me the steps you're
On Feb 22, 2012, at 9:54 PM, Jamie Johnson wrote:
> Perhaps if you could give me the steps you're using to test I can find
> an error in what I'm doing.
I've tested a few ways - one of them as Sami has already explained.
Could you try one more time from the latest? A few issues have been addre
Forgot to mention, I also append arg
'-Djava.util.logging.config.file=log4j.properties' on start up.
On Fri, Feb 24, 2012 at 3:38 PM, Anand Henry wrote:
> Hi,
>
> How do you configure default SOLR app running on jetty to send logs to
> syslog? I am using Solr 3.4.
>
> I understand I have to direc
Hi Umesh,
if you want to completely separate both Solr-instances for any reasons,
feel free to do so.
However *I* feel very comfortable with managing multiple SolrCores
within the same ServletContainer-instance (i.e. Tomcat).
The only reasons why I ever setup multiple instances were
security-cons
Hi Em,
Both the indices will be used by different applications and as of now we do
not have any need to combine the results.
Thanks,
Umesh
--
View this message in context:
http://lucene.472066.n3.nabble.com/Solr-muticore-OR-Solr-multiple-instances-tp3773192p3773915.html
Sent from the Solr - Use
Lucene Revolution will be here May 9-10 in Boston (with training classes
offered on May 7-8). Reserve your spot today with Early Bird pricing of $575.
Committers and accepted speakers are entitled to free admission. The CFP is
open and we’re actively seeking submissions from the Community.
Subm
The key piece is "ZkSolrResourceLoader does not support getConfigDir() "
Apparently DIH is doing something that requires getting the local config dir
path - but this is on ZK in SolrCloud mode, not the local filesystem.
Could you make a JIRA issue for this? I could look into a work around depend
On Feb 24, 2012, at 6:55 AM, Em wrote:
> You need a log for failover.
There is a transaction log.
- Mark Miller
lucidimagination.com
I have a situation where I want to show the term counts as is done in the
TermsComponent, but *only* for terms that are *matched* in a query, so I
get something returned like this (pseudo code):
q=title:(golf swing)
title: golf legends show how to improve your golf swing on the golf course
...ot
You've got to add firstSearcher/newSearcher queries
that do the expensive parts. Where you're probably
taking a wrong turn is thinking that queries
"load the index into memory". Nothing of the sort
happens. q=*:* is particularly unhelpful since it's
a "constant score query" which basically short-ci
I'm seeing some problems warming up solr on startup. Currently warmup
consists of two parts- running queries on startup programmatically, and
then running a script to perform queries. The programmatic warmup seems to
warm up Solr fine in terms of making queries via the Solr admin tool, but
when I d
Obviously it'd be great if someone else was able to confirm this in their
setup as well.
But with different environments, payload sizes, etc., I'm not sure how
easily it can be tested in other environments.
On Fri, Feb 24, 2012 at 2:46 PM, Brian G wrote:
> Erick -
>
> That is exactly what we ar
Erick -
That is exactly what we are seeing.
this is in our solrconfig.xml:
false
and our response times have decreased drastically. I'm on my 40th-ish test
today and the response times are still 10+ seconds faster on the higher
payload than they were when it was set to true.
Smaller payloads a
Hi Umesh,
how does your access-pattern looks like?
Do you need to combine results from different indizes?
Kind regards,
Em
Am 24.02.2012 18:37, schrieb Umesh_:
> All,
>
> I was trying to find information about plus/deltas with using
> - Solr muticore where each core has a different index
> - Mu
Let me echo this back to see if I have it right, because it's *extremely*
weird if I'm reading it correctly.
In your solrconfig.xml file, you changed this line:
true
to this:
false
and your response time DECREASED? If you can confirm that
I'm reading it right, I'll open up a JIRA.
Best
Erick
On
I'm not sure what would constitute a low vs. high hit rate (and eviction
rate), so we've kept the setting at LRUCache instead of FastCache for now.
But I will say we did turn the LazyFieldLoading option off and wow - a huge
increase in performance on the newer nightly build we are using (the one
f
jconsole should just be there in your Java SDK, you shouldn't have to install
anything. Connecting remotely is a little trickier, here's the reference.
http://docs.oracle.com/javase/1.5.0/docs/guide/management/agent.html
I cheat and disable authentication, see the "disabling security" section, but
All,
I was trying to find information about plus/deltas with using
- Solr muticore where each core has a different index
- Multiple instances where each Solr instance has its own index
So currently we have a Master/Slave configuration in place where we just
have one index. But we need to support
The empty path message is becayse nutch is unable to find a url in the url
location that you provide.
Kindly ensure there is a url there.
--
View this message in context:
http://lucene.472066.n3.nabble.com/nutch-and-solr-tp3765166p3773089.html
Sent from the Solr - User mailing list archive at Na
On Fri, Feb 24, 2012 at 11:24 AM, naptowndev wrote:
> Another question I have is regarding solr.LRUCache vs. solr.FastLRUCache.
> Would there be reason to implement (or not implement) fastLRU on the
> documentcache?
LRUCache can be faster if the hit rate is really low (i.e. the
eviction rate is h
Yonik -
Thanks, we'll give that a try (re: lazyfieldlaoding).
and no, the * is not in our config...that must have come over from pasting
it in from the file. Odd.
Another question I have is regarding solr.LRUCache vs. solr.FastLRUCache.
Would there be reason to implement (or not implement) fas
On Fri, Feb 24, 2012 at 10:25 AM, naptowndev wrote:
> Our current config for that is as follows:
> initialSize="*15000*"autowarmCount
> ="*0*" />
>
> It's the same for both instances
I assume the asterisks are for emphasis and are not actually present
in your config?
> And lazyfieldloading is e
Product_ID was defined - Turns out I was encapsulating all data when
encapsulation was only required for a couple field types.
Erik Hatcher walked me through everything I was doing wrong and brought me to a
solution.
Thank you for responding, though - tis greatly appreciated.
Thanks,
Patrick Mc
Thanks again.
We're trying to get our ops team to install jconsole for us so we can take
a look at the GC stuff.
Your comment about the documentcache is intriguing for sure.
We just ran a couple of test against the older 4.x build we have (that's
been returning quicker) and the newer in that we
Yonik,
thanks for sharing deeper details about how SolrCloud is going to work.
Do you plan to release any wiki-updates about the small details, so that
other developers are able to get in touch with what you've already done
there?
I think small guides and the mentioning of class-names and their
r
Hi Per,
if you are evaluating with your ProductOwner whether he/she wants to
contribute back:
Try to not see it only as a gift to the community for a highly usefull
product, but also see it as a protection of your investment.
What you are going to customize will be deeply integrated in Solr - in
Watch out for StringField, that may be where you're having
trouble. Take a close look at your admin/analysis page. If
"Pass by Value" is matching on a string field when quoted,
that'll explain why it isn't matching when not quoted.
The problem here is that the query parser (before it gets to
the f
On Fri, Feb 24, 2012 at 8:59 AM, Per Steffensen wrote:
> We might make it "outside" Solr/Lucene but I
> hope to be able to convince my ProductOwner to make it as a Solr-feature
> contributing it back - especiallly if the Solr community agrees that it
> would be a nice and commonly usable feature.
Per Steffensen skrev:
Em skrev:
This is a really cool feature!
Thanks for pointing us in that direction!
A feature where you can flag your "index" operation to provide "create
sematics" would be cool. When setting the "create-semantics" flag, an
"index" operation will fail if a document wit
Yonik Seeley skrev:
On Fri, Feb 24, 2012 at 9:04 AM, Per Steffensen wrote:
Cool. We have a test doing exactly that - indexing 2000 documents into Solr,
kill-9'ing Solr in the middle of the process, starting Solr again and
checking that 2000 documents will eventually be searchable. It lights
On Fri, Feb 24, 2012 at 9:04 AM, Per Steffensen wrote:
> Cool. We have a test doing exactly that - indexing 2000 documents into Solr,
> kill-9'ing Solr in the middle of the process, starting Solr again and
> checking that 2000 documents will eventually be searchable. It lights red as
> it is right
Hi,
I had posted two different query strings by mistake. PFB the correct strings
when "Pass by value" is the search word
String given without quotes
webapp=/solr path=/select/
params={facet=true&f.typeFacet.facet.mincount=1&qf=name^2.3+text+x_name^0.3+id^0.3+xid^0.3&hl.fl=*&hl=true&f.rFacet.fac
Yonik Seeley skrev:
On Fri, Feb 24, 2012 at 6:55 AM, Em wrote:
However, regarding a versioning-system, one always has to keep in mind
that an uncommited document is not guaranteed to be persisted in the index.
We now have durability via an update log.
With a recent nightly trunk build
Em skrev:
This is a really cool feature!
Thanks for pointing us in that direction!
A feature where you can flag your "index" operation to provide "create
sematics" would be cool. When setting the "create-semantics" flag, an
"index" operation will fail if a document with simular id (or whatev
On Fri, Feb 24, 2012 at 6:55 AM, Em wrote:
> However, regarding a versioning-system, one always has to keep in mind
> that an uncommited document is not guaranteed to be persisted in the index.
We now have durability via an update log.
With a recent nightly trunk build, you can send a document to
> I want to include my own filter for analysis of tokens while
> indexing the
> documents in SOLR.
http://wiki.apache.org/solr/SolrPlugins
This is a really cool feature!
Thanks for pointing us in that direction!
As the "Quick Start" says, a document does not need a commit nor a
soft-commit or anything else to be available via RealTimeGet.
However, regarding a versioning-system, one always has to keep in mind
that an uncommited docum
Hi Per,
> Can you give a code-pointer to where I can find the pending-set stuff?
> Does solr use this pending-set for query responses, so that solr deliver
> 100% real-time search results?
As of Solr 3.5 it can be found within the DirectUpdateHandler and
DirectUpdateHandler2-classes.
I am currentl
On Fri, Feb 24, 2012 at 12:06 PM, Per Steffensen wrote:
> Sami Siren skrev:
>
Given that you've set a uniqueKey-field and there already exists a
document with that uniqueKey, it will delete the old one and insert the
new one. There is really no difference between the semantics - upd
For example, this way:
1. Implement a filter factory:
[code]
package com.mycomp.solr.analysis;
import org.apache.lucene.analysis.TokenStream;
import org.apache.solr.analysis.BaseTokenFilterFactory;
import org.apache.solr.common.ResourceLoader;
import org.apache.solr.util.plugin.ResourceLoaderAwa
Hi, all,
I tried to upgrade tika0.8 to tika0.10 on solr3.3.0, but failed. Following
are some technical details.
Anyone tried similar things before? Pls advice. Thank you.
1. Replace the following jars in /contrib/extraction/
fontbox-1.6.0, jempbox-1.6.0, pdfbox-1.6.0, tika-core-0.10,
tika-p
Hi, all,
I tried to upgrade tika0.8 to tika0.10 on solr3.3.0, following the similar
steps, but failed.
1. Replace the following jars in /contrib/extraction/
fontbox-1.6.0, jempbox-1.6.0, pdfbox-1.6.0, tika-core-0.10,
tika-parsers-0.10;
2. Copy all the jars in /contrib/langid/* from solr3.5.0
Sami Siren skrev:
Given that you've set a uniqueKey-field and there already exists a
document with that uniqueKey, it will delete the old one and insert the
new one. There is really no difference between the semantics - updates
do not exist.
To create a UNIQUE-constraint as you know it from a dat
Dear all,
I want to include my own filter for analysis of tokens while indexing the
documents in SOLR.
Is there any explicit interface for programming compatible filters in SOLR?
Please let me know the steps to be followed to use my own filters in
Schema.xml file. I mean, If I create a java class
>> Given that you've set a uniqueKey-field and there already exists a
>> document with that uniqueKey, it will delete the old one and insert the
>> new one. There is really no difference between the semantics - updates
>> do not exist.
>> To create a UNIQUE-constraint as you know it from a database
Em skrev:
Hi Per,
I want an error to occur if a document with the same id already
exists, when my intent is to INSERT a new document. When my intent is
to UPDATE a document in solr/lucene I want the old document already
in solr/lucene deleted and the new version of this document added
(exact
Hi, Dmitry
Thank you. It solved my problem.
Best Regards,
Bing
--
View this message in context:
http://lucene.472066.n3.nabble.com/Fail-to-compile-Java-code-trying-to-use-SolrJ-with-Solr-tp3708902p3772017.html
Sent from the Solr - User mailing list archive at Nabble.com.
Thanks !!
This is a tomcat issue and not solr : URIEncoding="UTF-8" is missing in
tomcat server.xml
Frederic
2012/2/23 Em
> Hi Frederic,
>
> I saw similar issues when sending such a request without proper
> URL-encoding. It is important to note that the URL-encoded string
> already has to be a
53 matches
Mail list logo