On Thu, Aug 13, 2009 at 3:08 AM, John Lowe wrote:
> Hmmm...perhaps my original note was a bit TLTR. Trying again:
>
> The v1.3 docs say that one can pass one's own parameters in to DIH via the
> HTTP request:
>
DIH in Solr 1.3 had a bug due to which request parameters in variables were
not reso
On Thu, Aug 13, 2009 at 4:08 AM, Erik Hatcher wrote:
> My hunch, though I'll try to make some time to test this out thoroughly, is
> that the entity is parsed initially with variables resolved, but not per
> request. Variables/expressions do get expanded for fields of course, but
> perhaps not for
isn't better to make a jar of PlaintextEntityProcessor and drop into
solr.home/lib ?
On Tue, Aug 11, 2009 at 11:05 PM, Sascha Szott wrote:
> Hi Noble,
>
> Noble Paul wrote:
>>
>> isn't it possible to do this by having two datasources (one Js=dbc and
>> another File) and two entities . The outer en
*You should try to generate heap dumps and analyze the heap using a tool
like the Eclipse Memory Analyzer. Maybe it helps spotting a group of
objects holding a large amount of memory*
The tool that I used also allows to capture heap snap shots. Eclipse had a
lot of pre-requisites. You need to appl
Thanks!
How for weblogic?
Francis
-Original Message-
From: ant [mailto:dormant.m...@gmail.com]
Sent: Wednesday, August 12, 2009 8:44 PM
To: solr-user@lucene.apache.org
Subject: Re: Example dir
http://wiki.apache.org/solr/SolrJetty#head-5663a826c263727cad83bc58cac0cb02f53d6a80
SolrJetty
http://wiki.apache.org/solr/SolrJetty#head-5663a826c263727cad83bc58cac0cb02f53d6a80
SolrJetty
and others
http://wiki.apache.org/solr/SolrInstall#head-ec97d15a70656e9c0308009db70d71af3efc7cd2
2009/8/13 Francis Yakin
>
> Any one has any inputs for this? I really appreciated.
>
> Thanks
>
> Franci
Any one has any inputs for this? I really appreciated.
Thanks
Francis
-Original Message-
From: Francis Yakin [mailto:fya...@liquid.com]
Sent: Wednesday, August 12, 2009 3:39 PM
To: 'solr-user@lucene.apache.org'
Subject: Example dir
As of right now when I installed and configure the So
could anybody provide me with a complete data import handler example with
oracle if there is any.
thanks
rashid
--
View this message in context:
http://www.nabble.com/Writing-own-request-handler-tutorial-tp24943849p24943849.html
Sent from the Solr - User mailing list archive at Nabble.com.
Note that depending on the profile of your field (full text and how many
unique terms on average per document), the improvements from 1.4 may not
apply, as you may exceed the limits of the new faceting technique in Solr
1.4.
-Stephen
On Wed, Aug 12, 2009 at 2:12 PM, Erik Hatcher wrote:
> Yes, in
Hello,
We have Solr running with the defaultOperator set to "AND". Am not
able to get any results for queries like q=( Ferrari AND ( "599 GTB
Fiorano" OR "612 Scaglietti" OR F430 )) , which contain "(" for
grouping. Anyone have any ideas for a workaround ?
Thanks
madhu
Ah - right - sorry, getting lost looking at the diff I guess.
It looks to me like you are never actually returning NO_MORE_DOCS in
nextDoc based on the exception.
I can't see how that is at the moment though.
- Mark
Stephen Duncan Jr wrote:
Doesn't that happen with the call to
this.termDoc
Is there a way to do this currently? If a shard takes an
inordinate amount of time compared to the other shards, it's useful
to see the various qtimes per shard, with the aggregated results.
As of right now when I installed and configure the Solr, I will get "example"
dir ( like /opt/apache-solr-1.3.0/example ).
How can I change that to something else, because "example" to me is not real?
Thanks
Francis
My hunch, though I'll try to make some time to test this out
thoroughly, is that the entity is parsed initially with variables
resolved, but not per request. Variables/expressions do get expanded
for fields of course, but perhaps not for other high-level attributes?
Erik
On Aug 12
I believe it will be, but am not sure of the procedure for
distributing. I think if you register, but don't show, you will get a
notification.
-Grant
On Aug 10, 2009, at 12:26 PM, Lucas F. A. Teixeira wrote:
Hello Grant,
Will the webinar be recorded and available to download later
somep
For your fields with many terms you may want to try Bobo
http://code.google.com/p/bobo-browse/ which could work well with your
case.
On Wed, Aug 12, 2009 at 12:02 PM, Fuad Efendi wrote:
> I am currently faceting on tokenized multi-valued field at
> http://www.tokenizer.org (25 mlns simple docs)
>
Hi Alan,
Solr 1.4 does not contain near realtime search capabilities and
it could be variously detrimental to call commit too often as
indexing and searches could precipitously degrade. That being
said, most of the NRT functionality is not too difficult to add,
except for per segment caching SOLR-
: the search results. In particular, I hypothesize that, for a somewhat
: heterogeneous index (heterogeneous in terms of which fields a given
: record might contain), that the following rule might be helpful: Facet
: on a given field to the extent that it is frequently set in the
: documents match
Hmmm...perhaps my original note was a bit TLTR. Trying again:
The v1.3 docs say that one can pass one's own parameters in to DIH via
the HTTP request:
http://wiki.apache.org/solr/DataImportHandler#head-520f8e527d9da55e8ed1e274e29709c8805c8eae
SO if I have a URL like the following to dataimp
: I noticed in recent SVN versions the example/solr/bin dir has been empty.
FYI: example/solr/bin is empty, because thescripts aren't part of the
example anymore ... but the scripts themselves are still included in the
releases (src/scripts)
-Hoss
Hi,
Does the latest version of Solr 1.4 dev (including DIH) take advantage of
Lucene's Near Realtime Search features? I've read several past postings
about providing near-real time search using a small and large index and was
wondering if that will still be necessary when Solr 1.4 releases.
Tha
Is there a way to do this via a URL?
got it, thanks!
Is the whole updateHandler interface documented somewhere currently on the
wiki?
I think it will benefit all users to have this laid out nicely, I was
looking around for this and couldnt
easily find it in there.
-Chak
Yonik Seeley-2 wrote:
>
> Then replace commit with optimize
I am currently faceting on tokenized multi-valued field at
http://www.tokenizer.org (25 mlns simple docs)
It uses some home-made quick fixes similar to SOLR-475 (SOLR-711) and
non-synchronized cache (similar to LingPipe's FastCache, SOLR-665, SOLR-667)
Average "faceting" on query results: 0.2 - 0
Same works with optimize... /solr/update?optimize=true
Erik
On Aug 12, 2009, at 2:43 PM, KaktuChakarabati wrote:
Hey Yonik,
Thanks for the quick reply, However my first question was more
specific:
* I'm not worried about a commit but about the *optimize* operation
which I
might w
We have been using solr 1.3 with jdk1.6 for quite sometime in production, no
issues yet
Thanks,
Kalyan Manepalli
-Original Message-
From: vaibhav joshi [mailto:callvaib...@hotmail.com]
Sent: Wednesday, August 12, 2009 1:21 PM
To: solr-user@lucene.apache.org
Subject: Solr 1.3 and JDK1.6
Then replace commit with optimize?
curl 'http://localhost:8983/solr/update?optimize=true'
-Yonik
http://www.lucidimagination.com
On Wed, Aug 12, 2009 at 2:43 PM, KaktuChakarabati wrote:
>
> Hey Yonik,
> Thanks for the quick reply, However my first question was more specific:
> * I'm not worrie
Oracle JRockit (Mission Control 1.3) latest-greatest (Java 6), -server, AMD64,
SLES 10
Solr 1.3/1.4
Tomcat 6.0.20, APR
No any problem. But you need licensing for production.
JRockit seems to be at least 20 times faster than SUN's JVM.
P.S.
I only had constant problems with latest Apache Http
Yes, increasing the filterCache size will help with Solr 1.3
performance.
Do note that trunk (soon Solr 1.4) has dramatically improved faceting
performance.
Erik
On Aug 12, 2009, at 1:30 PM, Jérôme Etévé wrote:
Hi everyone,
I'm using some faceting on a solr index containing ~ 1
Hey Yonik,
Thanks for the quick reply, However my first question was more specific:
* I'm not worried about a commit but about the *optimize* operation which I
might want to run very infrequently
in respect to commits ( e.g I can commit every 15 minutes but optimize
once a day )
Yonik Seeley-
A script isn't really needed for something as simple as a commit:
curl 'http://localhost:8983/solr/update?commit=true'
-Yonik
http://www.lucidimagination.com
On Wed, Aug 12, 2009 at 2:27 PM, KaktuChakarabati wrote:
>
> Hey,
> I noticed in recent SVN versions the example/solr/bin dir has been emp
Hi,
I'm running solr 1.3 with java -version java version "1.6..." .
No problem to report.
Cheers.
J
2009/8/12 vaibhav joshi :
>
> Hi
>
> I am using Solr 1.3 ( official released version) and JDk1.5. My company is
> moving towards upgrading all systems to JDK1.6. is it safe to upgrade to
> JDK
Hey,
I noticed in recent SVN versions the example/solr/bin dir has been empty.
I understand the various snappulling scripts are basically deprecated since
replication is now handled in-process, however I was wondering what is the
state of the
optimize script, i.e how do I control optimization myse
Hi
I am using Solr 1.3 ( official released version) and JDk1.5. My company is
moving towards upgrading all systems to JDK1.6. is it safe to upgrade to JDK1.6
with Solr 1.3 wars? Are there any compatible issues with JDK1.6?
Thanks
Vaibhav
_
Jerome,
Yes you need to increase the filterCache size to something close to
unique number of facet elements. But also consider the RAM required to
accommodate the increase.
I did see a significant performance gain by increasing the filterCache size
Thanks,
Kalyan Manepalli
-Origina
Hi everyone,
I'm using some faceting on a solr index containing ~ 160K documents.
I perform facets on multivalued string fields. The number of possible
different values is quite large.
Enabling facets degrades the performance by a factor 3.
Because I'm using solr 1.3, I guess the facetting mak
Doesn't that happen with the call to
this.termDocs.next()
? Since I'm essentially delegating the tracking to the TermDocs object?
On Wed, Aug 12, 2009 at 10:08 AM, Mark Miller wrote:
> First thing I see (and it may be it) is that nextDoc must also set the doc
> - not just return it.
>
>
> --
First thing I see (and it may be it) is that nextDoc must also set the
doc - not just return it.
--
- Mark
http://www.lucidimagination.com
Stephen Duncan Jr wrote:
On Tue, Aug 11, 2009 at 2:09 PM, Stephen Duncan Jr
wrote:
This is with trunk for Solr 1.4. It happened both wi
Thank a lot Jason!
I'll go into depths with MergePolicy;
I use temporarily ramBufferSizeMB=8192 & mergeFactor=10 and looks like I
have constantly few thousands docs per second with very rare merge (already
3 hours, 8Gb index size, > 30 mlns docs)
I don't do "delete"; I execute SOLR /update and
Hi Grant,
Looks like I temporarily solved the problem with not-so-obvious settings:
ramBufferSizeMB=8192
mergeFactor=10
Starting from scratch on a different hardware (with much more RAM and CPU;
regular SATA) I have added/updated 30 millions docs within 3 hours...
without any merge yet! Index s
Thank you very much for replying. I'm really astound with the speed that you
guys reply to other people's problems :clap:. I managed to make it work with
WAPT and started testing.
Cheers
Viorel Hojda
Shalin Shekhar Mangar wrote:
>
> On Wed, Aug 12, 2009 at 5:07 PM, viorelhojda
> wrote:
>
Forwarding the ApacheCon announcement. Also note we have a lot of
Lucene ecosystem talks and a meetup scheduled, as well as training on
both Lucene and Solr, so I hope you will join us.
Cheers,
Grant
Begin forwarded message:
From: Sally Khudairi
Date: August 7, 2009 9:55:10 PM EDT
To: an
Some thoughts below, sorry for the late reply...
On Aug 6, 2009, at 2:27 PM, Mark Bennett wrote:
I'm investigating a problem I bet some of you have hit before, and
exploring
several options to address it. I suspect that this specific IDF
scenario is
common enough that it even has a name, t
Thank you for your suggestions. Since my initial email, I have
found/guessed two other indications :
- The CustomScoreQuery of Lucene :
http://lucene.apache.org/java/2_4_0/api/org/apache/lucene/search/function/package-summary.html
- To create a temporary index, at query-time, containing the doc
yaa it points the directory contains solr.xml.
But Solr Folder having the conf and bin folder.
for multicore any other thing is required.
2009/8/12 Noble Paul നോബിള് नोब्ळ्
> It somehow looks that the solr is started as a single core. are you
> sure the solr.solr.home points to a directory whi
On Wed, Aug 12, 2009 at 5:07 PM, viorelhojda wrote:
>
> Hello. I'm trying to do some speed tests on the SOLR server (Average
> Response
> Time, etc). I tried all kinds of tool and software but I didn't manage
> because at one point some errors appeared. The "*:*" query works just
> fine,
> but w
2009/8/12 Fabrice Estiévenart
> Hello,
>
> I need to sort my hits according to a rate of "popularity" which
> dynamically and periodically changes. Thus, I can't store this popularity in
> the index and I have to get it, from memory, at query-time.
>
> Is it possible with Solr ? Thank you.
>
Not
Ok, i'm going to go and implement this solution then. Thanks for the help :)
On Wed, Aug 12, 2009 at 2:11 PM, Avlesh Singh wrote:
> >
> > > multiValued="true" />
> >
> I think this should be good enough with the "number of days since ..." kind
> of data.
>
> Cheers
> Avlesh
>
> On Wed, Aug 12,
>
> multiValued="true" />
>
I think this should be good enough with the "number of days since ..." kind
of data.
Cheers
Avlesh
On Wed, Aug 12, 2009 at 5:26 PM, Constantijn Visinescu
wrote:
> Hmm .. I looked up the transformer class and that should work, thanks for
> the tip :)
>
> Thinking abou
It somehow looks that the solr is started as a single core. are you
sure the solr.solr.home points to a directory which contains the
solr.xml
On Wed, Aug 12, 2009 at 4:00 PM, deepak agrawal wrote:
> Hi all,
>
> I am having a solr single instance.Now i just want to use Solr Multi Core.
> So for thi
Hmm .. I looked up the transformer class and that should work, thanks for
the tip :)
Thinking about it a bit more, wouldn't it be easier to just add a field like
And use a function in the transformer to converts the day to int (number of
days since jan 1st 2000 or something).
Then if i use the
Hello. I'm trying to do some speed tests on the SOLR server (Average Response
Time, etc). I tried all kinds of tool and software but I didn't manage
because at one point some errors appeared. The "*:*" query works just fine,
but when I try to test another query the tool complains that the link(UR
>
> Is it possible with Solr ?
>
My understanding says, No.
I am waiting to be surprised.
Cheers
Avlesh
2009/8/12 Fabrice Estiévenart
> Hello,
>
> I need to sort my hits according to a rate of "popularity" which
> dynamically and periodically changes. Thus, I can't store this popularity in
> t
>
> I'd have to add the year because things might end up reserved far ahead, or
> they might be reserved again on the same day next year.
>
If that is the case, then yes you'll have to take year into account too.
Wouldn't it be terribly inefficient when I have 10million+ documents in my
> index an
On Tue, Aug 11, 2009 at 6:13 PM, Shalin Shekhar Mangar <
shalinman...@gmail.com> wrote:
> On Tue, Aug 11, 2009 at 7:08 PM, Constantijn Visinescu
> wrote:
>
> >
> >
> > Room1
> > 2000-08-01T00:00:00Z
> > 2000-08-31T23:59:59Z
> >
> >
> > Room2
> > 2000-08-01T00:00:00Z
> > 2000-08-13T23:59:59Z
> >
On Aug 11, 2009, at 5:30 PM, Bill Au wrote:
It looks like things have changed a bit since this subject was last
brought
up here. I see that there are support in Solr/Lucene for indexing
payload
data (DelimitedPayloadTokenFilterFactory and
DelimitedPayloadTokenFilter).
Overriding the Simil
Hi,
This seems like a bit of an unconventional suggestion but it just might
work.
I'd have to add the year because things might end up reserved far ahead, or
they might be reserved again on the same day next year.
I do have 2 questions:
1) Wouldn't it be terribly inefficient when I have 10millio
Hi all,
I am having a solr single instance.Now i just want to use Solr Multi Core.
So for this i just change solr.xml -
Now i have a solr directory(solr/conf/) and data directory above given
path.when i am using
http://localhost:8080/solr/admin/
then it is working fine.
B
>
> Searches would be for documents (rooms) that don't have certain dates in
> their multi-valued fields for the a particular month.
> E.g if you wanted to find out rooms available on 15th, 16th and 17th of
> August, the query could be:
> q=!(+reserved_dates_August:15 +reserved_dates_August:16
> +r
Rahul R schrieb:
> I tried using a profiling tool - Yourkit. The trial version was free for 15
> days. But I couldn't find anything of significance.
You should try to generate heap dumps and analyze the heap using a tool
like the Eclipse Memory Analyzer. Maybe it helps spotting a group of
objects
Hello,
I need to sort my hits according to a rate of "popularity" which
dynamically and periodically changes. Thus, I can't store this
popularity in the index and I have to get it, from memory, at query-time.
Is it possible with Solr ? Thank you.
Fabrice
61 matches
Mail list logo