have you given the url like
&qf=field1^100+field2^200
i also first tried something like &qf="field1^100 field2^200" which never
worked, when i used the + sign it works.
markus
> -Ursprüngliche Nachricht-
> Von: Mark Fletcher [mailto:mark.fletcher2...@gmail.com]
> Gesendet: Sam
yes i am using solr.xml, although there is only one core defined
at the moment. so reloading is only possible with a core-setup, right?
> -Ursprüngliche Nachricht-
> Von: Ahmet Arslan [mailto:iori...@yahoo.com]
> Gesendet: Samstag, 10. April 2010 15:55
> An: solr-user@lucene.apache.org
>
Hi everyone,
Some of you may have an idea about the best solution to achieve this simple
(in theory at least) goal :
I want a B2C website that uses Solr as its search engine service, but :
- it shouldn't expose Solr in an explicit manner (I mean clear URL pointing
to the Solr instance)
- keep cont
Hi,
I do not have an answer to your questions.
But, I have the same issue/problem you have.
It would be good if Solr community would agree and share their approach
for benchmarking Solr. Indeed, it would be good to have a benchmark for
"information retrieval" systems. AFIK there isn't one. :-/
T
Hi,
You can use Siege [1] in a similar manner as AB and it can support newline
separated URL files and pick random URL's.
[1]:http://freshmeat.net/projects/siege/
Cheers,
On Saturday 10 April 2010 03:46:56 Blargy wrote:
> I am about to deploy Solr into our production environment and I woul
On 04/11/2010 10:12 PM, Blargy wrote:
Mark,
Cool. I didn't think that was the expected behavior. Will you guys at Lucid
be rolling this patch into your 1.4 distribution?
I don't know the release plans, but I'm sure this patch would be
included in the next release.
As per your 1.5 comme
I've got a very simple perl script (most of the work is done with
modules) that I wrote which forks off multiple processes and throws
requests at Solr, then gives a little bit of statistical analysis at the
end. I have planned on sharing it from the beginning, I just have to
clean it up for pu
Hi All,
I'd like to know if anyone else is experiencing the same problem we are
facing
basically, we are running query with field collapsing (Solr 1.4 with patch
236). The responses tells us that there are about 2700 documents matching our
query.
However, I can not get passed the 431th do
I don't think the behavior is correct. The first example, with just one gap,
does NOT match. The second example, with an extra second gap, DOES match. It
seems that the term collapsing ("eighteenth-century" --> "eighteenthcentury")
somehow throws off the position sequence, forcing you to add
MitchK, I need a range filter, not a token filter. I'm trying to
filter on ranges. If you have code, I'd love to see it.
Israel, I've been using the JTeam spatial plugin as a model for my
work. I initially tried to use the plugin, but there were some serious
bugs in the code. I was able to make so
Lukas Kahwe Smith:
> On 07.04.2010, at 14:24, Lukas Kahwe Smith wrote:
> > For Solr the idea is also just copy the index files into a new directory
> > and then use http://wiki.apache.org/solr/CoreAdmin#RELOAD after updating
> > the config file (I assume its not possible to hot swap like with MySQL
> yes i am using solr.xml, although
> there is only one core defined
> at the moment. so reloading is only possible with a
> core-setup, right?
Yes.
I have been using Jmeter to perform some load testing. In your case you might
like to take a look at
http://jakarta.apache.org/jmeter/usermanual/component_reference.html#CSV_Data_Set_Config
. This will allow you to use a random item from your query list.
Regards,
Kallin Nagelberg
-Original
Hi,
could I interest you in this project?
http://github.com/thkoch2001/lucehbase
The aim is to store the index directly in HBase, a database system modelled
after google's Bigtable to store data in the regions of tera or petabytes.
Best regards, Thomas Koch
Lance Norskog:
> The 2B limitation i
Sorry I was backwards with my response, but the behavior is definitely
correct here.
On Mon, Apr 12, 2010 at 9:46 AM, Demian Katz wrote:
> I don't think the behavior is correct. The first example, with just one
> gap, does NOT match. The second example, with an extra second gap, DOES
> match.
Paolo Castagna wrote:
I do not have an answer to your questions.
But, I have the same issue/problem you have.
Some related threads:
- http://markmail.org/message/pns4dtfvt54mu3vs
- http://markmail.org/message/7on6lvabsosvj7bc
- http://markmail.org/message/ftz7tkd7ekhnk4bc
- http://markmail
Shawn Heisey wrote:
Anyone got a recommendation about where to put it on the wiki?
There are already two related pages:
- http://wiki.apache.org/solr/SolrPerformanceFactors
- http://wiki.apache.org/solr/SolrPerformanceData
Why not to create a new page?
- http://wiki.apache.org/solr/Benchm
Giovanni Fernandez-Kincade wrote:
Is there any way to configure autocommit to expungeDeletes? Looking at the code
it seems to be that there isn't...
Hi Gio.,
I don't think Solr supports it. Please open an issue.
Koji
--
http://www.rondhuit.com/en/
On 4/12/2010 8:51 AM, Paolo Castagna wrote:
There are already two related pages:
- http://wiki.apache.org/solr/SolrPerformanceFactors
- http://wiki.apache.org/solr/SolrPerformanceData
Why not to create a new page?
- http://wiki.apache.org/solr/BenchmarkingSolr (?)
Done. I hope you like i
I have some delimited data that I would like to import but am having issues
getting the regex patterns to work properly with Solr. The following is just
one example of the issues I have experienced.
The regex required for this example should be very simple (delimited data).
I have some regex pat
Hi,
I am running Solr 1.2 ( I will be updating in due course)
I am having a few issues with doing the snapshots after a postCommit or
postOptimize neither appear to work in my solrconfig.xml I have the
following
-
But a snapshot never gets taken, It's most likely something that I haven't
spo
I am trying to boost relevancy based on a date field with dismax, and
I've included the requestHandler config below. The post_date field in
my database is simple UNIX time, seconds since epoch. It's in a MySQL
bigint field, so I've stored it as a tlong in Solr. This filed is
required by our
The lines you have encloses are commented out by the
Bill
On Mon, Apr 12, 2010 at 1:32 PM, william pink wrote:
> Hi,
>
> I am running Solr 1.2 ( I will be updating in due course)
>
> I am having a few issues with doing the snapshots after a postCommit or
> postOptimize neither appear to work i
On 4/12/2010 11:55 AM, Shawn Heisey wrote:
[NOW-6MONTHS TO NOW]^5.0 ,
[NOW-1YEARS TO NOW-6MONTHS]^3.0
[NOW-2YEARS TO NOW-1YEARS]^2.0
[* TO NOW-2YEARS]^1.0
And here we have the perfect example of something I mentioned a while
ago - my Thunderbird (v3.0.4 on Win7) turning Solr boost syntax into
--- On Mon, 4/12/10, Ahmet Arslan wrote:
> From: Ahmet Arslan
> Subject: Re: AW: refreshing synonyms.txt - or other configs
> To: solr-user@lucene.apache.org
> Date: Monday, April 12, 2010, 5:08 PM
>
> > yes i am using solr.xml, although
> > there is only one core defined
> > at the moment. s
On 04/12/2010 02:16 PM, Ahmet Arslan wrote:
--- On Mon, 4/12/10, Ahmet Arslan wrote:
From: Ahmet Arslan
Subject: Re: AW: refreshing synonyms.txt - or other configs
To: solr-user@lucene.apache.org
Date: Monday, April 12, 2010, 5:08 PM
yes i am using solr.xml, although
there is only
forgot to mention that I DID use replaceWith="$1" in tests where the pattern
was like "(.*)(\|.*)" in order to only get the first group
--
View this message in context:
http://n3.nabble.com/problem-with-RegexTransformer-and-delimited-data-tp713846p714636.html
Sent from the Solr - User mailing li
> You could make your own little plugin RequestHandler that
> did the reload
> though. The RQ could get the CoreContainer from the
> SolrCore retrieved
> from the request, and then call reload on the "" core.
>
Awesome, this piece of code reloaded schema.xml, stopwords.txt etc. Thanx!
public c
Just a correction to what David says... text_rev is for allowing
wildcard at the beginning of a term, not the end. Wildcards at the
end work on standard text field types.
Erik
On Apr 9, 2010, at 3:08 PM, Smiley, David W. wrote:
If the user query is not going to have wildcards then
Hi,
- I'm bit confused on how analyzer apply filters on query, I know that they
are applied in order on which they are declared, but still, does the search
result
include only the final digest of the filters chain or at each filter step
solr add the matched term to results set.
- Does Dismax reque
I'm using HAProxy with 5 second healthcheck intervals and haven't seen
any problems on Solr 1.4.
My HAProxy config looks like this:
listen solr :5083
option httpchk GET /solr/parts/admin/ping HTTP/1.1\r\nHost:\ www
server solr01 192.168.0.101:9983 check inter 5000
server solr02 192.168.0.10
Talking from general regex-ness, you might be hitting the "greedy match"
issue. That is, .* matches everything. Have you tried ".*?"
Warning: this may be totally off base
HTH
Erick
On Mon, Apr 12, 2010 at 5:22 PM, Gerald wrote:
>
> forgot to mention that I DID use replaceWith="$1" in tests
Tim Underwood wrote:
>
> Have you tried hitting /admin/ping (which handles checking for the
> existence of your health file) instead of
> /admin/file?file=healthcheck.txt?
>
Ok this is what I was looking for. I was wondering if the way I was doing it
was the preferred way or not.
I didnt even
Is this at the broker or the shard? If it is at the shard, does the
log record the full stack trace of the parsing exception?
Do simpler function queries work?
On Sun, Apr 11, 2010 at 5:57 PM, Shawn Heisey wrote:
> Adding it to the main core looks like it works, without the dismax handler
> even
You can set a default value for a field in the schema.xml. I don't
know if an empty string will work, or if you have to create a special
value that means 'null'.
On Sun, Apr 11, 2010 at 11:07 PM, bbarani wrote:
>
> Hi,
>
> I have some dynamic fields something as key value pair in my SOLR data
> c
So, you have a user-visible field that is in 'seconds since epoch'.
You would like to index this, and the Solr date features are handy.
And, you do not have an application prepping the data to index, nor do
you have a UI that can receive Solr dates and turn them back into
'seconds since epoch'.
Du
([^|]*)\|([^|]*)--- abc|xyz|123 => abc / xyz
not-| 0 or more, then a pipe, then not-pipe 0 or more, with the third
group ignored.
Does the DIH debugger console help see the strings?
On Mon, Apr 12, 2010 at 6:02 PM, Erick Erickson wrote:
> Talking from general regex-ness, you might be h
~ Bharat
38 matches
Mail list logo