Ryan,
It turned out that another multivalue field was causing my problem. This field
was no longer configured in my schema.
My dynamic "catch-all" field of type ignored was not multivalued, adding
multivalue to this field solved my problem.
Regards, Rene
-Original Message-
From: Ryan
Hi Ryan,
Thanks for the inputs .These are the set of steps followed to solve this
issue.
1.make a loggging property file say solrLogging.properties.We can copy the
default logging property file available at JAVA_HOME/jre/lib folder.
default java logging file will look like the following.
##
solr 1.3 uses java logging. Most app containers (tomcat, resin, etc)
give you a way to configure that. Also check:
http://java.sun.com/j2se/1.4.2/docs/guide/util/logging/overview.html#1.8
You can make runtime changes from the /admin/ logging tab. However,
these changes are not persisted wh
Hi,
I was trying to do a performance test on Solr web application.
If I run the performance tests, lot og logging is happening due to which
I am getting log files in GBs
Is there any clean way of deactivating logging or changing the log level
to say error ..
Is there any property f
Derek,
q=+referring:XXX +question:YYY
(of course, you'll have to URL-encode that query string0
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
> From: Derek Springer
> To: solr-user@lucene.apache.org
> Sent: Monday, December 15, 2008 3:40:55 PM
Sammy:
http://markmail.org/search/solr+function+query+recip?page=1
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
> From: Sammy Yu
> To: solr-user@lucene.apache.org
> Sent: Monday, December 15, 2008 5:28:17 PM
> Subject: Re: Standard request with
Are the queries being fired wrong/different when you tried full-import?
On Tue, Dec 16, 2008 at 9:57 AM, sbutalia wrote:
>
> I'ev had a chance to play with this more and noticed the query does run fine
> but it only updates the records that are already indexed it doesn't add new
> ones.
>
> The
I'ev had a chance to play with this more and noticed the query does run fine
but it only updates the records that are already indexed it doesn't add new
ones.
The only option that i'ev found so far is to do a full-import with the
"clean=false" attribute and created_date > last_indexed_date...
I
Jeff,
Thanks.
It would be nice if you just review the config syntax and see if all
possible usecases are covered . Is there any scope for improvement ?
On Tue, Dec 16, 2008 at 5:45 AM, Jeff Newburn wrote:
> It does appear to be working for us now. The files replicated out
> appropriately which
I do not observe anything wrong.
you can also mention the 'deltaImportQuery' and try it
someting like
On Tue, Dec 16, 2008 at 5:54 AM, sbutalia wrote:
>
> I have a parent entity that grabs a list of records of a certain type from 1
> table... and a sub-entity that queries another table
Hi everybody,
So I have applied the Ivans latest patch to a clean 1.3.
I built it using 'ant compile' and 'ant dist', got the solr build.war
file.
Moved that into the Tomcat directory.
Modified my solrconfig.xml to include the following:
class="org.apache.solr.handler.component.CollapseC
It seems like maybe the fragmenter parameters just don't get displayed with
echoParams=all set. It may only display as far as the request handler's
parameters. The reason I think this is because I tried increasing
hl.fragsize to 1000 and the results were returned correctly (much larger
snippets), s
I have a parent entity that grabs a list of records of a certain type from 1
table... and a sub-entity that queries another table to retrieve the actual
data... for various reasons I cannot join the tables... the 2nd sql query
converts the rows into an xml to be processed by a custom transformer (
Would this mean that, for example, if we wanted to search productId
(long) we'd need to make a field type that had stopwords in it rather
than simply using (long)?
Thanks for your time!
Matthew Runo
Software Engineer, Zappos.com
mr...@zappos.com - 702-943-7833
On Dec 12, 2008, at 11:56 PM,
It does appear to be working for us now. The files replicated out
appropriately which is a huge help. Thanks to all!
-Jeff
On 12/13/08 9:42 AM, "Shalin Shekhar Mangar" wrote:
> Jeff, SOLR-821 has a patch now. It'd be nice to get some feedback if
> you
manage to try it out.
On Thu, Dec 11,
Thanks for this tip, it's very helpful. Indeed, it looks like none of the
highlighting parameters are being included. It's using the correct request
handler and hl is set to true, but none of the highlighting parameters from
solrconfig.xml are in the parameter list.
Here is my query:
http://local
Try adding echoParams=all to your query to verify the params that the
solr request handler is getting.
-Yonik
On Mon, Dec 15, 2008 at 6:10 PM, Mark Ferguson
wrote:
> Hello,
>
> In my solrconfig.xml file I am setting the attribute hl.snippets to 3. When
> I perform a search, it returns only a sin
You actually don't need to escape most characters inside a character class,
the escaping of the period was unnecessary.
I've tried using the example regex ([-\w ,/\n\"']{20,200}), and I'm _still_
getting lots of highlighted snippets that don't match the regex (starting
with a period, etc.) Has any
Hello,
In my solrconfig.xml file I am setting the attribute hl.snippets to 3. When
I perform a search, it returns only a single snippet for each highlighted
field. However, when I set the hl.snippets field manually as a search
parameter, I get up to 3 highlighted snippets. This is the configuratio
>
> No need to re-index with this change.
> But you will have to re-index any documents that got cut off of course.
>
> -Yonik
>
Ok, thanks...
I hoped to reindex the documents over the existent index (with incremental
update...while solr is running) ...and without delete the index folder
B
On Mon, Dec 15, 2008 at 5:28 PM, Antonio Zippo wrote:
>>
>> Check your solrconfig.xml:
>>
>> 1
>>
>> That's probably the truncating factor. That's the maximum number of terms,
>> not bytes or characters.
>>
> Thanks... I think it could be the problem.
> i tried to count whitespace i
Can you try with the latest nightly build?
That may help pinpoint if it's index file locking contention, or OS
disk cache misses when reading the index. If the time never recovers,
it suggests the former.
-Yonik
On Mon, Dec 15, 2008 at 5:14 PM, Sammy Yu wrote:
> Hi guys,
> I have a typical ma
>
> Check your solrconfig.xml:
>
> 1
>
> That's probably the truncating factor. That's the maximum number of terms,
> not bytes or characters.
>
> Erik
>
Thanks... I think it could be the problem.
i tried to count whitespace in a single text and it's over 55.000 ... but so
Hey guys,
Thanks for the response, but how would make recency a factor on
scoring documents with the standard request handler.
The query (title:iphone OR bodytext:iphone OR title:firmware OR
bodytext:firmware) AND _val_:"ord(dateCreated)"^0.1
seems to do something very similar to just sorting b
Hi guys,
I have a typical master/slave setup running with Solr 1.3.0. I did
some basic scalability test with JMeter and tweaked our environment
and determined that we can handle approximately 26 simultaneous
threads and get end-to-end response times of under 200ms even with
typically every 5 mi
Check your solrconfig.xml:
1
That's probably the truncating factor. That's the maximum number of
terms, not bytes or characters.
Erik
On Dec 15, 2008, at 5:00 PM, Antonio Zippo wrote:
Hi all,
i have a TextField containing over 400k of text
when i try to search a w
Hi all,
i have a TextField containing over 400k of text
when i try to search a word solr doesn't return any result but if I search
for a single document, I can see that the word exists there
So I suppose that solr has a textfield size limit (the field is indexed
using a tokeniz
>>: I need to tokenize my field on whitespaces, html, punctuation, apostrophe
>>
>>: but if I use HTMLStripStandardTokenizerFactory it strips only html
>>: but no apostrophes
> you might consider using one of the HTML Tokenizers, and then use a
> PatternReplaceFilterFilter ... or if you kno
Thanks for the tip, I appreciate it!
However, does anyone know how to articulate the syntax of "(This AND That)
OR (Something AND Else)" into a query string?
i.e. q=referring:### AND question:###
On Mon, Dec 15, 2008 at 12:32 PM, Stephen Weiss wrote:
> I think in this case you would want to ind
I think in this case you would want to index each question with the
possible referrers ( by title might be too imprecise, I'd go with
filename or ID) and then do a search like this (assuming in this case
it's by filename)
q=(referring:TomCruise.html) OR (question: Tom AND Cruise)
Which see
Hey all,
I'm having trouble articulating a query and I'm hopeful someone out there
can help me out :)
My situation is this: I am indexing a series of questions that can either be
asked from a main question entry page, or a specific subject page. I have a
field called "referring" which indexes the
Thanks Yonik for the clarification.
Yonik Seeley wrote:
A solr core is like a separate solr server... so create a new
CommonsHttpSolrServer that points at the core.
You probably want to create and reuse a single HttpClient instance for
the best efficiency.
-Yonik
On Mon, Dec 15, 2008 at 11:06
Hi -
In CoreContainer.java :: register the documentation says it would
return a previous core having the same name if it existed *and
returnPrev = true*.
* @return a previous core having the same name if it existed and
returnPrev==true
*/
public SolrCore register(String name, SolrCor
What do you see in the admin schema browser?
/admin/schema.jsp
When you select the field "names", do you see the property
"Multivalued"?
ryan
On Dec 15, 2008, at 10:55 AM, Schilperoort, René wrote:
Sorry,
Forgot the most important detail.
The document I am adding contains multiple "names
A solr core is like a separate solr server... so create a new
CommonsHttpSolrServer that points at the core.
You probably want to create and reuse a single HttpClient instance for
the best efficiency.
-Yonik
On Mon, Dec 15, 2008 at 11:06 AM, Kay Kay wrote:
> Hi -
> I am looking at the article
Hi -
I am looking at the article here with a brief introduction to SolrJ .
http://www.ibm.com/developerworks/library/j-solr-update/index.html?ca=dgr-jw17Solr&S_Tact=105AGX59&S_CMP=GRsitejw17#solrj
.
In case we have multiple SolrCores in the server application - (since
1.3) - how do I specif
Sorry,
Forgot the most important detail.
The document I am adding contains multiple "names" fields:
sInputDocument.addField("names", value);
sInputDocument.addField("names", value);
sInputDocument.addField("names", value);
There is no problem when a document only contains one value in the names f
Hi all,
When adding documents to Solr using solr I receive the following Exception.
org.apache.solr.common.SolrException: Bad Request
The field is "configured" as followed:
Any suggestions?
Regards, Rene
Hi all,
Whilst Solr is a great resource (a big thank you to the developers) it
presents me with a couple of issues.
The need for hierarchical facets I would say is a fairly crucial missing
piece but has already been pointed out
(http://issues.apache.org/jira/browse/SOLR-64).
The other issu
Have you tried using the
options in the schema.xml? After the indexing, take a look to the
fields DIH has generated.
Bye,
L.M.
2008/12/15 jokkmokk :
>
> HI,
>
> I'm desperately trying to get the dataimport handler to work, however it
> seems that it just ignores the field name mapping.
> I
On Dec 15, 2008, at 8:20 AM, Jacob Singh wrote:
Hi Erik,
Sorry I wasn't totally clear. Some responses inline:
If the file is visible from the Solr server, there is no need to
actually
send the bits through HTTP. Solr's content steam capabilities
allow a file
to be retrieved from Solr it
In the solrconfig.xml (scroll all the way to the bottom, and I believe
the example has some commented out)
On Dec 15, 2008, at 5:45 AM, ayyanar wrote:
I'm no QueryParser expert, but I would probably start w/ the default
query parser in Solr (LuceneQParser), and then progress a bit to the
Di
See also http://wiki.apache.org/solr/SolrResources
On Dec 15, 2008, at 2:57 AM, Andre Hagenbruch wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Sajith Vimukthi schrieb:
Hi Sajith,
I need some sample code of some examples done using solr. I need to
get an
idea on how I can use solr i
http://lucene.apache.org/solr/tutorial.html
On Dec 15, 2008, at 12:56 AM, Sajith Vimukthi wrote:
Hi all,
Can someone of you give me a sample code on a search function done
with solr
so that I can get an idea on how I can use it.
Regards,
Sajith Vimukthi Weerakoon
Associate Software Eng
Hi Erik,
Sorry I wasn't totally clear. Some responses inline:
> If the file is visible from the Solr server, there is no need to actually
> send the bits through HTTP. Solr's content steam capabilities allow a file
> to be retrieved from Solr itself.
>
Yeah, I know. But in my case not possible
sorry, I'm using the 1.3.0 release. I've now worked around that issue by
using aliases in the sql statement so that no mapping is needed. This way it
works perfectly.
best regards
Stefan
Shalin Shekhar Mangar wrote:
>
> Which solr version are you using?
>
--
View this message in context:
h
Which solr version are you using?
On Mon, Dec 15, 2008 at 6:04 PM, jokkmokk wrote:
>
> HI,
>
> I'm desperately trying to get the dataimport handler to work, however it
> seems that it just ignores the field name mapping.
> I have the fields "body" and "subject" in the database and those are call
HI,
I'm desperately trying to get the dataimport handler to work, however it
seems that it just ignores the field name mapping.
I have the fields "body" and "subject" in the database and those are called
"title" and "content" in the solr schema, so I use the following import
config:
Jacob,
Hmmm... seems the wires are still crossed and confusing.
On Dec 15, 2008, at 6:34 AM, Jacob Singh wrote:
This is indeed what I was talking about... It could even be handled
via some type of transient file storage system. this might even be
better to avoid the risks associated with uplo
Hi Erik,
This is indeed what I was talking about... It could even be handled
via some type of transient file storage system. this might even be
better to avoid the risks associated with uploading a huge file across
a network and might (have no idea) be easier to implement.
So I could send the fi
I found the following solution in the forum to use BoostingTermQuery in solr:
"I ended up subclassing QueryParser and overriding newTermQuery() to create
a BoostingTermQuery instead of a plain ol' TermQuery. Seems to work. "
http://www.nabble.com/RE:-using-BoostingTermQuery-p19651792.html
I ha
> I'm no QueryParser expert, but I would probably start w/ the default
> query parser in Solr (LuceneQParser), and then progress a bit to the
> DisMax one. I'd ask specific questions based on what you see there.
> If you get far enough along, you may consider asking for help on the
> java-use
On Dec 15, 2008, at 3:13 AM, Chris Hostetter wrote:
: If I can find the bandwidth, I'd like to make something which allows
: file uploads via the XMLUpdateHandler as well... Do you have any
ideas
the XmlUpdateRequestHandler already supports file uploads ... all
request
handlers do using
Just my thoughts on the matter:
the designer of the runner-up logo and the 3rd place logo is also
responsible for 5 other logos that made it in the list. They are
basically different versions of the same concept. If you add up the
scores for logo's 2, 3, 6, 8, 11, 20 and 23 you will see a sc
: ohk.. that means I can't use colon in the fieldname ever in such a scenario
: ?
In most internals, the lucene/solr code base allows *any* character in the
field name, so you *can* use colons in field names, but many of the
"surface" features (like the query parser) treat colon's as special
c
: If I can find the bandwidth, I'd like to make something which allows
: file uploads via the XMLUpdateHandler as well... Do you have any ideas
the XmlUpdateRequestHandler already supports file uploads ... all request
handlers do using the ContentStream abstraction...
http://wiki.apache
56 matches
Mail list logo