Hi
I have distributed messaging solution where I need to distinct between
adding a document and just
trying to update it.
Scenario:
1. message sent for document to be updated
2. meanwhile another message is sent for document to be deleted and is
executed before 1
As a result when 1 comes instead o
Also in this regard the possibility to update document without giving
all required fields
but just uniqueKey and some other data
Julian Davchev wrote:
> Hi
> I have distributed messaging solution where I need to distinct between
> adding a document and just
> trying to update it.
>
> Scenario:
>
Yes, please report this to the Tika project.
Erik
On Mar 31, 2010, at 9:31 PM, Ross wrote:
Does anyone have any thoughts or suggestions on this? I guess it's
really a Tika problem. Should I try to report it to the Tika project?
I wonder if someone could try it to see if it's a genera
Hello community,
I was hunting a ghostbug the last few days. As some of you might have
recognized, I have written some postings, because of unexpected
dismax-handler behaviour and some other problems. However, there was no
error in my code nor in my schema.xml.
It seems like that the ressource-
On Thu, Apr 1, 2010 at 7:06 AM, MitchK wrote:
>
> It seems like that the ressource-loader has got a little bug. The first
> line
> of a file you want to load with the getLine()-method of RessourceLoader [1]
> has to be outcommented by "#". If not, the first line seems to be ignored
> or
> somethi
I used notepadd++ to create the file and yes, you might be right. I will test
whether that was the problem.
If yes, do you know whether script-languages like php or javascript also
setting a BOM when they create a utf-8-encoded file/text?
Probably making a note for this behaviour somewhere in the
Hi Chris,
Actually I needed time upto seconds granularity, so did you mean I should index
the field after conversion into seconds
Ankit
-Original Message-
From: Chris Hostetter [mailto:hossman_luc...@fucit.org]
Sent: Wednesday, March 31, 2010 10:05 PM
To: solr-user@lucene.apache.org
Sub
hello.
i understand not really much about this conversation :D
but i think you can help me. i got an idea for my suggestions.
make it sense to group my suggestions with patch-236 ?
i test it. and it worked not complete well =(
my problem ist that i have too many productnames with too long nam
I did something similar.
The only difference with my set up is that I have two columns; one that
store the dates the document was first created and a second that stores the
date it was last updated as unix time stamps
So my query to find documents that are older than 4 hours would be very easy
T
One of the most requested features in Lucene/SOLR is to be able
to update only selected fields rather than the whole document. But
that's not how it works at present. An update is really a delete and
an add.
So for your second message, you can't do a partial update, you must
"update" the whole doc
Don't do that. For many reasons . By trying to batch so many docs
together, you're just *asking* for trouble. Quite apart from whether it'll
work once, having *any* HTTP-based protocol work reliably with 13G is
fragile...
For instance, I don't want to have my know whether the XML parsing in
SOLR p
Hi,
I am trying to create a search functionality same as that of
Lucidimagination search.
As of now I have formed the Facet query as below
http://localhost:8080/solr/db/select?q=*:*&fq={!tag=3DotHierarchyFacet}3DotHierarchyFacet:ABC&facet=on&facet.field={!ex=3DotHierarchyFacet}3DotHierarchyFace
: q=last_update_date:[NOW-7DAYS TO NOW-4HOURS]
:
: does not return a correct recordset. I would expect to get all documents
: with last_update_date in the specified range. Instead solr returns all
: documents that exist in the index which is not what I would expect.
: Last_update_date is SolrDate
: Yes, please report this to the Tika project.
except that when i run "tika-app-0.6.jar" on a text file like the one Ross
describes, i don't get the error he describes, which means it may be
something off in how Solr is using Tika.
Ross: I can't reproduce this error on the trunk using the exam
Hi Chris, thanks for looking at this.
I'm using Solr 1.4.0 including the Tika that's in the tgz file which
means Tika 0.4.
I've now discovered that only two letters are required. A single line
with XE will crash it.
This fails:
r...@gamma:/home/ross# hexdump -C test.txt
58 45 0a
: Subject: add/update document as distinct operations? Is it possible?
: References:
:
: In-Reply-To:
:
http://people.apache.org/~hossman/#threadhijack
Thread Hijacking on Mailing Lists
When starting a new discussion on a mailing list, please do not reply to
an existing message, inste
Hi,
We have a application which uses SOLR sharp to get the details from SOLR.
Currently since we are in testing stage we would like to know what query is
being passed to SOLR from our application without debuggging the application
each time.
Is there a way to view the queries passed to SOLR on
The error might be that your http client doesn't handle really large
files (32-bit overflow in the Content-Length header?) or something in
your network is killing your long-lived socket? Solr can definitely
accept a 13GB xml document.
I've uploaded large files into Solr successfully, including re
The default jetty.xml sets up a request logger that logs to
"logs/_mm_dd.request.log" relative to the directory jetty is
started from. Look for NCSARequestLog in your jetty.xml. If SOLR
Sharp uses GETs (not POSTs) you can look at the urls in the log and
pull out the "q" and "fq" parameters wh
: select/?q=video&qt=dismax&qf=titleMain^2.0+titleShort^5.3&debugQuery=on
...
:
: +(titleMain:video^2.0)~0.01 (titleMain:video^2.0)~0.01
:
...
: My solrconfig for the dismax handler:
..what about schema.xml? ... what do the field (and corrisponding
fieldtype) for titleShort loo
Hi,
I could see all GET request properly in SOLR but couldnt find any POST
request issued from SOLRsharp.
If I issue search directly in SOLR (not from the application) I could see
the logs as below,
127.0.0.1 - - [02/04/2010:03:33:23 +] "GET /solr/db/select?q=test
But when search happens
Hi Eric, Shawn,
Thank you for your reply.
Luckily just on the second time itself my 13GB SOLR XML (more than a million
docs) went in fine into SOLR without any problem and I uploaded another 2
more sets of 1.2million+ docs fine without any hassle.
I will try for lesser sized more xmls next time
Are function queries possible using the MLT request handler? How about using
the _val_ hack? Thanks for your help
--
View this message in context:
http://n3.nabble.com/MoreLikeThis-function-queries-tp692377p692377.html
Sent from the Solr - User mailing list archive at Nabble.com.
23 matches
Mail list logo