exists an alternative to waitFlush?
in my setup this command is very usefull for my NRT. is nobody here with the
same problem?
--
View this message in context:
http://lucene.472066.n3.nabble.com/Alternative-to-waitFlush-in-Solr4-0-tp3991489.html
Sent from the Solr - User mailing list archive at
Not sure if this is the right forum to post this question. If not, please
excuse.
I'm trying to use the DataImportHandler with
processor="CachedSqlEntityProcessor" to speed up import from an RDBMS. While
processor="CachedSqlEntityProcessor" is much faster than
processor="SqlEntityProcessor", the
It _sounds_ like you changed your schema.xml file in the process.
Essentially, any time
you change your schema, re-indexing everything is called for.
FWIW,
Erick
On Tue, Jun 26, 2012 at 3:51 PM, zsy715 wrote:
> I think I have fixed the problem by just importing data again. Finally~~
>
> --
> Vie
We are using the edismax query parser with an mm=100%. However, when a CJK
query ( ABC) gets tokenized by the CJKBigramFilter ([AB] [BC]), instead of
a Boolean AND for [AB] AND [BC], which is what we expect with mm=100%, this
gets searched as a Boolean "OR" query.
For example searching for "Daya
actually, it works when type model:* but not for model:F100(F100 is a model
number)
--
View this message in context:
http://lucene.472066.n3.nabble.com/solr-querying-doesn-t-return-result-tp3991424p3991426.html
Sent from the Solr - User mailing list archive at Nabble.com.
I think I have fixed the problem by just importing data again. Finally~~
--
View this message in context:
http://lucene.472066.n3.nabble.com/solr-querying-doesn-t-return-result-tp3991424p3991433.html
Sent from the Solr - User mailing list archive at Nabble.com.
Regarding the large number of files, even after optimize, we found that
when rebuilding a large, experimental 1.7TB index on Solr 3.5, instead of
Solr 1.4.1, there were a ton of index files, thousands, in 3.5, when there
used to be just 10 (or 11?) segments worth (as expected with mergeFactor
set t
Hey guys,
I am new to solr. Recently, I have configured solr which connects to hsql
database. I have a table ac_model which has two columns one is id_ac_model
and another one is ac_model. I run example-DIH and it works fine when I type
*:* and acid:*(or number) but it doesn't work when I type mode
Well, you'd have to understand the whole way the index structure is laid
out to do binary editing, and I don't know it well enough to even offer
a rough idea. There are detailed docs hanging around _somewhere_ that
will give you the formats, or you could go at the code. But that's probably
pretty h
So, I tried 'optimize', but it failed because of lack of space on the first
machine. I then moved the whole thing to a different machine where the index
was pretty much the only thing and was using about 37% of disk, but it still
failed because of a "No space left on device" IOException. Also, the
Hi Dmitry,
I have tried like you say but this will filter out any possible
'continuation' of the search query.
Let's make an example. Let's suppose we are looking for 'red rachel zoe'.
The user will type something like:
"red r"
Given that we have an autocomplete_facet facet where we copy name,
Hi,
Could you preserve in the facet.prefix what the user has typed so far? Or
is this breaking your requirement?
/Dmitry
On Tue, Jun 26, 2012 at 5:08 PM, Ugo Matrangolo wrote:
> Hi,
>
> We are using SOLR to build a simple search engine on our e-commerce site.
> We also implemented an autocomple
In my older version of solr this was possible, but it seems not possible in
this new =(
--
View this message in context:
http://lucene.472066.n3.nabble.com/FileNotFoundException-during-commit-concurrences-process-tp3991384p3991388.html
Sent from the Solr - User mailing list archive at Nabble.com.
Hello again.
this is my Exception.
with SolrVersion: 4.0.0.2012.04.26.09.00.41
SEVERE: Exception while solr commit.
java.io.FileNotFoundException: _8l.cfs
at org.apache.lucene.store.FSDirectory.fileLength(FSDirectory.java:266)
at org.apache.lucene.index.SegmentInfo.sizeInBytes(Seg
Hi,
We are using SOLR to build a simple search engine on our e-commerce site.
We also implemented an autocompletion feature using faceting following
exactly what is described in the book 'Apache SOLR 3 Enterprise Search
Server' (page 221).
What we do is that we fill an autocomplete_facet with our
see: http://solr.pl/en/2011/07/18/deep-paging-problem/
Best
Erick
On Tue, Jun 26, 2012 at 6:37 AM, Alok Bhandari
wrote:
> Hello Erick,
>
> thanks for the prompt reply you are giving. I have tried the options
> suggested by you but no luck for me this time.
>
> I am facing following issues
>
> 1)
You need to spend quite a bit of time on the admin/analysis page, examining
what different analysis chains do. This is one of the tricky parts of
understanding Solr.
Jack already mentioned n-grams, that's a good place to start.
If your part numbers are always at the beginning, you can just
use wi
I think I may have identified a bug with FVH. So I have two questions:
1) Does anyone know how to make FVH return a highlighted snippet when the
query matches all of one string in a multivalued field?
2) If not, does anyone know how to make DIH concatenate all the values in a
multivalued field int
Hi,
I'm indexing about 200.000 files (average size of 1 MB) with the tika
processor. At some point Solr started hanging. The logs is only reporting:
INFO: [] webapp=/solr path=/replication
params={command=indexversion&wt=javabin} status=0 QTime=0
Jun 26, 2012 2:34:00 PM org.apache.solr.core
no way to connect to:
http://wiki.apache.org/
wiki.apache.org is offline
No ETA on when the wiki will be back, we are working on it right now
--
View this message in context:
http://lucene.472066.n3.nabble.com/solr-wiki-is-down-tp3991339.html
Sent from the Solr - User mailing list archive at
Hi,
We have about 500 top queries for each core in our test cluster. They are
routed to the default request handler and are not distributed, facetting is
enabled and most common filters are present and rows not specified but the
default is 10. After server start up we only see a few items being
Hi, thank for the suggestion.But my requirement is to search with Partnumber
say 'ALT1' instead of whole 'ALT15SBW'.If I do so the search result is
empty. I have changed the schema.xml file for fieldtype
"wc_txt","solr.TextField" with passing
Still the search result empty for the above cond
Hi,
thank for the suggestion.But my requirement is to search with Partnumber say
'ALT1' instead of whole 'ALT15SBW'.If I do so the search result is empty.
I have changed the schema.xml file for fieldtype "wc_txt","solr.TextField"
with passing
.
Still the search result empty for the above conditio
Try using expand=true at both index and query time. Then all terms will
be there in all queries.
Upayavira
On Tue, Jun 26, 2012, at 01:42 AM, flyingeagle-de wrote:
> Hello,
>
> I've a problem using the synonyms.txt and a default "and" in my search.
>
> Using a lot of fields in my query I want t
Presumably your slaves are the only ones receiving queries.
When you query indexes, it will cache stuff, which takes up memory, thus
memory usage is greater on slaves than on master.
You should look at the types of queries you are using, and see what
caches you are building up (e.g. sorting, filt
Hi
I am running Solr 3.5 server with a master-slave and repeater-slave
architecture from last one year on linux machines. Recently found out that my
slaves (both) are using high virtual memory and keeps on increasing while
master and repeater both seems normal. My index size is quite normal in
Hello,
I've a problem using the synonyms.txt and a default "and" in my search.
Using a lot of fields in my query I want them all to be combined with "and"
except the synonyms.
Having this entry in the synonyms.txt
...
test1, test2, test3
...
and querying for test1 only matches when the document
On 26 June 2012 11:54, Sudhir Kumar wrote:
> Hi,
>
>
>
> Solr magento integration done but after indexing no results. In tomcat
> logs there is no exception.
What are you using for the integration? How are you indexing
data? It might be better to ask on a Magento-specific list.
> But, I have jus
28 matches
Mail list logo