Sorry my bad :(. Thanks for the help. It worked. I completely overlooked the
defType.
--
View this message in context:
http://lucene.472066.n3.nabble.com/DisMax-search-tp3455671p3458454.html
Sent from the Solr - User mailing list archive at Nabble.com.
I am searching for 9065 , so its not about case sensitivity. My search is
searching across all the field names and not limiting it to one
field(specified in the qf param and using deftype dismax)
--
View this message in context:
http://lucene.472066.n3.nabble.com/DisMax-search-tp3455671p3456716.h
Hi,
I am trying to achieve an exact match search on a text field. I am using a
copy field and copying it to a string and using that for the search.
and now I want to do an exact match on the imprint field and am trying to
search using the below, and the results are not limited to im
Thanks Stefan.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Query-Results-Differ-tp3104412p3105914.html
Sent from the Solr - User mailing list archive at Nabble.com.
So if I use a second fq parameter, will SOLR apply an AND on both the fq
parameters?
I have multiple indexed values, so when I search for q=time, does SOLR
return results with Time in any of the indexed values ? Sorry for the silly
questions
--
View this message in context:
http://lucene.472066.
Hi,
I am trying to understand why the two queries return different results. To
me they look similar, can some one help me understand the difference in the
results.
Query1 :
facet=true&q=time&fq=supplierid:1001&start=0&rows=10&sort=published_on desc
Query2: facet=true&q=time&fq=supplierid:1001+pu
I commented the autocommit option and tried uploading the file (a smaller
file now 5 million records) and I hit an oom again:
Jun 17, 2011 2:32:59 PM org.apache.solr.common.SolrException log
SEVERE: java.lang.OutOfMemoryError: Java heap space
--
View this message in context:
http://lucene.472066
I did that , but when I split them into 5 mill records, the first file went
through fine, when I started processing the second file SOLR hit an OOM
again:
org.apache.solr.common.SolrException log
SEVERE: java.lang.OutOfMemoryError: Java heap space
at
org.apache.lucene.index.FreqProxTermsWri
Yes Eric, after changing the lock type to Single, I got an OOM after loading
5.5 million records. I am using the curl command to upload the csv.
--
View this message in context:
http://lucene.472066.n3.nabble.com/SOlR-Out-of-Memory-exception-tp3074636p3074765.html
Sent from the Solr - User mailin
We just started using SOLR. I am trying to load a single file with 20 million
records into SOLR using the CSV uploader. I keep getting and out of Memory
after loading 7 million records. Here is the config:
1
6
I also encountered a LockObtainFailedException
org.apache.
10 matches
Mail list logo