I do following things:
* I create folder : D:\solr-6.0.0\myconfigsets\testdih.
* Copy D:\portable_sw\solr-6.0.0\example\example-DIH\solr\db\conf to
D:\solr-6.0.0\myconfigsets\testdih.
* Go into D:\solr-6.0.0\myconfigsets\testdih\conf and edit db-data-config.xml
as follows (I am pretty sure mysq
Hello!
I try to import email from am IMAP maildir and see the following in
the log:
2016-05-24 08:46:11.655 INFO (qtp428746855-14) [ x:ccc]
o.a.s.h.d.DataImporter Loading DIH Configuration: mail-data-config.xml
2016-05-24 08:46:11.685 INFO (qtp428746855-14) [ x:ccc]
o.a.s.h.d.c.DIHConfigu
I try run the example by issuing "bin\solr create_core -c exampledih -d
example\example-DIH\solr\db\conf". It also shows same error. Do I issue wrong
command?
scott.chu,scott@udngroup.com
2016/5/24 (週二)
- Original Message -
From: scott(自己)
To: solr-user
CC:
Date: 2016/5/24 (週二)
if you have in this path server/solr/configsets/testdih/conf you shoud
right this in your line commande:
'bin\solr>solr create -c your_core -d testdih -p 8983 to create a core with
an exemple config testdih.
2016-05-24 9:35 GMT+01:00 scott.chu :
>
> I do following things:
>
> * I create folder :
thank Erick for reply, actually I am using solr 4.4, solr.SuggestComponent
class is not available in solr 4.4,
Can I implement this into my solr 4.4 lib and how ?
if possible share any article before how people used suggestion in 4.4.
Thanks
Mugeesh
--
View this message in context:
htt
Thanks Nick,
I don't plan to index the document, the document is a kind of disposable
object. And it is based on the user query.
I have seen that page, I didn't get how pass the document (my disposable
object) via stream.body parameter.
Googling I found this https://issues.apache.org/jira/browse
Hello - did you find this manual page? It explains how HTML can be uploaded.
https://cwiki.apache.org/confluence/display/solr/Uploading+Data+with+Solr+Cell+using+Apache+Tika
Markus
-Original message-
> From:scott.chu
> Sent: Tuesday 24th May 2016 7:48
> To: solr-user
> Subject: Re:
I read that but can't quite understand the steps! That's why I ask help here.
scott.chu,scott@udngroup.com
2016/5/24 (週二)
- Original Message -
From: Markus Jelsma
To: solr-user ; solr-user
CC:
Date: 2016/5/24 (週二) 17:52
Subject: RE: Import html data in mysql and map schemas using
I happen to find the problem. The problem seems to come from the html file that
shows DIH function page. I use Maxthon browser, it has a function that can
switch between IE mode and non-IE mode (actually the Webkit engine). I happen
to switch back to non-IE mode and the error message is gone an
I test a table with ony 2 records. This table's properties are: MyISAM, utf8
character set, utf8_general_ci collation. the conents are html source with
Chinese characters. I doesn't specify any schema.xml but only setup
db-data-config.xml as follows:
Hi guys,
It has been a while I was thinking about this and yesterday I took a look
into the code :
I was wondering if the termEnum approach is still a valid alternative to
docValues when we have low cardinality fields.
The reason I am asking this is because yesterday I run into this piece of
code
Thanks Eric!
Best,
Mark
On Mon, May 23, 2016 at 1:35 PM, Erick Erickson
wrote:
> Yes, currently when using Atomic updates _all_ fields
> have to be stored, except the _destinations_ of copyField
> directives.
>
> Yes, it will make your index bigger. The affects on speed are
> probably minimal t
hi all,
i'm going mad over something that seems incredibly simple. in an attempt to
maintain some order to my growing data, i've begun to employ dynamicFields.
basic stuff here, just using *_s, *_d, etc. for my strings, doubles, and
other common datatypes.
i have these stored but not indexed. i'm
Justa let everybody know. I use DIH+template (without TIKA and Solr Cell, I
really don't understand that part in reference guide) to achieve what I want.
But still need to test more various form of HTML source.
Scott Chu,scott@udngroup.com
2016/5/24 (週二)
p.s. There're really many many exte
On Tue, May 24, 2016 at 3:06 PM, Scott Chu wrote:
> p.s. There're really many many extensive, worthy stuffs in Solr. If the
> project team can provide some "dictionary" of them, It would be a "Santa
> Claus"
> for we solr users. Ha! Just a X'mas wish! Sigh! I know it's quite not
> possbile.
> I
never mind, the issue ended up being that i had the copyField for that uom
field in two places and hadn't realized it, doh!
--
*John Blythe*
Product Manager & Lead Developer
251.605.3071 | j...@curvolabs.com
www.curvolabs.com
58 Adams Ave
Evansville, IN 47713
On Tue, May 24, 2016 at 9:05 AM, J
Well, theoretically you could back-port the suggester stuff, but
I'd guess it would be very, very difficult. It'd be far easier to
upgrade to, say, 4.10.
But there's a different problem there. Until, I think 5.1, the
suggester would rebuild itself on startup, and that could
take a very long time.
I need to pass a parameter to one of my searchComponent class from
solrconfog.xml file.
Please advice me how to do it if it is possible.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Is-it-possible-to-pass-parameters-through-solrconfig-xml-tp4278852.html
Sent from the So
Thank you so much for pulling me out for upgrading trouble.
I have implemented auto suggester following this wiki page
https://wiki.apache.org/solr/Suggester.
but i am looking for a result should be populated based on some ranking or
boosting(some business logic). how to implement ranking based a
Update , it seems clear I incurred in the bad
https://issues.apache.org/jira/browse/SOLR-8096 :
Just adding some additional information as I just incurred on the issue
with Solr 6.0 :
Static index, around 50 *10^6 docs, 20 fields to facet, 1 of them with high
cardinality on top of grouping.
Gropin
Looks like it is available through http post request as given in
https://lucidworks.com/blog/2014/08/12/indexing-custom-json-data/
Hence I assume corresponding json data import from mysql should also be
available. Can someone point me to related docs?
Thanks,
Sriram
--
View this message in co
Alessandro,
I checked with Solr 6.0 distro on techproducts.
Faceting on cat with uif hits fieldValueCache
http://localhost:8983/solr/techproducts/select?facet.field=cat&facet.method=uif&facet=on&indent=on&q=*:*&wt=json
fieldValueCache
- class:org.apache.solr.search.FastLRUCache
- description:Conc
Mikhail, you have been really helpful!
On Tue, May 24, 2016 at 9:38 PM, Mikhail Khludnev <
mkhlud...@griddynamics.com> wrote:
> Alessandro,
>
> I checked with Solr 6.0 distro on techproducts.
> Faceting on cat with uif hits fieldValueCache
>
> http://localhost:8983/solr/techproducts/select?facet.
: Is there any way a custom search component can access data created in custom
: post filter query so that the data can be added to the response?
Yes - this is exactly what the example i mentioned in my
previous message do -- as i said before...
>> Take a look at the CollapseQParser (which is u
your question confuses me - pass "through" from where?
when search components are defined in solrconfig.xml, they can be declared
with any init params you want which will be passed to the init() method.
Both the sample_techproducts_configs and data_driven_schema_configs that
come with Solr
Hi,
I am trying to debug solr performance problems on an old version of solr,
4.3.1.
The queries are taking really long -in the range of 2-5 seconds!!.
Running filter query with only one condition also takes about a second.
There is memory available on the box for solr to use. I have been looking
Hi,
Is it QueryComponent taking time?
Ot other components?
Also make sure there is plenty of RAM for OS cache.
Ahmet
On Wednesday, May 25, 2016 1:47 AM, Jay Potharaju wrote:
Hi,
I am trying to debug solr performance problems on an old version of solr,
4.3.1.
The queries are taking really
Try adding debug=timing, that'll give you an idea of what component is
taking all the time.
>From there, it's "more art than science".
But you haven't given us much to go on. What is the query? Are you grouping?
Faceting on high-cardinality fields? Returning 10,000 rows?
Best,
Erick
On Tue, May
Anyone has any information on this?
Regards,
Edwin
On 21 May 2016 at 14:50, Zheng Lin Edwin Yeo wrote:
> Hi,
>
> Would like to check, is there a way to do ngrouping for MLT queries?
> I tried the following query with the MoreLikeThisHandler, but I could not
> get any ngroup results.
>
>
> http:
Hi,
I have an implementation of storing the coordinates in Solr during
indexing.
During indexing, I will only store the value in the field name ="gps". For
the field name = "gps_0_coordinate" and "gps_1_coordinate", the value will
be auto filled and indexed from the "gps" field.
But
30 matches
Mail list logo