Thanks Alex.
i have gone through it. i am building a use case where cache has to updated
in my application when there is update or addition of solr document happens.
i am not sure event listeners of Update handler would fit the use case.
What do you say ? please share your ideas.
Thanks,
Anil
Have you looked at UpdateRequestProcessors?
Plus check solrconfig.xml for listeners on commit.
Regards,
Alex
On 26 Mar 2016 3:54 pm, "Anil" wrote:
> HI,
>
> Does solr support event (document create, update, delete) listeners ?
>
> Thanks,
> Anil
>
HI,
Does solr support event (document create, update, delete) listeners ?
Thanks,
Anil
Hello. My company uses Solr-4.10 in a distributed environment. I have
written a SearchComponent which contains a cache which is loaded at
start-up. The cache is only used on the searchers, never on the
aggregators. Is there some way I can signal that the cache should be loaded
only if the Solr i
Found the JIRA: https://issues.apache.org/jira/browse/SOLR-7042
It looks like you can try adding
-format solr
to your bin/post command line to get back to normal "solr JSON"
-Yonik
On Fri, Mar 25, 2016 at 8:43 PM, Yonik Seeley wrote:
> On Fri, Mar 25, 2016 at 6:19 PM, Alisa Z. wrote:
>> Hi
On Fri, Mar 25, 2016 at 6:19 PM, Alisa Z. wrote:
> Hi all,
> It is partially a question, partially a discussion.
> I am working with documents with deep levels of nesting. The documents are in
> a single JSON file (see a sample below).
>
> When I was on Solr 5.3.1,
> solr-5.3.1$ bin/post -c my_c
On 3/25/2016 5:32 PM, Shawn Heisey wrote:
> You are correct. Solr (Lucene, actually) only maintains compatibility
> with indexes from the previous major version, so 6.x will read 5.x
> indexes, but not indexes built by 4.x or earlier.
Further clarification on this specific point. This is your TL
On 3/24/2016 9:45 AM, Jack Krupansky wrote:
> Does anybody know if we have doc on the recommended process for upgrading
> data after upgrading Solr? Sure the upgraded version will work fine with
> that old data, but unless the data is upgraded, the user can't then upgrade
> to the next major releas
Further experiments:
-- updated the schema to account for multiple values:
curl -X POST -H 'Content-type:application/json' --data-binary '{
"add-dynamic-field":{
"name":"*type_s",
"type":"string",
"indexed":true,
"multiValued":true
}
}' http://localhost:8985/solr/my_coll
Hi all,
It is partially a question, partially a discussion.
I am working with documents with deep levels of nesting. The documents are in a
single JSON file (see a sample below).
When I was on Solr 5.3.1,
solr-5.3.1$ bin/post -c my_collection ../data/data-solr.json
caused no problems.
Now, I
bq: (collections API won't work because i use compositeId routing mode)
This had better NOT be true or SolrCloud is horribly broken. compositeId is
the default and it's tested a all the time by unit tests. So is implicit for
that matter.
One question I have is that you've specified a route field
Hi Alisa,
The issue here is still open so it seems highly unlikely that it would even
get to 6.0, which is around the corner. I think this would only be out with
6.1 at the earliest.
On Fri, Mar 25, 2016 at 11:12 AM, Alisa Z. wrote:
> Mikhail,
> Thank you for the answer.
> I'd be happy to cont
Mikhail,
Thank you for the answer.
I'd be happy to contribute tons of test cases on nested structures and their
querying and faceting...
I am working on a case of moving very nested data structures to Solr (and the
other option is ES...) but so far Solr seems to be quite behind... It's great
Hi guys
I am trying to set up a Solr Cloud environment of two Solr 5.4.1 nodes
but the data are always indexed on the first node although the unique id
is a GUID.
It looks like I can't add an additional node. Could you tell me where
i'm wrong ?
I try to set up a collection named "db" with
I've never created a Jira issue for Solr. I have the option to create a
Service Desk Request. Which one will route to the Solr board? Kylin, Atlas,
Apache Infrastructure, Ranger
--
View this message in context:
http://lucene.472066.n3.nabble.com/Can-Solr-recognize-daylight-savings-time-tp426604
If possible, log in UTC. Daylight time causes amusing problems in logs, like
one day with 23 hours and one day with 25.
You can always convert to local time when you display it.
wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunderwood.org/ (my blog)
> On Mar 25, 2016, at 8:49
On 3/25/2016 9:32 AM, Shawn Heisey wrote:
> Apparently we *can* use java system properties in the log4j config, so
> saying there's no generalized solution available was premature.
Second followup:
The info I looked at about using sysprops had no version number for
log4j, and was talking about th
Of course! Thanks for your help Shawn.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Can-Solr-recognize-daylight-savings-time-tp4266047p4266062.html
Sent from the Solr - User mailing list archive at Nabble.com.
tank you shawn ; but if I use solarium client PHP for the production what I
have to do in this case.
2016-03-25 13:44 GMT+00:00 Shawn Heisey :
> On 3/25/2016 5:44 AM, Moncif Aidi wrote:
> > Im Using solr 5.4.1 for indexing thousands of documents, and it works
> > perfectly.The issue comes when so
some document have content can not be extracted and stack in JVM of solr ;
i get this ERROR:
24/03/2016 à 19:26:59 ERROR null DocBuilder Exception while processing:
files document : null:org.apache.solr.handler.
dataimport.DataImportHandlerException: Unable to read content Processing
Document # 1
On 3/25/2016 9:24 AM, Shawn Heisey wrote:
> Solr's main log is written by log4j. With our current log4j version and
> configuration method, we cannot use environment variables in the log4j
> config, so no generalized solution is possible. Upgrading log4j is on
> the todo list, but that requires s
On 3/25/2016 8:26 AM, tedsolr wrote:
> My solr logs are an hour behind. I have set this property to log in local
> time
> SOLR_TIMEZONE="EST"
> but cannot find a property that will "turn on" daylight savings. If there
> isn't a solr property, maybe there's an apache log4j setting?
Solr's main log
My solr logs are an hour behind. I have set this property to log in local
time
SOLR_TIMEZONE="EST"
but cannot find a property that will "turn on" daylight savings. If there
isn't a solr property, maybe there's an apache log4j setting?
thanks,
v5.2.1
--
View this message in context:
http://luce
On 3/25/2016 7:29 AM, Raveendra Yerraguntla wrote:
> I got both the replies. Most likely we might have used some of the NFS
> options. I will try them early next week.
Running on NFS is not advised. You can make it work, but Solr doesn't
like it.
If you're trying to use NFS to share an index dir
On 3/25/2016 7:50 AM, Shawn Heisey wrote:
> If you're trying to use NFS to share an index directory between Solr
> nodes, don't do that. Each Solr node needs its own copy of all index
> data. Getting *this* to work *might* be possible, but even when it
> works, it's not very stable.
Followup on th
On 3/25/2016 2:04 AM, fabigol wrote:
> what i want to do and to create the differents links between the entities
> which i'm going to index. Therefore, i have a root entity and girls entities
> like showing xml File.
>
> But, my main problem is the number of documents. In facr, when i want to
> ind
On 3/25/2016 5:44 AM, Moncif Aidi wrote:
> Im Using solr 5.4.1 for indexing thousands of documents, and it works
> perfectly.The issue comes when some documents are not well formatted or
> contains some special characters and it makes solr hangs or blocked on some
> perticular documents and it give
Thanks Shawn.
I got both the replies. Most likely we might have used some of the NFS
options. I will try them early next week.
Thanks
Ravi
On Wed, Mar 23, 2016 at 9:50 AM, Shawn Heisey wrote:
> On 3/23/2016 6:00 AM, Raveendra Yerraguntla wrote:
> > I am using Solr 5.4 in solr cloud mode in a 8
HI ,
Im Using solr 5.4.1 for indexing thousands of documents, and it works
perfectly.The issue comes when some documents are not well formatted or
contains some special characters and it makes solr hangs or blocked on some
perticular documents and it gives these errors when viewing the log :
i want
Hi Georg,
One solution that could work on existing schema is to use query faceting
and queries like (for USER_ID = 1, bucker 100 to 200):
price_1:[100 TO 200] OR (-price_1:[* TO *] AND price:[100 TO 200])
Same query is used for filtering. What you should test is if
performances are acceptable
Hi,
I wonder if the function ${dataimporter.functions.escapeSql()} is available
in Solr 5.3.1.
Whenever i use it in my data import handlers, Solr replaces
'${dataimporter.functions.escapeSql(field)}' by '' (an empty string).
How can I escape strings when building sql queries in DIHconfigFile ?
Hi,
what i want to do and to create the differents links between the entities
which i'm going to index. Therefore, i have a root entity and girls entities
like showing xml File.
But, my main problem is the number of documents. In facr, when i want to
index 3 months of data i have no problem(5 mill
32 matches
Mail list logo