That did the trick. The GC tuning options also seems to be working, but
I guess we'll see when traffic ramps back up on monday. Thanks for all
your help!
On 7/18/2015 8:16 AM, Shawn Heisey wrote:
The first thing I'd try is removing the UseLargePages option and see if
it goes away.
SOLR-4470 is about:
Support for basic auth in internal Solr requests.
What is wrong with the internal requests?
Can someone help simplify, would it ever be possible to run with basic auth?
What work arounds?
Regards
Could you post your clusterstate.json? Or at least the "live nodes"
section of your ZK config? (adminUI>>cloud>>tree>>live_nodes. The
addresses of my nodes are things like 192.168.1.201:8983_solr. I'm
wondering if you're taking your node names from the information ZK
records or assuming it's 127.0.
P.S.
"It ain't the things ya don't know that'll kill ya, it's the things ya
_do_ know that ain't so"...
On Sat, Jul 18, 2015 at 12:46 PM, Erick Erickson
wrote:
> Could you post your clusterstate.json? Or at least the "live nodes"
> section of your ZK config? (adminUI>>cloud>>tree>>live_nodes. Th
bq: So I want to allow people to upload any CSV/XML/JSON to solr they want so
having a predefined schema isn't going to cut it
Piling on to Shawn's excellent comments I would really advise agains this.
Sure, you could make everything a text field using the * catch-all, but.
If a field
If you fire up thing in cloud mode in the example method, you should
be seeing something like ./example/cloud/node1/logs,
./example/cloud/node2/logs etc. If you're in non-cloud mode you should
see something like ./server/logs.
There is some logging on the admin/ui, see the "logging" selection.
Be
I will try that. traffic is pretty dead over the weekend, so I probably
won't be able to tell if its effective or not until monday.
thanks again!
On 7/18/2015 8:16 AM, Shawn Heisey wrote:
On 7/18/2015 12:42 AM, Jeremy Ashcraft wrote:
I turned on GC logging and verified that its definitely be
Hello!
before when start.jar was used the requests rolled in the console. When I
start with the new method (bin/solr) it does not. There is no relevant log
file anywhere either... How can log / watch requests?
Best would be to get them in the admin GUI
cheers
Cool, just curious
Thanks Eric
Sent from my iPhone
> On 18-Jul-2015, at 10:23 am, Erick Erickson wrote:
>
> No idea what you mean by "chance of these deleted docs
> getting re-indexed". Solr shouldn't be doing this by itself.
> Certainly if your indexing process sends them in again
> they'll
On 7/18/2015 9:49 AM, Charlie Hubbard wrote:
> So I want to allow people to upload any CSV/XML/JSON to solr they want so
> having a predefined schema isn't going to cut it. After reading about my
> options I figured my choices were schema-less mode and dynamic fields using
> the * with a type othe
Thanks Eric,
The strange thing is that although I have set the log level to "ALL" I see
no error messages in the logs (apart from the line saying that the response
is a 400 one).
I'm quite confident the configset does exist as the collection gets created
fine if I don't specify the createNodeSet
So I want to allow people to upload any CSV/XML/JSON to solr they want so
having a predefined schema isn't going to cut it. After reading about my
options I figured my choices were schema-less mode and dynamic fields using
the * with a type other than ignore. I know the docs say schema-less isn't
Thank you very much. I'm trying to use MapReduceIndexerTool and have downloaded
the search-mr-1.1.0.jar. Another question comes that how can I download the
dependent jar of search-mr-1.1.0? I tried to use mvn command to download the
dependences with the pom in
http://grepcode.com/snapshot/repos
No idea what you mean by "chance of these deleted docs
getting re-indexed". Solr shouldn't be doing this by itself.
Certainly if your indexing process sends them in again
they'll be re-indexed, there's no notion of "never index
this doc again".
Why? Are you seeing some symptom or are you just
curi
Dear Ali,
I'm not sure I understand what you are trying to do, please correct me if I
misunderstood:
given a document indexed into lucene you want to retrieve the top-k terms
with highest tf-idf right?
Could you please post your code somewhere? I don't understand what is
"mlt" :)
Cheers,
Diego
On 7/18/2015 12:42 AM, Jeremy Ashcraft wrote:
> I turned on GC logging and verified that its definitely being caused by
> a GC pause. I tried the tuning option from the article and get this
> warning:
>
> OpenJDK 64-Bit Server VM warning: Failed to reserve shared memory
> (errno = 1).
>
> any r
would MapReduceIndexerTool option ?
http://www.cloudera.com/content/cloudera/en/documentation/cloudera-search/v
1-latest/Cloudera-Search-User-Guide/csug_mapreduceindexertool.html
On 7/18/15, 9:38 AM, "步青云" wrote:
>I need help. I have several hundreds of GB files in hdfs and I want to
>creat
I need help. I have several hundreds of GB files in hdfs and I want to creat
indexes for these files so that I can search quickly. How can I create indexes
for these files in hdfs? I know tika embeded in solr could extact the content
of files in local file system and then solr would create index
I like the idea.
I am hearing nowadays about solrmeter, can't we accomplish this in solrmeter?
Thanks,
Naga
> On 18-Jul-2015, at 8:51 am, Alexandre Rafalovitch wrote:
>
> I haven't found one. I have a project plan for something just like
> this but it is one of many Solr-related ideas. Mine
I haven't found one. I have a project plan for something just like
this but it is one of many Solr-related ideas. Mine is actually around
the idea of small multiples with several similar stacks next to each
other and seeing how the same text/query run differently with minor
variations.
If people r
I know the Solr Admin panel has a way to test the current index and query
filters already in place in a schema file, but I was wondering if there is
a convenient "playground" for testing index and query filters?
I'm imagining a utility where you can select a set of index and query
filters, and th
Thank you. That helped
On Tue, Jul 14, 2015 at 5:02 PM, Chris Hostetter
wrote:
>
> : Are there any examples/documentation for IntervalFaceting using dates
> that
> : I could refer to?
>
> You just specify the interval set start & end as properly formated date
> values. This example shows some
Hello everyone,
Good day. I'm new to solr and have some questions. Could anyone help me?
I want to index files in hdfs using solr. And I know that we can use
"solr.extraction.ExtractingRequestHandler" to directly index files in local
file system. But this doesn't work for the files in
23 matches
Mail list logo