Hi,
i could index around 10 documents in couple of hours. But after that
the time for indexing very large (around just 15-20 documents per minute).
i have taken care of garbage collection.
i am passing below parameters to Solr:
-Xms6144m -Xmx6144m -XX:MaxPermSize=128m -XX:+UseConcMarkSweepGC
Hi,
I noticed a fact that Solr indexes all the folders and files including
hidden files.
Can anyone help me with avoiding indexing of hidden files?
Thanks,
Ameya
Hi
I am getting exception for Processing of multipart/form-data request
failed.
My solrconfig.xml contains:
Please find below the stack trace.
ERROR - 2014-07-30 13:52:05.013; org.apache.solr.common.SolrException;
null:org.apache.commons.fileupload.FileUploadBase$IOFileUploadException:
> you should make sure that you have enough system memory available for file
> caching to hold the entire Solr index.
>
> Do you have Solr auto-commit enabled?
>
> -- Jack Krupansky
>
> -Original Message- From: Ameya Aware
> Sent: Tuesday, July 29, 2014 3:01
yeah.. i tried that.. with null output connector all the files gets crawled
in simply one hour..
On Tue, Jul 29, 2014 at 4:00 PM, Toke Eskildsen
wrote:
> Ameya Aware [ameya.aw...@gmail.com] wrote:
> > I am using Apache ManifoldCF framework which connects to my local system
> >
, 2014 at 2:49 PM, Toke Eskildsen
wrote:
> Ameya Aware [ameya.aw...@gmail.com] wrote:
>
> [Solr -Xmx5120m]
>
> > I need to index around 30 documents but with above parameters
> > performance is coming very poor around 15000-2 documents per hour.
>
> 4-5 documents
Hi,
I am running Solr with below parameters:
-XX:MaxPermSize=128m -Xms5120m -Xmx5120m -XX:+UseConcMarkSweepGC
-XX:CMSInitiatingOccupancyFraction=70 -XX:NewRatio=3
-XX:MaxTenuringThreshold=8 -XX:+CMSParallelRemarkEnabled
-XX:+ParallelRefProcEnabled -XX:+UseLargePages -XX:+AggressiveOpts
-XX:-UseGC
yes..
i intended to post this query there.
By mistake, i put it here.
Apologizing
Ameya
On Mon, Jul 28, 2014 at 11:07 AM, Jack Krupansky
wrote:
> Or are you using ManifoldCF?
>
> -- Jack Krupansky
>
> -Original Message- From: Rafał Kuć
> Sent: Monday, July 28, 2014 11:00 AM
> To: so
Hi,
I am seeing considerable decrease in speed of indexing of documents.
I am using PostgreSQL.
So is this a right time to do vacuum on PostgreSQL because i am using this
since a week.
Also, to invoke vacuum full do i just need to go to PostgreSQL command
prompt and invoke "VACUUM FULL" comman
ack trace for your OOM message? Are you
> seeing this on the client or server side?
>
> Thanks,
> Greg
>
> On Jul 25, 2014, at 10:21 AM, Ameya Aware wrote:
>
> > Hi,
> >
> > I am in process of indexing lot of documents but after around 90
Hi,
I am in process of indexing lot of documents but after around 9
documents i am getting below error:
java.lang.OutOfMemoryError: Requested array size exceeds VM limit
I am passing below parameters with Solr :
java -Xms6144m -Xmx6144m -XX:MaxPermSize=512m
-Dcom.sun.management.jmxremote -X
ConcMarkSweepGC
> -XX:+CMSIncrementalMode -XX:+CMSParallelRemarkEnabled
> -XX:+UseCMSInitiatingOccupancyOnly -XX:CMSInitiatingOccupancyFraction=70
> -XX:ConcGCThreads=6 -XX:ParallelGCThreads=6
>
> Marcello
>
>
> On 07/24/2014 03:53 PM, Ameya Aware wrote:
>
> I did not make any other ch
I did not make any other change than this.. rest of the settings are
default.
Do i need to set garbage collection strategy?
On Thu, Jul 24, 2014 at 9:49 AM, Marcello Lorenzi
wrote:
> Hi,
> Did you set a Garbage collection strategy on your JVM ?
>
> Marcello
>
>
> On 07/24
Hi
I am in process of indexing around 2,00,000 documents.
I have increase java jeap space to 4 GB using below command :
java -Xmx4096M -Xms4096M -jar start.jar
Still after indexing around 15000 documents it gives java heap space error
again.
Any fix for this?
Thanks,
Ameya
Hi,
I am kind of in trouble regarding indexing documents using Solr.
After every 15-20 documents, Solr gives below log:
INFO - 2014-07-23 15:38:50.715; org.apache.solr.core.SolrDeletionPolicy;
newest commit generation = 994
INFO - 2014-07-23 15:38:50.718; org.apache.solr.search.SolrIndexSearch
ng to load "com/uwyn/jhighlight/renderer/XhtmlRendererFactory"
> but that is not a class which is shipped or used by Solr. I think you have
> some custom plugins (a highlighter perhaps?) which uses that class and the
> classpath is not setup correctly.
>
>
> On Wed, Jul 23
Hi
I am running into below error while indexing a file in solr.
Can you please help to fix this?
ERROR - 2014-07-22 16:40:32.126; org.apache.solr.common.SolrException;
null:java.lang.RuntimeException: java.lang.NoClassDefFoundError:
com/uwyn/jhighlight/renderer/XhtmlRendererFactory
at
org.apache
So can i come over this exception by increasing heap size somewhere?
Thanks,
Ameya
On Tue, Jul 22, 2014 at 2:00 PM, Shawn Heisey wrote:
> On 7/22/2014 11:37 AM, Ameya Aware wrote:
> > i am running into java heap space issue. Please see below log.
>
> All we have here is a
Hi
i am running into java heap space issue. Please see below log.
ERROR - 2014-07-22 11:38:59.370; org.apache.solr.common.SolrException;
null:java.lang.RuntimeException: java.lang.OutOfMemoryError: Java heap space
at
org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:790
, Alexandre Rafalovitch
wrote:
> Nothing gets indexed automatically. So you must be doing something (e.g.
> Nutch). Tell us what that something is first so we know your baseline
> setup.
>
> Regards,
> Alex
> On 21/07/2014 9:43 pm, "Ameya Aware" wrote:
>
> &g
Hi,
How can i stop content of file from being getting indexed??
Will removing content field from schema.xml do that job?
Thanks,
Ameya
Got it. Thanks man.
Ameya
On Thu, Jul 10, 2014 at 3:36 PM, Ahmet Arslan
wrote:
> Hi,
>
> With that dynamic filed approach, you can see all these fields' values in
> result/response page. e.g. q=*:*&fl=*
>
> Ahmet
>
>
>
> On Thursday, July 10, 2014
o use the value that sent with literal.Modified parameter.
>
> First, try literalsOverride=true parameter.
>
> If that does not work, use FirstFieldValueUpdateProcessorFactory to reduce
> multivalued modified field to a single valued one.
>
> ahmet
>
>
> On Thurs
iki.apache.org/solr/ExtractingRequestHandler#Input_Parameters
>
>
>
> By the way when I work with solr-cell I add following dynamic field
>
> indexed="true" />
>
> for debugging purposes. This will show you all field generated solr-cell
> and mcf. I select
ou use f.map setting in solrconfig.xml?
>
> I think it is better to send use solrconfig.xml file where solr cell
> handler defined.
>
>
> On Thursday, July 10, 2014 12:18 AM, Ameya Aware
> wrote:
>
>
>
> Hi,
>
> Please have look at the below part taken from
Hi,
Please have look at the below part taken from solr.log file.
INFO - 2014-07-09 15:30:56.243;
org.apache.solr.update.processor.LogUpdateProcessor;
[collection1] webapp=/solr path=/update/extract params={literal.deny_token_
document=DEAD_AUTHORITY&literal.DocIcon=docx&resource.name=Anarchism-
26 matches
Mail list logo