Hello, all:
My configurations works nicely with solr 4.4. I am encountering a configuration
error when I try to upgrade from 4.4 to 4.6. All I did was the following:
a) Replace the 4.4 solr.war file with the 4.6 solr.war in the tomcat/lib
folder. I am using version 6.0.36 of tomcat.
b) I repla
om scratch
--
Jan Høydahl, search solution architect
Cominvent AS - www.cominvent.com
9. mai 2013 kl. 01:54 skrev William Pierce :
The reason I placed the solr.war in tomcat/lib was -- I guess -- because
that's way I had always done it since 1.3 days. Our tomcat instance(s)
run nothing other
ceDir/lib?
--
Jan Høydahl, search solution architect
Cominvent AS - www.cominvent.com
8. mai 2013 kl. 21:15 skrev William Pierce :
Thanks, Alex. I have tried placing the jars in a folder under
solrhome/lib or under the instanceDir/lib with appropriate declarations in
the solrconfig.xml. I ca
. Have you tried putting those jars
somewhere else and using "lib" directive in solrconfig.xml instead to
point to them?
Regards,
Alex.
On Wed, May 8, 2013 at 2:07 PM, William Pierce
wrote:
I have gotten solr 4.3 up and running on tomcat7/windows7. I have added
the two dataimport ha
Hi,
I have gotten solr 4.3 up and running on tomcat7/windows7. I have added the
two dataimport handler jars (found in the dist folder of my solr 4.3 download)
to the tomcat/lib folder (where I also placed the solr.war).
Then I added the following line to my solrconfig.xml:
dih
Two suggestions: a) Noticed that your dih spec in the solrconfig.xml seems
to to refer to "db-data-config.xml" but you said that your file was
db-config.xml. You may want to check this to make sure that your file
names are correct. b) what does your log say when you ran the import
process?
Our setup on ec2 is as follows:
a) mysql master on ebs volume.
b) solr master on its own ebs volume
c) solr slaves do not use ebs -- but rather use the ephemeral instance
stores. There is a small period of time where the solr slave has to re-sync
the data from the solr master.
Cheers,
Bill
I have used solr extensively for our sites (and for the clients I work
with). I think it is great! If you do an item-by-item feature list
comparison, I think you will find that solr stacks up quite well. And the
price, of course, cannot be beat!
However, there are a few intangibles that ma
hit the
http://localhost:8080/postingsmaster/replication using a browser from
the slave box. if you are able to hit it what do you see?
On Tue, Dec 8, 2009 at 3:42 AM, William Pierce
wrote:
Just to make doubly sure, per tck's suggestion, I went in and
explicitly
added in the port in the ma
a 1.6.
Thanks,
- Bill
--
From: "William Pierce"
Sent: Monday, December 07, 2009 2:03 PM
To:
Subject: Re: Exception encountered during replication on slaveAny clues?
tck,
thanks for your quick response. I am running on the default por
7, 2009 at 4:44 PM, William Pierce
wrote:
Folks:
I am seeing this exception in my logs that is causing my replication to
fail.I start with a clean slate (empty data directory). I index the
data on the postingsmaster using the dataimport handler and it succeeds.
When the replication sl
Folks:
I am seeing this exception in my logs that is causing my replication to fail.
I start with a clean slate (empty data directory). I index the data on the
postingsmaster using the dataimport handler and it succeeds. When the
replication slave attempts to replicate it encounters this
Folks:
In my db I currently have fields that represent bitmasks. Thus, for example,
a value of the mask of 48 might represent an "undergraduate" (value = 16) and
"graduate" (value = 32). Currently, the corresponding field in solr is a
multi-valued string field called "EdLevel" which will h
Have you gone through the solr tomcat wiki?
http://wiki.apache.org/solr/SolrTomcat
I found this very helpful when I did our solr installation on tomcat.
- Bill
--
From: "Jill Han"
Sent: Friday, December 04, 2009 8:54 AM
To:
Subject: RE: search
config. It should be
fine
On Tue, Dec 1, 2009 at 6:59 AM, William Pierce
wrote:
Hi, Joe:
I tried with the "fetchIndex" all lower-cased, and still the same result.
What do you specify for masterUrl in the solrconfig.xml on the slave?
it
seems to me that if I remove the element, I
...
Thanks,
- Bill
------
From: "William Pierce"
Sent: Monday, November 30, 2009 1:47 PM
To:
Subject: How to avoid hardcoding masterUrl in slave solrconfig.xml?
> Folks:
>
> I do not want to hardcode the masterUrl in the solrconfig.xml of my
> slave.
> If the masterUrl tag
Folks:
Sorry for this repost! It looks like this email went out twice
Thanks,
- Bill
--
From: "William Pierce"
Sent: Monday, November 30, 2009 1:47 PM
To:
Subject: How to avoid hardcoding masterUrl in slave solrconfig.xml?
Fo
Folks:
I do not want to hardcode the masterUrl in the solrconfig.xml of my slave. If
the masterUrl tag is missing from the config file, I am getting an exception in
solr saying that the masterUrl is required. So I set it to some dummy value,
comment out the poll interval element, and issue
Folks:
Reading the wiki, I saw the following statement:
"Force a fetchindex on slave from master command :
http://slave_host:port/solr/replication?command=fetchindex
It is possible to pass on extra attribute 'masterUrl' or other attributes
like 'compression' (or any other parameter which
Folks:
For those of your experienced linux-solr hands, I am seeking recommendations
for which file system you think would work best with solr. We are currently
running with Ubuntu 9.04 on an amazon ec2 instance. The default file system I
think is ext3.
I am of course seeking, of course, t
.(ApplicationFilterConfig.java:108)
at
org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:3800)
Cheers,
- Bill
--
From: "William Pierce"
Sent: Monday, November 09, 2009 12:49 PM
To:
Subject: Solr Internal exception
Sorry...folks...I saw that there were two copies sent outBeen having
some email snafus at my end...so apologize in advance for the duplicate
email
- Bill
--
From: "William Pierce"
Sent: Monday, November 09, 2009 12:49 PM
To
Folks:
I am encountering an internal exception running solr on an Ubuntu 9.04 box,
running tomcat 6. I have deposited the solr nightly bits (as of October 7)
into the folder: /usr/share/tomcat6/lib
The exception from the log says:
Nov 9, 2009 8:26:13 PM org.apache.catalina.core.StandardConte
Folks:
I am encountering an internal exception running solr on an Ubuntu 9.04 box,
running tomcat 6. I have deposited the solr nightly bits (as of October 7)
into the folder: /usr/share/tomcat6/lib
The exception from the log says:
Nov 9, 2009 8:26:13 PM org.apache.catalina.core.StandardConte
I'd recommend two ways: The way I do it in my app is that I have written a
MySql function to transform the column as part of the select statement. In
this approach, your select query would like so:
select col1, col2, col3, spPrettyPrintCategory(category) as X, col4,
col5, from table
it appears that DIH use PreparedStatement in
the JdbcDataSource.
I set the batchsize parameter to -1 and it solved my problem.
Regards.
Gilbert.
William Pierce a écrit :
Folks:
My db contains approx 6M records -- on average each is approx 1K bytes.
When I use the DIH, I reliably get an OOM
Folks:
My db contains approx 6M records -- on average each is approx 1K bytes. When
I use the DIH, I reliably get an OOM exception. The machine has 4 GB ram,
my tomcat is set to use max heap of 2G.
The option of increasing memory is not tenable coz as the number of documents
grows I wi
Folks:
If I issue two requests with no intervening changes to the index,
will the second optimize request be smart enough to not do anything?
Thanks,
Bill
Congratulations on thisWhat dotnet library did you use? We are also
using solr in our windows2003/C# environment but currently simply use HTTP
to query and the Dataimport handler to update the indices...
- Bill
--
From: "Robert Petersen"
S
itely be printed
2009/10/17 Noble Paul നോബിള് नोब्ळ् :
It is strange that LogTransformer did not log the data. .
On Fri, Oct 16, 2009 at 5:54 PM, William Pierce
wrote:
Folks:
Continuing my saga with DIH and use of its special commands. I have
verified that the script functionality is indee
1:16 PM
To:
Subject: Re: Using DIH's special commandsHelp needed
On Fri, Oct 16, 2009 at 5:54 PM, William Pierce
wrote:
Folks:
Continuing my saga with DIH and use of its special commands. I have
verified that the script functionality is indeed working.I also
verified
that '
value
(finest) and still no output.
Thanks,
- Bill
--
From: "Noble Paul ??? ??"
Sent: Thursday, October 15, 2009 10:05 PM
To:
Subject: Re: Using DIH's special commandsHelp needed
use LogTransformer to see if the val
Bill
--
From: "Shalin Shekhar Mangar"
Sent: Thursday, October 15, 2009 1:42 PM
To:
Subject: Re: Using DIH's special commandsHelp needed
On Fri, Oct 16, 2009 at 12:46 AM, William Pierce
wrote:
Thanks for your help. Here is my DI
d_table
where (IndexingStatus = 1 or IndexingStatus = 4) ">
Thanks,
- Bill
--
From: "Shalin Shekhar Mangar"
Sent: Thursday, October 15, 2009 11:03 AM
To:
Subject: Re: Using DIH's special commands...
--
From: "Shalin Shekhar Mangar"
Sent: Thursday, October 15, 2009 10:03 AM
To:
Subject: Re: Using DIH's special commandsHelp needed
On Thu, Oct 15, 2009 at 6:25 PM, William Pierce
wrote:
Folks:
I see in the DIH wiki that th
Folks:
I see in the DIH wiki that there are special commands which according to the
wiki
"Special commands can be given to DIH by adding certain variables to the row
returned by any of the components . "
In my use case, my db contains rows that are marked "PendingDelete". How do
I use the
dea
> to
> support it in someway.
> I will post this thread on the dev-mailing list to seek opinion.
>
> Cheers
> Avlesh
>
> On Wed, Oct 14, 2009 at 11:39 PM, William Pierce wrote:
>
>> Thanks, Avlesh. Yes, I did take a look at the event listeners. As I
>>
Had a look at EventListeners in
DIH?http://wiki.apache.org/solr/DataImportHandler#EventListeners
Cheers
Avlesh
On Wed, Oct 14, 2009 at 11:21 PM, William Pierce
wrote:
Folks:
I am pretty happy with DIH -- it seems to work very well for my
situation.
Thanks!!!
The one issue I see has
Folks:
I am pretty happy with DIH -- it seems to work very well for my situation.
Thanks!!!
The one issue I see has to do with the fact that I need to keep polling
<>/dataimport to check if the data import completed successfully. I need
to know when/if the import is completed (successfull
OopsMy bad! I didn't realize that by changing the subject line I was
still "part" of the thread whose subject I changed!
Sorry folks! Thanks, Hoss for pointing this out!
- Bill
--
From: "Chris Hostetter"
Sent: Tuesday, October 13, 2009 11:
Folks:
During query time, I want to dynamically compute a document score as
follows:
a) Take the SOLR score for the document -- call it S.
b) Lookup the "business logic" score for this document. Call it L.
c) Compute a new score T = func(S, L)
d) Return the documents sorted by T.
I h
those
machines.
Good luck!
Lance Norskog
On Sat, Oct 10, 2009 at 5:57 PM, William Pierce
wrote:
Oh and one more thing...For historical reasons our apps run using msft
technologies, so using SolrJ would be next to impossible at the present
time
Thanks in advance for your
Oh and one more thing...For historical reasons our apps run using msft
technologies, so using SolrJ would be next to impossible at the present
time
Thanks in advance for your help!
-- Bill
--
From: "William Pierce"
Sent: Saturda
Folks:
I have a corpus of approx 6 M documents each of approx 4K bytes.
Currently, the way indexing is set up I read documents from a database and
issue solr post requests in batches (batches are set up so that the
maxPostSize of tomcat which is set to 2MB is adhered to). This means that
in
Folks:
Are there good rules of thumb for when to optimize? We have a large index
consisting of approx 7M documents and we currently have it set to optimize
once a day. But sometimes there are very few changes that have been
committed during a day and it seems like a waste to optimize (esp. s
Folks:
In our app we index approx 50 M documents every so often. One of the fields
in each document is called "CompScore" which is a score that our back-end
computes for each document. The computation of this score is heavy-weight
and is done only approximately once every few days.When d
Hi, Mark:
Thanks for the updateLooking forward to 1.4!
Cheers,
- Bill
--
From: "Mark Miller"
Sent: Wednesday, January 07, 2009 4:48 PM
To:
Subject: Re: Plans for 1.3.1?
William Pierce wrote:
Thanks, Ryan!
It is great
Thanks, Ryan!
It is great that Solr replication (SOLR-561) is included in this release.
One thing I want to confirm (if Noble, Shalin et al) can help:
I had encountered an issue a while back (in late October I believe) with
using SOLR-561. I was getting an error (AlreadyClosedException) from
That is fantastic! Will the Java replication support be included in this
release?
Thanks,
- Bill
--
From: "Ryan McKinley"
Sent: Wednesday, January 07, 2009 11:42 AM
To:
Subject: Re: Plans for 1.3.1?
there are plans for a regular release (1.4)
Paul ??? ??" <[EMAIL PROTECTED]>
Sent: Saturday, November 15, 2008 11:40 PM
To:
Subject: Re: Fatal exception in solr 1.3+ replication
Is this issue visible for consistently ? I mean are you able to
reproduce this easily?
On Fri, Nov 14, 2008 at 11:15 PM, William Pierce &
opens, which means our
IndexReaders don't attempt to close their underlying Directories now.
I can probably send you patch for the revision your on to hide this as
well, but I'm already in the doghouse on cleaning right now ; ) The way my
brain works, I'll probably be back to this l
l goodness).
So how does the refcount on the Directory hit 0? I can't find or duplicate
yet...
Trunk may actually still hide the issue (possibly), but something really
funky seems to have gone on and I can't find it yet. Do you have any
custom code interacting with solr?
- Mark
Will
of that before).
At worst, if/when a fix is discovered, you will probably be able to apply
just the fix to the revision your working with.
- Mark
William Pierce wrote:
Mark,
Thanks for your response --- I do appreciate all you volunteers working
to provide such a nice system!
Anyway, I
unexpectedly...I'll try to take a further look over the
weekend.
- Mark
William Pierce wrote:
Folks:
I am using the nightly build of 1.3 as of Oct 23 so as to use the
replication handler. I am running on windows 2003 server with tomcat
6.0.14. Everything was running fine until I n
Folks:
I am using the nightly build of 1.3 as of Oct 23 so as to use the replication
handler. I am running on windows 2003 server with tomcat 6.0.14. Everything
was running fine until I noticed that certain updated records were not showing
up on the slave. Further investigation showed me t
I am using tomcat 6.0.14 without any problems on windows 2003 R2 server. I
am also using the 1.3 patch (using the nightly build of 10/23) for
master-slave replication... That's been working great!
-- Bill
--
From: "Otis Gospodnetic" <[EMAIL PRO
2008 at 2:27 AM, William Pierce <[EMAIL PROTECTED]>
wrote:
I tried the nightly build from 10/18 -- I did the following:
a) I downloaded the nightly build of 10/18 (the zip file).
b) I unpacked it and copied the war file to my tomcat lib folder.
c) I made the relevant changes in th
uration
If you are using a nightly you can try the new SolrReplication feature
http://wiki.apache.org/solr/SolrReplication
On Thu, Oct 23, 2008 at 4:32 AM, William Pierce <[EMAIL PROTECTED]>
wrote:
Otis,
Yes, I had forgotten that Windows will not permit me to overwrite files
currentl
try the new SolrReplication feature
http://wiki.apache.org/solr/SolrReplication
On Thu, Oct 23, 2008 at 4:32 AM, William Pierce <[EMAIL PROTECTED]>
wrote:
Otis,
Yes, I had forgotten that Windows will not permit me to overwrite files
currently in use. So my copy scripts are failing. Windows will
ng it? Are you
deleting index files from the index dir on Q that are no longer in the
index dir on U?
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: William Pierce <[EMAIL PROTECTED]>
To: solr-user@lucene.apache.org
Sent: Wednesday
Folks:
I have two instances of solr running one on the master (U) and the other on
the slave (Q). Q is used for queries only, while U is where updates/deletes
are done. I am running on Windows so unfortunately I cannot use the
distribution scripts.
Every N hours when changes are committed
Folks:
I have an odd situation that I am hoping someone can shed light on.
I have a solr apps running under tomcat 6.0.14 (on a windows xp sp3
machine).
The app is declared in the tomcat config file as follows:
In file "merchant.xml" for the "merchant" app:
I have of course created the
your* shards and *your* type of queries is by benchmarking.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
From: William Pierce <[EMAIL PROTECTED]>
To: solr-user@lucene.apache.org
Sent: Thursday, May 15, 2008 12:23:03 PM
Subject: Some advice on s
Folks:
We are building a search capability into our web and plan to use Solr. While
we have the initial prototype version up and running on Solr 1.2, we are now
turning our attention to sizing/scalability.
Our app in brief: We get merchant sku files (in either xml/csv) which we
process an
separate _request_ after your
, or putting a into the same request. Solr only supports
one command (add or commit, but not both) per request.
Erik
On May 13, 2008, at 10:36 AM, William Pierce wrote:
Thanks for the comments
The reason I am just adding one document followed by a com
t=true in the URL of the add command.
-Yonik
On Tue, May 13, 2008 at 9:31 AM, Alexander Ramos Jardim
<[EMAIL PROTECTED]> wrote:
Maybe a delay in commit? How may time elapsed between commits?
2008/5/13 William Pierce <[EMAIL PROTECTED]>:
> Hi,
>
> I am having problems with
Hi,
I am having problems with Solr 1.2 running tomcat version 6.0.16 (I also tried
6.0.14 but same problems exist). Here is the situation: I have an ASP.net
application where I am trying to and a single document to an
index. After I add the document and issue the I can see (in the
solr
67 matches
Mail list logo