hat I need to set the path to the
> dataimport.properties file?
>
> On Tue, Mar 3, 2009 at 8:03 PM, Noble Paul നോബിള് नोब्ळ्
> wrote:
>> I do not see anything wrong with this .It should have worked . Can you
>> check that dataimport.properties is created (by DIH)
I have reopened the issue. We will fix it completely in a day or two.
On Wed, Mar 4, 2009 at 6:30 PM, Walter Ferrara wrote:
> tried with
>
>
> but no luck, the dataDir parameter seems ignored, no matter what is written
> there
>
> On Wed, Mar 4, 2009 at 12:58 P
d jetty
> container. Modify the example/conf/schema.xml file and add your own fields
> etc. Read through the DataImportHandler wiki page and at the
> example/example-DIH directory in the solr zip/tarball.
>
> If you have a specific doubt/question, ask on the list.
>
> --
> Regards,
> Shalin Shekhar Mangar.
>
>
>
>
--
--Noble Paul
solrj API. Any help will be
> appriciated.
>
> Thanks,
>
--
--Noble Paul
can we have
> on this server?
>
> *Server config*
>
> OS: Red Hat Enterprise Linux ES 4 - 64 Bit
> # Processor: Dual AMD Opteron Dual Core 270 2.0 GHz
> # 4GB DDR RAM
> # Hard Drive: 73GB SCSI
> # Hard Drive: 73GB SCSI
>
> thanks
>
--
--Noble Paul
Also not entirely true for the solrj client. Assuming the response
>> includes the standard solr data structures (NamedList, DocList, Doc, etc...)
>> the solrj client will parse the response.
>>
>
> Yes, apologies for the wrong characterization. I meant to say that one
> cannot use the nice API methods to navigate the results.
>
> --
> Regards,
> Shalin Shekhar Mangar.
>
--
--Noble Paul
ntext:
> http://www.nabble.com/problem-using-dataimporthandler-tp22406450p22406450.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
.SQLServerDriver"
> url="jdbc:sqlserver://192.168.1.120:1433;DatabaseName=g2" user="sa"
> password="mima1234@"/>
>
>
> >
>
>
>
>
>
>
>
> Not sure what is
; --
> View this message in context:
> http://www.nabble.com/DataImportHandler-that-uses-JNDI-lookup-tp22408996p22408996.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
Suggestions on how to do the delete bitty from within an entity?
>
> Regards Fergus.
>
>
>
> --
>
> =======
> Fergus McMenemie Email:fer...@twig.me.uk
> Techmore Ltd Phone:(UK) 07721 376021
>
> Unix/Mac/Intranets Analyst Programmer
> ===
>
--
--Noble Paul
ex is an optional value of a regex to
>>> identify documents which when matched should
>>> be deleted from the index **PLANNED**
>>>
>>> allowRegex a required regex to identify the portion
>>> of the A
hook in your own for
> debugging purposes? I can't seem to locate the options in the Wiki or
> remember if it was available.
>
> Thanks.
>
> - Jon
>
--
--Noble Paul
anaged
> to do last round.
>
> (I'm using something along these lines with my current, non-DIH-based
> indexing scheme.)
>
> Am I making sense here?
>
> Chris
>
--
--Noble Paul
be relative to baseDir. Required.
>>>>>
>>>>> manifestAddRegex is a required regex to identify lines
>>>>> which when matched should cause docs to
>>>>> be added to the index.
>>>>>
&g
there is no harm in hosting Solr alongwith other webapps
On Tue, Mar 10, 2009 at 5:14 AM, jlist9 wrote:
> Is it a bad idea to embed my webapp in solr jetty? Or is it always
> better to use a separate web server if I'm serving the result from a
> web server?
>
> Thanks
>
--
--Noble Paul
d as it's getting requests from 100 machines
setup repeater nodes
. setup a few slaves as masters also. and make some slaves replicate
from these slaves instead of the primary master.
>
> Thoughts?
>
> Thanks,
> Sameer.
> --
> http://www.productification.com
>
--
--Noble Paul
but I want to replace this with an environment variable, like:
> ${solr.data.dir:"%SOLR_DATA%"}
>
> How is it possible in solr 1.3
>
> Thanks
> con
> --
> View this message in context:
> http://www.nabble.com/Custom-path-for-solr-lib-and-data-folder-tp22450530p22450530.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
Thanks in advance...
> Ashish
>
>
>
>
> --
> View this message in context:
> http://www.nabble.com/SolrJ-XML-indexing-tp22450845p22450845.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
gt;>>> >>
>>> >
>>> > Unfortunately, no. The fix is in trunk but the trunk DataImportHandler
>>> > uses
>>> > a new rollback operation which is not supported by Solr 1.3 release.
>>> >
>>
>> However you should be able to backport the changes in SOLR-742 to Solr 1.3
>> code.
>>
>> --
>> Regards,
>> Shalin Shekhar Mangar.
>>
>
>
--
--Noble Paul
> will get events before and after the import? Correct me if Im wrong there
> are currently (1.4) preImportDeleteQuery and postImportDeleteQuery hooks for
> the entire import just nothing on the entity level?
>
> - Jon
>
> On Mar 9, 2009, at 2:48 PM, Noble Paul നോബിള് नोब्ळ् w
>>
>> I'm not sure if this is the right forum for this, but I'm wondering if I
>> could get a rough timeline of when version 1.4 of Solr might be out?
>> I'm trying to figure out whether we will be able to use the new built-in
>> replication as opposed to the current rsync collection distribution.
>>
>>
>>
>> Thanks,
>>
>> Laurent
>
>
--
--Noble Paul
IH, but if you can put your xml in a file
behind an http server then you can fire a command to DIH to pull data
from the url quite easily.
>
> Regards,
> CI
>
--
--Noble Paul
:
> http://www.nabble.com/Solr-1.3-and-Solr-1.4-difference--tp22471477p22471477.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
.nabble.com/SolrJ-%3A-EmbeddedSolrServer-and-database-data-indexing-tp22488697p22489420.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
e2 -> Table2Instance3 -> Table3Instance4
>
> I wanted to have a single document per root object instance (in this case
> per Table1 instance) but with the values from the different lines returned.
>
> Is it possible to have this behavior in DataImportHandler? How?
>
> Thanks in advance,
> Rui Pereira
>
--
--Noble Paul
>
> this is pretty easy to understand: no ClobTransformer implementation is
> found in the classpath.
>
> The question is: is there any default ClobTransformer shipped with Solrl or
> do I have to implement a custom one?
>
> Thanks,
> Giovanni
>
--
--Noble Paul
Solr and then have more that
> one document with the same id, but then I don't know if in delta-imports the
> documents outdated or deleted are updated (updated document is added and
> then we would have the outdated and the updated document in the index) or
> removed.
>
> No
queries are
> dynamically generated, taken in consideration a determinate topology.
>
> Noble Paul നോബിള് नोब्ळ् wrote:
>>
>> have one root which just does a "select id from Table1 "
>> .Then have a child entiy which does all the joins and return all other
>
any rollbacks, B) get DIH to do
> auto-commits, and C) make my custom transformer update the DB marker
> only immediately after an auto-commit.
>
> On Mon, Mar 9, 2009 at 9:27 PM, Noble Paul നോബിള് नोब्ळ्
> wrote:
>> I recommend writing a simple transformer which can write an entry
Does this solve your problem?
https://issues.apache.org/jira/browse/SOLR-1065
On Wed, Mar 11, 2009 at 11:52 PM, Noble Paul നോബിള് नोब्ळ्
wrote:
> On Tue, Mar 10, 2009 at 12:17 PM, CIF Search wrote:
>> Just as you have an xslt response writer to convert Solr xml response to
&
In Solr commits are already expensive s a second's delay may
be alright .
>
> 4. Has anyone else on the list attempted to do this? The intent
> here is to achieve optimal performance while have the freshest data
> possible if that's possible.
>
>
>
> Thanks,
> Laurent
>
>
--
--Noble Paul
I am not able to
> UPDATE an existing document or REMOVE a document that is not anymore in the
> DB.
>
> What am I missing? How should I specify my deltaQuery?
>
> Thanks a lot in advance!
>
> Giovanni
>
--
--Noble Paul
DB anymore. Is this possible?
>
> I am no DB expert...so ANY tip is very welcome!
>
> Thanks,
> Giovanni
>
>
> On 3/18/09, Noble Paul നോബിള് नोब्ळ् wrote:
>>
>> are you sure your schema.xml has a field to UPDATE docs.
>>
>> to remove deleted doc
LEFT JOIN
> Place Place1 ON SubPlace1.PLACEID = Place1.PLACEID LEFT JOIN PType PType2 ON
> Place1.PTYPEID = PType2.PTYPEID LEFT JOIN SType SType3 ON Sub0.STYPEID =
> SType3.STYPEID WHERE Sub0.SUBID = ${3142.SUBID}">
>
>
>
>
>
>
>
> I get results when running the deltaQuery manually, but Solr doesn't import
> anything!!!
> What am I doing wrong?!
>
> Thanks in advance,
> Rui Pereira
>
--
--Noble Paul
nt.vauthrin=disney@lucene.apache.org
> [mailto:solr-user-return-19721-laurent.vauthrin=disney....@lucene.apache.org]
> On Behalf Of Noble Paul ??? ??
> Sent: Tuesday, March 17, 2009 9:04 PM
> To: solr-user@lucene.apache.org
> Subject: Re: More replication questions
>
> On Wed, M
>> I have a URL the responds with a well formatted solr add xml (I'm able
>>>> to add it by POSTing). But when I try to add it using
>>>> http://localhost:8983/solr/dataimport?command=full-import i get a null
>>>> pointer exception.
>>>
>>>
>>> You need to use XPathEntityProcessor. If you do not specify a processor, the
>>> default is SqlEntityProcessor (used for DB imports).
>>>
>>> Add the attribute processor="XPathEntityProcessor" to the entity and try.
>>>
>>> --
>>> Regards,
>>> Shalin Shekhar Mangar.
>>>
>>
>
--
--Noble Paul
eems-to-work-tp22597630p22597630.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
sorry, the whole thing was commented . I did not notice that. I'll
look into that
2009/3/20 Noble Paul നോബിള് नोब्ळ् :
> you have set autoCommit every x minutes . it must have invoked commit
> automatically
>
>
> On Thu, Mar 19, 2009 at 4:17 PM, sunnyfr wrote:
>>
gt; at
> org.apache.solr.core.SolrResourceLoader.inform(SolrResourceLoader.java:388)
> at org.apache.solr.core.SolrCore.(SolrCore.java:571)
> at
> org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:121)
> at
> org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:69)
>
> Thanks
>
--
--Noble Paul
ation-%3A-segment-optimized-automaticly-td22601442.html
>
> thanks a lot Paul
>
>
> Noble Paul നോബിള് नोब्ळ् wrote:
>>
>> sorry, the whole thing was commented . I did not notice that. I'll
>> look into that
>>
>> 2009/3/20 Noble Paul നോബിള് नोब्ळ् :
s it problematic to delete documents from a determinate
> entity while importing?
Solr does not have an issue , but be aware that the commit may be
happening after the import and if that is OK for your data then it
should be OK
>
> Thanks in advance,
> Rui Pereira
>
--
--Noble Paul
eld, you can try
>>> to use MappingCharFilter
>>> instead of ISOLatin1AccentFilter. Add the following line to
>>> mapping-ISOLatin1Accent.txt:
>>>
>>> "è" => "e"
>>>
>>> and add the following fieldType:
>>>
>>> >> positionIncrementGap="100" >
>>>
>>> >> mapping="mapping-ISOLatin1Accent.txt"/>
>>>
>>>
>>>
>>>
>>> MappingCharFilter and mapping-ISOLatin1Accent.txt are in nightly build.
>>>
>>> Koji
>>>
>>>
>>>
>>>
>>
>> --
>> View this message in context:
>> http://www.nabble.com/Problem-with-UTF-8-and-Solr-ISOLatin1AccentFilterFactory-tp22607642p22616220.html
>> Sent from the Solr - User mailing list archive at Nabble.com.
>>
>
>
--
--Noble Paul
:
>>
>> ---
>> String addContent = ""
>> +"123> name="includes">eaiou with circumflexes:âîôû"
>> +"";
>> DirectXmlRequest up = new DirectXmlRequest( "/update", addContent );
>> server.request( up );
>> ---
>>
>> thanks for help
>
>
--
--Noble Paul
er_set are in UTF-8.
>
> I think that dataimporter get data in ISO. so the i just write a custom
> transformer to change the row's charset from iso to utf and now it work.
>
> --> Noble Paul : I use SOLR 1.4 Nighty 2009-03-18 build. i have to download
> the last one to apply y
uot;
> : all character_set are in UTF-8.
>
> I think that dataimporter get data in ISO. so the i just write a custom
> transformer to change the row's charset from iso to utf and now it work.
>
> --> Noble Paul : I use SOLR 1.4 Nighty 2009-03-18 build. i have to downloa
search those index by keyword. I tried to search like
> http://localhost:8080/solr/select/?q=Peter. But I got zero respone.
> Can anyone help me please why is this?
> Thanks in advance.
>
--
--Noble Paul
ot;/>
>
> person_id
>
>
> It is indexed properly. But I am not getting any result. Do I need to do
> explicit commit?
> Or Am I quering correctly http://localhost:8080/solr/select/?q=Peter ?
>
>
>
> 2009/3/22 Noble Paul നോബിള് नोब्ळ्
>
>> it i
import?command=delta-import&optimize=false
>> >
>> > Solrconfig.xml : autocommit turnd off
>> >
>> >
>> >
>> >
>> > Maybe it comes from lucene parameters?
>> >
>> > false
>> > 50
>> > 50
>> >
>> >
>> > 2147483647
>> > 1
>> >
>> > Thanks a lot for your help,
>> > Sunny
>> >
>> >
>> >
>> >
>>
>> --
>> View this message in context:
>> http://www.nabble.com/Problem-for-replication-%3A-segment-optimized-automaticly-tp22601442p22649412.html
>> Sent from the Solr - User mailing list archive at Nabble.com.
>>
>>
>
>
> --
> Regards,
> Shalin Shekhar Mangar.
>
--
--Noble Paul
sAdd(RunUpdateProc
> essorFactory.java:59)
> at
> org.apache.solr.handler.dataimport.SolrWriter.upload(SolrWriter.java:67)
> at
> org.apache.solr.handler.dataimport.DataImportHandler$1.upload(DataImportHand
> ler.java:263)
> at
> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:
> 377)
> at
> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:225
> )
>
>
>
--
--Noble Paul
hitsOnPage.add(values);
> System.out.println(values.get("displayname") + " (" +
> values.get("displayphone") + ")");
> }
> }
> catch (SolrServerException e)
> {
> e.printStackTrace();
> }
>
> }
>
> public static void main(String[] args)
> {
> SolrjTest solrj = new SolrjTest();
> solrj.query("Glenn");
> }
> }
>
>
>
--
--Noble Paul
if the DIH status does not say that it optimized, it is lucene
mergeing the segments
On Mon, Mar 23, 2009 at 8:15 PM, sunnyfr wrote:
>
> I checked this out but It doesn't say nothing about optimizing.
> I'm sure it's lucene part about merging or I don't know .
View this message in context:
> http://www.nabble.com/Delta-import-tp22663196p22663196.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
the xml
parsing overhead and the datasize is small (so less bandwidth) .
>
> Thanks,
> Siddharth
>
--
--Noble Paul
Mar 23 00:27:55 PDT 2009
>> > Current Replication Status Start Time: Mon Mar 23 00:22:55 PDT 2009
>> > Files Downloaded: 12 / 163
>> > Downloaded: 4.12 MB / 1.41 GB [0.0%]
>> > Downloading File: _5no.tis, Downloaded: 0 bytes / 629.57 KB [0.0%]
>> > Time Elapsed: 26371s, Estimated Time Remaining: 9216278s, Speed: 163
>> > bytes/s
>> >
>> >
>> >
>> > --
>> > Jeff Newburn
>> > Software Engineer, Zappos.com
>> > jnewb...@zappos.com - 702-943-7562
>> >
>>
>
>
>
> --
> Regards,
> Shalin Shekhar Mangar.
>
--
--Noble Paul
t;> ?
>>>
>>>
>> Not really, I've seen 1000x faster. Try firing a few of those queries on
>> the
>> database directly. Are they slow? Is the database remote?
>>
>> --
>> Regards,
>> Shalin Shekhar Mangar.
>>
>>
>
> --
> View this message in context:
> http://www.nabble.com/Delta-import-tp22663196p22710222.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
u whether a commit/optimize
>>> was performed
>>>
>>> On Fri, Mar 20, 2009 at 7:07 PM, sunnyfr wrote:
>>>>
>>>> Thanks I gave more information there :
>>>> http://www.nabble.com/Problem-for-replication-%3A-segment-optimized-automaticly-td
dule a delta-import every Sunday morning at 7am or perhaps every hour
> without human intervention. Writing a cron job to do this wouldn't be
> difficult. I'm just wondering is this a built in feature?
>
> Tricia
>
--
--Noble Paul
ache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:583)
> at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:446)
> at java.lang.Thread.run(Unknown Source)
> 13:32:35,196 ERROR [STDERR] 25/Mar/2009 13:32:35
> org.apache.solr.core.SolrCore execute
> INFO: [] webapp=/apache-solr-nightly path=/update
> params={topologyid:3142=} status=400
> QTime=16
>
> Thanks in advance,
> Rui Pereira
>
--
--Noble Paul
// TODO(RP): treat exception
> } catch (TransformerConfigurationException e) {
> // TODO(RP): treat exception
> } catch (TransformerException e) {
> // TODO(RP): treat exception
> }
>
>
> I changed t
field), this is, a way to continue with the next entity.
> Thanks in advance,
> Rui Pereira
>
--
--Noble Paul
Set filesToCopy = new HashSet();
>
> http://www.nabble.com/file/p22734005/ReplicationHandler.java
> ReplicationHandler.java
>
> Thanks a lot,
>
>
>
>
>
> Noble Paul നോബിള് नोब्ळ् wrote:
>>
>> James thanks .
>>
>> If this is true the place to fix this
show me some examples?
> Thanks in advance,
> Rui Pereira
>
--
--Noble Paul
w.nabble.com/replication-requesthandler-solr1.4-slow-answer-tp22721744p22721744.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
the latest nightly should do fine
On Fri, Mar 27, 2009 at 1:59 PM, sunnyfr wrote:
>
> Sorry but which one shoud I take??
> where exactly ?
>
>
> Noble Paul നോബിള് नोब्ळ् wrote:
>>
>> this fix is there in the trunk ,
>> you may not need to apply the patch
&
n the solrconfig.xml and default handlers)
>> that could help optimize my indexing process?
>
> Increase ramBufferSizeMB as much as you can afford.
> Comment out maxBufferedDocs, it's deprecated.
> Increase mergeFactor slightly.
> Consider the CSV approach.
> Index with multiple threads (match the number of CPU cores).
> If you are using Solrj, use the Streaming version of SolrServer.
> Give the JVM more memory (you'll need it if you increase ramBufferSizeMB)
>
> Otis
> --
> Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
>
>
--
--Noble Paul
___
> Låna pengar utan säkerhet. Jämför vilkor online hos Kelkoo.
> http://www.kelkoo.se/c-100390123-lan-utan-sakerhet.html?partnerId=96915014
>
--
--Noble Paul
//www.nabble.com/solrReplication-solr1.4-slave-is-slower-during-replication-tp22769716p22769716.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
DateFormatTransformer
> and without? You can also try the workaround I had suggested in the email I
> mentioned above to see if that solves the problem.
>
> --
> Regards,
> Shalin Shekhar Mangar.
>
--
--Noble Paul
gt;>> The question is how this can be achieved with the new
>>> SearchComponent architecture.
>>>
>>> Any inputs would be appreciated!
>>>
>>> Alex
>>
>>--
>>Grant Ingersoll
>>http://www.lucidimagination.com/
>>
>>Search the Lucene ecosystem (Lucene/Solr/Nutch/Mahout/Tika/Droids)
>>using Solr/Lucene:
>>http://www.lucidimagination.com/search
>
>
--
--Noble Paul
e unclear about the difference between
> index field and stored field.
> Can anyone explain me whta does mean by index and stored fields? and What
> does mean by storing large fields outside of Solr (need to set store="false"
> in schema.xml ) ?
>
> Thanks in advance.
>
>
--
--Noble Paul
http://wiki.apache.org/solr/DIHCustomTransformer
On Tue, Mar 31, 2009 at 7:38 PM, Radha C. wrote:
> Hi,
>
> Is there any documentation available for usage of transformers in
> dataimport. If so can anyone tell me the url ?
>
> Thanks
>
--
--Noble Paul
my index.
>
> replaceWith="Video" />
>
> However, the value in the index was set to "VideoVideo" for all documents.
>
> Any idea why this DIH instruction would see constant value appear twice??
>
> Thanks,
> Wesley.
>
>
>
--
--Noble Paul
===
> Fergus McMenemie Email:fer...@twig.me.uk
> Techmore Ltd Phone:(UK) 07721 376021
>
> Unix/Mac/Intranets Analyst Programmer
> ===
>
--
--Noble Paul
ata directory. My question is how
> can I create cores on fly and have them point to different data
> directories so each core write index in different location?
>
> Thanks,
> -vivek
>
--
--Noble Paul
tomcat is protected with password, i.e we do have to give the username
> and password when trying to access the web applications that are deployed in
> it. My doubt is how do we overcome this when Solr tries to access resource
> from tomcat?
> I do tried to add the username and password i
olrServer.java:245)
> at
>
> org.apache.solr.client.solrj.request.UpdateRequest.process(UpdateRequest.jav
> a:243)
>at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:48)
>at SolrIndexTest.main(SolrIndexTest.java:46)
> Java Result: 1
>
>
--
--Noble Paul
e-
> From: Radha C. [mailto:cra...@ceiindia.com]
> Sent: Wednesday, April 01, 2009 12:28 PM
> To: solr-user@lucene.apache.org
> Subject: RE: Runtime exception when adding documents using solrj
>
>
> I am using Solr 1.3 version
>
> _
>
> From: Noble Paul നോബിള്
ava code..
>>
>> -Original Message-
>> From: Radha C. [mailto:cra...@ceiindia.com]
>> Sent: Wednesday, April 01, 2009 12:28 PM
>> To: solr-user@lucene.apache.org
>> Subject: RE: Runtime exception when adding documents using solrj
>>
>>
>>
;1.6.0_12"
> Java(TM) SE Runtime Environment (build 1.6.0_12-b04)
> Java HotSpot(TM) 64-Bit Server VM (build 11.2-b01, mixed mode)
>
> Any idea why this could be happening?
>
>
>
>
> --
> View this message in context:
> http://www.nabble.com/performance-tests-with-DataImportHandler-and-full-import-tp22823145p22823145.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
7)
>
> I don't have any "Long" member variable in my java object - so not
> sure where is this coming from. I've checked the schema.xml to make
> sure the data types are ok. I'm adding 15K objects at a time - I'm
> assuming that should be ok.
>
> Any ideas?
>
> Thanks,
> -vivek
>
--
--Noble Paul
my index is is the following:
>>
>>
>> 2002-12-18T00:00:00Z
>>
>>
>>
>> 2002
>>
>>
>>
>> 2002-12-18T05:00:00Z
>>
>>
>> You'll notice that the hour (HH) in original_air_date_d changes is set to
>> 05. It should still be 00. I have noticed that it changes to either 04 or
>> 05 in all cases within my index.
>>
>> In my schema the dynamic field "*_d"
>>
>>
>> Thanks,
>> Wesley.
>>
>>
>
>
--
--Noble Paul
on( UpdateRequest.ACTION.COMMIT, false, false );
>> UpdateResponse res = req.process(server);
>>
>> Initially POJO was getting added when it was not composite POJO.
>> After trying to have composite POJO things are not working.
>> What is that I am doing wrong??
>>
>> Any help will be appreciated.
>>
>>
>>
>
>
> -
> Regards,
> Praveen
> --
> View this message in context:
> http://www.nabble.com/Composite-POJO-support-tp22841854p22845799.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
> Solr Implementation Version: 1.4-dev exported - root - 2009-01-22 13:51:22
>
> http://www.nabble.com/file/p22846336/CPU.jpg CPU.jpg
> --
> View this message in context:
> http://www.nabble.com/JVM-best-tune--help-...-solr1.4-tp22846336p22846336.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
ld be fine.
>
> I wonder, I shouldn't put 7G to xmx jvm, I don't know,
> but slave is as well a little problem during replication from the master.
>
>
>
> Noble Paul നോബിള് नोब्ळ् wrote:
>>
>> If you are looking at the QTime on the master it is likely to be
&g
lder merged so every time
> it brings back 10G datas.
>
> And during this time my repond time of my request are very slow.
> What can I check?
>
> Thanks Paul
>
> 2009/4/2 Noble Paul നോബിള് नोब्ळ्
>>
>> slave would not show increased request times because of repl
in your environment. I suggest that you
> look in the documentation of the jdbc driver. However, I'm quite
> certain that there exists a pure java jdbc driver for Oracle too.
>
> --
> Regards,
> Shalin Shekhar Mangar.
>
--
--Noble Paul
the nightly war NOT the right one to use?
>>
>> Thanks for your help.
>>
>> - ashok
>>
>>
>>
>
> --
> View this message in context:
> http://www.nabble.com/Oracle-Clob-column-with-DIH-does-not-turn-to-String-tp22859837p22859865.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
?) the source and
> replace the jar for DIH, right? I can try - for the first time.
> - ashok
>
> Noble Paul നോബിള് नोब्ळ् wrote:
>>
>> This looks strange. Apparently the Transformer did not get applied. Is
>> it possible for you to debug ClobTransformer adding(System.out
On Fri, Apr 3, 2009 at 11:28 AM, Praveen Kumar Jayaram
wrote:
>
>
> Thanks for the reply Noble Paul.
> In my application I will be having multiple types of object and the number
> of properties in each object will vary.
> So I have made them as FieldType and defined in schema.xm
know
> first_date_d instruction is valid but, it just disappears.
>
> Any thoughts?
>
> On 4/1/09 11:59 PM, "Noble Paul നോബിള് नोब्ळ्"
> wrote:
>
>> I guess dateFormat does the job properly but the returned value is
>> changed according to timezone.
>&g
I see none of these prints coming up in my
> 'catalina.out' file. Is that the right file to be looking at?
>
> As an aside, is 'catalina.out' the ONLY log file for SOLR? I turned on the
> logging to 'FINE' for everything. Also, these settings seem to go away
uot; user="remedy" password="y"/>
>
>
>
>
>
>
>
>
>
> ===
>
> A search result on the field short_desc:
> --
>
>
> 1.8670129
> oracle.sql.c...@155e3ab
> 4486
> Develop Rating functi
he 'war' that download came with. Thanks Noble.
>
> Noble Paul നോബിള് नोब्ळ् wrote:
>>
>> and which version of Solr are u using?
>>
>> On Fri, Apr 3, 2009 at 10:09 PM, ashokc wrote:
>>>
>>> Sure:
>>>
>>> data-c
ex has no data in the field 'projects'. Is it NOT
> possible to create multi-valued fields with DIH?
>
> Thanks
> --
> View this message in context:
> http://www.nabble.com/Multi-valued-fields-with-DIH-tp22877509p22877509.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
haps this transformer can be modified to be case-insensitive for
> the column names. If you had written it perhaps it is a quick change for
> you?
>
> Noble Paul നോബിള് नोब्ळ् wrote:
>>
>> I guess u can write a custom transformer which gets a String out of
>> the orac
g to do this? Am very new to Solr. So please don't mind if I
> dumb questions.
>
> Please give some sample example if possible.
>
>
>
> Noble Paul നോബിള് नोब्ळ् wrote:
>>
>> On Fri, Apr 3, 2009 at 11:28 AM, Praveen Kumar Jayaram
>> wrote:
>>>
>
uestions, if command=full-import, this should effectively mean that
> all DIH configuration are executed in sequential order. Is that correct? I
> am not seeing that behaviour at present.
>
> Thanks,
> Wesley
>
>
--
--Noble Paul
g and skipping a file if it has already been
> indexed
> and not changed since?
>
>
> Thank you.
>
> Regards,
> Veselin K
>
>
--
--Noble Paul
> 0.21
> (error executing: uname -a)
> (error executing: ulimit -n)
> (error executing: uptime)
>
> Thanks
>
> --
> View this message in context:
> http://www.nabble.com/solr-1.4-memory-jvm-tp22913742p22913742.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>
--
--Noble Paul
x.
> which use a big part of the cpu like u can see on the graph, the first
> part>>
> http://www.nabble.com/file/p22925561/cpu_.jpg cpu_.jpg
> and on this graph and first part of the graph (blue part) it's just
> replication no request at all.
> normally i've 20 reque
701 - 800 of 1149 matches
Mail list logo