DIH streams 1 row at a time.
DIH is just a component in Solr. Solr indexing also takes a lot of memory
On Tue, Apr 14, 2009 at 12:02 PM, Mani Kumar wrote:
> Yes its throwing the same OOM error and from same place...
> yes i will try increasing the size ... just curious : how this dataimport
> wo
Wow, that was pretty straight forward. Sorry I didn't catch that on the wiki
on my first few go rounds, I'll navigate harder next time.
Thanks.
Isaac
On Sun, Apr 12, 2009 at 11:40 PM, Shalin Shekhar Mangar <
shalinman...@gmail.com> wrote:
> On Mon, Apr 13, 2009 at 5:35 AM, Isaac Foster >wrote:
Yes its throwing the same OOM error and from same place...
yes i will try increasing the size ... just curious : how this dataimport
works?
Does it loads the whole table into memory?
Is there any estimate about how much memory it needs to create index for 1GB
of data.
thx
mani
On Tue, Apr 14, 2
On Tue, Apr 14, 2009 at 11:36 AM, Mani Kumar wrote:
> Hi Shalin:
> yes i tried with batchSize="-1" parameter as well
>
> here the config i tried with
>
>
>
> driver="com.mysql.jdbc.Driver"
> url="jdbc:mysql://localhost/mydb_development"
> user="root" password="**" />
>
>
> I hope i have u
On Tue, Apr 14, 2009 at 11:30 AM, Gargate, Siddharth wrote:
> Hi all,
>I am testing indexing with 2000 text documents of size 2 MB
> each. These documents contain words created with random characters. I
> observed that the tomcat memory usage goes on increasing slowly. I tried
> by removin
Hi Shalin:
yes i tried with batchSize="-1" parameter as well
here the config i tried with
I hope i have used batchSize parameter @ right place.
Thanks!
Mani Kumar
On Tue, Apr 14, 2009 at 11:24 AM, Shalin Shekhar Mangar <
s
what is the cntent of your text file?
Solr does not directly index files
--Noble
On Tue, Apr 14, 2009 at 3:54 AM, Alex Vu wrote:
> Hi all,
>
> Currently I wrote an xml file and schema.xml file. What is the next step to
> index a txt file? Where should I put my txt file I want to index?
>
> than
Hi all,
I am testing indexing with 2000 text documents of size 2 MB
each. These documents contain words created with random characters. I
observed that the tomcat memory usage goes on increasing slowly. I tried
by removing all the cache configuration, but still memory usage
increases. Once
On Tue, Apr 14, 2009 at 11:18 AM, Mani Kumar wrote:
> Here is the stack trace:
>
> notice in stack trace * "at
> com.mysql.jdbc.MysqlIO.readAllResults(MysqlIO.java:1749)"*
>
> It looks like that its trying to read whole table into memory at a time. n
> thts y getting OOM.
>
>
Mani, the data-
Hi Noble:
But the question is how much memory? is there any rules or something like
that? so that i can estimate the how much memory it requires?
Yeah i can increase it upto 800MB max will try it and let you know
Thanks!
Mani
2009/4/14 Noble Paul നോബിള് नोब्ळ्
> DIH itself may not be con
Here is the stack trace:
notice in stack trace * "at
com.mysql.jdbc.MysqlIO.readAllResults(MysqlIO.java:1749)"*
It looks like that its trying to read whole table into memory at a time. n
thts y getting OOM.
Apr 14, 2009 11:15:01 AM org.apache.solr.handler.dataimport.DataImporter
doFullImpo
DIH itself may not be consuming so much memory. It also includes the
memory used by Solr.
Do you have a hard limit on 400MB , is it not possible to increase it?
On Tue, Apr 14, 2009 at 11:09 AM, Mani Kumar wrote:
> Hi ILAN:
>
> Only one query is required to generate a document ...
> Here is my
Hi ILAN:
Only one query is required to generate a document ...
Here is my data-config.xml
and other useful info:
mysql> select * from items
+--+
| count(*) |
+--+
| 900051 |
+--+
1 row in set (0.00 sec)
Each
On Tue, Apr 14, 2009 at 7:14 AM, vivek sar wrote:
> Some more update. As I mentioned earlier we are using multi-core Solr
> (up to 65 cores in one Solr instance with each core 10G). This was
> opening around 3000 file descriptors (lsof). I removed some cores and
> after some trial and error I foun
what about:
fieldA:value1 AND fieldB:value2
this can also be written as:
+fieldA:value1 +fieldB:value2
On Apr 13, 2009, at 9:53 PM, Johnny X wrote:
I'll start a new thread to make things easier, because I've only
really got
one problem now.
I've configured my Solr to search on all fiel
I'll start a new thread to make things easier, because I've only really got
one problem now.
I've configured my Solr to search on all fields, so it will only search for
a specific query in a specific field (e.g. q=Date:"October") will only
search the 'Date' field, rather the all the others.
The
Some more update. As I mentioned earlier we are using multi-core Solr
(up to 65 cores in one Solr instance with each core 10G). This was
opening around 3000 file descriptors (lsof). I removed some cores and
after some trial and error I found at 25 cores system seems to work
fine (around 1400 file d
Hi all,
Currently I wrote an xml file and schema.xml file. What is the next step to
index a txt file? Where should I put my txt file I want to index?
thank you,
Alex V.
Depending on your dataset and how your queries look you may very likely
need to increase to a larger heap size. How many queries and rows are
required for each of your documents to be generated?
Ilan
On 4/13/09 12:21 PM, Mani Kumar wrote:
Hi Shalin:
Thanks for quick response!
By defaults i
Sorry, should have add that you should set the qt param:
http://wiki.apache.org/solr/CoreQueryParameters#head-2c940d42ec4f2a74c5d251f12f4077e53f2f00f4
-Grant
On Apr 13, 2009, at 1:35 PM, Fink, Clayton R. wrote:
The query method seems to only support "solr/select" requests. I
subclassed SolrR
Hi Shalin:
Thanks for quick response!
By defaults it was set to 1.93 MB.
But i also tried it with following command:
$ ./apache-tomcat-6.0.18/bin/startup.sh -Xmn50M -Xms300M -Xmx400M
I also tried tricks given on
http://wiki.apache.org/solr/DataImportHandlerFaq page.
what should i try next ?
On Mon, Apr 13, 2009 at 11:57 PM, Mani Kumar wrote:
> Hi All,
> I am trying to setup a Solr instance on my macbook.
>
> I get following errors when m trying to do a full db import ... please help
> me on this
>
> java.lang.OutOfMemoryError: Java heap space
>at
>
> org.apache.solr.handler.d
I am using Tomcat ...
On Mon, Apr 13, 2009 at 11:57 PM, Mani Kumar wrote:
> Hi All,
> I am trying to setup a Solr instance on my macbook.
>
> I get following errors when m trying to do a full db import ... please help
> me on this
>
> Apr 13, 2009 11:53:28 PM
> org.apache.solr.handler.dataimport.
Hi All,
I am trying to setup a Solr instance on my macbook.
I get following errors when m trying to do a full db import ... please help
me on this
Apr 13, 2009 11:53:28 PM org.apache.solr.handler.dataimport.JdbcDataSource$1
call
INFO: Creating a connection for entity slideshow with URL:
jdbc:mysq
Interesting. Do you know if it's possible to get the HTTP headers with
Solrj?
Yonik Seeley wrote:
On Fri, Apr 10, 2009 at 11:58 AM, Richard Wiseman
wrote:
Is it possible for a Solr client to determine if the index has changed since
the last time it performed a query? For example, is it p
Here is some more information about my setup,
Solr - v1.4 (nightly build 03/29/09)
Servlet Container - Tomcat 6.0.18
JVM - 1.6.0 (64 bit)
OS - Mac OS X Server 10.5.6
Hardware Overview:
Processor Name: Quad-Core Intel Xeon
Processor Speed: 3 GHz
Number Of Processors: 2
Total Number Of Cores: 8
L
The query method seems to only support "solr/select" requests. I subclassed
SolrRequest and created a request class that supports "solr/autoSuggest" -
following the pattern in LukeRequest. It seems to work fine for me.
Clay
-Original Message-
From: Grant Ingersoll [mailto:gsing...@apac
A further update on this is that (when 'Date' is searched using the same URL
as posted in the previous message), whether Date is of type string or text,
the full (exact) content of a field has to be searched to return a result.
Why is this not the case with Content? I tried changing the default
Do you know the specific syntax when querying different fields?
http://localhost:8080/solr/select/?q=Date:%222000%22&version=2.2&start=0&rows=10&indent=on
doesn't appear to return anything when I post it in my browser, when it
should, but (as before) if you change 'Date' to 'Content' it works!
(
Hi there
I installed Solr on tomcat 6 and whenever I click search it displays the xml
like I am editing it?
is that normal?
I added a connector line in my server.xml below.
-->
I added this line
---
On Apr 13, 2009, at 11:20 AM, Johnny X wrote:
Also, in reference to the other question, I'm currently trying to
edit the
main search page to search multiple fields.
Essentially, I detect if each field has been posted or not using:
if ($_POST['FIELD'] != '') {
$query = $query . '+FIELDNAME
Also, in reference to the other question, I'm currently trying to edit the
main search page to search multiple fields.
Essentially, I detect if each field has been posted or not using:
if ($_POST['FIELD'] != '') {
$query = $query . '+FIELDNAME:' . $_POST['FIELD'];
}
Once it's processed all the
I changed the ISBN to lowercase (and the other fields as well) and it works !
Thanks very much !
--
View this message in context:
http://www.nabble.com/DataImportHandler-with-multiple-values-tp23022195p23023374.html
Sent from the Solr - User mailing list archive at Nabble.com.
it is likely that your query did not return any data. just run the
query separately and see if it reallly works.
Or try it out in debug mode. it will tell you which query was run and
what got returned.
--Noble
2009/4/13 Vincent Pérès :
>
> Hello,
>
> I'm trying to import a simple book table with
2009/4/13 Vincent Pérès
>
>
> type="JdbcDataSource" driver="com.mysql.jdbc.Driver"
> url="jdbc:mysql://localhost:33061/completelynovel" user="root" password=""
> />
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> Ten fields are well created...
Hello,
I'm trying to import a simple book table with the full-import command. The
datas are stored in mysql.
It worked well when I tried to import few fields from the 'book' table :
title, author, publisher etc.
Now I would like to create a facet (multi valued field) with the categories
which bel
On Mon, Apr 13, 2009 at 12:36 PM, vivek sar wrote:
> I index in 10K batches and commit after 5 index cyles (after 50K). Is
> there any limitation that I can't search during commit or
> auto-warming? I got 8 CPU cores and only 2 were showing busy (using
> top) - so it's unlikely that the CPU was p
I index in 10K batches and commit after 5 index cyles (after 50K). Is
there any limitation that I can't search during commit or
auto-warming? I got 8 CPU cores and only 2 were showing busy (using
top) - so it's unlikely that the CPU was pegged.
2009/4/12 Noble Paul നോബിള് नोब्ळ् :
> If you use S
38 matches
Mail list logo