Hi Hugh,

 

Thanks for the link to that script. It seems like it should save me a lot of
hassle. However, at the moment, I'm running into another small problem. I've
followed all the instructions given in the README.txt to the word, and
executed the following command to initialise the installation:

 

sh ./dbpedia_install.sh localhost:1111 alex *** alex...@gmail.com

 

Note that I'm running this from cygwin on Vista (32-bit) - I don't think
that should create too much of a problem though.

 

The command fails almost instantly, printing the following to the log file:

 

===============================

  Install started

  Sat Sep 5 16:50:49 GMTDT 2009

===============================

Checking for VOS setup

Starting Virtuoso server, please wait ...

Cannot start Virtuoso server, please consult dbpedia.log file

Started.

This script must be started in server working directory

 

The dbpedia dir containing all the files exists under
C:\virtuoso-opensource, and is the current directory at the point when I run
the installation script.

 

Do I have the dbpedia dir located under the wrong path? Is my command-line
option wrong perhaps?

 

Regards,

Alex

 

From: Hugh Williams [mailto:hwilli...@openlinksw.com] 
Sent: 05 September 2009 15:53
To: Alex
Cc: virtuoso-users@lists.sourceforge.net
Subject: Re: [Virtuoso-users] File Size Limit

 

Hi Alex,

 

If a local DBpedia instance is what you are trying to setup we have the
following installation script used for loading up the DBpedia 3.2 datasets:

 

            http://s3.amazonaws.com/dbpedia-data/dbpedia_load.tar.gz

 

The loading of the DBpedia datasets can take many hours depending on the
speed of the machine, as you have seen yourself with the ttlp_* functions
which our installer script uses.

 

We also provide a DBpedia Virtuoso Amazon EC2 AMI to enable users to
instantiate a running DBpedia instance in the cloud in less then an hour, as
detailed at:

 

 
http://virtuoso.openlinksw.com/dataspace/dav/wiki/Main/VirtEC2AMIDBpediaInst
all

 

Best Regards

Hugh Williams

Professional Services

OpenLink Software

Web: http://www.openlinksw.com

Support: http://support.openlinksw.com

Forums: http://boards.openlinksw.com/support

 

 

 

On 5 Sep 2009, at 13:04, Alex wrote:





Hi Hugh, Egon,

I've tried using the DB.DBA.TTLP_MT_LOCAL_FILE a large .n3 file (1GB+), 
and it seems to be working (or at least doing something), having escaped 
the \ in the path. However, it is taking an absurdly long time to 
actually finish (I had to terminate the task), whereas uploading via 
WebDAV to rdf_sink was very quick (< 30 seconds for a 1GB file).

Firstly, are the two methods equivalent? What I want to do is simply 
make the RDF triple-store (for DBpedia) available via a SPARQL endpoint. 
There must be a recommended way for doing it - is this method it?

Thanks,
Alex

----------------------------------------------------------------------------
--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus
on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
_______________________________________________
Virtuoso-users mailing list
Virtuoso-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/virtuoso-users

 

Reply via email to