Is my
command-line option wrong perhaps?
Regards,
Alex
*From:* Hugh Williams [mailto:hwilli...@openlinksw.com] *Sent:* 05
September 2009 15:53
*To:* Alex
*Cc:* virtuoso-users@lists.sourceforge.net
<mailto:virtuoso-users@lists.sourceforge.net>
*Subject:* Re: [Virtuoso-users] File Size L
wrong perhaps?
Regards,
Alex
*From:* Hugh Williams [mailto:hwilli...@openlinksw.com] *Sent:* 05
September 2009 15:53
*To:* Alex
*Cc:* virtuoso-users@lists.sourceforge.net <mailto:virtuoso-users@lists.sourceforge.net
>
*Subject:* Re: [Virtuoso-users] File Size Limit
Hi Alex,
If a local DBpedia instanc
o:virtuoso-users@lists.sourceforge.net>
*Subject:* Re: [Virtuoso-users] File Size Limit
Hi Alex,
If a local DBpedia instance is what you are trying to setup we have
the following installation script used for loading up the DBpedia 3.2
datasets:
http://s3.amazonaws.com/dbpedia-data/
@lists.sourceforge.net
Subject: Re: [Virtuoso-users] File Size Limit
Hi Alex,
If a local DBpedia instance is what you are trying to setup we have
the following installation script used for loading up the DBpedia
3.2 datasets:
http://s3.amazonaws.com/dbpedia-data/dbpedia_load.tar.gz
The
From: Hugh Williams [mailto:hwilli...@openlinksw.com]
Sent: 05 September 2009 15:53
To: Alex
Cc: virtuoso-users@lists.sourceforge.net
Subject: Re: [Virtuoso-users] File Size Limit
Hi Alex,
If a local DBpedia instance is what you are trying to setup we have the
following installation script
Hi Alex,
If a local DBpedia instance is what you are trying to setup we have
the following installation script used for loading up the DBpedia 3.2
datasets:
http://s3.amazonaws.com/dbpedia-data/dbpedia_load.tar.gz
The loading of the DBpedia datasets can take many hours depending on
Hi Hugh, Egon,
I've tried using the DB.DBA.TTLP_MT_LOCAL_FILE a large .n3 file (1GB+),
and it seems to be working (or at least doing something), having escaped
the \ in the path. However, it is taking an absurdly long time to
actually finish (I had to terminate the task), whereas uploading via
rlooking something major here.
Regards,
Alex
-Original Message-
From: Hugh Williams [mailto:hwilli...@openlinksw.com]
Sent: 05 September 2009 02:06
To: Alex
Cc: virtuoso-users@lists.sourceforge.net
Subject: Re: [Virtuoso-users] File Size Limit
Hi Alex,
Are you uploaded these 4GB files via WebDAV ?
Hi Alex,
Are you uploaded these 4GB files via WebDAV ? You might want to try
the Virtuoso TTLP family of functions for uploading such large
datasets as detailed at:
http://docs.openlinksw.com/virtuoso/fn_ttlp_mt_local_file.html
Best Regards
Hugh Williams
Professional Services
OpenL
Hello,
I've recently been experimenting with using Virtuoso OSE as an RDF
store/SPARQL server, and seem to have things mainly set up now. I have
uploaded various N3 files to rdf_sink via WebDAV (the DBpedia store, to
be specific), and all files have succeeded except for the two largest
ones.
10 matches
Mail list logo