Dear Ivan,
I tried your suggestions, but unfortunately to no avail :-(.
Best,
Thomas
Ivan Mikhailov schrieb:
Thomas,
I'm sorry I have no good idea how to debug it further. I'm pretty sure
that the problem is in storing string session on disk, because this is
the only thing that could demonstr
Thomas,
I'm sorry I have no good idea how to debug it further. I'm pretty sure
that the problem is in storing string session on disk, because this is
the only thing that could demonstrate the problem in 1 minute. Even if
box loads 5 triples per second it can't run out of 4Gb boundary in a
minu
Dear Ivan,
I can easily copy files of up to several GBs into the directory, where
the temporary session file is written to. Interestingly enough, my
settings work fine for loading 5.000.000 triples (~500 MB), which takes
about 10 mins, but if I try to load the 25.000.000 triples file it fails
Thomas,
Please check if the root Makefile of your build contains something like
CCPLATFORMDEFS = -Dlinux -D_GNU_SOURCE -DFILE64 -D_LARGEFILE64_SOURCE
Without detected support of FILE64 the compiled server is unable to keep
files that are longer than 4Gb.
Even in bad case you still can deal with b
Hi,
unfortunately my load problem doesn't seem to be fixed yet. I can load
datasets of up to approx. 500 MB (~ 5.000.000 triples), but when trying
to load a file with 25.000.000 triples (~ 2.7 GB), I get the following
error message:
13:39:06 Can't write to file
/data/SP2B/sp2b/bench/sparqle
Thanks Ivan!
after using DB.DBA.TTLP_MT() for loading, even loading of 5.000.000
triples (approx. 500 MB) works just fine in about 10mins.
Best regards,
Thomas
Ivan Mikhailov schrieb:
Thomas,
Yes, DB.DBA.TTLP_MT() is more suitable. Among other things it does not
try to fit everything into
Thomas,
Yes, DB.DBA.TTLP_MT() is more suitable. Among other things it does not
try to fit everything into single transaction.
Best Regards,
Ivan Mikhailov,
OpenLink Software.
On Fri, 2008-02-22 at 15:03 +0100, Thomas Hornung wrote:
> Hi,
>
> I have a problem when I try to load a "large" (~109M
Hi,
I have a problem when I try to load a "large" (~109MB, approx. 1.000.000
triples) RDF data set into Virtuoso. Ideally I'd like to load files of
up to several GBs into Virtuoso.
I always get the same error message:
Connected to OpenLink Virtuoso
Driver: 05.00.3026 OpenLink Virtuoso ODBC D