Hi all,
I want to confirm the issue raised by Roland Cornelissen yesterday.
In my case, all is fine with gzipped .nt files up to 80MB (15M
triples), but trying to load a 250MB .nt file using ld_dir() in VOS
version:
"Version 07.00.3202-pthreads for Darwin as of Apr 7 2013" (see
attached config
Hi Roland,
If you can provide your dataset to test with that would be best as that is what
you are having the issue with ...
We will also test with the rdf/xml version as that should load also unless
there are genuine issue with the datasets in this form which the creators of
the dataset shoul
Hi Hugh,
No, there were no errors reported in virtuoso.log.
In the meantime I have repeated the same procedure on stable/vos6 where
all went well; all data loaded and available.
The dataset I am talking about is VIAF RDF data, publicly available here
[1].
I have preprocessed this set to erase s
Hi Roland,
Are any error reported in the "virtuoso.log" file when the load is in progress
? Also, is this a publicly available datasets your are loading that we could
try locally ?
Best Regards
Hugh Williams
Professional Services
OpenLink Software, Inc. // http://www.openlinks
Hi Hugh,
That saved me precious time. In the meantime I have upgraded to VOS7 and
try to load the large set using Bulk Loader.
But it is not working. See log below, this is what I get and Virtuoso
slowly dies after accumulating about 1GB in db, the service is not
available any more, I have to
Hi Roland,
You should use the Virtuoso RDF Bulik Loader for loading such large datasets as
detailed at:
http://virtuoso.openlinksw.com/dataspace/doc/dav/wiki/Main/VirtBulkRDFLoader
From the error you report it would appear a checkpoint is being performed in
the middle of your load, so
Hi,
I am trying to load a large dataset (~50GB) and bumped into:
*** Error 40001: [Virtuoso Driver][Virtuoso Server]SR325: Transaction
aborted due to a database checkpoint or database-wide atomic operation.
Please retry transaction
at line 10 of Top-Level:
ttlp_mt (file_to_string_output ('/usr/