Thanks a lot for your help, Patrick!
Yes, my mistake, it is BTC dataset, not DBpedia.
I changed the literal types from XML to Plain and the errors disappeared.
But now I got the new error:
/btc2014_unzipped/01/data.nq-10
http://fake-latest.org
2 2015.9.22 23:10.20 322216000 2015.
Why not to include some option to ignore such errors? To load the dataset
with error, or at least to ignore triples containing errors, but still load
the other part of the dataset?
Obviously all datasets created by humans will contain such errors, it can
not be ideal...
On 20 September 2015 at 17
Dear Mr. Williams,
Thank you very much for the information.
Thanks & Regards,
Jyoti
On Sun, Sep 20, 2015 at 6:34 AM, Hugh Williams
wrote:
> Hi Jyoti,
>
> Virtuoso does not cache query result sets per say, but rather the
> databases working set for a given query work load gets loaded into memo
Hi Hugh,
Thank you for all your responses so far, we really appreciate it.
Another issue came up with backup, this time I tried to do an online
backup and got this error:
"SQL> backup_online('virt-inc_dump_01082015#',
100,0,vector('/usr/local/var/lib/virtuoso_backup'));
*** Error 42000:
I'm running Ubuntu 14.04.2, Debian version Jessie and tried to install
virtuoso open source version 7.2.1. During installation it seemed to be
missing some of the VAD packages so I found another post about Debian
packages here:
http://serverfault.com/questions/631673/virtuoso-opensource-7-1-how-do-