Re: [Virtuoso-users] Inference performance

2011-05-09 Thread Roberto García
> Please provide the command used for creating the rule sets and confirm they > exist in the “sys_rdf_schema” table ? Yes, I used rdfs_rule_set('http://semantic.eurobau.com/data/schema/rules/','http://semantic.eurobau.com/data/schema/'); And they exist in that table. > Based on the details bel

Re: [Virtuoso-users] Problems loading large .ttl file

2011-05-09 Thread Peter DeVries
Sorry I should have been more clear. I create a dump file using this command. dump_one_graph('urn:org:linkedopenspeciesdata:dataspace:taxonconcept', 'txn_ses'); I the outputed .ttl file and change the name to txn_ses.ttl and then gzip it before copying it to my web server and public endpoint.

Re: [Virtuoso-users] Inference performance

2011-05-09 Thread Hugh Williams
Hi Roberto, I presume you have created the RDFS rule sets for the loaded ontologies using rdfs_rule_set() function as detailed at: http://docs.openlinksw.com/virtuoso/rdfsparqlrule.html Please provide the command used for creating the rule sets and confirm they exist in the “sys_rdf_sc

[Virtuoso-users] Inference performance

2011-05-09 Thread Roberto García
Dear all, I'm using OpenSource Virtuoso 6.1.3 with data from the Semantic Eurobau initiative. I've first loaded the ontologies used in this dataset. There is GoodRelations, a small Eurbau-specific ontology and FreeClass, the biggest one with arround 5000 classes: DB.DBA.RDF_LOAD_RDFXML_MT(file_o

Re: [Virtuoso-users] Problems loading large .ttl file

2011-05-09 Thread Hugh Williams
Peter, On 8 May 2011, at 18:26, Peter DeVries wrote: > Hi, > > I am trying to upload a new version of my TaxonConcept dataset. (VOS 6.1.3) > > There is one large .ttl file that is giving me problems. > > On my staging machine I run this procedure which loads each of the ~100,000 > rdf files.