Hi Pierre,
Thanks for your notes re. doc.
Please find my comments below:
On Tue, 2011-02-15 at 16:12 +, Hugh Williams wrote:
Being are very large dataset have you tuned your Virtuoso Server for
running on the target OS as detailed at:
http://docs.openlinksw.com/virtuoso/rdfperform
On Tue, 2011-02-15 at 16:12 +, Hugh Williams wrote:
>
> Being are very large dataset have you tuned your Virtuoso Server for
> running on the target OS as detailed at:
> http://docs.openlinksw.com/virtuoso/rdfperformancetuning.html
For the record (because I did not see it at first).
Hi,
Thanks for all these suggestions we will be looking into this and I'll
come back to you.
Regards,
Pierre
On Tue, 2011-02-15 at 12:19 -0500, Marc-Alexandre Nolin wrote:
> Hi,
>
> To push the loading capacity of the open source Virtuoso I use 2
> things at Bio2RDF.
>
> 1) Good size server (
On 2/15/11 12:19 PM, Marc-Alexandre Nolin wrote:
Hi,
To push the loading capacity of the open source Virtuoso I use 2
things at Bio2RDF.
1) Good size server (24 cores, 128 GB ram). I can't do much for you
here. The more ram the better and the more core, the more parrallel
loading... "up to a ce
Hi,
To push the loading capacity of the open source Virtuoso I use 2
things at Bio2RDF.
1) Good size server (24 cores, 128 GB ram). I can't do much for you
here. The more ram the better and the more core, the more parrallel
loading... "up to a certain point with the free version"
2) Exploit the
Pierre,
Being are very large dataset have you tuned your Virtuoso Server for running on
the target OS as detailed at:
http://docs.openlinksw.com/virtuoso/rdfperformancetuning.html
You can also use the bulk loader scripts we use for loading large datasets at
if not already doing so:
Hi,
I have been trying to load the uniprot rdf file into virtuoso. Uniprot
provides a rather big file [1] which uncompressed is ~133GB.
I have been trying to load it into virtuoso (6.1.2) but it seems that
virtuoso's performance drops after a while and eventually hangs.
We tried to load it using