Thomas, Please check if the root Makefile of your build contains something like CCPLATFORMDEFS = -Dlinux -D_GNU_SOURCE -DFILE64 -D_LARGEFILE64_SOURCE Without detected support of FILE64 the compiled server is unable to keep files that are longer than 4Gb.
Even in bad case you still can deal with big database files by using striping. For more precise diagnostics I need your Makefile. Additional question is space for temporary session files. They reside in directory specified by TempSesDir parameter of [Parameters] section of configuration file (say, virtuoso.ini); server's working directory is used by default. Can you write, say, a copy of your source-form RDF files to that directory? Best Regards, Ivan Mikhailov, OpenLink Software. On Wed, 2008-03-12 at 12:51 +0100, Thomas Hornung wrote: > Hi, > > unfortunately my load problem doesn't seem to be fixed yet. I can load > datasets of up to approx. 500 MB (~ 5.000.000 triples), but when trying > to load a file with 25.000.000 triples (~ 2.7 GB), I get the following > error message: > > 13:39:06 Can't write to file > /data/SP2B/sp2b/bench/sparqlengines/virtuoso//sesnZpG6K > File size limit exceeded > > I already checked, and there are no user limits for the file size. The > used file system is ext3, which should support files of at least up to > 16 GB, so this cannot be the problem either. > > The weirdest thing is that the files: > > virtuoso.trx > virtuoso.db > > do not change their initial size after the start of the load, which > normally happened for the other file sizes. > > I am using the following load command, which works fine for the > "smaller" files: > > ttlp_mt > (file_to_string_output('/data/SP2B/sp2b/bench/data/rdf/dblp25000000.n3'), > '', 'http://my_graph', 0); > > Am I missing something obvious here? > > > Best regards, > Thomas > > > Ivan Mikhailov schrieb: > > Thomas, > > > > Yes, DB.DBA.TTLP_MT() is more suitable. Among other things it does not > > try to fit everything into single transaction. > > > > Best Regards, > > > > Ivan Mikhailov, > > OpenLink Software. > > > > On Fri, 2008-02-22 at 15:03 +0100, Thomas Hornung wrote: > >> Hi, > >> > >> I have a problem when I try to load a "large" (~109MB, approx. 1.000.000 > >> triples) RDF data set into Virtuoso. Ideally I'd like to load files of > >> up to several GBs into Virtuoso. > >> > >> I always get the same error message: > >> > >> Connected to OpenLink Virtuoso > >> Driver: 05.00.3026 OpenLink Virtuoso ODBC Driver > >> > >> *** Error 40005: [Virtuoso Driver][Virtuoso Server]SR325: Transaction > >> aborted because it's log after image size went above the limit > >> at line 2 of Command-Line-Load virtuoso_cmds/load_1000000.virtuoso: > >> ttlp (file_to_string_output(...) > >> > >> *** Error 40005: [Virtuoso Driver][Virtuoso Server]SR325: Transaction > >> aborted because it's log after image size went above the limit > >> at line 2 of Command-Line-Load virtuoso_cmds/load_1000000.virtuoso: > >> ttlp (file_to_string_output(...) > >> OK > >> > >> > >> Where can I increment the image/log size? Or am I using the wrong > >> command, i.e. is ttlp the wrong command for bulk loading? > >> > >> Thanks, > >> Thomas