I am combining some GIS data where each layer is divided to around thousand
separare shapefiles by mapsheets.
We are in the same boat. For the moment I'm exploring the approach of aggregrating map sheets, creating super tiles, until we approach the 2gb shapefile limit. I've not found a good method for predicting when we'll hit that limit though so it's error prone with a lot of repeats.

Looking forward, the as-yet unreleased file geodatabase API/spec would be a solution. ESRI has said they will do so, but it's been several years since they first announced it and when it is finally is released there is no guarantee it will be under license terms open source projects can use. This isn't to say it will be unusable either, we just don't know.

I've been reading about Binary XML, which has an open source library published by Cuberwerx, http://www.cubewerx.com/bxml. They say it is a drop in replacement for XML. Perhaps binary gml would be a good format for >2gb data packages? How much work would it be to add this to gdal/ogr? I invite developers to submit price estimates to me. I don't have an active project assigned to this or a budget, but I can't get one without an idea of what to ask for either.

best regards,

matt wilkie
--------------------------------------------
Geomatics Analyst
Information Management and Technology
Yukon Department of Environment
10 Burns Road * Whitehorse, Yukon * Y1A 4Y9
867-667-8133 Tel * 867-393-7003 Fax
http://environmentyukon.gov.yk.ca/geomatics/
--------------------------------------------
_______________________________________________
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Reply via email to