what's the task? what about batching the geometry and or fields? can you
run on just the first feature, does that work?
how many features, how big is the task?
On Wed, 27 Sept 2023, 13:16 Scott via gdal-dev,
wrote:
> Any tips for using ogr2ogr to use only a specified amount of RAM? I'm
> not ha
Increasing swap was my last resort as I'd prefer not to do this across
different systems. However, that's exactly what I'll do!
Thanks for the help people!
On 9/28/23 11:36, Cainã K. Campos via gdal-dev wrote:
I believe that you could try to increase your swap RAM,
for linux it is pretty strai
Le 28/09/2023 à 20:17, Scott a écrit :
Thanks for digging into that Even!
Can I create my new .fgb in sections?
If I limit the number of source rows with -sql, doing that multiple
times with -update, will it still build the entire R-tree when writing
to the destination?
That would actually b
I believe that you could try to increase your swap RAM,
for linux it is pretty straightforward, and having a SSD or NVME it will
perform good.
Free disk space is a must have to this to work,
as you are going to need about 10 - 20 Gb disk space as swap, according to
Even calcs + 8Gb that you have.
N
Thanks for digging into that Even!
Can I create my new .fgb in sections?
If I limit the number of source rows with -sql, doing that multiple
times with -update, will it still build the entire R-tree when writing
to the destination?
I'm looking for a way to get the desired results.
On 9/28/2
ok, that now makes sense. Writing a .fgb files comes into those
exceptions where RAM consumption might be important, as it involves
building a packed Hilbert R-Tree in memory. With the current
implementation, you need at least the number of features times some
constant amount of RAM, at least t
I get the same error on OS AWS Linux 2.
Also, on either OS with source .parquet instead of .fgb.
On 9/28/23 10:17, Scott via gdal-dev wrote:
USA.fgb is 36 GB. I've renamed it from its original source which can be
found here:
https://beta.source.coop/vida/google-microsoft-open-buildings
ogr2o
USA.fgb is 36 GB. I've renamed it from its original source which can be
found here:
https://beta.source.coop/vida/google-microsoft-open-buildings
ogr2ogr -sql "select area_in_meters from bfp_USA" -nln footprints
footprints.fgb ~/Downloads/USA.fgb
GDAL 3.7.1
OS Debian Buster
Output from ogrin
Le 28/09/2023 à 18:47, Scott via gdal-dev a écrit :
I should have been more specific.
One particular machine has 8GB of memory. When I try to do the most
simple ogr2ogr command on large files, the host runs out of memory
(vmstat shows this) and ogr2ogr terminates with 'Killed', nothing more.
I should have been more specific.
One particular machine has 8GB of memory. When I try to do the most
simple ogr2ogr command on large files, the host runs out of memory
(vmstat shows this) and ogr2ogr terminates with 'Killed', nothing more.
The data formats I have experienced this with are
: keskiviikko 27. syyskuuta 2023 6.16
Vastaanottaja: gdal-dev@lists.osgeo.org
Aihe: [gdal-dev] Using ogr2ogr with limited memory
Any tips for using ogr2ogr to use only a specified amount of RAM? I'm not
having any luck.
___
gdal-dev mailing list
gda
Hello,
You might want to say which drivers you're using. I can imagine PostGIS with
PG_USE_COPY using more memory with a larger batch size (ogr2ogr -gt), for
example. But most likely you're not going to be able to do anything about it.
So how much memory are we talking about, and which formats
.
Best regards,
Abel
-Mensaje original-
De: gdal-dev En nombre de Scott via gdal-dev
Enviado el: dimecres, 27 de setembre de 2023 5:16
Para: gdal-dev@lists.osgeo.org
Asunto: [gdal-dev] Using ogr2ogr with limited memory
Any tips for using ogr2ogr to use only a specified amount of RAM? I
Any tips for using ogr2ogr to use only a specified amount of RAM? I'm
not having any luck.
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev
14 matches
Mail list logo