Hi Nikos,
I am using a similar approach to yours to extract data from OSM files to a GIS
format. I chose Spatiallite as a format since it is superior to the Shapefile
format in all areas.
FYI here’s what I do:
osmconvert -b="5.5,49,8,50.5" -o=saarland.osm.pbf "europe-latest.osm.pbf"
ogr2ogr --co
I'm pretty sick today, so I'll hold off making the change until tomorrow
(unless anyone wants to beat me to it)
-kurt
On Tue, Nov 7, 2017 at 4:02 PM, Roarke Gaskill
wrote:
> > Unfortunately given how the GRIB degrib and underlying libaries used
> are done, you cannot get metadata without proces
> Unfortunately given how the GRIB degrib and underlying libaries used are
done, you cannot get metadata without processing the whole file.
Oh, so the OOM would happen even when calling gdalinfo? I was assuming it
was later processing that was getting the OOM.
On Tue, Nov 7, 2017 at 5:58 PM, Even
On mardi 7 novembre 2017 17:45:57 CET Roarke Gaskill wrote:
> It seems inappropriate (even as a quick hack) to put the size check in the
> grib parser.
If we default to knMaxAllloc = INT_MAX, given that *ndpts is g2int, then the
check will
be a no-op. Actually the compiler and/or static analyze
It seems inappropriate (even as a quick hack) to put the size check in the
grib parser. With the check there, you are not able to run simple utilities
like gdalinfo on large files. What if I wanted to use gdalinfo to find out
if the file is too big to process?
If it is decided to continue with thi
On mardi 7 novembre 2017 15:34:15 CET Kurt Schwehr wrote:
> Why not something like this and just let me pick a small number?
>
> #ifdef GRIB_MAX_ALLOC
> const int knMaxAllloc = GRIB_MAX_ALLOC;
> #else
> const int knMaxAlloc = some massive number;
> #endif
>
Yes, looks reasonable. I guess "some m
Why not something like this and just let me pick a small number?
#ifdef GRIB_MAX_ALLOC
const int knMaxAllloc = GRIB_MAX_ALLOC;
#else
const int knMaxAlloc = some massive number;
#endif
On Tue, Nov 7, 2017 at 3:18 PM, Even Rouault
wrote:
> On mardi 7 novembre 2017 14:21:27 CET Kurt Schwehr wrote:
On mardi 7 novembre 2017 14:21:27 CET Kurt Schwehr wrote:
> Yeah, 1 << ## would have been better. Sigh.
>
> But really, someone needs to work through the logic of this stuff and do
> something that actually works through what is reasonable. Code that is
> intelligent and explains why it is doing
Yeah, 1 << ## would have been better. Sigh.
But really, someone needs to work through the logic of this stuff and do
something that actually works through what is reasonable. Code that is
intelligent and explains why it is doing is far preferable. OOM is not
okay in the world I work in, so I tr
Wouldn't the the max size be limited by the number of bytes read? So in
this case 4 bytes.
http://www.nco.ncep.noaa.gov/pmb/docs/grib2/grib2_sect5.shtml
Looking at netcdf's implementation they treat the value as a 32 bit signed
int.
https://github.com/Unidata/thredds/blob/5.0.0/grib/src/main/ja
On mardi 7 novembre 2017 13:51:30 CET Kurt Schwehr wrote:
> It's possible to cause massive allocations with a tiny corrupted grib file
> causing an out-of-memory. I found that case with the llvm ASAN fuzzer. If
> you have a specification that gives a more reasoned maximum or a better
> overall ch
It's possible to cause massive allocations with a tiny corrupted grib file
causing an out-of-memory. I found that case with the llvm ASAN fuzzer. If
you have a specification that gives a more reasoned maximum or a better
overall check, I'm listening. I definitely think the sanity checking can
be
> 2. Is OGR handling well the conversion from .osm to ESRI Shapefiles?
Yes, but within the limits ot the shapefile, and particularly .dbf format:
limitation to 254
characters for field values, 10 characters for field names... which are easily
violated by
OSM extracts.
Spatialite, GeoPackage,
Vincent,
digging into the GDAL 2.2.0 NEWS file, I see:
"""
ASRP driver:
* fix georeferencing of polar arc zone images (#6560)
So if there's an issue it is mostly related to that change, rather than on the
gdalwarp /
GDALSuggestedWarpOutput side of things
And that fix was backported to GD
Hi,
Why is the number of points greater than 33554432 considered nonsense?
https://github.com/OSGeo/gdal/blob/trunk/gdal/frmts/grib/degrib18/g2clib-1.0.4/g2_unpack5.c#L77
Thanks,
Roarke
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.
Still no ideas about this bug ?...
Gabriel
Le 02/11/2017 à 16:00, Gabriel Vatin a écrit :
Hi all,
I'm working with Python 3.6 32bits, GDAL package
(GDAL‑2.2.2‑cp36‑cp36m‑win32.whl) i've installed from here :
https://www.lfd.uci.edu/~gohlke/pythonlibs/
I also installed the gdal-202-1800-co
Dear OSM experts,
a reproducible workflow that extracts specific tags from OSM's Planet
data set, includes:
1. downloading OSM Planet in form of a pbf file
2. convert pbf to o5m using `osmconvert`
3 extract areas of interest in separate `aoi_*.o5m` files, based on boundaries
in form of .poly
Peter Baumann kirjoitti 07.11.2017 klo 16:29:
Ari-
On 11/07/2017 09:56 AM, Ari Jolma wrote:
To get multidimensional data into GDAL 2D domain, one may need to slice the
data at some point in the non x/y dimensions. I'm testing this with the
Rasdaman server. My code currently creates a KVP
"SUBS
Ari-
On 11/07/2017 09:56 AM, Ari Jolma wrote:
> Thanks for the replies before. I believe I'm getting closer, but slowly.
>
> In my earlier question I was referring to three CRSs. I now believe that it is
> only two since I had overlooked requirement 23, which to my understanding
> requires that t
Thanks for the replies before. I believe I'm getting closer, but slowly.
In my earlier question I was referring to three CRSs. I now believe that
it is only two since I had overlooked requirement 23, which to my
understanding requires that the CRS of the coverage (top level boundedBy
element)
20 matches
Mail list logo