Dear all,
I've been looking around GDAL codebase and I can't see anything that
would deal with wide/multybyte characters for file names/paths on
Windows. Would you please confirm that on Windows GDAL will only be able
to open files with pure ASCII names, not international charset?
-M
___
Mateusz Loskot wrote:
>
> Frank Warmerdam wrote:
>> Mateusz Loskot wrote:
>>> Hi,
>>>
>>> When the upcoming 1.7.0 will get its own branch in SVN?
>>
>> Mateusz,
>>
>> My normal practice is to produce a 1.7 branch at the point the first
>> RC is prepared.
>
> Frank,
>
> Great. Thanks!
>
Fra
Even,
We use the JP2ECW driver.
I did the valgrind test and did not see any reported leak. Here is some of
the outputs from valgrind:
==11469== Invalid free() / delete / delete[]
==11469==at 0x4CE: free (in
/usr/lib64/valgrind/amd64-linux/vgpreload_memcheck.so)
==11469==by 0x95D1CDA:
Jason Roberts wrote:
> Mateusz,
>
> Thank you very much for your insight. I have a few more questions I'm
> hoping you could answer.
>
>> Alternative is to try to divide the tasks: 1. Query features from
>> data source using spatial index capability of data source. 2.
>> Having only subject featu
Jan Hartmann wrote:
> On 13-1-2010 2:33, Mateusz Loskot wrote:
>>
>> OGR does not provide any spatial indexing layer common to various
>> vector datasets. For many simple formats it performs the
>> brute-force selection.
>>
> Just curious, would it make sense / be possible to implement indexing
Ozy,
The interesting info is that your input image is JPEG2000 compressed.
This explains why you were able to read a scanline oriented NITF with
blockwidth > . My guess would be that the leak is in the JPEG2000
driver in question, so this may be more a problem on the reading part
than on
Greg,
You've probably missed that the issue raised by Ozy was with NITF, not
with GeoTIFF
As a practical matter, I do not see this restriction in GDAL. On
Thu 21 Sep 2006, I created with gdal_merge.py a 3 GB .tif having
18,400 columns by 52,800 rows by RGB. On Thu 11 Dec
2009, gdal
Motion: Frank Warmerdam is authorized to negotiate a paid maintainer
contract with Chaitanya Kumar CH for up to $9360 USD at $13USD/hr over
six months, and would be acting as supervisor, operating under the terms
of RFC 9 (GDAL Paid Maintainer Guidelines).
---
Folks,
Chaitanya's current paid ma
>
> Date: Wed, 13 Jan 2010 10:27:43 -0500
> From: "Jason Roberts"
> Subject: RE: [gdal-dev] Open source vector geoprocessing libraries?
> To: "'Peter J Halls'"
> Cc: 'gdal-dev'
> Message-ID: <008001ca9464$f4059f10$dc10dd...@roberts@duke.edu>
> Content-Type: text/plain; charset="US-ASCII"
>
Update:
after more than 20 minutes of being non-responsive, the OS finally regained
functionality and promptly killed gdal_translate after about 80% into the
process.
On Wed, Jan 13, 2010 at 11:14 AM, ozy sjahputera wrote:
> Hi Even,
>
> yes, I tried:
> gdal_translate -of "NITF" -co "ICORDS=G"
Hi Even,
yes, I tried:
gdal_translate -of "NITF" -co "ICORDS=G" -co "BLOCKXSIZE=128" -co
"BLOCKYSIZE=128" NITF_IM:0:input.ntf output.ntf
I monitored the memory use using top and it was steadily increasing till it
reached 98.4% (I have 8GB of RAM and 140 GB of local disk for swap etc.)
before the
Martin Chapman wrote:
Frank,
In the file NITFDatasetCreate.cpp in the function NITFDatasetCreate() if
the compression option is set to C8 (JPEG2000) it looks like you:
1. get a handle to an installed J2K driver if available.
2. test for metadata creation capability.
3. create the
Jan Hartmann wrote:
Is that so? Reading the OGR API tutorial
(http://www.gdal.org/ogr/ogr_apitut.html), I see that all geometries,
frowm whatever input source, are represented internally as a generic
OGRGeometry pointer, which is a virtual base class for all real geometry
classes (http://www.g
As a practical matter, I do not see this restriction in GDAL. On Thu 21
Sep 2006, I created with gdal_merge.py a 3 GB .tif having 18,400 columns by
52,800 rows by RGB. On Thu 11 Dec 2009, gdal_translate processed a 150 GB
untiled .tif to a tiled .tif with 260,000 columns by 195,000 rows. Gr
Jason,
Jason Roberts wrote:
Peter,
are you constrained to retaining your data in an ArcGIS compatible format?
We are attempting to build tools that can work with data stored in a variety
of formats. Our current user community uses mostly shapefiles, ArcGIS
personal geodatabases, and ArcGIS
Hi Duarte,
Thanks for the suggestions.
I took a look at GeoKettle. Here are some relevant excerpts from a document:
"GeoKettle is a ... powerful, metadata‐driven spatial ETL tool dedicated to the
integration of different spatial data sources for building/updating geospatial
data warehouses. At
Jan Hartmann wrote:
On 13-1-2010 15:49, Ari Jolma wrote:
Jan Hartmann wrote:
Just curious, would it make sense / be possible to implement
indexing in OGR, something like a
generalized version of Mapserver's shptree, the "quadtree-based
spatial index for a shapefiles"?
http://mapserver.or
Peter,
> are you constrained to retaining your data in an ArcGIS compatible format?
We are attempting to build tools that can work with data stored in a variety
of formats. Our current user community uses mostly shapefiles, ArcGIS
personal geodatabases, and ArcGIS file geodatabases. Many of them
On 13-1-2010 15:49, Ari Jolma wrote:
Jan Hartmann wrote:
Just curious, would it make sense / be possible to implement indexing
in OGR, something like a
generalized version of Mapserver's shptree, the "quadtree-based
spatial index for a shapefiles"?
http://mapserver.org/utilities/shptree.
Mateusz,
Thank you very much for your insight. I have a few more questions I'm hoping
you could answer.
> Alternative is to try to divide the tasks:
> 1. Query features from data source using spatial index capability of
> data source.
> 2. Having only subject features selected, apply geometric pr
Jan Hartmann wrote:
On 13-1-2010 2:33, Mateusz Loskot wrote:
OGR does not provide any spatial indexing layer common to various
vector datasets. For many simple formats it performs the brute-force
selection.
Just curious, would it make sense / be possible to implement indexing
in OGR, some
On 13-1-2010 2:33, Mateusz Loskot wrote:
OGR does not provide any spatial indexing layer common to various
vector datasets. For many simple formats it performs the brute-force
selection.
Just curious, would it make sense / be possible to implement indexing in
OGR, something like a
genera
Jason,
Have you looked at GeoKettle [1]? And recently I found GearScape [2], which
seemed very interesting to me. Though neither is based on python...
Duarte Carreira
[1] - http://sourceforge.net/projects/geokettle/
[2] - http://www.fergonco.es/gearscape/index.php
__
Frank,
In the file NITFDatasetCreate.cpp in the function NITFDatasetCreate() if the
compression option is set to C8 (JPEG2000) it looks like you:
1. get a handle to an installed J2K driver if available.
2. test for metadata creation capability.
3. create the nitf file.
4.
Greg Coats mac.com> writes:
> I find that GDAL version 1.6.3, released 2009/11/19, gdal_translate fully
supports reading and writing a 150 GB GeoTiff image 260,000 columns by 195,000
rows by RGB. Greg
Hi,
The problem is not not the image size itself. It may be related to, as
mentioned earlier,
Jason,
are you constrained to retaining your data in an ArcGIS compatible format?
If so and if you do not have ArcSDE, then what follows may not be much help.
Otherwise, I think it likely that you will find using a DBMS as your data
repository advantageous for many reasons. Apart from the
26 matches
Mail list logo