Even,
we just upgraded to gdal 1.7. I tested gdal_translate and CreateCopy() again
and it still dies with similar conditions.
Since valgrind did not detect any memory leak related to CreateCopy(), I
suspect this problem is caused by poor memory management in doing
CreateCopy(). It seems to be con
Even,
We use the JP2ECW driver.
I did the valgrind test and did not see any reported leak. Here is some of
the outputs from valgrind:
==11469== Invalid free() / delete / delete[]
==11469==at 0x4CE: free (in
/usr/lib64/valgrind/amd64-linux/vgpreload_memcheck.so)
==11469==by 0x95D1CDA:
Ozy,
The interesting info is that your input image is JPEG2000 compressed.
This explains why you were able to read a scanline oriented NITF with
blockwidth > . My guess would be that the leak is in the JPEG2000
driver in question, so this may be more a problem on the reading part
than on
Greg,
You've probably missed that the issue raised by Ozy was with NITF, not
with GeoTIFF
As a practical matter, I do not see this restriction in GDAL. On
Thu 21 Sep 2006, I created with gdal_merge.py a 3 GB .tif having
18,400 columns by 52,800 rows by RGB. On Thu 11 Dec
2009, gdal
Update:
after more than 20 minutes of being non-responsive, the OS finally regained
functionality and promptly killed gdal_translate after about 80% into the
process.
On Wed, Jan 13, 2010 at 11:14 AM, ozy sjahputera wrote:
> Hi Even,
>
> yes, I tried:
> gdal_translate -of "NITF" -co "ICORDS=G"
Hi Even,
yes, I tried:
gdal_translate -of "NITF" -co "ICORDS=G" -co "BLOCKXSIZE=128" -co
"BLOCKYSIZE=128" NITF_IM:0:input.ntf output.ntf
I monitored the memory use using top and it was steadily increasing till it
reached 98.4% (I have 8GB of RAM and 140 GB of local disk for swap etc.)
before the
As a practical matter, I do not see this restriction in GDAL. On Thu 21
Sep 2006, I created with gdal_merge.py a 3 GB .tif having 18,400 columns by
52,800 rows by RGB. On Thu 11 Dec 2009, gdal_translate processed a 150 GB
untiled .tif to a tiled .tif with 260,000 columns by 195,000 rows. Gr
I find that GDAL version 1.6.3, released 2009/11/19, gdal_translate fully
supports reading and writing a 150 GB GeoTiff image 260,000 columns by 195,000
rows by RGB. Greg
$ tiffinfo dcua0002_BigTiffYES.tif
Image Width: 26 Image Length: 195000
Tile Width: 512 Tile Length: 512
Resolutio
Ozy,
Did you try with gdal_translate -of NITF src.tif output.tif -co
BLOCKSIZE=128 ? Does it give similar results ?
I'm a bit surprised that you even managed to read a 40Kx100K large NITF
file organized as scanlines. There was a limit until very recently that
prevented to read blocks whose o
I was trying to make a copy of a very large NITF image (about 40Kx100K
pixels) using GDALDriver::CreateCopy(). The new file was set to have
different block-size (input was a scanline image, output is to have a
128x128 blocksize). The program keeps getting killed by the system (Linux).
I monitor the
10 matches
Mail list logo