RE: [gdal-dev] gdal_translate segfault with large netCDF

2011-04-19 Thread Josh.Vote
Thanks for the suggestions - > I would like to suggest that you do a gdal_translate from a subset of the > ERS file at the bottom right corner of the source just to ensure that it > isn't a problem with reading past the 4GB mark in the ERS file. I just managed to run 'gdal_translate -of netCDF -s

Re: [gdal-dev] gdal_translate segfault with large netCDF

2011-04-19 Thread Nikolaos Hatzopoulos
What kind of netcdf file is causing the problem? is a netcdf4 or netcdf3 file? there is a compiling option in netcdf4 --enable-netcdf-4 On Tue, Apr 19, 2011 at 11:35 AM, Kyle Shannon wrote: > Josh, > As frank said, file a ticket and provide the output of ncdump -h > yourfile.nc with the ticke

Re: [gdal-dev] gdal_translate segfault with large netCDF

2011-04-19 Thread Kyle Shannon
Josh, As frank said, file a ticket and provide the output of ncdump -h yourfile.ncwith the ticket. I will take a look at it as soon as I can, although I am pretty busy. Thanks. kss /** * * Kyle Shannon * ksshan...@gmail.com * */ On Tue, Apr 19, 2011 at 09:57, Frank Warmerdam wrote: >

Re: [gdal-dev] gdal_translate segfault with large netCDF

2011-04-19 Thread Frank Warmerdam
On 11-04-19 05:01 AM, josh.v...@csiro.au wrote: Hi, I’m new to GDAL so please forgive any glaring ignorance J Currently I have an 8GB ER Mapper (ERS) dataset that I want to convert to a NetCDF file with gdal_translate which always results in a segfault when using the following command. gdal_tr

[gdal-dev] gdal_translate segfault with large netCDF

2011-04-19 Thread Josh.Vote
Hi, I'm new to GDAL so please forgive any glaring ignorance :) Currently I have an 8GB ER Mapper (ERS) dataset that I want to convert to a NetCDF file with gdal_translate which always results in a segfault when using the following command. gdal_translate -of netCDF input.ers output.nc Whereas