Hi,

Using the C# GDAL FileGDB API V 1.5 (https://trac.osgeo.org/gdal/wiki/FileGDB) 
I have written a simple test application which writes and rereads a polygon 
geometry. The  test FileGDB is using WKID 21781 (CH1903_LV03), XYScale of 
20'000 and  XYTolerance of 0.0004. 
Using the test application I have written a polygon with the following 
coordinates 
    POLYGON((
    660440.402                232655.598, 
    660419.723456789  232646.789153456, 
    660398.588123         232712.410135, 
    660477.806                232751.485, ...))

Reading the polygon using OgrInfo gets following coordinates
    MULTIPOLYGON (((
    660440.402                232655.598, 
    660419.723450001  232646.78915,
    660398.588100001  232712.410149999, 
    660477.806                232751.485, ...)))"

Rereading the geometry using the C# Esri.FileGDBAPI gets following coordinates
    point[0]  660440.40199999884   232655.59800000116
    point[1]  660419.72345000133   232646.78914999962
    point[2]  660398.58810000122   232712.41014999896
    point[3]  660477.80600000173   232751.4849999994 
    ...

Regarding the XYScale of 20'000 it is correct that the position 660398.588123 
is altered to 660398.58810.  But considering the value of 660398.588100001 the 
coordinates seems not to be stored as whole numbers. 
Using the Esri.FileGDBAPI the values differ even more. It seems to be a 
floating point accuracy problem but I don't really understand why. How are the 
coordinates stored in FileGDB and why does the read values differ between 
OgrInfo and Esri.FileGDBAPI? 

Thanks in anticipation
Benno Wiss
_______________________________________________
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Reply via email to