Hello Tamas and Dennis, This is my personal opinion and there may tons of people that disagree, but I think it is more sane to always assume UTF8 encoding. There is functionality among the various drivers to accept/return strings in different encodings (through the CPL_Recode mechanism) but they all do it with different parameters (e.g. "PGCLIENTENCODING" in PostgreSQL, "SHAPE_ENCODING" in ESRI Shapefiles, etc). Overly complicated for programmatic access.
Strings being set should use UTF8 encoding. Strings being read should assume UTF8 encoding. Not sure about the column names, layer names and such, but I would rather find the drivers that are not assuming utf8 and fix those than doing some extra complicated magic in the C# bindings. It simplifies logic greatly. If the underlying data store is using Windows-1252 encoding internally, that should be handled and abstracted out by the driver itself (by doing the conversions into utf8), not the bindings. My two cents, - Ragi -- View this message in context: http://osgeo-org.1560.n6.nabble.com/gdal-dev-GDAL-OGR-C-wrapper-and-UTF8-tp5044028p5044395.html Sent from the GDAL - Dev mailing list archive at Nabble.com. _______________________________________________ gdal-dev mailing list gdal-dev@lists.osgeo.org http://lists.osgeo.org/mailman/listinfo/gdal-dev