We’re trying to gdal.Polygonize a feature that exceeds SQLite’s default blob 
size of 1,000,000,000 bytes.  Curiously, we’re using the memory dataset, but 
this appears to leverage SQLite under the hood (evidenced by the error message: 
ERROR 1: In GetNextRawFeature(): sqlite3_step() : string or blob too big).  
SQLite docs state the that limit can be changed at runtime using the 
sqlite3_limit<https://www.sqlite.org/c3ref/limit.html>(db,SQLITE_LIMIT_LENGTH<https://www.sqlite.org/c3ref/c_limit_attached.html#sqlitelimitlength>,...)
 interface.   Maybe there could be some way to leverage PRELUDE_STATEMENTS to 
achieve this but we are not “opening” a dataset - we’re creating one directly 
from the memory driver.

What is the best way to work around this?

Best,
Jesse

Lead Computer Scientist
Science Systems and Applications, Inc.
Dr Compton Tucker Team
NASA Goddard Space Flight Center
_______________________________________________
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev
  • ... Meyer, Jesse R. (GSFC-618.0)[SCIENCE SYSTEMS AND APPLICATIONS INC] via gdal-dev
    • ... Even Rouault via gdal-dev
      • ... Meyer, Jesse R. (GSFC-618.0)[SCIENCE SYSTEMS AND APPLICATIONS INC] via gdal-dev

Reply via email to