#36144: DatabaseOperations.bulk_batch_size() should consider more database 
limits
on SQLite and Oracle
-------------------------------------+-------------------------------------
     Reporter:  Sarah Boyce          |                    Owner:  (none)
         Type:                       |                   Status:  new
  Cleanup/optimization               |
    Component:  Database layer       |                  Version:  5.1
  (models, ORM)                      |
     Severity:  Normal               |               Resolution:
     Keywords:                       |             Triage Stage:
                                     |  Unreviewed
    Has patch:  0                    |      Needs documentation:  0
  Needs tests:  0                    |  Patch needs improvement:  0
Easy pickings:  0                    |                    UI/UX:  0
-------------------------------------+-------------------------------------
Description changed by Sarah Boyce:

Old description:

> `DatabaseOperations.bulk_batch_size()` is used to calculate the maximum
> batch size when doing operations such as bulk_update and bulk_create
>
> When investigating the impact of composite primary keys on the maximum
> batch size calculation for `bulk_update()`, it became clear that there
> are more database limits that need to be considered when calculating the
> maximum batch size in order to have a bullet proof solution.
>
> One possible limit in play on SQLite is `SQLITE_MAX_EXPR_DEPTH` which is
> 1000 (see https://www.sqlite.org/limits.html#max_expr_depth).
> On Oracle, we found that a query could error with the ambiguous message:
> `ORA-00907: missing right parenthesis`, which may be due to hitting some
> limit (possibly documented here:
> https://docs.oracle.com/en/database/oracle/oracle-database/23/lnpls
> /plsql-program-limits.html)
>
> We may need to revisit the API design.
>
> PR discussion:
> https://github.com/django/django/pull/19088#discussion_r1929940327

New description:

 `DatabaseOperations.bulk_batch_size()` is used to calculate the maximum
 batch size when doing operations such as bulk_update and bulk_create

 When investigating the impact of composite primary keys on the maximum
 batch size calculation for `bulk_update()`, it became clear that there are
 more database limits that need to be considered when calculating the
 maximum batch size in order to have a bullet proof solution.

 One possible limit in play on SQLite is `SQLITE_MAX_EXPR_DEPTH` which is
 1000 (see https://www.sqlite.org/limits.html#max_expr_depth).
 On Oracle, we found that a query could error with the ambiguous message:
 `ORA-00907: missing right parenthesis`, which may be due to hitting some
 limit (possibly documented here:
 https://docs.oracle.com/en/database/oracle/oracle-database/23/lnpls/plsql-
 program-limits.html)

 We may need to revisit the API design.

 PR discussion:
 https://github.com/django/django/pull/19088#discussion_r1929940327
 Ticket which sparked the discussion/discovery: #36118

--
-- 
Ticket URL: <https://code.djangoproject.com/ticket/36144#comment:1>
Django <https://code.djangoproject.com/>
The Web framework for perfectionists with deadlines.

-- 
You received this message because you are subscribed to the Google Groups 
"Django updates" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/django-updates/01070194a702ef49-0486f17d-ec81-4b96-a565-b15d23f97912-000000%40eu-central-1.amazonses.com.

Reply via email to