On 3/4/06, Luke Plant <[EMAIL PROTECTED]> wrote:

> But your patch fixes things for me - thanks!

Good to hear. I've committed it to magic removal as r2487. Thanks for
your (and Malcolm's) help in tracking this one.

> I haven't been able to find a nice solution.  You can't even do
> something like Members.objects.all()[0:50].delete() and loop til done,
> since SQL 'DELETE' statements don't support LIMIT.

I've attached a patch that should fix this - any delete of over 100
objects, or any attempt to reference/update over 100 objects, will be
broken down into multiple SQL requests. The value of 100 is
customizable by modifying
django.db.models.query.GET_ITERATOR_CHUNK_SIZE. This mirrors the
chunking strategy that is used by __iter__.

The unit tests all pass with this change. However, I don't have a
readily available source of 20000 interrelated objects to stress test
the algorithm. Could you please check that this 1) actually fixes your
problem, and 2) gives you the right results.

Thanks,
Russ Magee %-)


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django developers" group.
To post to this group, send email to django-developers@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-developers
-~----------~----~----~----~------~----~------~--~---

Attachment: delete-chunk.patch
Description: Binary data

Reply via email to