On Mon, Oct 3, 2011 at 1:31 PM, Cal Leeming [Simplicity Media Ltd]
<cal.leem...@simplicitymedialtd.co.uk> wrote:
> So, came up against a strange thing today.
> Database backend is MySQL 5.5 (Percona variant)
>
> If I attempt to do an __in query with a list containing 50 thousand entries,
> the query wouldn't fail, but would instead return no results. If I split it
> up into chunks (say 100) - it would work fine.
> For example:
> _list = ['user1', 'user2'] # imagine this list has exactly 50 thousand
> entries
> Members.objects.filter(username__in = _list) # no results
> Members.objects.filter(username__in = _list[:100]) # 100 results
> I can provide exact further info, but this was just a preliminary email to
> see if this was expected behavior - or actually a bug??

I'm guessing it's MySQL being "special" again. I tried IN queries with
100k arguments on postgres and it worked fine. The MySQL documentation
states that "The number of values in the IN list is only limited by
the max_allowed_packet value."
(http://dev.mysql.com/doc/refman/5.5/en/comparison-operators.html#function_in),
so perhaps investigate this max_allowed_packet setting?

Jacob

-- 
You received this message because you are subscribed to the Google Groups 
"Django developers" group.
To post to this group, send email to django-developers@googlegroups.com.
To unsubscribe from this group, send email to 
django-developers+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/django-developers?hl=en.

Reply via email to