Ben Sturmfels <b...@sturm.com.au> writes:

>> The fix seems to be to double most of the backslashes in robot-detection.py
>
> I'd suggest removing the backslashes instead.
>
> Doubling the backslashes is for when you want literal backlash
> characters in the text. Here the backslash are intended to
> escape characters like hyphens:

Oops, my mistake. These strings are used as regular expressions, so removing the
backslashes would cause the patterns not to match properly.

I'd suggest instead marking all these strings as "raw" strings, eg:

robot_useragents = [
        ...
        r'googlebot',
        r'google\-sitemaps',
        r'gullive',
        ...
        ]

This is equivalent the same as doubling the backslashes, just a little more
readable.

-- 
Ben Sturmfels

Sturm Software Engineering
www.sturm.com.au
+61 3 9024 2467

Reply via email to