Package: moinmoin-common
Version: 1.5.3-1
Severity: wishlist
Tags: patch

I have had an annoyance with one of my MoinMoin-based wikis, and that is
that I cannot use wget to retrieve pages in raw or printable format.

Turns out this is an apparently deliberate measure taken on
MoinMoin/request.py, to keep robots from doing harmful things -- however,
it appears to be the case that the list of not-harmful things that a robot
can do has increased without the code reflecting this.

I have attached a patch that I've tested.  It works fine for me.

-- System Information:
Debian Release: testing/unstable
  APT prefers unstable
  APT policy: (500, 'unstable'), (500, 'testing'), (500, 'stable')
Architecture: powerpc (ppc)
Shell:  /bin/sh linked to /bin/bash
Kernel: Linux 2.6.14-2-powerpc-smp
Locale: LANG=C, LC_CTYPE=en_US.UTF-8 (charmap=UTF-8)
--- MoinMoin/request.py 2006-04-15 12:21:49.000000000 -0400
+++ MoinMoin/request.py.new     2006-07-06 19:12:14.000000000 -0400
@@ -883,9 +883,12 @@
         qs = self.query_string
         if ((qs != '' or self.request_method != 'GET') and
             not 'action=rss_rc' in qs and
-            # allow spiders to get attachments and do 'show'
+            # Allow spiders to get attachments view the page in any format.
             not ('action=AttachFile' in qs and 'do=get' in qs) and
-            not 'action=show' in qs
+            not 'action=show' in qs and
+            not 'action=raw' in qs and
+            not 'action=print' in qs and
+            not 'action=format' in qs
             ):
             forbidden = self.isSpiderAgent
 

Reply via email to