On Tue, 22 Apr 2025 at 13:36, Guinevere Larsen via Gcc <gcc@gcc.gnu.org> wrote: > > On 4/21/25 12:59 PM, Mark Wielaard wrote: > > Hi hackers, > > > > TLDR; When using https://patchwork.sourceware.org or Bunsen > > https://builder.sourceware.org/testruns/ you might now have to enable > > javascript. This should not impact any scripts, just browsers (or bots > > pretending to be browsers). If it does cause trouble, please let us > > know. If this works out we might also "protect" bugzilla, gitweb, > > cgit, and the wikis this way. > > > > We don't like to hav to do this, but as some of you might have noticed > > Sourceware has been fighting the new AI scraperbots since start of the > > year. We are not alone in this. > > > > https://lwn.net/Articles/1008897/ > > https://arstechnica.com/ai/2025/03/devs-say-ai-crawlers-dominate-traffic-forcing-blocks-on-entire-countries/ > > > > We have tried to isolate services more and block various ip-blocks > > that were abusing the servers. But that has helped only so much. > > Unfortunately the scraper bots are using lots of ip addresses > > (probably by installing "free" VPN services that use normal user > > connections as exit point) and pretending to be common > > browsers/agents. We seem to have to make access to some services > > depend on solving a javascript challenge. > > Jan Wildeboer, on the fediverse, has a pretty interesting lead on how AI > scrapers might be doing this: > https://social.wildeboer.net/@jwildeboer/114360486804175788 (this is the > last post in the thread because it was hard to actually follow the > thread given the number of replies, please go all the way up and read > all 8 posts). > > Essentially, there's a library developer that pays developers to just > "include this library and a few more lines in your TOS". This library > then allows the app to sell the end-user's bandwidth to clients of the > library developer, allowing them to make requests. This is how big > companies are managing to have so many IP addresses, so many of those > being residential IP addresses, and it also means that by blocking those > IP addresses we will be - necessarily - blocking real user traffic to > our platforms.
It seems to me that blocking real users *who are running these shady apps* is perfectly reasonable. They might not realise it, but those users are part of the problem. If we block them, maybe they'll be incentivised to stop using the shady apps. And if users stop using those apps, maybe those app developers will stop bundling the libraries that piggyback on users' bandwidth. > > I'm happy to see that the sourceware is moving to a more comprehensive > solution, and if this is successful, I'd suggest that we also try to do > that to the forgejo instance, and remove the IPs blocked because of this > scraping. For now, maybe. This thread already explained how to get around Anubis by changing the UserAgent string - how long will it be until these peer-to-business network libraries figure that out?