Re: limit_req per subnet?

2016-12-14 Thread c0nw0nk
proxy_cache / fastcgi_cache the pages output will help. Flood all you want Nginx handles flooding and lots of connections fine your back end is your weakness / bottleneck that is allowing them to be successful in effecting your service. You could also use the secure_link module to help on your ind

Re: limit_req per subnet?

2016-12-14 Thread shiz
I've inplemented something based on https://community.centminmod.com/threads/blocking-bad-or-aggressive-bots.6433/ Works perfectly fine for me. Posted at Nginx Forum: https://forum.nginx.org/read.php?2,271483,271535#msg-271535 ___ nginx mailing list n

Re: limit_req per subnet?

2016-12-14 Thread lists
By the time you get to UA, nginx has done a lot of work.  You could 444 based on UA, then read that code in the log file with fail2ban or a clever script. ‎That way you can block them at the firewall. It won't help immediately with the sequential number, but that really won't be a problem.   

Re: limit_req per subnet?

2016-12-14 Thread Grant
>> I rate limit them using the user-agent > > > Maybe this is the best solution, although of course it doesn't rate > limit real attackers. Is there a good method for monitoring which UAs > request pages above a certain rate so I can write a limit for them? Actually, is there a way to limit rate

Re: limit_req per subnet?

2016-12-14 Thread Grant
> I rate limit them using the user-agent Maybe this is the best solution, although of course it doesn't rate limit real attackers. Is there a good method for monitoring which UAs request pages above a certain rate so I can write a limit for them? - Grant

Re: limit_req per subnet?

2016-12-14 Thread Grant
> I'm no fail2ban guru. Trust me. I'd suggest going on serverfault. But my > other post indicates semrush resides on AWS, so just block AWS. I doubt there > is any harm in blocking AWS since no major search engine uses them. > > Regarding search engines, the reality is only Google matters. Just l

Re: nginx x-accel-redirect request method named location

2016-12-14 Thread hemendra26
hemendra26 Wrote: --- > I was using nginx x-accel-redirect as an authentication frontend for > an external db resource. > > In my python code I would do the following: > > /getresource/ > > def view(self, req, resp): > name = get_dbname(r

Re: limit_req per subnet?

2016-12-14 Thread shiz
I rate limit them using the user-agent Posted at Nginx Forum: https://forum.nginx.org/read.php?2,271483,271524#msg-271524 ___ nginx mailing list nginx@nginx.org http://mailman.nginx.org/mailman/listinfo/nginx

Re: limit_req per subnet?

2016-12-14 Thread lists
I'm no fail2ban guru. Trust me. I'd suggest going on serverfault. But my other post indicates semrush resides on AWS, so just block AWS. I doubt there is any harm in blocking AWS since no major search engine uses them.  Regarding search engines, the reality is only Google matters. Just look at y

Re: limit_req per subnet?

2016-12-14 Thread lists
‎They claim to obey robots.txt. They also claim to to use consecutive IP addresses. https://www.semrush.com/bot/ ‎ Some dated posts (2011) indicate semrush uses AWS. I block all of AWS IP space and can say I've never seen a semrush bot. So that might be a solution. I got the AWS IP space from s

access_logging in the stream block

2016-12-14 Thread kms-pt
Hello, Just wondering if anyone knows if access_logs are able to be configured in the stream block. We are looking to implement TCP stream which works but also have the requirement of logging the connections, transactions, etc. I know error_log can be enabled but I have found no documentation stat

Re: limit_req per subnet?

2016-12-14 Thread Grant
> I am curious what is the request uri they was hitting. Was it a dynamic page > or file or a static one. It was semrush and it was all manner of dynamic pages. - Grant ___ nginx mailing list nginx@nginx.org http://mailman.nginx.org/mailman/listinfo/ng

Re: limit_req per subnet?

2016-12-14 Thread Grant
> Did you see if the IPs were from an ISP? If not, I'd ban the service using > the Hurricane Electric BGP as a guide. At a minimum, you should be blocking > the major cloud services, especially OVH. They offer free trial accounts, so > of course the hackers abuse them. What sort of sites run

Re: nginx upgrade fails due bind error on 127.0.0.1 in a FreeBSD jail

2016-12-14 Thread Maxim Dounin
Hello! On Wed, Dec 14, 2016 at 06:36:14AM -0500, Alt wrote: > steveh Wrote: > --- > > > listen 443 default_server accept_filter=httpready ssl; > > listen 80 default_server accept_filter=httpready; > > Not related to your problem: I think you'l

Re: How to use Nginx as a proxy for S3 compatible storage with version 4 signature?

2016-12-14 Thread Reinis Rozitis
> It would be easy to proxy requests like this: > https://mydomain.com// > but with version4 we need to send requests like: > https://.mydomain.com/ > The problem is that s3storage is a private node which hasn't a public domain. > Only Nginx (which is a public node) can see s3storage. > Does som

How to use Nginx as a proxy for S3 compatible storage with version 4 signature?

2016-12-14 Thread Alexandr Porunov
Hello, I want to use Nginx as a proxy for private S3 compatible storage (i.e. It isn't s3.amazon.com but has the exactly the same API). I am novice in Nginx so I am not sure if I will explain correctly but I will try. I have 3 nodes: mydomain.com - node with nginx s3storage - private storage wi

Re: nginx upgrade fails due bind error on 127.0.0.1 in a FreeBSD jail

2016-12-14 Thread Alt
Hello ! steveh Wrote: --- > listen 443 default_server accept_filter=httpready ssl; > listen 80 default_server accept_filter=httpready; Not related to your problem: I think you'll want "accept_filter=dataready" for your SSL configuration. Best