So it’s very easy to get caught up in he trap if having unrealistic mental
models of how we servers work when dealing with web servers. If your host is a
recent (< 5 years) single Dickey host then you can probably support 300,000
requests per second fir your robots.txt file. That’s because the f
Hi all, I’ve recently deployed a rate-limiting configuration aimed at
protecting myself from spiders.
nginx version: nginx/1.15.1 (RPM from nginx.org)
I did this based on the excellent Nginx blog post at
https://www.nginx.com/blog/rate-limiting-nginx/ and have consulted the
documentation for l
Hi Igor,
Config is reloaded using
/usr/sbin/nginx -s reload
this is invoked from a python/shell script ( Nginx is installed on a web
control panel )
The top-level Nginx config is in the gist below
https://gist.github.com/AnoopAlias/ba5ad6749a586c7e267672ee65b32b3a
It further includes ~8k serv
Hello,
Thanks for that input, I changed my config like you said and also set the
fastcgi_param SCRIPT_FILENAME to /path/to/the/index.php;
But I feel like this should be done by the rewrite directive instead of
this way. Now that this is working, can someone please tell me if I can use
the rewrite
Anoop,
I suppose, most of your 10k servers are very similar, right?
Please, post top level configuration and a typical server{}, please.
Also, how do you reload configuration? With 'service nginx reload' or
may be other commands?
It looks like you have a lot of fragmented memory and only 4gb
> I tried rewrite tag, and tried giving alias inside the location block, both
> didn't work. The thing is if I remove the /v11/ from the location tag and the
> URL, its working without any issues What is the right way to do this?
The try_files needs probably be changed to:
try_files $uri $uri/