>> I do think this is related to 'proxy_read_timeout 60m;' leaving too
>> many connections open. Can I somehow allow pages to load for up to
>> 60m but not bog my server down with too many connections?
>
> Pardon me, but why on earth do you have an environment in which an HTTP
> request can take
> I do think this is related to 'proxy_read_timeout 60m;' leaving too
> many connections open. Can I somehow allow pages to load for up to
> 60m but not bog my server down with too many connections?
Pardon me, but why on earth do you have an environment in which an HTTP request
can take an hour
>> I've been struggling with http response time slowdowns and
>> corresponding spikes in my TCP Queuing graph in munin. I'm using
>> nginx as a reverse proxy to apache which then hands off to my backend,
>> and I think the proxy_read_timeout line in my nginx config is at least
>> contributing to t
It is most probably a question more suitable to some Odoo ML.
---
*B. R.*
On Sun, Sep 25, 2016 at 2:50 AM, Grant wrote:
> > I've been struggling with http response time slowdowns and
> > corresponding spikes in my TCP Queuing graph in munin. I'm using
> > nginx as a reverse proxy to apache whic
> I've been struggling with http response time slowdowns and
> corresponding spikes in my TCP Queuing graph in munin. I'm using
> nginx as a reverse proxy to apache which then hands off to my backend,
> and I think the proxy_read_timeout line in my nginx config is at least
> contributing to the is
I've been struggling with http response time slowdowns and
corresponding spikes in my TCP Queuing graph in munin. I'm using
nginx as a reverse proxy to apache which then hands off to my backend,
and I think the proxy_read_timeout line in my nginx config is at least
contributing to the issue. Here