Stream response from upstream to all proxy_cache_lock'ed client

2021-01-29 Thread loopback_proxy
Hi, I was looking into using proxy_cache_lock mechanism to collapse upstream requests and reduce traffic. It works great right out of the box but one issue I found was that, if there are n client requests proxy_cache_locked, only one of those clients get the response as soon as the upstream send

proxy_cache without proxy_buffers

2019-02-26 Thread loopback_proxy
I am wondering if Nginx will ever support caching without buffering responses? Buffering the full response before sending the data out to client increases the first byte latency (aka TTFB). In a perfect world if nginx can stream the data to the cache file and to the client simultaneously that would

Re: nginx cache issue (http upstream cache: -5)

2018-02-05 Thread loopback_proxy
Ahhh interesting, that did the trick. Thank you so much. I have been also trying to understand the internals of nginx caching and how it works. I read the nginx blog about the overall architecture and the nginx man page about proxy_cache_* directives. I am looking for the internal architecture of

Re: N00b: Forwarding the full request to upstream server

2018-02-05 Thread loopback_proxy
You could just do proxy_pass http://192.168.10.34$request_uri See this for more https://nginx.org/en/docs/http/ngx_http_core_module.html#var_request_uri Posted at Nginx Forum: https://forum.nginx.org/read.php?2,278344,278347#msg-278347 ___ nginx ma

nginx cache issue (http upstream cache: -5)

2018-02-05 Thread loopback_proxy
I am new to nginx caching but have worked with nginx a lot. I tried enabling caching feature in our repository but it never worked so I thought I will pull a fresh copy of nginx and turn it on. I ended with the same issue. For some reason, nginx is not able to create the cache file in the cache dir