No one have any insights on this?
BTJ
Posted at Nginx Forum:
http://forum.nginx.org/read.php?2,253367,253503#msg-253503
___
nginx mailing list
nginx@nginx.org
http://mailman.nginx.org/mailman/listinfo/nginx
Hello!
On Fri, Sep 19, 2014 at 12:50 PM, igorhmm wrote:
> I don't known how to reproduce, not yet :-)
>
> I couldn't identify which worker was responding too, but I can see with
> strace warnings in the old wolker about EAGAIN (Resource temporarily
> unavailable). I can see that because old worker
But i cannot switch with proxy_cache because we're mirroring the mp4 files
for random seeking using mp4 module and proxy_cache doesn't support random
seeking. Is there a way i can use bash script with proxy_store ? I want
the following logic to prevent duplicate downloads :-
You can try to put
But i cannot switch with proxy_cache because we're mirroring the mp4 files
for random seeking using mp4 module and proxy_cache doesn't support random
seeking. Is there a way i can use bash script with proxy_store ? I want the
following logic to prevent duplicate downloads :-
1st user :-
client (r
On Tuesday 23 September 2014 19:34:23 shahzaib shahzaib wrote:
> @Valentine, is proxy_cache_lock supported with proxy_store ?
No. But if you're asking, then you're using a wrong tool.
The proxy_store feature is designed to be very simple and stupid.
To meet your needs you should use the proxy_ca
@Valentine, is proxy_cache_lock supported with proxy_store ?
On Tue, Sep 23, 2014 at 7:03 PM, Valentin V. Bartenev
wrote:
> On Tuesday 23 September 2014 00:06:56 shahzaib shahzaib wrote:
> > Is there any way with nginx that i could put an hold on the subsequent
> > requests and only proxy the si
On Tuesday 23 September 2014 00:06:56 shahzaib shahzaib wrote:
> Is there any way with nginx that i could put an hold on the subsequent
> requests and only proxy the single request for same file in order to
> prevent filling up the tmp folder ? tmp is kept on filling up due to the
> multiple users
Hi people,
@BR: I not found anything on logs related to this problem, but I still
investigating and trying to reproduce.
@oscaretu: this looks a nice tool, thanks for recommendation
@dewanggaba: I'm using the reload command. We can't use restart because this
will kill all established connections
Hi,
I am working on an nginx module, that has flow:
receive user request -> parse request -> read local file -> process file ->
response to client
This flow is working well. But now we have another requirement. Afer process
this file and build response into ngx_chain, I want to write response c