Hello,
I Dont think the way you currently use nginx as cache proxy is best practice.
Serving large file then store whole file into cache with large number of
request is like burning your disk, even if nginx cache manager can delete and
refill cache fast enough, it will keep write/delete file in
No, it’s browser limitation
--
Hưng
> On Aug 22, 2019, at 18:44, glareboa wrote:
>
> Is there a limitation in nginx on the number of simultaneous via proxy_pass
> "http://192.168.1.2:90xx";?
>
> From the source http://192.168.1.2:90xx MJPEG is transmitted. Speed 200
> kB/sec.
>
> The browser
so i just point it to the folder in tmp?
>
> Thanks
> Andrew
>
> From: nginx on behalf of Hung Nguyen
>
> Sent: Monday, June 17, 2019 12:01 PM
> To: nginx@nginx.org
> Subject: Re: Securing URLs with the Secure Link Module in NGINX
>
> Hi,
>
> Actually yo
Hi,
Actually you can use a module developed by Kaltura call secure token module
(1). This module can examine your response to see its content-type, if it
matches configured parameter, it will automatically inject secure params into
hls playlist. Use this module, please note you dont use anythin
Just for your information,
Cookie is stored on client side, there’s no need to share btw.
> On Mar 22, 2019, at 1:47 PM, David Ni wrote:
>
> Hi Experts,
> Who can help with this?Thanks very much.
>
>
>
>
>
> At 2019-03-15 16:36:22, "David Ni" wrote:
> Hi Nginx Experts,
> I have on
Congratulation to NGINX and team,
Hopefully this change won’t change open source way of nginx, and team members
will stay the same also, Maxim Dounin, Arut... are the most incredible
developers I’ve known.
Keep doing your great things,
--
Hưng
> On Mar 12, 2019, at 03:16, Igor Sysoev wrote:
Hi,
I am working on an nginx module, that has flow:
receive user request -> parse request -> read local file -> process file ->
response to client
This flow is working well. But now we have another requirement. Afer process
this file and build response into ngx_chain, I want to write response c