Hello, Mohan.
Have you tried to make simultaneous request against the server from another
computer, using curl from the command line? It the request work in the
second computer, there is no problem in the server, and it will be in the
client. Perhaps you are looking for a problem in the nginx that
Hi Peter
Thanks for your reply.
I am not using script, I am creating a streamer project where i am using
libcurl to download the content from nginx server.
Since the content i am downloading is HLS, i am downloading every ~5sec.
During the stress test i am seeing "couldn't connect to server" er
I’m guessing that you have script that keeps executing curl. What you can do
is use curl -K ./fileWithListOfUrls.txt
and the one curl process will visit each url in turn reusing the socket (aka
HTTP keep alive)
That said, curl isn’t a great workload simulator and, in the long time, you can
get
Hi Liu
Client side I have increased the file descriptor value to 1 , but still the
same issue .
Also increased the FD in server side also then also same issue continuous.
Followed below link to increase the FD limit.
Linux Increase The Maximum Number Of Open Files / File Descriptors (FD)
It seems like your client has reach the limit of max open files.
>From the shell where you start you client program, run ‘ulimit -a’ to check
the settings.
You can also check the files open by your client in /proc//fd/.
Increase that value is simple, you can change is temporarily or save to
confi
Hi Team
I am trying execute ~1000 curl request from my CentOS machine to my nginx
server in ~5 sec.
The same exercise continuous every ~5sec.
I am using libcurl to make the HTTP request.
During this process i see most of my request are failed with reason
Failure Curl Error Code[ 7 ] Reason[ Co