On 10/10/2013 2:24 PM, Maxim Dounin wrote:
> Hello!
>
> On Thu, Oct 10, 2013 at 01:35:00PM -0400, itpp2012 wrote:
>
>>> Correct. One nginx process can handle multiple requests, it's one
>>> PHP process which limits you.
>>
>> Not really, use the NTS version of php not the TS, and use a pool a
Hello!
On Thu, Oct 10, 2013 at 01:35:00PM -0400, itpp2012 wrote:
> > Correct. One nginx process can handle multiple requests, it's one
> > PHP process which limits you.
>
> Not really, use the NTS version of php not the TS, and use a pool as
> suggested, e.a.;
>
> # loadbalancing php
>
> Correct. One nginx process can handle multiple requests, it's one
> PHP process which limits you.
Not really, use the NTS version of php not the TS, and use a pool as
suggested, e.a.;
# loadbalancing php
upstream myLoadBalancer {
server 127.0.0.1:19001 weight=1 fail_timeout=5
On 10/10/2013 11:26 AM, Maxim Dounin wrote:
> Hello!
>
> On Thu, Oct 10, 2013 at 11:13:40AM -0400, Ben Johnson wrote:
>
> [...]
>
>> Well, after all of the configuration changes, both to nginx and PHP, the
>> solution was to add the following header to the response:
>>
>> header('Content-Encod
Hello!
On Thu, Oct 10, 2013 at 11:13:40AM -0400, Ben Johnson wrote:
[...]
> Well, after all of the configuration changes, both to nginx and PHP, the
> solution was to add the following header to the response:
>
> header('Content-Encoding: none;');
Just in case: this is very-very wrong, there i
On 10/8/2013 11:48 AM, Maxim Dounin wrote:
> Hello!
>
> On Mon, Oct 07, 2013 at 10:57:14PM -0400, B.R. wrote:
>
> [...]
>
>> I then noticed on the capture that PHP was rightfully sending the content
>> in 2 parts as expected but somehow nginx was still waiting for the last
>> parto to arrive b
Hello!
On Mon, Oct 07, 2013 at 10:57:14PM -0400, B.R. wrote:
[...]
> I then noticed on the capture that PHP was rightfully sending the content
> in 2 parts as expected but somehow nginx was still waiting for the last
> parto to arrive before sending content to the client.
What makes you think t
Hello!
On Mon, Oct 07, 2013 at 03:22:15PM -0400, Ben Johnson wrote:
[...]
> Sorry to bump this topic, but I feel as though I have exhausted the
> available information on this subject.
>
> I'm pretty much in the same boat as Roger from
> http://stackoverflow.com/questions/4870697/php-flush-that
Hello,
On Mon, Oct 7, 2013 at 5:35 PM, Francis Daly wrote:
> Run the fastcgi server like this:
>
> env -i php-cgi -d cgi.fix_pathinfo=0 -q -b 9009
>
> Use an nginx config which includes something like this:
>
I would recommend being careful about that experiment since there is a
high probab
On Mon, Oct 07, 2013 at 03:22:15PM -0400, Ben Johnson wrote:
> On 9/16/2013 1:19 PM, Ben Johnson wrote:
Hi there,
> > For whatever reason, nginx *always* buffers the output, even when I set
> > Is it possible to disable PHP output buffering completely in nginx?
Have you shown that the initial p
Have you seen this one;
http://stackoverflow.com/questions/8882383/how-to-disable-output-buffering-in-php
Also try php NTS, it might also be that a flush only works with non-fcgi.
Posted at Nginx Forum:
http://forum.nginx.org/read.php?2,242895,243487#msg-243487
_
On 9/16/2013 1:19 PM, Ben Johnson wrote:
> Hello,
>
> In an effort to resolve a different issue, I am trying to confirm that
> my stack is capable of servicing at least two simultaneous requests for
> a given PHP script.
>
> In an effort to confirm this, I have written a simple PHP script that
Hello,
In an effort to resolve a different issue, I am trying to confirm that
my stack is capable of servicing at least two simultaneous requests for
a given PHP script.
In an effort to confirm this, I have written a simple PHP script that
runs for a specified period of time and outputs the numbe
13 matches
Mail list logo