[PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
Hi..

For a set of media import routines, i'm using a
javascript->php_on_apache->windows.bat->php_cli->curl->php_script invocation
method.
It seems longwinded, but it's designed to have different parts of the import
process run on different servers.

I'm stuck at getting curl_exec() to return the data of the final php_script
that does the importing.
The script itself runs fine, and i've recorded good details that curl_exec()
is supposed to catch with file_put_contents("/some/debug.txt",
json_encode($returnArray)), and from those debug-printouts it's just a few
tiny steps towards a cascade of "return" statements, followed by a
echo(json_encode($returnArray)) and a normal end to the php_script at the
end of the call chain.

However, curl_exec() seems to hang completely. I've added over a dozen
"debuginfo -> file on server" statements, and the one that should fire
straight after curl_exec() does not fire.

It does this only with large (1.8gb) video files, a smaller (60mb) video
file doesn't produce this problem and the entire import routines work fine
then.

I'd very much appreciate any tips you might have for me.


Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread David Hutto
On Tue, Feb 8, 2011 at 7:24 AM, Tolas Anon  wrote:
> Hi..
>
> For a set of media import routines, i'm using a
> javascript->php_on_apache->windows.bat->php_cli->curl->php_script invocation
> method.
> It seems longwinded, but it's designed to have different parts of the import
> process run on different servers.
>
> I'm stuck at getting curl_exec() to return the data of the final php_script
> that does the importing.


this is the part that hits my non professionals nerve.you want
something to return data like a 'db' that does the 'final' importing'.
Importing is more of python from my exp, and it is never done at the
end, but at the beginning, so please explain why it's last that this
is done.

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread David Hutto
On Tue, Feb 8, 2011 at 7:32 AM, David Hutto  wrote:
> On Tue, Feb 8, 2011 at 7:24 AM, Tolas Anon  wrote:
>> Hi..
>>
>> For a set of media import routines, i'm using a
>> javascript->php_on_apache->windows.bat->php_cli->curl->php_script invocation
>> method.
>> It seems longwinded, but it's designed to have different parts of the import
>> process run on different servers.
>>
>> I'm stuck at getting curl_exec() to return the data of the final php_script
>> that does the importing.
>
>
> this is the part that hits my non professionals nerve.you want
> something to return data like a 'db' that does the 'final' importing'.
> Importing is more of python from my exp,

In php, t should be include, and again it should be initially.

and it is never done at the
> end, but at the beginning, so please explain why it's last that this
> is done.
>



-- 
According to theoretical physics, the division of spatial intervals as
the universe evolves gives rise to the fact that in another timeline,
your interdimensional counterpart received helpful advice from me...so
be eternally pleased for them.

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Ashley Sheridan
On Tue, 2011-02-08 at 13:24 +0100, Tolas Anon wrote:

> Hi..
> 
> For a set of media import routines, i'm using a
> javascript->php_on_apache->windows.bat->php_cli->curl->php_script invocation
> method.
> It seems longwinded, but it's designed to have different parts of the import
> process run on different servers.
> 
> I'm stuck at getting curl_exec() to return the data of the final php_script
> that does the importing.
> The script itself runs fine, and i've recorded good details that curl_exec()
> is supposed to catch with file_put_contents("/some/debug.txt",
> json_encode($returnArray)), and from those debug-printouts it's just a few
> tiny steps towards a cascade of "return" statements, followed by a
> echo(json_encode($returnArray)) and a normal end to the php_script at the
> end of the call chain.
> 
> However, curl_exec() seems to hang completely. I've added over a dozen
> "debuginfo -> file on server" statements, and the one that should fire
> straight after curl_exec() does not fire.
> 
> It does this only with large (1.8gb) video files, a smaller (60mb) video
> file doesn't produce this problem and the entire import routines work fine
> then.
> 
> I'd very much appreciate any tips you might have for me.


Let me see if I've got this right.

The windows.bat is processing the media file somehow, then calling a
php_cli script which makes a cURL call to another web-based PHP script?
Is this right? The final script I assume is getting sent some info from
the cURL call and is using it somehow (in a DB maybe?) before some sort
of message back to your curl call. What is the code then doing with it
after that?

Thanks,
Ash
http://www.ashleysheridan.co.uk




[PHP] Paging and permissions

2011-02-08 Thread Arno Kuhl
I'm hoping some clever php gurus have been here before and are willing to
share some ideas.
 
I have a site where articles are assigned to categories in containers. An
article can be assigned to only one category per container, but one or more
containers. Access permissions can be set per article, per category and/or
per container, for one or more users and/or user groups. If an article is
assigned to 10 categories and only one of those has a permission denying
access, then the article can't be accessed even if browsing through one of
the other 9 categories. Currently everything works fine, with article titles
showing when browsing through category or search result lists, and a message
is displayed when the article is clicked if it cannot be viewed because of a
permission.
 
Now there's a requirement to not display the article title in category lists
and search results if it cannot be viewed. I'm stuck with how to determine
the number of results for paging at the start of the list or search. The
site is quite large (20,000+ articles and growing) so reading the entire
result set and sifting through it with permission rules for each request is
not an option. But it might be an option if done once at the start of each
search or list request, and then use that temporary modified result set for
subsequent requests on the same set. I thought of saving the set to a
temporary db table or file (not sure about overhead of
serializing/unserializing large arrays). A sizing exercise based on the
recordset returned for searches and lists shows a max of about 150MB for
20,000 articles and 380MB for 50,000 articles that needs to be saved
temporarily per search or list request - in the vast majority of cases the
set will be *much* smaller but it needs to cope with the worst case, and
still do so a year down the line.
 
All this extra work because I can't simply get an accurate number of results
for paging, because of permissions!
 
So my questions are:
1. Which is better (performance) for this situation: file or db?
2. How do I prepare a potentially very large data set for file or fast
writing to a new table (ie I obviously don't want to write it record by
record)
3. Are there any other alternatives worth looking at?
 
TIA
 
Cheers
Arno


Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread David Hutto
On Tue, Feb 8, 2011 at 7:35 AM, Ashley Sheridan
 wrote:
> On Tue, 2011-02-08 at 13:24 +0100, Tolas Anon wrote:
>
>> Hi..
>>
>> For a set of media import routines, i'm using a
>> javascript->php_on_apache->windows.bat->php_cli->curl->php_script invocation
>> method.
>> It seems longwinded, but it's designed to have different parts of the import
>> process run on different servers.
>>
>> I'm stuck at getting curl_exec() to return the data of the final php_script
>> that does the importing.
>> The script itself runs fine, and i've recorded good details that curl_exec()
>> is supposed to catch with file_put_contents("/some/debug.txt",
>> json_encode($returnArray)), and from those debug-printouts it's just a few
>> tiny steps towards a cascade of "return" statements, followed by a
>> echo(json_encode($returnArray)) and a normal end to the php_script at the
>> end of the call chain.
>>
>> However, curl_exec() seems to hang completely. I've added over a dozen
>> "debuginfo -> file on server" statements, and the one that should fire
>> straight after curl_exec() does not fire.
>>
>> It does this only with large (1.8gb) video files, a smaller (60mb) video
>> file doesn't produce this problem and the entire import routines work fine
>> then.
>>
>> I'd very much appreciate any tips you might have for me.
>
>
> Let me see if I've got this right.
>
> The windows.bat is processing the media file somehow, then calling a
> php_cli script which makes a cURL call to another web-based PHP script?
> Is this right? The final script I assume is getting sent some info from
> the cURL call and is using it somehow (in a DB maybe?) before some sort
> of message back to your curl call. What is the code then doing with it
> after that?

Other than placing it in the main php file(index.php), at the position
you called it at, and at which it sits in precedence at? Because at
the end, it is a part of the page being returned to the user.

>
> Thanks,
> Ash
> http://www.ashleysheridan.co.uk
>
>
>



-- 
According to theoretical physics, the division of spatial intervals as
the universe evolves gives rise to the fact that in another timeline,
your interdimensional counterpart received helpful advice from me...so
be eternally pleased for them.

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Paging and permissions

2011-02-08 Thread David Hutto
On Tue, Feb 8, 2011 at 7:36 AM, Arno Kuhl  wrote:
> I'm hoping some clever php gurus have been here before and are willing to
> share some ideas.
>
> I have a site where articles are assigned to categories in containers. An
> article can be assigned to only one category per container, but one or more
> containers. Access permissions can be set per article, per category and/or
> per container, for one or more users and/or user groups. If an article is
> assigned to 10 categories and only one of those has a permission denying
> access, then the article can't be accessed even if browsing through one of
> the other 9 categories. Currently everything works fine, with article titles
> showing when browsing through category or search result lists, and a message
> is displayed when the article is clicked if it cannot be viewed because of a
> permission.
>
> Now there's a requirement to not display the article title in category lists
> and search results if it cannot be viewed. I'm stuck with how to determine
> the number of results for paging at the start of the list or search. The
> site is quite large (20,000+ articles and growing) so reading the entire
> result set and sifting through it with permission rules for each request is
> not an option. But it might be an option if done once at the start of each
> search or list request, and then use that temporary modified result set for
> subsequent requests on the same set. I thought of saving the set to a
> temporary db table or file (not sure about overhead of
> serializing/unserializing large arrays). A sizing exercise based on the
> recordset returned for searches and lists shows a max of about 150MB for
> 20,000 articles and 380MB for 50,000 articles that needs to be saved
> temporarily per search or list request - in the vast majority of cases the
> set will be *much* smaller but it needs to cope with the worst case, and
> still do so a year down the line.
>
> All this extra work because I can't simply get an accurate number of results
> for paging, because of permissions!
>
> So my questions are:
> 1. Which is better (performance) for this situation: file or db?

have you timed it yourself?

> 2. How do I prepare a potentially very large data set for file or fast
> writing to a new table (ie I obviously don't want to write it record by
> record)

Even the db's cant insert as fast as the function is presented to it,
and it can respond, so again...timeit

> 3. Are there any other alternatives worth looking at?

This is a question for the experienced php developers. But the above
is applicable.

>
> TIA
>
> Cheers
> Arno
>



-- 
According to theoretical physics, the division of spatial intervals as
the universe evolves gives rise to the fact that in another timeline,
your interdimensional counterpart received helpful advice from me...so
be eternally pleased for them.

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
On Tue, Feb 8, 2011 at 1:35 PM, Ashley Sheridan 
wrote:

> On Tue, 2011-02-08 at 13:24 +0100, Tolas Anon wrote:
>
> > Hi..
> >
> > For a set of media import routines, i'm using a
> > javascript->php_on_apache->windows.bat->php_cli->curl->php_script
> invocation
> > method.
> > It seems longwinded, but it's designed to have different parts of the
> import
> > process run on different servers.
> >
> > I'm stuck at getting curl_exec() to return the data of the final
> php_script
> > that does the importing.
> > The script itself runs fine, and i've recorded good details that
> curl_exec()
> > is supposed to catch with file_put_contents("/some/debug.txt",
> > json_encode($returnArray)), and from those debug-printouts it's just a
> few
> > tiny steps towards a cascade of "return" statements, followed by a
> > echo(json_encode($returnArray)) and a normal end to the php_script at the
> > end of the call chain.
> >
> > However, curl_exec() seems to hang completely. I've added over a dozen
> > "debuginfo -> file on server" statements, and the one that should fire
> > straight after curl_exec() does not fire.
> >
> > It does this only with large (1.8gb) video files, a smaller (60mb) video
> > file doesn't produce this problem and the entire import routines work
> fine
> > then.
> >
> > I'd very much appreciate any tips you might have for me.
>
>
> Let me see if I've got this right.
>
> The windows.bat is processing the media file somehow, then calling a
> php_cli script which makes a cURL call to another web-based PHP script?
> Is this right? The final script I assume is getting sent some info from
> the cURL call and is using it somehow (in a DB maybe?) before some sort
> of message back to your curl call. What is the code then doing with it
> after that?
>
> Thanks,
> Ash
> http://www.ashleysheridan.co.uk
>
>
>
oops, i missed a step (php_daemon_script) in the chain of calls;

it's : javascript -> php_on_apache -> windows.bat -> php_cli ->
php_daemon_script -> curl_exec -> php_script

followed by calls-until-finished from javascript that read the status of
php_script via json files written to the server and thus display the status
to the end-user..

the windows.bat just starts up cli-php with admin privileges, which executes
the php_daemon_script, which uses repetitive curl calls to the import-script
("php_script" at the end of my chain) that does all the work for a single
item in the total upload/import queue; it does video conversion with
exec(/path/to/ffmpeg), photo conversion with imagemagick, and updates the db
with the php adodb library.
php_script at the end of it's work returns a simple and short status array
(json_encode()d) to the php_daemon_script via curl_exec, that dictates if
the php_daemon_script should continue calling the (import) php_script more
times.

it's curl_exec that hangs/freezes, both with using CURLOPT_RETURNTRANSFER=1,
or capturing output with ob_start() and ob_get_clean(). i've gathered that
much from my custom debug logs.


Re: [PHP] Paging and permissions

2011-02-08 Thread Ashley Sheridan
On Tue, 2011-02-08 at 14:36 +0200, Arno Kuhl wrote:

> I'm hoping some clever php gurus have been here before and are willing to
> share some ideas.
>  
> I have a site where articles are assigned to categories in containers. An
> article can be assigned to only one category per container, but one or more
> containers. Access permissions can be set per article, per category and/or
> per container, for one or more users and/or user groups. If an article is
> assigned to 10 categories and only one of those has a permission denying
> access, then the article can't be accessed even if browsing through one of
> the other 9 categories. Currently everything works fine, with article titles
> showing when browsing through category or search result lists, and a message
> is displayed when the article is clicked if it cannot be viewed because of a
> permission.
>  
> Now there's a requirement to not display the article title in category lists
> and search results if it cannot be viewed. I'm stuck with how to determine
> the number of results for paging at the start of the list or search. The
> site is quite large (20,000+ articles and growing) so reading the entire
> result set and sifting through it with permission rules for each request is
> not an option. But it might be an option if done once at the start of each
> search or list request, and then use that temporary modified result set for
> subsequent requests on the same set. I thought of saving the set to a
> temporary db table or file (not sure about overhead of
> serializing/unserializing large arrays). A sizing exercise based on the
> recordset returned for searches and lists shows a max of about 150MB for
> 20,000 articles and 380MB for 50,000 articles that needs to be saved
> temporarily per search or list request - in the vast majority of cases the
> set will be *much* smaller but it needs to cope with the worst case, and
> still do so a year down the line.
>  
> All this extra work because I can't simply get an accurate number of results
> for paging, because of permissions!
>  
> So my questions are:
> 1. Which is better (performance) for this situation: file or db?
> 2. How do I prepare a potentially very large data set for file or fast
> writing to a new table (ie I obviously don't want to write it record by
> record)
> 3. Are there any other alternatives worth looking at?
>  
> TIA
>  
> Cheers
> Arno


How are you determining (logically, not in code) when an article is
allowed to be read?

Assume an article on "user permissions in mysql" is in a container
called 'databases' and in a second one called 'security' and both
containers are in a category called 'computers'

Now get a user called John who is in a group called 'db admins' and that
group gives him permissions to view all articles in the 'databases'
container and any articles in any container in the 'computers' category.
Now assume John also has explicit user permissions revoking that right
to view the article in any container.

What I'm getting at is what's the order of privilege for rights? Do
group rights for categories win out over those for containers, or do
individual user rights trump all of them overall?

I think once that's figured out, a lot can be done inside the query
itself to minimise the impact on the script getting the results.

Thanks,
Ash
http://www.ashleysheridan.co.uk




Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Ashley Sheridan
On Tue, 2011-02-08 at 13:50 +0100, Tolas Anon wrote:

> 
> 
> 
> On Tue, Feb 8, 2011 at 1:35 PM, Ashley Sheridan
>  wrote:
> 
> 
> On Tue, 2011-02-08 at 13:24 +0100, Tolas Anon wrote:
> 
> > Hi..
> >
> > For a set of media import routines, i'm using a
> >
> javascript->php_on_apache->windows.bat->php_cli->curl->php_script 
> invocation
> > method.
> > It seems longwinded, but it's designed to have different
> parts of the import
> > process run on different servers.
> >
> > I'm stuck at getting curl_exec() to return the data of the
> final php_script
> > that does the importing.
> > The script itself runs fine, and i've recorded good details
> that curl_exec()
> > is supposed to catch with
> file_put_contents("/some/debug.txt",
> > json_encode($returnArray)), and from those debug-printouts
> it's just a few
> > tiny steps towards a cascade of "return" statements,
> followed by a
> > echo(json_encode($returnArray)) and a normal end to the
> php_script at the
> > end of the call chain.
> >
> > However, curl_exec() seems to hang completely. I've added
> over a dozen
> > "debuginfo -> file on server" statements, and the one that
> should fire
> > straight after curl_exec() does not fire.
> >
> > It does this only with large (1.8gb) video files, a smaller
> (60mb) video
> > file doesn't produce this problem and the entire import
> routines work fine
> > then.
> >
> > I'd very much appreciate any tips you might have for me.
> 
> 
> 
> 
> Let me see if I've got this right.
> 
> The windows.bat is processing the media file somehow, then
> calling a
> php_cli script which makes a cURL call to another web-based
> PHP script?
> Is this right? The final script I assume is getting sent some
> info from
> the cURL call and is using it somehow (in a DB maybe?) before
> some sort
> of message back to your curl call. What is the code then doing
> with it
> after that?
> 
> Thanks,
> Ash
> http://www.ashleysheridan.co.uk
> 
> 
> 
> 
> oops, i missed a step (php_daemon_script) in the chain of calls;
> 
> it's : javascript -> php_on_apache -> windows.bat -> php_cli ->
> php_daemon_script -> curl_exec -> php_script
> 
> followed by calls-until-finished from javascript that read the status
> of php_script via json files written to the server and thus display
> the status to the end-user..
> 
> the windows.bat just starts up cli-php with admin privileges, which
> executes the php_daemon_script, which uses repetitive curl calls to
> the import-script ("php_script" at the end of my chain) that does all
> the work for a single item in the total upload/import queue; it does
> video conversion with exec(/path/to/ffmpeg), photo conversion with
> imagemagick, and updates the db with the php adodb library. 
> php_script at the end of it's work returns a simple and short status
> array (json_encode()d) to the php_daemon_script via curl_exec, that
> dictates if the php_daemon_script should continue calling the (import)
> php_script more times.
> 
> it's curl_exec that hangs/freezes, both with using
> CURLOPT_RETURNTRANSFER=1, or capturing output with ob_start() and
> ob_get_clean(). i've gathered that much from my custom debug logs.


I've done similar things with transcoders, and found that often the best
way is to send off the job to the transcoder and leave it. Don't poll
from your main app to see how it's getting along (it'll get annoyed
otherwise!)

When the script responsible for transcoding is done though, it can
report back to the main app to let it know how it got on, and pass along
any details it needs that way.

This isn't great for getting things like transcode progress, but it will
reduce polling traffic, and transcodes can take a long time, especially
if more than once are running at the same time.

Thanks,
Ash
http://www.ashleysheridan.co.uk




Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread David Hutto
On Tue, Feb 8, 2011 at 8:02 AM, Ashley Sheridan 
wrote:

>  On Tue, 2011-02-08 at 07:59 -0500, David Hutto wrote:
>
> > it's : javascript -> php_on_apache -> windows.bat -> php_cli ->
> > php_daemon_script -> curl_exec -> php_script
> >
> > followed by calls-until-finished from javascript that read the status of
> > php_script via json files written to the server and thus display the
> status
> > to the end-user..
>
> 1: java event
> 2:java event calls a php function to apache.
>
>
> Java != Javascript
>
>
Honestly, I've never used java enough to know the difference from the
html/css/javascript/php mentality to notice there was a difference(Not that
I don't now the difference, just misuse the term).



>
>   Thanks,
> Ash
> http://www.ashleysheridan.co.uk
>
>
>


Fwd: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
-- Forwarded message --
From: Tolas Anon 
Date: Tue, Feb 8, 2011 at 1:56 PM
Subject: Re: [PHP] curl_exec won't return (any data)
To: a...@ashleysheridan.co.uk




On Tue, Feb 8, 2011 at 1:50 PM, Tolas Anon  wrote:

>
>
> On Tue, Feb 8, 2011 at 1:35 PM, Ashley Sheridan 
> wrote:
>
>> On Tue, 2011-02-08 at 13:24 +0100, Tolas Anon wrote:
>>
>> > Hi..
>> >
>> > For a set of media import routines, i'm using a
>> > javascript->php_on_apache->windows.bat->php_cli->curl->php_script
>> invocation
>> > method.
>> > It seems longwinded, but it's designed to have different parts of the
>> import
>> > process run on different servers.
>> >
>> > I'm stuck at getting curl_exec() to return the data of the final
>> php_script
>> > that does the importing.
>> > The script itself runs fine, and i've recorded good details that
>> curl_exec()
>> > is supposed to catch with file_put_contents("/some/debug.txt",
>> > json_encode($returnArray)), and from those debug-printouts it's just a
>> few
>> > tiny steps towards a cascade of "return" statements, followed by a
>> > echo(json_encode($returnArray)) and a normal end to the php_script at
>> the
>> > end of the call chain.
>> >
>> > However, curl_exec() seems to hang completely. I've added over a dozen
>> > "debuginfo -> file on server" statements, and the one that should fire
>> > straight after curl_exec() does not fire.
>> >
>> > It does this only with large (1.8gb) video files, a smaller (60mb) video
>> > file doesn't produce this problem and the entire import routines work
>> fine
>> > then.
>> >
>> > I'd very much appreciate any tips you might have for me.
>>
>>
>> Let me see if I've got this right.
>>
>> The windows.bat is processing the media file somehow, then calling a
>> php_cli script which makes a cURL call to another web-based PHP script?
>> Is this right? The final script I assume is getting sent some info from
>> the cURL call and is using it somehow (in a DB maybe?) before some sort
>> of message back to your curl call. What is the code then doing with it
>> after that?
>>
>> Thanks,
>> Ash
>> http://www.ashleysheridan.co.uk
>>
>>
>>
> oops, i missed a step (php_daemon_script) in the chain of calls;
>
> it's : javascript -> php_on_apache -> windows.bat -> php_cli ->
> php_daemon_script -> curl_exec -> php_script
>
> followed by calls-until-finished from javascript that read the status of
> php_script via json files written to the server and thus display the status
> to the end-user..
>
> the windows.bat just starts up cli-php with admin privileges, which
> executes the php_daemon_script, which uses repetitive curl calls to the
> import-script ("php_script" at the end of my chain) that does all the work
> for a single item in the total upload/import queue; it does video conversion
> with exec(/path/to/ffmpeg), photo conversion with imagemagick, and updates
> the db with the php adodb library.
> php_script at the end of it's work returns a simple and short status array
> (json_encode()d) to the php_daemon_script via curl_exec, that dictates if
> the php_daemon_script should continue calling the (import) php_script more
> times.
>
> it's curl_exec that hangs/freezes, both with using CURLOPT_RETURNTRANSFER=1,
> or capturing output with ob_start() and ob_get_clean(). i've gathered that
> much from my custom debug logs.
>

both php_daemon_script and php_script should use ini_set
('max_execution_time',0);


but i just saw that php_daemon_script does not!!
that might be my error, i hope.

i'll let you know in about 1 hour when my test completes..


Re: [PHP] Paging and permissions

2011-02-08 Thread Tolas Anon
On Tue, Feb 8, 2011 at 1:36 PM, Arno Kuhl  wrote:

> I'm hoping some clever php gurus have been here before and are willing to
> share some ideas.
>
> I have a site where articles are assigned to categories in containers. An
> article can be assigned to only one category per container, but one or more
> containers. Access permissions can be set per article, per category and/or
> per container, for one or more users and/or user groups. If an article is
> assigned to 10 categories and only one of those has a permission denying
> access, then the article can't be accessed even if browsing through one of
> the other 9 categories. Currently everything works fine, with article
> titles
> showing when browsing through category or search result lists, and a
> message
> is displayed when the article is clicked if it cannot be viewed because of
> a
> permission.
>
> Now there's a requirement to not display the article title in category
> lists
> and search results if it cannot be viewed. I'm stuck with how to determine
> the number of results for paging at the start of the list or search. The
> site is quite large (20,000+ articles and growing) so reading the entire
> result set and sifting through it with permission rules for each request is
> not an option. But it might be an option if done once at the start of each
> search or list request, and then use that temporary modified result set for
> subsequent requests on the same set. I thought of saving the set to a
> temporary db table or file (not sure about overhead of
> serializing/unserializing large arrays). A sizing exercise based on the
> recordset returned for searches and lists shows a max of about 150MB for
> 20,000 articles and 380MB for 50,000 articles that needs to be saved
> temporarily per search or list request - in the vast majority of cases the
> set will be *much* smaller but it needs to cope with the worst case, and
> still do so a year down the line.
>
> All this extra work because I can't simply get an accurate number of
> results
> for paging, because of permissions!
>
> So my questions are:
> 1. Which is better (performance) for this situation: file or db?
> 2. How do I prepare a potentially very large data set for file or fast
> writing to a new table (ie I obviously don't want to write it record by
> record)
> 3. Are there any other alternatives worth looking at?
>
> TIA
>
> Cheers
> Arno
>

Seems to me you make your setup needlessly complicated and restrictive.

And it's bad form to display articles in search results that aren't allowed
to be viewed..

Tell us more about why you want it to be so restrictive, i just don't
understand it.


Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread David Hutto
On Tue, Feb 8, 2011 at 7:50 AM, Tolas Anon  wrote:
> On Tue, Feb 8, 2011 at 1:35 PM, Ashley Sheridan 
> wrote:
>
>> On Tue, 2011-02-08 at 13:24 +0100, Tolas Anon wrote:
>>
>> > Hi..
>> >
>> > For a set of media import routines, i'm using a
>> > javascript->php_on_apache->windows.bat->php_cli->curl->php_script
>> invocation
>> > method.
>> > It seems longwinded, but it's designed to have different parts of the
>> import
>> > process run on different servers.
>> >
>> > I'm stuck at getting curl_exec() to return the data of the final
>> php_script
>> > that does the importing.
>> > The script itself runs fine, and i've recorded good details that
>> curl_exec()
>> > is supposed to catch with file_put_contents("/some/debug.txt",
>> > json_encode($returnArray)), and from those debug-printouts it's just a
>> few
>> > tiny steps towards a cascade of "return" statements, followed by a
>> > echo(json_encode($returnArray)) and a normal end to the php_script at the
>> > end of the call chain.
>> >
>> > However, curl_exec() seems to hang completely. I've added over a dozen
>> > "debuginfo -> file on server" statements, and the one that should fire
>> > straight after curl_exec() does not fire.
>> >
>> > It does this only with large (1.8gb) video files, a smaller (60mb) video
>> > file doesn't produce this problem and the entire import routines work
>> fine
>> > then.
>> >
>> > I'd very much appreciate any tips you might have for me.
>>
>>
>> Let me see if I've got this right.
>>
>> The windows.bat is processing the media file somehow, then calling a
>> php_cli script which makes a cURL call to another web-based PHP script?
>> Is this right? The final script I assume is getting sent some info from
>> the cURL call and is using it somehow (in a DB maybe?) before some sort
>> of message back to your curl call. What is the code then doing with it
>> after that?
>>
>> Thanks,
>> Ash
>> http://www.ashleysheridan.co.uk
>>
>>
>>
> oops, i missed a step (php_daemon_script) in the chain of calls;
>
> it's : javascript -> php_on_apache -> windows.bat -> php_cli ->
> php_daemon_script -> curl_exec -> php_script
>
> followed by calls-until-finished from javascript that read the status of
> php_script via json files written to the server and thus display the status
> to the end-user..

1: java event
2:java event calls a php function to apache.
3: which calls a window.bat
4: calls a php command line
5. calls a php daemon, which is a 'waiting server process', listening
on on a port.
6. you execute a command line statement
7. and it's for a php script to return something

>
> the windows.bat just starts up cli-php with admin privileges, which executes
> the php_daemon_script, which uses repetitive curl calls to the import-script
> ("php_script" at the end of my chain) that does all the work for a single
> item in the total upload/import queue;


 it does video conversion with
> exec(/path/to/ffmpeg), photo conversion with imagemagick, and updates the db
> with the php adodb library.
> php_script at the end of it's work returns a simple and short status array
> (json_encode()d) to the php_daemon_script via curl_exec, that dictates if
> the php_daemon_script should continue calling the (import) php_script more
> times.
>
> it's curl_exec that hangs/freezes, both with using CURLOPT_RETURNTRANSFER=1,
> or capturing output with ob_start() and ob_get_clean(). i've gathered that
> much from my custom debug logs.
>

What do you have at the beginning, and what do you want at the end of
this process. Break it down into simple steps. And then not only can
any php programmer help you, but any computer scientist can help you
utilize the control flow logic.


-- 
According to theoretical physics, the division of spatial intervals as
the universe evolves gives rise to the fact that in another timeline,
your interdimensional counterpart received helpful advice from me...so
be eternally pleased for them.

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Ashley Sheridan
On Tue, 2011-02-08 at 07:59 -0500, David Hutto wrote:

> > it's : javascript -> php_on_apache -> windows.bat -> php_cli ->
> > php_daemon_script -> curl_exec -> php_script
> >
> > followed by calls-until-finished from javascript that read the
> status of
> > php_script via json files written to the server and thus display the
> status
> > to the end-user..
> 
> 1: java event
> 2:java event calls a php function to apache. 


Java != Javascript

Thanks,
Ash
http://www.ashleysheridan.co.uk




RE: [PHP] Secure monetary transactions

2011-02-08 Thread Bob McConnell
From: Paul M Foster

> I'm certain people on this list have set up this type of system for
> customers. So I have some questions:
> 
> 1) Does the usual online store software (osCommerce or whatever)
include
> "secure" pages for acceptance of credit cards? I know they have the
> capability to pass this info securely off to places like authorize.net
> for processing.
> 
> 2) Assuming a customer website, probably hosted in a shared hosting
> environment, with appropriate ecommerce store software, how does one
> deal with PCI compliance? I mean, the customer would have no control
> over the data center where the site is hosted. Moreover, they would
> probably have little control over the updating of insecure software,
as
> demanded by PCI. They likely don't have the facilities to do the type
of
> penetration testing PCI wants. So how could they (or how do you) deal
> with the potentially hundreds of questions the PCI questionnaire asks
> about all this stuff? How do you, as a programmer doing this for a
> customer, handle this?

1) No.

2) PCI compliance is neither simple nor cheap. If you have not done it
before, hire a consultant that has and have them train you. You will
also need annual refresher courses and a good auditor to validate your
site every month.

You will need to change data centers, as you need one that is PCI
compliant for the pages that will handle protected information. There
are requirements for physical security of those servers as well as the
software that runs on them. You also have a choice of maintaining your
own servers or finding a managed hosting service that will maintain them
for you.

One of the requirements is that you must maintain separate servers for
development and testing. You also need to establish a formal
development, test and deployment process. The developers are not allowed
to have any access to the production servers. We have four sets,
development, QA test, User Acceptance Test and production. The latter
two are exposed to the Internet, while the first two are internal only.

We have several sites that are now PCI compliant. It took us eight
months after the decision to get the first one online and certified.
Most of that was training and waiting for the audits and certification,
as we nearly passed the initial validation on the first try. But we had
to change hosting providers twice to find one that we were comfortable
with.

After that is all said and done, keep in mind that the primary purpose
of the PCI requirements is to mitigate the financial liability of the
credit card issuers. If anything goes wrong at your end that exposes
privileged data, you will be financially responsible for the damages. So
make sure you go above and beyond those requirements to protect
yourself.

Bob McConnell

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Secure monetary transactions

2011-02-08 Thread David Hutto
On Tue, Feb 8, 2011 at 8:29 AM, Bob McConnell  wrote:
> From: Paul M Foster
>
>> I'm certain people on this list have set up this type of system for
>> customers. So I have some questions:
>>
>> 1) Does the usual online store software (osCommerce or whatever)
> include
>> "secure" pages for acceptance of credit cards? I know they have the
>> capability to pass this info securely off to places like authorize.net
>> for processing.
>>
>> 2) Assuming a customer website, probably hosted in a shared hosting
>> environment, with appropriate ecommerce store software, how does one
>> deal with PCI compliance? I mean, the customer would have no control
>> over the data center where the site is hosted. Moreover, they would
>> probably have little control over the updating of insecure software,
> as
>> demanded by PCI. They likely don't have the facilities to do the type
> of
>> penetration testing PCI wants. So how could they (or how do you) deal
>> with the potentially hundreds of questions the PCI questionnaire asks
>> about all this stuff? How do you, as a programmer doing this for a
>> customer, handle this?
>
> 1) No.
>
> 2) PCI compliance is neither simple nor cheap. If you have not done it
> before, hire a consultant that has and have them train you. You will
> also need annual refresher courses and a good auditor to validate your
> site every month.
>
> You will need to change data centers, as you need one that is PCI
> compliant for the pages that will handle protected information. There
> are requirements for physical security of those servers as well as the
> software that runs on them. You also have a choice of maintaining your
> own servers or finding a managed hosting service that will maintain them
> for you.
>
> One of the requirements is that you must maintain separate servers for
> development and testing. You also need to establish a formal
> development, test and deployment process. The developers are not allowed
> to have any access to the production servers. We have four sets,
> development, QA test, User Acceptance Test and production. The latter
> two are exposed to the Internet, while the first two are internal only.
>
> We have several sites that are now PCI compliant. It took us eight
> months after the decision to get the first one online and certified.
> Most of that was training and waiting for the audits and certification,
> as we nearly passed the initial validation on the first try. But we had
> to change hosting providers twice to find one that we were comfortable
> with.
>
> After that is all said and done, keep in mind that the primary purpose
> of the PCI requirements is to mitigate the financial liability of the
> credit card issuers. If anything goes wrong at your end that exposes
> privileged data, you will be financially responsible for the damages. So
> make sure you go above and beyond those requirements to protect
> yourself.
>
> Bob McConnell

1. The client is responsible for the procurement of the hardware, and
software they want used.

2. Programmers are to live in a secure environment where reliable
technologies are introduced in order for them to develop with.


3. The client is always right, so they're always to blame as well,
according to their own procured wisdom.

>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>



-- 
According to theoretical physics, the division of spatial intervals as
the universe evolves gives rise to the fact that in another timeline,
your interdimensional counterpart received helpful advice from me...so
be eternally pleased for them.

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
ok, i've done another run, this time with both php_daemon_script and
php_script allowed to run indefinitely, and i've triple-checked my
results..

- all the calls to convert and import the +-2gb video file in
php_script (the import-worker-script) complete just fine, they update
a debug file on the server with correct results. from there it's just
a few simple steps back to curl_exec(), which i've tripple-checked to
be followed correctly, with more print-to-seperate-debug-file
statements.

- the call to curl_exec() in launched by php_daemon_script never
completes. i've added a print-to-debug-file statement right after
curl_exec() there, and it does NOT update that debug file. previous
experience shows that this is true for curl_setopt
(CURLOPT_RETURNTRANSFER,1) aswell as
ob_start();curl_exec(blah);file_put_contents("/path/to/debug-file-N.txt",ob_get_clean()).

- the php_daemon_script continues to run in my debug window (i now
launch windows.bat manually), it does not crash/end, it freezes. and
from experience i can tell you that it will stay frozen for several
hours, not doing anything anymore.

i've run out of ideas :(

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
for completeness, i'll say that i'm using libcurl-7.21.3, and (again)
that these import routines work without problems for smaller video
files (tested with +-60mb vid file).

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
also: the vid file itself is converted correctly too, and inserted
into my db correctly.
i can even view the converted flv and it has the correct length and everything!

it's just that the frigging import won't continue with the rest of the
files in the queue..
it's maddening! ;-)

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
On Tue, Feb 8, 2011 at 3:39 PM, Tolas Anon  wrote:
> also: the vid file itself is converted correctly too, and inserted
> into my db correctly.
> i can even view the converted flv and it has the correct length and 
> everything!
>
> it's just that the frigging import won't continue with the rest of the
> files in the queue..
> it's maddening! ;-)
>

eh, "view the converted flv" _in_ the cms that does the importing, as
an end-user of the cms would.

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
the one thing i can think of is that curl_exec() somehow stops
listening for results and hangs the calling php script
(php_daemon_script in this case) if it does not receive any data for
more than a few minutes (converting the 60mb vid file takes about a
minute, and the 2gb script well over 30 minutes)..

however i haven't been able to find any bugreports via google that
describe this bug..

could it be i found a new bug in libcurl?...

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Paging and permissions

2011-02-08 Thread David Harkness
On Tue, Feb 8, 2011 at 4:36 AM, Arno Kuhl  wrote:

> But it might be an option if done once at the start of each
> search or list request, and then use that temporary modified result set for
> subsequent requests on the same set.


Instead of serializing the articles, you only need their IDs. Using

$sql .= ' where id in (' . implode(',', $ids) . ')';

you can load the data for a page of results in a single query. Storing the
IDs is much cheaper than the articles.

If the permissions are fairly static (i.e. access for user X to article Y
doesn't change every two minutes) you could create a calculated permission
table as a many-to-many between user and article. Here's the logic flow for
a query:

1. Run the query to find matching article IDs
2. Load permissions from table for all IDs
3. For each article without a calculated permission, calculate it and insert
a row (do a batch insert to save time)

If you flag the query in the middle tier as having been processed as above,
you can join to the calculated permissions each time you need another page.
The downside is that the code that runs the queries has to operate in two
modes: raw and joined to the permissions. If most users end up querying for
all articles, the table could grow. Plus you need to purge rows any time the
permissions for an article/user changes which could get fairly complicated.

On Tue, Feb 8, 2011 at 5:17 AM, Tolas Anon  wrote:

> And it's bad form to display articles in search results that aren't allowed
> to be viewed.


On Tue, Feb 8, 2011 at 4:36 AM, Arno Kuhl  wrote:

> Now there's a requirement to not display the article title in category
> lists
> and search results if it cannot be viewed.


:)

David


Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Mujtaba Arshad
nice quintuple posting.

On Tue, Feb 8, 2011 at 9:46 AM, Tolas Anon  wrote:

> the one thing i can think of is that curl_exec() somehow stops
> listening for results and hangs the calling php script
> (php_daemon_script in this case) if it does not receive any data for
> more than a few minutes (converting the 60mb vid file takes about a
> minute, and the 2gb script well over 30 minutes)..
>
> however i haven't been able to find any bugreports via google that
> describe this bug..
>
> could it be i found a new bug in libcurl?...
>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>


-- 
Mujtaba


Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
Things I've checked in the meanwhile;

curl_setopt($ch,CURLOPT_TIMEOUT, 0); and
curl_setopt($ch,CURLOPT_TIMEOUT, 9); have the same freezing
results as before

I've used a packetsniffer to analyze the dataflow on my port 80, with
all other http apps incl the browser not in memory/running, so i could
get a clear view.
Turns out i do get the traffic my php_daemon_script relies on and
should get from curl_exec() when the 2gb video file has finished
converting & importing, flowing from an 192.xyz.xyz.xyz address (which
is my apache) to a 82.xyz.xyz.xyz address (also resolves to my
apache).

I currently run the entire site, upload and import on the same machine
and instance of apache.
And the import was run from that windows.bat again, which i had
running as a normal dos window.
I made sure the sniffer caught both the initialization of the
offending 2gb-convert call, and it's termination, which i monitored by
monitoring the server filesystem (the size of the converted vid, and
the fact it got moved into it's final directory for serving to the
end-user).

This apache installation is on an adsl modem that does
outside-to-inside port 80 forwarding to my windows apache
installation, and i use the outside-world domain name linked to my
adsl IP for the site and upload and import.

However, this "wanted" traffic is reported by wireshark (the sniffer)
as having an invalid 0x header checksum, which should be
"something else" (a hex value of course).
The packet flow is like this:
192... -> 82... : HTTP/1.1 200 OK (text/html) {wanted data}
192... -> 82... : RETRANSMISSION HTTP/1.1 200 OK (text/html) {wanted data}
192... -> 82... : HTTP > portno [FIN,ACK] seq=... ack=... win=. len=0
192... -> 82... : RETRANSMISSION HTTP/1.1 200 OK (text/html) {wanted data}
192... -> 82... : RETRANSMISSION HTTP/1.1 200 OK (text/html) {wanted data}
192... -> 82... : RETRANSMISSION HTTP/1.1 200 OK (text/html) {wanted data}
192... -> 82... : RETRANSMISSION HTTP/1.1 200 OK (text/html) {wanted data}
192... -> 82... : HTTP > portno [RST,ACK] seq=... ack=... win=0 len=0
After this the http capturing goes silent again.

The wanted packet does not show up in the debug-info-to-file call made
by the php_daemon_script just after it does
$result=curl_exec(valid-settings);

I don't use sniffers often, so i have to ask;

Is this enough evidence to report it as a libcurl bug?
Or do i have to suspect php-cli and cmd.exe as well?



P.S.;
Entire WAMP installed by WampServer2.1d-x64.exe
PHP Version 5.3.4

System  Windows NT NOOT 6.1 build 7600 (Unknow Windows version Home
Premium Edition) AMD64
Build Date  Dec 15 2010 23:40:06
CompilerMSVC9 (Visual C++ 2008)
Architecturex64

libcurl-7.21.3.0 added manually

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
On Tue, Feb 8, 2011 at 7:01 PM, Tolas Anon  wrote:
> Things I've checked in the meanwhile;
>
> curl_setopt($ch,CURLOPT_TIMEOUT, 0); and
> curl_setopt($ch,CURLOPT_TIMEOUT, 9); have the same freezing
> results as before
>
> I've used a packetsniffer to analyze the dataflow on my port 80, with
> all other http apps incl the browser not in memory/running, so i could
> get a clear view.
> Turns out i do get the traffic my php_daemon_script relies on and
> should get from curl_exec() when the 2gb video file has finished
> converting & importing, flowing from an 192.xyz.xyz.xyz address (which
> is my apache) to a 82.xyz.xyz.xyz address (also resolves to my
> apache).
>
> I currently run the entire site, upload and import on the same machine
> and instance of apache.
> And the import was run from that windows.bat again, which i had
> running as a normal dos window.
> I made sure the sniffer caught both the initialization of the
> offending 2gb-convert call, and it's termination, which i monitored by
> monitoring the server filesystem (the size of the converted vid, and
> the fact it got moved into it's final directory for serving to the
> end-user).
>
> This apache installation is on an adsl modem that does
> outside-to-inside port 80 forwarding to my windows apache
> installation, and i use the outside-world domain name linked to my
> adsl IP for the site and upload and import.
>
> However, this "wanted" traffic is reported by wireshark (the sniffer)
> as having an invalid 0x header checksum, which should be
> "something else" (a hex value of course).
> The packet flow is like this:
> 192... -> 82... : HTTP/1.1 200 OK (text/html) {wanted data}
> 192... -> 82... : RETRANSMISSION HTTP/1.1 200 OK (text/html) {wanted data}
> 192... -> 82... : HTTP > portno [FIN,ACK] seq=... ack=... win=. len=0
> 192... -> 82... : RETRANSMISSION HTTP/1.1 200 OK (text/html) {wanted data}
> 192... -> 82... : RETRANSMISSION HTTP/1.1 200 OK (text/html) {wanted data}
> 192... -> 82... : RETRANSMISSION HTTP/1.1 200 OK (text/html) {wanted data}
> 192... -> 82... : RETRANSMISSION HTTP/1.1 200 OK (text/html) {wanted data}
> 192... -> 82... : HTTP > portno [RST,ACK] seq=... ack=... win=0 len=0
> After this the http capturing goes silent again.
>
> The wanted packet does not show up in the debug-info-to-file call made
> by the php_daemon_script just after it does
> $result=curl_exec(valid-settings);
>
> I don't use sniffers often, so i have to ask;
>
> Is this enough evidence to report it as a libcurl bug?
> Or do i have to suspect php-cli and cmd.exe as well?
>
>
>
> P.S.;
> Entire WAMP installed by WampServer2.1d-x64.exe
> PHP Version 5.3.4
>
> System  Windows NT NOOT 6.1 build 7600 (Unknow Windows version Home
> Premium Edition) AMD64
> Build Date      Dec 15 2010 23:40:06
> Compiler        MSVC9 (Visual C++ 2008)
> Architecture    x64
>
> libcurl-7.21.3.0 added manually
>

I've been thinking about that reported bad checksum. libcurl may be
right to reject those packets.
And i guess that apache determines that 0x checksum.
That is, _if_ i can trust the accuracy of that wireshark app.

Bit too many variables at work here for my liking, but this bug just
has to get fixed.

I guess i'll un-install wireshark, install a different sniffer app and
re-do the whole thing, i'll let you know the results when they are in.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
ok, did another run with a different packet analyzer, one that does

not have winPcap under the hood like WireShark does. I chose

microsoft network monitor 3.4 for the task.

the launch of the 2gb vid import (only part by curl_exec):

format:
packetnumber[TAB] time_date_local_adjusted[TAB] time_offset[TAB]

processname[TAB] source[TAB] destination[TAB] protocolname[TAB]

description[TAB] conv_id[TAB] tolas_comment

134 19:50:08 8-2-2011   31.2774633  Unavailable 

WindowsNameOfApacheServer   InternetDomainOfApacheServerHTTP

HTTP:Request, POST /site/cms/php/php_(import_)script.php,

Query:PHPSESSID=o365inhor0ln8mqdr0p8vb3rm7&action=batchNext 

{HTTP:40, TCP:38, IPv4:8}   

135 19:50:08 8-2-2011   31.2777843  Unavailable 

InternetDomainOfApacheServerWindowsNameOfApacheServer   TCP 

TCP:Flags=...A, SrcPort=3573, DstPort=HTTP(80), PayloadLen=0,

Seq=1169302766, Ack=4118332969, Win=260 (scale factor 0x8) = 66560  

{TCP:39, IPv4:8}

136 19:50:08 8-2-2011   31.2781472  Unavailable 

InternetDomainOfApacheServerWindowsNameOfApacheServer   HTTP

HTTP:Request, POST /site/cms/php/php_(import_)script.php,

Query:PHPSESSID=o365inhor0ln8mqdr0p8vb3rm7&action=batchNext 

{HTTP:41, TCP:39, IPv4:8}

138 19:50:08 8-2-2011   31.4807963  Unavailable 

WindowsNameOfApacheServer   InternetDomainOfApacheServerTCP 

TCP:Flags=...A, SrcPort=HTTP(80), DstPort=3573, PayloadLen=0,

Seq=4118332969, Ack=1169303037, Win=260 (scale factor 0x8) = 66560  

{TCP:39, IPv4:8}

139 19:50:08 8-2-2011   31.4859769  Unavailable 

InternetDomainOfApacheServerWindowsNameOfApacheServer   TCP 

TCP:Flags=...A, SrcPort=HTTP(80), DstPort=3573, PayloadLen=0,

Seq=4118332969, Ack=1169303037, Win=260 (scale factor 0x8) = 66560  

{TCP:38, IPv4:8}

140 19:50:08 8-2-2011   31.6234279  adsl-router 

WindowsNameOfApacheServer   TCP TCP:Flags=..S.,

SrcPort=2456, DstPort=14013, PayloadLen=0, Seq=1335058890, Ack=0,

Win=5840 ( Negotiating scale factor 0x2 ) = 5840{TCP:42,

IPv4:4}

141 19:50:11 8-2-2011   34.6104085  adsl-router 

WindowsNameOfApacheServer   TCP TCP:[SynReTransmit #140]

Flags=..S., SrcPort=2456, DstPort=14013, PayloadLen=0,

Seq=1335058890, Ack=0, Win=5840 ( Negotiating scale factor 0x2 ) =

5840{TCP:42, IPv4:4}

Packet 137 = adsl-router discovery traffic, not relevant.





Then, after 45 minutes or so, the completion of the conversion and
import, with the wanted data sent back to curl_exec() running from
php-cli, which was started by windows.bat

Some other (network discovery) traffic is likely mixed in, I added it
to be on the safe side.

590120:32:45 8-2-2011   2588.7346959Unavailable 
WindowsNameOfApacheServer
InternetNameOfApacheServer  HTTPHTTP:Response, HTTP/1.1, 
Status: Ok,
URL: /site/cms/php/php_(import_)script.php  {HTTP:41, TCP:39, IPv4:8}
{{CONTAINS THE WANTED DATA}}

590220:32:45 8-2-2011   2588.7354838
InternetNameOfApacheServer  WindowsNameOfApacheServer
ICMPICMP:Destination Unreachable Message, Communication
Administratively Prohibited, 82.161.37.94   {IPv4:8}

590320:32:48 8-2-2011   2591.6092751
FE80:0:0:0:E579:89FF:369D:668B  FF02:0:0:0:0:0:0:C  SSDPSSDP:Request,
M-SEARCH *  {HTTP:3, UDP:2, IPv6:1}

590420:32:48 8-2-2011   2591.7421833Unavailable 
WindowsNameOfApacheServer
InternetNameOfApacheServer  TCP TCP:[ReTransmit 
#5901]Flags=...AP...,
SrcPort=HTTP(80), DstPort=3573, PayloadLen=558, Seq=4118332969 -
4118333527, Ack=1169303037, Win=260 (scale factor 0x8) =
66560   {TCP:39, IPv4:8}

590520:32:48 8-2-2011   2591.7452755
InternetNameOfApacheServer  WindowsNameOfApacheServer
ICMPICMP:Destination Unreachable Message, Communication
Administratively Prohibited, 82.161.37.94   {IPv4:8}

590620:32:49 8-2-2011   2592.4183261192.168.178.1   
WindowsNameOfApacheServer
TCP TCP:Flags=..S., SrcPort=2119, DstPort=14013, PayloadLen=0,
Seq=4038649047, Ack=0, Win=5840 ( Negotiating scale factor 0x2 ) =
5840{TCP:1001, IPv4:4}

590720:32:51 8-2-2011   2594.2343496Unavailable 
WindowsNameOfApacheServer
InternetNameOfApacheServer  TCP TCP:Flags=...A...F, 
SrcPort=HTTP(80),
DstPort=3573, PayloadLen=0, Seq=4118333527, Ack=1169303037, Win=260
(scale factor 0x8) = 66560  {TCP:39, IPv4:8}

590820:32:51 8-2-2011   2594.2349872
InternetNameOfApacheServer  WindowsNameOfApacheServer
ICMPICMP:Destination Unreachable Message, Communication
Administratively Prohibited, 82.161.37.94   {IPv4:8}

590920:32:52 8-2-2011   2595.4057840192.168.178.1   
WindowsNameOfApacheServer
TCP

Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
On Tue, Feb 8, 2011 at 8:54 PM, Tolas Anon  wrote:
> But in the meanwhile I found a new idea to try as well;
>
>        curl_setopt($ch, CURLOPT_HTTPHEADER, array(
>                        'Connection: Keep-Alive',
>                        'Keep-Alive: 300'
>        ));
>
> I already checked via phpinfo() that keep-alive is on in
> apache2handler, and no other mentions of "keepalive" or "keep alive"
> in the phpinfo() output.
>
> I'll post the results.
>

ehm

http://www.io.com/~maus/HttpKeepAlive.html :

HTTP/1.0

Under HTTP 1.0, there is no official specification for how keepalive
operates. It was, in essence, tacked on to an existing protocol. If
the browser supports keep-alive, it adds an additional header to the
request:
Connection: Keep-Alive

Then, when the server receives this request and generates a response,
it also adds a header to the response:
Connection: Keep-Alive

Following this, the connection is NOT dropped, but is instead kept
open. When the client sends another request, it uses the same
connection. This will continue until either the client or the server
decides that the conversation is over, and one of them drops the
connection.

-
HTTP/1.1

Under HTTP 1.1, the official keepalive method is different. All
connections are kept alive, unless stated otherwise with the following
header:
Connection: close

The Connection: Keep-Alive header no longer has any meaning because of this.
Additionally, an optional Keep-Alive: header is described, but is so
underspecified as to be meaningless. Avoid it.

-

And of course, my return data packet 5901 uses HTTP1.1, so the test
i'm running now probably won't fix things.. :(((

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Steve Staples
On Tue, 2011-02-08 at 21:15 +0100, Tolas Anon wrote:
> On Tue, Feb 8, 2011 at 8:54 PM, Tolas Anon  wrote:
> > But in the meanwhile I found a new idea to try as well;
> >
> >curl_setopt($ch, CURLOPT_HTTPHEADER, array(
> >'Connection: Keep-Alive',
> >'Keep-Alive: 300'
> >));
> >
> > I already checked via phpinfo() that keep-alive is on in
> > apache2handler, and no other mentions of "keepalive" or "keep alive"
> > in the phpinfo() output.
> >
> > I'll post the results.
> >
> 
> ehm
> 
> http://www.io.com/~maus/HttpKeepAlive.html :
> 
> HTTP/1.0
> 
> Under HTTP 1.0, there is no official specification for how keepalive
> operates. It was, in essence, tacked on to an existing protocol. If
> the browser supports keep-alive, it adds an additional header to the
> request:
> Connection: Keep-Alive
> 
> Then, when the server receives this request and generates a response,
> it also adds a header to the response:
> Connection: Keep-Alive
> 
> Following this, the connection is NOT dropped, but is instead kept
> open. When the client sends another request, it uses the same
> connection. This will continue until either the client or the server
> decides that the conversation is over, and one of them drops the
> connection.
> 
> -
> HTTP/1.1
> 
> Under HTTP 1.1, the official keepalive method is different. All
> connections are kept alive, unless stated otherwise with the following
> header:
> Connection: close
> 
> The Connection: Keep-Alive header no longer has any meaning because of this.
> Additionally, an optional Keep-Alive: header is described, but is so
> underspecified as to be meaningless. Avoid it.
> 
> -
> 
> And of course, my return data packet 5901 uses HTTP1.1, so the test
> i'm running now probably won't fix things.. :(((
> 


i've been sorta reading this (as I am sure most maybe stopped after the
4th consecutive post)... but what I am wondering is...

why can't you just write the output of the what you're doing to a file,
or the db, and then query along the way or when you need/want some
information on it??   Maybe i just haven't quite figured out, or got the
gist of what you are trying to accomplish...

it also seems to me, that this really isn't a PHP specific issue, so all
the posts that you're doing, really doesn't pertain to the PHP mailing
list, so (and sorry to say this) maybe stop posting all the incremental
updates you're doing, and when there is a major break through, or
someone has an idea on how to help solve your issue, update us.

Steve.


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
On Tue, Feb 8, 2011 at 9:33 PM, Steve Staples  wrote:
> i've been sorta reading this (as I am sure most maybe stopped after the
> 4th consecutive post)...

yea, i feel i gotta be complete so as to not waste the time of people
who do wanna help.

> but what I am wondering is...
>
> why can't you just write the output of the what you're doing to a file,
> or the db, and then query along the way or when you need/want some
> information on it??   Maybe i just haven't quite figured out, or got the
> gist of what you are trying to accomplish...

That's exactly what i'm doing.
I need curl_exec() to absorb the response to requests that take over
1hr to complete, because i want to be able to run different parts of
the media import process on different servers.

> it also seems to me, that this really isn't a PHP specific issue, so all
> the posts that you're doing, really doesn't pertain to the PHP mailing
> list, so (and sorry to say this) maybe stop posting all the incremental
> updates you're doing, and when there is a major break through, or
> someone has an idea on how to help solve your issue, update us.
>
> Steve.

I wish to have a complete log of this bughunt somewhere online, one of
the many websites with the php mailing list content will do nicely.

I suppose i could've chosen to subscribe to the libcurl mailinglist,
but this seemed a good place because i thought to find many people
that use libcurl in different ways here.
And at the start it could've been the php-cli, or the apache, or the
lib-curl, i just didn't know.

Even though lib-curl atm looks the more likely suspect, i'll continue
this log here.

If you don't like it, don't read it.
With a decent mail reader it is shoved under 1 header anyways.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Secure monetary transactions

2011-02-08 Thread Donovan Brooke

Paul M Foster wrote:
[snip]

In essence, my customer is not responsible for any confidential/secure
information, which is all handled by the merchant gateway.

For whatever unknown reason, my customer has been convinced they should
go with a different merchant service company. However, this company
doesn't have the same kind of secure payment pages. (Yes, they're
legitimate, but they're simply a payment processor. They don't have the
additional site to accept manual input of payment information and such.)
I've explained to my customer that, in doing this, he will need:

[snip]

I've done quite many of these... all of which could be questionable as 
to PCI-compliance... however, first, why you would require
an ecommerce app? Most gateweways come with an SDK with examples that 
you can start from.


For PCI compliance, go through the steps at the link Gary posted and see
where (if any) there become issues.

Very basically, never store the credit card, encrypt it always, and I 
don't see a reason why this could not be done securely as long as your

shared environment is secured.

If your shared environment is not secure and you require PCI compliance,
tell them they need to go to a VPS or something... about the same pricing.

Donovan



--
D Brooke

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread David Hutto
On Tue, Feb 8, 2011 at 3:47 PM, Tolas Anon  wrote:
> On Tue, Feb 8, 2011 at 9:33 PM, Steve Staples  wrote:
>> i've been sorta reading this (as I am sure most maybe stopped after the
>> 4th consecutive post)...
>
> yea, i feel i gotta be complete so as to not waste the time of people
> who do wanna help.

Personally, I feel that if you want to have a conversation that
pertains to the usefulness of the language, and therefore promote it's
'consumer' benefits, and branding, it's a necessity to relate it to
the language, and solve the OP's problem programatically utilizing the
language itself.

>
>> but what I am wondering is...
>>
>> why can't you just write the output of the what you're doing to a file,
>> or the db, and then query along the way or when you need/want some
>> information on it??   Maybe i just haven't quite figured out, or got the
>> gist of what you are trying to accomplish...
>
> That's exactly what i'm doing.
> I need curl_exec() to absorb the response to requests that take over
> 1hr to complete, because i want to be able to run different parts of
> the media import process on different servers.
>
>> it also seems to me, that this really isn't a PHP specific issue, so all
>> the posts that you're doing, really doesn't pertain to the PHP mailing
>> list, so (and sorry to say this) maybe stop posting all the incremental
>> updates you're doing, and when there is a major break through, or
>> someone has an idea on how to help solve your issue, update us.
>>
>> Steve.
>
> I wish to have a complete log of this bughunt somewhere online, one of
> the many websites with the php mailing list content will do nicely.
>
> I suppose i could've chosen to subscribe to the libcurl mailinglist,
> but this seemed a good place because i thought to find many people
> that use libcurl in different ways here.
> And at the start it could've been the php-cli, or the apache, or the
> lib-curl, i just didn't know.
>
> Even though lib-curl atm looks the more likely suspect, i'll continue
> this log here.
>
> If you don't like it, don't read it.
> With a decent mail reader it is shoved under 1 header anyways.
>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>



-- 
According to theoretical physics, the division of spatial intervals as
the universe evolves gives rise to the fact that in another timeline,
your interdimensional counterpart received helpful advice from me...so
be eternally pleased for them.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



[PHP] First PHP site - thanks - euca_phpmysql function library

2011-02-08 Thread Donovan Brooke

Hello,

Just wanted to say thanks to those that helped me get through my first 
PHP project (over the last month).


As is with much of the work we server-side language people do, the 
back-end (non-public) side of this site is perhaps the more interesting.


However, here is the link to the site:

http://www.impactseven.org/

They have full control over the content in the admin pages, and much
of this content will soon change as I simply copy/pasted some of their 
old site's content to the database fields.


btw, I7 is a great source for working capitol if you are in the need, 
and if you are in Wisconsin, USA. ;-)


Also, for good karma ;-), here is a link to a small function library 
containing just a few (mostly MySQL) functions that I created for this site:


http://www.euca.us/downloads/euca_phpmysql.zip (4KB)

(if used, please keep the 'www.euca.us' credit in place)

It has 4 functions:

dbconnect
global_id
list_formvars
list_vars

You can read all about them in the file, but here is the basic rundown.

dbconnect - basic connection/error reporting for MySQL
global_id - If you've ever run into data relations changing between
related tables, you may want to look into this one. ;-)
list_formvars - list all request vars (for testing) with the option to
display only certain matched vars.
list_vars - list all set vars (for testing) with option to display only
certain matched vars.

The later two I usually post either at the end of the page, or at the 
end of page within  for testing/development purposes.


Lastly, I'm sure I will add to this library as time goes by, but if
you find that you've used it and made changes, drop me the file so I
can learn as well.

Thanks again!,
Donovan



--
D Brooke

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
On Tue, Feb 8, 2011 at 11:01 PM, Tolas Anon  wrote:
> Ok, another run, this time after adding the following to
> php_daemon_script;
>
>        curl_setopt($ch, CURLOPT_HTTPHEADER, array(
>                        'Connection: Keep-Alive',
>                        'Keep-Alive: 300'
>        ));
>
> Exec-sum:
> No change whatsoever in behaviour, curl_exec still freezes after
> processing the 2gb file completely. ~maus was right.
>
> Since i create the "folder" in the cms to hold the media-imported
> files before starting the import, i can view the offending 2gb video
> as an end-user of my cms. but files that were behind the 2gb video
> in the queue never get imported because curl_exec (or something
> feeding it) has frozen.
>
> It could be that apache sends the wrong packets, or libcurl rejects
> 'good' packets, or libcurl just froze completely. I might not have
> thought of all possibilities though, that's one of the reasons I
> post these mails.
>
> I've run out of ideas again..
> And since i've been at it since 05:00, i'll get some sleep now..
>
> opening bell of 2gb import, microsoft network monitor 3.4 again:
>
>
> 89268   21:48:23 8-2-2011       314.7693867     Unavailable-4920
>
> InternetDomainNameOfApacheServer        WindowsNameOfApacheServer
>
> HTTP    HTTP:Request, POST /site/cms/php/php_(import_)script.php,
>
> Query:PHPSESSID=bhgnqukbn4v0dbbv25i14ca2o4&action=batchNext
>
> {HTTP:112, TCP:110, IPv4:1} {{ALL GOOD DATA}}
>
>
> And the closing bell (the omissions are there because i let the
>
> sniffer app filter by process id -> ip-address) :
>
> 363     22:31:16 8-2-2011       676.1139552     Unavailable-4920
> WindowsNameOfApacheServer       InternetDomainOfApacheServer    HTTP
> HTTP:Response, HTTP/1.1, Status: Ok, URL:{OMMITTED BECAUSE OF MEMORY
> EXHAUSTION->KILL OF SNIFFER APP}        {HTTP:31, TCP:30, IPv4:29}
> {{CONTAINS THE WANTED DATA}}
>
> 367     22:31:19 8-2-2011       679.1089487     Unavailable-4920
> WindowsNameOfApacheServer       InternetDomainOfApacheServer    TCP
> TCP:[ReTransmit #363]Flags=...AP..., SrcPort=HTTP(80), DstPort=8372,
> PayloadLen=614, Seq=3326586000 - 3326586614, Ack=3291710338, Win=260
> {TCP:30, IPv4:29}
>
> 372     22:31:21 8-2-2011       681.6189851     Unavailable-4920
> WindowsNameOfApacheServer       InternetDomainOfApacheServer    TCP
> TCP:Flags=...A...F, SrcPort=HTTP(80), DstPort=8372, PayloadLen=0,
> Seq=3326586614, Ack=3291710338, Win=260 {TCP:30, IPv4:29}
>
> 376     22:31:25 8-2-2011       685.1089219     Unavailable-4920
> WindowsNameOfApacheServer       InternetDomainOfApacheServer    TCP
> TCP:[ReTransmit #363]Flags=...AP..F, SrcPort=HTTP(80), DstPort=8372,
> PayloadLen=614, Seq=3326586000 - 3326586615, Ack=3291710338, Win=260
> {TCP:30, IPv4:29}
>
> 384     22:31:37 8-2-2011       697.1089168     Unavailable-4920
> WindowsNameOfApacheServer       InternetDomainOfApacheServer    TCP
> TCP:[ReTransmit #363]Flags=...AP..F, SrcPort=HTTP(80), DstPort=8372,
> PayloadLen=614, Seq=3326586000 - 3326586615, Ack=3291710338, Win=260
> {TCP:30, IPv4:29}
>
> 400     22:32:01 8-2-2011       721.1089938     Unavailable-4920
> WindowsNameOfApacheServer       InternetDomainOfApacheServer    TCP
> TCP:[ReTransmit #363]Flags=...AP..F, SrcPort=HTTP(80), DstPort=8372,
> PayloadLen=614, Seq=3326586000 - 3326586615, Ack=3291710338, Win=260
> {TCP:30, IPv4:29}
>
> 431     22:32:49 8-2-2011       769.1090750     Unavailable-4920
> WindowsNameOfApacheServer       InternetDomainOfApacheServer    TCP
> TCP:[ReTransmit #363]Flags=...AP..F, SrcPort=HTTP(80), DstPort=8372,
> PayloadLen=614, Seq=3326586000 - 3326586615, Ack=3291710338, Win=260
> {TCP:30, IPv4:29}
>
> 485     22:33:49 8-2-2011       829.1041446     Unavailable-4920
> WindowsNameOfApacheServer       InternetDomainOfApacheServer    TCP
> TCP:Flags=...A.R.., SrcPort=HTTP(80), DstPort=8372, PayloadLen=0,
> Seq=3326586615, Ack=3291710338, Win=0   {TCP:30, IPv4:29}
>
> 550     22:35:38 8-2-2011       938.7487524     Unavailable-676
> WindowsNameOfApacheServer       InternetDomainOfApacheServer    TCP
> TCP:Flags=...A.R.., SrcPort=8372, DstPort=HTTP(80), PayloadLen=0,
> Seq=3291710338, Ack=3326586000, Win=0   {TCP:59, IPv4:29}
>
> 551     22:35:38 8-2-2011       938.7514250     Unavailable-4920
> InternetDomainOfApacheServer    WindowsNameOfApacheServer       TCP
> TCP:Flags=...A.R.., SrcPort=8372, DstPort=HTTP(80), PayloadLen=0,
> Seq=3291710338, Ack=3326586000, Win=0   {TCP:60, IPv4:29}
>
>
> Please note that packet 550 was apparently from a different cpu
> process.
>

oh, packet number 363 has HTTP/1.1 as the protocol version..

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
Ok, another run, this time after adding the following to
php_daemon_script;

curl_setopt($ch, CURLOPT_HTTPHEADER, array(
'Connection: Keep-Alive',
'Keep-Alive: 300'
));

Exec-sum:
No change whatsoever in behaviour, curl_exec still freezes after
processing the 2gb file completely. ~maus was right.

Since i create the "folder" in the cms to hold the media-imported
files before starting the import, i can view the offending 2gb video
as an end-user of my cms. but files that were behind the 2gb video
in the queue never get imported because curl_exec (or something
feeding it) has frozen.

It could be that apache sends the wrong packets, or libcurl rejects
'good' packets, or libcurl just froze completely. I might not have
thought of all possibilities though, that's one of the reasons I
post these mails.

I've run out of ideas again..
And since i've been at it since 05:00, i'll get some sleep now..

opening bell of 2gb import, microsoft network monitor 3.4 again:


89268   21:48:23 8-2-2011   314.7693867 Unavailable-4920

InternetDomainNameOfApacheServerWindowsNameOfApacheServer   

HTTPHTTP:Request, POST /site/cms/php/php_(import_)script.php,

Query:PHPSESSID=bhgnqukbn4v0dbbv25i14ca2o4&action=batchNext 

{HTTP:112, TCP:110, IPv4:1} {{ALL GOOD DATA}}


And the closing bell (the omissions are there because i let the

sniffer app filter by process id -> ip-address) :

363 22:31:16 8-2-2011   676.1139552 Unavailable-4920
WindowsNameOfApacheServer   InternetDomainOfApacheServerHTTP
HTTP:Response, HTTP/1.1, Status: Ok, URL:{OMMITTED BECAUSE OF MEMORY
EXHAUSTION->KILL OF SNIFFER APP}{HTTP:31, TCP:30, IPv4:29}
{{CONTAINS THE WANTED DATA}}

367 22:31:19 8-2-2011   679.1089487 Unavailable-4920
WindowsNameOfApacheServer   InternetDomainOfApacheServerTCP 
TCP:[ReTransmit #363]Flags=...AP..., SrcPort=HTTP(80), DstPort=8372,
PayloadLen=614, Seq=3326586000 - 3326586614, Ack=3291710338, Win=260
{TCP:30, IPv4:29}

372 22:31:21 8-2-2011   681.6189851 Unavailable-4920
WindowsNameOfApacheServer   InternetDomainOfApacheServerTCP 
TCP:Flags=...A...F, SrcPort=HTTP(80), DstPort=8372, PayloadLen=0,
Seq=3326586614, Ack=3291710338, Win=260 {TCP:30, IPv4:29}

376 22:31:25 8-2-2011   685.1089219 Unavailable-4920
WindowsNameOfApacheServer   InternetDomainOfApacheServerTCP 
TCP:[ReTransmit #363]Flags=...AP..F, SrcPort=HTTP(80), DstPort=8372,
PayloadLen=614, Seq=3326586000 - 3326586615, Ack=3291710338, Win=260
{TCP:30, IPv4:29}

384 22:31:37 8-2-2011   697.1089168 Unavailable-4920
WindowsNameOfApacheServer   InternetDomainOfApacheServerTCP 
TCP:[ReTransmit #363]Flags=...AP..F, SrcPort=HTTP(80), DstPort=8372,
PayloadLen=614, Seq=3326586000 - 3326586615, Ack=3291710338, Win=260
{TCP:30, IPv4:29}

400 22:32:01 8-2-2011   721.1089938 Unavailable-4920
WindowsNameOfApacheServer   InternetDomainOfApacheServerTCP 
TCP:[ReTransmit #363]Flags=...AP..F, SrcPort=HTTP(80), DstPort=8372,
PayloadLen=614, Seq=3326586000 - 3326586615, Ack=3291710338, Win=260
{TCP:30, IPv4:29}

431 22:32:49 8-2-2011   769.1090750 Unavailable-4920
WindowsNameOfApacheServer   InternetDomainOfApacheServerTCP 
TCP:[ReTransmit #363]Flags=...AP..F, SrcPort=HTTP(80), DstPort=8372,
PayloadLen=614, Seq=3326586000 - 3326586615, Ack=3291710338, Win=260
{TCP:30, IPv4:29}

485 22:33:49 8-2-2011   829.1041446 Unavailable-4920
WindowsNameOfApacheServer   InternetDomainOfApacheServerTCP 
TCP:Flags=...A.R.., SrcPort=HTTP(80), DstPort=8372, PayloadLen=0,
Seq=3326586615, Ack=3291710338, Win=0   {TCP:30, IPv4:29}

550 22:35:38 8-2-2011   938.7487524 Unavailable-676 
WindowsNameOfApacheServer   InternetDomainOfApacheServerTCP 
TCP:Flags=...A.R.., SrcPort=8372, DstPort=HTTP(80), PayloadLen=0,
Seq=3291710338, Ack=3326586000, Win=0   {TCP:59, IPv4:29}

551 22:35:38 8-2-2011   938.7514250 Unavailable-4920
InternetDomainOfApacheServerWindowsNameOfApacheServer   TCP 
TCP:Flags=...A.R.., SrcPort=8372, DstPort=HTTP(80), PayloadLen=0,
Seq=3291710338, Ack=3326586000, Win=0   {TCP:60, IPv4:29}


Please note that packet 550 was apparently from a different cpu
process.

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread Tolas Anon
The one thing that strikes me as odd, before i go, is that i saw no
actual HTTP-KEEPALIVE traffic flowing... I might have missed it, but i
don't think so..

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] curl_exec won't return (any data)

2011-02-08 Thread David Hutto
On Tue, Feb 8, 2011 at 5:05 PM, Tolas Anon  wrote:
> The one thing that strikes me as odd, before i go, is that i saw no
> actual HTTP-KEEPALIVE traffic flowing... I might have missed it, but i
> don't think so..
>

Welcome to programming. on /off true /false.
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>



-- 
According to theoretical physics, the division of spatial intervals as
the universe evolves gives rise to the fact that in another timeline,
your interdimensional counterpart received helpful advice from me...so
be eternally pleased for them.

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php