Re: [PHP] PHP cron job optimization

2011-09-12 Thread Igor Escobar
Use PHP threads. Do the job separately.. in parts... in other words... you
can't read all them at once.

You can read a little more about php multithreading here:
http://blog.motane.lu/2009/01/02/multithreading-in-php/

You can use a non-relational database like mongo or couchdb to manage where
you stop and where you have to look back to the RSS feed as well.

[]'s

Regards,
Igor Escobar
*Software Engineer
*
+ http://blog.igorescobar.com
+ http://www.igorescobar.com
+ @igorescobar 





On Sat, Sep 10, 2011 at 10:37 PM, Stuart Dallas  wrote:

> On 10 Sep 2011, at 09:35, muad shibani wrote:
>
> > I want to design an application that reads news from RSS sources.
> > I have about 1000 RSS feed to collect from.
> >
> > I also will use Cron jobs every 15 minutes to collect the data.
> > the question is: Is there a clever way to collect all those feed items
> > without exhausting the server
> > any Ideas
>
> I designed a job queuing system a while back when I had a similar problem.
> You can read about it here: http://stut.net/2009/05/29/php-job-queue/. Set
> that type of system up and add a job for each feed, set to run every 15
> minutes. You can then watch the server and tune the number of concurrent job
> processors so you get the optimum balance between load and speed.
>
> -Stuart
>
> --
> Stuart Dallas
> 3ft9 Ltd
> http://3ft9.com/
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>


Re: [PHP] PHP cron job optimization

2011-09-12 Thread Igor Escobar
Other good point is: always set a timeout connection when you're getting the
RSS data to avoid your thread get stuck unnecessary. Use cURL (is much more
faster then file_get_contents).

Multithreading in PHP with cURL http://devzone.zend.com/article/3341


Regards,
Igor Escobar
*Software Engineer
*
+ http://blog.igorescobar.com
+ http://www.igorescobar.com
+ @igorescobar 





On Mon, Sep 12, 2011 at 10:05 AM, Igor Escobar wrote:

> Use PHP threads. Do the job separately.. in parts... in other words... you
> can't read all them at once.
>
> You can read a little more about php multithreading here:
> http://blog.motane.lu/2009/01/02/multithreading-in-php/
>
> You can use a non-relational database like mongo or couchdb to manage where
> you stop and where you have to look back to the RSS feed as well.
>
> []'s
>
> Regards,
> Igor Escobar
> *Software Engineer
> *
> + http://blog.igorescobar.com
> + http://www.igorescobar.com
> + @igorescobar 
>
>
>
>
>
>
> On Sat, Sep 10, 2011 at 10:37 PM, Stuart Dallas  wrote:
>
>> On 10 Sep 2011, at 09:35, muad shibani wrote:
>>
>> > I want to design an application that reads news from RSS sources.
>> > I have about 1000 RSS feed to collect from.
>> >
>> > I also will use Cron jobs every 15 minutes to collect the data.
>> > the question is: Is there a clever way to collect all those feed items
>> > without exhausting the server
>> > any Ideas
>>
>> I designed a job queuing system a while back when I had a similar problem.
>> You can read about it here: http://stut.net/2009/05/29/php-job-queue/.
>> Set that type of system up and add a job for each feed, set to run every 15
>> minutes. You can then watch the server and tune the number of concurrent job
>> processors so you get the optimum balance between load and speed.
>>
>> -Stuart
>>
>> --
>> Stuart Dallas
>> 3ft9 Ltd
>> http://3ft9.com/
>> --
>> PHP General Mailing List (http://www.php.net/)
>> To unsubscribe, visit: http://www.php.net/unsub.php
>>
>>
>


Re: [PHP] PHP cron job optimization

2011-09-12 Thread Eric Butera
On Mon, Sep 12, 2011 at 9:37 AM, Igor Escobar  wrote:
> Other good point is: always set a timeout connection when you're getting the
> RSS data to avoid your thread get stuck unnecessary. Use cURL (is much more
> faster then file_get_contents).
>
> Multithreading in PHP with cURL http://devzone.zend.com/article/3341
>
>
> Regards,
> Igor Escobar
> *Software Engineer
> *
> + http://blog.igorescobar.com
> + http://www.igorescobar.com
> + @igorescobar 
>
>
>
>
>
> On Mon, Sep 12, 2011 at 10:05 AM, Igor Escobar wrote:
>
>> Use PHP threads. Do the job separately.. in parts... in other words... you
>> can't read all them at once.
>>
>> You can read a little more about php multithreading here:
>> http://blog.motane.lu/2009/01/02/multithreading-in-php/
>>
>> You can use a non-relational database like mongo or couchdb to manage where
>> you stop and where you have to look back to the RSS feed as well.
>>
>> []'s
>>
>> Regards,
>> Igor Escobar
>> *Software Engineer
>> *
>> + http://blog.igorescobar.com
>> + http://www.igorescobar.com
>> + @igorescobar 
>>
>>
>>
>>
>>
>>
>> On Sat, Sep 10, 2011 at 10:37 PM, Stuart Dallas  wrote:
>>
>>> On 10 Sep 2011, at 09:35, muad shibani wrote:
>>>
>>> > I want to design an application that reads news from RSS sources.
>>> > I have about 1000 RSS feed to collect from.
>>> >
>>> > I also will use Cron jobs every 15 minutes to collect the data.
>>> > the question is: Is there a clever way to collect all those feed items
>>> > without exhausting the server
>>> > any Ideas
>>>
>>> I designed a job queuing system a while back when I had a similar problem.
>>> You can read about it here: http://stut.net/2009/05/29/php-job-queue/.
>>> Set that type of system up and add a job for each feed, set to run every 15
>>> minutes. You can then watch the server and tune the number of concurrent job
>>> processors so you get the optimum balance between load and speed.
>>>
>>> -Stuart
>>>
>>> --
>>> Stuart Dallas
>>> 3ft9 Ltd
>>> http://3ft9.com/
>>> --
>>> PHP General Mailing List (http://www.php.net/)
>>> To unsubscribe, visit: http://www.php.net/unsub.php
>>>
>>>
>>
>

Thread != Multi Process.

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



[PHP] Stop PHP execution on client connection closed

2011-09-12 Thread Marco Lanzotti
Hi all, I'm new in the list and I already have a question for you.
I'm running an heavy query on my DB in a PHP script called by AJAX.
Because client often abort AJAX connection to ask a new query, I need to
stop query because DB will be too loaded.
When AJAX connection is aborted, PHP script doesn't stop until it send
some output to client, so I need to wait query execution to know client
aborted connection.
How can I abort query (or script) when AJAX connection is aborted?

Thank you,
Marco


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] PHP cron job optimization

2011-09-12 Thread Igor Escobar
@Eric ok ;)


Regards,
Igor Escobar
*Software Engineer
*
+ http://blog.igorescobar.com
+ http://www.igorescobar.com
+ @igorescobar 





On Mon, Sep 12, 2011 at 10:52 AM, Eric Butera  wrote:

> On Mon, Sep 12, 2011 at 9:37 AM, Igor Escobar 
> wrote:
> > Other good point is: always set a timeout connection when you're getting
> the
> > RSS data to avoid your thread get stuck unnecessary. Use cURL (is much
> more
> > faster then file_get_contents).
> >
> > Multithreading in PHP with cURL http://devzone.zend.com/article/3341
> >
> >
> > Regards,
> > Igor Escobar
> > *Software Engineer
> > *
> > + http://blog.igorescobar.com
> > + http://www.igorescobar.com
> > + @igorescobar 
> >
> >
> >
> >
> >
> > On Mon, Sep 12, 2011 at 10:05 AM, Igor Escobar  >wrote:
> >
> >> Use PHP threads. Do the job separately.. in parts... in other words...
> you
> >> can't read all them at once.
> >>
> >> You can read a little more about php multithreading here:
> >> http://blog.motane.lu/2009/01/02/multithreading-in-php/
> >>
> >> You can use a non-relational database like mongo or couchdb to manage
> where
> >> you stop and where you have to look back to the RSS feed as well.
> >>
> >> []'s
> >>
> >> Regards,
> >> Igor Escobar
> >> *Software Engineer
> >> *
> >> + http://blog.igorescobar.com
> >> + http://www.igorescobar.com
> >> + @igorescobar 
> >>
> >>
> >>
> >>
> >>
> >>
> >> On Sat, Sep 10, 2011 at 10:37 PM, Stuart Dallas 
> wrote:
> >>
> >>> On 10 Sep 2011, at 09:35, muad shibani wrote:
> >>>
> >>> > I want to design an application that reads news from RSS sources.
> >>> > I have about 1000 RSS feed to collect from.
> >>> >
> >>> > I also will use Cron jobs every 15 minutes to collect the data.
> >>> > the question is: Is there a clever way to collect all those feed
> items
> >>> > without exhausting the server
> >>> > any Ideas
> >>>
> >>> I designed a job queuing system a while back when I had a similar
> problem.
> >>> You can read about it here: http://stut.net/2009/05/29/php-job-queue/.
> >>> Set that type of system up and add a job for each feed, set to run
> every 15
> >>> minutes. You can then watch the server and tune the number of
> concurrent job
> >>> processors so you get the optimum balance between load and speed.
> >>>
> >>> -Stuart
> >>>
> >>> --
> >>> Stuart Dallas
> >>> 3ft9 Ltd
> >>> http://3ft9.com/
> >>> --
> >>> PHP General Mailing List (http://www.php.net/)
> >>> To unsubscribe, visit: http://www.php.net/unsub.php
> >>>
> >>>
> >>
> >
>
> Thread != Multi Process.
>


[PHP] Re: Stop PHP execution on client connection closed

2011-09-12 Thread Al

See http://us2.php.net/manual/en/function.connection-aborted.php

On 9/12/2011 10:40 AM, Marco Lanzotti wrote:

Hi all, I'm new in the list and I already have a question for you.
I'm running an heavy query on my DB in a PHP script called by AJAX.
Because client often abort AJAX connection to ask a new query, I need to
stop query because DB will be too loaded.
When AJAX connection is aborted, PHP script doesn't stop until it send
some output to client, so I need to wait query execution to know client
aborted connection.
How can I abort query (or script) when AJAX connection is aborted?

Thank you,
Marco



--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php