Thanks Paul and Renee!
Paul: I am not up to par on my SQL joins, I have to get another book this
weekend; I knew that joins would be almost necessary for what I'm trying to
accomplish. Iterative development, while painstaking, seems like the hands
on approach I'm used to, and has proven effective
OK : I just recompiled, under Ubuntu 8.04, the same php version that the
distribution provides, used the same php.ini and it works... I didn't
applied the same patchs or even setted the same compilation options but for
me it's "clear" that it is a distribution problem.
I still do not know what exa
On 28.01.2010 03:40, Paul M Foster wrote:
> On Wed, Jan 27, 2010 at 04:55:46PM -0600, Skip Evans wrote:
>
>> Hey all,
>>
>> I'm looking for recommendations on how to replace accented
>> characters, like e and u with those two little dots above
>> them, with the regular e and u characters.
>
> FWI
On 26 January 2010 17:54, PEPITOVADECURT wrote:
> Exists any reports generator that exports directly to html/php?
Depending upon your platform, you have the option of using an external
report generator and invoking it appropriately from within PHP.
I'm on Windows and use Crystal Reports via PHP
> -Original Message-
> From: Rene Veerman [mailto:rene7...@gmail.com]
> Sent: 27 January 2010 22:46
>
> And if your script needs to pass large (> 5Mb) arrays around to
> functions, be sure to use passing-by-reference; failing to do so can
> double your memory requirements,
> possibly hitti
It's Rene, not Renee :p
curl is a method of fetching http pages from within php (and other languages).
with parsing i meant "parse(process) a html page into (in my case) an
array of "hits" found on that page".
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.p
Thanks for your research Mike, i'm a bit puzzled.
I have a custom random-array generator that i use to fill the
available memory to it's max, about 1.5G on a 2G system, then passing
it to a recursive json string generator.
Not passing by reference did double my memory requirements, but i've
change
Ford, Mike wrote:
>> -Original Message-
>> From: Rene Veerman [mailto:rene7...@gmail.com]
>> Sent: 27 January 2010 22:46
>>
>> And if your script needs to pass large (> 5Mb) arrays around to
>> functions, be sure to use passing-by-reference; failing to do so can
>> double your memory requir
At 12:17 PM +0100 1/28/10, Marcus Gnaß wrote:
On 28.01.2010 03:40, Paul M Foster wrote:
On Wed, Jan 27, 2010 at 04:55:46PM -0600, Skip Evans wrote:
Hey all,
I'm looking for recommendations on how to replace accented
characters, like e and u with those two little dots above
them, with the
> -Original Message-
> From: Nathan Rixham [mailto:nrix...@gmail.com]
> Sent: 28 January 2010 13:43
>
> Ford, Mike wrote:
> >> -Original Message-
> >> From: Rene Veerman [mailto:rene7...@gmail.com]
> >> Sent: 27 January 2010 22:46
> >>
> >> And if your script needs to pass large (>
tedd wrote:
At 12:17 PM +0100 1/28/10, Marcus Gnaß wrote:
On 28.01.2010 03:40, Paul M Foster wrote:
On Wed, Jan 27, 2010 at 04:55:46PM -0600, Skip Evans wrote:
Hey all,
I'm looking for recommendations on how to replace accented
characters, like e and u with those two little dots above
t
Hi..
I've built http://mediabeez.ws/htmlMicroscope/ (lgpl), which provides
a way to look at very big arrays in the browser.
I'm kinda stuck at a 120 - 200Mb data-size limit, and have exhausted
all my ideas on how to increase that limit further.
My reasoning is that i have a gigabyte free memory o
Nathan Rixham wrote:
Ford, Mike wrote:
-Original Message-
From: Rene Veerman [mailto:rene7...@gmail.com]
Sent: 27 January 2010 22:46
And if your script needs to pass large (> 5Mb) arrays around to
functions, be sure to use passing-by-reference; failing to do so can
double your memory re
Rene Veerman wrote:
Hi..
I've built http://mediabeez.ws/htmlMicroscope/ (lgpl), which provides
a way to look at very big arrays in the browser.
I'm kinda stuck at a 120 - 200Mb data-size limit, and have exhausted
all my ideas on how to increase that limit further.
My reasoning is that i have a
On Thu, Jan 28, 2010 at 11:41:43AM -, Ford, Mike wrote:
> > -Original Message-
> > From: Rene Veerman [mailto:rene7...@gmail.com]
> > Sent: 27 January 2010 22:46
> >
> > And if your script needs to pass large (> 5Mb) arrays around to
> > functions, be sure to use passing-by-reference;
On Thu, Jan 28, 2010 at 01:31:30AM -0800, Allen McCabe wrote:
>
> SIDE QUESTION: What do you think of my use of serialization? I don't see a
> need to store duplicate information in new tables, and thought serializing
> these one shot reports the best solution.
I couldn't really find a good re
Paul M Foster wrote:
On Thu, Jan 28, 2010 at 11:41:43AM -, Ford, Mike wrote:
-Original Message-
From: Rene Veerman [mailto:rene7...@gmail.com]
Sent: 27 January 2010 22:46
And if your script needs to pass large (> 5Mb) arrays around to
functions, be sure to use passing-by-reference;
Paul M Foster wrote:
On Thu, Jan 28, 2010 at 01:31:30AM -0800, Allen McCabe wrote:
SIDE QUESTION: What do you think of my use of serialization? I don't see a
need to store duplicate information in new tables, and thought serializing
these one shot reports the best solution.
I couldn't reall
On Thu, Jan 28, 2010 at 4:04 PM, Robert Cummings wrote:
> Use get_memory() to see just how much memory you're using. Somewhere in your
> script you are probably storing much more memory than you think.
My functions do print the memory usage;
I'm just wondering why the array returned is 1/5th of t
Oh, i forgot to mention that firefox takes about a gigabyte of memory
after having stalled at "200mb parsed" in a 330mb document..
And despite using setTimeout(), firefox frequently freezes (for about
2 to 10 minutes), before updating the decoding-status display again.
I'd really appreciate someo
On Thu, Jan 28, 2010 at 10:49:17AM -0500, Robert Cummings wrote:
> Paul M Foster wrote:
>> On Thu, Jan 28, 2010 at 01:31:30AM -0800, Allen McCabe wrote:
>>
>>
>>
>>> SIDE QUESTION: What do you think of my use of serialization? I don't see a
>>> need to store duplicate information in new tables, a
Rene Veerman wrote:
> Oh, i forgot to mention that firefox takes about a gigabyte of memory
> after having stalled at "200mb parsed" in a 330mb document..
>
> And despite using setTimeout(), firefox frequently freezes (for about
> 2 to 10 minutes), before updating the decoding-status display again
Paul M Foster wrote:
On Thu, Jan 28, 2010 at 10:49:17AM -0500, Robert Cummings wrote:
Paul M Foster wrote:
On Thu, Jan 28, 2010 at 01:31:30AM -0800, Allen McCabe wrote:
SIDE QUESTION: What do you think of my use of serialization? I don't see a
need to store duplicate information in new tab
Op 1/28/10 5:03 PM, Rene Veerman schreef:
> Oh, i forgot to mention that firefox takes about a gigabyte of memory
> after having stalled at "200mb parsed" in a 330mb document..
>
> And despite using setTimeout(), firefox frequently freezes (for about
> 2 to 10 minutes), before updating the decodin
On Thu, 2010-01-28 at 17:30 +0100, Jochem Maas wrote:
> Op 1/28/10 5:03 PM, Rene Veerman schreef:
> > Oh, i forgot to mention that firefox takes about a gigabyte of memory
> > after having stalled at "200mb parsed" in a 330mb document..
> >
> > And despite using setTimeout(), firefox frequently f
At 200Mb/330Mb parsing, i have released 200Mb of html comment nodes,
and should have accumulated only 200Mb of javascript array/object.
it's _just_ the data, no HTML has been generated yet.
I accept a 5x overhead for turning it into HTML, but wonder why
firefox
a) stops updating the screen despite
On Thu, Jan 28, 2010 at 5:32 PM, Ashley Sheridan
wrote:
>
> You could page through the data and make it look like it's happening all in
> the browser with a bit of clever ajax
>
Ok, good point.
Maybe JSON-transport > javascript parsing just has it's limit at just over
100 meg.
Accepting the fac
At 9:28 AM -0500 1/28/10, Robert Cummings wrote:
tedd wrote:
At 12:17 PM +0100 1/28/10, Marcus Gnaß wrote:
On 28.01.2010 03:40, Paul M Foster wrote:
On Wed, Jan 27, 2010 at 04:55:46PM -0600, Skip Evans wrote:
Hey all,
I'm looking for recommendations on how to replace accented
characters
On Thu, Jan 28, 2010 at 12:31 AM, wrote:
> On Wed, 27 Jan 2010 10:21:00 -0800, deal...@gmail.com (dealtek) wrote:
>Opening tables, etc, wrongly generally messes the page up completely, but
> forgetting to close them again often has no affect no visible effect at all
> -- until you
> make some in
On Thu, 28 Jan 2010 21:10:42 +0100, rene7...@gmail.com (Rene Veerman) wrote:
>On Thu, Jan 28, 2010 at 12:31 AM, wrote:
>> On Wed, 27 Jan 2010 10:21:00 -0800, deal...@gmail.com (dealtek) wrote:
>>Opening tables, etc, wrongly generally messes the page up completely, but
>> forgetting to close them
On Thu, Jan 28, 2010 at 02:38:52PM -0500, tedd wrote:
> My point was more to the theme that we are an eclectic group of people
> with a wide range of knowledge and skills. Individually we may have
> trouble finding our ass, but together we can find the answer to many
> things.
I just got th
On Fri, Jan 29, 2010 at 08:17:34AM +1100, clanc...@cybec.com.au wrote:
> On Thu, 28 Jan 2010 21:10:42 +0100, rene7...@gmail.com (Rene Veerman) wrote:
>
> >On Thu, Jan 28, 2010 at 12:31 AM, wrote:
> >> On Wed, 27 Jan 2010 10:21:00 -0800, deal...@gmail.com (dealtek) wrote:
> >>Opening tables, etc
Paul M Foster wrote:
On Thu, Jan 28, 2010 at 02:38:52PM -0500, tedd wrote:
My point was more to the theme that we are an eclectic group of people
with a wide range of knowledge and skills. Individually we may have
trouble finding our ass, but together we can find the answer to many
things.
Robert Cummings wrote:
Paul M Foster wrote:
On Thu, Jan 28, 2010 at 02:38:52PM -0500, tedd wrote:
My point was more to the theme that we are an eclectic group of people
with a wide range of knowledge and skills. Individually we may have
trouble finding our ass, but together we can find the
On Thu, 2010-01-28 at 16:23 -0500, Paul M Foster wrote:
> On Fri, Jan 29, 2010 at 08:17:34AM +1100, clanc...@cybec.com.au wrote:
>
> > On Thu, 28 Jan 2010 21:10:42 +0100, rene7...@gmail.com (Rene Veerman) wrote:
> >
> > >On Thu, Jan 28, 2010 at 12:31 AM, wrote:
> > >> On Wed, 27 Jan 2010 10:21
clanc...@cybec.com.au wrote:
On Thu, 28 Jan 2010 21:10:42 +0100, rene7...@gmail.com (Rene Veerman) wrote:
On Thu, Jan 28, 2010 at 12:31 AM, wrote:
On Wed, 27 Jan 2010 10:21:00 -0800, deal...@gmail.com (dealtek) wrote:
Opening tables, etc, wrongly generally messes the page up completely, but
On Thu, Jan 28, 2010 at 10:17 PM, wrote:
> On Thu, 28 Jan 2010 21:10:42 +0100, rene7...@gmail.com (Rene Veerman) wrote:
>
>>On Thu, Jan 28, 2010 at 12:31 AM, wrote:
>>> On Wed, 27 Jan 2010 10:21:00 -0800, deal...@gmail.com (dealtek) wrote:
>>>Opening tables, etc, wrongly generally messes the pa
Hey all -
I need a few million sample contact records - name, company, address, email,
web, phone, fax. ZIP codes and area codes and street addresses should be
correct and properly formatted, but preferably not real people or companies or
email addresses. But they'd work if you did address vali
Brian Dunning wrote:
> Hey all -
>
> I need a few million sample contact records - name, company, address, email,
> web, phone, fax. ZIP codes and area codes and street addresses should be
> correct and properly formatted, but preferably not real people or companies
> or email addresses. But th
Brian Dunning wrote:
> Hey all -
>
> I need a few million sample contact records - name, company, address, email,
> web, phone, fax. ZIP codes and area codes and street addresses should be
> correct and properly formatted, but preferably not real people or companies
> or email addresses. But th
I remembered a coworker found an online resource that generated sample data
for you. I hit google and I think I found it:
http://www.generatedata.com/
I I found it on a list of resources for data generation:
http://www.webresourcesdepot.com/test-sample-data-generators/
I've never used any of
Fakenamegenerator.com is pretty good for these kinds of records, alot of
variety and can change order/formating for them but they do limit free
orders to 50k records.
http://www.fakenamegenerator.com/order.php
On Thu, Jan 28, 2010 at 8:06 PM, TG wrote:
> I remembered a coworker found an online
Hi php-dev pros,
I got an issue about catching exception throw from __autoload on php 5.3.1.
The manual state that exception throw from __autoload could be catched with
try.. catch statement same as the normal flow.
But I'can archive that even I have copied the same sample code from the
manual.
On Fri, 2010-01-29 at 13:02 +0800, Eric Lee wrote:
> Hi php-dev pros,
>
> I got an issue about catching exception throw from __autoload on php 5.3.1.
>
> The manual state that exception throw from __autoload could be catched with
> try.. catch statement same as the normal flow.
>
> But I'can ar
On Fri, 2010-01-29 at 13:02 +0800, Eric Lee wrote:
> Hi php-dev pros,
>
> I got an issue about catching exception throw from __autoload on php 5.3.1.
>
> The manual state that exception throw from __autoload could be catched with
> try.. catch statement same as the normal flow.
>
> But I'can ar
于 2010-1-29 13:19, Ashley Sheridan 写道:
> On Fri, 2010-01-29 at 13:02 +0800, Eric Lee wrote:
>
>
>> Hi php-dev pros,
>>
>> I got an issue about catching exception throw from __autoload on php 5.3.1.
>>
>> The manual state that exception throw from __autoload could be catched with
>> try.. catch s
On Fri, Jan 29, 2010 at 1:19 PM, Ashley Sheridan
wrote:
> On Fri, 2010-01-29 at 13:02 +0800, Eric Lee wrote:
>
> Hi php-dev pros,
>
> I got an issue about catching exception throw from __autoload on php 5.3.1.
>
> The manual state that exception throw from __autoload could be catched with
> try..
Hi all and thanks for Ryan,
I apologize ! !
I have missed out the small class_exists call before it.
Thanks.
Regards,
Eric,
2010/1/29 Ryan
> 于 2010-1-29 13:19, Ashley Sheridan 写道:
> > On Fri, 2010-01-29 at 13:02 +0800, Eric Lee wrote:
> >
> >
> >> Hi php-dev pros,
> >>
> >> I got an issue
48 matches
Mail list logo