, but the
link you gave me is good enough for what I needed it for.
Again thank you!
Stut wrote:
> Kenneth Andresen wrote:
>
>> What I would like to know is if there are some lists of all these base
>> domains, or maybe some function already doing what I would like to do?
Hello all,
I am trying to extract base domains and sub domains from url's, and
expect there to exist something to do this already.
I used the parse_url($url) to get the host variable.
My thought is to use $domain_elements[]=array_reverse(explode('.',$url));
then simply check $domain_element[0]
Thanks Rob, that solved the problem for me!
Rob Richards wrote:
Kenneth Andresen wrote:
I am having problems with the following functions where my return
simply is "#text Joe #text Smith #text unknown", it should have read
firstname Joe lastname Smith address unknown
What am I d
cal/public_html/preview/Inc/menu.class.php on line 1563
On Dec 22, 2005, at 7:08 PM, Kenneth Andresen wrote:
Hello,
why not simply convert the text to html
mb_convert_encoding($string, 'html', 'utf-8');
Best regards,
Kenneth
jonathan wrote:
I'm inserting some info i
Hello,
why not simply convert the text to html
mb_convert_encoding($string, 'html', 'utf-8');
Best regards,
Kenneth
jonathan wrote:
I'm inserting some info into a mysql table which has the charset set
to utf-8.
When I do a select via the command-line from mysql, it looks like this:
Clam
I am having problems with the following functions where my return simply
is "#text Joe #text Smith #text unknown", it should have read
firstname Joe lastname Smith address unknown
What am I doing wrong?
$xmlstring =
"JoeSmithunknown";
$domdoc = new DomDocument();
$domdoc->loadXML($xmlstring
Hello all,
I have been trying to get the Content-Encoding header from Curl, but
have yet to manage.
Using curl from command line I have no problems simply using:
curl --compress page_to_get -o local_page_copy -D dumpheader.txt
The data gets compressed and uncompressed also in PHP, I am just l
7 matches
Mail list logo