In CF you can fire a sql statment and store its result in a variable which is not going to be removed from the server's memory after responding to the user. That variable will be available for sometime declared by you when you created the query.
This is usefull, suppose you want to show the last 20 recoreds added to your site, you will fire the query once per hour! So, if your site visitors are 1000 per hour, your transaction to the actual db will be 1 time per hour not 1000 per hour.
Sounds Greate!
That also has a drawback, you have to be smart enough. You should use this feature only if your query is fixed and it should return some limited value. For example, you cannot store your search results because each and every visitor will use his own key words.
Regards, Hamid Hossain Saudi Arabia
----Original Message Follows---- From: "PHP general" <[EMAIL PROTECTED]> To: [EMAIL PROTECTED] Subject: [PHP] what PHP really needs Date: Fri, 23 Jan 2004 20:42:59 +0100
There's 1 really important thing missing in PHP as I see it, and it's the ability to keep variables in memory for as long as the programmer choose. If this was possible there could be some truly great optimizations done. Some things are very slow to create but very fast to work with. I wrote a XML class a couple of days ago and while it's extremly quick to search and work with, sadly it's rendered pretty much useless since creating the tree which it uses isn't fast enough.
I've heard there's a feature like this in Cold Fusion, which every Cold Fusion user seems to think of as the holy grail, and I would have to agree with them.
One thing I've heard they use this for is to load an entire database into system memory. I don't know exactly how it's works but imagine having the whole database in system memory. When you change data you update it both in system memory and on the drive, but when you select (which is what you mostly do), you just query the mirror in system memory.
So how cost effective could this be? 1GB of system memory is pretty much minimum on a decent server today. Assuming the site generates aprox 1 million bytes worth of data every day (storing images and other types of massive data in the tables would perhaps not be apropiate) the site could be up and runing for 1 thousand days. And if you just keep tables that gets queried a lot, but doesn't get altered often, you could most likely come up with a great compromise.
I can't say for sure how much faster things would be but I'm guessing at several 1000% faster, however I might be way off.
The only drawback I can see is that there might be multi threading issues, so if this would be implemented a new key word would probably have to be introduced to make data mutexed, or perhaps the other way around to avoid to many people scratching their heads.
/Sebastian Karlsson
-- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
_________________________________________________________________
Add photos to your e-mail with MSN 8. Get 2 months FREE*. http://join.msn.com/?page=features/featuredemail
-- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php