Let me tell you what I'm doing.
First, I'm looking for an excuse to learn Python.
Second, I'm running a server that takes request from a remote server. This request is an HTTP GET request. I used to have (lost the source) a C program I wrote that functions as a web server.
It takes the request on port 80, parses the GET. From the GET I construct a MySQL query and spit the results back out.
Now, the logical solution is to run Apache to handle the webserver aspect, and have it hand off to a CGI to run the MySQL query and spit the results back out to the requesting server. In this situation, I could have thousands of requests per minute, and it bogs the server down. To much overhead when Apache calls the CGI. I've tried using several different setups with Apache, and C, Perl, and PHP CGIs.
I found it to be much faster if I wrote the webserver, and handle the MySQL call within the same thread.
I was told by a friend of mine that Python is good for creating network servers. So I thought I'd give it a try...instead of rewriting my C program I wanted to take the opportunity to learn a bit of Python.
The only piece I don't have going the Python route is actually getting the GET
On 9/27/05, paul brian <[EMAIL PROTECTED]
> wrote:
> Basicall, what I need is a stand alone CGI. Instead of the program passing
> the data off to a CGI, I want it to parse and handle the request directly.
instead of which program ?
Http requests are served by a web server (ie Apache), which depending
on the type of request passes the request to wherever.
As such any HTTP request *must* be handled first by a web server, and
cgi scripts traditionally lived in cgi-bin directory on the server so
a URL would look like http://www.example.com/cgi-bin/myscript.py
I think you have 3 options
1. use the cgi module in python to create scripts like the one above.
They will not be fast but it gives you a lowlevel access to the request
However cgi was out of date about 8 years ago - it has some
serious limitations mostly on speed/capacity.
2. use a system like mod_python. This is better than cgi for lots of reasons,
mostly to do with speed. Here you also have access to the request
objects, but there is a bit of a learning curve.
3. Zope - higher level than even mod_python and still more of a learning curve
(there is a multitude of python based cgi repalcements, Django,
webware and others spring to mind. But there is no clear "winner"
amoungst the community)
I would recommend that you look at taking a weekend to install apache,
and play with both the cgi module and mod_python. mod_python is
pretty good and fairly well documented, as well as being pretty low
level.
I think there is a lot to do here - perhaps if you tell us exactly
what you need we can point you at a solution. Some web hosters provide
mod_python or zope hosting and that might be a way to get up and
running faster.
On 9/27/05, Jerl Simpson <[EMAIL PROTECTED]> wrote:
> Hello,
>
> I have been looking through some of the HTTP projects and haven't quite
> found what I'm looking for.
> Basicall, what I need is a stand alone CGI. Instead of the program passing
> the data off to a CGI, I want it to parse and handle the request directly.
>
> The part I'm having trouble with is actually getting the request and parsing
> it.
>
> Let's say I have a URI that looks like:
> ?var1=val1&var2=val2&...varn=valn
>
> I'd like to find a way to get these into some datastructure so I can use
> them to generate my output.
>
> It seems like a simple thing, but as I'm new to python, I don't know where
> to start.
>
> Thank you for any help you can give.
>
>
> Jerl
>
> _______________________________________________
> Tutor maillist - Tutor@python.org
> http://mail.python.org/mailman/listinfo/tutor
>
>
>
--
--------------------------
Paul Brian
m. 07875 074 534
t. 0208 352 1741
_______________________________________________ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor