[Tutor] Refreshing the interpreter environment
How do I refresh the interpreter environment without restarting it, if possible? For example, I'm using the interpreter to test a class I'm writing; importing and instantiating it reveals a typo; I go and fix the typo. Now, is there any way to reload the class afresh? Simply importing again doesn't seem to do it. Thanks in advance for your help. Lawrence Wang ___ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor
[Tutor] Importing from directories below yourself...
Say I have a directory tree like this: foo - bar -- quux.py - baz -- glonk.py >From within glonk.py, how do I import quux.py? I've tried going to foo, running baz/glonk.py, and using "from bar import quux", but this doesn't seem to work. Thanks in advance! Lawrence ___ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor
[Tutor] Closing BaseHTTPServer...
I've been using BaseHTTPServer (and subclassing BaseHTTPRequestHandler, of course) for a project at work. However, I can't seem to close my connections completely once I'm done with the server. I've tried: server.server_close() del server but when I try to use the same port again, it complains that it's already bound; what's more, the interpreter hangs when I try to exit. Thanks in advance for any help you can offer! Lawrence ___ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor
Re: [Tutor] Closing BaseHTTPServer...
Here's some more detail about how I've got things set up. I left a bunch of things out of my original email; sorry about that, I was rushed for time. I don't have access to the verbatim code right now, but here's the gist. I've subclassed threading.Thread like this: class ServerThread(threading.Thread): def __init__(self): self.httpd = BaseHTTPServer() def run(self): self.httpd.serve_forever() def close(self): self.httpd.server_close() del self.httpd I was expecting server_close() to stop serve_forever(), but this doesn't seem to be the case, since the threads still stick around after I run close() (that is, the extra python instances still show up in the list of processes). When I try to use the port again after closing, I get socket.error,'address already in use' or something like that. I'm running Python 2.4 on Debian Linux, btw. On 7/15/05, Danny Yoo <[EMAIL PROTECTED]> wrote: > > > > I've been using BaseHTTPServer (and subclassing BaseHTTPRequestHandler, > > of course) for a project at work. However, I can't seem to close my > > connections completely once I'm done with the server. I've tried: > > > > server.server_close() > > del server > > > > but when I try to use the same port again, it complains that it's > > already bound; what's more, the interpreter hangs when I try to exit. > > Thanks in advance for any help you can offer! > > > Hi Lawrence, > > I'm not exactly sure if this is the issue you're running into; if you can > show us code, that'll help. Are you sure nothing's running as a daemon > afterwards? You may want to try the unix utility 'telnet' and just make > double-check that the server port is closed. > > > If everything is truly closed, then it still usually takes a moment > between restarts before the socket is available for use again. If we want > to force the issue, we can use the socket.SO_REUSEADDR attribute. For a > general idea of what common problem is, see: > > http://www.unixguide.net/network/socketfaq/4.5.shtml > > > Concretely, before binding the server's socket to a port, we may need to > set a few socket parameters to reuse a port that's still in cooldown. If > we have a socket object that hasn't been bound yet, then we can do > something like this: > > ## > socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1) > socket.bind(server_address) > ## > > We shouldn't have to do this either if we're using BaseHTTPServer, since > the mechanism for calling SO_REUSEADDR already exists in > SocketServer.TCPServer.server_bind. > > > The thing that bothers me is that this should already be happening for > you, since by default, allow_reuse_address is set to true for subclasses > of HTTPServer. At least, this appears to be true in Python 2.3, according > to this code snippet in the BaseHTTPServer code: > > ### BaseHTTPServer.py ### > class HTTPServer(SocketServer.TCPServer): > > allow_reuse_address = 1# Seems to make sense in testing environment > > def server_bind(self): > """Override server_bind to store the server name.""" > SocketServer.TCPServer.server_bind(self) > host, port = self.socket.getsockname()[:2] > self.server_name = socket.getfqdn(host) > self.server_port = port > ## > > > So I think we need some more information before we nail down what's really > happening. Show us some error messages, and some code, and we might be > able to get a better idea of the situation. > > > Good luck! > > ___ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor
[Tutor] weird socket errors on linux with asyncore
apologies if this doesn't belong on tutor. i have a long-running script that manages a bunch of sockets with asyncore, opening 600 connections every 30 seconds for short transactions, and every now and then (like anywhere from twice an hour to once every few hours) i get this weird error: "filedescriptor out of range in select()". i found a bug report that stated that this was an issue with python 2.4.3, so i upgraded to 2.5.1 -- but it's still happening. anyone seen this before? thanks in advance for any help you can provide. --lawrence ___ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor
[Tutor] Closing SimpleXMLRPCServer properly
I have a SimpleXMLRPCServer, which I've tweaked thusly: class StoppableXMLRPCServer(SimpleXMLRPCServer.SimpleXMLRPCServer): def serve_forever(self): """to stop this server: register a function in the class that uses it which sets server.stop to True.""" self.stop = False while not self.stop: self.handle_request() Here's the code where I start the server... try: self.server.serve_forever() finally: self.server.server_close() self.log('server closed') >From another thread, I set the server's stop attribute to False, so the server stops running. It exits the try block, runs server_close(), then I get the message 'server closed'... ...but when I try to use the port that the server's bound to again, it takes a very long time (while i try to use the port, catch the exception, sleep, try again) until it becomes free. Is there something else I need to call to ensure that the port is released cleanly? Is this an OS-specific thing out of my control? (I'm running Debian Linux.) Thanks in advance Lawrence ___ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor
[Tutor] Getting info about processes via pid
So I have a list of pids, and I want to check whether the pids still refer to live processes or not. Currently I'm doing that thus: pids = ['4550\n', ...] procs = os.popen("ps ax|grep %s|awk '{ print $1 }'" % keyword).readlines() and comparing the two lists. I'm wondering, though, if there's a way to do this that's less dependent on external programs. I thought os.waitpid() might work, but on *nix I can only call it for child processes. Lawrence Wang ___ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor
Re: [Tutor] XML: Expletive Deleted
> >> for item in itemIDs: > >> print item > > yeilds > > > > > > > > Okay, no problem. Now all I have to do is figure out which > particlular.string.of.words.interconnected.by.periods to > pass to extract the values. > > >> for item in itemIDs: > >> print item.nodeValue > > Seems logical: > > None > None > None > None > None try dir(item) to see what attributes the item has, and try the ones that sound right. e.g.: >>> from xml.dom.minidom import parse, parseString >>> resp = parseString("foo") >>> bottom = resp.getElementsByTagName("bottom") >>> bottom [] >>> dir(bottom[0]) ['ATTRIBUTE_NODE', ...long list snipped..., 'writexml'] >>> bottom[0].hasChildNodes() True >>> bottom[0].childNodes [] >>> dir(bottom[0].childNodes[0]) ['ATTRIBUTE_NODE', ...long list snipped..., 'writexml'] >>> bottom[0].childNodes[0].data u'foo' so you see, with "value", there's an invisible text node. it's one of the quirks of xml, i guess. then the attribute you're looking for is "data", not "nodeValue". in summary: instead of item.nodeValue, item.childNodes[0].data. --lawrence ___ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor
[Tutor] struct.calcsize curiosity...
>>> struct.calcsize('hq') 12 >>> struct.calcsize('qh') 10 why is this? is it platform-dependent? i'm on mac os x. ___ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor