Re: wxFormBuilder
On Mar 20, 8:41 am, sturlamolden <[EMAIL PROTECTED]> wrote: > I just discovered wxFormBuilder. After having tried several GUI > builders for wx (including DialogBlocks, wxGlade, XRCed, Boa > constructor), this is the first one I can actually use. > > To use it wxFormBuilder with wxPython, I generated an xrc resource and > loaded it with wxPython. All the tedious GUI coding is gone :-) > > http://wxformbuilder.org/http://wiki.wxpython.org/index.cgi/XRCTutorial I've stumbled across it myself and have found it superior so far over the others. I just wish it could also crank out wxPython code. I'm still hoping one day for a form designer closer to Visual Studio's form designer in ease of use area. FarPy GUIE has the right idea, but not near enough widget support. And, I'm not good enough at wxPython yet to be able help either project. -- http://mail.python.org/mailman/listinfo/python-list
multiprocessing problems
Hi, I decided to play around with the multiprocessing module, and I'm having some strange side effects that I can't explain. It makes me wonder if I'm just overlooking something obvious or not. Basically, I have a script parses through a lot of files doing search and replace on key strings inside the file. I decided the split the work up on multiple processes on each processor core (4 total). I've tried many various ways doing this form using pool to calling out separate processes, but the result has been the same: computer crashes from endless process spawn. Here's the guts of my latest incarnation. def ProcessBatch(files): p = [] for file in files: p.append(Process(target=ProcessFile,args=file)) for x in p: x.start() for x in p: x.join() p = [] return Now, the function calling ProcessBatch looks like this: def ReplaceIt(files): """ All this does is walks through all the files passed to it and verifies the file is a legitimate file to be processed (project file). @param files: files to be processed """ processFiles = [] for replacefile in files: if(CheckSkipFile(replacefile)): processFiles.append(replacefile) if(len(processFiles) == 4): ProcessBatch(processFiles) processFiles = [] #check for left over files once main loop is done and process them if(len(processFiles) > 0): ProcessBatch(processFiles) return Specs: Windows 7 64-bit Python v2.6.2 Intel i5 Thanks -- http://mail.python.org/mailman/listinfo/python-list
Re: multiprocessing problems
On Jan 19, 10:26 am, Adam Tauno Williams wrote: > > I decided to play around with the multiprocessing module, and I'm > > having some strange side effects that I can't explain. It makes me > > wonder if I'm just overlooking something obvious or not. Basically, I > > have a script parses through a lot of files doing search and replace > > on key strings inside the file. I decided the split the work up on > > multiple processes on each processor core (4 total). I've tried many > > various ways doing this form using pool to calling out separate > > processes, but the result has been the same: computer crashes from > > endless process spawn. > > Are you hitting a ulimit error? The number of processes you can create > is probably limited. > > TIP: close os.stdin on your subprocesses. > > > > > Here's the guts of my latest incarnation. > > def ProcessBatch(files): > > p = [] > > for file in files: > > p.append(Process(target=ProcessFile,args=file)) > > for x in p: > > x.start() > > for x in p: > > x.join() > > p = [] > > return > > Now, the function calling ProcessBatch looks like this: > > def ReplaceIt(files): > > processFiles = [] > > for replacefile in files: > > if(CheckSkipFile(replacefile)): > > processFiles.append(replacefile) > > if(len(processFiles) == 4): > > ProcessBatch(processFiles) > > processFiles = [] > > #check for left over files once main loop is done and process them > > if(len(processFiles) > 0): > > ProcessBatch(processFiles) > > According to this you will create files is sets of four, but an unknown > number of sets of four. What would be the proper way to only do a set of 4, stop, then do another set of 4? I'm trying to only 4 files at time before doing another set of 4. -- http://mail.python.org/mailman/listinfo/python-list
Re: multiprocessing problems
On Jan 19, 10:33 am, DoxaLogos wrote: > On Jan 19, 10:26 am, Adam Tauno Williams > wrote: > > > > > > I decided to play around with the multiprocessing module, and I'm > > > having some strange side effects that I can't explain. It makes me > > > wonder if I'm just overlooking something obvious or not. Basically, I > > > have a script parses through a lot of files doing search and replace > > > on key strings inside the file. I decided the split the work up on > > > multiple processes on each processor core (4 total). I've tried many > > > various ways doing this form using pool to calling out separate > > > processes, but the result has been the same: computer crashes from > > > endless process spawn. > > > Are you hitting a ulimit error? The number of processes you can create > > is probably limited. > > > TIP: close os.stdin on your subprocesses. > > > > Here's the guts of my latest incarnation. > > > def ProcessBatch(files): > > > p = [] > > > for file in files: > > > p.append(Process(target=ProcessFile,args=file)) > > > for x in p: > > > x.start() > > > for x in p: > > > x.join() > > > p = [] > > > return > > > Now, the function calling ProcessBatch looks like this: > > > def ReplaceIt(files): > > > processFiles = [] > > > for replacefile in files: > > > if(CheckSkipFile(replacefile)): > > > processFiles.append(replacefile) > > > if(len(processFiles) == 4): > > > ProcessBatch(processFiles) > > > processFiles = [] > > > #check for left over files once main loop is done and process them > > > if(len(processFiles) > 0): > > > ProcessBatch(processFiles) > > > According to this you will create files is sets of four, but an unknown > > number of sets of four. > > What would be the proper way to only do a set of 4, stop, then do > another set of 4? I'm trying to only 4 files at time before doing > another set of 4. I found out my problems. One thing I did was followed the test queue example in the documentation, but the biggest problem turned out to be a pool instantiated globally in my script was causing most of the endless process spawn, even with the "if __name__ == "__main__":" block. -- http://mail.python.org/mailman/listinfo/python-list
