Re: building exe from script

2007-08-20 Thread nih
A bit more information is clearly needed...

1. Create a file called py2exeSetup.py with the following contents:

from distutils.core import setup
import py2exe

def compile(appName, console=False):
OPTIONS = {"py2exe": {"compressed": 1, "optimize": 0, "bundle_files": 
1, } }
ZIPFILE = None

if console:
setup(
options=OPTIONS,
zipfile=ZIPFILE,
console=[appName]
)
else:
setup(
options=OPTIONS,
zipfile=ZIPFILE,
windows=[appName]
)


2. Create another file called setup.py with the following contents

import py2exeSetup

# to stop a wxPython program from loading a console window you just need to 
change the
# file extension from .py to .pyw

# change Filename to your XXXApp.pyw filename
# change to py2exeSetup.compile('Filename.pyw', console=True) to show the 
console
py2exeSetup.compile('Filename.pyw')


3. Create a file called Compile.bat with the following contents

REM change Filename to your XXXApp.exe filename
REM make sure that python is in your system path

python setup.py py2exe
copy dist\Filename.exe Filename.exe
if errorlevel 1 pause


4. Put all these files in the same place as the program you want to convert 
to exe and run Compile.bat
an *.exe file will now be in the same folder as your python files :)

I have py2exeSetup.py in a separate 'Shared' folder so there is only one 
copy of it,
so all my programs only have setup.py and compile.bat in them


btw if you have a load of programs that you want to compile in one go the 
add a file CompileAll.bat
in the root folder that all the programs are in, with the following 
contents:

cd "ProgramOne"
call py2exe.bat
cd..

cd "ProgramTwo"
call py2exe.bat
cd..

# REM this part will execute if all programs compiled & copied ok
echo all copied ok!
pause


I know hardly anything about ms-dos, but this works ok:/

gl!

<[EMAIL PROTECTED]> wrote in message 
news:[EMAIL PROTECTED]
> Hello,
>
> Is there any solution for building exe file from python script
> something like bbfreeze.When user write some script in
> my program, it must compile script into exe without opening console
> ( cmd ).I'am working on Windows XP  SP2 and Python 2.5.
>
>
>
> Regards,
> Vedran
> 


-- 
http://mail.python.org/mailman/listinfo/python-list


Creating Dict of Dict of Lists with joblib and Multiprocessing

2016-04-20 Thread Sims, David (NIH/NCI) [C]
Hi,

Cross posted at 
http://stackoverflow.com/questions/36726024/creating-dict-of-dicts-with-joblib-and-multiprocessing,
 but thought I'd try here too as no responses there so far.

A bit new to python and very new to parallel processing in python.  I have a 
script that will process a datafile and generate a dict of dicts.  However, as 
I need to run this task on hundreds to thousands of these files and ultimately 
collate the data, I thought parallel processing made a lot of sense.  However, 
I can't seem to figure out how to create a data structure.  Minimal script 
without all the helper functions:

#!/usr/bin/python
import sys
import os
import re
import subprocess
import multiprocessing
from joblib import Parallel, delayed
from collections import defaultdict
from pprint import pprint

def proc_vcf(vcf,results):
sample_name = vcf.rstrip('.vcf')
results.setdefault(sample_name, {})

# Run Helper functions 'run_cmd()' and 'parse_variant_data()' to generate a 
list of entries. Expect a dict of dict of lists
all_vars = run_cmd('vcfExtractor',vcf)
results[sample_name]['all_vars'] = parse_variant_data(all_vars,'all')

# Run Helper functions 'run_cmd()' and 'parse_variant_data()' to generate a 
different list of data based on a different set of criteria.
mois = run_cmd('moi_report', vcf)
results[sample_name]['mois'] = parse_variant_data(mois, 'moi')
return results

def main():
input_files = sys.argv[1:]

# collected_data = defaultdict(lambda: defaultdict(dict))
collected_data = {}

# Parallel Processing version
# num_cores = multiprocessing.cpu_count()
# Parallel(n_jobs=num_cores)(delayed(proc_vcf)(vcf,collected_data) for vcf 
in input_files)

# for vcf in input_files:
# proc_vcf(vcf, collected_data)

pprint(dict(collected_data))
return

if __name__=="__main__":
main()


Hard to provide source data as it's very large, but basically, the dataset will 
generate a dict of dicts of lists that contain two sets of data for each input 
keyed by sample and data type:

{ 'sample1' : {
'all_vars' : [
'data_val1',
'data_val2',
'etc'],
'mois' : [
'data_val_x',
'data_val_y',
'data_val_z']
}
'sample2' : {
   'all_vars' : [
   .
   .
   .
   ]
}
}

If I run it without trying to multiprocess, not a problem.  I can't figure out 
how to parallelize this and create the same data structure.  I've tried to use 
defaultdict to create a defaultdict in main() to pass along, as well as a few 
other iterations, but I can't seem to get it right (getting key errors, pickle 
errors, etc.).  Can anyone help me with the proper way to do this?  I think I'm 
not making / initializing / working with the data structure correctly, but 
maybe my whole approach is ill conceived?
-- 
https://mail.python.org/mailman/listinfo/python-list