Hi there, I've been using C/C++ for many years (python, just reading about it).
I have a software written in C/C++ but considering porting most of it to python, as it seems like it's a better choice for decision making portion of the code. I'm also thinking about having a 'matlab' like interface for reading, processing, and writing. In my current C++ code, I would read data into a vector of structs (which contains other simple vectors, strings, and structs) and it can be as large as 500MB to 2GB. The data gets processed (requires random access). The result is then written out. I would like to make modules for python. The problem is that the vector of structs that is very large. First, is it possible to pass such structures around to and from python and C/C++? What would be the overhead cost of having a large structure that needs to be passed to and from the C/C++ modules? # I imagine I'd use the newly written software this way: >>> import c_stuff # my C/C++ module >>> largestuff = c_stuff.read(file) # read from disk >>> c_stuff.process(largestuff, someparams1) # processing, requires random >>> access to the vector >>> c_stuff.process(largestuff, someparams2) ... >>> c_stuff.process(largestuff, someparams10000) # the whole thing may take a >>> few minutes to days >>> >>> import python_stuff # some module written in python to process the data as >>> well >>> >>> python_stuff.process(largestuff, otherparams1) # It's important that this >>> data can be read (and I hope written) by python code >>> python_stuff.process(largestuff, otherparams2) >>> >>> c_stuff.write(largestuff) #write result Thank you in advance, Paul _______________________________________________ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: http://mail.python.org/mailman/listinfo/tutor