> overhead for me to store the data persistently, because then I would
> have to manage updates to individual rows and columns right now, I
> simply construct the latest version of my data table from my source
> data, whenever the script is run this is also the reason why I am using
> python in the
On Sun, 9 Oct 2005, [ISO-8859-1] Frank Hoffs�mmer wrote:
> thanks for your answers looks like SQL is the ticket for such
> problems... because my data is updated quite often, it feels like an
> overhead for me to store the data persistently, because then I would
> have to manage updates to indiv
Hi Frank,
you can use Numeric.py it is also powerfull to handle averages, min, max, matrix algebra, etc..
see: http://www.pfdubois.com/numpy/
You just need to use a list.
If your data is big and you need more power I suggest you use database like mysqldb for python.
It is also fun to combine
> with the little I know about classes, I assume that then I would have
> a list of class instances as representation of my tabular data
> but given such a list of class instances, i would still need for
> loops to get to e.g. the minimal value of a certain attribute in all
> classes in that list.
On Sat, 8 Oct 2005, [ISO-8859-1] Frank Hoffs�mmer wrote:
> I often find myself writing python programs to compute averages, min,
> max, top10 etc of columns in a table of data In these programs, I always
> capture each row of the table in a tuple the table is then represented
> by a list of tupl
Hello
I often find myself writing python programs to compute averages, min,
max, top10 etc of columns in a table of data
In these programs, I always capture each row of the table in a tuple
the table is then represented by a list of tuples
computing averages, min, max and other meta-information i