I am wondering what will give me the best performance for storing information about the servers in our environment. currently i store all info about all servers in a single shelve file, but i have concerns. 1) as the size of the shelve file increases, will performance suffer ? 2) what about if 2 updates occur at the same time to the shelve file? when a shelve file is opened, is the whole file read into memory?
if either scenario (1 or 2) is true, then should i think about creating a shelve file per server? i was also thinking about using SQL Lite with one DB to store all the info. with this option, i would not have to worry about concurrent updates, but as the file size increases, i could expect performance to suffer again? I am running Python 2.6 CGI scripts on Apache web server on windows platform. they interact with the current shelve file to pull info or request info from a python service which will go collect the info and put it into the shelve file. -- http://mail.python.org/mailman/listinfo/python-list
