There is no way to share connections between processes. However some drivers may allow you share connections between thread, assuming you have a sort of connection pool to safely use connection by running thread. Here is what I use in such case:
https://bitbucket.org/akorn/wheezy.core/src/tip/src/wheezy/core/pooling.py You acquire connection from pool and get back once you are done. Also you can consider use to thread model instead of process (this is supported by uwsgi) with in-memory cache to share data. The memory cache implementation that you can found here: https://bitbucket.org/akorn/wheezy.caching/src/tip/src/wheezy/caching/memory.py does not use serialization to store data, thus you should be careful while working with objects returned from cache. You should be good with read-only access, however any operation that might change object state must be synchronized (to avoid conflicts in multi-threaded environment). Thanks. Andriy ________________________________ > Date: Fri, 17 May 2013 12:36:36 -0700 > From: [email protected] > To: [email protected] > Subject: Re: [uWSGI] sharing objects amongst workers > > Sharing the DB connection probably is not a good idea and having one > connection on each worker is not such a big overhead, as for the shared > array, that is very tricky if you don't have control over process > spawning and that sort of stuff, a easy way would be using shared > memory mechanism such as mmap: > > http://docs.python.org/2/library/mmap.html > > Other methods may work such as the posix_ipc, sysv_ipc or shm modules. > > > On Fri, May 17, 2013 at 12:15 PM, Andriy Kornatskyy > <[email protected]<mailto:[email protected]>> wrote: > How about to use memcache on the same host to share data between processes? > > Andriy > > > ________________________________ >> From: [email protected]<mailto:[email protected]> >> Date: Fri, 17 May 2013 11:57:30 -0700 >> To: [email protected]<mailto:[email protected]> >> Subject: [uWSGI] sharing objects amongst workers >> >> We are using the pymongo driver with mongodb and as it stands, it looks >> like each worker gets its own connection. Is there any way to share a >> database connection in memory among the workers? I've read about ksm >> on the readthedocs page, is that applicable here? >> >> We also have a bunch of python objects being stored in memory, would >> KSM be helpful in sharing memory among the workers for this? >> >> For example: >> Currently we have an identical list being stored on each worker (we >> keep the list stored in a python variable because its large and we need >> to access it quickly so storing in the uwsgi cache and pickling and >> unpickling (the only way I can think of to store a python object in the >> cache) each request, would be too slow) and we have 6 workers. This >> means we are using several times the memory we actually need to. Is >> there any way to share a global python object among all of the workers? >> All we do is read from it, we only modify it when we are completely >> refreshing it periodically. >> >> The overarching question is, how do i share db connections and python >> objects globally, so each worker doesn't need to have their own copy of >> the python objects and we aren't keeping so many db connections open? >> >> _______________________________________________ uWSGI mailing list >> [email protected]<mailto:[email protected]> >> http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi > _______________________________________________ > uWSGI mailing list > [email protected]<mailto:[email protected]> > http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi > > > _______________________________________________ uWSGI mailing list > [email protected] > http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi > _______________________________________________ uWSGI mailing list [email protected] http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi
