On 24 October 2015 at 04:15, David Montgomery <[email protected]> wrote:
> Thanks. I am definitely looking for a better workflow. But I have to > load from redis a key that will turn into a hash every e.g. 5 minutes. I > cant load the hash every request. It will be potentially large. > I am not sure if I am missing something here, but this is what I would do 1. Have a single process with a timer to calculate hash every 5 minutes 2. The process stores calculated hash in a Redis key 3. Other processes read this key Redis calls are fast. Very fast. There is no need to be conservative with them. Hope this helps, Mikko > > So if I dont use rbtimer then within each worker then make a call to > redis after so many seconds. But this means a redis call for every worker > in a process rather then one. Or am I missing something. > > Thanks > > > > > On Sat, Oct 24, 2015 at 7:40 AM, Mikko Ohtamaa <[email protected]> > wrote: > >> >> However, the worker has to use the updated hash to select items that >>> might have been updated from the data that was supposed to be updated from >>> the reloaded data. However it is not. For me to get updated data that >>> works I have to do a touch reload which defeats the purpose. Below is my >>> logic. So how do I resolve? >>> >> >> I assume you are using global updated_hash. Global variables are not >> compatible with most web server process models. The global variable might >> end up in a different process, dead process, etc. >> >> I suggest you just keep *all* your data in a Redis and if you need to >> maintain a state variable, create a single Redis key for it. >> >> As a golden rule never write to global variables in Python web service >> processes (well... technically you can do it, but you really must know what >> you are doing). >> >> Thanks, >> Mikko >> >> >> >> >>> >>> Thanks >>> >>> >>> from uwsgidecorators import * >>> @rbtimer(120,target='workers') >>> def load_redis(signum): >>> updated_hash = redis.get('data') >>> with open('/tmp/meta_post.json', 'w') as outfile: >>> json.dump(updated_hash, outfile, indent=4, sort_keys=True) >>> <-THIS WORKS >>> >>> >>> def foo(updated_hash){ >>> do stuff >>> >>> @get('/hash/') >>> def hash(): >>> >>> json_data = foo(updated_hash) <- THIS USES OLD DATA >>> return json_data >>> >>> _______________________________________________ >>> uWSGI mailing list >>> [email protected] >>> http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi >>> >>> >> >> >> -- >> Mikko Ohtamaa >> http://opensourcehacker.com >> http://twitter.com/moo9000 >> >> >> _______________________________________________ >> uWSGI mailing list >> [email protected] >> http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi >> >> > > _______________________________________________ > uWSGI mailing list > [email protected] > http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi > > -- Mikko Ohtamaa http://opensourcehacker.com http://twitter.com/moo9000
_______________________________________________ uWSGI mailing list [email protected] http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi
