antoine
rouadec at gmail.com
Sun Jul 10 12:40:14 EDT 2005
ok, thanks for those advices and insight into mod_python. I'll do more testing with the caching module it looks like a solution to my current problem. it's not trusly enormous but the loading makes the difference between an instantaneous page and a one second delay. I'm forwarding your answer to the mailing list for the future generations ;) (or at least the ones stuck with apache1.3 :) On 7/10/05, Jorey Bump <list at joreybump.com> wrote: > antoine wrote: > > hum, ok but I tried a few more time to access this page in different > > fashions and I can say in some circonstances the processes are being > > reused (and therefore my bigdict object is not reloaded, as expected) > > so it is consistent with Graham explanations. > > I may have misinterpreted your statement to mean that you would > eventually begin using the object in a persistent manner, rather than > occassionally access it, which is the case and consistent with both of > our explanations. > > > The cached module "trick" should be more performant that the "global" > > only if the modules are cached once for all the processes (and nor per > > process) I guess? Anyway, I'll try and see for myself. > > Well, they are cached per child, so you get a boost. > > > Should this module also use global or not? Is a simple implementation like: > > I wouldn't. > > > CacheBigdic.py > > def getBigDic(): > > return looong_and_painful_process() > > > > enough or should I go for : > > CacheBigdic.py > > bigdic = None > > def getBigDic(): > > global bigdic > > if not bigdic: > > return looong_and_painful_process() > > else: > > return bigdic > > > > and call this from index.py with > > > > import CacheBigdic > > bigdic = CacheBigdic.getBigdic() > > def index(req): > > ... > > use(bigdic) > > ... > > I'd do this, probably: > > # CacheBigdic.py > > bigdic = looong_and_painful_process() > > # publishedmodulewithuniquename.py > > import CacheBigdic > def index(req): > ... > use(CacheBigdic.bigdic) > ... > > If you find errors appearing in your apache log under heavy load, you > may need to alter CacheBigdic.py: > > try: > bigdic = looong_and_painful_process() > except TheErrorYouSee: > bigdic = looong_and_painful_process() > > It looks redundant, but it's needed because the module is cached. In any > case, it's important to protect yourself if your bigdic gets you in trouble. > > > I do not need truly global var, the cached object is never modified > > (and the long_and_painful_process() is already nothing but a (c)pickle > > ;). > > Then this should work, but be aware that if your bigdic is truly > enormous, you should tuck it away and only use it when needed. Excessive > handling of bigdic could result in a huge load and bring your system to > exhaustion. > > A database is a good candidate for this. > -- Antoine http://delaunay.org/antoine
|