Graham Dumpleton
grahamd at dscpl.com.au
Mon Jan 15 04:14:36 EST 2007
On 15/01/2007, at 7:41 PM, export at hope.cz wrote: > Graham, > Thank you for your reply and help. > So,now my input filter looks like this: > > def inputfilter(filter): > if filter.req.method != 'POST': > filter.pass_on() > return > filter.req.log_error('first read') > s = filter.read() > while s: > filter.req.log_error('writing (%s)' % len(s)) > filter.write(s) > s = filter.read() > if s is None: > filter.req.log_error('closing') > filter.close() > > When I check error log I can see list of > writing (8000) > which could be good.I think it says that data is read /write in > 8000bytes chunks. > > But still memory is NOT released but increases in the process of > reading. > Should memory be released after each writing? > What should I check to find out the reason that causes resources > are run out? > Thank you for helo and reply Are you still running Django as the content handler? Are you sure that the increase in memory size isn't something to do with Django? Instead of running Django as the response handler write a small custom handler which does: def handler(req): req.content_type = 'text/plain' req.write(req.read()) return apache.OK You will then need to create a static HTML page with a POST form that posts to the URL which matches the handler. This should have the result of exercising the filter with the contents of the POST being returned as the response. See if that small example still uses increasing resources. Graham
|