Joerg
jerch at rockborn.de
Sun Feb 15 22:19:34 EST 2009
Am Monday 16 February 2009 02:42:32 schrieben Sie: > 2009/2/16 Joerg <jerch at rockborn.de>: > > hello, > > > > ive dumped a fileupload (original size = 231KB) with the following > > results: > > > > without filter: > > - req.read() --> 231KB > > - req.readline() --> 231KB > > - req.readline(64000) --> 231KB > > - via util.py --> 231KB > > > > with filter: > > - req.read() --> 231KB > > - req.readline() --> 231KB > > - req.readline(64000) --> 64KB > > What about: > > s = req.readline(64000) > while not s: > ... > # log some debug .... > s = req.readline(64000) > > Does that result in all data being read, or is that what your test was > doing? hm, dont know wot u mean with the while not clause, the handler will loop here for ever. ive inspected the data stream of a indexed textfile to get the break point. wot ive found out: if u set the size of req.readline(size) pretty low the stream breaks after 103888 Bytes. til there the readline works correct, it "resumes" the old line data and so on. all data after 103888 is lost. if u set the size greater than this border, only the first line read will succeed. the other data is lost. btw, this break point is in the middle of a filter call, not at the end (was the 6th here with every call working on 1,4KB of stream data) since this weird behavior shows up only with a running filter (dont know, if non python input filter are affected too), this is probably caused by some code in the c part, which is only active with a filter before. also it might be some buffer issue there, since the 103888 Byte border is reproducible. jerch
|