|
Daniel Nogradi
nogradi at gmail.com
Mon Feb 20 19:16:46 EST 2006
> There are still some problems with the FieldStorage class and large files.
> If you want to upload a
> large file, you MUST use a make_file callback to the FieldStorage
> constructor and NEVER call
> form["file"] for that field or your apache process will load the whole file
> into RAM.
>
> In other words:
> - Does it work with small (e.g. 10k) files?
> - Look at apache's memory usage.
There are two things I don't understand here. First, if you don't give
a file callback function to FieldStorage it uses the default one,
which is creating a temporary file and that is where a file from a
posted form data will get written. So why would apache load that into
memory? The corresponding part of util.py is:
# is this a file?
if disp_options.has_key("filename"):
if file_callback and callable(file_callback):
file = file_callback(disp_options["filename"])
else:
file = self.make_file()
and
def make_file(self):
return tempfile.TemporaryFile("w+b")
The second thing is that in my example FieldStorage can not possibly
be a source of any problems as the whole request handler is only:
from mod_python import apache
def handler( req ):
req.content_type = "text/html"
req.write( "hello" )
return apache.OK
so FieldStorage is not even instantiated. My understanding is that
with such a handler the body of a POST request is simply lost and is
neither read into memory nor anywhere else.
Another thought in this direction: I plan to write an inputfilter for
large file uploads in order to avoid FieldStorage all together. Has
anyone done such a thing?
|