|
Hector Muñoz
hectormunozh at gmail.com
Fri Oct 31 17:08:16 EDT 2008
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hi all,
I'm uploading a very large file (150MB more or less) with mod_python.
I get the file object with:
fileitem = req.form['file']
and I save the file with:
fname = os.path.basename(fileitem.filename)
dir_path = os.path.join('/var/www','resources')
f = open(os.path.join(dir_path, fname), 'wb', 100000)
for chunk in fbuffer(fileitem.file):
f.write(chunk)
f.close()
where fbuffer is:
def fbuffer(f, chunk_size=100000):
while True:
chunk = f.read(chunk_size)
if not chunk: break
yield chunk
With files smaller than 70MB works good but when I try to send files
bigger than 70MB mod_python crash with an I/O Error.
I have read that if file is too large there are problems with
in-memory handle. Are there any other way to make this? How can I
solve this problem?
Thanks!
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iEYEARECAAYFAkkLc78ACgkQkXQuTK97GwMV8ACeMkC+z2UkILzuQBQ8wkLzYzPX
okYAn1Pswub0sUopMQwbyGgHTMGyQHLd
=t2np
-----END PGP SIGNATURE-----
|