|
Kurt Nordstrom
knordstrom at library.unt.edu
Thu Oct 18 18:38:24 EDT 2007
We're working on a webapp that will need to receive, from its clients,
fairly large files (it's an archival storage system). The logical way
to do this seems to be to use http PUT, with appropriate code on the
server side to store the file to the proper place.
Borrowing and modifying some code posted back in '05 to this list by a
Jeremy Jones, I have been playing with this:
from mod_python import apache
import os
def handler(request):
#request.content_type = "text/plain"
content_length = int(request.headers_in["Content-Length"])
outPath = "/home/webtmp/upload_out.dat"
outFile = open(outPath, "w")
conn = request.connection
i = 0
buf = ""
while i < content_length:
c = conn.read(1)
if c == "":
break
buf = buf + c
if len(buf) >= 102400:
outFile.write(buf)
outFile.flush()
buf = ""
i += 1
if len(buf):
outFile.write(buf)
outFile.flush()
outFile.close()
talkBack = "Recieved %s bytes\n" % os.stat(outPath).st_size
request.set_content_length(len(talkBack))
request.write(talkBack)
return apache.OK
Basically it reads bytes one at a time from the connection and puts them
in a buffer, writing it to a file in 100k chunks (I'll probably up this,
once it actually works the way I want it to).
This seems to be working fine...up to a point. For some reason, it
seems to stall out on files that are larger than 95 Megs in size. It
just freezes, no activity from client or server, until I kill the client
(I'm using curl -T to PUT files on the system).
Any thoughts on what might be causing this? I've already checked the
apache config for timeout and request size limitations, which are well
beyond anything that might cause this to be a problem.
--
===
Kurt Nordstrom
Programmer
University of North Texas Libraries
Digital Projects Unit
(940) 891-6747
|