[mod_python] Problems with headers in proxy handler

Sebastjan Trepca trepca at gmail.com
Wed Dec 7 08:14:00 EST 2005


Hi,

I'm writing a proxy handler which makes a request to a different page
and retrieves its content and headers. This is the source:

import urllib2
from mod_python import util

def handler(req):
	req.headers_out.clear()
	f = util.FieldStorage(req)
	url = f.get('url',None)
	if(url):
		url=url.split('?')
		if(len(url)>1):
			u = url[0]
			p = url[1]
			ureq = urllib2.Request(u,p)
		else:
			u = url[0]
			ureq = urllib2.Request(u)
		for h in req.headers_in:
			ureq.add_header(h,req.headers_in[h])
		r = urllib2.urlopen(ureq)
		for h in r.headers:
			req.headers_out[h]=r.headers[h]
		data = 	r.read()
		req.write(data)
		return 0#-2 doesn't help either
	return 0

But this seems to have problems with headers, I get back basic headers
from my Apache server, which is wrong of course.
Any ideas why is this happening?

Thanks, Sebastjan



More information about the Mod_python mailing list