[mod_python] modpython, mysqldb best paractice

Jim Gallacher jpg at jgassociates.ca
Wed Jul 19 08:52:59 EDT 2006


Martijn Moeling wrote:
> Hi all,
> 
>  
> 
> I started the development of a specific kind of CMS over 2 years ago,
> and due to sparse documentation I am still puzzled about a few things.
> 
> Basically it consist of one .py and a mysqldatabase with all the data
> and templates, ALL pages are generated on the fly.
> 
>  
> 
> First of all I am confused about PythonInterPerDirectory and
> PythonInterpPerDirectory
> 
> In the way I use modpython.
> 
>  
> 
> My apache has no configured virtual hosts since my CMS can handle this
> on its own by looking at req.host
> 
> On one site which is running on my system ( http://www.mkbok.nl
> <http://www.mkbok.nl/>  ) we use different subdomains, so basically 
> 
> the page to be build is derived from the req.host (e.g.
> xxx.yyy.mkbok.nl) where xxx and yyy are variable. 
> 
> This is done by DNS records like   *     A      ip1.ip2.ip3.ip4

You don't actually state what problem here. ;)

> My next problem seems mysql and mysqldb.
> 
> Since I do not know which website is requested (multiple are running on
> that server) I open a database connection, do my stuff and close the
> connection, 
> 
> Again the database selection comes from req.host, and here the
> domainname is used for database selection.
> 
>  
> 
> The system runs extremely well, but once in a while the webserver
> becomes so busy that it does not respond to page request anymore. 
> 
> We suspect that mysql is the problem here since the only thing we can
> see is mysql is consuming more and more swapspace and at some point it
> runs out of resources and starts looping. At that point the system
> (Linux) keeps running but with a 100% cpu utilization and we are unable
> to login and investigate. 
> 
> So logging in to the UPS remotely and power down the system by virtually
> unplugging the cable is the only (and BAD) solution.

Ouch. Maybe you can run a cron job every 5 minutes to check the load and
try to catch the problem before you hit 100%? I'm not suggesting this is
a permanent solution, just do it until you can track down the cause.

Is there a chance that mysql is hitting its connection limit? (although
I'm not sure if that would cause the behaviour you describe).

> 
> So what is best practice when you have to connect to mysql with mysqldb
> in a mod_python environment keeping in mind that the database connection
> has to be build every time a visitor requests a page? Think in terms of
> a "globally" available db or db.cursor connection.

I don't think the performance penalty for creating a connection to mysql
is too great - at least compared to some other databases. You might want
to google for more information.

> Since global variables are troublesome in the .py contaning the handler
> I use a class from which an instance is created every time a client
> connects and the DB connection is global to that class, ist that wrong?

This looks OK.

>  
> 
> What happenens if mod_python finds an error before my mysqldb connection
> is closed (not that this happenes a lot, but it does happen, sorry)

It depends on how you handle the exception. This is why you should close
the connection in a registered cleanup function, which will always run.

> 
> Also I do not understand the req.register_cleanup() method, what is
> cleaned up and what not?

Whatever function you register is run during the cleanup phase (unless
mod_python segfaults - but then you've got other problems). The cleanup
phase occurs after the response has been sent and anything registered is
guaranteed to run, regardless of what happens in prior phases. Typical
usage looks like this:

def handler(req):
    conn = MySQLdb.connect(db='blah', user='blah', passwd='blah')
    req.register_cleanup(db_cleanup, conn)

    ... do your request handling ...

    return apache.OK


def db_cleanup(connection):
    connection.close()

Jim


More information about the Mod_python mailing list