AW: [mod_python] Multiple submissions from the same user

Dominique.Holzwarth at Dominique.Holzwarth at
Sat Apr 5 10:31:05 EDT 2008

Your idea with locking might be a point in general, however it's not really want I to be honest. Think about a form that has a "show" button which sends a request to the server to get same data and display it to the user (as new html answer). Then what I'd like to have is that the user just can't hit that show button like 1000x in 1 sec so that a new request is started before a previous one is actually transferred back to the user.
As it makes no sense to "spam click" that show button (just as an example) it also makes no sense to me to blow up my scripts just do handle that case.
So what I was wondering about was if there's some sort of mechanism within apache or mod_python itself to simply ignore requests from the same user when a request for that user is still being processed.
I know you can limit stuff like "maxClient" and "ThreadsPerChild" and "maxThreads" which is somehow related to how requests are being processed. But I have to admit that I dont really understand it to the fullest, so thats why I was asking here... ;-)

Von: Martin (gzlist) [gzlist at]
Gesendet: Freitag, 4. April 2008 17:40
An: Holzwarth, Dominique (Berne Bauhaus)
Cc: mod_python at
Betreff: Re: [mod_python] Multiple submissions from the same user

On 04/04/2008, <Dominique.Holzwarth at> wrote:
> Hello everyone
>  Does anyone know how I can configure apache and/or mod_python in a way, so that multiple requests from the same user are 'ignored' (python scripts not executed) if the answer of the first request wasn't completely sent to the browser yet?
>  I just stumbled over that problem while testing my web application which has a form with several "submit" buttons and when I click one of them - or different ones - very quick after each other ('spam clicking') then my scripts don't like that at all and gracefully die :-)

The solution is to stop the scripts dying? If for instance, it needs
to write to a file user/{username} then running it twice at the same
time will cause the second instance should either fall over or block
till the first finishes, but the first should be unaffected either
way. If that's not already the case, you need to have an explicit
locking mechanism:

def do_action_for(user):
    with lock = lock_for(user):

Once that's in place, you can fix your problem by blocking on the
lock, then once the lock is acquired, checking to see if the action is
already done, and returning the result straight away:

def do_action_for(user):
    with lock = lock_for(user):
        result_filename = result_file_for(user)
        if not os.exists(result_filename):
            file(result_filename, "w").write(dangerous_stuff_for(user))

If you're cunning, you can combine the lock and result cache mechanism.

>  As a quick work-around I was thinking about simply disabling the submit buttons with javascript and the "onsubmit" event handler but as we all know javascript ain't THAT reliable so I was wondering if it's possible to control that problem on the server side.

Client side scripting really isn't the answer to a server program bug.
Website users will do unexpected things whatever: like trying to
delete the same item twice because they get back-button confused.
Trying to take away their back button then is not the solution, you
need to make sure you're coping with odd input.

>  Another, but similar question I have:
>  How can I configure apache / mod_python to allow only ONE active session per user at a time? At the moment I can login with same user/pw unlimited times which also ain't really good for my application =)

With sessions, this is generally legal, but should invalidate the old
session. Slightly annoying when you're testing with two browsers at
once, as they log each other out, but the convention. You can do this
by checking to see if the user already has a session in the database
before handing out a new one, and removing it if they do.


More information about the Mod_python mailing list