Thursday, June 20, 2013

prevent multiple copies of script running at same time


Single-process servers (and cron jobs) are easier to write.  If a script fires up, then starts processing a job that another copy of the same script is working on, then bad things happen.

The traditional solution to this is a bit of a hack.  Script fires up, writes its unique process id (PID) to a magic file somewhere.  When the script dies we delete the file.  This works somewhat, but fails if the script gets impolitely killed -- the "lock" file still exists but the associated script is no longer running.

The following trick is useful. When the script fires up it registers itself with a global list of strings held by the OS. If the code can't register itself, it knows the program is already running, and should exit.  When a script dies or is killed, the OS itself de-registers it, thus the script can run again.

Alas this trick is Linux-only.


#!/usr/bin/env python

import os, sys, time


def get_lock(process_name=None):
    """
    prevent multiple processes running at the same time
    """
    # http://stackoverflow.com/questions/788411/check-to-see-if-python-script-is-running
    import socket
    if not process_name:
        process_name = os.path.basename(sys.argv[0])
    lock_socket = socket.socket(socket.AF_UNIX, socket.SOCK_DGRAM)
    try:
        lock_socket.bind('\0' + process_name)
        return lock_socket
    except socket.error:
        pass
    return None


def main():
    sys.process_lock = get_lock()
    print sys.process_lock
    if sys.process_lock:
        time.sleep(30)


if __name__=='__main__':
    main()


Gist: https://gist.github.com/shavenwarthog/5824515

No comments:

Post a Comment