Right now I have a central module in a framework that spawns multiple processes using the Python 2.6
module. Because it uses
, there is module-level multiprocessing-aware log,
LOG = multiprocessing.get_logger()
. Per the docs
, this logger has process-shared locks so that you don't garble things up in
(or whatever filehandle) by having multiple processes writing to it simultaneously.
The issue I have now is that the other modules in the framework are not multiprocessing-aware. The way I see it, I need to make all dependencies on this central module use multiprocessing-aware logging. That's annoying within
the framework, let alone for all clients of the framework. Are there alternatives I'm not thinking of?