Multiprocessing Logging in Python
When using Python's multiprocessing module, it's important to consider logging practices to avoid errors caused by multiple processes writing to the same filehandle simultaneously. By default, the multiprocessing-aware logger provided by mp.get_logger() ensures proper locking mechanisms in sys.stderr.
However, modules that are not multiprocessing-aware may need modifications to use multiprocessing-aware logging. To avoid these changes, consider alternative approaches:
Custom Logging Handler
One approach is to create a custom log handler that sends logs to the parent process via a pipe. This allows modules to use the standard logging module while the parent process handles the actual logging. Here's an implementation:
from logging.handlers import RotatingFileHandler
import multiprocessing, threading, logging, sys, traceback
class MultiProcessingLog(logging.Handler):
def __init__(self, name, mode, maxsize, rotate):
logging.Handler.__init__(self)
self._handler = RotatingFileHandler(name, mode, maxsize, rotate)
self.queue = multiprocessing.Queue(-1)
t = threading.Thread(target=self.receive)
t.daemon = True
t.start()
The handler receives log records from the child processes and writes them to a file using the provided file handler. This ensures centralized logging without the need to make changes to dependent modules.
Disclaimer: All resources provided are partly from the Internet. If there is any infringement of your copyright or other rights and interests, please explain the detailed reasons and provide proof of copyright or rights and interests and then send it to the email: [email protected] We will handle it for you as soon as possible.
Copyright© 2022 湘ICP备2022001581号-3