I'm a little bit confused about how best to approach setting up logging in my Python application.
I'm using the IBPy module, which contains it's own logging here:
import logging
import os
format = '%(asctime)s %(levelname)-9.9s %(message)s'
datefmt = '%d-%b-%y %H:%M:%S'
##
# Default log level. Set IBPY_LOGLEVEL environment variable to
# change this default.
level = int(os.environ.get('IBPY_LOGLEVEL', logging.DEBUG))
def logger(name='ibpy', level=level, format=format,
datefmt=datefmt):
logging.basicConfig(level=level, format=format, datefmt=datefmt)
return logging.getLogger(name)
My application consists of a bunch of files that are imported into a Jupyter notebook. Final execution takes place inside the notebook.
Currently, other modules are firing off on the 'DEBUG' level, and my notebook is full of warnings.
What is the right approach to configure logging at the module level?
IbPy's call to basicconfig has configured 'root' logger to catch events with debug level and above and send them to stderr. That effectively results in every event from every logger being processed. There's a few ways to get this back under control.
Mix and match the following at the top of your own module.
Is ibpy polluting your console? Set just the ibpy logger to level 'warning'. You will still get debug or greater events for all other loggers.
logging.getLogger('ibpy').setLevel('WARNING')
Restrict the root logger to only process info and above. This effectively means you'll see info and above events from all loggers.
logging.getLogger().setLevel('INFO')
Remove the handler from root, nothing goes to stderr.
logging.getLogger().handlers.clear()
May need to configure a new handler if you removed the one from root. Consider adding it to this logger if you want to avoid logs from any other logger.
logger = logging.getLogger('mine')
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With