Icon

kn_​example_​python_​log_​to_​file

KNIME / Python - Custom Logging Utility with Timestamped Log Files from within Python Script nodes


KNIME / Python - Custom Logging Utility with Timestamped Log Files from within Python Script nodes

Logging levels provided by Python's logging library are:

logging.DEBUG: Detailed information, typically of interest only when diagnosing problems.

logging.INFO: Confirmation that things are working as expected.

logging.WARNING: An indication that something unexpected happened or indicative of some problem in the near future (e.g., 'disk space low'). The software is still working as expected.

logging.ERROR: Due to a more serious problem, the software has not been able to perform some function.

logging.CRITICAL: A very serious error, indicating that the program itself may be unable to continue running.

Logging levels provided by Python's logging library are:logging.DEBUG: Detailed information, typically of interest only when diagnosing problems.logging.INFO: Confirmation that things are working as expected.logging.WARNING: An indication that something unexpected happened or indicative of some problem in the near future (e.g., 'disk space low'). The software is stillworking as expected.logging.ERROR: Due to a more serious problem, the software has not been able to perform some function.logging.CRITICAL: A very serious error, indicating that the program itself may be unable to continue running. KNIME / Python - Custom Logging Utility with Timestamped Log Files from within Python Script nodeshttps://forum.knime.com/t/python-scripting/64621/4?u=mlauber71There is a Jupyter notebook in the /data/ subfolder "kn_example_python_log_to_file.ipynb" import knime.scripting.io as knioimport loggingimport sysimport datetime# Logging levels provided by Python's logging library are:# logging.DEBUG: Detailed information, typically of interest only when diagnosing problems.# logging.INFO: Confirmation that things are working as expected.# logging.WARNING: An indication that something unexpected happened or indicative of some problem in the near future (e.g.,'disk space low'). The software is still working as expected.# logging.ERROR: Due to a more serious problem, the software has not been able to perform some function.# logging.CRITICAL: A very serious error, indicating that the program itself may be unable to continue running.desired_logging_level = logging.DEBUGclass StreamToLogger: def __init__(self, logger, level=logging.INFO): self.logger = logger self.level = level def write(self, message): if message.rstrip() != "": self._log_without_recursion(message.rstrip()) def flush(self): pass def _log_without_recursion(self, message): original_stdout = sys.stdout original_stderr = sys.stderr sys.stdout = sys.__stdout__ sys.stderr = sys.__stderr__ try: self.logger.log(self.level, message) finally: sys.stdout = original_stdout sys.stderr = original_stderrdef stop_logging(): sys.stdout = sys.__stdout__ sys.stderr = sys.__stderr__timestamp = datetime.datetime.now().strftime("%Y-%m-%d_%H-%M-%S")logfile_path = knio.flow_variables['context.workflow.data-path']# Create a filename with the timestamplogfile = f"{logfile_path}logfile_{timestamp}.txt"logger = logging.getLogger("my_logger")logger.setLevel(desired_logging_level)# Create file handler and add it to loggerfh = logging.FileHandler(logfile, mode='w')fh.setLevel(desired_logging_level)fh.setFormatter(logging.Formatter("%(asctime)s %(name)-12s %(levelname)-8s %(message)s", datefmt="%Y-%m-%d %H:%M:%S"))logger.addHandler(fh)# Redirect stdout and stderr to loggersys.stdout = StreamToLogger(logger, desired_logging_level)sys.stderr = StreamToLogger(logger, desired_logging_level)# --------- start the Python code you want to log to the fileprint(logfile_path)input_table = knio.input_tables[0].to_pandas() v_colums_to_filter = [col for col in input_table.columns if '<span' in col]# export as Flow Variableknio.flow_variables['colums_to_filter'] = v_colums_to_filter# https://thispointer.com/python-pandas-drop-columns-in-dataframe-by-label-names-or-by-index-positions/input_table = input_table.drop(v_colums_to_filter , axis='columns')knio.output_tables[0] = knio.Table.from_pandas(input_table.copy())logger.debug("This is a debug message.")logger.info("This is an info message.")logger.warning("This is a warning message.")logger.error("This is an error message.")logger.critical("This is a critical message.")# ------------------ Stop logging to the filestop_logging() Summary:This code provides a custom logging utility that captures and logs messages from the standard output (stdout)and standard error (stderr) streams, as well as directly logged messages using Python's built-in logging library.The logged messages are saved to timestamped log files.Here's a brief overview of the code components:1. Define a StreamToLogger class that redirects the stdout and stderr to a logger object.It has a _log_without_recursion method that temporarily disables redirection to prevent infinite recursion whilelogging.The write method logs the messages if they are not empty.The flush method does nothing as it is not needed in this context.2. Define a stop_logging function that stops the redirection of stdout and stderr by setting them back to theiroriginal values.3. Generate a timestamped log file name using the current date and time.4. Create a logger object and configure its logging level and handlers.Set the logger's level to DEBUG.Add a file handler with a specified logging format and a timestamped log file name.5. Redirect stdout and stderr to the StreamToLogger instances using the created logger.6. Log messages with different logging levels (DEBUG, INFO, WARNING, ERROR, and CRITICAL).7. Use print statements to demonstrate that the messages are captured by the logger.8. Perform a simple operation (division) to show that the code execution continues as expected.9.Call the stop_logging function to stop logging the messages to the file. Python advanced Loggingto a filefiler based onknio.flow_variables['colums_to_filter']locate and create/data/ folderwith absolute pathslist log file .txtlatestentrylatest filePathread the latest Log file Table Creator Python Script Column Filter Collect LocalMetadata List Files/Folders Files/FoldersMeta Info Sorter Row Filter Table Rowto Variable File Reader Logging levels provided by Python's logging library are:logging.DEBUG: Detailed information, typically of interest only when diagnosing problems.logging.INFO: Confirmation that things are working as expected.logging.WARNING: An indication that something unexpected happened or indicative of some problem in the near future (e.g., 'disk space low'). The software is stillworking as expected.logging.ERROR: Due to a more serious problem, the software has not been able to perform some function.logging.CRITICAL: A very serious error, indicating that the program itself may be unable to continue running. KNIME / Python - Custom Logging Utility with Timestamped Log Files from within Python Script nodeshttps://forum.knime.com/t/python-scripting/64621/4?u=mlauber71There is a Jupyter notebook in the /data/ subfolder "kn_example_python_log_to_file.ipynb" import knime.scripting.io as knioimport loggingimport sysimport datetime# Logging levels provided by Python's logging library are:# logging.DEBUG: Detailed information, typically of interest only when diagnosing problems.# logging.INFO: Confirmation that things are working as expected.# logging.WARNING: An indication that something unexpected happened or indicative of some problem in the near future (e.g.,'disk space low'). The software is still working as expected.# logging.ERROR: Due to a more serious problem, the software has not been able to perform some function.# logging.CRITICAL: A very serious error, indicating that the program itself may be unable to continue running.desired_logging_level = logging.DEBUGclass StreamToLogger: def __init__(self, logger, level=logging.INFO): self.logger = logger self.level = level def write(self, message): if message.rstrip() != "": self._log_without_recursion(message.rstrip()) def flush(self): pass def _log_without_recursion(self, message): original_stdout = sys.stdout original_stderr = sys.stderr sys.stdout = sys.__stdout__ sys.stderr = sys.__stderr__ try: self.logger.log(self.level, message) finally: sys.stdout = original_stdout sys.stderr = original_stderrdef stop_logging(): sys.stdout = sys.__stdout__ sys.stderr = sys.__stderr__timestamp = datetime.datetime.now().strftime("%Y-%m-%d_%H-%M-%S")logfile_path = knio.flow_variables['context.workflow.data-path']# Create a filename with the timestamplogfile = f"{logfile_path}logfile_{timestamp}.txt"logger = logging.getLogger("my_logger")logger.setLevel(desired_logging_level)# Create file handler and add it to loggerfh = logging.FileHandler(logfile, mode='w')fh.setLevel(desired_logging_level)fh.setFormatter(logging.Formatter("%(asctime)s %(name)-12s %(levelname)-8s %(message)s", datefmt="%Y-%m-%d %H:%M:%S"))logger.addHandler(fh)# Redirect stdout and stderr to loggersys.stdout = StreamToLogger(logger, desired_logging_level)sys.stderr = StreamToLogger(logger, desired_logging_level)# --------- start the Python code you want to log to the fileprint(logfile_path)input_table = knio.input_tables[0].to_pandas() v_colums_to_filter = [col for col in input_table.columns if '<span' in col]# export as Flow Variableknio.flow_variables['colums_to_filter'] = v_colums_to_filter# https://thispointer.com/python-pandas-drop-columns-in-dataframe-by-label-names-or-by-index-positions/input_table = input_table.drop(v_colums_to_filter , axis='columns')knio.output_tables[0] = knio.Table.from_pandas(input_table.copy())logger.debug("This is a debug message.")logger.info("This is an info message.")logger.warning("This is a warning message.")logger.error("This is an error message.")logger.critical("This is a critical message.")# ------------------ Stop logging to the filestop_logging() Summary:This code provides a custom logging utility that captures and logs messages from the standard output (stdout)and standard error (stderr) streams, as well as directly logged messages using Python's built-in logging library.The logged messages are saved to timestamped log files.Here's a brief overview of the code components:1. Define a StreamToLogger class that redirects the stdout and stderr to a logger object.It has a _log_without_recursion method that temporarily disables redirection to prevent infinite recursion whilelogging.The write method logs the messages if they are not empty.The flush method does nothing as it is not needed in this context.2. Define a stop_logging function that stops the redirection of stdout and stderr by setting them back to theiroriginal values.3. Generate a timestamped log file name using the current date and time.4. Create a logger object and configure its logging level and handlers.Set the logger's level to DEBUG.Add a file handler with a specified logging format and a timestamped log file name.5. Redirect stdout and stderr to the StreamToLogger instances using the created logger.6. Log messages with different logging levels (DEBUG, INFO, WARNING, ERROR, and CRITICAL).7. Use print statements to demonstrate that the messages are captured by the logger.8. Perform a simple operation (division) to show that the code execution continues as expected.9.Call the stop_logging function to stop logging the messages to the file. Python advanced Loggingto a filefiler based onknio.flow_variables['colums_to_filter']locate and create/data/ folderwith absolute pathslist log file .txtlatestentrylatest filePathread the latest Log fileTable Creator Python Script Column Filter Collect LocalMetadata List Files/Folders Files/FoldersMeta Info Sorter Row Filter Table Rowto Variable File Reader

Nodes

Extensions

Links