Custom logger in streamlit

Hello,

I’m trying to add an handler on streamlit logger (and other loggers, like code response 200, 404…) in a Json format to send to Elasticsearch. The aim is to make KPIs based on the streamlit app with Elasticsearch. For example, chcek usage, loads, how many files where sent and many more.

For this I started to list all the loggers in streamlit like this (Please improve logging formatter with time · Issue #447 · streamlit/streamlit · GitHub)

loggers = [
    name for name in logging.root.manager.loggerDict if name.startswith("streamlit")
]

I get a huge list, but I’m only keeping a few that, I think, are usefull in my application.

I’m using json_log_formatter module to create a JSONFormater class. Traking time of log, the message and other stuff I need.

Below is the file I created to generate the Json logs in a rotating file. You can use it by importing it in app.py and by calling set_logging_format() in main(). No need to launch streamlit with --logger.level=debug option.

import logging
import os
from datetime import datetime
from pathlib import Path

import json_log_formatter

from utils import settings

# Loads once with import in app.py
START = True


def set_logging_format():
    global START
    # Set only once loggers (Can not cache this function)
    if START:
        START = False
        loggers = [
            "streamlit.config",
            "streamlit",
            "streamlit.hashing",
            "streamlit.caching",
            "streamlit.report_thread",
            "streamlit.script_runner",
            "streamlit.report",
            "streamlit.watcher",
            "streamlit.report_session",
            "streamlit.metrics",
            "streamlit.server.routes",
            "streamlit.server",
            "streamlit.components.v1.components",
            "streamlit.components.v1",
            "streamlit.components",
            "streamlit.server.server",
            "streamlit.logger",
        ]

        Path("./my_log_path").mkdir(parents=True, exist_ok=True)
        log_file_path = os.path.join(
            "./my_log_path",
            datetime.now().strftime("logs_%Y_%m_%d_%H_%M_%S.log"),
        )

        formatter = AppJsonFormatLogger()
        json_handler = logging.handlers.TimedRotatingFileHandler(
            filename=log_file_path,
            when="D",
            interval=7,
            backupCount=3,
        )
        json_handler.setFormatter(formatter)

        for name in loggers:
            logger = logging.getLogger(name)
            logger.addHandler(json_handler)
            logger.setLevel(logging.DEBUG)
 
            
        return True
    else:
        return False


class AppJsonFormatLogger(json_log_formatter.JSONFormatter):
    def json_record(self, message: str, extra: dict, record: logging.LogRecord) -> dict:
        time_obj = datetime.now()
        time = time_obj.strftime("%Y-%m-%d %H:%M:%S")
        extra["request_date"] = time
        extra["name"] = record.name
        extra["levelname"] = record.levelname
        extra["application_name"] = "My_Super_App_Name"
        extra["details"] = message
        return extra

Here is my issue, I cannot get all the loggers to work at the same time. I found out that loggers have a level already set. For example streamlit.server.server is a logging.INFO, but if I don’t set the logger to logging.DEBUG I don’t get his logging in the file.

Why the logger doesn’t accept my Json handler depending on the logger level set? In Elastiksearch I don’t want to flag streamlit.server.server as DEBUG but as INFO. If I keep his level in INFO it won’t write in my log file.

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.