Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: Unexpected logger behaviour. Unable to use build-in default logger in Litestar #3913

Open
2 of 4 tasks
RoTorEx opened this issue Dec 25, 2024 · 4 comments
Open
2 of 4 tasks
Labels
Bug 🐛 This is something that is not working as expected

Comments

@RoTorEx
Copy link

RoTorEx commented Dec 25, 2024

Description

Hi,

I am currently using Litestar for my new services. And after FastAPI I can't get the logger to work correctly.

First. During development we use std readable custom logs. And on the server we use json logs. I didn't find this fix in the documentation in sours, I guess it's not there. Additional mention of this issue.

Second. After several tests with Litestar, I could not jam your default logger. Using a custom logger based on the default logger, where clear formatters, logging levels, handlers, and interceptors are defined, it doesn't want to work as I expect it to work in litestar as it does in other parts of the service(s).

For example, when doing local development during an exception I expect to get an expanded log of my pattern, but instead I get the default Litestar log.

P.S. I'm not sure if it's a bug or a feature, but in that case I'd suggest considering stopping your default logger initialization or accepting a pre-created and prepared users' logger, as is done in the Faststream framework.

URL to code causing the issue

No response

MCVE

# log.py

import json
import logging
import sys

from dataclasses import dataclass
from datetime import datetime
from logging import Formatter, Handler, StreamHandler
from logging.config import dictConfig
from typing import TYPE_CHECKING, Annotated, Any

import stackprinter

from loguru import logger as loguru_logger
from pydantic import Field


@dataclass
class SerializeJsonLog:
    timestamp: Annotated[str, Field(alias="timestamp")]
    timestamp: str
    thread: int | str | None
    env: str
    level: str
    source: str
    message: str
    exceptions: list[str] | str | None = None
    trace_id: str | None = None
    span_id: str | None = None
    parent_id: str | None = None
    props: str | None = None


if TYPE_CHECKING:  # pragma: no cover
    from types import FrameType


LEVEL_TO_NAME: dict[int, str] = {
    logging.CRITICAL: "Critical",
    logging.ERROR: "Error",
    logging.WARNING: "Warning",
    logging.INFO: "Info",
    logging.DEBUG: "Debug",
    logging.NOTSET: "Trace",
}


class LoguruInterceptHandler(Handler):
    def emit(self, record: logging.LogRecord) -> None:
        # Get corresponding Loguru level if it exists.
        frame: FrameType | None
        level: str | int

        try:
            level = loguru_logger.level(record.levelname).name

        except ValueError:
            level = record.levelno

        # Find caller from where originated the logged message.
        frame = logging.currentframe()
        depth = 0

        while frame and (depth == 0 or frame.f_code.co_filename == logging.__file__):
            frame = frame.f_back
            depth += 1

        loguru_logger.opt(depth=depth, exception=record.exc_info).log(level, record.getMessage())


class JSONLogFormatter(Formatter):
    """Custom class-formatter for writing logs to json."""

    def __init__(self, app_env: str, *args: Any, **kwargs: Any) -> None:
        self.app_env: str = app_env

        super().__init__(*args, **kwargs)

    def format(self, record: logging.LogRecord, *args: list, **kwargs: dict) -> str:
        """Write LogRecord to json."""

        log_object: dict = self._format_log_object(record)

        return json.dumps(log_object, ensure_ascii=False)

    def _format_log_object(self, record: logging.LogRecord) -> dict:
        now = datetime.fromtimestamp(record.created).astimezone().isoformat()

        if "/site-packages/" in record.pathname:
            parts = record.pathname.split("/site-packages/")[-1].split("/")
            module_path = ".".join(parts).replace(".py", "")

        else:
            module_path = record.name

        log_obj = SerializeJsonLog(
            timestamp=now,
            thread=record.process,
            env=self.app_env,
            level=LEVEL_TO_NAME[record.levelno],
            source=f"{module_path}:{record.funcName}:{record.lineno}",
            message=record.getMessage(),
        )

        if hasattr(record, "props"):
            log_obj.props = record.props

        if record.exc_info:
            # Stackprinter gets all debug information
            # https://github.com/cknd/stackprinter/blob/master/stackprinter/__init__.py#L28-L137

            log_obj.exceptions = str(
                stackprinter.format(
                    record.exc_info,
                    suppressed_paths=[
                        r"lib/python.*/site-packages/starlette.*",
                    ],
                    add_summary=False,
                ).split("\n"),
            )

        elif record.exc_text:
            log_obj.exceptions = record.exc_text

        # # Pydantic to dict
        json_log = log_obj.model_dump(
            exclude_unset=True,
            by_alias=True,
        )

        # getting additional fields
        if hasattr(record, "request_json_fields"):
            json_log.update(record.request_json_fields)

        return json_log


def configure_logging(app_env: str, log_level: str) -> None:
    base_logging_handlers: dict[str, Any] = {}

    loguru_logger.remove()

    if app_env in ["dev", "test"]:
        log_format = (
            "<green>{time:YYYY-MM-DD HH:mm:ss.SSS}</green> | "
            "<level>{level: <8}</level> | "
            "<cyan>{name}</cyan>:<cyan>{function}</cyan>:<cyan>{line}</cyan> | "
            "<level>{message}</level>"
        )

        loguru_logger.add(sys.stderr, level=log_level, format=log_format, colorize=True, serialize=False)

        base_logging_handlers.update({"intercept": {"()": LoguruInterceptHandler}})

    else:
        base_logging_handlers.update(
            {
                "stream": {
                    "class": StreamHandler,
                    "stream": sys.stdout,
                    "level": log_level,
                    "formatter": "json",
                }
            }
        )

    cfg: dict[str, Any] = {
        "version": 1,
        "disable_existing_loggers": False,
        "handlers": base_logging_handlers,
        "formatters": {
            "json": {
                "()": JSONLogFormatter,
                "app_env": app_env,
            },
        },
        "loggers": {
            # * Our own services loggers
            "base": {
                "handlers": base_logging_handlers,
                "level": log_level,
                "propagate": False,
            },
            "base.faststream": {
                "handlers": base_logging_handlers,
                "level": "WARNING",
                "propagate": False,
            },
            # * Root Python logger (Libs)
            "root": {
                "handlers": base_logging_handlers,
                "level": "WARNING",  # ? Set this level in `DEBUG` or in `INFO` will show ALL logs
                "propagate": False,
            },
            # * Uvicorn loggers (Server launch)
            "uvicorn": {
                "handlers": base_logging_handlers,
                "level": "DEBUG",
                "propagate": False,
            },
            "uvicorn.access": {
                "handlers": base_logging_handlers,
                "level": "DEBUG",
                "propagate": False,
            },
            "uvicorn.error": {
                "handlers": base_logging_handlers,
                "level": "DEBUG",
                "propagate": False,
            },
        },
    }

    dictConfig(cfg)


# main.py

from fastapi import APIRouter, FastAPI
from litestar import Litestar, get


configure_logging("dev", "DEBUG")  # !My custom logger dict config definition


@get("/")
async def failed_endpoint_litestar() -> dict[str, str]:
    _ = 1 / 0  # ! Nothing in stdout :'(

    return {"message": "Hello World"}


router = APIRouter()

@router.get("/")
async def failed_endpoint_fastapi() -> dict[str, str]:
    _ = 1 / 0  # ! I've got my excepted log in stdout here

    return {"message": "Hello World"}


litestar_app = Litestar(route_handlers=[failed_endpoint_litestar])

fastapi_app = FastAPI()
fastapi_app.include_router(router)

Steps to reproduce

Litestar - `uv run python -m uvicorn src.report:litestar_app --host 0.0.0.0 --port 80 --reload`
FastAPI - `uv run python -m uvicorn src.report:fastapi_app --host 0.0.0.0 --port 80 --reload`

Screenshots

Below

Logs

Below

Litestar Version

Version: "litestar==2.13.0",

Platform

  • Linux
  • Mac
  • Windows
  • Other (Please specify in the description above)
@RoTorEx RoTorEx added the Bug 🐛 This is something that is not working as expected label Dec 25, 2024
@RoTorEx
Copy link
Author

RoTorEx commented Dec 25, 2024

FastAPI browser output and app logs (Expected behaviour):
FastAPI - api
FastAPI - console


Litestar browser output and app logs (Nope):
Litestar - api
Litestar - console

@euri10
Copy link
Contributor

euri10 commented Dec 25, 2024

have you seen this the 2nd Attention point in here : https://docs.litestar.dev/latest/usage/logging.html and does it solves your issue ?

@RoTorEx
Copy link
Author

RoTorEx commented Dec 25, 2024

have you seen this the 2nd Attention point in here : https://docs.litestar.dev/latest/usage/logging.html and does it solves your issue ?

Yeah, sure, I've seen that page.

The problem is that Litestar push you to configure a unique logger for it, and I don't want to make a unique logger only for Litestar when I have my sufficient logger configuration for all application components.

Plus it doesn't say anything about json logs.

@RoTorEx
Copy link
Author

RoTorEx commented Dec 25, 2024

The same goes for the StructlogPlugin plugin, which is already configured with structlog or loguru in my logger configuration and Litestar requires to configure it again, which violates the SSOT for the logger initialization control.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug 🐛 This is something that is not working as expected
Projects
None yet
Development

No branches or pull requests

2 participants