Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 3 additions & 6 deletions .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
Expand Up @@ -26,18 +26,15 @@
"customizations": {
"vscode": {
"extensions": [
"charliermarsh.ruff",
"ms-azuretools.vscode-docker",
"ms-python.black-formatter",
"ms-python.flake8",
"ms-python.isort",
"ms-python.python",
"ms-python.mypy-type-checker",
"tamasfe.even-better-toml"
],
"settings": {
"editor.formatOnSave": true,
"editor.rulers": [
120
130
],
"editor.tabSize": 4,
"files.insertFinalNewline": true,
Expand All @@ -46,7 +43,7 @@
"editor.codeActionsOnSave": {
"source.organizeImports": "explicit"
},
"python.editor.defaultFormatter": "ms-python.black-formatter",
"python.editor.defaultFormatter": "charliermarsh.ruff",
"python.testing.unittestEnabled": false,
"python.testing.pytestArgs": [
"src/tests"
Expand Down
58 changes: 55 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@
![python version](https://img.shields.io/badge/python-3.13-brightgreen)
![fastAPI version](https://img.shields.io/badge/fastapi-0.95.2-brightgreen)


## Components

- Rest API built with FastAPI and SQLAlchemy
- PostgreSQL database

Expand Down Expand Up @@ -40,22 +40,22 @@ variable loaded, otherwise the dev container might fail or not work as expected.
PROJECT_NAME=your-awesome-project code <path/to/repo>
```


## Migrations

We use Alembic as database migration tool. You can run migration commands directly inside the dev container or use the provided shortcut in the `exec.sh` script.

- `migrate` – Runs all migrations.
- `makemigrations` – Compares the current database state with the table metadata and generates the necessary migration files.


## Code tools

Linters, formatters, etc.

- **ruff**: Linter and formatter
- **mypy**: Static type checker

### pre-commit

`pre-commit` is part of the `dev` group in the `pyproject.toml` and is installed by default.

Setup the `pre-commit` hooks, specified in `.pre-commit-config.yaml`:
Expand All @@ -77,9 +77,11 @@ There is a shortcut under the `/scripts` directory that runs all this tools for
![Screenshot](.docs/images/format.png)

## Tests

We use FastAPI's `TestClient` and `pytest` for testing. `./exec.sh test` shortcut can be used to run all tests or just `test` inside the dev container.

## Shell

You can start an interactive Python shell inside the dev container in two ways:

1. Simply run `shell` inside the container.
Expand All @@ -94,17 +96,67 @@ The shell provides some useful pre-imported stuff:
![Screenshot](.docs/images/shell.png)

## Admin

The template includes an admin interface via [SQLAdmin](https://github.com/aminalaee/sqladmin). It's a flexible admin that can be configured in many ways.

*One note: You should be careful when adding relationships to the list or detail pages (specially large many-to-many / one-to-many relationships), because it's not very optimal in terms of DB querys in those cases (all the related objects would be loaded in memory).*

![Screenshot](.docs/images/admin.png)

## Celery

The template includes a simple example of distributed tasks using [Celery](https://docs.celeryq.dev/en/stable/). There's an example endpoint which sends a task to the queue and then the celery worker will execute it. You can monitor the worker with [Flower](https://flower.readthedocs.io/en/latest/), to do so first execute `poetry run celery -A src.task_queue.celery_worker flower --loglevel=info` and then go to `localhost:5555`.

In case you want to implement some real-world task you should modify the `src/task_queue/task.py` with your logic and then modify `src/api/v1/routers/task.py`.
Remember to always add all your tasks modules to the `src/task_queue/celery_worker.py` with `celery_worker.autodiscover_tasks(["path.to.your.task.module"])`.

## OpenTelemetry

A simple example of OpenTelemetry is included using the native FastAPI instrumentor to collect basic data of requests, also there is a custom instrument to collect data from the controllers. There is a simple implementation to monitor Celery to count the total tasks executed. Given that OpenTelemetry do not have a frontend, to see what is going on you should run `docker logs -f <otel-collector-container-id>`.

## Logging

This project uses [structlog](https://www.structlog.org/en/stable/) for structured logging.

Structured logs make it easier to parse, search, and analyze logs in production systems, especially when using centralized logging tools like Loki, ELK, or Datadog.

### Examples

1. Pass context information as key word arguments

```python
import structlog

log = structlog.get_logger()
# Pass extra information as key word arguments to log calls instead of wrapping them into the log message itself
log.info("user_logged_in", user_id="1234", method="password")
```

2. Bind additional context information to be used by any logger instance in scope

```python
import structlog
from structlog.contextvars import bound_contextvars

async def route_handler(user_id: str, session) -> User:
with bound_contextvars(user_id=user_id):
logger = structlog.get_logger(__name__)
logger.info("Handling request XYZ") # Will include user_id in the log entry

user = await fetch_user(user_id, session)

return user


async def fetch_user(user_id: str, session) -> User | None:
logger = structlog.get_logger(__name__)
log = logger.bind(class_name=User.__class__.name)
user = await User.objects(session).get(User.id == user_id)

if user is None:
# This will include class_name and also user_id in the logs, as the logger is created
# within the previous method's context.
log.debug("Record not found")

return user
```
20 changes: 19 additions & 1 deletion poetry.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ python-jose = "^3.4.0"
redis = "^5.2.1"
sqladmin = "^0.20.1"
sqlalchemy = "^2.0.39"
structlog = "^25.3.0"
uvicorn = "^0.34.0"


Expand Down
22 changes: 16 additions & 6 deletions src/api/v1/routers/user.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,11 @@
from typing import Any
from uuid import UUID

import structlog
from fastapi import APIRouter, Depends, Response
from fastapi_pagination import Page
from fastapi_pagination.ext.sqlalchemy import paginate
from structlog.contextvars import bound_contextvars

from src import models
from src.api.dependencies import db_session, get_user
Expand Down Expand Up @@ -39,14 +41,22 @@ async def login(

@router.get("/me", response_model=schemas.User)
def me(user: models.User = Depends(get_user)) -> Any:
logger = structlog.get_logger(__name__)
logger.debug("Getting current user profile")
return user


@router.get("/{user_id}/items", response_model=Page[Item])
async def get_public_items(user_id: UUID, session: AsyncSession = Depends(db_session)) -> Any:
# We can't use the @instrument decorator here because it will collide with the
# FastAPIinstrumentor and cause the span to be created twice.
# So we need to create the span manually.
with tracer_provider.get_tracer(__name__).start_as_current_span("get_public_items"):
user = await models.User.objects(session).get_or_404(models.User.id == user_id)
return await paginate(session, user.get_public_items())
# Adding user_id to the context information for loggers
with bound_contextvars(method_handler="get_public_items"):
logger = structlog.get_logger(__name__)
logger.debug("Getting user public items")

# We can't use the @instrument decorator here because it will collide with the
# FastAPIinstrumentor and cause the span to be created twice.
# So we need to create the span manually.
with tracer_provider.get_tracer(__name__).start_as_current_span("get_public_items"):
user = await models.User.objects(session).get_or_404(models.User.id == user_id)

return await paginate(session, user.get_public_items())
55 changes: 44 additions & 11 deletions src/core/config.py
Original file line number Diff line number Diff line change
@@ -1,18 +1,52 @@
from enum import Enum
import logging
import sys
from enum import IntEnum, StrEnum
from typing import Any, Literal

from pydantic import PostgresDsn
from pydantic_settings import BaseSettings
from pydantic import PostgresDsn, model_validator
from pydantic_settings import BaseSettings, SettingsConfigDict


class LogLevel(str, Enum):
critical = "CRITICAL"
error = "ERROR"
warning = "WARNING"
info = "INFO"
debug = "DEBUG"
class Env(StrEnum):
dev = "dev"


class LogLevel(IntEnum):
CRITICAL = logging.CRITICAL
ERROR = logging.ERROR
WARNING = logging.WARNING
INFO = logging.INFO
DEBUG = logging.DEBUG


class LogSettings(BaseSettings):
# Makes the settings immutable and hashable (can be used as dicts key)
model_config = SettingsConfigDict(frozen=True)

log_level: LogLevel = LogLevel.INFO
structured_log: bool | Literal["auto"] = "auto"
cache_loggers: bool = True

@property
def enable_structured_log(self) -> bool:
return not sys.stdout.isatty() if self.structured_log == "auto" else self.structured_log

@model_validator(mode="before")
@classmethod
def parse_log_level(cls, data: Any) -> Any:
if isinstance(data.get("log_level"), str):
data["log_level"] = LogLevel[data["log_level"]]

return data


class Settings(BaseSettings):
# Makes the settings immutable and hashable (can be used as dicts key)
model_config = SettingsConfigDict(frozen=True)

env: str = "dev"
project_name: str

# Database
async_database_url: PostgresDsn
database_pool_pre_ping: bool
Expand All @@ -21,8 +55,8 @@ class Settings(BaseSettings):
database_max_overflow: int

# Logging
log_level: LogLevel = LogLevel.debug
server_url: str
log_settings: LogSettings = LogSettings()

# Auth
access_token_expire_minutes: float
Expand All @@ -36,7 +70,6 @@ class Settings(BaseSettings):

# OpenTelemetry
otel_exporter_otlp_endpoint: str
env: str = "dev"


settings = Settings()
6 changes: 4 additions & 2 deletions src/core/database.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
import logging
import uuid
from datetime import datetime, timezone
from typing import Any, Dict, Generic, Sequence, Type, TypeVar

import structlog
from fastapi import HTTPException
from opentelemetry.instrumentation.sqlalchemy import SQLAlchemyInstrumentor
from sqlalchemy import ExceptionContext, func, select
Expand All @@ -27,6 +27,8 @@
from src.helpers.casing import snakecase
from src.helpers.sql import random_uuid, utcnow

logger = structlog.get_logger(__name__)

# Async engine and session
engine: AsyncEngine = create_async_engine(
url=str(settings.async_database_url),
Expand Down Expand Up @@ -54,7 +56,7 @@ def _on_handle_error(context: ExceptionContext) -> None:
Returns:
None: this returns nothing.
"""
logging.warning(f"handle_error event triggered for PostgreSQL engine: {context.sqlalchemy_exception}")
logger.warning(f"handle_error event triggered for PostgreSQL engine: {context.sqlalchemy_exception}")
if "Can't connect to PostgreSQL server on" in str(context.sqlalchemy_exception):
# Setting is_disconnect to True should tell SQLAlchemy treat this as a connection error and retry
context.is_disconnect = True # type: ignore
Expand Down
32 changes: 24 additions & 8 deletions src/core/security.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,16 +58,19 @@ def process_login(cls, user: User, response: Response) -> Token | None:
return None

async def get_user_from_token(self, token: str, session: AsyncSession) -> User:
try:
payload = jwt.decode(token=token, key=settings.jwt_signing_key, algorithms=self.algorithm)
token_data = TokenPayload(**payload)
except (JWTError, ValidationError):
raise self.credentials_exception
token_data = await self._validate_token(token=token)
user = await User.objects(session).get(User.id == token_data.user_id)
if not user:
raise self.credentials_exception
return user

async def _validate_token(self, token: str) -> TokenPayload:
try:
payload = jwt.decode(token=token, key=settings.jwt_signing_key, algorithms=self.algorithm)
return TokenPayload(**payload)
except (JWTError, ValidationError):
raise self.credentials_exception

def _get_token_from_cookie(self, request: Request) -> str | None:
token = request.cookies.get(self.cookie_name)
return token
Expand All @@ -79,16 +82,29 @@ def _get_token_from_header(self, request: Request) -> str | None:
return None
return token

def _get_token(self, request: Request) -> str:
def _get_token(self, request: Request) -> str | None:
token = None
if self.accept_header:
token = self._get_token_from_header(request)
if not token and self.accept_cookie:
token = self._get_token_from_cookie(request)
if not token:
raise self.credentials_exception

return token

async def __call__(self, request: Request, session: AsyncSession) -> User:
token = self._get_token(request)
if not token:
raise self.credentials_exception

return await self.get_user_from_token(token, session)

async def get_token_payload(self, request: Request) -> TokenPayload | None:
"""
Get the user token payload from the request auth token if present. Otherwise, return None.
This method validates the token if present and raise a 401 error if invalid.
"""
token = self._get_token(request)
if not token:
return None

return await self._validate_token(token)
Loading