Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .devcontainer/aliases-devcontainer
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
alias format='scripts/format.sh'
alias makemigrations='scripts/makemigrations.sh'
alias migrate='scripts/migrate.sh'
alias shell='poetry run python src/helpers/shell.py'
alias shell='uv run python src/helpers/shell.py'
alias test='scripts/test.sh'
4 changes: 2 additions & 2 deletions .devcontainer/devcontainer-start.sh
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/usr/bin/bash

# Run start commads such as poetry install to keep dependecies updates and in sync with your lock file.
# Run start commads such as uv sync to keep dependecies updates and in sync with your lock file.

set -xeo pipefail

poetry install --no-ansi --no-root
uv sync --frozen --no-install-project --all-groups
6 changes: 2 additions & 4 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@
version: 2

updates:
# Enable version updates for poetry
- package-ecosystem: "pip"
# Enable version updates for uv
- package-ecosystem: "uv"
commit-message:
include: "scope"
prefix: "chore"
Expand All @@ -14,8 +14,6 @@ updates:
- xmartlabs/python
schedule:
interval: "weekly"
# dependabot doesn't support poetry v2. See: https://github.com/dependabot/dependabot-core/issues/11237
open-pull-requests-limit: 0

# Enable version updates for Docker
- package-ecosystem: "docker"
Expand Down
12 changes: 6 additions & 6 deletions .github/workflows/python-app.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,8 @@ jobs:

- name: Install dependencies
run: |
pip install poetry
poetry install --no-root --with dev
pip install uv
uv sync --frozen --no-cache --no-install-project --all-groups

- name: Install pre-commit
run: pip install pre-commit
Expand Down Expand Up @@ -75,13 +75,13 @@ jobs:

- name: Install dependencies
run: |
pip install poetry
poetry install --no-root --with dev
pip install uv
uv sync --frozen --no-cache --no-install-project --group dev --no-group types

- name: Run tests with coverage
run: |
poetry run coverage run -m pytest
poetry run coverage report -m --fail-under=80
uv run coverage run -m pytest
uv run coverage report -m --fail-under=80

docker-build:
name: Build Docker Image
Expand Down
23 changes: 9 additions & 14 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -22,23 +22,19 @@ RUN groupadd -g 10001 ${USER} \

USER ${USER}

ENV POETRY_HOME="/home/${USER}/.local/share/pypoetry"
ENV POETRY_NO_INTERACTION=1
ENV POETRY_VERSION=2.1.1
ENV UV_VERSION=0.7.18

# Poetry is installed in user's home directory, which is not in PATH by default.
# uv is installed in user's home directory, which is not in PATH by default.
ENV PATH="$PATH:/home/${USER}/.local/bin"
ENV PYTHONPATH=/opt/app/${PROJECT_NAME}

RUN pip install --upgrade pip \
&& pip install --user poetry==${POETRY_VERSION}
&& pip install --user uv==${UV_VERSION}

WORKDIR /opt/app/${PROJECT_NAME}
COPY --chown=${USER}:${USER} . .

RUN poetry install --no-root --without dev \
# clean up installation caches and artifacts
&& yes | poetry cache clear . --all
RUN uv sync --frozen --no-cache --no-install-project --no-default-groups


# ----
Expand All @@ -64,22 +60,21 @@ RUN echo '%sudo ALL=(ALL) NOPASSWD:ALL' >> /etc/sudoers
USER ${USER}

RUN mkdir -p /home/${USER}/.cache
RUN poetry install --no-root --all-groups
RUN uv sync --frozen --no-cache --no-install-project --all-groups

CMD ["sleep", "infinity"]

# ----
# Celery worker stage
FROM base AS celery_worker

CMD ["poetry", "run", "celery", "-A", "src.task_queue.celery_worker", "worker", "--loglevel=info"]
CMD ["uv", "run", "celery", "-A", "src.task_queue.celery_worker", "worker", "--loglevel=info"]

# ----
# Builder will package the app for deployment
FROM base AS builder

RUN poetry check \
&& poetry build --format wheel
RUN uv build --wheel

# ----
# Deployment stage to run in cloud environments. This must be the last stage, which is used to run the application by default
Expand All @@ -96,10 +91,10 @@ USER ${USER}
# TODO(remer): wheel version has to match what is set in pyproject.toml
COPY --from=builder /opt/app/${PROJECT_NAME}/dist/python_template-0.1.0-py3-none-any.whl /opt/app/${PROJECT_NAME}/dist/python_template-0.1.0-py3-none-any.whl

RUN poetry run pip install --no-deps dist/python_template-0.1.0-py3-none-any.whl
RUN uv run pip install --no-deps dist/python_template-0.1.0-py3-none-any.whl

EXPOSE 8000

ENTRYPOINT ["poetry", "run", "python", "-m", "uvicorn", "src.main:app"]
ENTRYPOINT ["uv", "run", "python", "-m", "uvicorn", "src.main:app"]

CMD ["--host", "0.0.0.0", "--port", "8000"]
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,12 @@ You can connect to the container once it's running using `scripts/exec.sh bash`

Once the containers and server are running, you can go to `http://localhost:8000/docs` to see the automatic interactive API documentation.

In case you don't to use VS Code and dev containers, or you want to set up the environment in a different way. You can use the `Dockerfile` in the root of the repository to create the image with everything needed to run the project. The `docker-compose.yaml` and `.env.example` files in the `.devcontainer` folder serve as references for recreating other services like the database. Also, you will need to run the `poetry install --no-ansi --no-root` command manually to install all the required dependencies.
In case you don't to use VS Code and dev containers, or you want to set up the environment in a different way. You can use the `Dockerfile` in the root of the repository to create the image with everything needed to run the project. The `docker-compose.yaml` and `.env.example` files in the `.devcontainer` folder serve as references for recreating other services like the database. Also, you will need to run the `uv sync --frozen --no-cache --no-install-project --all-groups` command manually to install all the required dependencies.

Alternatively, you must have:

- `Python >3.13`
- [Poetry](https://python-poetry.org/docs/#installation) (don't forget to install the dependencies from the lock file)
- [uv](https://docs.astral.sh/uv/getting-started/installation/) (don't forget to install the dependencies from the lock file)
- [PostgreSQL](https://www.postgresql.org/) database, setting the corresponding environment variables for the database connection.

For making code changes, installing `pre-commit` is necessary (see section [Code tools: pre-commit](#pre-commit))
Expand Down Expand Up @@ -105,7 +105,7 @@ The template includes an admin interface via [SQLAdmin](https://github.com/amina

## Celery

The template includes a simple example of distributed tasks using [Celery](https://docs.celeryq.dev/en/stable/). There's an example endpoint which sends a task to the queue and then the celery worker will execute it. You can monitor the worker with [Flower](https://flower.readthedocs.io/en/latest/), to do so first execute `poetry run celery -A src.task_queue.celery_worker flower --loglevel=info` and then go to `localhost:5555`.
The template includes a simple example of distributed tasks using [Celery](https://docs.celeryq.dev/en/stable/). There's an example endpoint which sends a task to the queue and then the celery worker will execute it. You can monitor the worker with [Flower](https://flower.readthedocs.io/en/latest/), to do so first execute `uv run celery -A src.task_queue.celery_worker flower --loglevel=info` and then go to `localhost:5555`.

In case you want to implement some real-world task you should modify the `src/task_queue/task.py` with your logic and then modify `src/api/v1/routers/task.py`.
Remember to always add all your tasks modules to the `src/task_queue/celery_worker.py` with `celery_worker.autodiscover_tasks(["path.to.your.task.module"])`.
Expand Down
Loading