Project with Python and PostgreSQL to model an inventory tracking system
[ ] IaC to deploy to Production
[ ] Docker Compose File
[ ] Change HTTPException details to be more like FastAPI
[ ] Update request and response models to be more JSON like - camelCase
Design and create a REST API using FastAPI and SQLAlchemy. A quick step into a using an ASGI framework and explore the rich typing system that both projects have made use of.
Currently tool of choice for Python environments is Poetry. This makes it simple to handle version pinning and even packaging as needed.
# Get started by installing a virtual env
poetry install
# Activate venv
poetry shell
# or use the venv Python without activation
poetry run python ...
To keep a single source of truth about the version, instead of using pyproject.toml
the semver is stored in inven_api/__init__.py
and I use Tiangolo's Poetry Plugin to load the version from there.
Black has been a great formatter for all of my projects and this one is no different. Ruff has been a great addition to Python projects, yet again showing a great integration of Python and Rust.
These are run on the code in this project using pre-commit. To enforce that usage:
# Make sure that dependencies have been previously installed
poetry run pre-commit install
poetry run pre-commit run --all-files
Since this does rely on PostgreSQL, it is better to run it as-is using Docker Compose. But you can edit the app to use a SQLite connection string ./inven_api/common/.env.db
.
# edit DB connection string
vim ./inven_api/common/.env.db
poetry run uvicorn main:APP
docker-compose up --build
Instead of using latest which at this time is 16, going to use 15.4 for this project.
Start the Postgres container for dev with:
docker run --name postgres-dev -e POSTGRES_PASSWORD=postgres -p 5432:5432 postgres:15.4
Then once the container is started, we can use psql
for any later debugging to test our code executions.
We need to specify localhost in this case, otherwise psql
will try to use the default machine socket that isn't present due to the container running.
psql -h localhost -U postgres -W
It is better to call the SQLAlchemy Base.metadata.create_all
from a seperate script and not tie it to the server start up. I think it makes more sense to have a seperate script to do that.
python3 create_database.py