This is the backend API for Ashes.live, a fan-developed deckbuilder and community site for the card game Ashes Reborn.
You must install the following to run the Ashes.live API locally:
- Docker
- Docker Compose (included in Docker Desktop on Windows and Mac)
- Make
That's it! For local development, all other code is executed in Docker via Make using the standard 3 Musketeers pattern.
Please note: in order to run Docker Desktop on Windows you will need a recent copy of Windows 10 with WSL 2 enabled.
Because WSL 2 runs faster when files are living under the Linux filesystem, you will probably
want to clone this repo into your Linux file system, install make
under your Linux distro
(if necessary) and then execute your make commands from the WSL command line (accessible via
wsl
in PowerShell, or by opening the Linux terminal directly).
This means that on Windows you are typically:
- Windows: Installing and running Docker Desktop
- Windows: Using VisualStudio Code or PyCharm to edit the files store in WSL file system (see below)
- Linux/WSL: Running
make
commands in a WSL command line instead of standard Windows cmd or Powershell - Linux/WSL: Performing git actions in WSL (no need to install git in Windows for this project)
However, if for whatever reason you do want to install make
on Windows, this is an easy way:
- Install the Chocolatey package manager
- Run
choco install make
in an elevated command prompt
After installing the dependencies above:
- Create a copy of
.env.example
named.env
in your root directory - At minimum update
POSTGRES_PASSWORD
andSECRET_KEY
in.env
(you can update other values if you wish; they aren't required to run locally, though) - Run
make
from the root project directory
This will build your main Docker container and display the available commands you can
execute with make
.
Now that you have a functional API stack, you can run make data
to create some example
testing data in your database (note: this may fail if you have never run make run
or
make db
prior to make data
because the Postgres database must be initialized first).
Please note: if you do not use the example data, you will need to install the extension
pgcrypto
before running any migrations (via the SQL create extension pgcrypto;
).
At this point, you can execute make up
to start a local development server, and view your
site's API documentation at http:localhost:8000.
From within the API docs, you can query the API directly and inspect its output. If you need
to authenticate, use the email [email protected]
as the username with the password changeme
to log in as IsaacBot#30000. You must not make your API public without changing this password.
If you are running a local development server to work on the front-end application, you're done!
If you wish to contribute to the API, read on!
You can use Visual Studio Code to develop directly within the Docker container, allowing you direct access to the Python environment (which means linting, access to Python tools, working code analysis for free, and bash shell access without needing to run a make command). To do so:
- Install Visual Studio Code, if you haven't already
- Install the Remote Development extension pack
- Outside VSCode in your favored command line, execute
make run
to launch the API container - Inside VSCode use the Remote Explorer in the left sidebar of VSC to attach to the running
API container (likely named
asheslive:dev
). You can find explicit instructions for this in the Visual Studio Code documentation - If this is your first time attaching, open the Command Palette and type "container" then select "Remote-Containers: Open Container Configuration", replace the contents of the file with the following, save, and then close the window and re-attach to the container:
{
"workspaceFolder": "/code",
"settings": {
"terminal.integrated.shell.linux": "/bin/bash",
"python.pythonPath": "/usr/local/bin/python3.8",
"python.linting.pylintEnabled": true,
"python.linting.enabled": true,
"editor.formatOnSave": true,
"python.formatting.provider": "black",
"editor.wordWrapColumn": 88
},
"remoteUser": "root",
"extensions": [
"EditorConfig.EditorConfig",
"ms-python.python"
]
}
You will need to start the API prior to launching VSCode to automatically attach to it. (I am looking into ways to improve this workflow, but short-term this is the easiest to get working consistently without requiring rebuiding the API with every poetry change.)
Please note: you must run your make commands in an external shell! The VSCode Terminal
in your attached container window will provide you access to the equivalent of make shell
,
but running the standard make commands there will result in Docker-in-Docker, which is not
desirable in this instance.
You can use PyCharm to develop directly within the Docker container, allowing you access to the Python environment (which means linting, access to Python tools, etc.). To do so:
- Install PyCharm, if you haven't already
- In your favorite Terminal, run
make run
to ensure the local stack is running - Open PyCharm's Settings (on Windows) or Preferences (on macOS)
- Under Project -> Python Interpreter, click the gear icon by the Python Interpreter dropdown and choose "Add..."
- Select "Docker Compose" as the type in the left sidebar
- Select
api
under the "Service" dropdown - Apply your changes and close the settings
You will now have auto-completion, automatic imports, and code navigation capabilities in PyCharm. To enable local debugging:
- In the upper right of the main window, click "Add Configuration..."
- Click the "+" button and choose "Python" as the template
- Name your configuration whatever you like (e.g.
Local
) - Select "Script path", switch it to "Module name", then enter
uvicorn
as the "Module name" - Enter
api.main:app --reload --host 0.0.0.0 --port 8000
as the "Parameters" - Choose the Python Interpreter you set up in the previous steps
- Apply your changes
- In your favorite Terminal, exit the running local stack (if it is still running)
- You can now launch a local stack (or debug a local stack) with the buttons in the upper right corner of the main window (the stack should auto-reload as you save files)
This project is configured to use isort
and black
for import and code formatting, respectively.
You can trigger formatting across the full project using make format
, or you can also set up automatic
formatting on a per-file basis within PyCharm:
- Open PyCharm's Settings (on Windows) or Preferences (on macOS)
- Under Tools -> File Watchers, click the "+" button and choose the "custom" template
- Name your File Watcher whatever you like (e.g. "isort & black")
- Configure the following settings:
- File type:
Python
- Scope:
Project Files
- Program:
make
(macOS/Linux) orwsl
(Windows) - Arguments:
format FILEPATH=$FilePathRelativeToProjectRoot$
(macOS/Linux) ormake format FILENAME="$UnixSeparators($FilePathRelativeToProjectRoot$)$"
(Windows) - Output paths to refresh:
$FilePath$
- Working Directory and Environment Variables -> Working directory:
$ProjectFileDir$
- Uncheck Advanced Options -> Auto-save edited files to trigger the watcher
- Uncheck Advanced Options -> Trigger the watcher on external changes
- File type:
If automatic formatting is behaving too slowly for your tastes, you can optionally install isort and black in your local environment and configure them that way:
- https://github.com/pycqa/isort/wiki/isort-Plugins
- https://black.readthedocs.io/en/stable/editor_integration.html
The Ashes.live API uses the FastAPI framework to handle view logic, and SQLAlchemy for models and database interaction. Pydantic is used for modeling and validating endpoint input and output. Pytest is used for testing.
The primary entrypoint for the application is api/main.py
. This file defines
the FastAPI app and attaches all site routers. Site modules are organized as follows:
api/views
: Route view functions, typically organized by base URL segment. Start here to trace a code path for a given endpoint.api/models
: Data models used to persist to and represent info from the databaseapi/schemas
: Pydantic models used to validate and model endpoint input/outputapi/tests
: Integration tests (with some unit tests where integration testing is not feasible)api/services
: Functions for performing "business logic"; e.g. creating and modifying models, shared queries that span model relationships, etc.api/utils
: Utility functions for doing a single small thing (placed here because multiple endpoints leverage the function, or to make testing easier)
Services and utility functions are quite similar. Generally speaking, if it's working with simple data, it's a utility. If it's manipulating models, it's probably a service.
You will likely leverage the following files, as well:
api/db.py
: Convenience access to SQLAlchemy objects and methodsapi/depends.py
: View dependencies (e.g. to allow endpoints access to the logged-in user)api/environment.py
: Exports thesettings
object for access to environment settings
I am shooting to maintain 100% code coverage. When you submit a PR, I will expect you to
include tests that fully cover your code. You can view line-by-line coverage information
by executing make test
and then loading htmlcov/index.html
into your favorite browser.
Note that full code coverage simply means the tests must exercise all possible logic paths
in your code. However, If you check the api/tests
folder you will find that most existing
tests are integration tests; they setup a scenario, query a single endpoint, and check that
the status code is correct (typically no other information is verified). Tests do not need
to exhaustively cover every eventuality; they simply need to ensure that all code paths are
functional and appear to be working as expected.
Testing performs queries against an actual database, and every individual test starts with an empty slate (there is no pre-existing data, and data does not persist between tests).
In some instances, you may need to write unit tests instead (for instance, user badge generation logic does this). This will typically come up when you need to verify error handling within a service or utility function for failure states that are not possible to trigger externally.
Migrations are handled by Alembic. To create a new migration:
- Add or edit models or properties in
api/models
. If you add a new model class, make sure to hoist the class to the root module inapi/models/__init__.py
or else it will not be detected by Alembic! - Execute
make shell
- In the shell, run the command:
alembic revision --autogenerate -m "Short description here"
- This will create a new file in
migrations/versions
; verify the contents and remove the "autogenerated" comments. - You can now exit the Docker shell and run
make migrate
to update your local database!
You can find documentation for Alembic here: https://alembic.sqlalchemy.org/en/latest/
Make sure that your model classes all inherit from api.db.AlchemyBase
! This is what allows
SQLAlchemy and Alembic to map the class to a table definition.
The Ashes.live API uses Poetry for dependency management. To install a new dependency from outside of the container:
$ make shell
root@123:/code$ poetry add DEPENDENCY
(If you are developing within Visual Studio Code, you can open the built-in terminal and skip
the make shell
command.)
Then commit changes in your updated poetry.lock
and pyproject.toml
. Please see the
Poetry docs for other available commands.
You may wish to shut down your container, run make build
, and relaunch it to ensure that
newly added dependencies are available. If you pull down code and stuff starts failing in
weird ways, you probably need to run make build
and make migrate
.
Please note: make shell
will log you into the Docker container as the root user!
This is unfortunately necessary to allow Poetry to function properly (I haven't found a
good way yet to install initial dependencies as a non-root account and have them work,
which means the shell has to be root in order to properly calculate the dependency graph).
The underlying Dockerfile uses the following tools, pinned to specific release versions:
In order to update these tools, you must update their pinned version in Dockerfile
and (for Poetry) in pyproject.toml
then rebuild your API container using make build
.
Ashes.live is currently setup for deployment to Render.com. To deploy a copy of the site:
- Create a managed Postgres database
- Create a new Docker service pointing to your Ashes.live GitHub repo with the following settings:
- Docker Command:
/bin/bash -c cd /code && /gunicorn.sh
- Health Check Path:
/health-check
- Environment Variables: at minimum, you must set the following environment variables (you
can set others, if you like; the available keys are in
.env.example
):- ENV:
production
- POSTGRES_DB
- POSTGRES_HOST
- POSTGRES_PASSWORD
- POSTGRES_USER
- SECRET_KEY
- ENV:
- Docker Command:
- Once your Docker service has deployed, you can use the Shell tab to run Alembic migrations, or otherwise populate your database with initial data.
That's it!
Please note that the .env
file is not populated in your production images. The .env
file
works locally because Docker Compose automatically loads its contents as environment variables, but
when running in production mode Pydantic is not capable of reading an .env
file with the current
setup (which is why you must define your environment variables one-by-one in the Render control
panel).